Mesh Shaders: Why aren't games using them ?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by bluedevil, Mar 24, 2022.

Tags:
  1. bluedevil

    bluedevil Master Guru

    Messages:
    415
    Likes Received:
    28
    GPU:
    Kfa2 RTX 2060 6gb
    Turing architecture has been released in late 2018, Nvidia had a demo called Asteroids, there is even a 3dmark test for this tech so why after over 3 years no mainstream games are using it?
     
  2. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    5,896
    Likes Received:
    7,178
    GPU:
    RX 6800 XT
    Probably, because it's a lot of work.
    It means replacing the whole geometry pipeline part.
    Vertex shaders + Geometry Shaders + Hull Shaders + Domain Shaders => Amplification Shaders + Mesh Shaders
     
  3. Undying

    Undying Ancient Guru

    Messages:
    20,783
    Likes Received:
    9,093
    GPU:
    RTX 3070 OC
    I think Unreal 5 was gonna use mesh shaders with nanites.
     
  4. TimmyP

    TimmyP Maha Guru

    Messages:
    1,081
    Likes Received:
    172
    GPU:
    RTX 3070
    UE5 Nanite is the same thing just a different name. We have had many demos.
     

  5. S3r1ous

    S3r1ous Member Guru

    Messages:
    134
    Likes Received:
    21
    GPU:
    Sapphire RX 6700
    There was already uhh... umbrella? is that the correct term? for this technology, its called Tessellation.
    Mesh Shaders seems to be kind of next step for it, but i assume reason why it isnt in every game are the same reason most games dont unify everything under Tessellation and rely on instead whats in drivers/graphic card itself.
    My personal theory is why this doesnt end up in a lot of games, is because its too much work for hardly any gain, sure you will simplify your pipeline, maybe get things easier done in the future. Nobody really thinks about these things unless they specifically work on game engines and see massive gains for the pipeline, tradeoffs guarantee usually that sticking to some proven models outweigh any potential gains often with these things. Besides Tessellation was pain in the ass to get working well and optimized back in the day so thats now set in the thinking for a lot of devs. Even if its a lot easier to work with nowdays.
     
    Last edited: Mar 24, 2022
  6. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    5,896
    Likes Received:
    7,178
    GPU:
    RX 6800 XT
    No.
    Tesselation is just Hull Shaders + Domain Shaders.
     
  7. S3r1ous

    S3r1ous Member Guru

    Messages:
    134
    Likes Received:
    21
    GPU:
    Sapphire RX 6700
    Well thats what i am saying, next step, its more things unified.
     
  8. TimmyP

    TimmyP Maha Guru

    Messages:
    1,081
    Likes Received:
    172
    GPU:
    RTX 3070


    Mesh shading is used in this. Also if you havent seen it prepare to have your mind absolutely blown.

    @8:41 btw. Nanite is mesh shaders. Same exact thing.
     
    Last edited: Mar 25, 2022
  9. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,407
    Likes Received:
    661
    GPU:
    RTX 4090
    Because old consoles can't run them and there are no games made omitting them yet. Also it's highly likely that GPUs without mesh shaders support won't be able to run such games at playable framerates which means that many people will be left out.
     
    Dragam1337 and cucaulay malkin like this.
  10. bluedevil

    bluedevil Master Guru

    Messages:
    415
    Likes Received:
    28
    GPU:
    Kfa2 RTX 2060 6gb
    Old consoles cant run RTX stuff and yet we have some big games that do use this tech. Game engines can implement different code paths for this feature and considering that you get about 500% performance increase in benches i see no reason against using mesh shaders.Laziness is not a valid argument.
    Most of the time the gpu spends it's resources computing geometry textures and lightning so optimizing any of them would bring big gains in performance.
    Even if in real world scenarios we don't get the full performance increase that benchmarks like 3dmark show that's still a lot of performance that we are missing.
    Unreal engine's nanite uses mesh shaders and the performance increase is huge.
    Check out this link to see performance increase on current cards in 3dmark:
    https://videocardz.com/newz/ul-rele...t-results-of-nvidia-ampere-and-amd-rdna2-gpus
    Also :
     
    Last edited: Mar 28, 2022

  11. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,407
    Likes Received:
    661
    GPU:
    RTX 4090
    RT is a visual improvement which can be easily turned off. You can't easily turn off 90% of geometry.

    Geometry is very rarely a bottleneck in modern games. Switching current geometric complexity to mesh shaders won't give you anything in terms of performance.
     
    Dragam1337 and cucaulay malkin like this.
  12. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    5,896
    Likes Received:
    7,178
    GPU:
    RX 6800 XT
    With RT geometry can be a huge bottleneck.
    Remember that RT casts a ray, or more, for each triangle or primitive.
    This is why having a good BVH implementation is so important.
     
  13. user1

    user1 Ancient Guru

    Messages:
    2,338
    Likes Received:
    1,007
    GPU:
    hd 6870
    probably because It operates in a very different way to traditional vertex/geometry shaders, ultimately most game developers are under constant time crunch, so unless learning the new thing gives easy obvious large improvements visually or performance wise, it doesn't get picked up for a very long time if ever.

    a good example would be how widespread dx9 usage persisted well after dx10 came, and into the dx11 era.
     
  14. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,407
    Likes Received:
    661
    GPU:
    RTX 4090
    That's an RT bottleneck, not a geometry bottleneck.
     
  15. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    5,896
    Likes Received:
    7,178
    GPU:
    RX 6800 XT
    More geometry can cause an increase in ray count, becoming a bottleneck.
    Remember that we must test each triangle. So if we have more geometry, we have to cast more rays to test.
     

  16. Erick

    Erick Member Guru

    Messages:
    123
    Likes Received:
    21
    GPU:
    RTX 2060 Super 8GB
    Forespoken will use it. Unreal Engine 5 has it built in and they showed off DirectStorage for PC.
     
  17. Erick

    Erick Member Guru

    Messages:
    123
    Likes Received:
    21
    GPU:
    RTX 2060 Super 8GB
    Yep and either way, Win32 can't handle it anymore without overclocking the GPU/CPU. This is where DirectStorage and NVMe tech has to replace Win32....as soon as devs have finished learning something new.
     
  18. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,407
    Likes Received:
    661
    GPU:
    RTX 4090
    It can't cause any increase in ray count. It can cause more divergence in ray traversal which will lead to worse h/w occupancy. But as I've said this is not a geometry issue.

    This is not how it works.
     
  19. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    5,896
    Likes Received:
    7,178
    GPU:
    RX 6800 XT
    Then please explain.
     
  20. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,407
    Likes Received:
    661
    GPU:
    RTX 4090
    Already did. More geometry may lead to higher ray divergence. This is a RT problem which may result in a (even more pronounced) RT bottleneck. It can be avoided by using simplified proxy geometry - UE5 does it this way. Thus the geometry which you see and the one which is used for RT would be different.
     

Share This Page