Asynchronous Compute

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Carfax, Feb 25, 2016.

  1. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    GoW:U was also a joke. As was UAP/UWP Tomb Raider.
     
  2. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Those two are fine, (not fine, but well not the worst) but after Quantum Break, Hitman is probably the worst example of DX12 ive encountered.
     
  3. narukun

    narukun Master Guru

    Messages:
    228
    Likes Received:
    24
    GPU:
    EVGA GTX 970 1561/7700
  4. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,495
    Likes Received:
    124
    GPU:
    GTX Titan Sli
    Deus EX:MD just got its DX12 support and the story is the usual one. Better performance on AMD, worse on Nvidia
     

  5. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,930
    Likes Received:
    1,044
    GPU:
    RTX 4090
  6. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,500
    Likes Received:
    1,875
    GPU:
    7800 XT Hellhound
  7. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
  8. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,222
    Likes Received:
    1,540
    GPU:
    NVIDIA RTX 4080 FE
    And since it does not support DX11 then there's no way to know if it is better or worse. My guess is that it would likely run exactly the same give or take a couple of framers per second on DX11. Ditto for Quantum Break, which is not very well optimised under DX12 anyway.
     
  9. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,952
    Likes Received:
    1,244
    GPU:
    .
    People still do not understand most of games and game engines are first designed to be GPU bottlenecked instead of being GPU-driven (AOS is one of few attempts to be it). Moreover you cannot pretend that 3-4-5 years (or even more!) of works on a product being developed around some concept can be completely vanish just adding some code to run on a new API.

    Low-level APIs do not help much on this scenarios, though they have some features to help better utilize the GPU, most of the easiest task of the process converting a game/engine from high-level APIs to low-level/overhead APIs are focused on reducing CPU overhead.

    Moreover, most of the benchmarks are done with high level CPUs or using high quality settings only. Most of people do not have high-end CPUs and do not have a GPU capable to run on high settings the last AAA game. But those are the clickbait only scenarios: everyone love to dream having a high end-PC when reading reviews...
     
  10. This is all true. But there is no excuse for shoddy game design.
     

  11. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,930
    Likes Received:
    1,044
    GPU:
    RTX 4090
    Nothing of this justify DX12 renderer running slower than DX11. This is just a badly coded renderer, nothing more.
     
  12. somemadcaaant

    somemadcaaant Master Guru

    Messages:
    454
    Likes Received:
    65
    GPU:
    Red Devil 6900XT LE
    HAHAHAHA... have to agree, my new wallpaper thanks.
     
  13. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,930
    Likes Received:
    1,044
    GPU:
    RTX 4090
    Your card support concurrent execution of async compute.
     
  14. somemadcaaant

    somemadcaaant Master Guru

    Messages:
    454
    Likes Received:
    65
    GPU:
    Red Devil 6900XT LE
    I'm sure mate guess we just wish it worked better or as intended then?

    If Nvidia had a decent implementation of async compute they would have already introduced it. We will see with the release of BF1.

    Do we think Nvidia is currently ****ting bricks?
     
  15. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,930
    Likes Received:
    1,044
    GPU:
    RTX 4090
    Did you even bother to read the thread?

    "Decent implementation of async compute" is impossible since async compute is a software construct, there's only one implementation of it in DX12 and another in Vulkan, both just are.

    A h/w which is able to run compute concurrently with graphics in an optimal way would not gain any performance from such situation. A h/w which does is bad at running graphics and/or compute alone.

    Do you think that NV is ****ting bricks? Can you give any reason for them to?
     

  16. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    I'm already getting 100fps+ in BF1 @1920x1200. Not my kind of game, but, performance is great. What more do you want to see? At my resolution, you should also be easily getting 100fps.

    Hypothetically, if you had 120fps in DX11, is your desire to see 130-140fps in DX12?
     
  17. siriq

    siriq Guest

    Messages:
    794
    Likes Received:
    21
    GPU:
    Evga GTX 570 Classified

    Lol.

    Async has to be supported by hw and sw. If the hw implementation is bad, which is the case in kepler/maxwell and a bit better now in pascal. Nv is very limited by hw side in this case. Please read more about this matter. Don't give out missinfo. I assume you have seen the diagram about maxwell/kepler async hw capability. It is very limited compare to amd gcn.

    The thing is, this is an old story. We all know how it works and what causing the limits.
     
  18. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,500
    Likes Received:
    1,875
    GPU:
    7800 XT Hellhound
    I'm rather missing out DX12 gains instead of having disastrous performance in some other games:
    http://gamegpu.com/rpg/роллевые/obduction-test-gpu

    GCN quickly seems bottlenecked in other scenarios, it's just not hyped as much since it can't be linked to a single missing feature.
     
  19. siriq

    siriq Guest

    Messages:
    794
    Likes Received:
    21
    GPU:
    Evga GTX 570 Classified
    Developers has to follow the general directive in the code path/s . That engine is not one of them :D More like an DX 11 kinda code. By time, most of the company's will do it, this way favors the next gen nvidia and the old and current/next gen amd as well. Same happened before with DX 11.
     
    Last edited: Sep 10, 2016
  20. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,500
    Likes Received:
    1,875
    GPU:
    7800 XT Hellhound
    Yeah, just like everything until today, except Doom with Vulkan for GCN.
    Game developers screw up DX12 (and they likely would screw up Vulkan as well), I am way more confident in Nvidia's DX11 driver than in current developers' DX12 skills.
     

Share This Page