Asynchronous Compute

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Carfax, Feb 25, 2016.

  1. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,359
    Likes Received:
    254
    GPU:
    Gigabyte 3080 Ti
    GoW:U was also a joke. As was UAP/UWP Tomb Raider.
     
  2. Redemption80

    Redemption80 Ancient Guru

    Messages:
    18,495
    Likes Received:
    266
    GPU:
    GALAX 970/ASUS 970
    Those two are fine, (not fine, but well not the worst) but after Quantum Break, Hitman is probably the worst example of DX12 ive encountered.
     
  3. narukun

    narukun Master Guru

    Messages:
    217
    Likes Received:
    24
    GPU:
    EVGA GTX 970 1561/7700
  4. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,485
    Likes Received:
    120
    GPU:
    GTX Titan Sli
    Deus EX:MD just got its DX12 support and the story is the usual one. Better performance on AMD, worse on Nvidia
     

  5. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,259
    Likes Received:
    567
    GPU:
    RTX 3080
  6. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    1,640
    Likes Received:
    422
    GPU:
    3060 TUF
  7. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,772
    Likes Received:
    383
    GPU:
    MSI GTX1070 GamingX
  8. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    16,072
    Likes Received:
    586
    GPU:
    EVGA GTX 1080 Ti SC2
    And since it does not support DX11 then there's no way to know if it is better or worse. My guess is that it would likely run exactly the same give or take a couple of framers per second on DX11. Ditto for Quantum Break, which is not very well optimised under DX12 anyway.
     
  9. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,270
    Likes Received:
    822
    GPU:
    .
    People still do not understand most of games and game engines are first designed to be GPU bottlenecked instead of being GPU-driven (AOS is one of few attempts to be it). Moreover you cannot pretend that 3-4-5 years (or even more!) of works on a product being developed around some concept can be completely vanish just adding some code to run on a new API.

    Low-level APIs do not help much on this scenarios, though they have some features to help better utilize the GPU, most of the easiest task of the process converting a game/engine from high-level APIs to low-level/overhead APIs are focused on reducing CPU overhead.

    Moreover, most of the benchmarks are done with high level CPUs or using high quality settings only. Most of people do not have high-end CPUs and do not have a GPU capable to run on high settings the last AAA game. But those are the clickbait only scenarios: everyone love to dream having a high end-PC when reading reviews...
     
  10. CrazyBaldhead

    CrazyBaldhead Master Guru

    Messages:
    300
    Likes Received:
    14
    GPU:
    GTX 1070
    This is all true. But there is no excuse for shoddy game design.
     

  11. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,259
    Likes Received:
    567
    GPU:
    RTX 3080
    Nothing of this justify DX12 renderer running slower than DX11. This is just a badly coded renderer, nothing more.
     
  12. somemadcaaant

    somemadcaaant Master Guru

    Messages:
    422
    Likes Received:
    32
    GPU:
    Red Devil 6900XT LE
    HAHAHAHA... have to agree, my new wallpaper thanks.
     
  13. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,259
    Likes Received:
    567
    GPU:
    RTX 3080
    Your card support concurrent execution of async compute.
     
  14. somemadcaaant

    somemadcaaant Master Guru

    Messages:
    422
    Likes Received:
    32
    GPU:
    Red Devil 6900XT LE
    I'm sure mate guess we just wish it worked better or as intended then?

    If Nvidia had a decent implementation of async compute they would have already introduced it. We will see with the release of BF1.

    Do we think Nvidia is currently ****ting bricks?
     
  15. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,259
    Likes Received:
    567
    GPU:
    RTX 3080
    Did you even bother to read the thread?

    "Decent implementation of async compute" is impossible since async compute is a software construct, there's only one implementation of it in DX12 and another in Vulkan, both just are.

    A h/w which is able to run compute concurrently with graphics in an optimal way would not gain any performance from such situation. A h/w which does is bad at running graphics and/or compute alone.

    Do you think that NV is ****ting bricks? Can you give any reason for them to?
     

  16. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,772
    Likes Received:
    383
    GPU:
    MSI GTX1070 GamingX
    I'm already getting 100fps+ in BF1 @1920x1200. Not my kind of game, but, performance is great. What more do you want to see? At my resolution, you should also be easily getting 100fps.

    Hypothetically, if you had 120fps in DX11, is your desire to see 130-140fps in DX12?
     
  17. siriq

    siriq Master Guru

    Messages:
    790
    Likes Received:
    14
    GPU:
    Evga GTX 570 Classified

    Lol.

    Async has to be supported by hw and sw. If the hw implementation is bad, which is the case in kepler/maxwell and a bit better now in pascal. Nv is very limited by hw side in this case. Please read more about this matter. Don't give out missinfo. I assume you have seen the diagram about maxwell/kepler async hw capability. It is very limited compare to amd gcn.

    The thing is, this is an old story. We all know how it works and what causing the limits.
     
  18. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    1,640
    Likes Received:
    422
    GPU:
    3060 TUF
    I'm rather missing out DX12 gains instead of having disastrous performance in some other games:
    http://gamegpu.com/rpg/роллевые/obduction-test-gpu

    GCN quickly seems bottlenecked in other scenarios, it's just not hyped as much since it can't be linked to a single missing feature.
     
  19. siriq

    siriq Master Guru

    Messages:
    790
    Likes Received:
    14
    GPU:
    Evga GTX 570 Classified
    Developers has to follow the general directive in the code path/s . That engine is not one of them :D More like an DX 11 kinda code. By time, most of the company's will do it, this way favors the next gen nvidia and the old and current/next gen amd as well. Same happened before with DX 11.
     
    Last edited: Sep 10, 2016
  20. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    1,640
    Likes Received:
    422
    GPU:
    3060 TUF
    Yeah, just like everything until today, except Doom with Vulkan for GCN.
    Game developers screw up DX12 (and they likely would screw up Vulkan as well), I am way more confident in Nvidia's DX11 driver than in current developers' DX12 skills.
     

Share This Page