Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Carfax, Feb 25, 2016.
GoW:U was also a joke. As was UAP/UWP Tomb Raider.
Those two are fine, (not fine, but well not the worst) but after Quantum Break, Hitman is probably the worst example of DX12 ive encountered.
Deus EX:MD just got its DX12 support and the story is the usual one. Better performance on AMD, worse on Nvidia
In real game scenes of Deus Ex, DX12 is also slower than DX11 on AMD cards:
The game's internal benchmark seems to be some kind of scam, just like it was with Total War.
I'm not surprised tbh. Cheaters.
And since it does not support DX11 then there's no way to know if it is better or worse. My guess is that it would likely run exactly the same give or take a couple of framers per second on DX11. Ditto for Quantum Break, which is not very well optimised under DX12 anyway.
People still do not understand most of games and game engines are first designed to be GPU bottlenecked instead of being GPU-driven (AOS is one of few attempts to be it). Moreover you cannot pretend that 3-4-5 years (or even more!) of works on a product being developed around some concept can be completely vanish just adding some code to run on a new API.
Low-level APIs do not help much on this scenarios, though they have some features to help better utilize the GPU, most of the easiest task of the process converting a game/engine from high-level APIs to low-level/overhead APIs are focused on reducing CPU overhead.
Moreover, most of the benchmarks are done with high level CPUs or using high quality settings only. Most of people do not have high-end CPUs and do not have a GPU capable to run on high settings the last AAA game. But those are the clickbait only scenarios: everyone love to dream having a high end-PC when reading reviews...
This is all true. But there is no excuse for shoddy game design.
Nothing of this justify DX12 renderer running slower than DX11. This is just a badly coded renderer, nothing more.
HAHAHAHA... have to agree, my new wallpaper thanks.
Your card support concurrent execution of async compute.
I'm sure mate guess we just wish it worked better or as intended then?
If Nvidia had a decent implementation of async compute they would have already introduced it. We will see with the release of BF1.
Do we think Nvidia is currently ****ting bricks?
Did you even bother to read the thread?
"Decent implementation of async compute" is impossible since async compute is a software construct, there's only one implementation of it in DX12 and another in Vulkan, both just are.
A h/w which is able to run compute concurrently with graphics in an optimal way would not gain any performance from such situation. A h/w which does is bad at running graphics and/or compute alone.
Do you think that NV is ****ting bricks? Can you give any reason for them to?
I'm already getting 100fps+ in BF1 @1920x1200. Not my kind of game, but, performance is great. What more do you want to see? At my resolution, you should also be easily getting 100fps.
Hypothetically, if you had 120fps in DX11, is your desire to see 130-140fps in DX12?
Async has to be supported by hw and sw. If the hw implementation is bad, which is the case in kepler/maxwell and a bit better now in pascal. Nv is very limited by hw side in this case. Please read more about this matter. Don't give out missinfo. I assume you have seen the diagram about maxwell/kepler async hw capability. It is very limited compare to amd gcn.
The thing is, this is an old story. We all know how it works and what causing the limits.
I'm rather missing out DX12 gains instead of having disastrous performance in some other games:
GCN quickly seems bottlenecked in other scenarios, it's just not hyped as much since it can't be linked to a single missing feature.
Developers has to follow the general directive in the code path/s . That engine is not one of them More like an DX 11 kinda code. By time, most of the company's will do it, this way favors the next gen nvidia and the old and current/next gen amd as well. Same happened before with DX 11.
Yeah, just like everything until today, except Doom with Vulkan for GCN.
Game developers screw up DX12 (and they likely would screw up Vulkan as well), I am way more confident in Nvidia's DX11 driver than in current developers' DX12 skills.