Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 5, 2019.
Can you post fps difference btw 399.x drivers and latest ones?
People already creating conspiracy theories about Nvidia gimping drivers, no...
This is GCN optimizations that came from consoles as well as the latest COD, Pascal is going to die pretty soon...
2020 AAA titles will probably run well on turing and GCN/RDNA/RDNA2, but pretty poor on Pascal, Fermi, Maxwell, would love to see in the benchmark a 980ti and a 780ti hahahaha!
Yeah, i don't give much value to what they say. If someone can prove that turing is using specific hardware to "yield better image quality", then sure... but i very much doubt that can be proven, as i seriously doubt that to be the case. Nvidia will probably claim that it is the case, but i take everything they say with a huge truckload of salt. I am willing to bet that they are using the exact same driver software implementation.
Speaking of Nvidia drivers, is anyone interested taking a go with the latest Vulkan developer drivers, also released yesterday?
Waste of Hilbert's time. He's just trolling.
Assassin's Creed Odyssey. 980 Ti slower than 1060...
A game is a game, they affect the cards in different ways.
In the The Outer Worlds for example, Radeon VII is on par with the RX 5700 and the RX 5700XT is a good 15% to 20% faster.
So, could we say that AMD is deliberately crippling Radeon VII to favor Navi?
Obviously they won't push cards that are no longer made. But they do support them, saying otherwise is just not true. Do you ever read driver notes?
I was always skeptical about Nvidia gimping their old GPUs, but seeing 1080Ti performance in RDR2 (and the new CoD) I think it's time to revisit that topic.
You know, you may be right, but why deny the possibility of them not optimizing it properly for Pascal and this being the beginning due to the title itself being a big deal? How do smartphone manufactures force people to change their phones? Why would they release new if they make the old ones run better with each year.. it's all speculation obviously but it could as well just be the beginning of the end for Pascal.
Yes, they fix glarring issues, but they sure as hell don't optimize game performance for pascal, which is very evident with this game.
Lots of crazy talk in this thread. This is madness!
There is a pretty large difference between Nvidia not optimizing for Pascal and Nvidia intentionally gimping Pascal - which is what people are saying is occurring (and that Nvidia has a history of doing this - they don't). Just like there is a difference between Apple not releasing updates for older phones and Apple releasing updates that actually lower the performance on older phones.
In this case there are a ton of different variables - Turing's general compute for example is significantly higher than Pascal and it's ability to dual issue int/fp simultaneously may cause a developer, when developing shaders, to lean into those advantages. There is no amount of optimization Nvidia can do from their side to make Pascal better in that case. FP16 is now supported on RDNA/Vega/Turing - any shaders that are utilizing this are going to be faster on those architectures and no optimization is going to make them faster on Pascal. There are a whole slew of architecture enhancements on Turing that devs can be taking advantage of that can't be "optimized" for on Pascal. In those cases there isn't even a "nvidia not doing it properly" its just "nvidia can't do anything" to make up that difference. That's not to mention that developers just targeting new NVAPI/ISA versions can just improve Turing performance across the board on games that aren't.
Is that what's going on here? I have no idea - but neither does anyone in this thread. To automatically assume intentional downgrading is a garbage assumption. I don't care for it.
Some sensible person here!
LOL. You must be new here, but Denial has been defending Nvidia at every controversy for as long as I remember. I don't think I ever saw him say a single criticism about Nvidia. Take from that as you will, but I always try to listen to people who at least try to remain neutral. Blind devotion and fanboyism isn't healthy for anyone.
I'm not saying you're wrong, just that what is going on here in this title is odd to say the least. Who knows maybe turing has a ton under the hood and we were seeing the tip of the iceberg up until this point.
good news I have another reason to wait until the steam release, not going to buy a 20xx for that sorry nvidia
5.x vram is not something you see everyday in 1080p but I saw 4.3-4.4 often enough (BL3,RE2,modded skyrims) to know that 4Gb will limit you for sure in 2019
I think this is amusing because I've been called an AMD fanboy on these forums by people too - when calling out equally dumb opinions by Nvidia people against AMD. I was outspoken on the 970 memory issue, GPP program (when more evidence of it came out) and various other issues. I also own $30K of AMD stock. But yeah I guess I'm an Nvidia fanboy because I don't buy into random people saying "nvidia is intentionally gimping performance" based on literally nothing.
Wrong!....He gets on everybody's azz....
So you do change your stance about Nvidia once there's overwhelming evidence about their wrongdoings, but until that time, you happily defend them and talk down every concerns that people might have about performance or other issues. I don't know how you can say, with serious face, that 1080Ti performance here is normal, but whatever.