Discussion in 'Benchmark Mayhem' started by DannyD, Oct 8, 2020.
With 380w bios.
I totally get it but remember benching and gaming are totally different, cheers for your dlss off result, surely you have more juice in the card yet though eh?
and i'll stand by my point benchmarks and dlss don't mix.
I love DLSS - but there is a definite loss of visual fidelity. Maybe 50% as bad as turning on FXAA...?
Definitely more noticeable in some scenes than others. This is exactly the scenario (if you were actually playing the game) in which it's most beneficial. A game where you want to keep RT (or whatever feature) on, sacrifice overall visual fidelity slightly in order to bring the FPS back into comfortable ranges.
I'd be interested to know how much the fidelity of DLSS is affected by the data that's fed into Nvidia's machine?
I see DLSS as a very well crafted 'trick' to run games in lower resolutions without noticeable image loss.
I see benchmarks as a way to see the true power of your gpu, so for me i would naturally disable all the 'tricks' first.
And to the point about bottlenecks that's just crazy, enjoy your system and enjoy the experience of upgrading it.
Also i like dlss and have zero issues with it, but if i had gpu that could raw power a game without it then it gets disabled.
Have you bios unlocked your card?
DLSS is not future - significant IQ advancements aside.
Native or higher will continue to be the gold standard - DLSS is only there to make up for FPS deficiencies.
That's patently false.
There is an appreciable loss of quality in most circumstances, including some severe artifacting at times. It's a fantastic technology, but the fact remains - unless you have poor eyesight (a very real possibility) it's not without drawbacks.
It's no different to anti-aliasing techniques - they have pros and cons.
Dude, the hell have you done to my thread?