Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 10, 2022.
It's just a bandaid that cache - doesn't provide near the same as a full fat bus
The numbers on the 6950XT look pretty good... but... like others pointed out, with MPT TDP adjustments my AMD 6900XT achieves almost identical scores. I'd like to see a bit of a technical dive into the new GPU stepping's, firmware, memory timings, etc to see what the differences are. Importantly, how well they overclock.
If you held off purchasing a 6900XT during the inflated price period, well your money is certainly going to go further now.
Could you try getting a OC-Formula from Asrock? That would be a great comparision. OC-Formulas are the highest performing models out there...
In the review I see a lot of focus in Raytracing benchmarks but in my opinion it would be nice to have both RT on and RT Off.
This is because, who actually plays heavy games like Cyberpunk with RT enabled?
I suppose following groups enable the setting:
- 1080p users --> 83 FPS with a 3090 Ti
But I doubt somebody with higher GPU load would enable it:
- 1440p 144Hz users --> 61 FPS with a 3090 Ti (I assume they dont enable this because of the horrible 1% Low FPS and because they will probably want to have an average of 80-90 fps at least if they are used to play above 60Hz/FPS).
- 2160p users --> 34 FPS with a 3090 Ti, there is no way they use the setting. ( And I wouldn't swap native rendering with DLSS to change shaders lights/reflections with raytracing lights/reflections).
- ultrawide 1440p users
RT in AMD is much worse and that should be mentioned in every review. I also have an AMD card, and if I have FPS to spare, I still enable RT. (example: I played following games with RT ON, because they still give +100-120 FPS: Doom Eternal and Warzone).
But it would be nice to have the non RT version as well in the review. At least until enabling RT becomes the norm and most of the players stopped playing without RT. (So, having RT broadly used in the mid-range).
Right now I see RT as a not efficient technology in Nvidia cards and even much less efficient in AMD cards. If it was efficient at least in nvidia, then I'd consider it more seriously.
Thats where you mistaken. If you enable dlss your framerate will double across all resolutions making rt available for wider range of cards and honestly it looks like native rendering.
Could someone explain me, why it got such low scores on these tests?
The raytracing numbers are interesting, 20-30% higher than the 6900xt. Driver improvements perhaps? or silicon changes? probably would be a good idea to re run the 6900xt test to see whats going on.
Which is why I think the review should compare more scenarios:
1) Cyberpunk (Raytracing) -> including some entries with DLSS to show some meaningful FPS for Nvidia GPUs.
2) Cyberpunk (No RT)
RT without DLSS is a waste of FPS, in Cyberpunk and heavy games. With DLSS, if you get x2 FPS without losing any quality, then I approve.
I would also recommend running the May Performance drivers, and as I've tried both, I prefer the May Performance drivers over the 22.5.1's. The performance gains on some D3d11 titles is impressive--not everything will benefit, but many games do. My 6900XT reference (bought for Dec 2020 MSRP!) keeps up with this pretty well, but I'm running the May Performance drivers, too. Great cards, and as others have mentioned, by far the best buys available today.