Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 18, 2020.
Wait for Guru3d CP2077 game performance review, that'll be one hell of a review.
Its only ok when Nvidia does something, world ends when AMD does something.
Not really, it just shows the 6800/xt working properly, just like most other games, and not showing the 3070/80 working properly.
It's an invalid point to make due to the fact it's not treating both companies cards equally and is an outlier. It realistically only matters if you are only ever going to play that game.
It'd be a useful benchmark IF:
The performance difference between the 2080 ti, 3080 and 3090 all were in line with what was expected AND if AMD still was in the lead as it shows.
But that's not the case, and is an issue with the game, not a benefit to the 6800/XT
What’s with the tapering off performance after 1080p? Is it a bandwidth thing or perhaps drivers? I mean these cards are absolute kings at 1080p, but that’s hardly the standard res at this price point.
If they were at 1440p as well things would get interesting
Can't make an assessment based off of previous generations' and the data analytics you already have now?
I guess impossible is possible , got mine almost 6 hours after launch . Guess shops canceled some of the orders by bots/ people that ordered more than 1 . Thanks to Poland - X-KOM.
Debating between the 6800XT vs 3080 or 3080TI mainly because nVidia offers more in terms of DLSS2.0 RTX and better 4K performance.
Lower resolutions use lower resolution buffers. Those things can fit into 128MB IC. But with higher resolutions, card has to move more data in/from memory with each frame.
Basically, Lower resolutions can get away with just higher GPU clock. Higher resolutions need higher memory bandwidth.
It is reasonable to think that moment AMD starts to use faster memory or wider IMC, they'll regain performance that's lost on higher resolutions.
= = = =
Here, we are starting to have RTX 3070 available. No surprise when shops list them at same price as 6800XT.
And I did notice that some RTX 3090-s are having incoming stock on 26th, and pre-purchase can be made. So, it means people did not buy out that incoming batch, yet.
"Properly coded" for specific HW but ensuring competition's HW is gimped in the process. Fair enough, they are just doing what Nvidia sometimes did to them in the past. AMD wants a chance to play the unethical monster they've always bitched and moaned about to see if they can replicate Nv's success. The "were in it for gamers overall benefit" principle no longer applies when they find the shoe on the other foot .
And what if it is just use of resource optimizations to maximize benefit of IC?
nVidia has small cache in comparison, could as well choke on IMC if game uses latency sensitive operations.
Congrats. (change that image provider,doesnt work here)
The truth is I dont know if other AIBs(except Sapphire-the company who build most refference AMD Radeon cards) will put Hitachi's carbon thermal pad on Gpu chip for better thermal transfer and much cooler Gpu chip.
Spoiler: Carbon thermal pad
Thanks, i paid around 660 eur.
mine also arrived today. now i`m just waiting for the delivery of Ryzen 5600x
here is my RX 6800 powercolor score 1440p badass
I honestly don’t believe that they have gimped the competition, just allowed their own hardware to spread the wings and perform to the full potential , this is not always possible and as much as I like my Nvidia cards I have to say the “ sponsoring games “ was always bit fishy.
Let’s hope that AMD will not drop to that level .
It was shocking when intel used to force the “ less efficient path “ in the compilers when AMD cpu was detected , they deserved to be sued and pay the fine, that sort of behaviour it’s just criminal .
I wasnt referring to the overall performance, but rather to the RT performance where Nvidia has more capable HW for that purpose. But apparently AMD and the game devs can side-step that and implement RT to the best capabilities of their own HW even if it may be an inferior solution. But its understandable and I wont fault AMD for it, they have a business to run and they NEED to show examples of them doing better than the competition.
AMD Helped Godfall and Dirt 5 Devs Implement Ray-Traced Shadows, Though It’s Barely Noticeable
my time spy graphic score is also higher
More games to add to my not interested list, cheers.
Hh. I think I loled a bit. Thanks. Except you do it before you even know if game will be broken for nVidia users.
With PhysX and Gameworks in general, games were atrocious for over 10 years. And often run poorly even on Nvidia's own cards.
What you are afraid of is that AMD will employ nVidia's strategy. Yet, did you boycott games which were broken for AMD or both sides thanks to nVidia's code?
I think people had a lot of time to slap nVidia over fingers. It did not happen, gamers did not care.
If AMD returned equal treatment now, maybe some gamers would grow spine. For most, it is too late.
Where's your comments on AMD's RT doing nothing for these games and hardly any difference? Why aren't you even questioning the performance, RT on or off?
The difference with Physx and Gameworks is that there was an actually difference in fidelity and/or effects possible and it crippled performance on both sides. They were next-gen effects that would be years away without it.
Why should I even care when the history of team red is that they don't work with or help devs?