Review average of 17 websites shows 6800 XT to be 7.4% Slower than GeForce RTX 3080

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 20, 2020.

  1. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,205
    Likes Received:
    2,996
    GPU:
    5700XT+AW@240Hz
    False. My methods are perfectly suited for real comparison. Explained all multiple times. Not hard to understand them at all.

    For example, physically smaller screen can get away with lower fps. Because in motion objects travels smaller physical distance on screen. They still move same number of pixels if both screens in comparison have same resolution.

    Like android devices being historically locked to 30fps gaming. Not many people would notice with so small screens. Only in recent years, some games got 60fps mode.
    And only recently we are getting android devices with 90 and now with 120Hz screens.
    For cellphone, 90/120Hz is overkill unless person wants to use device in VR headset.

    When I trigger motion which results in object moving 2cm per frame on my screen at 240fps, it translates to 8cm per frame for 60Hz screens no matter how many fps game can produce as long as screen size is equal.

    And it is same for image quality comparison for screens of same size. When you have 1440p screens and run whatever details you can while game runs stable 60fps, and then take 4K screen of same size with same HW and use settings which get you same 60fps... you effectively removed temporal part of comparison and can perform IQ comparison.
    - - - -
    What you described is only true in case of unlimited power. And that's whole problem. Not even 3090 has that. I made a joke on power draw of Suprim card, but that does not translate to gaming. Power is not there. If I blown it out of proportion to demonstrate it: Chose DX12 + DX-R maximum IQ on 1080p or maximum DX9c on 8K or DX7 (No shaders) on 16K.

    Sure, maybe in 20 years when we have 32K screens, everything will be about tiny polygons and things will return back to times of simple TnL. But I doubt it.
     
  2. coth

    coth Master Guru

    Messages:
    466
    Likes Received:
    49
    GPU:
    KFA2 2060 Super EX
    All 5120x1440 monitors are 49". They are doubled 24" 2560x1440. That is 120 ppi. Not only a low resolution with still a large pixel, but also 125% scaling is not supported by 99,9% of apps. Any raster scaling to 125% is severely distorted.
     
    Dragam1337 likes this.
  3. KingK76

    KingK76 Member Guru

    Messages:
    106
    Likes Received:
    11
    GPU:
    Pascal Titan X Under H2O
    Okay so I did the math myself using GURU3D's benchmarks... in 1440p the RTX3080 is faster in 7 of 14 games tied in one. That means the RTX3080 wins at 1440p as the 6800XT was faster in 6 games tied in 1. When averaging the games benchmarked the RTX 3080 has an average FPS of 138.71 vs 136.79 for the 6800XT. Another win by 1.4% @1440p for the RTX 3080... So yes, even on Guru3D the RTX3080 is faster at 1440p then the 6800XT. I didn't bother with 1080p as NO ONE buying a top tier GPU is gaming at 1080p... I could say the same for 1440p as I know myself that I game at 2160p and most people buying a top tier GPU are going to be gaming at 4K... No matter what AMD "enthusiasts" like to say. I will state that the 6800XT is an AMAZING GPU coming from AMD. The gains gen over gen are astounding. I give them all the credit in the world. But when you consider the performance gap at 4K for the RTX3080 and the slim victory at 1440p added with the large gap in Ray Tracing performance and features like DLSS it's only the blind that would say the 6800XT is the better card.
     
  4. KingK76

    KingK76 Member Guru

    Messages:
    106
    Likes Received:
    11
    GPU:
    Pascal Titan X Under H2O
    In 1 or 2 years Nvidia will also be on a proper die shrink which will only increase their advantage... Samsung's 8nm is really 10nm so AMD currently have a big advantage over Nvidia when it comes to clock speed and power. When that is gone so will the small gap Nvidia currently has over AMD. In other words the gap will be much larger... And for that I would bet we only have to wait until next year...
     

  5. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    6,979
    Likes Received:
    209
    GPU:
    980
    It's not like AMD will be standing still. Even with the node difference I doubt Nvidia would've gotten holyshit better with the 7nm. Nvidia barely has any gap and mostly that gap is in RT which then again makes sense. Them having specific cores for that and tensor cores for dlss. But like cmon.
     
    HandR and Kosmoz like this.
  6. Kosmoz

    Kosmoz Member

    Messages:
    39
    Likes Received:
    14
    GPU:
    GTX 1060 6GB
    First, AMD will improve too with next gen GPUs and they will also push harder and will be competitive from now on. They achieved what they said they will with RDNA 2, so I can't doubt them now for going again +50% with RDNA3, like they are saying. I do not expect nvidia to pull a gap again, I do expect them to trade punches though, like now.

    Second, I was referring how will this look in 1-2 years time, about this gen. How 3000 series vs 6000 series will look in 1-2 years when we benchmark them, compared to now. That's what I meant, not next gen vs next gen. This gen for AMD will look even better than nvidia's, the fine wine will be even greater when they ad their own DLSS alternative in the mix, which I predict will come in max 6 months from now.

    So people are jumping the gun with their conclusions saying no DLLSS and thus RT(X) = nvidia is better, when in a few months time even that advantage will be gone.

    There are 2 types of buyers for these GPUs: those that want the NOW performance (RTX 3000s), and those that think ahead and know that AMD will deliver and buy the better future product (Radeon 6000s) with the enough performance for now, to not be too inferior anyway.
     
  7. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,301
    Likes Received:
    1,035
    GPU:
    Aorus 3090 Xtreme
    lol wut?
     
  8. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    6,979
    Likes Received:
    209
    GPU:
    980
    Oh tbh the advantage nvidia has in RT won't be gone. Unless AMD delivers way better DLSS which I doubt. It will be smaller. But even then best RT performance will be on nvidia cards.
     
  9. Kosmoz

    Kosmoz Member

    Messages:
    39
    Likes Received:
    14
    GPU:
    GTX 1060 6GB
    Right, but if it will be +/-10fps difference, not a lot of people will care really. I won't for sure. As long as it is over 60fps.
     
  10. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,205
    Likes Received:
    2,996
    GPU:
    5700XT+AW@240Hz
    I would not buy 3080/6800XT for higher than 1080p resolution if I was into DX-R.
    And I do not want 6800XT for 4K performance would not last till next generation. It is great 1440p GPU and it will last outside of DX-R.

    That's why I am on 1080p and have no reason to move above it even if I get 6800XT.

    If nVidia doubles DX-R performance per $ with next generation, maybe DX-R games will have more sense on 1440p. Till then fake pixels for some, 1080p for others.
     

  11. jarablue

    jarablue Active Member

    Messages:
    99
    Likes Received:
    10
    GPU:
    nVidia 960m 2GB
    This is wrong. 1440p with a 3080 will net me near 144 fps for my 144 hertz monitor with every setting cranked. It is not at all a waste of money.
     
  12. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    6,979
    Likes Received:
    209
    GPU:
    980
    Well yes. Didn't anyway mean the DX-R performance is anything to write home about on any GPU when it comes down to it especially on higher res.

    I am on 5120*1440 because of work mainly. Needed the real estate. But that is a reason for me to buy a really fast gpu now too... Most likely will go for that 6900 xt unless I can get AiB tomorrow. Just so frikkin annoying to run this display at 60hz on my 980 :D Or play world of tanks at 30-40fps...
     
    Fox2232 likes this.
  13. KingK76

    KingK76 Member Guru

    Messages:
    106
    Likes Received:
    11
    GPU:
    Pascal Titan X Under H2O
    Come on... How do you figure Nvidia's advantage in RT will be gone in a few months? Where are you coming from with that? Look at raw RT performance in Minecraft and Nvidia has like a 5x advantage... And AMD has nothing that can compete with how Nvidia does DLSS as Nvidia has the Tensor cores... I think you're stretching it quite a bit with that statement... And Nvidia also won't be standing still... AMD have certainly put the scare in them now. Unlike Intel who seem to have nothing to come back at AMD with, Nvidia are much better positioned to up their game. And when both AMD and Nvidia are on 7nm Nvidia will only increase their gap. 5nm is next but AMD aren't close to moving to that... Maybe another 1.5 years I'd say, 2022. And I would bet Nvidia won't rest on their laurels and let AMD get to that node quicker then they will... Don't forget who the bigger company is with more money to spend to keep their lead.
     
  14. Kosmoz

    Kosmoz Member

    Messages:
    39
    Likes Received:
    14
    GPU:
    GTX 1060 6GB
    The advantage will not be 100% gone, as in AMD will match exactly nvidia's RT performance when they have the DLSS alternative (
    FidelityFX Super Resolution), but I do expect 70-80% of that advantage to be gone. I would be a case of nvidia has, let's say @ 1440p: 100 fps with RTX+DLSS and AMD will have 80 fps with RT+FidelityFX Super Resolution. That will look much better than now and some people won't care if its 100 or 80 fps at that point. I know I won't. The gap will be much lower.

    Also you are wrong about AMD and 5nm, they are moving to 5nm for RDNA3, thus the +50% perf/ watt vs RDNA2, as they say will happen again as it did now vs RDNA1. There will not be a case where both AMD and nvidia are on 7nm with the same generation of GPUs.
     
  15. AuerX

    AuerX Member Guru

    Messages:
    105
    Likes Received:
    40
    GPU:
    PNY RTX2070 OC
    Imaginary stories about unobtainable hardware are the best.
     

  16. Kosmoz

    Kosmoz Member

    Messages:
    39
    Likes Received:
    14
    GPU:
    GTX 1060 6GB
    For now yes, see ya in 4-6 months. It will still be this gen even then, won't be a new next gen...

    Some people have no foresight at all, they think that if NOW is like this, when we clearly know more things will come to improve this NOW situation (especially for AMD), they ignore all of that and think that NOW = FOREVER. A few months is nothing compared to years.

    Like the horse blinders wearing fanboys of Xbox that kept saying about their most powerful console in the word, 12TF XSX will demolish the weakstation5, etc. And now they see that not only are they on par, but actually PS5 ca be even better on some scenarios in 3rd party games. Where is that +20% performance, hm? Where is the demolishing the stomping over PS5?

    The same they said Radeon wil be 2080ti+15% and the same was proven wrong. Concerning AMD vs nvidia this gen, 6000s vs 3000s, AMD still has another big gun to bring to the fight and that is FidelityFX Super Resolution. nvidia has nothing left, except start developing the 4000 series sooner and faster. I'm not saying next gen Hopper vs RDNA3 will be the same, nvidia won't stay still to take a beating like Intel, but this gen the one that surprised us more is definitely AMD.
     
  17. mitzi76

    mitzi76 Ancient Guru

    Messages:
    8,722
    Likes Received:
    19
    GPU:
    MSI 970 (Gaming)
    And price listed on ocuk £750 ish for a 6800XT!
     
  18. StevieSleep

    StevieSleep Member

    Messages:
    10
    Likes Received:
    2
    GPU:
    GTX 1080
    Lovely story, and by all means, that's all it is, a story, not even a factually correct one.
    There is no RTX 1080, I have a GTX 1080
    The GTX 1080 never delivered 4k in any modern game (would have been nice if you looked that one up)
    I want 4k performance especially for old games, Between DLSS2 and resolution scaling needing to hit 4k on modern games is pointless.
    Nice story of the fake pixels, but i really don't care, and I'll tell you a secret: NOBODY DOES, as long as it looks good i don't pixel peep, i didn't do it when I played at 1080p and i'm sure AF not going to do it when i have 4x the real-estate.
    Now unless you actually had a point instead of meandering about fake cards, unrealistic performance metrics and a short diatribe about fake pixels, I'm pretty sure we're done here.
     
  19. StevieSleep

    StevieSleep Member

    Messages:
    10
    Likes Received:
    2
    GPU:
    GTX 1080
    Yup, my bad, didn't take into account the high FPS crowd.
    But honestly you can't blame me, I'm actually one of those slow pokes that can barely tell the difference and, for me, as long as I don't experience any input lag it's really hard to care about going well over 60. But I am well over 30 at this point, so expecting my reflexes to improve at this point is a bit of a stretch.
     
  20. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,205
    Likes Received:
    2,996
    GPU:
    5700XT+AW@240Hz
    Thanks for pointing out typo. But, using it as argument? :D
    https://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1080_review,17.html
    Literal release date benchmark set says that card delivered 40~50 average fps at 4K in most of modern games at time. In those lighter, average was closer to 60.
    From test set, only Alien Isolation did not spend large portion of time under 60 fps.

    I get it, you bought GTX 1080 in 2016 to play games from 2012 on 4K. And now in 2020, you still enjoy playing those 2012 games on 4K. And maybe some new undemanding indie games.
    But if anyone persuaded me with whatever magic, to buy GTX 1080 for 4K, I would feel pretty stupid. Same way as I would feel stupid to buy any of the current Ampere/RDNA2 GPUs for 4K gaming since Turing introduced endless black hole for performance in form of raytracing. And DX-R is not going to go away any time soon.

    4K performance did always suck on modern HW and in year from each HW release new AAA titles turned it into mediocre performance level. Not much changed since time AMD/nVidia started marketing their GPUs as "True 4K GPU".

    They are still not 4K and that's why nVidia came with fake pixels and AMD been pushed to promise their fake pixels alternative.
     

Share This Page