Review average of 17 websites shows 6800 XT to be 7.4% Slower than GeForce RTX 3080

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 20, 2020.

  1. KingK76

    KingK76 Member Guru

    Messages:
    106
    Likes Received:
    11
    GPU:
    Pascal Titan X Under H2O
    In 1 or 2 years Nvidia will also be on a proper die shrink which will only increase their advantage... Samsung's 8nm is really 10nm so AMD currently have a big advantage over Nvidia when it comes to clock speed and power. When that is gone so will the small gap Nvidia currently has over AMD. In other words the gap will be much larger... And for that I would bet we only have to wait until next year...
     
  2. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,041
    Likes Received:
    259
    GPU:
    6800 XT
    It's not like AMD will be standing still. Even with the node difference I doubt Nvidia would've gotten holyshit better with the 7nm. Nvidia barely has any gap and mostly that gap is in RT which then again makes sense. Them having specific cores for that and tensor cores for dlss. But like cmon.
     
    HandR and Kosmoz like this.
  3. Kosmoz

    Kosmoz Member Guru

    Messages:
    119
    Likes Received:
    72
    GPU:
    GTX 1080
    First, AMD will improve too with next gen GPUs and they will also push harder and will be competitive from now on. They achieved what they said they will with RDNA 2, so I can't doubt them now for going again +50% with RDNA3, like they are saying. I do not expect nvidia to pull a gap again, I do expect them to trade punches though, like now.

    Second, I was referring how will this look in 1-2 years time, about this gen. How 3000 series vs 6000 series will look in 1-2 years when we benchmark them, compared to now. That's what I meant, not next gen vs next gen. This gen for AMD will look even better than nvidia's, the fine wine will be even greater when they ad their own DLSS alternative in the mix, which I predict will come in max 6 months from now.

    So people are jumping the gun with their conclusions saying no DLLSS and thus RT(X) = nvidia is better, when in a few months time even that advantage will be gone.

    There are 2 types of buyers for these GPUs: those that want the NOW performance (RTX 3000s), and those that think ahead and know that AMD will deliver and buy the better future product (Radeon 6000s) with the enough performance for now, to not be too inferior anyway.
     
  4. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,705
    Likes Received:
    1,295
    GPU:
    Aorus 3090 Xtreme
    lol wut?
     

  5. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,041
    Likes Received:
    259
    GPU:
    6800 XT
    Oh tbh the advantage nvidia has in RT won't be gone. Unless AMD delivers way better DLSS which I doubt. It will be smaller. But even then best RT performance will be on nvidia cards.
     
  6. Kosmoz

    Kosmoz Member Guru

    Messages:
    119
    Likes Received:
    72
    GPU:
    GTX 1080
    Right, but if it will be +/-10fps difference, not a lot of people will care really. I won't for sure. As long as it is over 60fps.
     
  7. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,361
    GPU:
    6900XT+AW@240Hz
    I would not buy 3080/6800XT for higher than 1080p resolution if I was into DX-R.
    And I do not want 6800XT for 4K performance would not last till next generation. It is great 1440p GPU and it will last outside of DX-R.

    That's why I am on 1080p and have no reason to move above it even if I get 6800XT.

    If nVidia doubles DX-R performance per $ with next generation, maybe DX-R games will have more sense on 1440p. Till then fake pixels for some, 1080p for others.
     
  8. jarablue

    jarablue Member Guru

    Messages:
    117
    Likes Received:
    10
    GPU:
    nVidia 960m 2GB
    This is wrong. 1440p with a 3080 will net me near 144 fps for my 144 hertz monitor with every setting cranked. It is not at all a waste of money.
     
  9. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,041
    Likes Received:
    259
    GPU:
    6800 XT
    Well yes. Didn't anyway mean the DX-R performance is anything to write home about on any GPU when it comes down to it especially on higher res.

    I am on 5120*1440 because of work mainly. Needed the real estate. But that is a reason for me to buy a really fast gpu now too... Most likely will go for that 6900 xt unless I can get AiB tomorrow. Just so frikkin annoying to run this display at 60hz on my 980 :D Or play world of tanks at 30-40fps...
     
    Fox2232 likes this.
  10. KingK76

    KingK76 Member Guru

    Messages:
    106
    Likes Received:
    11
    GPU:
    Pascal Titan X Under H2O
    Come on... How do you figure Nvidia's advantage in RT will be gone in a few months? Where are you coming from with that? Look at raw RT performance in Minecraft and Nvidia has like a 5x advantage... And AMD has nothing that can compete with how Nvidia does DLSS as Nvidia has the Tensor cores... I think you're stretching it quite a bit with that statement... And Nvidia also won't be standing still... AMD have certainly put the scare in them now. Unlike Intel who seem to have nothing to come back at AMD with, Nvidia are much better positioned to up their game. And when both AMD and Nvidia are on 7nm Nvidia will only increase their gap. 5nm is next but AMD aren't close to moving to that... Maybe another 1.5 years I'd say, 2022. And I would bet Nvidia won't rest on their laurels and let AMD get to that node quicker then they will... Don't forget who the bigger company is with more money to spend to keep their lead.
     

  11. Kosmoz

    Kosmoz Member Guru

    Messages:
    119
    Likes Received:
    72
    GPU:
    GTX 1080
    The advantage will not be 100% gone, as in AMD will match exactly nvidia's RT performance when they have the DLSS alternative (
    FidelityFX Super Resolution), but I do expect 70-80% of that advantage to be gone. I would be a case of nvidia has, let's say @ 1440p: 100 fps with RTX+DLSS and AMD will have 80 fps with RT+FidelityFX Super Resolution. That will look much better than now and some people won't care if its 100 or 80 fps at that point. I know I won't. The gap will be much lower.

    Also you are wrong about AMD and 5nm, they are moving to 5nm for RDNA3, thus the +50% perf/ watt vs RDNA2, as they say will happen again as it did now vs RDNA1. There will not be a case where both AMD and nvidia are on 7nm with the same generation of GPUs.
     
  12. AuerX

    AuerX Member Guru

    Messages:
    171
    Likes Received:
    81
    GPU:
    PNY RTX2070 OC
    Imaginary stories about unobtainable hardware are the best.
     
  13. Kosmoz

    Kosmoz Member Guru

    Messages:
    119
    Likes Received:
    72
    GPU:
    GTX 1080
    For now yes, see ya in 4-6 months. It will still be this gen even then, won't be a new next gen...

    Some people have no foresight at all, they think that if NOW is like this, when we clearly know more things will come to improve this NOW situation (especially for AMD), they ignore all of that and think that NOW = FOREVER. A few months is nothing compared to years.

    Like the horse blinders wearing fanboys of Xbox that kept saying about their most powerful console in the word, 12TF XSX will demolish the weakstation5, etc. And now they see that not only are they on par, but actually PS5 ca be even better on some scenarios in 3rd party games. Where is that +20% performance, hm? Where is the demolishing the stomping over PS5?

    The same they said Radeon wil be 2080ti+15% and the same was proven wrong. Concerning AMD vs nvidia this gen, 6000s vs 3000s, AMD still has another big gun to bring to the fight and that is FidelityFX Super Resolution. nvidia has nothing left, except start developing the 4000 series sooner and faster. I'm not saying next gen Hopper vs RDNA3 will be the same, nvidia won't stay still to take a beating like Intel, but this gen the one that surprised us more is definitely AMD.
     
  14. mitzi76

    mitzi76 Ancient Guru

    Messages:
    8,722
    Likes Received:
    19
    GPU:
    MSI 970 (Gaming)
    And price listed on ocuk £750 ish for a 6800XT!
     
  15. StevieSleep

    StevieSleep Member

    Messages:
    13
    Likes Received:
    2
    GPU:
    Nvidia GTX 1080
    Lovely story, and by all means, that's all it is, a story, not even a factually correct one.
    There is no RTX 1080, I have a GTX 1080
    The GTX 1080 never delivered 4k in any modern game (would have been nice if you looked that one up)
    I want 4k performance especially for old games, Between DLSS2 and resolution scaling needing to hit 4k on modern games is pointless.
    Nice story of the fake pixels, but i really don't care, and I'll tell you a secret: NOBODY DOES, as long as it looks good i don't pixel peep, i didn't do it when I played at 1080p and i'm sure AF not going to do it when i have 4x the real-estate.
    Now unless you actually had a point instead of meandering about fake cards, unrealistic performance metrics and a short diatribe about fake pixels, I'm pretty sure we're done here.
     

  16. StevieSleep

    StevieSleep Member

    Messages:
    13
    Likes Received:
    2
    GPU:
    Nvidia GTX 1080
    Yup, my bad, didn't take into account the high FPS crowd.
    But honestly you can't blame me, I'm actually one of those slow pokes that can barely tell the difference and, for me, as long as I don't experience any input lag it's really hard to care about going well over 60. But I am well over 30 at this point, so expecting my reflexes to improve at this point is a bit of a stretch.
     
  17. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,361
    GPU:
    6900XT+AW@240Hz
    Thanks for pointing out typo. But, using it as argument? :D
    https://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1080_review,17.html
    Literal release date benchmark set says that card delivered 40~50 average fps at 4K in most of modern games at time. In those lighter, average was closer to 60.
    From test set, only Alien Isolation did not spend large portion of time under 60 fps.

    I get it, you bought GTX 1080 in 2016 to play games from 2012 on 4K. And now in 2020, you still enjoy playing those 2012 games on 4K. And maybe some new undemanding indie games.
    But if anyone persuaded me with whatever magic, to buy GTX 1080 for 4K, I would feel pretty stupid. Same way as I would feel stupid to buy any of the current Ampere/RDNA2 GPUs for 4K gaming since Turing introduced endless black hole for performance in form of raytracing. And DX-R is not going to go away any time soon.

    4K performance did always suck on modern HW and in year from each HW release new AAA titles turned it into mediocre performance level. Not much changed since time AMD/nVidia started marketing their GPUs as "True 4K GPU".

    They are still not 4K and that's why nVidia came with fake pixels and AMD been pushed to promise their fake pixels alternative.
     
  18. kapu

    kapu Ancient Guru

    Messages:
    4,717
    Likes Received:
    411
    GPU:
    Radeon 6800
    Thats exacly what it is , im on 1080p ( considering 1440p screen now) . Just got 6800 last week , and i realized i dont push more than 120FPS at 1080p now!. in 2-3 years it will degrade to 60-70 level i think, but still i will be able to max games at 1080p .
    1080p is top res right now and its not going to change anytime soon, performance drop from 1080p to 4k is erroneous .... play 4k and drop quality settings, what's the point in that ?.
    I still rather push top quality settings and sit at 1080 with nice framerates . 6800 is very nice 1080p GPU , it actually can outperform 3080 in many games .
    I think 2 generations from now ( next consoles ) RT will be thing , like the thing , most of the games will have it and GPUs will be able to support it at decent performance levels . Like it was with T&L ..sometime ago :D (also tesselation ? used in almost every game now ? )
     
    Fox2232 likes this.
  19. StevieSleep

    StevieSleep Member

    Messages:
    13
    Likes Received:
    2
    GPU:
    Nvidia GTX 1080
    No, you pretty much did not get anything quite right. I wouldn't have bought the 1080 for 4k because everyone knew it wasn't a good 4k competitor. When your best case is maybe 60fps, you're not a 4k card. I don't know why you felt the need to reiterate that but let's move on.
    I bought a card that was on sale and was at the correct price, not the price blockchain pushed it to be. Later on there was a BF sale on 4k monitors so I got 2, (I know, overkill, but gaming isn't my main profession and they're useful for that). Problem is now that because of the way windows handles resolution changes it forces me to play on 4k or rearrange everything on the screen every time i close a game.
    So to make it clear 4k gaming was never my aim, 4k desktop was.

    Nvidia came out with a solution that fit their architecture over the resolution scaling we already had. They were in a comfortable lead and they spent a generation working on tensor cores and RT cores because this is the future they envisioned for the platform.
    AMD kept grasping at straws and even now AIBs can't sell the graphics card for what they promised because they were already at knifes edge on profits. The less we talk about the tech AMD implemented in the latest cards the better, it's embarrassing enough as it is and I'm not here to bash AMD.

    I'm not particularly happy about the situation either. Nvidia was always a closed of sort of platform and why shouldn't they be otherwise? They were the ones spending money on creating the implementation so obviously they wouldn't just give it away. AMD came around to it later but at that point having it open source was the only way to make their implementation be used by the devs.

    For me it's a simple equation. I want something that will last me the next 5 years since I have other pans in the meantime
    - I don't need the latest graphics then.
    - I don't care as much about the "fake pixels" that you are so obsessed about.
    - I want something that has the best chance of offering me a good experience not only now but later on.
    - I'm definitely not interested in 8k or HDR or whatever idiotic thing they come up in the future. Not unless windows fixes it's crap or Linux actually becomes a serious contender as a desktop environment when it comes to media.

    If I'm going to take a gamble on the long run between NVidia and AMD there's no question who I'd pick. Because one thing's for certain, the chance of windows getting it's poop in a group is a bit unrealistic.
     
  20. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,361
    GPU:
    6900XT+AW@240Hz
    No, You can run desktop on 4K and game on any other resolution in windowed-fullsceen mode which actually does not affect desktop in any way. Mode that affects desktop is exclusive-fullscreen and that's not exactly future of Windows 10.
    Maybe you missed it. But 4K is not AMD vs nVidia, neither delivered true 4K cards in past. And if nVidia's drivers force change of desktop resolution with every game resolution change, then they are inferior in scaling methods to AMD's. And I guess that's not the case. So, maybe you should revisit way you set up your driver scaling options and modes in games.
    What you call: "no choice" is actually pattern in AMD's vs nVidia's behavior. One puts out things that elevate everyone willing to be elevated. Other would step on shoulder of anyone whose guard is down.
    You can gamble all you want. I prefer quite deterministic approach. That says 1440p is sustainable. 4K is not. Want to keep same HW for 5 years? With DX-R in game, get ready for 1080p resolution on your 4K screen with integer scaling.

    As for other technologies. HDR will come. And it will bring visually interesting change too. Performance hit from that is small. Price point of good screen with HDR is anything but not small for now.
    And that's entire point: "Is it worth investment?"
     

Share This Page