Review average of 17 websites shows 6800 XT to be 7.4% Slower than GeForce RTX 3080

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 20, 2020.

  1. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,205
    Likes Received:
    2,996
    GPU:
    5700XT+AW@240Hz
    Lovely good night story... with GTX 1080 running 4K.
    I bet that those new games really run well on 4K. Higher resolution supplements AA. But cost of running reduced details (missing effects) and missing on actual per pixel precision is not worth it.
    Especially since other option is to have crappy framerate.
    It is simple. You have RTX 1080. When did you buy it? And then we can look at how long that card delivered 60fps+ on 4K.
    But today. Your dreamy bubble of 4K bursted into 20~40fps on average in most of recent AAA games.
    Sacrifices you have to make to keep 60 fps vs sacrifices someone does at 1440p (with same HW) result in much worse image quality on your side.
    You could as well play some of those heavier games on 1080p with 2:1 pixel scaling, they will look better that way and run better too.

    And when you even remotely start thinking about future of DX-R...
    Top cards may have trouble keeping 60fps+ on 1440p year after they are out.

    4K people can only pray that fake pixels will come to rescue.
     
  2. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,489
    Likes Received:
    2,029
    GPU:
    HIS R9 290
    The 6800XT is 7.4x slower than the 3080, on Windows ;)
    I'm most likely going to get a 6700; the 6000 series is the only obvious choice for Linux users right now. The Linux drivers are so much better optimized (especially for compute tasks - the 6800 can often outperform the 3080). These GPUs suck at raytracing, but, I don't know if I'm going to be able to take advantage of that any time soon anyway so it's not much of a loss to me. Even if I were a Windows gamer, I wouldn't be dumping any money into raytracing anyway since the technology is going to need another couple years of maturity. As I've said before, reminds me a lot of the early days of tessellation.
     
  3. kapu

    kapu Ancient Guru

    Messages:
    4,135
    Likes Received:
    173
    GPU:
    MSI Geforce 1060 6gb
    4k for 1% of gamers., sure valid. For that 1%
     
    Kosmoz likes this.
  4. AuerX

    AuerX Member Guru

    Messages:
    105
    Likes Received:
    40
    GPU:
    PNY RTX2070 OC
    Plenty of games out there that even potatoes like a 1080/2070/5700 can play at 4K
    Even better if they have DLSS
     

  5. angelgraves13

    angelgraves13 Ancient Guru

    Messages:
    2,218
    Likes Received:
    656
    GPU:
    RTX 2080 Ti FE
    6900 XT with SAM and RAGE will smoke the 3090 in everything but ray tracing, unless developers begin optimizing for RDNA2, at which point Nvidia will need to release the 40 series on 7nm TSMC next year.
     
  6. Hawaii_Boy_808

    Hawaii_Boy_808 New Member

    Messages:
    5
    Likes Received:
    2
    GPU:
    Radon 5700xt
    Click bait title and nothing more.
     
    Kosmoz likes this.
  7. Elder III

    Elder III Ancient Guru

    Messages:
    3,696
    Likes Received:
    296
    GPU:
    Both Red and Green
    *waves* I've been using 4K for over 5 years now. Nice to meet you! ;)

    I was fairly sure that there were a number of people on this particular forum that have been using 4K for awhile now too (granted we are a small number of course). I agree with everything else you said though.
     
  8. kcthebrewer

    kcthebrewer Member Guru

    Messages:
    184
    Likes Received:
    2
    GPU:
    BFG Nvidia 6800GT OC
    The 6900XT has 11% more CUs than the 6800XT.
    Same cache, same memory, same RAM speed.

    The issue with the 6900XT is that the memory bandwidth will be an issue at 4K.

    The 3090 will scale much better due to having 20% more CUs and memory bandwidth.

    Yes the 3090 costs a stupid amount but you at least get *some* value. There is pretty much zero reason to buy a 6900XT over a 6800XT.
     
    Maddness and JonasBeckman like this.
  9. moo100times

    moo100times Master Guru

    Messages:
    275
    Likes Received:
    121
    GPU:
    295x2 @ stock
    Maybe say that after we get 6900xt reviews. 6800xt is hot on the heels of the 3090 in some benchmarks, so I wouldn't count it out just yet
     
    Kosmoz likes this.
  10. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,190
    Likes Received:
    2,649
    GPU:
    AMD S. 5700XT Pulse
    Having just recently read about that the memory issue and infinity fabric shortcomings at 3840x2160 let alone higher will still be a problem AMD would need a bigger bus width or more memory for the cache still to try and alleviate that possibly faster GDDR6 memory too.
    It does manage to keep up performance even so but the drop in comparison to 1920x1080 -> 2560x1440 and then once you go up to 3840x2160 where it just falls off entirely that's going to be a problem for them with this setup.

    That's a issue, alleviate the memory bandwidth and problems and then there's simply not enough memory for this at higher resolution to where this can be fully effective though it's still doing it's job but the resulting performance hit is evident too.

    Though I suppose AMD could make a revision later on but I wonder if RDNA3 isn't more likely for later next year and whatever changes that will bring.
    Although it sounds like the 5000 series CPU's are getting a revised model if that info is still accurate so maybe something could be done for the GPU's in-between RDNA2 -> RDNA3 but we'll just have to see what happens.

    And if the hit-rate and utilization for the IF is ~80% ish possibly in the 70's then going from 1080 to 1440 maybe additional whatever (Memory speed, infinity cache memory, bus width. All? :D ) could see a neat little performance bump if there was something like a 6850 or 6950 model available.

    Going to be a bit of a thing existing owners might be less than pleased about however if AMD comes out with a variant like that soon after assuming the pricing isn't entirely [beep] because of reasons hmm.
     
    Last edited: Nov 21, 2020

  11. Kosmoz

    Kosmoz Member

    Messages:
    39
    Likes Received:
    14
    GPU:
    GTX 1060 6GB
    If Assassin's Creed Valhalla and Dirt 5 are any indication about how the games optimized for AMD will look from now on, the RX 6000 series will mop the floor with nvidia RTX 3000 @ 1440p (and 1080p for that matter). Just look at how big of a win 6800 XT has in those 2 games in the Hardware Unboxed review and benchmarks, it surpasses the 3090 making a joke of that card and it's silly price, basically shaming it while being 3 times cheaper almost... it's crazy.

    Considering a lot of games will get optimized for PS5/XSX from now on that means on PC we will get great performance. When AMD brings their DLSS solution it will have even better performance and frankly unless nvidia throws money at every single game to close the gap or straight buys it's performance with nvidia sponsored titles (in detriment of AMD), it will not win in the long run with these 3000 cards. I bet they are already accelerating their 4000 series development, since that is their only viable long term solution.

    Not to mention the huge OC room 6000 series have over the nvidia 3000. I think we will see soon some crazy AIB variants that can OC like never before for a Radeon card.

    Here is the sum up: Radeon 6000s already winning by a mile in optimized games for AMD, they have more Vram than equivalent class cards from nvidia, they beat the silly 3090 on both performance and price @ 1440p & 1080p, they already have SAM (for extra performance), soon they will have DLSS alternative (more performance, even in RT) and they have better and crazy OC headroom (for even more performance). I mean how many advantages do one needs to see how clearly this will be in Radeon's favor?
     
    lukas_1987_dion likes this.
  12. Sukovsky

    Sukovsky Master Guru

    Messages:
    937
    Likes Received:
    31
    GPU:
    GTX 1080
    Would be interesting to see this comparison with 1440p.
     
    lukas_1987_dion likes this.
  13. angelgraves13

    angelgraves13 Ancient Guru

    Messages:
    2,218
    Likes Received:
    656
    GPU:
    RTX 2080 Ti FE
    RDNA3 will be interesting for sure. I wonder if it will have even more cache, like 1GB.

    I feel like my 2080 Ti may get me by until RDNA3 if it comes late next year.
     
    lukas_1987_dion and JonasBeckman like this.
  14. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,190
    Likes Received:
    2,649
    GPU:
    AMD S. 5700XT Pulse
    It's fast it has mature drivers and as such it should hold up just fine way I see it.
    Memory wise AMD would have data from how RDNA2 performed it's strengths and shortcomings though I don't know what the design plan is for RDNA3 but if it's planned for late 2021 it'd already be underway though changes and tweaks can happen in response to feedback.

    Certainly improved from the RDNA1 making for some really impressive performance results if they can keep that sort of momentum going that's going to be really impressive although there could be issues or other unexpected developments.
    Looking forward to seeing how things go and hopefully also being able to upgrade this current 5700XT to something faster so hopefully this situation will also improve.

    November 25th I think it was and then how the situation looks like with the 6900XT for December 8th and whenever the 3080's are actually in stock for the NVIDIA side looking at how that develops too.

    5000 series CPU's appear to be having a estimated availability date of December 18th now but no estimations as to how many of the 5800's or higher up but GPU wise the OEM 6800's are entirely gone and the 3080's are just a unknown for when these will be available again in any actual quantities not singular numbers.
     
  15. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,205
    Likes Received:
    2,996
    GPU:
    5700XT+AW@240Hz
    Sure, plenty does not. Especially if they are new AAA titles and GTX 1080 does not support 4K.
    And cards you wrote were never even 4K. All deliver barely 1440p with average 60fps with minor sacrifices.

    But you can still argue, that old games run 4K. They will always run 4K, because they will never change. Or that there is this new game called Pixel dungeon which runs 4K juts fine.
    (Arguments absent of logic are hardly persuading anyone.)

    I can say that people can play AC: Valhalla on RX 5700 (XT) while using reasonable details. But at 4K, those cards deliver 30-33fps on average.
    In tests, you have to go to Medium details to even gain 35% fps boost. That's still 44.5fps on average on 4K. So, you are talking new games played at low.
    And guess what? 1080 is doing even worse there.

    So, choice is 1440p Ultra with average 57~59fps. Or 4K 55fps on Low. GG.
     

  16. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,025
    Likes Received:
    1,037
    GPU:
    Rtx 3090 Strix OC
    @Fox2232 I will give you this - 1440p makes much more sense from a cost vs performance perspective, as it is much cheaper to get 1440p60 vs 4k60. Something like 3 times cheaper, and 4k doesn't look 3 times better.

    However, 4k does look better, and some of us don't really care as much about the cost, as we do about the visual fidelity, thus 4k is our choice :)
     
  17. heffeque

    heffeque Ancient Guru

    Messages:
    4,062
    Likes Received:
    73
    GPU:
    nVidia MX150
    Power-Performance ratio has increased 50% going from RDNA 1 to RDNA 2, and it's gotten AMD a long way, to basically be on par with nVidia.
    Power-Performance ratio is expected to also be 50% going from RDNA 2 to RDNA 3, so... either nVidia also makes a big leap on their next gen, or AMD will finally be able to surpass nVidia on basically all metrics.
     
  18. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,205
    Likes Received:
    2,996
    GPU:
    5700XT+AW@240Hz
    Yet there is temporal information too. Ignoring that is not exactly way to go.
    What I said stands trial of time. You buy so called 4K GPU and it can't do 4K at reasonable fps pretty damn soon.
    Some of those cards could not do it at day they came out.

    I am on 1080p for 240Hz. Very few users have more than 60Hz on 4K. 1440p screens can be had with 144/165Hz.
    Best balance between per frame information and temporal information comes at 1440p. Same applies to reasonable GPU longevity.

    Your card does 60fps on average 4K AC:Valhalla. Where are minimums? 45~50? When there is lighter scene, how many 4K screens can balance by displaying actual 75fps?
    With 60Hz screen, 60fps on average generated by GPU actually means something like 53~55 frames shown on screen on average. Because anything above 60fps gets temporal cropping G/Free-sync or not.
    Then you have DX-R games. You're not playing WD:Legion on 4K with DX-R as that would be ~36fps. You run upscaling from 1080/1440p depending on how stable framerate you want.

    3090 is strongest GPU we had till date. And data say it is not 4K GPU. It needs upsacling in already existing games to have mere 60fps.
     
  19. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,025
    Likes Received:
    1,037
    GPU:
    Rtx 3090 Strix OC
    Temporal info is only really relevant for fast paced games. For slow paced games, image clarity from resolution makes a far bigger difference.

    But your argument about gpu longevity is fairly mute, as 1440p 144 fps requires roughly the same amount of gpu power as 4k60 does, so to maintain 1440p144 you are gonna need to swap your gpu just as often as you need to with 4k60. Same goes if you want to actually make use of that 240 hz on your 1080p display.

    You're right, im not playing legion. And if i did, i would not be using raytracing, as you'd know if you had read just about any of my posts... ever. Same goes for dlss.

    How is 3090 not a 4k gpu? Show me a game where it doesnt get 60 fps at 4k native.
     
    AuerX likes this.
  20. lukas_1987_dion

    lukas_1987_dion Master Guru

    Messages:
    484
    Likes Received:
    51
    GPU:
    RTX 2080 Ti AMP! OC
    6800 XT is much better GPU for my resolution (2560x1440x75hz) and takes 50w less power than 3080, the problem is I can't buy it anywhere :(
     
    Last edited: Nov 21, 2020

Share This Page