Review average of 17 websites shows 6800 XT to be 7.4% Slower than GeForce RTX 3080

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 20, 2020.

  1. AuerX

    AuerX Ancient Guru

    Messages:
    2,537
    Likes Received:
    2,332
    GPU:
    Militech Apogee
    Plenty of games out there that even potatoes like a 1080/2070/5700 can play at 4K
    Even better if they have DLSS
     
  2. Hawaii_Boy_808

    Hawaii_Boy_808 Guest

    Messages:
    5
    Likes Received:
    2
    GPU:
    Radon 5700xt
    Click bait title and nothing more.
     
    Kosmoz likes this.
  3. Elder III

    Elder III Guest

    Messages:
    3,737
    Likes Received:
    335
    GPU:
    6900 XT Nitro+ 16GB
    *waves* I've been using 4K for over 5 years now. Nice to meet you! ;)

    I was fairly sure that there were a number of people on this particular forum that have been using 4K for awhile now too (granted we are a small number of course). I agree with everything else you said though.
     
  4. kcthebrewer

    kcthebrewer Guest

    Messages:
    190
    Likes Received:
    3
    GPU:
    BFG Nvidia 6800GT OC
    The 6900XT has 11% more CUs than the 6800XT.
    Same cache, same memory, same RAM speed.

    The issue with the 6900XT is that the memory bandwidth will be an issue at 4K.

    The 3090 will scale much better due to having 20% more CUs and memory bandwidth.

    Yes the 3090 costs a stupid amount but you at least get *some* value. There is pretty much zero reason to buy a 6900XT over a 6800XT.
     
    Maddness and JonasBeckman like this.

  5. moo100times

    moo100times Master Guru

    Messages:
    566
    Likes Received:
    323
    GPU:
    295x2 @ stock
    Maybe say that after we get 6900xt reviews. 6800xt is hot on the heels of the 3090 in some benchmarks, so I wouldn't count it out just yet
     
    Kosmoz likes this.
  6. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Having just recently read about that the memory issue and infinity fabric shortcomings at 3840x2160 let alone higher will still be a problem AMD would need a bigger bus width or more memory for the cache still to try and alleviate that possibly faster GDDR6 memory too.
    It does manage to keep up performance even so but the drop in comparison to 1920x1080 -> 2560x1440 and then once you go up to 3840x2160 where it just falls off entirely that's going to be a problem for them with this setup.

    That's a issue, alleviate the memory bandwidth and problems and then there's simply not enough memory for this at higher resolution to where this can be fully effective though it's still doing it's job but the resulting performance hit is evident too.

    Though I suppose AMD could make a revision later on but I wonder if RDNA3 isn't more likely for later next year and whatever changes that will bring.
    Although it sounds like the 5000 series CPU's are getting a revised model if that info is still accurate so maybe something could be done for the GPU's in-between RDNA2 -> RDNA3 but we'll just have to see what happens.

    And if the hit-rate and utilization for the IF is ~80% ish possibly in the 70's then going from 1080 to 1440 maybe additional whatever (Memory speed, infinity cache memory, bus width. All? :D ) could see a neat little performance bump if there was something like a 6850 or 6950 model available.

    Going to be a bit of a thing existing owners might be less than pleased about however if AMD comes out with a variant like that soon after assuming the pricing isn't entirely [beep] because of reasons hmm.
     
    Last edited: Nov 21, 2020
  7. Sukovsky

    Sukovsky Guest

    Messages:
    967
    Likes Received:
    76
    GPU:
    GTX 1080
    Would be interesting to see this comparison with 1440p.
     
    lukas_1987_dion likes this.
  8. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    It's fast it has mature drivers and as such it should hold up just fine way I see it.
    Memory wise AMD would have data from how RDNA2 performed it's strengths and shortcomings though I don't know what the design plan is for RDNA3 but if it's planned for late 2021 it'd already be underway though changes and tweaks can happen in response to feedback.

    Certainly improved from the RDNA1 making for some really impressive performance results if they can keep that sort of momentum going that's going to be really impressive although there could be issues or other unexpected developments.
    Looking forward to seeing how things go and hopefully also being able to upgrade this current 5700XT to something faster so hopefully this situation will also improve.

    November 25th I think it was and then how the situation looks like with the 6900XT for December 8th and whenever the 3080's are actually in stock for the NVIDIA side looking at how that develops too.

    5000 series CPU's appear to be having a estimated availability date of December 18th now but no estimations as to how many of the 5800's or higher up but GPU wise the OEM 6800's are entirely gone and the 3080's are just a unknown for when these will be available again in any actual quantities not singular numbers.
     
  9. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Sure, plenty does not. Especially if they are new AAA titles and GTX 1080 does not support 4K.
    And cards you wrote were never even 4K. All deliver barely 1440p with average 60fps with minor sacrifices.

    But you can still argue, that old games run 4K. They will always run 4K, because they will never change. Or that there is this new game called Pixel dungeon which runs 4K juts fine.
    (Arguments absent of logic are hardly persuading anyone.)

    I can say that people can play AC: Valhalla on RX 5700 (XT) while using reasonable details. But at 4K, those cards deliver 30-33fps on average.
    In tests, you have to go to Medium details to even gain 35% fps boost. That's still 44.5fps on average on 4K. So, you are talking new games played at low.
    And guess what? 1080 is doing even worse there.

    So, choice is 1440p Ultra with average 57~59fps. Or 4K 55fps on Low. GG.
     
  10. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    @Fox2232 I will give you this - 1440p makes much more sense from a cost vs performance perspective, as it is much cheaper to get 1440p60 vs 4k60. Something like 3 times cheaper, and 4k doesn't look 3 times better.

    However, 4k does look better, and some of us don't really care as much about the cost, as we do about the visual fidelity, thus 4k is our choice :)
     

  11. heffeque

    heffeque Ancient Guru

    Messages:
    4,422
    Likes Received:
    205
    GPU:
    nVidia MX150
    Power-Performance ratio has increased 50% going from RDNA 1 to RDNA 2, and it's gotten AMD a long way, to basically be on par with nVidia.
    Power-Performance ratio is expected to also be 50% going from RDNA 2 to RDNA 3, so... either nVidia also makes a big leap on their next gen, or AMD will finally be able to surpass nVidia on basically all metrics.
     
  12. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Yet there is temporal information too. Ignoring that is not exactly way to go.
    What I said stands trial of time. You buy so called 4K GPU and it can't do 4K at reasonable fps pretty damn soon.
    Some of those cards could not do it at day they came out.

    I am on 1080p for 240Hz. Very few users have more than 60Hz on 4K. 1440p screens can be had with 144/165Hz.
    Best balance between per frame information and temporal information comes at 1440p. Same applies to reasonable GPU longevity.

    Your card does 60fps on average 4K AC:Valhalla. Where are minimums? 45~50? When there is lighter scene, how many 4K screens can balance by displaying actual 75fps?
    With 60Hz screen, 60fps on average generated by GPU actually means something like 53~55 frames shown on screen on average. Because anything above 60fps gets temporal cropping G/Free-sync or not.
    Then you have DX-R games. You're not playing WD:Legion on 4K with DX-R as that would be ~36fps. You run upscaling from 1080/1440p depending on how stable framerate you want.

    3090 is strongest GPU we had till date. And data say it is not 4K GPU. It needs upsacling in already existing games to have mere 60fps.
     
  13. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    Temporal info is only really relevant for fast paced games. For slow paced games, image clarity from resolution makes a far bigger difference.

    But your argument about gpu longevity is fairly mute, as 1440p 144 fps requires roughly the same amount of gpu power as 4k60 does, so to maintain 1440p144 you are gonna need to swap your gpu just as often as you need to with 4k60. Same goes if you want to actually make use of that 240 hz on your 1080p display.

    You're right, im not playing legion. And if i did, i would not be using raytracing, as you'd know if you had read just about any of my posts... ever. Same goes for dlss.

    How is 3090 not a 4k gpu? Show me a game where it doesnt get 60 fps at 4k native.
     
    AuerX likes this.
  14. lukas_1987_dion

    lukas_1987_dion Master Guru

    Messages:
    701
    Likes Received:
    167
    GPU:
    RTX 4090 Phantom GS
    6800 XT is much better GPU for my resolution (2560x1440x75hz) and takes 50w less power than 3080, the problem is I can't buy it anywhere :(
     
    Last edited: Nov 21, 2020
  15. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    Still waiting for 3dcenter's proper full Launch Analysis. Mostly curious about he power.

    Proly monday.
     

  16. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,547
    Likes Received:
    608
    GPU:
    6800 XT
    Tbh AMD doesnt fall off at 4k just the normal amount vs 1440. The rtx 3000 are just abysmal on lower res being akin to Fury X vs 980 ti where AMDs more complex catched up at 4K.

    There was a good comparison how The performance persentage drop is for all cards from 1440 to 4k 3080, 2080ti and 6800xt drop roughly the same persentage.
     
    Dragam1337 likes this.
  17. Calmmo

    Calmmo Guest

    Messages:
    2,424
    Likes Received:
    225
    GPU:
    RTX 4090 Zotac AMP
    The AMD card is going to be faster within a year.
    In raster only.
     
  18. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,547
    Likes Received:
    608
    GPU:
    6800 XT
    And in RT faster than it is now. But won't beat nvidia. Still relatively good first try
     
  19. Mannerheim

    Mannerheim Ancient Guru

    Messages:
    4,915
    Likes Received:
    95
    GPU:
    MSI 6800XT
    My RX 580 is like ~10% average MAX :D
     
  20. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    sounds about right
    some show 10%,some 5%,I'd say 7.5% is pretty normalized.
    it's good enough to challenge the 3080,but RT performance and DLSS will keep nvidia on top this time.

    for a card that's over 30% faster than 2080Ti,right.........let's buy one for 1080p
    some people will always pick results depending on what brand they cheer for
     
    Last edited: Nov 21, 2020
    Stormyandcold and AuerX like this.

Share This Page