Review average of 17 websites shows 6800 XT to be 7.4% Slower than GeForce RTX 3080

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 20, 2020.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,317
    Likes Received:
    18,405
    GPU:
    AMD | NVIDIA
  2. DannyD

    DannyD Ancient Guru

    Messages:
    6,706
    Likes Received:
    3,730
    GPU:
    evga 2060
    These tests also show 2080TI > 3070
     
    Dragam1337 and rl66 like this.
  3. SpajdrEX

    SpajdrEX Ancient Guru

    Messages:
    3,395
    Likes Received:
    1,651
    GPU:
    Gainward RTX 4070
    I'm missing information on what CPU they tested, is it Intel or AMD (with SAM enabled)?
     
  4. tuco ramirez

    tuco ramirez Guest

    Messages:
    2
    Likes Received:
    3
    GPU:
    Radeon 480
    Seems pretty accurate since it's for 4k, at 1440p 6800xt is clearly the stronger card though.
     

  5. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    You're missing the point. It's purpose is to give the average difference of the gpus using results from various different configurations. However, I'd hazard a guess that the majority of sites would've used Intel in their test setups.

    IMHO, SAM is only relevant to new system builders/buyers. For those of us without SAM capability the article is much more representative of the performance we can expect. Make no mistake though, SAM makes a difference and 2% is 2% in my book.
     
    IceVip and JonasBeckman like this.
  6. BLEH!

    BLEH! Ancient Guru

    Messages:
    6,401
    Likes Received:
    418
    GPU:
    Sapphire Fury
    How do the two compare in price, power consumption, and availability, though?
     
  7. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    For now though if NVIDIA does add this support as well and it works the same (Which it should.) then it's going to be interesting to see how AMD responds. :)
    ~7% and then possibly 4-5% with SAM edging out a little bit more.

    Then back to 7% with NVIDIA but then as they also support Intel and PCI-E 3.0 that'll be a 9-10% lead on those systems for the 3080 instead now.

    Might give AMD a reason to just unlock this fully if it's something they're just using a standard thing for making it a bit special and trying to sell hardware through it at least the way I understood how this works.
     
    Last edited: Nov 20, 2020
  8. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,390
    Likes Received:
    3,064
    GPU:
    PNY RTX4090
    Lets just hope that we get some "fine wine" treatment with the 6000 series GPU's. Like we did with the HD7000 series of cards which aged so damn well.

    AMD do seem a lot more focused these days, and with my 3900X CPU I have already seen this "fine wine" improvement over time with new BIOS/AGESA updates. The latest AGESA V2 PI 1.1.0.0 Patch C has brought some really really good improvements to the way the chip boosts. Before I would be seeing 4.65GHz for a nanosecond and the chip would stay around 4.2GHz at stock PBO enabled. Now with AGESA V2 PI 1.1.0.0 Patch C I am seeing with the same settings 4.35GHz across all cores and it boosts A LOT to 4.65GHz during bursty loads.

    We need to be seeing the same kind of improvements on the GPU side as well.

    I have my links bookmarked and shall be checking them every day to see if I can snag a 6800XT but I would love an Asus TUF 6800XT, the 3080 TUF looks so good and I love the blackout theme on it. Would suit my build very well.

    If anyone is interested OCUK has 3090's in stock (zotac and another brand) over 20+ in stock if that's the card for you.
     
  9. ETAxDOA

    ETAxDOA Member

    Messages:
    23
    Likes Received:
    6
    GPU:
    GTX 1080 SLI
    The title of the article should have "at 4k" added to the end of it to reflect the content of the story. For people only ever intending to run 1080/1440 the statistics are different again.

    Otherwise the title could have just as easily read "RTX3080 and RX6800XT both have abysmal performance"... at 8k
     
  10. Undying

    Undying Ancient Guru

    Messages:
    25,206
    Likes Received:
    12,611
    GPU:
    XFX RX6800XT 16GB
    5700XT at first was slower than 2070 few months later its competing with 2070Super it will be the same this time. AMD needs some nice drivers improvements and optimizations it will come on top. 16GB vram also comes in handy on the long run. The most interesting of all will be the super resolution feature so we compare the quality and performance vs nvidia dlss.
     

  11. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,445
    Likes Received:
    2,538
    GPU:
    TUF 6800XT OC
    So it's a tiny bit slower for theoretically 8% lower price, with 60% more memory and significantly lower power consumption.
    FineWine will most likely put it ahead 1 year into the future.

    This sounds like a win to me...
    ... assuming there will be any stock.

    But I think the real winners will be partner cards. Already noticing ridiculous clocks out there, 2500+
     
    tunejunky likes this.
  12. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Nvidia needs some nice driver improvements and optimizations and it will stay on top. SAM support also comes in handy in the long run. The most interesting of all will be the DLSS3.0 feature so we compare the quality and performance vs older nvidia dlss.
     
  13. geogan

    geogan Maha Guru

    Messages:
    1,266
    Likes Received:
    468
    GPU:
    4080 Gaming OC
    Funny that, noticed it myself on some sites. Maybe people are finally realizing what a rip off it is and not bothering with it. Or waiting to see price of 3080Ti (will miss Christmas though :() as I am...
     
    CPC_RedDawn and Stormyandcold like this.
  14. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,511
    Likes Received:
    2,353
    GPU:
    Nvidia 4070 FE
    From what I've seen, Nvidia usually has pretty good and optimised drivers right from the beginning. Apart from occasional initial bug like problems, which caused the whole capacitor spectacle with the 3000 series. AMD, however, seems to require months to figure out how its own hardware works, to optimise the drivers.

    I wonder what would have happened if AMD had invested in a wider memory bus. The 128MB miracle cache was supposed to help, but it's precisely with 4k where 6800 (XT) seems to be lagging. Does the cache fail to deliver in the higher res? I'd like to imagine AMD tested the whole thing somehow during development. Or maybe the reason is elsewhere.
     
    Maddness likes this.
  15. Freitlein

    Freitlein Member

    Messages:
    28
    Likes Received:
    5
    GPU:
    VII
    So AMD even optimizes after "the beginning"? In other words: You are a hardware and driver developer for Nvidia and AMD and you can certainly present your statements in more detail. Thanks in advance.
     

  16. ACEB

    ACEB Member Guru

    Messages:
    129
    Likes Received:
    69
    GPU:
    2
    I would say wait for Nitro and Devil before making mind up, apparently they can go stable 2500mhz on stock cooling lol. Hopefully the premium isn't too high, in any case atleast AMD has proven it can somehow compete with Nvidia and beat Intel into the ground from near bankruptcy a few years ago, crazy stuff and Lisa Su is going to be getting headhunted by everyone
     

    Attached Files:

    Kosmoz likes this.
  17. Revenge81

    Revenge81 Guest

    Messages:
    2
    Likes Received:
    1
    GPU:
    5700XT
    I think that percentages are also consequence of performances in ray tracing (due to the fact that, at the moment, rt use is not optimized for dx12U...). Without the rt, i think we would be talking about a totally different story.

    At the moment, i think that the 6800xt is the card to pick... Has performance on par with the 3090 in some cases, especially in 2k(with 180w less), is silent, is cheaper and, in the right rig(5900+sma) will definitely kick @sses also in 4k...

    Time will let us know who will win this battle, but i think that next gen consoles will push the development over dx12u, and this will surely come in help of amd...
     
    Kosmoz likes this.
  18. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,776
    Likes Received:
    1,388
    GPU:
    黃仁勳 stole my 4090
    Yeah but it's at 4K. Everyone I know, literally all of them, fall into 1 of 3 categories (in order of how common):
    1 - The Plebeian: 1080p/60
    2 - The Ascended: 1440p/144 or 165
    3 - The Madman: 3440x1440/144

    I don't know a single person that uses 4K, and if Steam charts are any indicator, that's how it is world wide. It seems to me the audience that's likely to buy a 6800 XT or 3080 is going to be using 1440p or ultrawide 1440p, at those resolutions the 6800 XT is closer to par or slightly faster than a 3080... in pure rasterization performance. Too bad its lack of DLSS and abysmal RT performance makes that moot.
     
    rl66 and Lebon30 like this.
  19. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Then someone can come and say:
    6800XT is 7.2% cheaper and eats 12% less energy on reference design.

    One should not better look at AIB's cards. Other differences are not even needed to be mentioned.
    Like performance balance at 1080p vs 1440p vs 4K.

    Most relevant are 1440p results. 4K are almost irrelevant. And 1080p too, as very few people will pair 6800(XT) with 1080p screen.
     
    tunejunky likes this.
  20. EspHack

    EspHack Ancient Guru

    Messages:
    2,794
    Likes Received:
    188
    GPU:
    ATI/HD5770/1GB
    yeah RTX cards do faster 4k but how's half the vram going to work out in a year or two? I'm almost certain even the 6800 will outpace them when newer games show up asking for more vram, doom eternal is already doing it

    were it not for the huge vram asterisk, the 3070-3080 would be a simple buy, nvidia loves doing this
     
    Kosmoz and Dragam1337 like this.

Share This Page