Review average of 17 websites shows 6800 XT to be 7.4% Slower than GeForce RTX 3080

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 20, 2020.

  1. Noisiv

    Noisiv Ancient Guru

    Messages:
    7,355
    Likes Received:
    877
    GPU:
    2070 Super
    Still waiting for 3dcenter's proper full Launch Analysis. Mostly curious about he power.

    Proly monday.
     
  2. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    6,979
    Likes Received:
    209
    GPU:
    980
    Tbh AMD doesnt fall off at 4k just the normal amount vs 1440. The rtx 3000 are just abysmal on lower res being akin to Fury X vs 980 ti where AMDs more complex catched up at 4K.

    There was a good comparison how The performance persentage drop is for all cards from 1440 to 4k 3080, 2080ti and 6800xt drop roughly the same persentage.
     
    Dragam1337 likes this.
  3. Alex13

    Alex13 Ancient Guru

    Messages:
    2,388
    Likes Received:
    171
    GPU:
    GT710
    The AMD card is going to be faster within a year.
    In raster only.
     
  4. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    6,979
    Likes Received:
    209
    GPU:
    980
    And in RT faster than it is now. But won't beat nvidia. Still relatively good first try
     

  5. Mannerheim

    Mannerheim Ancient Guru

    Messages:
    4,801
    Likes Received:
    19
    GPU:
    Gigabyte RX580 8GB
    My RX 580 is like ~10% average MAX :D
     
  6. cucaulay malkin

    cucaulay malkin Master Guru

    Messages:
    215
    Likes Received:
    103
    GPU:
    1070 Strix
    sounds about right
    some show 10%,some 5%,I'd say 7.5% is pretty normalized.
    it's good enough to challenge the 3080,but RT performance and DLSS will keep nvidia on top this time.

    for a card that's over 30% faster than 2080Ti,right.........let's buy one for 1080p
    some people will always pick results depending on what brand they cheer for
     
    Last edited: Nov 21, 2020
    Stormyandcold and AuerX like this.
  7. Kosmoz

    Kosmoz Member

    Messages:
    39
    Likes Received:
    14
    GPU:
    GTX 1060 6GB
    Fortunately for us consumers (because competition is good), nvidia is not Intel, they do not slack as much and don't fall behind as much or at all, even when they get caught up. So after seeing how good RDNA 2 is and literally being punched by the Radeon 6000 series, it's almost a given that the RTX 4000 series from nvidia will use a much better node (7nm/7+ from TSMC, at least, if not 5nm) and they will make the best of it. So I expect a fierce competition with even better performance jumps than this gen, from the next gen coming late 2021 or 2022.
    You can buy a high end GPU even for 1080p, if you want to play on Ultra + high refresh rate and not care for 4-5 years. This is what a lot of people don't get and can't seem to comprehend... you can buy an overkill GPU to last you more years for a lower resolution, instead of buying a new one every 1-2 years, because is good enough just now...
     
  8. cucaulay malkin

    cucaulay malkin Master Guru

    Messages:
    215
    Likes Received:
    103
    GPU:
    1070 Strix
    yes but how does that make 4k irrelevant for 3080 ?
     
    Dragam1337 likes this.
  9. asturur

    asturur Master Guru

    Messages:
    938
    Likes Received:
    273
    GPU:
    Geforce Gtx 1080TI
    I'm not sure how a 3090 can last much longer than a 3080. They perform so much similar, when the 3080 won't be able to hit 60fps, the 3090 will be near there too.
     
    cucaulay malkin likes this.
  10. cucaulay malkin

    cucaulay malkin Master Guru

    Messages:
    215
    Likes Received:
    103
    GPU:
    1070 Strix
    luckily both will last 4 years easily
     

  11. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,026
    Likes Received:
    1,037
    GPU:
    Rtx 3090 Strix OC
    Issue will be vram at 4k with max settings on the 3080, much sooner than it will be an issue to hit 60 fps... in the average game anyways. But in the very demanding games, such as valhalla, the 3090 gets 60 fps, where as the 3080 gets 50 fps - that is a fairly substantial difference.
     
  12. asturur

    asturur Master Guru

    Messages:
    938
    Likes Received:
    273
    GPU:
    Geforce Gtx 1080TI
    I'm personally happy there are 4 fast cards to choose from.
    Now if my 1080ti can't hit 144fps at 1440p in new wow game, then i have to update.
     
  13. Abc666

    Abc666 Member Guru

    Messages:
    111
    Likes Received:
    23
    GPU:
    EVGA FTW 1070 8GB
    So next monitor is going to be 2560x1440..This generation cards RT performance again takes a massive performance hit even with highest end cards, leaving the choice obvious (for me) to turn off RT to unleash 30-40% more FPS.

    So why would i go 3080 ? , it uses 100watt more, is slower, has less ram,+cost more it do
     
    Dragam1337 likes this.
  14. metagamer

    metagamer Ancient Guru

    Messages:
    1,821
    Likes Received:
    692
    GPU:
    Palit GameRock 2080
    Wait for Cyberpunk 2077 performance review. RTX + DLSS2.0 on Nvidia will run circles around anything AMD can offer.
     
  15. cucaulay malkin

    cucaulay malkin Master Guru

    Messages:
    215
    Likes Received:
    103
    GPU:
    1070 Strix
    amd need a dlss equivalent
    until they don't have it even a 2070S with dlss will run circles around 6800XT in rt


    https://www.purepc.pl/test-kart-graficznych-amd-radeon-rx-6800-xt-vs-geforce-rtx-3080?page=0,13
    https://www.purepc.pl/test-kart-graficznych-geforce-rtx-3070-vs-geforce-rtx-2080-ti?page=0,13

    47 vs 35 fps at 1440p

    seriously,releasing rt-capable cards without dlss equivalent was pointless.
    even a 3080 needs it for 1440p.
    2080Ti needed it for 1080p.
     

  16. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,302
    Likes Received:
    1,035
    GPU:
    Aorus 3090 Xtreme
    Memory requirements can easily exceed 10GB if devs are allowed to leverage it. ie if memory size increases on next gen, both AMD and NVidia will have surpassed 10GB with high end cards.
    RT can suck ram up like a hoover, as can very high res textures. It will be used for max quality if enough cards have it.
    I expect the 3080 to have no major issues over the next 2 years with max quality settings but I'm not so sure beyond that.
    This is the major appeal of the 3080ti or a 20GB version.
    Not only for stock games but also game mods, they can chew through Vram as well.
     
  17. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,205
    Likes Received:
    2,996
    GPU:
    5700XT+AW@240Hz
    At time 1440p falls down to 60fps on average, 4K is already in 25~30fps hell.
    I do not play CC games, because they are all locked to 30fps. They are not fast paced game. But waiting over 33ms for game to react on engine level is increasing latency and drastically reduces ability to issue higher number of commands per second.
    Sure, you can play Civilization on 4K.

    And I'll repeat it for you and anyone else. Buying GPU for 1440p 144Hz does not mean, every game always has to run on 144fps. In time, card will be too weak and fps will fall to still comfortable 90fps. Years later it will be 60fps.

    You made statement based on false premise. That's saying someone with 144Hz screen will not have good experience when game does 60fps.
    And that it is same as when 4K game does 25fps. It is not same in any way or form. Having less than 50fps on average is already torture. And no self respecting PC gamer would accept it willingly through using too high resolution.

    Reality of your statement is:
    One can have good GPU which either starts its journey as: 1440p 144fps or as 4K 60fps.
    Where is that journey in half a year when heavier games come?
    1440p @120fps or 4K at 50fps.
    What about when 1440p can do only 100fps? 4K at around 42fps.
    At that point 1440p owner is still happy. 4K dude is reducing game details.

    And at that point, if you took two 32 inch screens, one 1440p one 4K and mage image quality comparison from same distance. You would say that 1440p looks better.

    In the end, it is same case as GTX 1080 which started this discussion. It can get you 4K @60fps in AC:Valhalla, but only if you are willing to play at LOW details.
    I am pretty sure 1440p on high details wins in image quality on screens which have same size. And I would personally have no problem to enjoy extra fluid movements at 1080p thanks to even higher fps.

    You know, 4K can deliver higher IQ than 1440p. But that requires proper use of high detail textures and high precision shaders. That's direct opposite of what you are getting when you turn details from ultra to medium/low.
     
  18. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,205
    Likes Received:
    2,996
    GPU:
    5700XT+AW@240Hz
    10GB is just allocation. Tiny fraction of data is loaded into GPU each frame.
    RX 6800(XT) can actively use around 7GB of data from VRAM before its fps falls under 60.
    Sure, 9GB of extra cache is nice. Will prevent stutter. But higher the real VRAM usage, worse fps on current RDNA2 cards.

    AMD titles will not push real VRAM use above 7GB no matter how much spare VRAM cards have. No matter how much allocation consoles can have.
    If anyone pushes extra VRAM usage, it will be nVidia. And I doubt they are giving KO to all cards with 8GB anytime soon.
    6GB, sure, may happen within 2 years. But which card with 6GB VRAM is really powerful. And if it is powerful, will reduction of one detail type which eats most of VRAM be sufficient?
    Dynamic shadows are quite costly to bandwidth and occupy some extra VRAM. Maybe ultra texture pack is not exactly way to go on older GPU either.

    But when we talk about cards having 10GB VRAM. I would talk bandwidth 1st. That's more pressing parameter than VRAM. Especially since Turing has Sampler Feedback as well as RDNA2. New games which will use more VRAM will implement available methods, so they will not choke cards with 8GB VRAM.
    Recent history says that 4GB VRAM is not enough in some cases, and is fixed on affected cards by simply reducing detail or two. But those cards do not have power to deliver good fps at those details anyway.
    We can start worrying about cards with 8GB VRAM when this applies to cards having 6GB.
     
  19. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,302
    Likes Received:
    1,035
    GPU:
    Aorus 3090 Xtreme
    Yawn.
     
  20. metagamer

    metagamer Ancient Guru

    Messages:
    1,821
    Likes Received:
    692
    GPU:
    Palit GameRock 2080
    DLSS2.0 pretty much doubles my fps in Control. I can go from the game completely maxed, including RTX at 1440p running at 30-40fps to running 60-70fps with DLSS2.0 and same settings. Dropping RTX a notch or a setting here and there, the game now runs at 100fps no problem and looks amazing.

    I agree AMD need a DLSS alternative. I'll give them this though, AMD RTX performance is slightly better than I expected it to be. Not bad for a first try. Problem is, Nvidia now have a lead here, having had a previous generation to iron things out and like I've said many a times in the past, DLSS2.0 is a game changer.
     
    AuerX and cucaulay malkin like this.

Share This Page