Review average of 17 websites shows 6800 XT to be 7.4% Slower than GeForce RTX 3080

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 20, 2020.

  1. jbscotchman

    jbscotchman Guest

    Messages:
    5,871
    Likes Received:
    4,765
    GPU:
    MSI 1660 Ti Ventus
    Agreed, but adding extra VRAM doesn't compensate for lack of horse power. I mean hell a 5700XT with 4 gigs of vram would easily out perform an RX580 with 8 gigs.
     
  2. jbscotchman

    jbscotchman Guest

    Messages:
    5,871
    Likes Received:
    4,765
    GPU:
    MSI 1660 Ti Ventus
    Well that entirely depends on the settings
     
  3. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,671
    GPU:
    Aorus 3090 Xtreme
    I think you care about this far more than me.
     
  4. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    Love how you use strawman arguments all the time. I dont drop below my 58 fps cap - so all your ramblings are just BS strawman arguments - nothing more, nothing less.

    And when people buy a high hz display, it's to make use of it - otherwise they might aswell have bought a 60 hz display in the first place and saved themselfs alot of money. So yeah, you probably wasted your money on that 240 hz display with that 5700 gpu...
     
    jbscotchman likes this.

  5. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    No it doesn't.

    Yes it is.

    According to steam surveys, 3440x1440 is by far the least used resolution - better luck next time.

    Im the one who is clueless, yet you cant even get the specs right when given the model number on the monitor...
     
  6. ViperAnaf

    ViperAnaf Guest

    Messages:
    404
    Likes Received:
    125
    GPU:
    ASUS TUF 3080 OC
    DLSS 2.0 + RT makes the 6800xt a no go card... unless you are poor of course...
     
  7. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Actually, now, you have done it. Lovely projection.
    But let's be honest. If you are maxing out new games, you are running under 60fps @4K. Saying otherwise is lying to everyone around. And in a year, you will be running new games at what on average? 50? 45? 40?
    Me with my poor 240Hz 1080p screen. If I drop under 160fps. I'll still have fluid framerate. Yeah, no 240fps in new games. But that's the thing, I can afford to lose 70% of fps in time and all games will still be playable.
    1440p can lose 50% of fps and games will still be playable.
    But when you start on average around 60fps. You are doing sacrifices day one. Or you have subpar experience and lying about it.

    I am not saying that 240Hz is optimal, it is not. Maybe when we get 480Hz screen, I'll stop seeing difference. But I am not saying that 120/144/165Hz is bad. Quite contrary, for most people 1440p with those framerates is optimal. There is HW which can play games without sacrifices, and will be able to do so for quite some time.
    It is 4K, which never really got frame rate of the ground unless sacrifices were made.

    Everyone knows it, every tech site shows same data. Yet, buyer's bias is buyer's bias. Mistakes happen, people have to live with them.
    Acknowledging it and learning does not hurt. Or live in denial.
     
  8. AuerX

    AuerX Ancient Guru

    Messages:
    2,533
    Likes Received:
    2,332
    GPU:
    Militech Apogee
    No correction, but I intend to game at 4K60 with a 80/90 series. I have played a lot of games as is at 4K with a 2070.

    Looking forward to doing it better, and with some titles I never really attempted due to hardware limitations.
     
    Dragam1337 likes this.
  9. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    Again, using strawman arguments. Look through the tests hilbert did of the 3090 - how many of the games averaged at or below 60?

    The vast majority of games runs considerably faster than 60 fps at 4k maxed out on a 3090. The only exceptions are valhalla and flight simulator 2020, which run at or slightly below 60 fps. But that obviously doesn't fit your narrative that NO GPU CAN RUN 4k !!!!!!111...
     
  10. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    Soooo we can argue, but we gotta be civil about it. It's been good so far with a few... Slips but overall let's keep like this.
     

  11. W@w@Y

    W@w@Y Ancient Guru

    Messages:
    8,227
    Likes Received:
    86
    GPU:
    Strix 4080
    upload_2020-11-22_14-27-59.png
     
    Stormyandcold, itpro and Dragam1337 like this.
  12. asturur

    asturur Maha Guru

    Messages:
    1,371
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    Consoles spots 120hz on the specs but i m sure nothing will do that.
    Maybe some low poly artistic game, cool, but not a reference.
    Maybe ps4 games at 1080p can be run at 120fps, exactly as we can do we games with medium setting on a 1080ti
     
  13. leszy

    leszy Master Guru

    Messages:
    348
    Likes Received:
    39
    GPU:
    GB 7900XT Gaming OC
    Interesting comparison. It looks like NVidia's PR department in Germany is very influential;) Tests on German sites gave NVidia a great advantage. And 3Dcenter carefully skipped sites where RX6800XT won, for example notebookcheck.net + 4.75% or techgage.com +1.5%. By the way, usually NVidia cards are tested in the best possible environment for them. This time, when testing the PCi 4.0 card, most of the known sites turned to the old PCi 3.0 architecture. Is this a nod to Intel or NVidia? Sometimes I wonder what the point is for companies to develop new advanced technologies, since the reviewers carefully pretend that they do not notice their importance (unless it is a technology of this company that should be supported (such as DLSS available in (literally) several games)?;)
     
  14. leszy

    leszy Master Guru

    Messages:
    348
    Likes Received:
    39
    GPU:
    GB 7900XT Gaming OC
    I have a strange feeling that games on consoles, will again operate better than on a PC. At least the titles, which porting will be sponsored by NVidia;)
     
  15. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    really?
    or maybe calling a well established and reputable tech site "jensen bitches" is more likely to be the grounds for a ban ?

    It was so refreshing to see that the discussion under G3D's 6800xt review contained so little fanboy hatred,cause other sites I visit it was a complete ****show,just pathetic.

    Can we please have one tech site where a term "Jensen bitch" isn't thrown around ?
     
    Last edited: Nov 22, 2020

  16. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    As did @Agonist told you: "Resolution is not the only defining thing here."

    If game renders 960x540 image, and then it is displayed on 3 screens of same physical size: 1080p, 1440p, 2160p via integer scaling,
    all 3 screens will deliver same image quality. That's because game's output image quality did not exceed capabilities of any of those screens.
    And when it gets to modern games, their output image quality rarely exceeds that of 1440p.
    For that, there is need for higher resolution textures and complex shader code which will process them at such quality. And last factor is needed to actually show the difference. Objects with higher detail potential must be close enough to viewport. Like crysis suit stuck right into your face, so you do not see what actually matters in game. (Everything that's going on around.)
    And when we get to image quality delivered at same screen size. When there is that small difference, people do not get to perceive it in motion. Only in stills. Makes lovely photos, sure, but playing the game, no.
    4K people with low fps tend to keep on motion blur to get illusion of smoothness. That itself defeats any potential gains. In contrast, those who can keep high fps will turn motion blur OFF immediately.
    I can tell you that if someone with 4K screen uses motion blur, even I with 1080p screen have higher image quality while having higher motion clarity and temporal smoothness of motion.

    And back to achievable image quality. For 4K to deliver higher perceivable IQ than 1440p, it needs to use appropriate resources, and that often results in low fps. Which people compensate with lower detail settings. Defeats entire point of this discussion.
    If card drops under 60fps @4K in modern games, it is not suitable for 4K gaming. Not even 3090 can keep it up in already released games.
    Because 3 out of 13 games in Hilbert's test set show exactly that. (You may notice that current test set has 14 games, I excluded MS:F.Sim as it is CPU bound and only idiot would count it towards GPU limited games and total count as there is no opportunity to succeed or fail.)
    And this Hilbert's test set really has only few very new games. Would it consist purely of new AAA games to show contemporary abilities of now released cards, it would be 4K nightmare.

    So once again. Given same screen size, someone running 1440p will have higher image quality while having perfectly stable and fluid fps than someone with 4K screen trying to achieve same framerate fluidity with same GPU. (In modern "new" games.)

    Looking into past, is false argument. Like saying that HD7970 is 4K GPU because it can play mario64 on 4K. Reality is, it would fail to deliver good fps in games present day it was released. And it would fail to do so with future games. Just because there are cases where something does not fail does not mean that something is failproof.
    3090 is failing to deliver good 4K experience in some games released before its launch.And considering trend, it will continue to do so. That means it is not 4K card. And as such it means, there is currently no 4K graphics card. Because there is no stronger 4K graphics card than 3090.

    People can play at 4K even with older cards. It is just question of How old game is, and how low details are. People who shill for 4K or 3090 should think.
    And as I wrote moment nVidia unveiled RTX cards for 1st time. People will be returning to lower resolution screens. Because raytracing is bottomless pit where you can throw any and all computational power, and it will not be enough. Some will just take longer to realize how reality shifted under their feet. Some will not realize any time soon.
    (And btw. , most of this is not aimed at you, @W@w@Y.)
     
    cucaulay malkin likes this.
  17. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    a lot of good points here
    most people define IQ for static shots
    motion clarity and fluidity is a much,if not more,part of percieved image.

    that's why I don't understand how technologies like ulmb are not so popular while people buy 240hz monitors like hot cakes.just doesn't make sense.

    I have a 24" 1440p ulmb monitor and I wouldn't trade it for anything.it just doesn't get better than that for a crisp,fluid image achieveable with most of modern upper mid range gpus.
     
  18. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    120fps on XSX:
    CoD: Cold War
    CoD: Warzone
    Devil May Cry 5
    Dirt 5
    Falconeer
    Gears 5
    ...

    They surely are not falling into category "nothing". And not low-polygon-artistic either.
    To be honest, people do not buy 240Hz like hot cakes. :) That's probably why.
    And some manufacturers like to keep manufacturing costs in check to remain competitive.
    Most of people will buy screen by resolution and panel type. Some will check refresh rate and if screen has AdaptiveSync.
    Very small minority checks special features.

    And I think special features like ULMB will be even lower in priority lists as HDR is becoming bigger selling point.
     
    Last edited: Nov 22, 2020
    itpro likes this.
  19. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    imo saying that consoles aim for 120 while high end pcs for 60 is wrong.

    in the end it's performance that matters.with a 3070/6800 you can do more than with an xbox,period.especially with dlss 2.0 avbailable for rtx cards
     
  20. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Sure you can. And consoles do aim to deliver some games capable of 120fps (locked). Most of them 1080p, some 1440p.
    But that statement was to show contrast that even consoles control image quality vs frame rate.
    Because some members here declare that 4K at low and unstable framerate is OK because pixel density is above fluid gameplay. (Paraphrasing here.)
     
    itpro likes this.

Share This Page