Review average of 17 websites shows 6800 XT to be 7.4% Slower than GeForce RTX 3080

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 20, 2020.

  1. leszy

    leszy Master Guru

    Messages:
    325
    Likes Received:
    17
    GPU:
    Sapphire V64 LC
    I have a strange feeling that games on consoles, will again operate better than on a PC. At least the titles, which porting will be sponsored by NVidia;)
     
  2. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    1,566
    Likes Received:
    743
    GPU:
    107001070
    really?
    or maybe calling a well established and reputable tech site "jensen bitches" is more likely to be the grounds for a ban ?

    It was so refreshing to see that the discussion under G3D's 6800xt review contained so little fanboy hatred,cause other sites I visit it was a complete ****show,just pathetic.

    Can we please have one tech site where a term "Jensen bitch" isn't thrown around ?
     
    Last edited: Nov 22, 2020
  3. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    As did @Agonist told you: "Resolution is not the only defining thing here."

    If game renders 960x540 image, and then it is displayed on 3 screens of same physical size: 1080p, 1440p, 2160p via integer scaling,
    all 3 screens will deliver same image quality. That's because game's output image quality did not exceed capabilities of any of those screens.
    And when it gets to modern games, their output image quality rarely exceeds that of 1440p.
    For that, there is need for higher resolution textures and complex shader code which will process them at such quality. And last factor is needed to actually show the difference. Objects with higher detail potential must be close enough to viewport. Like crysis suit stuck right into your face, so you do not see what actually matters in game. (Everything that's going on around.)
    And when we get to image quality delivered at same screen size. When there is that small difference, people do not get to perceive it in motion. Only in stills. Makes lovely photos, sure, but playing the game, no.
    4K people with low fps tend to keep on motion blur to get illusion of smoothness. That itself defeats any potential gains. In contrast, those who can keep high fps will turn motion blur OFF immediately.
    I can tell you that if someone with 4K screen uses motion blur, even I with 1080p screen have higher image quality while having higher motion clarity and temporal smoothness of motion.

    And back to achievable image quality. For 4K to deliver higher perceivable IQ than 1440p, it needs to use appropriate resources, and that often results in low fps. Which people compensate with lower detail settings. Defeats entire point of this discussion.
    If card drops under 60fps @4K in modern games, it is not suitable for 4K gaming. Not even 3090 can keep it up in already released games.
    Because 3 out of 13 games in Hilbert's test set show exactly that. (You may notice that current test set has 14 games, I excluded MS:F.Sim as it is CPU bound and only idiot would count it towards GPU limited games and total count as there is no opportunity to succeed or fail.)
    And this Hilbert's test set really has only few very new games. Would it consist purely of new AAA games to show contemporary abilities of now released cards, it would be 4K nightmare.

    So once again. Given same screen size, someone running 1440p will have higher image quality while having perfectly stable and fluid fps than someone with 4K screen trying to achieve same framerate fluidity with same GPU. (In modern "new" games.)

    Looking into past, is false argument. Like saying that HD7970 is 4K GPU because it can play mario64 on 4K. Reality is, it would fail to deliver good fps in games present day it was released. And it would fail to do so with future games. Just because there are cases where something does not fail does not mean that something is failproof.
    3090 is failing to deliver good 4K experience in some games released before its launch.And considering trend, it will continue to do so. That means it is not 4K card. And as such it means, there is currently no 4K graphics card. Because there is no stronger 4K graphics card than 3090.

    People can play at 4K even with older cards. It is just question of How old game is, and how low details are. People who shill for 4K or 3090 should think.
    And as I wrote moment nVidia unveiled RTX cards for 1st time. People will be returning to lower resolution screens. Because raytracing is bottomless pit where you can throw any and all computational power, and it will not be enough. Some will just take longer to realize how reality shifted under their feet. Some will not realize any time soon.
    (And btw. , most of this is not aimed at you, @W@w@Y.)
     
    cucaulay malkin likes this.
  4. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    1,566
    Likes Received:
    743
    GPU:
    107001070
    a lot of good points here
    most people define IQ for static shots
    motion clarity and fluidity is a much,if not more,part of percieved image.

    that's why I don't understand how technologies like ulmb are not so popular while people buy 240hz monitors like hot cakes.just doesn't make sense.

    I have a 24" 1440p ulmb monitor and I wouldn't trade it for anything.it just doesn't get better than that for a crisp,fluid image achieveable with most of modern upper mid range gpus.
     

  5. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    120fps on XSX:
    CoD: Cold War
    CoD: Warzone
    Devil May Cry 5
    Dirt 5
    Falconeer
    Gears 5
    ...

    They surely are not falling into category "nothing". And not low-polygon-artistic either.
    To be honest, people do not buy 240Hz like hot cakes. :) That's probably why.
    And some manufacturers like to keep manufacturing costs in check to remain competitive.
    Most of people will buy screen by resolution and panel type. Some will check refresh rate and if screen has AdaptiveSync.
    Very small minority checks special features.

    And I think special features like ULMB will be even lower in priority lists as HDR is becoming bigger selling point.
     
    Last edited: Nov 22, 2020
    itpro likes this.
  6. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    1,566
    Likes Received:
    743
    GPU:
    107001070
    imo saying that consoles aim for 120 while high end pcs for 60 is wrong.

    in the end it's performance that matters.with a 3070/6800 you can do more than with an xbox,period.especially with dlss 2.0 avbailable for rtx cards
     
  7. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    Sure you can. And consoles do aim to deliver some games capable of 120fps (locked). Most of them 1080p, some 1440p.
    But that statement was to show contrast that even consoles control image quality vs frame rate.
    Because some members here declare that 4K at low and unstable framerate is OK because pixel density is above fluid gameplay. (Paraphrasing here.)
     
    itpro likes this.
  8. itpro

    itpro Master Guru

    Messages:
    964
    Likes Received:
    517
    GPU:
    Radeon Technologies
    5120×1440 almost rivals "4k" though. G9 and other monitors are better overall for pure pc gaming now.
     
  9. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    1,566
    Likes Received:
    743
    GPU:
    107001070
    maybe for the way they play it's a true statement.
     
  10. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,042
    Likes Received:
    259
    GPU:
    6800 XT
    Well its double 1440p for G9. While 4K is 2.3 times. They are indeed close. I have G9 myself and it's awesome.

    It is more immersive than flat-screen for sure.
     

  11. Undying

    Undying Ancient Guru

    Messages:
    15,505
    Likes Received:
    4,509
    GPU:
    Aorus RX580 XTR 8GB
    It is the only site where intel cpus are still beating amd and where 6800xt is slower even in lower resolutions. That reputable site is a joke their opinions means nothing to me. Btw few poeple liked it so im not the only one. ;)
     
    moo100times likes this.
  12. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    1,566
    Likes Received:
    743
    GPU:
    107001070
    who df cares what cpus you cheer for in a g3d discussion about cards

    I never liked how they tested cpus and ram for gaming either,but really,it's completely irrelevant here.I don't suspect them of cheating the results so even if I don't like them,I sholud accept them.That's called being objective.You can't just s**t on results you don't like.

    [​IMG]

    no,it's not the only site that shows 3080 at 1440p,not even close to true.in fact,most I read don't.

    computerbase don't,pcgh don't,purepc don't.
    in fact computerbase and pcgh show even more than tpu ! is everyone biased in your opinion ?
     
    Last edited: Nov 22, 2020
  13. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,636
    Likes Received:
    335
    GPU:
    MSI GTX1070 GamingX
    Fox2232 re: post 126

    That's all theoretical. In my experience it actually doesn't hold-up. I'm on 1440p with GTX1070 and my son is on a laptop with 4K and GTX1060. It's obvious to me that even when I'm running high details and he's on mix of low-medium that his 4K screen looks better and the difference is really obvious. Even though his "total assets" maybe lower than mine, the 4K resolution still shows more "details" than my 1440p can. That's especially noticeable when it comes to games with long draw distances. With 4K, you also have the flexibility of being able to run 1080p if more performance is needed and it still looks right.

    I bought my 1440p 144 monitor in March 2018, due to my own negative perception of 4K and how demanding it "was". Today after seeing it in action, my view has totally changed. 4K is just better, there's no way to overcome it, even with the compromises it's still better. However, I tend to use monitors until they break, so I probably won't change my monitor until around 2025 (I'm estimating it'll last 7yrs~), but, the pricing is getting better and 4K144+ is starting to come within reach of what I'm willing to pay. I can't wait to upgrade tbh, it looks sweet.
     
    AuerX likes this.
  14. KonoSuba

    KonoSuba New Member

    Messages:
    8
    Likes Received:
    2
    GPU:
    ASUS Dual RX 5600XT
    I have no money for any of the tested options...
     
  15. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,707
    Likes Received:
    1,295
    GPU:
    Aorus 3090 Xtreme
    2x1440p monitors are 5120x1440p UW, 1.33x horizontal pixels of UHD.
    I use a CRG9 with my UHD TV also in view centrally and its incredibly immersive.
    It can run 3840x1080p while looking very good too which makes it accessible for 1080ti owners and with some lesser cards.

    A review in case its useful
    https://www.displayninja.com/samsung-c49rg9-review/
     
    Last edited: Nov 22, 2020

  16. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    Yeah, so do look games on cellphones good. Apples to apples means same screen size.
    And entire time were talking: 4K viable for games existing at release of GPU and future viable. Like at least if person is willing to burn money for best GPU available, that best GPU can carry maximum details gaming till next generation Best releases.
    And that simply never held true in past, nor does today. nVidia released great and powerful GPUs, but they can't handle many existing games already.
    So, if you have around 4K screen with same size as your 1440p. Put them side by side. Set maximum details on system using 1440p screen. And then reduce details on system that use 4K screen till fps is equal.

    When fps is equal, temporal perception (quality) is equal. And one can compare actual image quality. Then you can swap devices around and give 1440p handicap with laptop performance.

    But different screen sizes give different perception even to frame rate.
     
  17. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,636
    Likes Received:
    335
    GPU:
    MSI GTX1070 GamingX
    Again, your viewpoint is theoretical and suited more for testing purposes. In real-life it's irrelevant. Your testing methods are more akin to gpu benchmarking, rather than how it works in-use.

    As I mentioned, with any game with a decent draw-distance the 4K resolution makes a perceptible difference. No matter how detailed you try to make background details, if it's only made of 100 pixels at 1440p, but, 225 pixels on the 4K screen, then, you will notice this straight away. There's no way to get around this. The pixels on the 1440p under your conditions might look nicer "as a whole", but, theres just more of them on the 4K screen.

    The negatives for 4K ends when it comes to viewing content and productivity. For those who do those things, then, it's a no brainer. I'm pretty sure this same argument will present itself with 4K vs 5K, 4K vs 8K, 5K vs 8K etc etc. For me, resolution makes a huge difference, for which gaming is only one aspect. Thankfully the advantage of a PC is that it's a user customisable experience and we can tweak to taste. No-one is expecting any of these cards to still offer best performance beyond 2yrs. However, with consoles driving the market, I also don't think we'll be looking at needing 2x RTX3090 performance to get playable 4K in 2023+, though I'm sure such cards will exist by then. Within a few years 4K will make sense, it's inevitable. Everything from consoles and movies to TV/monitor manufacturers to streaming are pushing ahead with it.
     
    carnivore likes this.
  18. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,810
    Likes Received:
    3,363
    GPU:
    6900XT+AW@240Hz
    False. My methods are perfectly suited for real comparison. Explained all multiple times. Not hard to understand them at all.

    For example, physically smaller screen can get away with lower fps. Because in motion objects travels smaller physical distance on screen. They still move same number of pixels if both screens in comparison have same resolution.

    Like android devices being historically locked to 30fps gaming. Not many people would notice with so small screens. Only in recent years, some games got 60fps mode.
    And only recently we are getting android devices with 90 and now with 120Hz screens.
    For cellphone, 90/120Hz is overkill unless person wants to use device in VR headset.

    When I trigger motion which results in object moving 2cm per frame on my screen at 240fps, it translates to 8cm per frame for 60Hz screens no matter how many fps game can produce as long as screen size is equal.

    And it is same for image quality comparison for screens of same size. When you have 1440p screens and run whatever details you can while game runs stable 60fps, and then take 4K screen of same size with same HW and use settings which get you same 60fps... you effectively removed temporal part of comparison and can perform IQ comparison.
    - - - -
    What you described is only true in case of unlimited power. And that's whole problem. Not even 3090 has that. I made a joke on power draw of Suprim card, but that does not translate to gaming. Power is not there. If I blown it out of proportion to demonstrate it: Chose DX12 + DX-R maximum IQ on 1080p or maximum DX9c on 8K or DX7 (No shaders) on 16K.

    Sure, maybe in 20 years when we have 32K screens, everything will be about tiny polygons and things will return back to times of simple TnL. But I doubt it.
     
  19. coth

    coth Master Guru

    Messages:
    488
    Likes Received:
    53
    GPU:
    KFA2 2060 Super EX
    All 5120x1440 monitors are 49". They are doubled 24" 2560x1440. That is 120 ppi. Not only a low resolution with still a large pixel, but also 125% scaling is not supported by 99,9% of apps. Any raster scaling to 125% is severely distorted.
     
    Dragam1337 likes this.
  20. KingK76

    KingK76 Member Guru

    Messages:
    106
    Likes Received:
    11
    GPU:
    Pascal Titan X Under H2O
    Okay so I did the math myself using GURU3D's benchmarks... in 1440p the RTX3080 is faster in 7 of 14 games tied in one. That means the RTX3080 wins at 1440p as the 6800XT was faster in 6 games tied in 1. When averaging the games benchmarked the RTX 3080 has an average FPS of 138.71 vs 136.79 for the 6800XT. Another win by 1.4% @1440p for the RTX 3080... So yes, even on Guru3D the RTX3080 is faster at 1440p then the 6800XT. I didn't bother with 1080p as NO ONE buying a top tier GPU is gaming at 1080p... I could say the same for 1440p as I know myself that I game at 2160p and most people buying a top tier GPU are going to be gaming at 4K... No matter what AMD "enthusiasts" like to say. I will state that the 6800XT is an AMAZING GPU coming from AMD. The gains gen over gen are astounding. I give them all the credit in the world. But when you consider the performance gap at 4K for the RTX3080 and the slim victory at 1440p added with the large gap in Ray Tracing performance and features like DLSS it's only the blind that would say the 6800XT is the better card.
     

Share This Page