Review average of 17 websites shows 6800 XT to be 7.4% Slower than GeForce RTX 3080

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 20, 2020.

  1. W@w@Y

    W@w@Y Ancient Guru

    Messages:
    8,101
    Likes Received:
    39
    GPU:
    RTX 3080
    upload_2020-11-22_14-27-59.png
     
    Stormyandcold, itpro and Dragam1337 like this.
  2. asturur

    asturur Master Guru

    Messages:
    938
    Likes Received:
    273
    GPU:
    Geforce Gtx 1080TI
    Consoles spots 120hz on the specs but i m sure nothing will do that.
    Maybe some low poly artistic game, cool, but not a reference.
    Maybe ps4 games at 1080p can be run at 120fps, exactly as we can do we games with medium setting on a 1080ti
     
  3. leszy

    leszy Master Guru

    Messages:
    323
    Likes Received:
    15
    GPU:
    Sapphire V64 LC
    Interesting comparison. It looks like NVidia's PR department in Germany is very influential;) Tests on German sites gave NVidia a great advantage. And 3Dcenter carefully skipped sites where RX6800XT won, for example notebookcheck.net + 4.75% or techgage.com +1.5%. By the way, usually NVidia cards are tested in the best possible environment for them. This time, when testing the PCi 4.0 card, most of the known sites turned to the old PCi 3.0 architecture. Is this a nod to Intel or NVidia? Sometimes I wonder what the point is for companies to develop new advanced technologies, since the reviewers carefully pretend that they do not notice their importance (unless it is a technology of this company that should be supported (such as DLSS available in (literally) several games)?;)
     
  4. leszy

    leszy Master Guru

    Messages:
    323
    Likes Received:
    15
    GPU:
    Sapphire V64 LC
    I have a strange feeling that games on consoles, will again operate better than on a PC. At least the titles, which porting will be sponsored by NVidia;)
     

  5. cucaulay malkin

    cucaulay malkin Master Guru

    Messages:
    215
    Likes Received:
    103
    GPU:
    1070 Strix
    really?
    or maybe calling a well established and reputable tech site "jensen bitches" is more likely to be the grounds for a ban ?

    It was so refreshing to see that the discussion under G3D's 6800xt review contained so little fanboy hatred,cause other sites I visit it was a complete ****show,just pathetic.

    Can we please have one tech site where a term "Jensen bitch" isn't thrown around ?
     
    Last edited: Nov 22, 2020
  6. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,205
    Likes Received:
    2,996
    GPU:
    5700XT+AW@240Hz
    As did @Agonist told you: "Resolution is not the only defining thing here."

    If game renders 960x540 image, and then it is displayed on 3 screens of same physical size: 1080p, 1440p, 2160p via integer scaling,
    all 3 screens will deliver same image quality. That's because game's output image quality did not exceed capabilities of any of those screens.
    And when it gets to modern games, their output image quality rarely exceeds that of 1440p.
    For that, there is need for higher resolution textures and complex shader code which will process them at such quality. And last factor is needed to actually show the difference. Objects with higher detail potential must be close enough to viewport. Like crysis suit stuck right into your face, so you do not see what actually matters in game. (Everything that's going on around.)
    And when we get to image quality delivered at same screen size. When there is that small difference, people do not get to perceive it in motion. Only in stills. Makes lovely photos, sure, but playing the game, no.
    4K people with low fps tend to keep on motion blur to get illusion of smoothness. That itself defeats any potential gains. In contrast, those who can keep high fps will turn motion blur OFF immediately.
    I can tell you that if someone with 4K screen uses motion blur, even I with 1080p screen have higher image quality while having higher motion clarity and temporal smoothness of motion.

    And back to achievable image quality. For 4K to deliver higher perceivable IQ than 1440p, it needs to use appropriate resources, and that often results in low fps. Which people compensate with lower detail settings. Defeats entire point of this discussion.
    If card drops under 60fps @4K in modern games, it is not suitable for 4K gaming. Not even 3090 can keep it up in already released games.
    Because 3 out of 13 games in Hilbert's test set show exactly that. (You may notice that current test set has 14 games, I excluded MS:F.Sim as it is CPU bound and only idiot would count it towards GPU limited games and total count as there is no opportunity to succeed or fail.)
    And this Hilbert's test set really has only few very new games. Would it consist purely of new AAA games to show contemporary abilities of now released cards, it would be 4K nightmare.

    So once again. Given same screen size, someone running 1440p will have higher image quality while having perfectly stable and fluid fps than someone with 4K screen trying to achieve same framerate fluidity with same GPU. (In modern "new" games.)

    Looking into past, is false argument. Like saying that HD7970 is 4K GPU because it can play mario64 on 4K. Reality is, it would fail to deliver good fps in games present day it was released. And it would fail to do so with future games. Just because there are cases where something does not fail does not mean that something is failproof.
    3090 is failing to deliver good 4K experience in some games released before its launch.And considering trend, it will continue to do so. That means it is not 4K card. And as such it means, there is currently no 4K graphics card. Because there is no stronger 4K graphics card than 3090.

    People can play at 4K even with older cards. It is just question of How old game is, and how low details are. People who shill for 4K or 3090 should think.
    And as I wrote moment nVidia unveiled RTX cards for 1st time. People will be returning to lower resolution screens. Because raytracing is bottomless pit where you can throw any and all computational power, and it will not be enough. Some will just take longer to realize how reality shifted under their feet. Some will not realize any time soon.
    (And btw. , most of this is not aimed at you, @W@w@Y.)
     
    cucaulay malkin likes this.
  7. cucaulay malkin

    cucaulay malkin Master Guru

    Messages:
    215
    Likes Received:
    103
    GPU:
    1070 Strix
    a lot of good points here
    most people define IQ for static shots
    motion clarity and fluidity is a much,if not more,part of percieved image.

    that's why I don't understand how technologies like ulmb are not so popular while people buy 240hz monitors like hot cakes.just doesn't make sense.

    I have a 24" 1440p ulmb monitor and I wouldn't trade it for anything.it just doesn't get better than that for a crisp,fluid image achieveable with most of modern upper mid range gpus.
     
  8. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,205
    Likes Received:
    2,996
    GPU:
    5700XT+AW@240Hz
    120fps on XSX:
    CoD: Cold War
    CoD: Warzone
    Devil May Cry 5
    Dirt 5
    Falconeer
    Gears 5
    ...

    They surely are not falling into category "nothing". And not low-polygon-artistic either.
    To be honest, people do not buy 240Hz like hot cakes. :) That's probably why.
    And some manufacturers like to keep manufacturing costs in check to remain competitive.
    Most of people will buy screen by resolution and panel type. Some will check refresh rate and if screen has AdaptiveSync.
    Very small minority checks special features.

    And I think special features like ULMB will be even lower in priority lists as HDR is becoming bigger selling point.
     
    Last edited: Nov 22, 2020
    itpro likes this.
  9. cucaulay malkin

    cucaulay malkin Master Guru

    Messages:
    215
    Likes Received:
    103
    GPU:
    1070 Strix
    imo saying that consoles aim for 120 while high end pcs for 60 is wrong.

    in the end it's performance that matters.with a 3070/6800 you can do more than with an xbox,period.especially with dlss 2.0 avbailable for rtx cards
     
  10. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,205
    Likes Received:
    2,996
    GPU:
    5700XT+AW@240Hz
    Sure you can. And consoles do aim to deliver some games capable of 120fps (locked). Most of them 1080p, some 1440p.
    But that statement was to show contrast that even consoles control image quality vs frame rate.
    Because some members here declare that 4K at low and unstable framerate is OK because pixel density is above fluid gameplay. (Paraphrasing here.)
     
    itpro likes this.

  11. itpro

    itpro Master Guru

    Messages:
    676
    Likes Received:
    374
    GPU:
    Radeon Technologies
    5120×1440 almost rivals "4k" though. G9 and other monitors are better overall for pure pc gaming now.
     
  12. cucaulay malkin

    cucaulay malkin Master Guru

    Messages:
    215
    Likes Received:
    103
    GPU:
    1070 Strix
    maybe for the way they play it's a true statement.
     
  13. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    6,979
    Likes Received:
    209
    GPU:
    980
    Well its double 1440p for G9. While 4K is 2.3 times. They are indeed close. I have G9 myself and it's awesome.

    It is more immersive than flat-screen for sure.
     
  14. Undying

    Undying Ancient Guru

    Messages:
    14,186
    Likes Received:
    3,368
    GPU:
    Aorus RX580 XTR 8GB
    It is the only site where intel cpus are still beating amd and where 6800xt is slower even in lower resolutions. That reputable site is a joke their opinions means nothing to me. Btw few poeple liked it so im not the only one. ;)
     
    moo100times likes this.
  15. cucaulay malkin

    cucaulay malkin Master Guru

    Messages:
    215
    Likes Received:
    103
    GPU:
    1070 Strix
    who df cares what cpus you cheer for in a g3d discussion about cards

    I never liked how they tested cpus and ram for gaming either,but really,it's completely irrelevant here.I don't suspect them of cheating the results so even if I don't like them,I sholud accept them.That's called being objective.You can't just s**t on results you don't like.

    [​IMG]

    no,it's not the only site that shows 3080 at 1440p,not even close to true.in fact,most I read don't.

    computerbase don't,pcgh don't,purepc don't.
    in fact computerbase and pcgh show even more than tpu ! is everyone biased in your opinion ?
     
    Last edited: Nov 22, 2020

  16. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,507
    Likes Received:
    278
    GPU:
    MSI GTX1070 GamingX
    Fox2232 re: post 126

    That's all theoretical. In my experience it actually doesn't hold-up. I'm on 1440p with GTX1070 and my son is on a laptop with 4K and GTX1060. It's obvious to me that even when I'm running high details and he's on mix of low-medium that his 4K screen looks better and the difference is really obvious. Even though his "total assets" maybe lower than mine, the 4K resolution still shows more "details" than my 1440p can. That's especially noticeable when it comes to games with long draw distances. With 4K, you also have the flexibility of being able to run 1080p if more performance is needed and it still looks right.

    I bought my 1440p 144 monitor in March 2018, due to my own negative perception of 4K and how demanding it "was". Today after seeing it in action, my view has totally changed. 4K is just better, there's no way to overcome it, even with the compromises it's still better. However, I tend to use monitors until they break, so I probably won't change my monitor until around 2025 (I'm estimating it'll last 7yrs~), but, the pricing is getting better and 4K144+ is starting to come within reach of what I'm willing to pay. I can't wait to upgrade tbh, it looks sweet.
     
    AuerX likes this.
  17. KonoSuba

    KonoSuba New Member

    Messages:
    5
    Likes Received:
    1
    GPU:
    VEGA 8
    I have no money for any of the tested options...
     
  18. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,302
    Likes Received:
    1,035
    GPU:
    Aorus 3090 Xtreme
    2x1440p monitors are 5120x1440p UW, 1.33x horizontal pixels of UHD.
    I use a CRG9 with my UHD TV also in view centrally and its incredibly immersive.
    It can run 3840x1080p while looking very good too which makes it accessible for 1080ti owners and with some lesser cards.

    A review in case its useful
    https://www.displayninja.com/samsung-c49rg9-review/
     
    Last edited: Nov 22, 2020
  19. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,205
    Likes Received:
    2,996
    GPU:
    5700XT+AW@240Hz
    Yeah, so do look games on cellphones good. Apples to apples means same screen size.
    And entire time were talking: 4K viable for games existing at release of GPU and future viable. Like at least if person is willing to burn money for best GPU available, that best GPU can carry maximum details gaming till next generation Best releases.
    And that simply never held true in past, nor does today. nVidia released great and powerful GPUs, but they can't handle many existing games already.
    So, if you have around 4K screen with same size as your 1440p. Put them side by side. Set maximum details on system using 1440p screen. And then reduce details on system that use 4K screen till fps is equal.

    When fps is equal, temporal perception (quality) is equal. And one can compare actual image quality. Then you can swap devices around and give 1440p handicap with laptop performance.

    But different screen sizes give different perception even to frame rate.
     
  20. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,507
    Likes Received:
    278
    GPU:
    MSI GTX1070 GamingX
    Again, your viewpoint is theoretical and suited more for testing purposes. In real-life it's irrelevant. Your testing methods are more akin to gpu benchmarking, rather than how it works in-use.

    As I mentioned, with any game with a decent draw-distance the 4K resolution makes a perceptible difference. No matter how detailed you try to make background details, if it's only made of 100 pixels at 1440p, but, 225 pixels on the 4K screen, then, you will notice this straight away. There's no way to get around this. The pixels on the 1440p under your conditions might look nicer "as a whole", but, theres just more of them on the 4K screen.

    The negatives for 4K ends when it comes to viewing content and productivity. For those who do those things, then, it's a no brainer. I'm pretty sure this same argument will present itself with 4K vs 5K, 4K vs 8K, 5K vs 8K etc etc. For me, resolution makes a huge difference, for which gaming is only one aspect. Thankfully the advantage of a PC is that it's a user customisable experience and we can tweak to taste. No-one is expecting any of these cards to still offer best performance beyond 2yrs. However, with consoles driving the market, I also don't think we'll be looking at needing 2x RTX3090 performance to get playable 4K in 2023+, though I'm sure such cards will exist by then. Within a few years 4K will make sense, it's inevitable. Everything from consoles and movies to TV/monitor manufacturers to streaming are pushing ahead with it.
     
    carnivore likes this.

Share This Page