HDR Benchmarks perf impact on AMD (-2%) vs Nvidia (-10%) - 12 Games at 4K HDR

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 22, 2018.

  1. nizzen

    nizzen Master Guru

    Messages:
    803
    Likes Received:
    160
    GPU:
    3x2080ti/5700x/1060
    ?

    Asus PG27UQ works like this:

    4:4:4 HDR: 98Hz
    4:2:2 HDR: 120Hz
    Overclocked 4:2:2 HDR: 144 Hz.

    Set 4K, 120hz, 8 bit, SDR, RGB for normal usage
    Set 4K, 98hz, 10 bit, HDR, 444 for HDR usage


    -----

    At the risk of further complicating things, let's quickly go over the supported refresh rates and settings for SDR and HDR content. Here are the ones for SDR:
    • 98Hz, RGB (4:4:4), 10-bit color depth
    • 120Hz, RGB (4:4:4), 8-bit color depth
    • 144Hz, YCbCr 4:2:2, 8-bit color depth
    And here are the supported modes and settings for HDR content:
    • 98Hz, RGB (4:4:4), 10-bit color depth
    • 120Hz, YCbCr 4:2:2, 10-bit color depth
    • 144Hz, YCbCr 4:2:2, 10-bit color depth
     
    Last edited: Jul 23, 2018
    Stefem likes this.
  2. Pimpiklem

    Pimpiklem Member Guru

    Messages:
    162
    Likes Received:
    49
    GPU:
    amd
    Can tell when people dont own HDR.
    I do and loving it.
    Right now my settings say

    RGB 4.4.4 10 bit 144 htz 32" 1ms 600 lumin freesync2
    Sorry Nizzen your reference is flawed.
     
  3. nizzen

    nizzen Master Guru

    Messages:
    803
    Likes Received:
    160
    GPU:
    3x2080ti/5700x/1060
    Why so toxic? Give the right answer then...
     
  4. Denial

    Denial Ancient Guru

    Messages:
    12,417
    Likes Received:
    1,657
    GPU:
    EVGA 1080Ti
    You're both right. Nizzen is correct for 4K monitors - the EX3203R that I assume Pimpiklem is using isn't a 4K monitor.
     

  5. Pimpiklem

    Pimpiklem Member Guru

    Messages:
    162
    Likes Received:
    49
    GPU:
    amd
    Yeah mate im using WQHD
    I find 4k for gaming kind of idiotic just pointless.
    Great for productivity but for gaming just seems overkill.
    Hell we haven't even saturated 1920x1080 yet because no game looks as good as a HD movie yet.
    http://www.guru3d.com/articles-pages/samsung-c32hg70-freesync-2-hdr-monitor-review,1.html
    Not bad for 500
    I have vega 64 to run it so im happy.
    Im good for the next 3 years . I will wait until i can double my performance again.
    I dont do these 25% jumps Nvidia push out i wait for 100%

    I swapped out my stand for one i already had.
    I wasn't impressed with how far the stand sits away from the wall.
    Was like CRT all over again so i swapped it.


    And sorry about the grammar i edited this like 20 x
     
    Last edited: Jul 23, 2018
  6. Stefem

    Stefem Member

    Messages:
    40
    Likes Received:
    4
    GPU:
    16
    Yep, these are due to restriction imposed by current DP and HDMI standards
     
  7. fOrTy_7

    fOrTy_7 Master Guru

    Messages:
    344
    Likes Received:
    36
    GPU:
    N/A
    That's the biggest non-sense I have read in these forums since a long time.

    Thanks for sharing HDR benchmark results though.
    Guru3d.com is getting less and less reliable with every post they made.
    It's posting way too many gossips and unconfirmed information.
     
  8. RooiKreef

    RooiKreef Master Guru

    Messages:
    374
    Likes Received:
    42
    GPU:
    Vega 56 Red Devil
    Looking at the charts, it’s clear that both AMD and Nvidia does get performance drops from HDR 4K. It must be the way the game developers implement the HDR effect in the game that correlate to the performance impact. A driver update can improve performance I will say, but I’m thinking more it’s the developers that must improve the way they implement the HDR effects.
    Why is it that some game engines doesn’t impact performance and others do? That points tewards the developers more so that AMD or Nvidia.
     
  9. Pimpiklem

    Pimpiklem Member Guru

    Messages:
    162
    Likes Received:
    49
    GPU:
    amd
    May sound like nonsense to you because you haven't thought about it.
    Well you tell me of a game that looks more photo realistic than any 1920x1080 movie .
    There isnt one because we are no where close to hitting that realism.
    Increasing the screen res is trying to run before learning how to walk.

    example
    [​IMG]
     
  10. Denial

    Denial Ancient Guru

    Messages:
    12,417
    Likes Received:
    1,657
    GPU:
    EVGA 1080Ti
    Ignoring the fact that there are thousands of games not attempting photorealistic graphics or the fact that people watch 4K movies on their monitors.. that statement holds true for 720p and arguably 480p in games. So why are you even gaming on a QHD monitor? We haven't maxed 720p to photorealism yet - you better downgrade.
     
    yasamoka likes this.

  11. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    14,726
    Likes Received:
    1,317
    GPU:
    RTX 2070FE
    The big difference between the two is response time and refresh rate. I have the LG 43uj6300, when I bought it I paid $600. The internal refresh rate is 60hz, but with post processing frame interpolation it outputs it as 120hz, this is noticeable though and kind of makes it seem a little worse response time...The problem is, the response time is 14ms. Right now, LG offers this TV for $349 as a special, but normal pricing is $499.

    Not going to include monitors that hit 120hz, as the 120hz feature on the LG TV I have isn't real, and not going to include size as well.
    https://www.newegg.com/Product/Product.aspx?Item=N82E16824011158

    $421.46 for this, specs aside from audio very relative to the LG TV I listed earlier, with a response time of 4ms. Jump up to 144hz(which realistically there's no hardware out right now that can run 4k 144hz HDR for current games), and then you have the big price jump to $2000 and only....2 or 3 choices?

    Point is, the panels you get from a TV can be really nice. Very good quality, nice picture, but response time is the limiting factor. Going from my Hanspree HT 231 1080p monitor with a 5ms response time and 75hz refresh rate, the resolution is a lot lower but I definitely notice a huge difference playing between the two and input lag.

    Correction- my monitor is 60hz. I have it at 65hz just for giggles lmao.
     
    Last edited: Jul 23, 2018
  12. Eastcoasthandle

    Eastcoasthandle Ancient Guru

    Messages:
    2,190
    Likes Received:
    233
    GPU:
    Nitro 5700 XT
    [​IMG]

    Edit:
    Nevermind I read your earlier post.
     
    Last edited: Jul 23, 2018
  13. X7007

    X7007 Maha Guru

    Messages:
    1,451
    Likes Received:
    10
    GPU:
    Inno3D 1080GTX
    I would prefer OLED TV E6 55" than Nvidia Gsync IPS Panel .. I can't stand the IPS anymore or any other than OLED . the colors and the black is unreal . Also the 3D is the biggest bonus as I can play any game in 3D using Reshade + SuperDepth Shader . This by the Quality is Top and the 3D is top . I also bought DreamScreen to go with my set as it gives correct light all over the TV top buttom and sides.

    Those are the things you need if you want to play 3D.

    https://reshade.me/
    https://github.com/BlueSkyDefender/Depth3D

    This is the light support 4K - I bought it because I had philips and I couldn't do without it. it adds up so much
    https://www.dreamscreentv.com/

    I have Gsync 100 Hz on my Asus G751JT laptop and I don't in anyway would prefer Gsync IPS over OLED 3D any time soon.

    I could go for C7 series with 100 Hz and low Input lagI think it's 21.3ms .
    But I would lose the 3D
     
  14. fOrTy_7

    fOrTy_7 Master Guru

    Messages:
    344
    Likes Received:
    36
    GPU:
    N/A
    You're comparing apples to oranges. To be precise you're comparing image quality which is strictly technical factor with artistic expressions and interpretation of those.

    To put it simple, if play the same movie at different resolutions (and therefore different bitrates) you will see a quite difference assuming source material was high resolution, e.g. : 4K.

    It's like for TVs going form SDR to Full HD and then to 4k. At higher resolution you can see much more details since there is more pixels per inch at similar viewable area (screen). Same goes for games, although visual jump between resolution change is less apparent, but still you get more details on screen and less aliasing.

    Just try to play a random youtube move at different resolutions: 480p (SDR), 1080p (Full HD), 2160p (4k);
    Start at 0:50 s.




    As for photo realism we're not that far away as one may thought of.








    Although, I find the Siren girl rather creepy. :D
     
  15. Pimpiklem

    Pimpiklem Member Guru

    Messages:
    162
    Likes Received:
    49
    GPU:
    amd
    OK mate.
    now pls link me a game.

    Thx for helping me prove my point.
     

  16. Keesberenburg

    Keesberenburg Master Guru

    Messages:
    851
    Likes Received:
    24
    GPU:
    EVGA GTX 980 TI sc
    GTX 1080 Ti don't get the perfomance hit wen HDR is enabled, not with 422 and not with rgb. FPS is the same.
     
  17. Kolt

    Kolt Ancient Guru

    Messages:
    1,648
    Likes Received:
    476
    GPU:
    RTX 2080 OC
    https://www.benoitdereau.com/


    You can download and walk around an apartment which is pretty photo realistic. It's not technically a game, but what is stopping anyone from using this tech and design for games? (I'll give you the most likely answer.. weak consoles!?)
     
  18. Monchis

    Monchis Maha Guru

    Messages:
    1,304
    Likes Received:
    36
    GPU:
    GTX 950
    I don´t know what kind of testing you did but everybody is liking the 1000nit monitors so much more.
     
  19. warlord

    warlord Ancient Guru

    Messages:
    2,460
    Likes Received:
    812
    GPU:
    Null
    No one like being blinded by brightness. Only epeen to justify I have the most expensive TV or monitor. Yeap I agree. So much more price for nothing. Even mobile HDR annoys me when goes fully bright. I can't imagine a more than 1000 Nits at two meters with 65". It shouts me get a life lazy boy or visit a doc.
     
  20. alanm

    alanm Ancient Guru

    Messages:
    8,982
    Likes Received:
    1,328
    GPU:
    Asus 2080 Dual OC
    I think HDR 1000 is more suited to TVs from a distance, not up close monitors. That wont stop monitor makers from pushing it as the next big thing. They will milk whatever technology to whomever they can whether its good for them or not.
     

Share This Page