HDR Benchmarks perf impact on AMD (-2%) vs Nvidia (-10%) - 12 Games at 4K HDR

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 22, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,544
    Likes Received:
    18,856
    GPU:
    AMD | NVIDIA
  2. Keesberenburg

    Keesberenburg Master Guru

    Messages:
    886
    Likes Received:
    45
    GPU:
    EVGA GTX 980 TI sc
    1440p AC origin HDR with the GTX 1080 Ti -2fps
     
  3. robintson

    robintson Guest

    Messages:
    423
    Likes Received:
    114
    GPU:
    Asus_Strix2080Ti_OC
    This looks like a bench made to prove something in favor of AMD. Anyways , the fact is that GTX 1080 is not the flagship green team graphics card, but the RX Vega 64 is the red team flagship card. If we want to be objective then lets make a SDR/HDR tests of 1080 Ti vs RX Vega 64.
    The point of this tests is just to make up that RX 64 is better card than 1080, which simply is not true at all IMO. Rega RX 64 has a 295W TDP, while 1080 has 180W TDP, which is 48.4 % difference and GTX 1080 still overclocks much more than RX Vega 64.
    I would be very happy to see AMD beating Nvidia in TDP Power consumption, meaning much less power hungry and much higher performance graphic cards from AMD. More Efficiency from AMD is the name of game
     
  4. Romulus_ut3

    Romulus_ut3 Master Guru

    Messages:
    780
    Likes Received:
    252
    GPU:
    NITRO+ RX5700 XT 8G

    I think you've missed the point of these benchmarks. This test isn't meant to determine the better card but rather showcase the performance penalty they suffer when HDR is enabled between the RX VEGA 64 and the GTX 1080.

    It looks to me as if something within nvidia's driver is amiss, and they'll have this sorted out. There's absolutely no favorites or bias here. The only people who would find these results offensive are fanboys.
     
    Last edited: Jul 26, 2018
    airbud7, Embra and Duke Nil like this.

  5. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    GTX1080 and Vega 64 are on same performance level in general and the benchmark is made to show the discrepancies on HDR - SDR.
    AMD never saw the Vega 64 as "flagship" product, (is a compute chip after all) to compete with the likes of Titan and Ti.

    As for power consumption spare us. 1630 factory overclocked Vega 64 Nitro+ never goes over 276W power consumption. GTX1080Ti Xtreme (and all 2012 factory overclocked models) burns north of 330W constantly. (I have both)
    And when was the last time you checked the massive power consumption of an overclocked GTX1080? These things do not consume power linearly. There is a sharp angle over the 1780Mhz clock. GTX1080 Armor watercooled at 2190 core it was burning 325W.
    For the same performance level the Vega 64 has to run at 1724 burning exactly the same amount of power (330W).
     
    Last edited: Jul 22, 2018
  6. Brit90

    Brit90 Member Guru

    Messages:
    124
    Likes Received:
    53
    GPU:
    R390X 8GB
    What you said has absolutely nothing to do with what is shown in these graphs. The FACTS show that AMD performance is a lot higher in these tests than Nividia. This has nothing to do with a 1080 not being the top of the range. From you miss calculations you are saying that if it was the top 1080 in play, the HDR would make no difference and those who bought inferior cards can suffer.

    Fact - Nvidia Drivers right now suck. Hopefully Nvidia can fix it with a Driver update, all the people who bought G-Sync Premium Monitors maybe a bit upset that they get this big loss and also pray that a driver will fix it.
     
  7. Pimpiklem

    Pimpiklem Guest

    Messages:
    162
    Likes Received:
    49
    GPU:
    amd
    I can confirm zero impact in performance on farcry 5 with HDR Vega 64
    I think some how the fact Nvidia installs drivers in Limited colour has something to do with this result.
    They hide something.and HDR reveals it.
    Just a guess but maybe it has something to do with the fact Nvidia have to drop the chrome Info in HDR.
    Not sure but its interesting.
     
    Last edited: Jul 22, 2018
  8. Pimpiklem

    Pimpiklem Guest

    Messages:
    162
    Likes Received:
    49
    GPU:
    amd
    TDP..
    If you drop clock speeds on vega 64 just by 5% and compensate by overclocking the hbm2 you will slash power in half.
    Im not talking about just under volt im talking about dropping the clock speed as well.
    Im not exaggerating you slash power in half..
    AMD set their clocks to high in the red zone they gain huge amounts of efficiency when you drop the clock speeds.
     
    Last edited: Jul 22, 2018
  9. Texter

    Texter Guest

    Messages:
    3,275
    Likes Received:
    332
    GPU:
    Club3d GF6800GT 256MB AGP
    Maybe nVidia's memory compression methods don't work very well with the bright pixels, so the 256 bit memory bus on that 1080 returns to what it actually is, i.e. $250 videocard material. Just beating around in the bush, of course...
     
    cowie likes this.
  10. Keesberenburg

    Keesberenburg Master Guru

    Messages:
    886
    Likes Received:
    45
    GPU:
    EVGA GTX 980 TI sc
    Same settings, Nvidia HDR is 10 bit full range?
     

  11. Corbus

    Corbus Ancient Guru

    Messages:
    2,469
    Likes Received:
    75
    GPU:
    Moist 6900 XT
    Vega64 is better than i remember.
     
  12. Pimpiklem

    Pimpiklem Guest

    Messages:
    162
    Likes Received:
    49
    GPU:
    amd
    I wonder if Vulcan will ever have HDR
    Just keep in mind peeps.
    If you buy a HDR screen avoid anything higher than 600 nits unless its a TV.
    Honestly 600 nits is blinding from 2 feet away anything more is folly.
     
  13. Krzysztof

    Krzysztof Member

    Messages:
    29
    Likes Received:
    5
    GPU:
    MSI RX VEGA56
    Maybe answer is easy :Vega 12.6TFLOPS vs gtx1080Ti -10.6TFLOPS ?
     
  14. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,518
    Likes Received:
    2,361
    GPU:
    Nvidia 4070 FE
    If they had used 1080 Ti instead, I assume you would have been here posting that the whole point of the test would have been to make Nvidia look better and AMD worse. Right?
     
  15. Ricepudding

    Ricepudding Master Guru

    Messages:
    872
    Likes Received:
    279
    GPU:
    RTX 4090
    HH any chance you can run your own tests that are more reliable?

    Last time i checked HDR should have little to no performance hit, it's just color reproduction right?
     

  16. Pimpiklem

    Pimpiklem Guest

    Messages:
    162
    Likes Received:
    49
    GPU:
    amd
    This is how i see things in my minds eye from my observations.
    Do you disagree ?

    https://s22.**********/l1obdbl35/powerr.jpg
     
  17. nizzen

    nizzen Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,157
    GPU:
    3x3090/3060ti/2080t
    Flaw test.

    The performance loss was due to the way NVIDIA converts RGB to reduced chroma YCbCr 4:2:2 in HDR which was used for their HDR test. If they used 4:4:4 or RGB, the performance would be the same as SDR.

    Nice to know.
     
    DxVinxGado likes this.
  18. cowie

    cowie Ancient Guru

    Messages:
    13,276
    Likes Received:
    357
    GPU:
    GTX
    could be rabbit could be
    if it comes down to a bus size anyway,they just love doing that cutting down the bus width to save money ,it really pissed me off when they put out the 7600gt to replace the 6800 with a 128 bus. but it does not hurt as much till res goes up then they crap out lol
    any 1080ti users can say different on the performance hit?

    could be rabbit could be....just a setting?
     
    Last edited: Jul 22, 2018
  19. DxVinxGado

    DxVinxGado Active Member

    Messages:
    62
    Likes Received:
    17
    GPU:
    Zotac RTX 3090 Ti
    EXACTLY!

    [​IMG]

    [​IMG]

    [​IMG]
     
  20. DxVinxGado

    DxVinxGado Active Member

    Messages:
    62
    Likes Received:
    17
    GPU:
    Zotac RTX 3090 Ti
    Honestly, this is hardly front page news. At the very least Hilbert should update the post to reflect the oversite of Computerbase's testing methods.
     

Share This Page