Review: Hitman 2016: PC graphics performance benchmarks

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 16, 2016.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,544
    Likes Received:
    18,856
    GPU:
    AMD | NVIDIA
    This week Hitman was updated, the second episode based in the Medeterranean setting of Sapienza. After the update you'll notice that radeon card work great with the latest AMD Radeon Software Crimson 16.4.2 driver. The results for GeForce cards (GeForce 364.72 WHQL driver) are pretty dramatic, even lower than DX11.

    Read the updated content at page 7 of the updated article.
     
  2. Koniakki

    Koniakki Guest

    Messages:
    2,843
    Likes Received:
    452
    GPU:
    ZOTAC GTX 1080Ti FE
    Just.... Painful....
     
  3. Cave Waverider

    Cave Waverider Ancient Guru

    Messages:
    1,883
    Likes Received:
    667
    GPU:
    ASUS RTX 4090 TUF
    Why is the Titan X performing so much worse than the 980 Ti in Ultra HD DirectX 12? 27 vs. 35 FPS sure does make a difference.
     
  4. Humanoid_1

    Humanoid_1 Guest

    Messages:
    959
    Likes Received:
    66
    GPU:
    MSI RTX 2080 X Trio
    Thanks for the update Hilbert :)

    They really should have sorted that delayed start up for NVidia users before releasing this though, they are just going to get it in the neck over it...
     

  5. Singleton99

    Singleton99 Maha Guru

    Messages:
    1,071
    Likes Received:
    125
    GPU:
    Gigabyte 3080 12gb
    I can't take the pain of my 980 ti's getting there bum kicked. lol
    Good show for AMD
     
  6. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    Anyone have any bright ideas that could explain these huge differences in results ?

    http://www.pcgameshardware.de/Hitma...Episode-2-Test-Benchmarks-DirectX-12-1193618/

    [​IMG]
    [​IMG]



    I have no idea, it's strange. Maybe clocks ? It's unclear to me what clock these cards are effectively run at during the test
     
    Last edited: Apr 30, 2016
  7. fOrTy_7

    fOrTy_7 Guest

    Messages:
    345
    Likes Received:
    36
    GPU:
    N/A
    The other site used the latest drivers for both graphic cards types, while Guru3d used the latest official WHQL driver for nVidia cards (364.72).

    The other site.
    nVidia - Geforce 364.96 Hotfix
    AMD - Radeon Software 16.4.2 Beta

    Geforce 364.96 Hotfix driver is rumored to fix performance issues where GPU could go into incorrect p-state (performance state).

    I would like to see an updated review with game tested on Geforce 364.96 Hotfix driver for nVidia cards.
     
  8. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    Hilbert, can we not get reported clocks for these tests? I hate having to assume it's ~1200mhz just because that is what the reference design says

    PCGH results also have another oddity btw, at the top of the DX12 graph it says "Max details ingame"

    But on the actual y-axis it says (lower detail!) for both Fury and Ti
     
  9. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,544
    Likes Received:
    18,856
    GPU:
    AMD | NVIDIA
    We ALWAYS use reference clocks and reference cards. PCGH uses factory overclocked GPUs. For the 980 Ti that can run upwards to a 20% extra perf benefit.
     
  10. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    Thanks for the clarification, but I'm wondering for example about the 980ti reference boost clock is 1076 if I'm not mistaken, does it actually run at 1076mhz? I remember it running higher even on the reference design
     

  11. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,544
    Likes Received:
    18,856
    GPU:
    AMD | NVIDIA
    No, the Nvidia boost clock is a complex thing anno 2016 depending on game load and the many limiters that monitor temp thresholds, power draw, voltages, clocks and relative clocks etc etc.

    However the overclocked versions e.g. factory higher clocked products do see that same increase on the boost clock. Typically I've seen the 980 Ti reference jump to 1200~1275 MHz in games however I have seen the tweaked cards run to 1400~1500 MHz on the boost frequency as the incremental increase on the clock frequency does move the boost performance upwards as well. Especially the better cooled products in combo with an added 150~200 MHz handle a higher boost really well.

    Hence the better factory tweaked models can run ~20% faster.
    So the PCGH 62 FPS for the Superclocked edition they use in WHQD could be 50 FPS if they had used a reference card/clock. It's a substantial difference.
     
  12. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    I've personally never experienced my card going over the stock boost no matter what the temperature/tdp is, 1366 is written on the box, and 1366 is what I get.

    Nonetheless I think at this point it would be reasonable to assume it was running at an average of 1200mhz in your review.

    This would put the super-clocked 1350mhz (assuming it was 1350 and not higher) card at a ~13% advantage, certainly not enough to explain these results
     
  13. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Total "full" boost 2.0 clock is ~144 - 164MHz from listed base freq.

    Kepler GK110B with Boost2.0 base e.g. my old 780GTX 940MHz boost 1006MHz (real 1084MHz), ~ 144Mhz


    Maxwell with Boost 2.0 base e.g. my 1178MHz boost 1279MHz (real 1342MHz), ~ 164MHz.

    Now when I flashed my card to base 1241Mhz it boosts to 1405MHz, still that 164MHz.
     
  14. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    It's not so much that the actual boost is higher than advertised, it's that it varies.

    My card will boost to 1390 if I'm not mistaken, if gigabyte 'OC mode' is enabled from their hideous guru app, but that also raises the voltage so it goes one bin (13mhz)up

    Anyway, my point is *if* the reference Ti was boosting at 1200+, and not 1076 or w/e, it wouldn't account for the disparity between the PCGH results and these, does anyone that plays the game test its performance over driver sets? Could the delta be in the driver's?
     
  15. Broken Haiku

    Broken Haiku Member

    Messages:
    42
    Likes Received:
    0
    GPU:
    AMD 280x 3GB
    "We have to say that we are incredible dissapointed about the quality control on DX12 from the developper. These things just shouldn't happen. Very sloppy."

    Oh the irony...."dissapointed" and "developper". Very sloppy :^).
     

  16. cowie

    cowie Ancient Guru

    Messages:
    13,276
    Likes Received:
    357
    GPU:
    GTX
    like Hilbert said
    even a ref card that runs ie:1250 can vary in clocks depending on temps and power...the better cooled and factory bios enhaced for more power custom cards can sustain a higher boost clock for more of the time then ref.
    it could mean a big difference in number
     
  17. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000
    It's not the clocks. PCGH uses their own benchmark run ingame, not the internal benchmark. Seems like real world performance is different from benchmark performance
     
  18. pharma

    pharma Ancient Guru

    Messages:
    2,496
    Likes Received:
    1,197
    GPU:
    Asus Strix GTX 1080
    Curious why you did not include updated results using the Nvidia Geforce 364.96 Hotfix driver? In the article the results were updated on March 18 using Radeon Crimson 16.3.1 Hotfix driver, so really isn't at question of only using WHQL drivers..
     
  19. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Lets be real here for a moment and then we can start or stop pointing fingers.

    Hilbert's Results for 1080p DX12:
    Titan X = 78fps; GTX 980 Ti = 76fps.

    PCGHW results for 1080p DX12:
    GTX 980 Ti Superclocked = 79.9fps
    - - - - - - - - - - - - - - - - - - - - -
    Hilbert's Results for 1440p DX12:
    Titan X = 60fps; GTX 980 Ti = 58fps.

    PCGHW results for 1440p DX12:
    GTX 980 Ti Superclocked = 61.9fps
    - - - - - - - - - - - - - - - - - - - - -
    Hilbert's Results for 4k DX12:
    Titan X = [mistakes happen]; GTX 980 Ti = 35fps.

    PCGHW results for 4k DX12:
    GTX 980 Ti Superclocked = 37.1fps
    Difference is not in GTX 980 Ti results, but in very different Fury X results. 9~15%. (They claim HQ-AF in driver. I have many more IQ improving tweaks, improving -5 TexLOD+SSAA to remove resulting shimmering => Best AF one can get and AA on top of that.)
    I'll say it openly: "Those Clowns in PCGHW borked Fury X by very bad driver tweaks through registry settings."

    This is my difference between 3DM Driver Standard (Left) vs. My improved IQ (right).
    http://www.3dmark.com/compare/fs/8345321/fs/8345401
    1.6% difference in graphics score. (should be noted, that I apparently did not enforce 3DM to run with AA, just with 1 frame pre-render limit, -5lod, HQ-AF and few minor things.)
     
    Last edited: May 1, 2016
  20. Ieldra

    Ieldra Banned

    Messages:
    3,490
    Likes Received:
    0
    GPU:
    GTX 980Ti G1 1500/8000

    You're ignoring the main point which is the relative performance of 980ti vs fury X. Guru3d didn't test episode 2 dx11

    PCGH uses their own benchmark run, not the built-in benchmark. That's the main difference afaik, I think driver settings should be completely default when testing, I have no idea what you're on about in settings for amd
     
    Last edited: May 1, 2016

Share This Page