Review: Total War WARHAMMER DirectX 12 PC graphics performance

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 2, 2016.

  1. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,430
    Likes Received:
    2,771
    GPU:
    MSI 6800 "Vanilla"
    There's a couple of things in the config file as well but I don't think any of them are fully implemented for this version of the "Total War" engine, I forgot it's actual name.

    Depth of Field quality can go above on/off and be set to a "high" setting.

    Tessellation can be enabled but if I remember correctly only Shogun 2 used this
    on units and the other, later games carried over the setting but it had no effect.
    (You're unlikely to be a effective general in the cinema mode camera view or zoomed in on the unit models backsides after all and from up above controlling the battle tessellation won't really be noticeable.)

    Alpha-blending / OIT / Order Independent Transparency can be enabled, was used in Rome 2 to kinda kill the framerate unless you used the alternate method which relied on Intel CPU's with a dedicated APU.

    There's a setting for blood but yeah as with the last three (I think.) Total War games it's likely to be a separately sold DLC, for ratings reasons or some such nonsense.
    (Although even without actual blood and dismemberment some unfortunate unit still gets his day totally ruined. - http://steamcommunity.com/sharedfiles/filedetails/?id=394151276 for Attilla - http://steamcommunity.com/sharedfiles/filedetails/?id=176036992 for Rome 2 )

    There is a setting for AA for a value of 1 which should be SMAA instead of 0 (off) and 2 (MLAA) but it seems this game has done away with SMAA support entirely in favor of the (In my opinion inferior.) SMAA method.
    (Or was it FXAA and not SMAA? I can't entirely remember and FXAA has been pretty popular still for it's low performance cost.)
     
    Last edited: Jun 2, 2016
  2. Clouseau

    Clouseau Ancient Guru

    Messages:
    2,639
    Likes Received:
    375
    GPU:
    ASUS STRIX GTX 1080
    The main point took away from the cpu scaling was that ipc is still more important than amount of "cores" once the 8370 was included. Was interesting to see that there is a point of diminishing returns between 1080 and 1440 once the performance floor was surpassed. Does not seem like bulldozer was on the ground floor. Good thing Zen is coming.
     
  3. vazup

    vazup Master Guru

    Messages:
    302
    Likes Received:
    7
    GPU:
    r9 280X
    Are you saying that amd's high end should not be compared with nvidias? In any case regular Fury smashes 980 in this benchmark also.
     
    Last edited: Jun 2, 2016
  4. JAMVA

    JAMVA Master Guru

    Messages:
    244
    Likes Received:
    180
    GPU:
    RYZEN 5800X :)
    I was just about to say that ;) I own a 780 but that R9 390 has been a cracking card :)

    Also nice to see AMD starting getting the frametimes in DX12 as good and as smooth as they have on DX11 they were struggling a tiny bit on Hilberts other DX12 tests.

    I'm really looking forward to Polaris & Vega , AMD seem to support there customers very well , Nvidia has got very poor support in regards to newer game optimizations for their older products.
     

  5. Aelders

    Aelders Banned

    Messages:
    870
    Likes Received:
    0
    GPU:
    980Ti G1 @ 1490|8000
    No. He isn't. He just got confused.

    There are two Vega GPUs, four SKUs.

    "big vega" will be go against Titan/Ti
    "little vega" will go agaisnt 1080/1070

    Fury is faster by more than 20%!

    Interesting number that, 20%... 20% is also an average OC on a 980. I wonder how a 980 @ 1500mhz would do in this test
     
    Last edited: Jun 2, 2016
  6. eclap

    eclap Banned

    Messages:
    31,497
    Likes Received:
    3
    GPU:
    Palit GR 1080 2000/11000
    It performs pretty much identical to a lot cheaper 1070. At least that's what I'm seeing. And only just outperforms the cheaper 980ti. Then there's the MLAA thing favoring AMD. Nvidia looks fine in this chart. Can't see any advantage for AMD at all. Nvidia performs the same and is cheaper (both 1070 and 980ti).

    Not to mention overclocking, factor that in and it's a clear win for Nvidia.

    Lower end cards, AMD seem to have an advantage though.
     
    Last edited: Jun 2, 2016
  7. Dazz

    Dazz Master Guru

    Messages:
    902
    Likes Received:
    97
    GPU:
    ASUS STRIX RTX 2080
    But like everything else they can be overclocked too, but i think having a set line of default clocks then if you overclock well more performance. The biggest performance killer is depth of field. Not sure what the unlimited video memory option is used for but i have a R9 290 so limited to 4GB Ram. Wonder if the unlimited video memory is for caching or something. Maybe the 8GB cards will pull away with that enabled?
     
  8. Denial

    Denial Ancient Guru

    Messages:
    13,232
    Likes Received:
    2,720
    GPU:
    EVGA RTX 3080
    I didn't get confused, I know there will be two Vega GPU's but when you use "Vega" like that and don't specify I think it's pretty clear that you're referring to the larger of the two, or that you don't know two exist in the first place.

    I'm not saying AMD's high end should not be compared to Nvidias, but Nvidia's high end isn't GP104, it's GP102, which will probably be out around the same time Vega is. Just because AMD isn't currently competing against GP104 doesn't magically make GP104 the best Nvidia has to offer with Pascal.
     
  9. chinobino

    chinobino Maha Guru

    Messages:
    1,078
    Likes Received:
    43
    GPU:
    GTX980Ti Lightning
    "Hitman is a cache what you can continuously type of title"

    Not sure what happened there but I think it might need to be fixed?

    Great article otherwise, very interesting AMD CPU results.
     
  10. Aelders

    Aelders Banned

    Messages:
    870
    Likes Received:
    0
    GPU:
    980Ti G1 @ 1490|8000
    I wouldn't call GP102 high-end simply because then you have to call GP104 midrange.

    Let's call GP102 enthusiast :p

    Im not sure what you mean there. My point is a stock 980 runs at around 1220mhz (reference), it can comfortably hit 1500+

    A stock Fury runs at 1.05Ghz, you're lucky if you're stable at 1150

    10 % vs 25%
     

  11. Dazz

    Dazz Master Guru

    Messages:
    902
    Likes Received:
    97
    GPU:
    ASUS STRIX RTX 2080
    What i mean is i think HH was a little stretched for time to play about with tweaking every single graphics card, and although yours may hit 1500MHz it does not mean they all hit that so leaving it at a base line is the best thing to do.
     
  12. Denial

    Denial Ancient Guru

    Messages:
    13,232
    Likes Received:
    2,720
    GPU:
    EVGA RTX 3080
    What happens when your baseline is more expensive and worse performing? lol

    I feel like the review process needs to change for the 1080, or we just punish Nvidia for coming up with it in the first place.
     
  13. Aelders

    Aelders Banned

    Messages:
    870
    Likes Received:
    0
    GPU:
    980Ti G1 @ 1490|8000
    I never suggested HH does that. I don't have a 980. Most REFERENCE 980s in reviews hit 1500 approximately so it seems like a good OC baseline.

    The fact is when you see reviews citing 980 performance they're talking reference, and you can squeeze another 20-25% out of them. I'm not saying Hilbert should test every card OC'd, but that readers should be aware of this
     
  14. Dazz

    Dazz Master Guru

    Messages:
    902
    Likes Received:
    97
    GPU:
    ASUS STRIX RTX 2080
    Well in fairness they want to make their new product look good, they don't want the 980Ti @ 1.4GHz catching up to their shiny new 1080, makes people think well why the hell would i buy that?

    But again not everyone overclocks, thats why i said if you know what you can get atleast and you overclock you get more then fair enough. Alot of people will check benchmarks for games and i bet a good amount of them have never even overclocked a video card. If you overclock it and they don't then they think why is my card so slow? Now no one will complain if theirs is faster see my point?

    So anyway assuming it is an ideal world that they increase exactly by the percentage of the overclock then the 980 would go from 51 to 63 while the Fury would be 63 to 71fps at 1440p.
     
    Last edited: Jun 2, 2016
  15. Caesar

    Caesar Maha Guru

    Messages:
    1,231
    Likes Received:
    471
    GPU:
    GTX 1070Ti Titanium
    Hmmmm..... AMD Fury and DX12 seem to be "glamorous".

    Forthcoming AMD GPUs appear to be cost effectively elegant ... Waiting....for....
     

  16. eclap

    eclap Banned

    Messages:
    31,497
    Likes Received:
    3
    GPU:
    Palit GR 1080 2000/11000
    I think most people who invest $500+ in a gpu will be enthusiasts and therefore overclock. I have overclocked every single gpu I've owned since Voodoo 3Dfx. I think most do.
     
  17. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,006
    Likes Received:
    237
    GPU:
    Guru3d GTX 980 G1
    I don't even know what to think about these DX12 Benchmark tests. For a start they all seem to favor AMD gfx cards which means god only knows what Nvidia cards are really capable of.

    We have to remember that HH only uses default clocks on reference gfx cards. I doubt anyone here even runs a reference card.

    What i do know is that my 980 G1 will out score a reference card by as much as 20% in some tests/benchmark results.

    If H does a test like this then usually i get about an extra 10 fps on top of his reference results.

    He's getting 72 fps on a reference 980 @ 1080p which for all we know is crippled by the AMD game engine. I'd get 80+fps on my G1 980 which for me is more than enough to be able to play an AMD DX12 game from 2016, so i'm actually very happy with that and hope to see some actually Nvidia favored DX12 games in the near future.
     
  18. Horus-Anhur

    Horus-Anhur Master Guru

    Messages:
    357
    Likes Received:
    204
    GPU:
    GTX 1070
    omg! just look at my R9 390 betting the GTX 980...
    Finally we see DX12 bring out the full power of GCN and it's amazing.
     
  19. Dazz

    Dazz Master Guru

    Messages:
    902
    Likes Received:
    97
    GPU:
    ASUS STRIX RTX 2080
    See i am not so sure about that as they only partnered up with AMD less than 2 months before release so too me thats to far gone, the main reason why AMD is performing so well in DX12 titles is due to the massive over head from the driver which has held AMD back in DX11 on a otherwise sound architecture. Fact is nVidia has been competing with crippled products because they had the advantage of having well optimised drivers to get the most out of their own architecture.
     
  20. jbmcmillan

    jbmcmillan Ancient Guru

    Messages:
    2,763
    Likes Received:
    275
    GPU:
    Gigabyte G1 GTX970
    Amen brother lol ordinary joes pay 250 maybe 300 on a good day.
     

Share This Page