Review: Total War WARHAMMER DirectX 12 PC graphics performance

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 2, 2016.

  1. evilkiller650

    evilkiller650 Ancient Guru

    Messages:
    4,802
    Likes Received:
    0
    GPU:
    Zotac 980ti AMP! Extreme
    Jesus Christ man, again?
    Stop rambling on with your hate and crap. You've been told TW: Warhammer is performing and running great but yet you still can't keep your mouth closed and ramble on about what you feel :3eyes:

    We understand.... you had bad experience with the devs in the past but as has been said.... they really improved the engine this time.
     
  2. Moegames

    Moegames Guest

    --------------
    Not just fury that seems glamorous at DX12 but just about all of the AMD cards do. I have a gut feeling this is the start of AMD's big comback in terms of performance of its existing cards since it's a lot better at DX12 than Nvidia. I always get a giggle or two out of people that still feel in their subconscious have to downplay AMD's DX12 perf. Numbers don't lie fella's, unless you spin the crap out of the results.

    More incoming from other review sites with this game and more upcoming Dx12 games. This is one of the newer DX12 games that was actually built for DX12 fully.

    PS: I wouldn't downplay AMD's new Rx480 neither like a lot of folks want to do. If the Rx480 has good overclock headroom, it will be in striking distance of the 1070, mark it down! And as time goes by, it'll mature into another nice gpu for 2 bills.

    :nerd:
     
    Last edited by a moderator: Jun 3, 2016
  3. Moegames

    Moegames Guest

    You're gonna eat a big plate of crow, soon enough dude, I promise you this. ANd yes, you do sound like a broken record because you're one of the biggest anti-AMD dudes around here, man.

    AMD does have better DX12 solutions in their graphic cards and 2016 and beyond will prove it. It's already starting to happen with these beta's and some of the newer DX12 games just starting to come out now.
     
  4. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    GOW remake maybe, but TR had dx12 from start, just in background disabled. Btw there is a interesting boost by TR too, geothermal Valley - with lots of vegetation up to 20fps by min fps, dx11 65fps, dx12 87fps
     

  5. Aelders

    Aelders Guest

    Messages:
    870
    Likes Received:
    0
    GPU:
    980Ti G1 @ 1490|8000
    Yeah the AMD GPUs are just higher specced in terms of compute, but saying GCN performs better in DX12 (not saying you said this) or GCN is better in AotS suggests it would be better even at spec parity. This is not the case, this what was I was more interested in, whether the shader array on the nvidia architectures would be less utilized compared to gcn. It doesn't appear so
    I don't think it's weird, it's a company, this is good for them I guess. Anyway there is a lot more money being invested in r&d over at nv, was it 8 or 9bn on pascal ? it is just manufacturing costs they recoup, i can't find any satisfying figures on this, but the cost of developing the custom cell libraries nvidia famously employ must be pretty big as well

    I love CUDA, I'm sure it shows, cuda is amazing, there's tons of libraries you can access for free (non commercial) and it has saved me and my friends tons of time. Friends of mine working on neural net competition ported their code into tensorflow with a cuda framework in one day, training time went from 8-10 hours to 40-60 minutes. In one day.

    Everyone and their mother uses cuda in academia

    It would be amazing, much better than having to look for a party online and try to meet people.

    Nobody wants to do evil campaigns :(
     
  6. Illyrian

    Illyrian Guest

    Messages:
    223
    Likes Received:
    13
    GPU:
    Gigabyte 4080 Aero
    Rambling? lol.... those aren't just my opinions, the Total War community as a whole feel they got burnt with Rome 2 and Attila... now all you Warhammer nerds coming out of the woodwork talking like you know a thing or two about the series

    You had no previous Total War experience but tried wise-guying me in the other thread got whipped, and it's clear you still butthurt about it...
    So yea AGAIN, take another loss here and keep it moving bruh
     
  7. evilkiller650

    evilkiller650 Ancient Guru

    Messages:
    4,802
    Likes Received:
    0
    GPU:
    Zotac 980ti AMP! Extreme
    You're worryingly delusional bro. You keep rambling that the game sucks and is unplayable. You got whipped. Why would you even think I was "butt-hurt"?
    I and others keep telling you the game is fine but you just can't comprehend that and can't stop yourself from wasting your own time.
    As I said in the other thread, I clearly do not need experience in the other games. You said do not buy this game because you're literally going to get horrible performance and the engine will sucks when..... SURPRISE.... the game runs fine.

    I get it. You're butt-hurt about your past experiences with this series and want to *try* and convince others not to support the devs.
    You just need to deal that the game is fine and it's the best selling Total War game so far :)
     
  8. Monchis

    Monchis Guest

    Messages:
    1,303
    Likes Received:
    36
    GPU:
    GTX 950
    Ha, look at how long the gm206 lasted :cussing:
     
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    I hear you about CUDA, but something like that shouldn't be proprietary. OpenCL should solve this once enough libraries are around. I don't believe for a moment that NVIDIA spent 9bn for Pascal. For Maxwell, Pascal, the transition to 16nm AND the packed libraries, along with Gameworks and drivers, then MAYBE. They don't even have the pure income for that the last years. I believe that as usual Huang gave an impressive number with a lot of asterisks. That's why I dislike him. As for NVIDIA being weird, I mean it in the sense that they are very specialized and only in a single market really. The last company like that was ATi and it merged with AMD. Between NVIDIA, AMD and Intel, NVIDIA is the least flexible one and the one depending most in a single market. If it didn't achieve excellence and huge profit margins in this single market, it could go away much faster than AMD which is latched in many more.
    I'm in for evil campaigns, but between my friends we are in four different countries :/

    Please dude, you are not the only person playing Total War games either. Some of us both play Total War AND like Warhammer. And if you know Total War as much as you claim, you know the golden rule for Total War, which is:

    1) Good game
    1a) Mediocre expansion
    2) Bad game
    2a) Good Expansion
    3) Good game
    3a) Mediocre expansion.

    In names these are Empire (bad game), Napoleon (good expansion), Shogun 2 (good game), Rome 2 (bad game), Attila (good expansion), Warhammer (good game).

    This time they even fixed the ****ty engine. What's your SPECIFIC problem with the game?
     
    Last edited: Jun 3, 2016
  10. Bansaku

    Bansaku Guest

    Messages:
    159
    Likes Received:
    7
    GPU:
    Gigabyte RX Vega 64
    Last night I ran the DX12 Ashes of the Singularity benchmark on my twin HD7950 for the first time and was shocked at their performance vs the Fury X and 980Ti (comparing the scores from this site)! I am excited for this game now. I just hope Total War: Warhammer doesn't eat up all 16GB of RAM and crash like AotS does after 1 benchmark run.

    (speaking of AOTS, I miss that show... :cry: )
     

  11. 0blivious

    0blivious Ancient Guru

    Messages:
    3,301
    Likes Received:
    824
    GPU:
    7800 XT / 5700 XT
    I swear you guys must each own stock in one of these rival companies because otherwise, all the arguing seems really silly, trivial and completely lacking in productivity.

    Anyways, carry on...
     
  12. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Thank you for this serious, deep and productive comment. And for your permission to carry on.
     
  13. Cyberdyne

    Cyberdyne Guest

    Messages:
    3,580
    Likes Received:
    308
    GPU:
    2080 Ti FTW3 Ultra
    Everything is "Ashes of the Singularity". New GPU, Ashes of the Singularity. New game? Ashes of the Singularity.

    So annoying. You see both AMD and NVIDIA perform about the same in a DX12 game and it's still "Ashes of the Singularity".
    It's one game, that was fixed by new hardware. We don't have any game or benchmark that also does this. We don't know if AMD's new cards will still have some sort of inherit advantage when they come out to compare to 1070/1080 'fix'.

    Ashes of the Singularity itself, as a video game, is such a boring piece of 'MEH'. It's a crap game.

    Then we got one guy here who went on a "this is what companies do to make money" rant. Another guy goes on a rant on the games graphics engine.
    And now I'm ranting!
     
  14. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    FFS can we have just one thread that doesn't get run down with the same old AoS arguments?
     
  15. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,754
    Likes Received:
    9,647
    GPU:
    4090@H2O
    No we can't, it's AMD fanboy's wet dream.




    To get back to topic, an overclocked 980 (which I have, and no there's no SLI support), running about 1500MHz gets you below 60fps on the campaign map, with dips to below 10 during the turn end (when the AI gets their turn). Average it shows 59 when I have the ShadowPlay overlay on. But the game runs fine, I have a Gsync monitor.

    What I see is that I would actually downgrade my fps going for dx12. So.... it's not a thing to do, as simple as that. dx12 offers nothing in better graphics or performance... no gain, so it's a fail for a well built machine that happens to have a nvidia graphics card. Getting 59 or 60 fps with an overclocked card between dx11 and dx12 doesn't make any difference.

    By the way, I'm not surprised this game runs better on AMD, it's in AMD's partner program... :rolleyes:
     

  16. 0blivious

    0blivious Ancient Guru

    Messages:
    3,301
    Likes Received:
    824
    GPU:
    7800 XT / 5700 XT
    No. Starting to reach "Keplar downgrade!" status though.


    Maybe I'm on crack, but it looks like, in HH's charts, a $700 nvidia card in the lead pretty significantly with a $600 AMD card now neck and neck with the upcoming $400 1070. Why is DX12 AoS even a thing anymore?
     
  17. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,551
    Likes Received:
    608
    GPU:
    6800 XT
    My 290x is doing good :p I will get this game later on when I can get chaos with the game once again.

    When I am gaming on a 1200p I feel no need to upgrade my gpu what so ever. The benefits ain't there. Well maybe for black desert there would be some.
     
  18. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,754
    Likes Received:
    9,647
    GPU:
    4090@H2O
    You could have gotten Chaos in the first week actually, was surprised about that but I just bought it a day before release, got Chaos, and preload.

    Does your CFX work in TW:WH Ryu?
     
  19. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    $379 nvidia gpu (1070) on par with a $629 AMD gpu (FuryX). Also FuryX only just slightly ahead of the cheaper 980ti. After overclocking both, 980ti pulls ahead of the FuryX, while remaining cheaper.

    I don't see this as an AMD win.
     
  20. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,551
    Likes Received:
    608
    GPU:
    6800 XT
    It worked with earlier games just fine actually, most likely would have worked with this too maybe not on dx12 tho. Didn't get TW:WH because they never sent me the limited edition I pre ordered, so I decided to wait :)
     

Share This Page