The Witcher III Wild Hunt VGA Performance review

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 20, 2015.

  1. Yecnot

    Yecnot Master Guru

    Messages:
    856
    Likes Received:
    0
    GPU:
    R9 270 2GB
    So give it to me straight. Is it logical for the 960 to be faster than the 780 (as if nvidia pulled some guru **** in one generation)?
     
  2. Undying

    Undying Ancient Guru

    Messages:
    12,504
    Likes Received:
    1,955
    GPU:
    Aorus RX580 XTR 8GB
    Its virtually impossible for 960 to be faster than 780. Its magic!
     
  3. 0blivious

    0blivious Ancient Guru

    Messages:
    2,639
    Likes Received:
    263
    GPU:
    MSi 1070 X / 970 / 780Ti
    So one game comes out that leaves behind the last generation of equipment (on Day 3) and it's a massive conspiracy by Nvidia?

    What about all the other titles released in the last 6 months?
     
  4. Undying

    Undying Ancient Guru

    Messages:
    12,504
    Likes Received:
    1,955
    GPU:
    Aorus RX580 XTR 8GB
    I guess you just didnt listen. It started with wierd performance in FC4 up until now, TW3 is just the worst one yet.
     

  5. eclap

    eclap Banned

    Messages:
    31,497
    Likes Received:
    3
    GPU:
    Palit GR 1080 2000/11000
    The 780 is faster than the 960 at every resolution. This is how witch hunts start, people misinterpret stuff.
     
  6. ---TK---

    ---TK--- Ancient Guru

    Messages:
    22,112
    Likes Received:
    2
    GPU:
    2x 980Ti Gaming 1430/7296
    Saw on the nvidia forums, ManuelG is supposedly looking into it.
     
  7. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,405
    Likes Received:
    91
    GPU:
    GTX Titan Sli
    Do you mean the claims about Nvidia making Kepler perform worse using the latest drivers?
    Or is it about Kepler's low performance for TW3 specifically?
     
  8. waltc3

    waltc3 Maha Guru

    Messages:
    1,031
    Likes Received:
    310
    GPU:
    AMD 50th Ann 5700XT
    Unless nVidia is pulling another rabbit out of its slimy bag of tricks...;)...(sorry, couldn't resist), I might be tempted to reiterate HH's comment that the game goes light on textures and heavy on the shaders (in 2015, for a DX11-level game, using < 2GBs vid mem with a 40GB game @ Ultra HD res spells one heavily shader-optimized game, imo)...which could easily explain Maxwell's dominance over the earlier nV architectures...it's too early for Crossfire profiles I suppose (if they are possible with this game, which I haven't heard anything about), but HH mentioned that, as well.

    If it's true, I really like the idea of more heavily utilized shaders in games--over strictly using large textures while going "lite" on the shaders. The game seems fairly PC-optimized to me--at least, at the moment--but I think it should not be forgotten that today's consoles *are* PCs in the literal sense of the world--just scrawny and under-performing PCs, to be sure, but PCs all the same.

    IMO, "optimizing for a console" is fairly synonymous with PC optimization these days. What is a PS4, after all, except a slightly hobbled Pit Cairn HD 7850 that runs out of shared-with-the-cpu GDDR5 , has a few shaders more but a lower stock clock (800MHz vs. 860MHz)...? (Mine runs ROOB @1.05GHz stock voltage & cooling & has 2GB of its own dedicated GDDR5.) Oh, yea, the PS4 has a fairly slow x86 AMD cpu (among the AMD cpu family) to move things along (the PS4 has an x86 8-core 1.6GHz Jaguar cpu--I just bought a 4GHz AMD 8-core for $135.00) xBone is even weaker in the gpu department, but has the same cpu as PS4, but clocked 150Mhz faster @ 1.75GHz)--but both are 100% x86 PCs inside. Dog slow ones, at that.

    Anybody remember the Tomb Raider reboot a couple years back? In that game, an AMD-evolved title, ATi had a neat little hair-shader thing going on in that game--which dumped all over nV at the time because nV gpus looked & performed sadly running it...;) The TR stuff was heavily optimized for the current AMD architecture--which of course put nV in a bad light just as Geralt's hairdo is trying to do with AMD gpus in this game, etc. Pure marketing hype. This sort of thing is just PR, don't cha' know--it has very little to do with whose gpu is better than whose...;) Just my opinion, of-a course...;)
     
  9. Daftshadow

    Daftshadow Maha Guru

    Messages:
    1,312
    Likes Received:
    8
    GPU:
    MSI GTX 1080 Armor 8G OC
    couldnt agree more. you would think the 980 would handle it no problem, guess again. the fps drops more noticeably when it zooms in during conversation scenes.
     
  10. lordofthering

    lordofthering Banned

    Messages:
    93
    Likes Received:
    0
    GPU:
    2600XT
    I've said this in past, but it is true, AMD GPU's are almost (of not just) always much powerful compared to nVidia, what nv does, is good driver job for (artificial) performance. This was the case since 9000 series. In raw compute power, AMD was always way ahead, and that shows when product become less relevant.

    Long story short, if you want something that have real performance that will last some time, go with AMD.
     

  11. Yecnot

    Yecnot Master Guru

    Messages:
    856
    Likes Received:
    0
    GPU:
    R9 270 2GB
    That was temporary iirc. TressFX works pretty well on Nvidia drivers. The difference is HairWorks code being locked while Nvidia was allowed to optimise for TressFx.
     
  12. 0blivious

    0blivious Ancient Guru

    Messages:
    2,639
    Likes Received:
    263
    GPU:
    MSi 1070 X / 970 / 780Ti
    What are you smoking?

    Personally, I find most AMD cards to be cheap junk that use way too much power, get way too loud, and produce way too much heat for the performance delivered. If you want seriously garbage drivers, go AMD. They do cost less though (I just bought another one last week). They're kind of like AMD processors. Underwhelming.
     
  13. Lane

    Lane Ancient Guru

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    I dont know why, i can imagine you will see a new setting in W3 really soon for tesselation level on hair, or maybe just an optimisation of the performance for it by the developpers ( understand a reduction of the level of tesselation of hairwork ). Do anyone really need 8x AA and more than 64x tesselation level for hairs strands, when 32x is allready considered as really high for benchmarks.


    Specially when AMD users can do it directly from the driver, and so negate the impact of hairwork easely.

    Im pretty sure, Nvidia users ( specially the one with middle range gpu's ) could be happy to get a medium quality setting in the game for it, and been able to push 10fps more without a big hit on quality.

    [​IMG]

    [​IMG]
    courtesy WC CF.. if you want see bigger resolution go there.
     
    Last edited: May 21, 2015
  14. BadAssMusician

    BadAssMusician Member

    Messages:
    35
    Likes Received:
    0
    GPU:
    GTX 980Ti EGPU
    Heya, can we add some Low-End testing as well? Not many of us have a 5960X and folks wouldn't even have money beyond 720p / 768p / 900p testing.
    So people with a low budget or a low power rig at least know how much it takes to run this or future AAA Titles at low - Medium settings for the extreme tight budget friendly.

    Yeah, just a thought. Though I have GTX 970s in SLI, my daily driver has shifted to my Gigabyte BRIX with an R9 M275 for a while.
     
    Last edited: May 21, 2015
  15. Ven0m

    Ven0m Ancient Guru

    Messages:
    1,764
    Likes Received:
    0
    GPU:
    ASUS GTX 970 STRIX OC
    While in these benchmarks 960 isn't faster than 780, it's way too close in my opinion, and surprisingly far ahead of 770. Also, the gap between 290x and 780(Ti) is huge, much greater than in any other benchmarks I've seen before.

    I can see the logic behind it.

    While I wouldn't say that 6xx cards and 7xx cards were gimped, they're simply abandoned. I'd expect the performance boosts to be greater for 9xx series, as their drivers aren't as mature, however seeing *any* performance boost with recent drivers for 6xx and 7xx over early 347 driver series would be a nice thing to see.

    It's logical what they do right now. Investing money in old cards instead of new ones, might only give some PR boost and won't really translate into sales. If we were getting all the goodies, it would be difficult to the new cards. Take my GTX 680 for example. Initially it was on par/slightly faster than 770, then 770 looked better and better and now it appears that even 960 benches faster (despite not being even close in the beginning). This suggests me that I should buy a new card. Without that nudge, why would I even consider replacing this 3-year old card? And what options do I have? AMD 290x which is not even close to 9xx series in terms of power usage and noise (which are important for me)?

    But I haven't got Witcher 3, and the only game that might me want to replace my card right now is Project CARS, yet the experience with 680 is good. What I can see is that when I got my 680, it was on par or slightly faster than 7970 GHz edition. When I checked the most recent graphics card review on Guru3D (Gigabyte GeForce GTX 960 G1 Gaming 4GB), 7970 GHz edition was 13% faster in 3DMark Firestrike. That's quite a lot and shows that AMD cards age better and it's quite an argument against all the AMD driver bashing. While AMD driver support may be not the best on the release date, in the end it's good. Let's see what 390(x) will be like. If they improve the power consumption and bundle something like NV Shadowplay, I'll have very little reason to stay in the green team despite having NV cards since Riva 128.
     
    Last edited: May 21, 2015

  16. Supertribble

    Supertribble Master Guru

    Messages:
    577
    Likes Received:
    21
    GPU:
    1080Ti/RadVII/2070S
    Yeah to be honest the 960 shouldn't be getting close to a 780, it's what, 5fps behind? Way too close. Anyway with the news that Witcher 3 will be getting a new patch with graphical improvements I've swapped out the 960 and back to using the 290. I swear I'm swapping cards in and out every other day at this point.
     
  17. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    37,027
    Likes Received:
    6,101
    GPU:
    AMD | NVIDIA
    Added:

    • GTX 980 SLI results
    • GTX 980 FCAT
    • GTX 980 SLI FCAT
    • Some Hairworks perf
    • Quality modes scaling with GXT 770
     
  18. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,939
    Likes Received:
    2,292
    GPU:
    5700XT+AW@240Hz
    Crysis tessellation is time of HD5870, AMD doubleed tessellation power with HD7970, and then again with 290/285 series.
    While Maxwell is stronger at it than GCN 1.2/kepler, those 2 later are quite equal.
     
  19. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,939
    Likes Received:
    2,292
    GPU:
    5700XT+AW@240Hz
    It is not faster, performance problems of any card in W3 are caused by few particular things where older generations cause huge prolonging of render time.
    As been already told, tessellation will take huge amount of time while making very small visual difference.

    On other hand, badly coded tree shadow system is another culprit of long render time. And unfortunately those trees without shadows look very bad.
     
  20. zer0_c0ol

    zer0_c0ol Ancient Guru

    Messages:
    2,976
    Likes Received:
    0
    GPU:
    FuryX cf
    actually it is
     

Share This Page