Forbes: AMD Is Wrong About 'The Witcher 3' And Nvidia's HairWorks

Discussion in 'Frontpage news' started by pharma, May 22, 2015.

  1. pharma

    pharma Maha Guru

    Messages:
    1,095
    Likes Received:
    123
    GPU:
    Asus Strix GTX 1080
    http://www.forbes.com/sites/jasonevangelho/2015/05/21/amd-is-wrong-about-the-witcher-3-and-nvidias-hairworks/
     
  2. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,755
    Likes Received:
    2,203
    GPU:
    5700XT+AW@240Hz
    Forbes... and this particular dilettante...
     
  3. (.)(.)

    (.)(.) Banned

    Messages:
    9,094
    Likes Received:
    0
    GPU:
    GTX 970
    Amd are all talk and have been since they fell off the top almost a decade ago.
     
  4. Denial

    Denial Ancient Guru

    Messages:
    12,388
    Likes Received:
    1,630
    GPU:
    EVGA 1080Ti
    Well to be fair Ars basically comes to the same conclusion:

    http://arstechnica.co.uk/gaming/201...completely-sabotaged-witcher-3-performance/2/

    I guess tessellation performance is the real issue here. I don't really know if AMD releasing new drivers could fix their tessellation performance though. What I will say is that I expect more and more games to use more and more tessellation. Which probably explains why Kepler is falling behind Maxwell. Not only Kepler "maxed out" when it comes to being optimized but Maxwell is supposedly 3x faster at tessellation then Kepler.

    AMD needs a new card out. Obviously we are getting that soon in June, but I really think it should have been out already. They are basically giving marketshare to Nvidia on the basis that HBM is going to somehow pull people back. I personally don't think it will. I'll be really surprised if the 390x outperforms a Titan X to a considerable margin. At most I'm betting it will be within 10% of it's performance.
     

  5. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,755
    Likes Received:
    2,203
    GPU:
    5700XT+AW@240Hz
    I would not expect CDPR to implement TressFX and hairworks at same time. 1st came, 1st served. But I would expect them to test all effects they use even 3rd party and see if their settings is meaningful or not at given resolution.
    Because hairworks ON apparently uses higher than 64x tessellation since capping it there improves performance a bit already. And that means polygons smaller than 1 pixel of 1080p screen and over that there is default 8x MSAA.

    So, while I like Witcher games (since I love books), I am not very happy that in 2 hours I spent in configuration of game I got practically same level of details as CDPR's "uber" while running 30~40% higher fps.

    They have access to all what their engine does and can tweak it to 60+ fps even on HW like HD7850/6870 or GTX660Ti while running 1080p.
    I really wonder if their speedtree shaders are best looking while having least impact on performance or if they ended work on them moment image started to look OK.

    I may look at game with HelixMod to see if there are some rogue shaders with drastic impact.
     
  6. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,656
    Likes Received:
    499
    GPU:
    2070 Super
    Shoddy needs to go.
    All he's done with his crying over GameWorks is make AMD look weak and incompetent.

    And that's even if assume THAT EVERYHING HE'S TELLING IS 100% TRUE.

    A tough sell coming from PR that talks about competing company, BUT EVEN SO.
    If you can't innovate and push your own technology, if you can't follow competition's pace,
    and instead all you're doing is complaining, yet can't do nothing about it - IT'S TIME TO GO.
     
  7. cowie

    cowie Ancient Guru

    Messages:
    13,195
    Likes Received:
    286
    GPU:
    GTX
    I never liked anyone that ALWAYS passes the blame.
    just put out your new card and make sure it runs all the hwbot benchmarks better then we have ever seen.
     
  8. elpsychodiablo

    elpsychodiablo Master Guru

    Messages:
    349
    Likes Received:
    0
    GPU:
    Retina Z2 + Vlab Motion
    They really think the Gaming community are morons, .... ok maybe they are lol
    If i look at Gameworks games and games without Gameworks...

    Gameworks title:

    Project Cars = AMD Performance sucks
    The Witcher 3 = AMD Performance sucks
    World of Warcraft add some Gameworks effect = 960gtx is almost double faster as 290x!
    The Crew = GTX760 is equal as 290x
    Farcry 4 = gtx670 is equal with 290 and a gtx 770 destroy a 290x
    Watchdogs = ok we needn´t talk about that desaster



    Now if i look at title without Gameworks

    Crysis 3 = normal
    Battlefield 3 = normal
    Battlefield 4 = normal
    Battlefield Hardline = normal
    Dragon Age Origin = normal
    Middleearth Shadow of Mordor the 290x is even equal with 980gtx
    Grid2 = normal
    Grid Autosport = normal
    Skyrim = normal

    So Gameworks is not the Problem?

    Two things are possible
    1. Nvidia sabotage AMD on every Game with Gameworks.
    2. Devs which use Gameworks have no skill on coding and are absolute morons.
     
    Last edited: May 22, 2015
  9. Denial

    Denial Ancient Guru

    Messages:
    12,388
    Likes Received:
    1,630
    GPU:
    EVGA 1080Ti
    Yeah except all this wrong and fabricated bull****.

    http://www.tomshardware.com/reviews/far-cry-4-benchmark-performance-review,4019-4.html

    http://www.techspot.com/review/925-the-crew-benchmarks/page3.html

    So right there your farcry 4 and crew nonsense is just wrong.


    http://www.bit-tech.net/hardware/graphics/2014/09/19/nvidia-geforce-gtx-980-review/6

    You say BF4 is normal. 980 beats a 290x in BF4 by ~17% @ QHD

    http://www.guru3d.com/articles_pages/the_witcher_3_graphics_performance_review,6.html

    In Witcher 3, the 980 beats a 290x by ~20% @ QHD

    So you're telling me that's the difference between "normal" and "AMD performance sucks" 3%? Give me a break.

    If you're going to peddle a narrative at least base it somewhat in reality. It's really getting old seeing stupid posts like this with absolutely zero evidence backing it up.
     
  10. -Tj-

    -Tj- Ancient Guru

    Messages:
    16,412
    Likes Received:
    1,497
    GPU:
    Zotac GTX980Ti OC
    Just some guy's pov, anyone could write that.
     

  11. elpsychodiablo

    elpsychodiablo Master Guru

    Messages:
    349
    Likes Received:
    0
    GPU:
    Retina Z2 + Vlab Motion
    Look Fanboy this Guru3D benchmark is not a qualified benchmark for Gameworks, because Guru3D disable Hairworks which is a Gameworks effect. The only thing this Benchmark prove is if you disable Gameworks effects the FPS turns to something normal. Enable it and look the Benches again.

    Farcry 4 and The Crew have problems on release by AMD Cards. It would be terrible if its not fixed until now, there was not just a driver fix for this games, there was patches for the game self, too.
     
  12. Denial

    Denial Ancient Guru

    Messages:
    12,388
    Likes Received:
    1,630
    GPU:
    EVGA 1080Ti
    ok..

    http://www.techspot.com/articles-info/1006/bench/2160_Ultra.png

    Once again a 3% difference from BF to now, 20% between the two cards.

    In the worst possible case the most I can find is a 30% difference in performance between the 980 and the 290x in Witcher with Hairworks on. Going back and looking at 980 launch benchmarks, there were a few titles where a 980 reached ~30% gains over a 290x.

    So like obviously enabling tessellation, something that AMD has worse performance in, is going to impact AMD's performance. If that's the argument here then I agree. But the bottom line is this, more and more games are going to be using tessellation for tons of different things. It allows for displacement mapping, better character models, better LOD, scalable assets, etc. Blaming Nvidia for using it is stupid. What they should be focusing on is advertising how the 390x is going to improve tessellation performance or something, giving me more of a reason to buy their card (which I might do anyway, since HBM is going to be decent at 4K).
     
  13. elpsychodiablo

    elpsychodiablo Master Guru

    Messages:
    349
    Likes Received:
    0
    GPU:
    Retina Z2 + Vlab Motion
    at 4k, anyway every card cant score at 4k

    look at 1080p which most people use.
     
  14. sdamaged99

    sdamaged99 Ancient Guru

    Messages:
    2,028
    Likes Received:
    21
    GPU:
    Inno3d GTX1080 Ti
    As much as i do believe Nvidia can be questionable, i think the issue this time lies firmly with AMD


    I love my R9 295X2, but having started the Witcher 3 and not having working crossfire drivers, its unacceptable

    I also hate their smug attitude when it comes to being questioned about drivers

    The fact the most recent question to roy@AMD about Crossfire Driver for Witcher 3 was finished with a smiley face when he said "next week" irritated the hell out of me

    Why did they not have a working driver on launch???

    Competition is vital in this industry, but AMD strike me as lazy and not willing to put the effort in

    i can either hammer the Witcher 3, on reduced settings or sit back and wait for a Crossfire driver which will arrive god knows when.

    Sick of AMD and sick of defending them.
     
  15. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    1,741
    Likes Received:
    137
    GPU:
    Guru3d GTX 980 G1
    Whatever, i personally think Huddy is full of it. So many lies it's hard to distinguish fact from fiction.

    I'm not saying nVidia are angels but the underdog seems to have to spin a yarn a lot more than the leaders do.

    Again what does all this mean? It means the consumers(us) gets shafted time and time again.

    I know there's a lot of money involved but this stuff is starting to seem more and more childish as time goes on.
     

  16. Duke Nil

    Duke Nil Maha Guru

    Messages:
    1,187
    Likes Received:
    19
    GPU:
    GTX 1080
    Yeah that article is pretty tough for me to disagree with
     
  17. Black_ice_Spain

    Black_ice_Spain Ancient Guru

    Messages:
    4,553
    Likes Received:
    0
    GPU:
    970GTX
    i have it disabled anyways, i dont get all the noise around it, takes so many fps and impact ingame its quite meh

    Anyways, what AMD says it's perfectly possible, after all... its competition between companies, I've seen worse things.
     
  18. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    3,173
    Likes Received:
    477
    GPU:
    EVGA 1070 FTW
    When it comes to things like this, I say always follow the money, or, just cui bono.

    nVidia do tie studios into contracts to get gameworks support and tools, and in doing so will flat out order the studio not to disclose the code and any optimisations to someone like AMD. This has always made the nVidia drivers and hardware out perform AMD, because they are coded to work with nVidia, essentially.

    All the time AMD seem to think they don't have to lift a finger to help out gamers by supporting studios throughout the process. Or that's how it seems.

    I don't know what you guys think, but with AMD it seems like the gaming market is an add-on subsidiary to their 'real' business - but with nVidia, all of their non-gaming interests seem to be secondary to gaming.

    Look, I don't particularly love or like either company, but in terms of approach to the gaming market, I believe nVidia has the right attitude. We all know this doesn't make their hardware any better than AMD, but it certainly gives me a warmer feeling about having an nVidia card in my machine than an AMD one.
     
  19. elpsychodiablo

    elpsychodiablo Master Guru

    Messages:
    349
    Likes Received:
    0
    GPU:
    Retina Z2 + Vlab Motion
    I would like to blame Microsoft a bit.
    If every game need driver optimization, special if you have crossfire or sli then the directx api is wrong.

    Microsoft have Console, PC Gamer and now mobile Games, they should give devs all tools for coding, directx based physx system, clothes, hair, rain, weather and fire tools independently from Nvidia or AMD.

    Thats the only way it could work for PC Gamers in future.

    Remember Creative at Windows XP times, they had a monopoly in the past for Audio acceleration and sound effects, if you played in that time without Creative Cards you have less fps and no good sound, too.

    Microsoft comes and changed the whole Sound API, now you can play with every Soundcard you want, Asus, Creative, USB Dac,....

    We need this again for GPU!
    Remember Intel would join GPU Market years ago, on the paper they had the best hardware, but they had no chance against AMD and NVidia because of their drivers.

    If Microsoft change the things again i m sure we could have strong Intel GPUs and other Companys for PC Gaming.
     
    Last edited: May 23, 2015
  20. Lane

    Lane Ancient Guru

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    Well he have his opinion and i respect that. but seriously his arguments are really out of the situation.

    - If AMD was use a lower tesselation level on his driver, similar of what we can do by forcing it on the CCC, they will not wait a long time to see reviewers, peoples accusing AMD to cheat the benchmarks. hence why they have never force the tesselation on driver profile but let the choice on the end user. ( it was even a concern when they have introduct this setting on CCC ).

    Reviewers will anyway disable this profile in this case when testing the game.

    Extreme tesselation level on hairwork used for the witcher3 is only half of the problem, it permit too negate the case, not to solve it.

    - If AMD have work since a while on witcher3, Huddy claim it have got the copy with hairwork on 2 months ago. Well this is not impossible at all, it is offtly the case with this type of library that they are not enabled on the work in progress copy of the game. Look AssassinCreed, FarCry4 most features was not even included in the game and have been patched after the release of the game. ( including hairwork for FC4 ).. Lol that even remember me the tressfX story on Tombraider. ( On purpose as TressFX was under a secreet warp at this time and have been presented at the same time of the game release )

    - The developper say they cant optimize the gamework features for AMD, like him i think many things can be the cause, money, time, or just they dont care .. the problem is due to the nature of gamework, you cant easely optimize the driver for the specific implementation who is put in the game.

    Is Nvidia pushing tesselation to an extreme level on hairwork knowing this will put other hardware on the knees ?.. maybe, finally it is their right, it is their library, even if questionnable. On this level, AMD cant do anything on the driver side..

    The most funny is The TW3 is using an engine who is extremely close of what is working the best on AMD GPU's, and the performance without gamework show it nicely. Most graphics engine are moving to a tiled deffered + forward rendering, something close of the AMD forward+ rendering.

    - The GTA5 things is a counter example, GTA5 is not a gamework game, even if it use 1 feature you can find in gamework ( HBAO+), Rockstar have a different approach on game developpement, they are surely the only one who use Bullet as physic engine. Rockstar, at contrario of Ubisoft try to stay as far that they cant of the hardware brands.

    The problem today, whatever is the reason, it seems continue a trend we have seen since some years with TWITMP games, and the list start to be really long.. So is it AMD who is responsible or Nvidia, or the situation, i dont know.. but we cant deny this trend. Since Gameowork have been launched, we have got 4 majors games, where everyone was show trouble performance, Assassin Creed, COD, Batman series, FC ...

    I tend to believe that AMD have too his responsability on a certain extend, they maybe dont push the work on thoses titles enough, or dont pressure enough the developpers, who know. maybe they should increase their relations with the studio who use gameworks or who are close to Nvidia.

    But its funny to see the revert of this guy from Forbes, as for obtain half of what he want, AMD should sue every developper who are working with Nvidia.
     
    Last edited: May 23, 2015

Share This Page