Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 20, 2015.
161 Days Since last Catalyst WHQL Release.. what is going on with AMD..
All settings ultra and when i activate NVIDIA Hairworks dropped from 70fps down to 51fps. On 1080p resolution.
Would not be surprised to see a large driver update very soon with the release of new cards.
I'm guessing that these performance issues are more related to the fact that the game it's just another port from console to PC, lets not forget that both Xbone and PS4 are base on AMD tech, thus the game being a port equals to a better optimized game on an ATI Card than an Nvidia... Don't believe those conspiracy theories since any other game would be moving great with the actual driver. Has anyone done a benchmark of FC4 with these new drivers? Or that doesn't even matters anymore?
Oh and btw, computex is around the corner, DX12 is around the corner, and I also guess that AMD has to do a lot of work to get DX12 running without issues on the Xbone... and that's why they don't release any new driver.
Love to both AMD and Nvidia fanboys, let's game not silly accusations
Someone at hardware.fr forum did a pixel fill rate comparison of the cards in the guru3d review:
Interesting how it (roughly) lines up with the performance of the cards.. although I dont necessarily link it to the results, Keplers pixel fill rates notably lower than their counterparts.
It has nothing to do with Pixelfilrate, this only comes into play @ higher resolutions, just like texture fillrate.
Note, 960 with 36pixel vs 780ti with 52gpixel and yet it somehow almost beats it in this game, this further confirms it has no connection to it.
Btw 780TI has higher texture fillrate then TitanX, just fyi.
stock TitanX 192 gtexel/s
stock 780ti 210 gtexel/s
No it's not. stop lying to yourself. Many gurus will confirm that HW tanks the fps on their 970/980, even in SLI.
But it doesnt 'almost beat it'.. in fact it seems perfectly in line with the FPS. As well as other cards between them.. so I dont get where this "almost beats it" comes from.
Pretty much wrong.
Maxwell DOUBLES on Kepler's ROP count
(48) GK110 -> (96)GM 200
(32) GK104 -> (64) GM204)
Which is the reason why Pixellfilrate and NOT Raster operations becomes the limiting factor.
And your point is? 960 doesnt have 64ROPS.. Crysis1 liked higher ROP count, more then texture fillrate.
And at 1080p 1x more ROPS doesnt help much, looks some more reviews, once you go to 1440p TitanX with 96ROP pulls further ahead.
This is with Hairworks off, with on apparently 960 can beat 780ti, with much lower pixel and texture fillrate, nvm lower tessellation and bandwidth perf.
Interesting. It looks like this is extremely pixel fill rate heavy game.
If it was then 290X with 12gpxiles more then 970 would beat it but it doesnt, so nope.
Hairworks actually has less performance penalty (Maxwell, Kepler), or virtually the same (Tonga) than TressFX.
Despite TressFX doing nothing but Lara's hair.
Maxwell is waaaay more advanced and efficient architecture. Direct comparison like that between different manufactures is useless. On top of that there are other factors that may make the game run better on Nvidia.
In my case my card has 84 Gpixel fillrate (at 1,5 GHz) when my previous GTX 780 (1,2 GHz) has 57,6 Gpixel fillrate. That is so large difference that it will easily show. Since I have had both, it's obvious to me which card is faster. GTX 780 had too high texture fill rate vs. pixel fill rate in my opinion.
Read the chart bellow with your edited post..
No its not, the only factor here is the game was specifically designed to run better on Maxwell, last minute optimizaitons that's why it got postponed few times at end and slapped all that nv bs on top of it.
780Ti should be faster vs 290x even though it has lower ROP count and higher pixel fillrate according to 3dmark vantage pixel fillrate, but it isn't
So yeah that direct pixel fillrate comparison why it runs better on Maxwell or on GCN with higher fillrate is not it.
Pixel fill rate is one factor, after all Witcher 3 is way more than simple fill rate benchmark. Also, on paper 290X has higher pixel fill rate so it's not a big surprise if it has the edge in some cases in real life situations.
That's what I've been saying, it doesnt make any real difference at 1080p (960 has lower and yet with hairworks it can apparently beat 780ti), bandwidth isnt the case either its lower, texturefillrate also lower (its even lower compared to TitanX), tessellation isnt thaaat higher either..
The more I look at all these benchmarks and gpu specifications the more its clear its just nv dirty trick & this game dev game code to sell them, they know Witcher is a hyped game and now that its bundled with maxwell kinda says it all.
OK but ATTENTION post #93!!!
Better performance HairWorks™ than TressFX™
I knew it, game shipped with unoptimized shaders. That´s the most common cause of fill rate problems, and this games scales linearly with fill rate. Some reference on fill rate bound scenarios:
Now get up of your collective buttocks CDPR and optimize your shaders. It´s ridiculous that people with a gtx970 has to deal with these kind of lods (see the bushes) to get 60fps:
What a pure smooth reference!
ATTN: overclocking will blow up your video card.