Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 24, 2017.
Driver 376.60 give me much better performance i think. Or have the game better graphics with 378.49?
Looks like a mess for benchmarking. Too unstable to draw any meaningful conclusions.
How can a game ready driver broke the performance? HH said performance is all over the place and it could differ from benchmark to benchmark.
The 390X is a beast!
If you saw original benchmark with 376.60, then you would know 378.xx breaks it hard. :grin:
pcgameshw did a review too, although it doesn't look so out of control there, they used same new r378.xx driver
Maybe Guru3d test with Shadow cashe on?
Shadow cache on indeed seems to be the root of all issues. I'll do some further investigation and update the charts today if that indeed is the confirmed root cause.
hi guys, can you tell me which radeon rx 480 you used on test and which 1060 please?
I have been saying tat these cards with their 4gb of ram are all jokes. Doesn't matter what kind of ram it is. Simply put there isn't not enough of it in a 4gb capacity to make a true difference.
I saw the "red" flags when the card dropped originally.
To me personally I believe that the card should have never released without coming with at least 8gb of ram. Then the card would have been an investment as opposed to a pushed product.
Very interesting results at pcgameshardware. They also included Tahiti which is faster than 780 and 970 (lol).
Hell's Kitchen? ("Resident Evil 7 (Launch; Steam), PCGH-Benchmark 'Hell's Kitchen'")
So they did their testing in that small room in the first house? Well that would explain why the framerate is so high for their GPU's although it looks like they encountered some issues as well going by some other images and stuff posted, something about shadow quality on very high and the cache setting resulting in a significant performance hit and even though Capcom said they'd look into it when people asked bout 21:9 in the demo it's still not fixed although I guess the first post-release update might maybe correct the aspect ratio and FOV for that view mode.
EDIT: Ah shadow cache works similar to Infinity Ward's "Call of Duty" games storing data in the GPU at the expense of VRAM, resulting in a pretty noticeable performance hit for GPU's with 4 GB or less of said memory at least for that game.
(But If VRAM is readily available such as for the 480 8 GB or the 1080 / Titan X Pascal then it likely helps a bit.)
EDIT: Seems to be the same with RE7 from what I've read, it simply takes too much VRAM to keep shadow caching on.
(Probably also depending on selected screen resolution and some other factors but still takes a lot of video memory.)
Interesting results, guessing that there's still some quirks with the game though as the 1070 GPU appears to be underperforming in the tests you did with it and while I am a Fury GPU owner I don't see it as being able to close in on the 1080 GPU like that.
(Going by the results of the 2560x1440 performance chart at least.)
Still in need of a patch then I suppose or just AMD and Nvidia to get their drivers into better shape.
EDIT: Evens out a bit for the 3840x2160 when the GPU's start getting a bit more bogged down, GPU's are roughly in the order I'd expect them to be performance wise even if the actual framerate is very close to one another.
Titan X Pascal pulls ahead by what I guess is almost just sheer power with the 1080 a little bit behind (Still a really fast GPU though but that's already well established.) and then trailing along a bit behind that is the earlier Titan X (Maxwell) and then things clump up a bit it seems.
The 1070 and 980Ti almost completely tied followed by the 390X 8 GB just a little bit above the Fury X (VRAM advantage I guess.) similarly the 390 is just a bit above the Fury and then the 980 and 480 a little bit below.
(480 is keeping up pretty well though despite it's weaker hardware, guessing the Nano is being held back by throttling since it would otherwise be a pretty strong GPU.)
Only it's kinda like $250 right now, and it blows everything out of the water. It's not exactly a long term investment, but if you game at 1080p it should destroy anything in the same price bracket for quite some time (and it does).
This game looks like a badly optimized title. I wonder what kind of results Hilbert would get if he forced Vsync on.
Hmm I guess you're right, going to be interesting to see how the charts will look like after a patch or two if Hilbert keeps them updated.
Possibly also after some newer drivers for AMD or Nvidia GPU's are released unless they just shift over focus to I don't really know what exactly comes out in February in terms of "AAA" published games, Tom Clancy's Wildland and For Honor I think with Ubisoft traditionally having a good partnership with Nvidia.
(Sniper Eliter 4 and Dead Rising 4 - Steam version. - are planned for March I believe.)
EDIT: Of course close partnerships between GPU vendor and developer (or publisher.) usually means a driver is ready weeks or months in advance with perhaps some corrections driver side closer to release such as using the games name and not some debug exe.
(For Honor has a profile in AMD's 17.1.1 driver for example and Nvidia probably had one too and possibly updated it a bit now in time for the soon to start beta event.)
The shadow cache setting is a trivial one. I added this to the conclusion page:
1) If you have an up-to 4 GB graphics card, please turn it off.
2) However if you have a 6GB or 8GB card, leave the shadow cache activated. Especially 8GB AMD cards benefit there greatly in performance. Nvidia 6GB+ cards not so much.
I have now tested all cards three times in roughly 24hrs and all results sets have been different, I am putting this to rest for now. It remains a hard to test title, but then again the framerates are more then acceptable for pretty much anyone. Honestly I can really recommend the game as it is great.
True, performance has been pretty good so far outside of going completely over the top with say the resolution scale slider although I'm not very far into the game yet but for how it looks well aside from some lower-res textures it's pretty good, "RE Engine" or what this was called. (Not the MT Framework engine this time.)
For performance tests like this I guess it's always going to be tough to test a game if there's no dedicated benchmark mode for it, even when there is doing a run first for the GPU to say "warm up" and be done with things like shader caching before checking the performance results might be useful.
(Well I guess that's why dedicated benchmark software generally have a setting for how many runs to do before showing the results.)
Although I'm not sure a benchmark mode would be something most developers would be interested in spending extra time on implementing even if it's a fairly simple one.
(IE for this game having the camera run from the forest area at the start of the game up to the house entrance would work for one version and then perhaps an in-door camera run going through the various rooms in said house.)
EDIT: Interesting frame time results there now that I've read the rest of the benchmark article, 4 GB is slowly starting to get a little bit more limiting in newer games. (A number of little "hiccups" there for the Fury results.)
Good job! Also if you still have the shadow cache on chart for amd's put it up and you could also make a performance chart . Amd has a massive benefit as you have mentioned. Same as DX 12 . I would say around 20% .
RX 480 shadow cache on 128 fps. Putting this little handy card next to Titan X maxwell which has S C off.
You rock brotha!