Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Dec 14, 2020.
Where is this benchmark tool you used? I don't see a built-in benchmark. Is there one?
I did the hack on my system 3900X/1070 and can see that the 24 threads are now being more evenly utilized, and CPU overall usage gone up about 2% from 22% to 24%. There was no real difference in my frame rate though - although i only tested standing in the training simulator where I am still getting about 34FPS @ 1440P HIGH setting.
My CPU sure gets toasty with the mod. I'll try without and see what happens.
Thats why I didnt mod it.Why would I want the cpu to ramp up if its working perfectly fine the way it is?
Weirdly enough there isn't much of a difference even though SMT is enabled. We have two CCX so I guess it's natural that it gets warm.
Wow... so... it was sloppiness on CDPR's part combined with laziness from AMD. I'm not sure what's worse.
I don't drop below 50 fps its between 50-60fps on 1440p I am using custom settings but there are ultra/high and screen space reflection off because anything on results in very grainy graphics besides the Psycho mode but it's unplaybale then I get like 20fps on Psycho.
Ok. So after 4 hours of play with the MOD, I no longer know if it as really changed something or not. :-(
The vRAM used is toping at 8.3GB (but it's because I use HBCC to expand the 8GB of the Vega64)
The RAM usage is between 7.5 to 8.7GB (for all the game + OS)
My FPS are about same : between 45 and 65 FPS
My CPU usage seems a bit superior (40 to 60%) but, even before applying the MOD, I never experienced what we can see on the left of the techpowerUp screenshot.
Finally, the sluggish mouse control is somewhat less present but It definitely is here.
The good thing is that the game is - for me - playable as it is even if a solid 60+FPS and a more fluid mouse control would make it better.
I would prefer CDPR to address the collision bugs, the driving, etc.
Graphically and performance wise, it is already acceptable (I am not forced to game with ultra settings after all).
I don't expect results will change much, if at all, at frame rates around there, but I bet the smoothness of your overall experience will be better with reduced low spikes. I only tested one spot recently where my save was, staring off into the crowd at the base of the home building where I'm already completely GPU choked, and it still made a small difference to average FPS.
Settings: Pretty much DF's optimized RT settings but a few things turned up, like colour accuracy, & I turn off all the post processing effects because I despise them.
Edit: I forgot to mention, I have the following ReShade shaders on as well FWIW - CAS, filmic tonemap, Technicolor 2. Tiny hit from all 3 combined.
Specs: 1440p (non-HDR), 3900X @ "stock" (Fabic/controller/mem @ 1900MHz), TUF 3080 OC @ +135/+1000 (2025-2125MHz core, "10.5/21GHz" mem), 16GB DDR4-3600@3800 16-16/17-16-32, WD Black 1TB NVMe SSD
Pre mod: 79-80 fps
Post mod: 84-85 fps
Roughly a 6% improvement to average FPS in what looked like a 100% GPU bound scene is more than I expected. And as I said before load times are faster. It's anecdotal so far with not enough of a sample, but the game feels smoother due to presumably softened low spikes. 6% is basically half the gap between a 3080 and a 3090 in most games -_-'
Applied the hack. loading times went way down feels also more smooth. Also have some random speedups while playing (way faster moving). Maybe thats why it was disabled?
Hacks, "hundreds" of bioses and agesa's. When is the zen plattform ready for use? There is no time for playing games, when we are "hacking" and updating and tweaking bioses all the time
Jelly about Intel users that just works
Blame software devs over the past 10 years completely ignoring AMD and just optimizing software for Intel CPUs.
Looks to me that area benchmarked is not the same, and that what author of this article wrote is reality = less performance, not more. The rest is, to put mildly, wishful thinking.
Honestly AMD updated the library 3 years ago.
It only happened once to me so far ever since applying the hex-edit on the executable of the game (may anybody else confirm?)
The framerates got enhanced by approx. 8-15fps (depending on scenarios) in the beginning. However, after a while (like an hour or so) the framerates gradually decreased to an even worse state than before the hex-edit at the same frequencies and temperatures (CPU and GPU). Before I rebooted, I ended up with about 43fps on places I normally have 80fps even without the hex edit. So something was wrong on that session. After a system reboot the framerates were normal again.
I Think this might have been due to some kind of memory leak or something but not sure if it is caused by the hex-edit or if this was just a one-time-issue with my rig. I've never had this on longer sessions without the hex edit before.
I double-checked, the frequencies on my CPU and GPU were not underclocking or anything, temps were fine too. However, I didn't check the state of my VRAM clocks and usage though.
Can anybody confirm a similar or the same strange behaviour?
About max fps,
r5 2600 52fps before hack
r5 2600 85fps after hack
Edit: 3,8 ghz.
I remember doing this kind of mem hack 20 years ago to cheat my level in a online fps shooter.