Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 26, 2021.
AMD does AMD things
I noticed this with 5900x and Gigabyte 6900xt Master.
Even on a 60hz monitor, more than 100fps feels smooter in many games.
Like in the old days with Battlefield 3, and under 80fps felt "stuttery". Over 100fps was smooth as butter. This was with 60hz monitor without g-sync. You can "feel" it when you are playing...
Interesting. I have not noticed this behavior at all with my 6900XT, but then again I do not play any games that push 200+ fps at 4k 120hz. My son plays some older games at 1440p with his 6900XT. I wonder if he has noticed this?
unfortunately in many if not most games, the logic is tied to the framerate, doesn't have to be that way but it is.
Lol kind of reminds me of Command and Conquer Renegade, when it launched it had no loading/start timer (i think a subsequent patch added one or maybe it was a server setting later on, I don't recall), if you had a Western Digital Raptor or other high RPM hard drive, you could load in with a couple other buddies and do a C4 suicide run on the ob/AGT before the other team even got into the game. It was like the first time I recognized having a fast computer gave you an advantage over other people - although most games work on mitigating that now.
I played few and had this on Diablo 3.
LoL nothing. CS nothing.
But it was easily fixed with game profile, i locked clocks and disabled amd chill
I'm looking for someone who is having the issue discussed in this thread and would like to be a Guinea Pig to see if the following resolves it.
Before I give the instructions, Note the following:
You will need to uninstall and reinstall the AMD Radeon Adrenalin driver to undo the work you are performing.
I am not responsible for any damage or data loss you may experience.
A back up or at the very least, a restore point should be available.
For the record, I have performed this operation a number of times without any issue except better GPU performance meaning less hiccups in frame rate and FPS. Time will tell if this is the case for other users.
The work involves the uninstall of the "AMD Crash Defender" from the Device Manager under the "System Devices" section.
Step 1) Restart the computer.
Step 2) Right click on "AMD Crash Defender" and select "Uninstall Device". A new Window will open with the following statement "Warning: You are about to uninstall this device from your system." Make sure to click the box in front of the statement that reads "Delete the driver software for this device."
Step 3) Left click on the "Uninstall" button
Step 4) Reboot the computer once the uninstall has been completed
Step 5) Once the computer reboots, Start the game that you have had the issue discussed in this thread and test its performance.
Step 6) Report back to this thread.
I will be monitoring this thread for responses.
Wait a minute, wasn't this exact same or similar observation made about Nvidia a while back, alleged driver overhead?
Makes you wonder if there really is any issue at all.
I can't really see how that would help. Problem is "CPU overhead" and resulting bottleneck ( even with high end CPU ). Forcing clocks and disabling amd chill helps in cases i tested.
There is one part of statement that can be determined as false in no time. That's "CPU intensive".
I did kick up CS:GO and went to do testing. Framerate drops are real. Cause is not between CPU and Driver.
CS:GO Scenario tested: Game on max settings except motion blur. No vsync (Freesync enabled). 235fps limit set in game (240Hz screen). Data taken from in-game graph function.
- Default clock mode on card: Issue exists. Fps drops even to 145. Fps variance up to 1.2ms. It is in form of occasional hitching or lower fps which lasts for many seconds.
- Min Clock 2GHz: Issue exists. Fps drops even to 160. Fps variance up to 1.1ms. It is in form of lower fps lasting for many seconds.
- Max clock 750MHz: Issue exists. Fps moves between 200~235. Fps variance up to 0.6ms. It is in form of lower fps lasting for many seconds.
So there are 2 causes:
- card ignores workload as it is too small (zones of low fps). Happens no matter what clock GPU runs at, but is worse when clock is higher.
- card reduces clock without increasing utilization fast enough (occasional hitching)
So, hitching part is consequential of changes in clock and will go away moment GPU properly manages its peak clock in more reasonable steps.
Main issue therefore is in GPU not properly recognizing workload and leaving it in queue for later while there is nothing else to do anyway.
It is kind of like:
- GPU is done with frame
- Gets into idle state and runs empty cycles at some inner super low clock power saving mode
- Should wake up for next frame, but it takes time
GPU intensive games have no problem to deliver stable 235fps.
Edit: running CS:GO with VSR 4K @240Hz removes hitching too as clock no longer changes in big steps. Fps holds mostly above 200. But occasionally goes to 180.
And I did test playing with fps limit at 400/800. Those people are stupid or blind. Going above freesync range causes extra frametime variance from actual monitor syncing. And it is very noticeable. Running ~350fps at 240Hz screen was much worse visual experience than running game when it was having fps between 160~235 +freesync.
And that's because it is better to have up to 1.1ms frametime variance in game engine/driver in instant syncing to screen (data to photon delay is practically same in between frames).
Than to have average ~2ms syncing variance between GPU and screen (data to photon delay is wild in between frames). Those people playing at 120/144Hz screens pushing 350fps+ have quite some variance between data and photons. Anyone who has G/Freesync should not go above monitor's range. It will only reduce smoothness of image stream.
Yes , i tested 720p Horizon Dawn , Valhalla ( at low settings ) they easily push 250-300 FPS zero problems , but see they are both DX12 engines. This issue is isolated to DX9-11 and is connected to "driver overhead" AMD has for years in older titles.
Yet, there is no new overhead regression for RDNA1/2 driver. I did check some games where I do remember issues in past which went away.
Like Dead Island (DX9) which used to have horrible fps instability in past long gone. And it has super stable ~235fps.
Same way HL2/Portal2 (DX9) cousins of CS:GO keeps perfectly stable 235fps.
CPU bottleneck definitely exists in games like Grim Dawn. But that's because game uses two threads for everything.
32bit DX9 - 88fps; 32bit DX9(DXVK) - 113fps; 64bit DX11 - 110fps; 64bit DX11(DXVK) - 120fps. Where DXVK helps, there objectively is overhead. But is it driver overhead or overhead in method game communicates data for rendering to driver?
Because some games put on screen many more objects and use many more shaders, yet they can still deliver stable high fps.
Yet here is another perspective. And that's perspective of time. At time DX9 came to be, and DX11 came to be, what were refresh rates of screens? And what was resulting expectations for "good fps"?
If Grim Dawn was to be played under DX9 on CPU that came in time of DX9, it would run at less than 60fps and would perform poorly. Objectively it would be bad even under DX11 at time of release of DX11 unless one had newest and greatest Intel Core. Those few and expensive 120Hz screens would look sadly at ~60fps in 2009/2010.
So it is so unexpected that some old or poorly coded games deliver low fps even with modern CPU?
Now back to CS:GO. Or other games. That's shortly after AMD released Fiji. (With i5-2500K @4.5GHz.)
There is really no change for worse. Dead Island actually performs better now due to gradual improvements in CPUs.
But if we look at CPU performance & General Screen refresh rates available around time API came out and today. CPU performance went up at much smaller rate than Screen refresh rates.
Expecting 240fps from some old/poorly coded game developed by someone who used 60Hz screen and did not even think about 120fps possibility is like shopping for 8GHz CPUs.
Someone here (with nVidia's GPU) can jump in with Zen2 CPU and Grim Dawn and test what fps he gets. And we can have some reasonably comparable data for driver overhead vs bad game coding.
Or Killing Floor 1. But I do not want to install that pile. (I liked the game, but people simply did not realize that there are certain limits on polygon counts. And some maps performs very poorly due to poor map design and improper use of automatic geometry culling zones in those maps.) Would install it if someone with Zen 2 and nV's GPU wants to do comparison.
I will test CS GO today and post my findings.
I don't have this problem at all? I actually fired up the game yesterday for the first time in forever because I was curious what my 3900x/3080 would pull. Got about 320fps with my config (which is mostly broken now because the game has changed so much) smoothness was fine, I had no issues.
That's the problem. If you consider ~320fps vsynced (or non-vsync) better than 120/144/165Hz G-Synced, you are on placebo.
But it is OK if you are not able to see difference and consider both equal. I simply am used to rather different experience and notice even small stutter.
Yeah experience this a lot with fortnite, game is running high fps then move camera fast in any direction you see the core clocks drop from 2GHz plus to 500MHz and loading on gpu hovers around 40% on 500MHz causes massive frame times 500ms upwards. Seems be worse more you lighten the load on the gpu too, say you go to low settings.
https://forums.guru3d.com/threads/r...-mods-bios-tweaks.434911/page-44#post-5891935 It's been going off awhile. I noticed frame buffer goes 0% on my graph
Also if you set the minimum gpu clock speed above 500 in wattman, whilst playing high fps games which are causing this hitching/glitching usually you'll get black screen an maybe a system reboot. If you look at event viewer usually says WHEA error something do with cpu or amd drivers.
I'm assuming that's the equivalent/same thing of/as nVidia's FastSync? FastSync has been measured as adding input lag, just nowhere near the same as vsync, unless there were mistakes made in the videos I saw. But at the same time it seems the higher the fps the lower that already much lower than vsync input lag would be.
I don't know if it adds any lag, but it certainly allows for faster input than with V-sync on, so for casual gamers who don't care about removing every millisecond it's great.
Also if you drop under your monitor refresh rate, it then stops waiting for V-sync, so you don't get the huge drop down in FPS, just tearing occurs under the refresh rate.