Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Darren Hodgson, Aug 20, 2020.
Game Mode would disable this unless 100ms input lag doesn't bother you.
Fun thing is that back when I had my Xbox 360 and connected it to an old 60 Hz monitor, the 30 fps were still "console" fps lul.
Care to give a bit more detail? I may try this.
I was kinda hesitating to write this but could distance from viewer to display play a part here? If you are up close you see all the "wrongs" but bigger the distance the less you notice those imperfections so it smooths out a little?
There is also a latency difference for the input method controller vs mouse. When I have played games very rarely on my PC at 30fps. It just feels a lot better using a controller compared to a mouse. The movement on a mouse feels stuttery and jittery at 30fps but it feels ok with a controller.
Why would "distance" make any difference? It will be 30 fps at 100 meters or at 1 meter... I don't understand this logic, the only time where distance can help is with resolutions, for example, even 720p on a big T.V can look fairly decent if you are far enough from the screen. Can't see how this applies to animated images in motion, specially a game. I used to sit super close to my console as a kid and if I play any game on any console to this date, it will be "smooth" as I remembered it, even if I sit as close as possible or if i don't.
Motion Blur truly is a horrible experience. I understand its need but my god, my eyes will bloodshot and start watering with it on after 30 minutes to an hour. That and it feels like the TV is covered in vaseline with it.
God of War has given a lot of people motion sickness with its blur techniques. So bad they've nearly felt like passing out.
I am playing the release version of Marvel's Avengers using a 30 fps framerate cap in RTSS for consistent performance on maxed out settings at 1440p because even the in-game DSR target of 60 fps (and even 30 fps) don't seem to work properly, making the game feel like it is speeding up and slowing down depending on whether the framerate is nearer 60 fps or 30 fps. It's very offputting even though I am using a165 Hz G-SYNC display. Maybe my ageing i7-4770K is to blame here even though I don't particularly see high usage even when targetting 60 fps.
The 30 fps cap fixes this and results in a game that runs 99.999% locked even in the cutscenes and loading screens (which are animated). Furthermore this is a smooth 30 fps that feels on a par with an optimised 30 fps console game which is refreshing to play on PC. Normally, I find 30 fps to be too unsmooth in feel and on my eyes to use for any length of time on PC.
@Darren Hodgson I've noticed this occur in several games on my fairly beefy rig (3900X + 2080S + 32 gigs 3200 MHz RAM + NVMe SSD @1440p G-Sync). For example Arkham Knight stutters like crazy for me unless I cap it to 30 via RTSS (in-engine limiter still stutters for me).
My "theory" is that some console ports were just built around 30 fps really quite a lot and they seem to just work best/not have any I/O stutter/microstuttering when outputting at 30 even on PC while exhibiting bizarre issues above this. I saw the exact same behavior in the same games on my last 8700K + 1080 Ti PC too. Now, I'm not saying this is super common or anything (i'm not sure), but I have seen similar behavior in certain PC ports. In such cases (unless you have G-Sync) I would use half-refresh v-sync + either an in-engine cap to 30 if available or an RTSS cap of ("true refresh"/2) - 0.01 (low lag vsync trick from blurbusters article) + probably reduce flip queue/prerender queue to ultra/min for more acceptable latency (or use in-game setting like Overwatch's Reduce Buffering which is preferred to the control panel settings from what I gather).
For me, I would prefer a steady 30 over lots of microstutter -- I really just can't stand hitching of any kind, it drives me mad lol.
Tried 30 fps with various additional settings at AC Origins today on my new 144Hz LCD, it didn't ended well, nearly puked instantly when tried to move around.
@SpajdrEX Is your panel freesync/gsync? If not then 30 might look really bad since it's not a divisible value of 144 hz with traditional v-sync. Strangely (i've tested this back to back on a traditional panel VS Gsync) 30 fps with half refresh vsync on a fixed refresh 60 hz panel looks noticeably smoother to my eye than 30 fps with g-sync + rtss limit + control panel v-sync on (though input lag is definitely better with g-sync at 30 than half refresh in my experience). If you're using a 144 Hz fixed refresh then I believe you'd need to use something like Nvidia Inspector 1/4 refresh v-sync (36 fps on a 144 hz panel) then do the framerate limit at 36 (or the ("true refresh"/4) - 0.01 trick from blurbusters low lag vsync guide if you're using RTSS).
A little reminder, Unwinder's scanline sync /2 is a good option as well when trying to reach a smooth 30 fps on a 60Hz monitor without VRR.
@emperorsfist Scanline Sync is great when it works. In my own tests on an older laptop of mine, I found that unless I had a lot of overhead, the tear would frequently move around enough that I couldn't consistently keep it off screen. This depends on the game though and assuming I had the overhead, it was pretty great yeah.
My apologies I forgot to respond to this.
If you wish to start getting into tweaking your DDR4 memory kit please read this document for Intel and AMD:
https://github.com/integralfx/MemTestHelper/blob/master/DDR4 OC Guide.md
This will provide you with pretty much all the information you need to get started and the software needed for memory benchmarking / error checking. You can skip to section four in the document if you wish to get started right away.
With my Corsair Vengeance Samsung B-Die DDR4 2133mhz stock (XMP 3000mhz) modules I was able to get a COMPLETELY stable 3400mhz 16-19-19-37, although I think I lost the silicon lottery with this kit as I've seen better timings / frequency results online. The performance of the system and general latency has improved significantly though.
A few things from my experimentation I would like to share:
First I'll start with my specs:
I7-8700K @ 4.7ghz (all cores fixed frequency), EIST / c-states disabled in bios.
MSI Z370 Gaming Plus motherboard
MSI RTX 2070 Super Gaming X
16gb (x2 8gb) Corsair / Samsung B-Die DDR4 - 3000mhz XMP (2133mhz default) - I run these overclocked @ 3400mhz primary timings 16-19-19-37.
Windows 10 64bit
Samsung Evo M.2 NVME 500gb
1. In the UEFI (bios) on my MSI motherboard, there are two settings under the "Advanced Memory Timings" to be aware of, one called "Memory Performance Finetune" and another called "Turn Around Timing Optimize" - Disable both of these options. If you're using a motherboard from another manufacturer like ASUS there might be similar options to be aware of.
2. Leave options to optimize RTL / IO-L's enabled as these won't (on my MSI board) change any other timings automatically during memory training.
3. Voltage. There are three voltage controls to be played with that DO affect perceived 'smoothness' of games and the windows mouse cursor (again board specific and your results may vary)
CPU SA Voltage, CPU IO Voltage, DRAM Voltage. BE CAREFUL.
I would suggest making a note of your XMP profile voltages and then applying them to your custom overclock as a baseline. My XMP SA/IO voltages were automatically set VERY high!
4. When it comes to secondary timings take your time, in the document linked above only use the "presets" shown in the boxes and then stress test to ensure there are no errors.
5. Ring ratio / LLC clock can affect your timings in positive and negative ways. When I asked MSI they said when overclocking, it's generally best to have it set three 'clicks' below your CPU frequency, so for me 4.7ghz CPU = 4.4 Ring - I would like more information on this if possible.
6. Use the AIDA64 latency / read test after each change to ensure the performance is actually improving - many times the tightest settings are NOT the best. Leave it a couple of minutes after Windows reboot to ensure the system has settled before benchmarking.
7. Keep an eye on CPU temps especially when running tightened secondary timings, things can heat up a bit more than normal!
Although I was disappointed with my end stable results, I found the process interesting and well worth doing. Good luck!
It's Freesync (AOC 27G2U). I tried 1/4 Refresh rate + RTSS limit in Remnant From the Ashes and so far it looks really good in movement Thanks for the tip!.
Guys what's the general idea about using the 1/2 vsync from Nvidia Inspector, combined with the ultra low latency?
I would rather not involve RTSS if possible.
In some games this works great. But often Nvidia 1/2 sync gives me occasional micro stutters that i do not have with RTTS
My personal method to test this (in first/third person games) is to turn around the view really slow watching at the panorama. If there are no microstutters in this test I'm OK
It simply comes down to proper sync, frame pacing, and visual effects/shaders (such as any kind of motion blur) being specifically sampled/tuned for the framerate.
Shadow of the Colossus PS4 has a 30 fps mode with a not for 30fps tuned motion blur that drove me sick within 5 seconds, thankfully the reason is it also has a very decent 60fps mode.
In the end, when given alternative framerates, not even consoles can get 30fps right
I have a liquid cooled 2080ti and a Gsynch Monitor so in theory I can run all games pretty much at max everything, however there are 2 things I never use as I hate them with a passion motion blur and Gsynch, it may just be me as I have a genuine medical reason in that flickering lights of certain frequencies trigger migraines so most old school CRTs were a big no no as are some lights, I find motion blur and gsynch sometimes collude to give that exact set of circumstances.
As a result of this I find games play much smoother if I just set the res etc at max in the controls
I use Steam in Big Picture mode and I would prefer not to have another extra thing hooking in the game if possible, and another configuration menu to go through, hence why I'm not using RTSS yet.