Why does 30 fps on PC not look as smooth as 30 fps on console?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Darren Hodgson, Aug 20, 2020.

  1. dampflokfreund

    dampflokfreund Master Guru

    Messages:
    203
    Likes Received:
    31
    GPU:
    8600/8700M Series
    Deactivate Vsync in games and force 1/2 Refresh rate using Nvidia Inspector. You got smooth as butter 30 FPS similar to consoles.
     
    Dragam1337 likes this.
  2. Smough

    Smough Master Guru

    Messages:
    984
    Likes Received:
    303
    GPU:
    GTX 1660
    You could test nvidia frame rate limiter which seems to be very solid or the one included in Radeon, which is this:

     
    BlindBison likes this.
  3. OrdinaryOregano

    OrdinaryOregano Guest

    Messages:
    433
    Likes Received:
    6
    GPU:
    MSI 1080 Gaming X
    @Darren Hodgson seems like there's been a lot of discussion here but doesn't look like anybody mentioned this one thing I've felt and noticed.

    The biggest potential reason in your setup is Gsync. I have noticed 30 FPS on Gsync looks like complete garbage and looks a lot better on a regular display. Blurbusters mentioned that when a Gsync monitor goes under ~36FPS, the monitor starts adding duplicate refreshes in between to compensate and that's been my experience ever since I bought a Gsync monitor years ago, I've pretty much stopped playing anything where I can't maintain 40+ FPS minimum it's an unplayable choppy mess otherwise that I cannot stand. I would suggest that you try those same games after disabling Gsync to see if you notice any difference.
     
    Darren Hodgson, enkoo1 and BlindBison like this.
  4. Xhah

    Xhah Guest

    Messages:
    7
    Likes Received:
    3
    GPU:
    HD7790
    It's not all that difficult. There are 3 key things one must remember:

    1. Frame rate: Your frame rate must be exactly half or quarter of your refresh rate, otherwise you'll notice frame judder or frame jumping. If you've a 120Hz panel then cap it to either 30 or 60 or 120 and not somewhere in-between like 75 or whatever. The OP locked the game at 30FPS on his 165Hz monitor which is a HUGE no no as 165 ÷ 4 = 41.25.

    2. Frame time: Another aspect that's often overlooked. If you look at Digital Foundry's console benchmarks you'll see that 30FPS console games run at an exact 33.33ms which contributes to their smoothness as each frame is being rendered for exactly 2 refresh cycles and not a millisecond more.

    Now here's the thing, if you turn-on the frame times graph in MSI Afterburner you'll notice that Nvidia and AMD's built-in frame locking utilities don't work as good as they should and fluctuate a lot, usually in 15-20ms range which is a LOT, especially if you compare them to consoles! Your eyes may not notice such miniscule stutters but it's there alright! The best tool for this purpose is RTSS which comes bundled with MSI Afterburner and does a MARVELOUS job of locking your frame times.

    3. Vsync: This one is a no brainer, really. If you want smoothness then you'll have to sync your frames with your monitor's refresh rate otherwise you'll notice tearing and micro stutters.
     
    Last edited: Oct 4, 2020
    MotherSoraka likes this.

  5. Smough

    Smough Master Guru

    Messages:
    984
    Likes Received:
    303
    GPU:
    GTX 1660
    As has been discussed before here, its not frametimes. Why people keep repeating this? Also, its not "vsync", you must use "half v-sync" in combination with RTSS at 30 fps. Even getting 30 fps at 33.3 ms on PC, you won't get the feel of console 30 fps smoothness. You should have take the time to read everything that has been saind on this thread instead of giving such a blind response that has already been suggested by some in here as well. Most of us are well aware what frametime is and often, this isn't the problem when you try to get "console-like" 30 fps on PC.
     
  6. Xhah

    Xhah Guest

    Messages:
    7
    Likes Received:
    3
    GPU:
    HD7790
    Let me ask you something first; do you think half vsync caps a 60Hz monitor to 30Hz?

    No, it doesn't. It simply renders each frame for 2 refresh cycles, and one can achieve the same "effect" by locking the frame rate with RTSS. And what else, I just forced half vsync in Hitman (my PC can't quite push it to 60) and didn't notice any difference between half and full vsync.

    But if you think half vsync makes a world of a difference and frame times are absolutely irrelevant then so be it! I'm not looking for a pointless debate.
     
  7. fernake

    fernake Master Guru

    Messages:
    231
    Likes Received:
    21
    GPU:
    Asus strix GTX 1080 OC
    Do not try any more methods to get 30 fps from the console, nothing works, the only thing that can be achieved is 30 fps raw without any smoothness, it is what it is.
     
    enkoo1 likes this.
  8. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Hmm. My display has a 40-120 range. I didn't notice this issue when I tried to limit the fps to a constant number below 40.
    nVidia didn't give it a name like AMD did: Low Framerate Compensation (if I recall correctly) but they supposedly had it from G-Sync's initial public launch (AMD only added this to FreeSync a bit later than they introduced FreeSync to the public).
    So, may be it only happens with varying framerates below your range. But since we talk abut a constant 30 fps limit here anyway..., I don't know...
    G-Sync should be "perfect" for a constant 30 fps limited scenario (low lag, no tearing, etc). Most displays would of course actually get 60 from the driver but those kinds of even frame duplications should be fully invisible (the image doesn't actually change on the screen at all - you can't see what's not even there...).

    Do in-game and/or RTSS / nV v3 limiters help to smooth out the sampling of the inputs, aka create even scene pacing?
    I think AMD Chill claims to do something like that (on top of limiting the fps). But I never tried that with FreeSync (especially not with a constant 30 fps limit -> min=max=30 in this case).
     
    Last edited: Oct 6, 2020
  9. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    @OrdinaryOregano I have G-Sync myself and have tested 30 FPS w/ G-Sync (fps limiters) VS 30 FPS (fps limiters) w/ half-refresh v-sync on my fixed 60 hz panel and yeah -- 30 fps definitely looks smoother to the eye with the fixed refresh panel/half refresh v-sync. That said input lag (even with the blurbusters low lag trick w/ FPS limiter) is noticeably better with g-sync so it might still be preferred overall for that reason.
     
  10. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    I actually agree with @Smough on this. I do not think the effect is placebo in some cases. I can only speculate as to "why" (which I will do in a moment) but I have observed this phenomenon too. I was actually just testing this today in a few games where I cap the game with RTSS to a perfect 30 with g-sync + v-sync then compare to my laptop using traditional half refresh rate v-sync then compare against my console outputting to 30 to tv.

    30 FPS on PC actually does look quite smooth and reasonably close to what my console is producing "if" I'm using traditional half refresh rate v-sync from nvidia inspector with motion blur enabled. Back to back this looks a lot smoother than 30 FPS on my g-sync monitor capped with RTSS. Like, a lot -- noticeably. One theory I have here since RealNC mentioned he could not reproduce this sometime ago is that it comes down to differences in the monitor type as some produce different levels of image blur. Something I will get around to testing then is plugging my Switch with BotW into my TV/monitor/and laptop screen then I can report back if I observe differences in the same area/panning the camera.

    My other theory is that this has something to do with the way some game's handle camera animation. For example, in specific games even though the frametime readout is flawless/completely perefct there is visible "jitter" panning the camera. MCC Collection and DX12 BFV did this for me with G-Sync + V-Sync ON + consistently reaching an RTSS cap then just panning the camera. In fairness it's been quite awhile since I tested this so it may just come down to the way some game's handle the camera animation.

    The other thing it could be maybe is the way games tune their Motion Blur. For example I noticed with G-sync + V-sync ON capped internally to 60 fps in Dark Souls Remastered panning the camera looks very very smooth/perfectly smooth basically with no visible camera jitter I could make out (using the in-game built in cap). The thing that struck me in that game is the motion blur looks amazing to me in terms of how they tuned the strength/shutter speed or what have you. I wonder if for consoles games built with 30 in mind they use a much stronger/longer shutter motion blur or some such. I noticed in Sekiro if you use a mod to remove the 60 fps cap then set an RTSS cap to 120 it actually looks perceptually less smooth than setting an RTSS cap to 60 in this game. For one because the motion blur strength is basically nonexistent at the higher values and then I suspect perhaps also because the game was not built to read input/interpolate the camera animation right at FPS above 60 -- but I can only guess.

    But yeah, seems to me there can be quite a noticeable difference -- in some cases -- where perfect frametime readouts are no guarantee of flawless smoothness. Impressive smoothness being a characteristic console 30 fps often seems to have (though in other cases you're clearly better off on PC since Fromsoftware 30 is garbage on console, etc).
     
    Last edited: Apr 12, 2023

  11. Benik3

    Benik3 Master Guru

    Messages:
    605
    Likes Received:
    73
    GPU:
    Aorus E. RTX2080Ti
    Yeah, consoles use motion blur to mask the 30fps stuttering.
    The sharper the image and faster the monitor you have, the more you will see the stuttering.
    If you look on 30fps on G-Sync/FreeSync monitor with low latency, you will notice the stuttering. If you look on 30FPS on TV which has slow panel and also can use automatically image interpolation, it will look much smoother.
    This effect can be also more easily seen on OLED panels, because they are extremely fast (you don't have almost any blurring when picture change), so e.g. camera movement can be seen stuttering.

    Anyway even on PC there can be seen differences between games and their smoothness on the same FPS. I know that some games looks smooth even on 40 FPS and some are "unplayable". I think that it also depends on frame time consistency and input lag, but I never investigate it...
     
  12. Memorian

    Memorian Ancient Guru

    Messages:
    4,021
    Likes Received:
    890
    GPU:
    RTX 4090
    You mean judder not stuttering..
     
  13. oneoulker

    oneoulker Member Guru

    Messages:
    193
    Likes Received:
    225
    GPU:
    NVIDIA RTX 3070
    1/2 vsync works but has horrendous input lag. somehow PS4 does not have similar input lag
    1/2 vsync + 30 fps cap reduces the input lag quite a bit. but it needs to be aligned to VBLANK or something or you will get random judders here and there

    1/2 vsync at 60 hz on the exact same gsync screen feels smoother than locked 30 with Gsync enabled. VSYNC makes the picture appear smoother somehow/someway

    however the input lag is a problem.

    also try vsync forced on (no fractional) and ingame 30 fps cap. that too, kind of works. but from time to time it produces judders. i heard that if frame cap is aligned to VBLANK, it wouldn't do that. and sadly, nvcp/rtss framecaps are not aligned to vblank from what I heard. only Special K's framecap is aligned to vblank. might be worth a try. but as i said in the other thread; external frame limiters add 1 frame of input lag which is problematic with aiming mechanics at low framerates
     
    Last edited: Apr 13, 2023
    BlindBison likes this.
  14. Benik3

    Benik3 Master Guru

    Messages:
    605
    Likes Received:
    73
    GPU:
    Aorus E. RTX2080Ti
    Yeah, I didn't remembered that name.
    BTW on rtings they have it exact opposite:
    Stutter is related to screen speed, judder is about consistency of frametimes:
    https://www.rtings.com/tv/tests/motion/stutter

    VRR (G-Sync/FreeSync/Adaptive Sync) is not the same as V-SYNC. It changes the frequency of the monitor, but it doesn't make exact frame syncing. That's why you can see tearing of picture even on VRR enabled. You can enable V-SYNC even with VRR to get optimal result, just keep FPS at least 3FPS lower then your maximum VRR refresh rate. Then you will have synced frames without added input lag. It's discussed here and on the internet many times, Blur busters also made tests for various scenarios VRR + V-SYNC.
    I have V-SYNC eanbled and framerate limiter set in global settings of nVidia.
     
    Last edited: Apr 13, 2023
    janos666 likes this.
  15. pegasus1

    pegasus1 Ancient Guru

    Messages:
    5,188
    Likes Received:
    3,588
    GPU:
    TUF 4090
    Most Reddit comment ever
     
    BlindBison likes this.

  16. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    I've noticed this too -- my crack pot theory which could definitely be wrong is that it has to do with frame presentation times. Maybe you need both accurate/perfect timing for releasing the frame from the CPU to the GPU and then also you'd then need the GPU to wait to release the frame to the monitor with perfect timing as it does with half refresh v-sync? So, RTSS with its Async mode is very accurate in terms of when it releases the frame buffer from the GPU. This frame presentation is what is measured in the RTSS frame time read out by default. "But" if you measure with "frame presentation" you will notice that in-engine limiters, RTSS, Driver limiter -- whatever really -- has a good amount of variance and I assume that's because each frame takes a very slightly different amount of time to render and with G-Sync these frames get sent to the monitor as soon as they are finished. This means it would looks like this I believe:

    G-Sync:
    1) CPU -> GPU -> Monitor
    2) 30 FPS cap in RTSS with default Async Mode ON
    3) CPU release frame to GPU at flawless 33.3 ms intervals (I "think" this means the input is also read at "perfect" intervals since that's on the CPU side of things so hopefully you get correct camera motion/animation and such)
    4) GPU receives frame and does its work -- once finished it immediately sends the frame to the monitor.
    5) Because each frame takes a slightly different amount of time for the GPU to render you get some variation in frame present timing and thus it does not look "perfectly flawlessly smooth" like half refresh v-sync would.

    Traditional 60 Hz panel with 1/2 Refresh Rate V-Sync:
    1) Steps 1 through 3 are the same
    2) Step 4 instead of sending to monitor immediately it "waits" until the perfect screen refresh moment to submit (v-blank window iirc).
    3) Since the GPU waited until the perfect 33.3 ms moment to submit you get a flawless 30 across the board.

    Of course RTSS has a mode with front edge that focuses on frame presentation but then you're not reading the user's input or releasing the frame from the CPU at perfect intervals anymore with that mode so you could still have it not look perfectly smooth I expect (in my tests that mode doesn't look to my eye smoother than the default async mode typically). So, for a "true" half refresh rate v-sync setup with the RTSS perfect cap for the low lag setup we are then both releasing the frame from the CPU at perfect intervals and then also after the GPU renders the frames it has to wait until the perfect release point to submit it. Thus you get truly smooth camera motion and such.

    I could be off base regarding this. It would be good to get some insight or correction from people more in the know on the nitty gritty details of frame rate limiting/adaptive sync/v-sync. If I am correct I wonder if it would be possible to have an Async + Front Edge hybrid limiting mode where it both waits to release the frame buffer from the CPU to GPU at perfect intervals but then also has the GPU wait slightly such that the present times all have precisely the same gaps between them. I expect input lag would not be good in this mode/it might be the same as traditional half-refresh rate v-sync at that point. @Unwinder No need to respond if you're busy of course, but this is a subject I am very curious about. Thanks,
     
    Last edited: Apr 14, 2023
  17. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    There is a pretty nutty amount of wrong information that somehow reaches the top comments on reddit. Not saying that never happens here, of course it does but generally what i've seen is users here are better about presenting things they don't know as questions or as though they are not certain. That and a lot of users here really do know a ton about niche PC and technical topics. On reddit I used to regularly see people making technical statements as though they were 100% certain when they were totally wrong and then somehow these would be the topmost comments. Makes me wonder how exactly their comment hierarchy rating algorithm works.
    At native 4K you are probably right that most of the new batch of GPUs from AMD and Nvidia would be unable to do maximum settings with full RT and reach 60+ fps constantly. But I'd argue that's not really a typical use case for most PC users. They may be playing with a 1440p monitor. They probably will be using DLSS or FSR2. Probably most users would use something like High settings rather than Ultra and if they do opt to use RT many users will set it to Medium quality rather than High/Ultra RT quality. Savings like that can go a very long way. Even cards like the 3070 which are now sadly becoming very VRAM constrained can run games pretty well if you're willing to turn down the Textures/Shadow maps to Medium and use Image Reconstruction.

    I do agree with you that both Nvidia and AMD should provide the tools for a proper "console like" 30 FPS, but they sort of do already. Or, at least Nvidia users can if they're wiling to download Nvidia Profile Inspector for 1/2 Refresh V-Sync. In conjunction with a framerate cap and maybe one of the low lag modes one can achieve a pretty close to "console like" 30 in most cases. Now I think they should take this even further -- Nvidia should have a checkbox on their FPS limiter to allow for decimal place precision out to two digits (RTSS has this already and its awesome) and Nvidia should just go ahead and add actual 1/2 refresh rate v-sync to their driver options rather than only having "adaptive" by default there and requiring users to download profile inspector. There are also some issues iirc with Optimus laptops and driver v-sync so yeah, I think you're right that more work should be done there in giving users options.
     
    Last edited: Apr 14, 2023
    pegasus1 likes this.
  18. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    Hm, I will have to look up that double game speed/render back technique. I'm having a hard time picturing how that would work in my head. It does seem like motion blur is tuned very well in most console games around their target framerate. One thing I noticed in Dark Souls Remastered on PC is that the motion blur in that game on the camera is just
    [​IMG]
    Camera motion in general looks extremely smooth in that title -- like, smoother to my eye than other games running at 90 FPS while that game is capped internally to 60. Makes me wonder if "smoothness" has more to it than just framerate. Maybe it has to do with a wider combination of factors such as how the game reads input, frame pacing behavior, motion blur implementation, etc. Frame pacing might also be more complex than I originally thought since you have the CPU releasing the frame buffer to the GPU end of things then you also have when the GPU releases the final frame to the monitor and then how game's read input and handle interpolation and stuff like that. It's over my head and I don't know enough to say beyond certain games looking much "smoother" than others. Console games at 30 seem to be sometimes doing "something" different as a game like Uncharted 4 on PS4 looks very smooth to my eye despite outputting at 30 in the campaign. I think they used a pretty heavy motion blur shutter speed though.
     
  19. P_G19

    P_G19 Member

    Messages:
    30
    Likes Received:
    5
    GPU:
    GTX 1660 / 6GB
    Best 30 FPS experience was Special K for me.
     
    BlindBison likes this.
  20. Calenhad

    Calenhad Active Member

    Messages:
    66
    Likes Received:
    23
    GPU:
    MSi 3080ti Suprim X
    VRR/G-sync/Freesync: Display is synchronised with GPU output rate, tearing should only happen when FPS is outside the VRR frequency range of the display. Which for some displays is way to narrow. Running 30 FPS content on a VRR display with a lower range of 48Hz and no frame doubling support, can result in a bad experience for example

    Vsync: GPU is synchronised with display refresh rate. Tearing should never happen. Only really problematic if your CPU/GPU is unable to keep up with the display refresh rate. And input lag can be an issue with low refresh rate and vsync, depending on content

    Both: YMMV

    Setting a FPS limit will impact how this works as well

    FWIW I run g-sync compatible enabled, vsync enabled and a 238 fps limit in the Nvidia control panel with my two 240Hz monitors. I don't remember the last time I saw any overshoot above 240 FPS, so I have no issues to speak of. And a frame time of 4.20ms in some games is "funny"
     
    BlindBison likes this.

Share This Page