Does lowering "Max Prerendered Frames" reduce input lag when using traditional V-Sync?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by BlindBison, Dec 2, 2019.

  1. Mda400

    Mda400 Maha Guru

    Messages:
    1,090
    Likes Received:
    201
    GPU:
    4070Ti 3GHz/24GHz
    Theoretically yes, but it ultimately depends on how well the application takes advantage of the hardware.
    You mainly want your GPU to be the bottleneck since it's the last piece of hardware before it outputs to your display.

    Powerful and Weak were generalizations and it isn't as black and white as you may think.
    Only by looking at benchmark results and even then, testing for yourself will you know for sure what works in each situation.

    No one setting will be the best as not all applications are made the same.
    One starting point is assuming your CPU is "powerful" and globally setting Low Latency Mode to Ultra, then find out its "weak" to a specific application and adjust the Low Latency Mode on an individual program basis. The trade-off is either a more consistent framerate or lower latency.
     
    Last edited: Dec 4, 2019
  2. PQED

    PQED Active Member

    Messages:
    92
    Likes Received:
    32
    GPU:
    MSI 1070 Gaming X
    I don't play any of the games tested in that video, but in the ones I do (whether they tax my CPU or GPU to the max or not) pre-render 1 causes a marked improvement, and nvidias new "Ultra" setting even further so. It's a simply unmistakable difference. No placebo.

    For anyone who has Dying Light (a title notorious for input lag), go ahead and try for yourselves, you'll most likely see a marked improvement (especially if you're stuck with 60Hz & V-Sync only, like me).
    I'm not saying it will help in every title though, beause as we all know, they may differ greatly (like Dying Light), and you may be met by micro-stuttering (as many well have).

    You're likely to see the best results with only V-sync and the Ultra setting; limiters may add lag.

    I would advise against this personally. There are no doubt many titles that don't react well to this setting, and in my opinion you are therefore better off only setting it for titles where you know it benefits you.

    This is going to be my final post on the matter, I really don't have anything else to add on the matter that hasn't already been said in one way or another.


    On a sidenote: I see that my little jab at Astyanax has been removed, whereas his is allowed to stand. What kind of moderation is that?
    I may have provoked him, but not before he did so towards others in this thread first with his rather toxic attitude, and little explanation or even proof to back up what the was saying. Not a first for him on this forum either.
     
  3. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    Thanks, that's good to know @PQED I appreciate your explaining all of that
     
  4. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    Correct, there is a real effect. Are you done yet?


    Nope, your theoretical is not what happens in reality as those two events are rarely in exactly the same timing. You're arguing with Unwinder who has likely forgotten more than you pretend to know.
     

  5. Mda400

    Mda400 Maha Guru

    Messages:
    1,090
    Likes Received:
    201
    GPU:
    4070Ti 3GHz/24GHz
    Right, I should have put it this way for the sake of preference:

    If input latency matters to you, set Low Latency Mode globally to Ultra and then change on a per-program basis if performance is not optimal.

    If consistently max framerates matter to you, keep Low Latency Mode off. However you may put yourself at a disadvantage in applications where input response is crucial.
     
    PQED likes this.
  6. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    @Mda400 The only thing I worry about is not having slightly lower overall fps, it's potential hitching and stuttering that reducing the size of the flip queue may cause (according to some at least -- including AMD apparently going off their note for the flip queue setting in that google doc link I'd attached earlier). Better frametimes rather than framerates is one of the things CaptaPraelium talked about in that Future Frame Rendering BFV reddit thread and it's a bit agitating for me that when Hardware Unboxed (and other outlets) test reducing the size of the flip queue the only thing they normally look at is fps, but you can have almost the same average fps while getting more stutters/framerate spikes downward and that's what I'm primarily concerned with -- I despise input lag (like most people I imagine), but I despise hitching and stuttering even more -- it really just takes one out of the game I feel when it happens and way too many games have a hitching problem with uncapped fps for some reason has been my experience.
     
    Last edited: Dec 6, 2019
  7. Mda400

    Mda400 Maha Guru

    Messages:
    1,090
    Likes Received:
    201
    GPU:
    4070Ti 3GHz/24GHz
    Set it to what you prefer and not every application will behave the same, that is why there are options.

    But provided that you are not CPU-bottlenecked, lower values should reduce input latency as the GPU keeps getting fed more recent frames to paint.

    If there is stutter with any value, it can be dependent on the application, the hardware in your PC, or the graphical/OS power settings you use (like framerate limiting, vsync, or Windows power plan).
     
    Last edited: Dec 6, 2019
  8. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    @Astyanax Sorry to bother you again, but if the user simply caps their framerate to a value that is consistently achievable then are you saying that the default prerender settings will not have higher input delay than when using the low-lag modes/Ultra? (so, for example if you were using traditional v-sync for example in conjunction with the blurbusters low-lag v-sync guide where you have an RTSS fps cap -- or, if you just capped fps to a consistently achievable value on a G-Sync panel, etc).

    I read CaptaPraelium's reddit post on Future Frame Rendering for Battlefield V (https://www.reddit.com/r/BattlefieldV/comments/9vte98/future_frame_rendering_an_explanation/) and provided I'm understanding him correctly, it doesn't sound to me like the the CPU is preparing 3 whole frames at once (correct me if I'm wrong there), but rather it prepares 1 -> sends to GPU then while the GPU is working on that starts working on the second frame then a third "if" the GPU is still busy working on that first frame.

    However, if the GPU is able to keep up then I would expect the flip queue would just stay empty on its own most of the time. So, for example if a user caps FPS to a consistently achievable value. I might be misunderstanding this though so correct me if I'm wrong.

    Part of why all of this is so confusing is that sometimes results come out that seem a bit odd/not what one would expect -- so for example, BattleNonSense's test of Ultra low lag mode for Overwatch found it actually increased input lag a bit when you are making use of the in-game fps limiter. Of course that might be a specific game case though, I'm not sure (or, it might not happen with an RTSS limit rather than in-game, who knows). Anyway, thanks for your time and helpful comments.

    EDIT: On the other hand, DisplayLag's streetfighter 4 tests from forever ago (years back at this point) found that reducing the size of the prerender queue to 1 did still reduce input lag in conjunction with V-Sync (and that game caps its fps to 60 internally). Notably BattleNonSense did not test external limiters nor did he test the in-game ON setting for reduce buffering in Overwatch or just a value of ON for the low latency mode -- he only tested the Ultra mode in conjunction with an in-engine fps game for that title -- sorta wish he'd been more thorough.
     
    Last edited: Jun 11, 2020
  9. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,380
    GPU:
    GTX 1080ti
    the low lag settings are more likely to reduce perceived latency when the gpu is under high loads, since capping means you're typically going to be not gpu load limited you'll perceive little if any from changing the prerender.
     
    BlindBison likes this.
  10. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,507
    Likes Received:
    1,877
    GPU:
    7800 XT Hellhound
    The ULL setting also affects input lag with RTSS and driver limiter. Capping via vsync should be a different case, ULL might only feel more direct when falling below the cap.

    Edit: I've just tested this in BF4, and without ULL (ultra, didn't test prerenderlimit 1), the input latency gets atrocious with vsync. So I suspect it's simply adding up latency of CPU prerender and vsync backbuffers and thus ULL ultra always helps.
     
    Last edited: Jun 11, 2020
    BlindBison likes this.

  11. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    @Astyanax Thanks! That's helpful -- yeah, if I remember correctly BattleNonSense's video did explain that the features don't really help if the GPU is not maxed/unless you're GPU limited so that makes sense to me. If you're CPU limited then being able to pre-render is probably worth it going off the CaptaPraelium writeup at least (as I'm understanding it and assuming its correct).

    @aufkrawall2 Question, for your test of V-Sync in Battlefield V were you able to always maintain your target V-Sync cap/were you tracking your FPS with an overlay like RTSS? So, for example if you were doing double buffer V-Sync at 60 fps, were you able to maintain that target or were you using something like a variable triple buffer V-Sync? I see you mention that you think ULL might only help when you fall below the cap -- that's sort of what I'm wondering myself.

    Also, were you capping your FPS with RTSS or an in-engine limiter or just using V-Sync without any fps cap? (so, BlurBuster's low lag guide is typically an fps cap + v-sync iirc).

    If you were frequently dropping fps beneath your target/refresh rate than I'd expect you might be GPU limited meaning the low lag modes should help in those cases. @Astyanax Sorry to spam you, but any thoughts on aufkrawall2's comment there? Thanks for your time,
     
  12. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,507
    Likes Received:
    1,877
    GPU:
    7800 XT Hellhound
    I went ahead and also tried both vsync scenarios: ULL ultra reduces latency both when being capped by vsync and also when being GPU bound below refresh rate with vsync (proper triple buffering in the case of BF4, no weird jumps in fps or frame times and no tearing). I think this supports the idea that ULL definitely only affects CPU prerender and vsync backbuffer is entirely independent from this.
    I'm going to post my results and conclusions in a separate thread, as questions around it pop up often.
     
  13. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    Alright, thanks for the heads-up, sounds good
     
  14. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    Isn't it best to use a limiter like RTSS together with adaptive or g-sync rather than also enabling v-sync?
     
  15. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    All in all, the overall best is G-Sync ON, V-Sync ON, LLM Ultra (unless it causes issues with certain games, then simply On) + either the V3 slider or the RTTS limiter at a small negative offset (like 118 for 120Hz). That should cover all scenarios with the best or least bad option.
     

  16. PQED

    PQED Active Member

    Messages:
    92
    Likes Received:
    32
    GPU:
    MSI 1070 Gaming X
    I'll back this statement with a small addition of my own (somewhat unsure if correct or not; but I'd like to find out): some Freesync displays may sway a bit more in framerate, so you may want to use a bigger offset in your limiter as to not touch your v-sync refreshrate cap.

    For example, i use 134FPS as the setting in NVCPL (also tried RTSS) with my 144Hz AOC 24G2U2 (usually just referred to as 24G2U in reviews, as "U2" is the model with the USB hub on it). This is not an officially recognized G-Sync Compatible display, but I haven't experienced any issues with it - and have been unable to find anyone that has either.
    As for ULL I decided to use simply "On" as the global setting, and only using Ultra with titles the won't stutter with it on.

    Most recently Borderlands 3 was horrible to play with ULL on Ultra; this running on an Corsair MP510 NVMe drive, so I don't believe it was I/O-related.
    Possibly a CPU limitation as I'm not exactly on the cusp of technological advancement there with a moderately OC'd 4770k. ULL just set to "On" worked fine though, with only minor and fairly infrequent less intense stutters.

    You wouldn't think such a low cap to be necessary what with limiters being as good as they are, and maybe it isn't.
    Maybe it's entirely placebo, which is why I'd encourage more people to test this for themselves and post their results. I'd like to see how everyone perceives it; if there is any difference at all.

    I will say though that my monitors OSD framecounter would reach 144Hz unless i set the limit to 134FPS. How accurate that counter is I can't say, so I have only my perceptual impressions to go on beyond that.

    NVIDIA Pendulum test was among the tools i used because of it's claim to fame of being consistent.
    Framerate/Framtetime measurement was conducted with RTSS as I have no better tools at hand. Perhaps someone else here does and can give us a better look at it.

    From what I understand, Freesync is mostly software-based(this is what i heard for AMD; is it also accurate for NVIDIA?) in difference to a G-Sync module that would control this "sway" on a hardware level.


    This is my take on this topic after finally dragging myself into the "high" synced-refreshrate game. I was previously using a 60Hz non-sync Dell U2414H.

    If you think/know that I got anything wrong, please go ahead and correct me (preferably with data, if you have it) as I don't want to spread misinformation unnecessarily.


    Edit: Added some pertinent info about my current monitor.
     
    Last edited: Jun 13, 2020

Share This Page