Why does "Ultra Low Latency" mode fix stuttering in some games? (Ryse Son of Rome for example)

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by BlindBison, May 30, 2020.

  1. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,433
    Likes Received:
    2,789
    GPU:
    MSI 6800 "Vanilla"
    Might be useful depending on how it's paired with synchronization though without it or with the traditional method of VSync lower migtht be faster and more responsive plus for D3D11 features like flip model presentation and flip discard are rarely utilized so experimenting with more back buffers and pre-rendered frames and setting all that up might be less effective outside of I suppose SpecialK for setting the sync model to flip. :)

    Don't know if that's been compared much though, frame rate limiters and overall input latency and responsiveness has been but flip model is a bit of a thing mostly for D3D12 or Vulkan even if supported under D3D11 and Windows 8 and up (Discard as of Win10 I believe.) but if it's not supported by default then ULL or similar probably provides better results. (Radeon Anti Lag I suppose on the AMD side of driver settings when it's not restricted.)


    Might also depend on how many back buffers and pre-rendered frames some of these games use by default and how they handle triple buffering (An additional step in this whole chain.) and such, probably not always the easiest to just look up without digging into how the game uses D3D11 and any possible issues with functionality such as the in-game framerate limiter. (Thread sleeping and issues around timing that many titles do have problems with resulting in at worst stuttering or hitching plus a uneven framerate cap.)

    https://cookieplmonster.github.io/2020/05/16/silentpatch-wonderful-101/

    The broken frame pacing section for what that's about. :)


    Have some reading to do about this myself though, only started looking into it more recently as I started experimenting more around this and different modes of syncing plus game/software specific issues and various drawbacks and advantages.


    EDIT: Well that's a lot of text for effectively a very good driver feature and oh what a surprise that there's some problematic game to game or game engine issues ha ha.
    (Well they're not all catastrophic but some can be pretty bad with hitching/stuttering and that sort of thing.)
     
    BlindBison likes this.
  2. BlindBison

    BlindBison Master Guru

    Messages:
    604
    Likes Received:
    108
    GPU:
    RTX 2080 Super
    @aufkrawall2 So, in one of the old AMD control panels where you could modify the flip queue size they had a note saying that reducing the flip queue could result in stuttering. So, that top screenshot here: https://docs.google.com/document/d/1ydvAKHF4NvybJIWmZEocoi0-gMYg4NhvYhWNLyohNAQ/edit?usp=sharing

    That's primarily what I was basing my assumption off of in addition to the CaptaPraelium writeup on the BFV subreddit for future framerendering -- at this point, I don't know what to believe lol. What I'll probably do is just leave globally the low lag mode OFF, but then if I get stuttering/microstutter or bad perf in a game, I'll try the various low lag modes and various fps limiting methods to see if it helps (as in some cases it clearly does -- such as Crysis 3/Ryse in my case -- i'm unsure if this is hardware related/Ryzen related or what at this point).
     
  3. BlindBison

    BlindBison Master Guru

    Messages:
    604
    Likes Received:
    108
    GPU:
    RTX 2080 Super
    @JonasBeckman Thanks for that link, i'll give it a read after work.
     
  4. aufkrawall2

    aufkrawall2 Master Guru

    Messages:
    772
    Likes Received:
    106
    GPU:
    6800 reference UV
    I have never seen a case myself where there was stutter with prerender limit 1 while there was none with prerender limit 3/app controlled. Only the opposite.
     

  5. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,173
    Likes Received:
    1,921
    GPU:
    Zotac GTX980Ti OC
    ^
    Yeah most of the time.



    I think OT should leave it at default or ON and then test with windows timer rate, it's the only thing that the game has issues with.


    Win8 to win10 transition made that glitch., this windows timer @ 0.500 fixed it again., high gpu usage all the time and great stable fps.
    I tested this when I was still on 780GTX..so its been happenig for a long time now..
     
    BlindBison likes this.
  6. janos666

    janos666 Master Guru

    Messages:
    995
    Likes Received:
    159
    GPU:
    MSI RTX3080 10Gb
    I guess the main assumption is that "increased amount of buffering can help increasing the frame-rate and/or smoothing the frame-times out". However, I guess it's only true (talking about the smoothing effect here) as long as the buffers stay filled up (bloated). But that's true with traditional V-Sync while the fps is limited to the refresh rate by the back-pressure. Once the fps starts fluctuating though, the buffer length can fluctuate with it. But since in-game animations and real-world mouse movements are independent of those fluctuations (the in-game movement of an object will have it's own speed and acceleration vectors, as well as the mouse in your hand), a fluctuating buffer length can result in fluctuating latency (from mouse to light on screen or even in-game animation sampled state to light on screen).
    I am still not entirely sure how those kinds of phenomenons manifest (if at all) on frame-time graphs. I think one can have a "flat" frame-time graph and still experience "micro stutter" (perceived uneven movement of objects which were ought to move smoothly in the game-world) due to these kinds of things.
    So, I am not at all surprised if reduced buffering (when it's done successfully, even if at some cost like somewhat reduced average uncapped fps potential) can help in certain situations.
    And the general warning could also imply that overriding something like this could potentially break something in the game's behavior. It might cause unexpected results not because you decreased the value and decreased values are inherently worse but simply because you altered the default and the game was never fully tested with non-default values and virtually all software have some bugs (forever undiscovered because edge cases are never hit in practive).
     
    BlindBison likes this.
  7. BlindBison

    BlindBison Master Guru

    Messages:
    604
    Likes Received:
    108
    GPU:
    RTX 2080 Super
    @-Tj- Sorry, but — what do you mean by OT? Thanks.

    That’s really interesting about the windows timer. I wonder if the problem is that those games I’m testing were made with the old windows timer in mind perhaps? Correct me if I’m misunderstanding you though.

    @janos666 Very interesting, huh — thanks for your thoughts there.
     
  8. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,173
    Likes Received:
    1,921
    GPU:
    Zotac GTX980Ti OC
    On topic? - thread starter :)


    From what I remember older cryengine games e.g. cryengine3 had some timer issues and it didnt use the highest possible or something among those lines.. this timer at max fixed it again.

    Some said it even cured 30hz bug, I never had this glitch though, just fps issue with low gpu utilization.
     
    Last edited: Jun 19, 2020
    BlindBison likes this.
  9. windrunnerxj

    windrunnerxj Master Guru

    Messages:
    358
    Likes Received:
    54
    GPU:
    MSI Gaming 1060 6G
    So... in theory, what would be the best LLM setting for CPU limited games without using GSYNC/VSYNC?
    Take an extreme scenario with CSGO (DX9) for example, use something like 60hz display, FX-4300 and RTX2080 to run it in 720P with default game fps cap of 400. You'll have wild frametime and FPS variance - 60~400FPS depending on what's happening on the screen and the map with maybe an average of 100fps. Any point in changing LLM from Off to something else in this case?
     
  10. AsiJu

    AsiJu Ancient Guru

    Messages:
    6,965
    Likes Received:
    1,981
    GPU:
    MSI RTX 2070 Armor
    Question: does limiting framerate always add a bit of input lag?

    If so, which results in less input lag, fps cap or hitting the vsync ceiling with a 144 Hz VRR display?
     

  11. janos666

    janos666 Master Guru

    Messages:
    995
    Likes Received:
    159
    GPU:
    MSI RTX3080 10Gb
    Good question.
    I gathered that the overall conclusion of these tests usually is that CPU-side limiters "add 0-1 frame lag", so we should calculate with 1 at worst case and may be with 0.5 at average (I don't know but may be this applies to some "well oiled system" with neither the CPU, nor the GPU hitting ~100% utilization and everything playing together nicely).
    V-Sync lag should be pretty high (regardless if it's paired with G-Sync or not). And yes, you have LLM On and Ultra now which should theoretically reduce this (V-Sync) lag as well. But then again, Ultra applies an auto fps cap (via the V3 limiter), so nVidia seems to be sure it's the better solution (and they should know how much lag to expect with either V-Sync + reduced pre-render queue length or their fps limiter). Don't forget nV has access to very robust and precise measuring techniques and equipments (if, for nothing else, due to their G-Sync certification labs) which can (among others) measure the end-to-end lag (probably >= 1000 Hz sampling rate and plenty of time to carry these test out because those people in the lab get payed to do it and probably have the skills to automate it well). Independent (fairly good approximate) real world measurement seems to confirms these assumed findings and theories.
    So, all in all, GSync+VSync + Ultra + V3 slider at -2 offset seems to be a great choice of global setting (and then some application profiles modified if necessary).

    What I never fully understood is what really happens when the GPU (or the CPU for that matter) runs at constant ~100% utilization and the fps is limited by the raw processing resources.
    Does LLM help in this case? Do CPU sides limiters add extra latency when they aren't effectively limiting the fps (on top of the already increased lag)?
    Why the increased lag in this case with LLM Ultra? Which buffers/quest get filled up/bloated? Why can't we see or better yet, control those queues with current software?
    Are those transparent hardware side scheduler (or at least low-level driver side, depending on the GPU arch / driver architecture) buffers/queues/caches? Why can't those be starved efficiently?
     
    Last edited: Jun 19, 2020
    AsiJu and BlindBison like this.
  12. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,173
    Likes Received:
    1,921
    GPU:
    Zotac GTX980Ti OC
    I think all that is because of dx9-11 api bottleneck, it still uses 1 core for main render, while dx12 or vulkan highly multi threaded so no more stalls.
     
    BlindBison likes this.
  13. BlindBison

    BlindBison Master Guru

    Messages:
    604
    Likes Received:
    108
    GPU:
    RTX 2080 Super
    @windrunnerxj So, I am basing this off of the Google doc I linked above and also basing this off of CaptaPraelium's Battlefield V Future Frame Rendering explanation to answer your question (https://www.reddit.com/r/BattlefieldV/comments/9vte98/future_frame_rendering_an_explanation/).

    Ideally, if anything there is incorrect or if there's more to say than folks will correct me/add to what I'm saying.

    My position at this point is to leave Pre-render settings globally at the default settings (so, low lag modes OFF). As I understand it pre-rendering (when done correctly at least in games where it works) should actually help out the CPU/help you most when you're CPU limited whereas if you're almost always GPU limited that's when the Low-Lag modes can help most. Again someone correct me if anything I said there is innaccurate though, that's just my conclusion assuming CaptaPraelium's write-up is correct.

    Now, some games at least in my tests and with my rig DO seem to have a microstutter issue and even performance issues with default prerender settings -- as is what this whole thread is about.

    So, basically what I plan to do myself is globally leave prerender settings at defaults then if I get strangely poor perf and/or bad framepacing/microstutter like I do in Crysis 3/Ryse then I'll go into that games individual profile and test out the various low-lag modes to see if it makes a difference. If it helps (such as in the case of Crysis 3 for me) then I'll change the setting for that particular game.

    Another thing to try could be various FPS limiters -- some games stutter like mad with uncapped fps, but seem to smooth up a ton with capped fps (while others are perfectly smooth to my eye with uncapped fps like DOOM 2016/Eternal). So Arkham Knight for example on my rig -- capping to a consistently achievable fps target with RTSS seems to smooth that game out a lot for me while the in-engine caps don't work at all in my experience (still lots of stutter). Sadly it seems to be a case-by-case thing and it could even be hardware dependent perhaps -- so, for example I don't recall getting this microstuttering issue with default prerender settings in Crysis 3 back when I had an Intel CPU (now I have a 12 core/24 thread Ryzen CPU), but perhaps I'm just misremembering.

    If default prerender settings doesn't give you microstutter
     
  14. aufkrawall2

    aufkrawall2 Master Guru

    Messages:
    772
    Likes Received:
    106
    GPU:
    6800 reference UV
    I don't think that level of cautiousness is required, unless somebody comes up with an example of ULL on/ultra actually hurting frame time consistency.

    FFR setting in BF5 btw. is crap vs. ULL on/ultra in driver. ULL on/ultra in driver still keeps GPU fed by ~99% and flattens frame time consistency, while FFR off in BF5 butchers GPU bound performance (reported GPU util. drops significantly) and ruins frame time consistency. Those should not be mixed up: BF5 FFR off is 0 CPU prerender, while ULL ultra is ~between 0 and 1.
     
  15. theahae

    theahae Member

    Messages:
    49
    Likes Received:
    5
    GPU:
    GTX 1060
    did you apply the "workaround" for bf5 Nvidia profile ? The game resets the profile to default.
     

  16. aufkrawall2

    aufkrawall2 Master Guru

    Messages:
    772
    Likes Received:
    106
    GPU:
    6800 reference UV
    Yes. It btw. also works when setting it globally.
     
  17. BlindBison

    BlindBison Master Guru

    Messages:
    604
    Likes Received:
    108
    GPU:
    RTX 2080 Super
  18. BlindBison

    BlindBison Master Guru

    Messages:
    604
    Likes Received:
    108
    GPU:
    RTX 2080 Super
    @-Tj- How can I change the windows timer you're referring to above? I'd like to do some testing with what you're descring to see if changing the time resolves the horrible performance and framepacing that I see in Crysis 3/Ryse when using the default Prerender settings. Thanks,
     
  19. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,173
    Likes Received:
    1,921
    GPU:
    Zotac GTX980Ti OC

    This app, keep it running in the background

    V.1.2 works too
     
  20. jl94x4

    jl94x4 New Member

    Messages:
    1
    Likes Received:
    1
    GPU:
    RTX 2080 TI
    Not sure if the "this" should be clickable, but its not, so not sure which software you refer to :)
     
    BlindBison likes this.

Share This Page