1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

New "Ultra Low-Latency" Mode from Nvidia in New Drivers -- Pros and Cons?

Discussion in 'Videocards - NVIDIA GeForce' started by EerieEgg, Aug 22, 2019.

Tags:
  1. EerieEgg

    EerieEgg Member Guru

    Messages:
    151
    Likes Received:
    11
    GPU:
    GTX 1080 Ti
    https://www.howtogeek.com/437761/how-to-enable-ultra-low-latency-mode-for-nvidia-graphics/

    It seems Nvidia is making their own version of AMD's low-lag mode. There are some important questions regarding this though -- I expect enabling this will have notable cons or it would've been setup this way for sometime now, eh?

    One quote from the article is as follows:

    "With the default settings of “Off,” the game’s engine will queue one to three frames at a time. The “On” setting will force the game to only queue a single frame, which is the same as setting Max_Prerendered_Frames to 1 in older NVIDIA drivers. The Ultra setting submits the frame “just in time” for the GPU to pick it up—there will be no frame sitting in the queue and waiting."

    There are a lot of questions raised by the introduction of this setting though. Will users get lower overall fps with this forced ON then? If so, how much? Will this impact frame pacing? Will it result in hitching/stuttering? I've had a pretty difficult time tracking down benchmarks that have been done with the Flip Queue / Max Prerendered Frames forced to 1-4, but going off the article I expect this would be similar to moving from a higher flip queue value to lower.

    Everybody online seems to hotly debate what to set "Max Prerendered Frames" to so truthfully I still don't have a clue what the best setting for it is or whether or not this new setting will be best set ON or OFF. Will have to give this setting a try and perhaps run some of my own benchmarks or just see by feel if games stutter more.

    Anyway, thanks for your time, if anyone knows more about what the cons to this sorta thing are, would be great if you could share them below. Would be great to better understand how prerendering really works and the pros/reasons why it's enabled at 3 by default considering the input lag that appears to cause.

    EDIT:

    Hardware Unboxed did a video on AMD's Anti-Lag so perhaps forcing this new Nvidia Setting to Ultra will impact games similarly:

    In these tests, it was found that Anti-lag did have a negative impact on overall fps that could range from relatively small to upwards of 10-15% in some titles iirc.

    EDIT #2:

    Seems the new driver has removed the option to manually set flip queue size (max prerendered frames). I dislike this -- now for example you cannot set the queue size to "2" from the control panel. There's only default which is probably 3, on which seems to be 1, and the new ultra setting.

    EDIT #3:

    https://www.reddit.com/r/nvidia/comments/ct3b28/low_latency_mode_can_cause_major_stuttering_dont/

    Seems that setting this to Ultra or even ON can induce discernible hitching/stuttering in some titles going off this reddit thread.
     
    Last edited: Aug 23, 2019
    Passo's likes this.
  2. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,037
    Likes Received:
    783
    GPU:
    GTX 1080ti
    All i can say, its not a setting that should be exposed in the Global settings, OR, games and applications with known issues could be preffed to either prevent ultra, or a "Treat ultra as Low" setting.
     
  3. Chastity

    Chastity Ancient Guru

    Messages:
    1,900
    Likes Received:
    461
    GPU:
    Nitro 390/GTX1070M
    It's really based on each game, so just experiment for yourself instead of waiting for another person's dubious opinion. :)
     
    A M D BugBear, HARDRESET and max2 like this.
  4. EerieEgg

    EerieEgg Member Guru

    Messages:
    151
    Likes Received:
    11
    GPU:
    GTX 1080 Ti
    Ideally, it wouldn't be a subjective opinion, but rather some form of impartial testing that's been done with the flip queue or a developer/engineer who understands exactly how it works and the pros and cons to different flip queue values.

    I would say that usually for this sort of thing there is a "general consensus" that emerges on what's best overall after carefully weighing the pros and cons/doing testing and it seems likely that somebody out there knows much better than I do for this. Unfortunately it also seems like few people really understand exactly what the pros and cons actually are regarding prerendering/mprf (other than input lag).

    Once I acquire the new driver, I'll run some tests in a few titles that have built in benchmarks and I'll try and keep track of any differences I encounter. Still good to have information from other potentially more knowledgeable people about their experience or the technology itself though ofc.
     

  5. EerieEgg

    EerieEgg Member Guru

    Messages:
    151
    Likes Received:
    11
    GPU:
    GTX 1080 Ti
    @Astyanax Yes, that was my concern as well -- namely, are some games (or most games) gonna have a problem with this setting being turned to ON where their performance tanks or hitching/stutter becomes frequent/unbearable. No way to no without testing it I suppose, but going off the article it should be similar to adjusting the flip queue. But changing the flip queue is also something where the pros and cons don't seem to be well known (other than input lag).
     
  6. spajdrik

    spajdrik Ancient Guru

    Messages:
    1,959
    Likes Received:
    333
    GPU:
    GB RX 5700 Gaming
    btw. test on Hardware Unboxed was done before Adrenalin version 19.8.1 release, which resolved > Radeon AntiLag may slightly impact performance in select games
     
    EerieEgg likes this.
  7. EerieEgg

    EerieEgg Member Guru

    Messages:
    151
    Likes Received:
    11
    GPU:
    GTX 1080 Ti
    Just installed the new driver -- putting it on Ultra low latency mode seems to cause more stuttering in Crysis 3 (w/ 8700k + 1080 Ti at stock clocks). Haven't tested other titles yet though. One thing that I dislike is that there's no longer any fine-grain control of your flip queue size. It's either (Default-probably 3 / 1 / Ultra). What if a user wants to use "2"?

    Another oddity is that they left in the flip queue size for virtual reality -- why not also have ultra be available for that too? Or, why not also show the flip queue sizes for the new ultra setting?
     
  8. Mustang104

    Mustang104 Active Member

    Messages:
    61
    Likes Received:
    8
    GPU:
    nVidia 1080 GTX FE
    You could still use Nvidia Profile Inspector to control MPRF settings I think?
     
  9. Mda400

    Mda400 Master Guru

    Messages:
    740
    Likes Received:
    34
    GPU:
    1660 - 2130/10ghz
    Crysis 3 may not like/be aware of the Just-In-Time behaviour of the driver's low latency mode "ultra" setting. Games always assumed it was at least 1 frame being prepared by the CPU before the GPU started painting it. This setting is still a 'Beta' setting and probably for this very reason which it may or may not mature in the future.

    Usually if a low pre-rendered setting causes stuttering, your CPU in the specific application isn't able to keep up with the demand of the GPU. If it stutters at a high pre-rendered setting, you are likely bottlenecking the GPU, bloating the chain by queuing more frames from the CPU than needed. It is a kind of buffer and is why it can be application dependant..

    They may not have designed the functionality for Virtual Reality yet. If you mean why they don't show the size for the new "ultra" setting, its because the ultra setting isn't the size of a frame, but a timing algorithm between the CPU and GPU in the driver to try and keep the frames rendering just as they are sent to the GPU.
     
    Last edited: Aug 23, 2019
    EerieEgg likes this.
  10. EerieEgg

    EerieEgg Member Guru

    Messages:
    151
    Likes Received:
    11
    GPU:
    GTX 1080 Ti
    @Mustang104 You may be right about Nvidia Inspector -- in the past I've encountered issues with Inspector though. For example, back in the day on my 770, I could never get Half-Refresh V-Sync to work consistently unless I literally reinstalled the driver each time I wanted to turn it on in a game. I'll have to check and see if the new versions still let you change the value now that the new lag setting's have replaced the flip queue. I'd still prefer if it were simply left in the control panel though.
     

  11. Ital

    Ital Member Guru

    Messages:
    163
    Likes Received:
    48
    GPU:
    ZOTAC GTX 1080 AMP
    There was a bug with the half refresh rate adaptive v-sync feature which was fixed in 431.70. The fix should also be in 436.02 unless Nvidia screwed up.
     
    EerieEgg likes this.
  12. EerieEgg

    EerieEgg Member Guru

    Messages:
    151
    Likes Received:
    11
    GPU:
    GTX 1080 Ti
    @Ital Hurray! Thanks for the heads-up -- that bug drove me nuts!
     
  13. Ital

    Ital Member Guru

    Messages:
    163
    Likes Received:
    48
    GPU:
    ZOTAC GTX 1080 AMP
    Don't celebrate just yet.

    Try it out on your system first and please let us know whether it's truly fixed, or whether Nvidia screwed up yet again. :)
     
  14. kurtextrem

    kurtextrem Member Guru

    Messages:
    189
    Likes Received:
    10
    GPU:
    NVIDIA GeForce GTX 970
    ManuelG said Ultra works best with High FPS and GPU bound games. So... what is high fps? The article says 60-100 fps. What about 200 fps?
     
  15. Cyberdyne

    Cyberdyne Ancient Guru

    Messages:
    3,113
    Likes Received:
    93
    GPU:
    RTX 2070 XC Ultra
    I think 200 is more than 100.
     

  16. RealNC

    RealNC Ancient Guru

    Messages:
    2,985
    Likes Received:
    1,247
    GPU:
    EVGA GTX 980 Ti FTW
    At 200FPS the CPU is probably the bottleneck. But again, it depends on the game. Note that most people consider "higher than 60" being "high FPS". Peasants :p

    Also, at 200FPS, getting rid of 1 frame of latency gives a 5ms advantage. Most people consider this insignificant. At 60FPS on the other hand, removing 1 frame gives a 16.7ms advantage, which is significant. At 100FPS, you get 10ms less lag. This is why you see statements like "works best between 60 and 100."
     
    mohiuddin and EerieEgg like this.
  17. Caesar

    Caesar Master Guru

    Messages:
    671
    Likes Received:
    233
    GPU:
    GTX 1070Ti Titanium
    Having games minimized at every 5-8 seconds when enabled at 144Hz (no freesync/gsync)
     
  18. mohiuddin

    mohiuddin Master Guru

    Messages:
    756
    Likes Received:
    50
    GPU:
    GTX670 4gb ll RX480 8gb
    And here me thinking 30fps is playable and 45+ fps to be more than enough . :)
     
    nikobellic and mbk1969 like this.
  19. BlackNova92

    BlackNova92 Member

    Messages:
    20
    Likes Received:
    5
    GPU:
    16gb
    i'm curious, in games like Overwatch, do i need the in-game option (reduce buffering) if i have the option to pre-render my frames inside the driver already?
    i'm using both right now, but always wondered if it's pointless to use both, reduce buffering is also bugged as hell.
     
  20. EerieEgg

    EerieEgg Member Guru

    Messages:
    151
    Likes Received:
    11
    GPU:
    GTX 1080 Ti
    @BlackNova92 I could be wrong on this, but my understanding was the "Reduce Buffering" simply reduces the size of the Flip Queue from its default. Now, I don't know whether "Reduce Buffering" is taking the game from it's default of "3" down to "2" or down to "1" and I don't even know what the "default" value is in Overwatch (it may be 2 or it may be 3 -- usually games use 3 by default iirc), but my expectation is that the "Ultra" setting in the control panel would override the in-game setting provided that OW is one of the supported APIs for the setting. I don't expect there's a problem with enabling both the in-game setting and the control panel one just in case.

    If you start noticing more frequent discernible hitching/stutter or if the game starts looking "choppy", I might experiment with the default flip queue values since some users/AMD themselves have stated reducing the flip queue size can cause behaviors like that.
     

Share This Page