So i'm always looking for the least input delay possible (while having a stable frametime), and so far in the game I play the most (Fortnite) I have been using the following settings: - All graphics quality options set to lowest, 1080p resolution (GPU load never exceeds 30-50% in my case) - Vsync disabled in game and nvidia driver options. - Low latency mode in nvidia driver set to Ultra. - Ingame fps limiter set to 120 , my monitor is 120 hz, I can get a LOT higher average fps with unlimited fps in game options, or for example if I set ingame fps limiter to 240 instead of 120, but in situations where there are many players in my vicinity the frametime will be really bad and the fps will dip a lot, causing mouse input to feel slightly sluggish as well as the picture stuttering (picture stuttering is probably because I use lightboost/blur reduction, which works best with a 100% stable fps/frametime) however, with 120 fps limit set, it will never ever ever fall under 120 fps no matter what is going on in the match / how many players near me, and frametime overlay graph is more or less completely flat 99.999% of the time, looking and feeling great, there is slight tearing but totally worth it IMO, i'm much less sensitive to tearing than input delay if that makes sense, Anyway, I today saw Chris from Battle Nonsense on youtube video on AMD antilag and Nvidia low latency option here: From what I can understand of his testing, he claims that low latency option ONLY helps when your GPU load is near max load, and will in fact even make input delay worse if used otherwise in scenarios where GPU load is not maxed? maybe I understood the video incorrectly, but I am 100% positive that in my testing, there is a HUGE difference in how light/responsive the mouse feels between these 3 settings: - low latency set to "off" in nvidia driver (i.e 3 maximum prerendered frames) mouse feels pretty heavy, but picture/motion feels a bit smoother. - low latency set to "on" in nvidia driver (i.e 1 maximum prerendered frame) mouse feels a lot more responsive than off/3 max prerendered frames, its very noticeable just by turning mouse around fast, mouse feels much "lighter" compared to off/3 MPRF, picture/motion is perhaps very slightly less smooth but there is no framedrops and a much better option for input delay compared to off/3 MPRF in my scenario. - low latency set to Ultra in nvidia (ie "0" max prerendered frames, frames are sent just in time to the GPU) very similar feeling to low latency "on" / 1 max prerendered frame, but after many hours and sessions of testing, it does indeed for me feel like mouse is even lighter/more responsive than the "on" / 1 max prerendered frame option, its quite hard to tell the difference in my scenario but if its not placebo and i'm not mistake, it does indeed feel slightly better/ lighter mouse/more responsive with "Ultra" / 0 MPRF instead of "On" / 1 MPRF. So what is the consensus and truth here really? I would like to have a clear answer so I can sleep better at night lol for me, the difference betweent low latency "off" and "on" is night and day , its without a question that input delay is lower with it set to "on" vs "off", i'm not as sure though on "ultra" vs "on", but in theory and just from how it is supposed to work, that frames are sent just in time for the GPU to render, it seems obvious this should be lower input delay, no matter what? An interesting note from his video is that he only tested with epic/ultra/max quality settings from what I can see in the video, can this affect the results negatively so to speak? he also does not compare low latency "on" vs "off" / "ultra", only "off" vs "ultra" from what I can see? that would be so much more useful to see all 3 options vs eachother imo? In my case, where I set the graphics quality options to lowest, and GPU load never exceeds 30-50%, as well as limiting fps with the ingame limiter to 120 fps, should I use low latency "ultra" or "on" ? I definetly will not use "off" as that 100% will raise input delay and make mouse feel heavy compared to the other two options, Would greatly appreciate any feedback on this and technical knowledge/input on what would be best in my case and in general, also comments on his video and testing , to me the testing seems flawed/not enough scenarios tested (as in only super high quality settings tested instead of low settings, as well as not testing low latency "on" vs "off/ultra" , only testing "off" vs "ultra") cheers and thanks in advance for any help on this PS my specs: i7-9700k / RTX 2060 / 2x8gb ddr4 3200mhz cl15 ram / ssd disk / win 10 1903 all latest updates / latest nvidia 440.97 driver / asus vg248qe 1 ms 144hz monitor set to 120hz with lightboost(blur reduction) at 10% strobing brightness / high performance power plan enabled in windows and dvr/background apps/etc disabled Edit: if okay would like to tag/mention couple users that I know from the past are knowledgeable on things like this, @RealNC @Mda400 anyone is free to comment though! Thank you.