Battle non-sense (youtuber) claims low latency mode only helps when GPU load is 99% ?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by alexander1986, Oct 22, 2019.

  1. alexander1986

    alexander1986 Master Guru

    Messages:
    201
    Likes Received:
    21
    GPU:
    RTX 2060
    So i'm always looking for the least input delay possible (while having a stable frametime), and so far in the game I play the most (Fortnite) I have been using the following settings:



    - All graphics quality options set to lowest, 1080p resolution (GPU load never exceeds 30-50% in my case)

    - Vsync disabled in game and nvidia driver options.

    - Low latency mode in nvidia driver set to Ultra.

    - Ingame fps limiter set to 120 , my monitor is 120 hz, I can get a LOT higher average fps with unlimited fps in game options, or for example if I set ingame fps limiter to 240 instead of 120, but in situations where there are many players in my vicinity the frametime will be really bad and the fps will dip a lot, causing mouse input to feel slightly sluggish as well as the picture stuttering (picture stuttering is probably because I use lightboost/blur reduction, which works best with a 100% stable fps/frametime)


    however, with 120 fps limit set, it will never ever ever fall under 120 fps no matter what is going on in the match / how many players near me, and frametime overlay graph is more or less completely flat 99.999% of the time, looking and feeling great, there is slight tearing but totally worth it IMO, i'm much less sensitive to tearing than input delay if that makes sense,




    Anyway, I today saw Chris from Battle Nonsense on youtube video on AMD antilag and Nvidia low latency option here:


    From what I can understand of his testing, he claims that low latency option ONLY helps when your GPU load is near max load, and will in fact even make input delay worse if used otherwise in scenarios where GPU load is not maxed?


    maybe I understood the video incorrectly, but I am 100% positive that in my testing, there is a HUGE difference in how light/responsive the mouse feels between these 3 settings:



    - low latency set to "off" in nvidia driver (i.e 3 maximum prerendered frames) mouse feels pretty heavy, but picture/motion feels a bit smoother.


    - low latency set to "on" in nvidia driver (i.e 1 maximum prerendered frame) mouse feels a lot more responsive than off/3 max prerendered frames, its very noticeable just by turning mouse around fast, mouse feels much "lighter" compared to off/3 MPRF, picture/motion is perhaps very slightly less smooth but there is no framedrops and a much better option for input delay compared to off/3 MPRF in my scenario.


    - low latency set to Ultra in nvidia (ie "0" max prerendered frames, frames are sent just in time to the GPU) very similar feeling to low latency "on" / 1 max prerendered frame, but after many hours and sessions of testing, it does indeed for me feel like mouse is even lighter/more responsive than the "on" / 1 max prerendered frame option, its quite hard to tell the difference in my scenario but if its not placebo and i'm not mistake, it does indeed feel slightly better/ lighter mouse/more responsive with "Ultra" / 0 MPRF instead of "On" / 1 MPRF.



    So what is the consensus and truth here really? I would like to have a clear answer so I can sleep better at night lol :p for me, the difference betweent low latency "off" and "on" is night and day , its without a question that input delay is lower with it set to "on" vs "off", i'm not as sure though on "ultra" vs "on", but in theory and just from how it is supposed to work, that frames are sent just in time for the GPU to render, it seems obvious this should be lower input delay, no matter what?


    An interesting note from his video is that he only tested with epic/ultra/max quality settings from what I can see in the video, can this affect the results negatively so to speak? he also does not compare low latency "on" vs "off" / "ultra", only "off" vs "ultra" from what I can see? that would be so much more useful to see all 3 options vs eachother imo?


    In my case, where I set the graphics quality options to lowest, and GPU load never exceeds 30-50%, as well as limiting fps with the ingame limiter to 120 fps, should I use low latency "ultra" or "on" ? I definetly will not use "off" as that 100% will raise input delay and make mouse feel heavy compared to the other two options,



    Would greatly appreciate any feedback on this and technical knowledge/input on what would be best in my case and in general, also comments on his video and testing , to me the testing seems flawed/not enough scenarios tested (as in only super high quality settings tested instead of low settings, as well as not testing low latency "on" vs "off/ultra" , only testing "off" vs "ultra")


    cheers and thanks in advance for any help on this :)



    PS my specs: i7-9700k / RTX 2060 / 2x8gb ddr4 3200mhz cl15 ram / ssd disk / win 10 1903 all latest updates / latest nvidia 440.97 driver / asus vg248qe 1 ms 144hz monitor set to 120hz with lightboost(blur reduction) at 10% strobing brightness / high performance power plan enabled in windows and dvr/background apps/etc disabled


    Edit: if okay would like to tag/mention couple users that I know from the past are knowledgeable on things like this, @RealNC @Mda400

    anyone is free to comment though!

    Thank you.
     
    Last edited: Oct 22, 2019
  2. janos666

    janos666 Ancient Guru

    Messages:
    1,645
    Likes Received:
    405
    GPU:
    MSI RTX3080 10Gb
    I think this is at least the third topic for this video. :(
     
  3. alexander1986

    alexander1986 Master Guru

    Messages:
    201
    Likes Received:
    21
    GPU:
    RTX 2060

    My apologies, to keep it simple and short:

    I asked several questions not adressed in the video ! for example, vsync off, 120 fps limit in game, ~50% gpu load maximum, is it better with MPRF 1 or 0 ? (low latency ON vs ULTRA) I know I can just test both myself, and I have for many hours lol, but can't really come to a 100% conclusive answer :p
     
  4. kurtextrem

    kurtextrem Master Guru

    Messages:
    251
    Likes Received:
    40
    GPU:
    NVIDIA GeForce GTX 970
    off does not mean 3 pre-rendered frames, it depends on the game. Rainbow Six Siege for example sets it to 1 already, so I'm not sure if "On" would do anything.

    To quote Nvidia: "Low Latency modes have the most impact when your game is GPU bound, and framerates are between 60 and 100 FPS, enabling you to get the responsiveness of high-framerate gaming without having to decrease graphical fidelity."
    So how do you notice the game is GPU bound? Usage & changing GPU-bound graphic settings to see if usage drops.
     
    alexander1986 likes this.

  5. alexander1986

    alexander1986 Master Guru

    Messages:
    201
    Likes Received:
    21
    GPU:
    RTX 2060

    Ahh good point about some games defaulting to 1 MPRF, and yes I know that quote from nvidia, but they say the *most* impact, to me this sounds like that there is still an impact/benefit to latency and input delay even when not gpu bound and framerates are between 60 and 100 fps, right?


    Like just from how Ultra low latency is supposed to work, that frames are sent just in time for the GPU to draw, as in a fully minimized queue or "0" MPRF, to me it seems logical this would improve input latency in every single scenario, or am I wrong with this logic? (I understand some games/engines might behave badly with this or it could introduce side effects like frameskips/stutters/hitches/etc, but still)
     
  6. flow

    flow Maha Guru

    Messages:
    1,023
    Likes Received:
    17
    GPU:
    Asus TUF RTX3080 oc
    I tried the low latency settings and also with the latest driver. But I can't seem to like it, with neither settings. Off or ultra, it just doesn't feel right and online shooters appear to fluctuate alot at ultra, while off puts me behind in time.
    I know bfv puts you at 1 prerendered frame in dx11, but somehow it feels more accurate with older driver that doesn't have the low latency feature.
    Funnily enough the ultra setting did work better than off while prerendered frames were fixed at 1 by the game. The driver must do some background work alongside imo. Maybe that's why it feels off to me.
     
    alexander1986 likes this.
  7. alexander1986

    alexander1986 Master Guru

    Messages:
    201
    Likes Received:
    21
    GPU:
    RTX 2060
    appreciate your input and feedback at least ! for me in Fortnite (unreal engine) it is very hard to say what is better 100% between "on" and "ultra" for me, but ultra does feel slightly better if I had to choose one to keep :p wish there was a real/serious input delay test done by someone with highspeed camera+LED method in this game or engine when you are NOT gpu bound and have 120/144/240 fps limit...
     
  8. kurtextrem

    kurtextrem Master Guru

    Messages:
    251
    Likes Received:
    40
    GPU:
    NVIDIA GeForce GTX 970
    The "simplest" and "hardest" solution at the same time is: Get MSI Afterburner, activate the overlay for frametimes (graph), and take a look at the frametimes with on vs. ultra. More spikes + high spikes with ultra? Stick with on. No spikes, lower avg. frametime with ultra? Stick with ultra.
    At 240 fps you barely get any benefit out of ultra. At 60 fps, you might get one frame or half of a frame advantage, which is 16 or 8ms. At 240 fps it's 4 ms or 2 ms. I wouldn't trade frametime stability for 2-4ms.
     
    alexander1986 likes this.
  9. Sajittarius

    Sajittarius Master Guru

    Messages:
    490
    Likes Received:
    76
    GPU:
    Gigabyte RTX 4090
    it sounds like you already found the sweet spot at 120fps, better to not have those dips if lots of things are on the screen, lol

    i would also make sure you have your mouse sensitivity (in game and also mouse DPI) set correctly, makes a huge difference in hitting things. I used to use the overwatch DPI tool, website will tell you optimal settings based on resolution, also there is a converter from overwatch settings to fortnite sensitivity (so you can use the same DPI). I'm not sure if there is a DPI tool just for fortnite.

    Example: in Overwatch, i used to use 3450 DPI and 2 sensitivity. In fortnite i kept the 3450 DPI but have a different mouse sensitivity (2 in overwatch = .024 in fortnite, might have to use the INI file to set it). I had to play around with DPI and sensitivity to find the feel i liked to play with, but once i did it definitely felt more accurate.

    Edit: links

    https://pyrolistical.github.io/overwatch-dpi-tool/ DPI tool
    https://gearbroz.com/overwatch-fortnite-sensitivity/ Overwatch to Fortnite Converter
     
    alexander1986 likes this.
  10. alexander1986

    alexander1986 Master Guru

    Messages:
    201
    Likes Received:
    21
    GPU:
    RTX 2060

    i'm using a logitech g400s mouse, 400/800 dpi is what feels best on that mouse for me and 800 is the native dpi AFAIK, but 400 feels great as well with a doubled sensitivity setting compared to what I use on 800 dpi. play at pretty high / competitive level , not professional though, but still, lowest possible input lag while having stable frametime is super important for me,

    as you can probably tell lol. anyway the difference between on/ultra is slight but def. feels like its there, would be gold with a true button-to-pixeldelay highspeed camera test measuring this in a non gpu-bound situation though, to rule out placebo and / or the game defaulting to 1 MPRF..
     

Share This Page