Input lag in csgo/ Nvidia profile inspector

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by robis85, Jan 2, 2020.

  1. robis85

    robis85 Member

    Messages:
    23
    Likes Received:
    3
    GPU:
    gtx 960 1gb
    Hello, so recently https://prnt.sc/nudj25 i found this pic on the internet people telling that force off/delay flip by flip better than default and it reduces input lag/stuttering, witch option should i go for? and i also got different options on my different versions of inspector for example https://prnt.sc/nudkue , and https://prnt.sc/nudks2.
    OR should i keep it default? even tho default is stuttering and game feels slow and lagy?

    Playing only CS:GO professionally, can trade off stuttering and choppyness for input lag/delay. Thanks alot in advance! :)
     
  2. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,100
    Likes Received:
    120
    GPU:
    MSI RTX 2080
    Don't use the driver Limiter (V1/V2 in NPI) as it will more than likely add more latency than just using the built in limiter in CS:GO. If you want to use an external limiter, RTSS adds the least lag. (And using a frational limit may produce less of a delay than a standard whole number but the latency oscillates instead)
    https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/11/
    You can set pre-rendered frames to 1, or use the new Low Latency modes in NVCP too.
    You should use the most recent NPI version.
     
    robis85 likes this.
  3. robis85

    robis85 Member

    Messages:
    23
    Likes Received:
    3
    GPU:
    gtx 960 1gb
    Thanks for answer MrBonk! i have turned off driver limiter in NPI as far as i know, yet still when i change "frame rate limiter mode" option from 4 example delay-ce/ to force-off i notice changes in my game behaviour can it be placebo or does it really matter at the end? i use ingame limiter " set fps_max 300"
     
  4. Kokotko

    Kokotko New Member

    Messages:
    3
    Likes Received:
    0
    GPU:
    Nvidia GTX 1050-Ti
    Seems like a controversial theme, the guy literally put there a table of results/tests that someone made and you simply out-beat him by some precept. This thing needs some testing or better explanation. I am also curious what all those frame limiting values might add up when put in a correct way. And tell me again that driver-made limiter isnt closer/having more options in case of frame limiting i.e. end product = smoother gameplay
     

  5. aufkrawall2

    aufkrawall2 Master Guru

    Messages:
    565
    Likes Received:
    41
    GPU:
    GTX 1070 OC/UV
    In case of CS:GO, just set ULL to ultra and use the driver limiter. The game's own limiter produces crap frame time consistency and ULL ultra reduces the driver limiter lag with D3D9/10/11 (and probably OGL).
    Also disable the game's internal multi threading option, it introduces an additional, ridiculous amount of latency.
    The stutter is shader compile stutter, it should get smoother the more you play.
     
  6. Piwielle

    Piwielle New Member

    Messages:
    3
    Likes Received:
    1
    GPU:
    RTX 2080
    Hi. I've been wondering, because, while you seem very sure of yourself, this seems somewhat odd, so I set out to do some testing.
    My testing setup consists of :
    - 8700k at 5Ghz
    - RTX 2080, mildly OC, nothing crazy.
    - 16GB at 4000MHz CL 16
    It's a high-end, and optimized system. I have no problem reaching over 500 FPS, and I use fps_max 400 in CS:GO.

    I was mainly interesting in input lag, so to test that I'm using an Arduino Leonardo, plugged in via USB to my PC, and with a photodiode stuck on my monitor. I specifically go to a very bright place (vertigo CT spawn, nose in the 2 lights on the right), and open my buy menu.
    The arduino code does the following :
    - send an "escape" usb input to leave the buy menu, and start recording time when it sends the input
    - waits for the photodiode to change value (it has a LOW and HIGH output, based on light)
    - record the time at which the photodiode switches from LOW to HIGH
    - does a substraction, and the result is then the total time between the input sent, and the monitor chaing color, which I take as input lag.
    It does not take into account the internal mouse lag, but since I'm using it for comparaisons, it should be fine to tell me which settings are better. Time between input lag tests are also randomized, to avoid any weird syncing between refresh rate and measuring input lag.
    The precision has been tested with a LED plugged into the arduino instead of the monitor, and the values are precise up to +- 10 ┬Ás.

    So now, here are my results, which are an average of 315 values for the first test, and 328 values for the 2nd test :

    Low latency mode : ON
    FPS limiter type : In-game (400)
    Multicore rendering option : ON
    input lag (in ms) : 10,46009524

    Low latency mode : ULTRA
    FPS limiter type : I wasn't reaching 400 FPS without multicore rendering, so it was bouncing around 380, without any kind of limiter.
    Multicore rendering option : OFF
    input lag (in ms) : 10,03969512

    So, while you were technically right about the multicore rendering adding some latency, I wouldn't call it ridiculous, at all. Obivously, It's just on my setup, and it might be different for different PCs (weaker CPU which gets a high % of utilization from this option enabled will probably suffer more from added latency).

    My next step would be testing in-game frame limiter versus RTSS versus Driver frame limiter, but I'm currently testing the 450 branch drivers for other reasons, and it does not have the frame limiter option yet. Maybe some other day.
    Although, I'm fairly sure input lag will be very similar, but frametime pacing will be better with RTSS.
     
  7. aufkrawall2

    aufkrawall2 Master Guru

    Messages:
    565
    Likes Received:
    41
    GPU:
    GTX 1070 OC/UV
    You don't need to worry much about input getting delayed by a frame or two at crazy frame rates like 400fps, as 1 frame is just ~2.6ms. I doubt any human individual can notice such differences of delay in double blind tests.
    It is much more apparent at lower fps, e.g. in Danger Zone. The game's MT can be toggled without restarting and it seems the Nvidia driver is doing most of the work anyway (hardly any difference in fps and CPU utilization while being CPU bound in Danger Zone), unlike the AMD driver.
     
    Last edited: Apr 11, 2020
  8. Andre Souza

    Andre Souza Member

    Messages:
    17
    Likes Received:
    2
    GPU:
    1080
    Never test (video) input lag using something that depends on the tick-rate and(or) the network.

    The correct way to measure is by using the camera angle event. I'm seeing almost everyone making that mistake even well known and smart people like Battle(non)sense and the guys from Blurbuster.

    Network delay and the way the game handles rate is a different subject, and there are way too many variables to check. A client running with perfect 127.5 fps could be very different from one at 128.5 fps if the tick-rate is 128, and it all depends on the way the game handles the rate throttling.
     
    Last edited: Apr 12, 2020
  9. aufkrawall2

    aufkrawall2 Master Guru

    Messages:
    565
    Likes Received:
    41
    GPU:
    GTX 1070 OC/UV
    Care to elaborate?
     

Share This Page