Hi there guys, I read here: https://forums.guru3d.com/threads/rtss-vs-in-engine-framerate-capping.415494/ that frame limiters can reduce input lag by 1 or 2 frames (1 if it's Rivatuner -- 2 if it's an in-game frame limiter) where if you hit 70 fps uncapped your input lag will be greater than if you hit 70 fps capped. This latency reduction also goes a long way towards explaining the low-lag traditional vsync implementations that have been described on Blur Busters that involve capping your framerate to keep buffers empty (as I understand it). My question is in the title -- why or rather how does capping your framerate reduce latency over the latency you'd have if you weren't limiting your framerate but were hitting the same frames per second value? Is it because the game can measure input closer to the delivery of the frame since it knows exactly when to expect it? I don't really get how something like this is possible and I'd really like to know how it works. Thank you very much, I really appreciate it.