This is something that really confuses and pisses me off. Basically 60Hz Vsync + 30FPS cap = Constant microstuttering , decent Input lag. 60Hz 1/2 Refresh Vsync + 30FPS cap = No microstuttering, large increase in input lag. It's ridiculously frustrating. I've got plenty of overhead to achieve perfect 30FPS if I can't maintain a perfect 60FPS in any given game. But every few seconds or when the camera is moved, it drops a frame or there is a spike in the time it takes to get a frame out and it creates judder/microstutter. I don't understand how this is happening. Double buffered Vsync = can't maintain 60, drops to 30. So why is this happening when a 30FPS cap is being used? Using Triple buffering doesn't seem to help this issue. I've noticed this in a few games and it's a big problem in the game i'm playing now, Mafia II. And maybe this explains *somehow* the constant microstuttering I get in Dead island with a 30FPS cap and Vsync on. I don't recall trying it with 1/2 Refresh to check microstuttering but I do remember trying 1/2 Refresh to check input lag and it was a bit worse *Video Link* https://www.copy.com/s/asp9Aw1Er9oP/Stutter with Framerate showing.avi Anyone else ever notice this?