Anti-Aliasing: Surprised, but I shouldn't be... (Upscaling too) Try disabling it a little while for yourself but to my brain, "true" anti-aliasing (not that image blurring type of "AA" called FXAA, MLAA, SMAA, etc.) causes input delay. Even if your graphics card isn't fully being utilized and you're hitting the max framerate. I was surprised that I've been using it all this time without realizing what msaa/fsaa/ssaa does to an original rasterized image. Its much a process like GPU upscaling (which also causes input delay. even on consoles). Upscaling (on the GPU) adds artificial color information into a pixel to guess where it would be if running at a higher resolution. Anti-Aliasing also does this sort of thing but it is done at the existing resolution scanning over the existing edges (like layers) and is also "artificial" in that sense. Both delay the process the image is undergoing further before being shooed out to your display. My theory is that since display's still "upscale", they do it differently than a GPU. They have some sort of known "coordinates" for where to place each pixel from a smaller resolution to its panel's native resolution, found in a display's EDID noted by "detailed timings" (if you ever use a edid reader like Moninfo). This is why if there is lag to upscaling on a display rather than your GPU, it would in theory be much lower (and to me it is, like when using an Xbox 360 @ 720p vs 1080p/upscaling). Take an easy running game like Counter-Strike: Global Offensive (or any easy running source game for this matter) and toggle MSAA anti-aliasing for a bit (again FXAA excluded). Twitch FPS gamers should see the difference pretty quickly, as I did (your mouse movement wont feel like its as heavy as it was before). But it is up to your brain and this thread is only to reinforce my findings. I now realize why console's don't bother with anti-aliasing. The amount of graphics power needed and the delay it brings to the image process is not worth the time to make a well-built, but beautiful looking game. This makes buying a higher resolution display all the more necessary once people discover this drawback for themselves. Or maybe I'm alone and crazy :nerd:? I mean there's a lot of threads when doing a google search about the question if anti-aliasing adds input lag. Most of those you will find someone saying "if you mean adding more work for the GPU to do of course its a delay. But as long as your graphics card is fast enough you wont notice it". Bullocks, I say. as mentioned above, it is an artificial "resolution" process unlike higher textures or higher res shadows or higher amounts of Aniso filtering. I own an Nvidia GPU and I thought this was a more driver oriented feature so thats why i posted here if anyone is wondering... This is going to make my graphics card a little more capable in the future not having to worry about anti-aliasing anymore (though i have a reason to get a 4k screen now).