Anti-Aliasing: Surprised, but I shouldn't be..

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Mda400, Jan 11, 2016.

  1. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    Anti-Aliasing: Surprised, but I shouldn't be... (Upscaling too)

    Try disabling it a little while for yourself but to my brain, "true" anti-aliasing (not that image blurring type of "AA" called FXAA, MLAA, SMAA, etc.) causes input delay. Even if your graphics card isn't fully being utilized and you're hitting the max framerate.

    I was surprised that I've been using it all this time without realizing what msaa/fsaa/ssaa does to an original rasterized image. Its much a process like GPU upscaling (which also causes input delay. even on consoles).

    Upscaling (on the GPU) adds artificial color information into a pixel to guess where it would be if running at a higher resolution. Anti-Aliasing also does this sort of thing but it is done at the existing resolution scanning over the existing edges (like layers) and is also "artificial" in that sense. Both delay the process the image is undergoing further before being shooed out to your display.

    My theory is that since display's still "upscale", they do it differently than a GPU. They have some sort of known "coordinates" for where to place each pixel from a smaller resolution to its panel's native resolution, found in a display's EDID noted by "detailed timings" (if you ever use a edid reader like Moninfo). This is why if there is lag to upscaling on a display rather than your GPU, it would in theory be much lower (and to me it is, like when using an Xbox 360 @ 720p vs 1080p/upscaling).

    Take an easy running game like Counter-Strike: Global Offensive (or any easy running source game for this matter) and toggle MSAA anti-aliasing for a bit (again FXAA excluded). Twitch FPS gamers should see the difference pretty quickly, as I did (your mouse movement wont feel like its as heavy as it was before). But it is up to your brain and this thread is only to reinforce my findings.

    I now realize why console's don't bother with anti-aliasing. The amount of graphics power needed and the delay it brings to the image process is not worth the time to make a well-built, but beautiful looking game. This makes buying a higher resolution display all the more necessary once people discover this drawback for themselves.

    Or maybe I'm alone and crazy :nerd:? I mean there's a lot of threads when doing a google search about the question if anti-aliasing adds input lag. Most of those you will find someone saying "if you mean adding more work for the GPU to do of course its a delay. But as long as your graphics card is fast enough you wont notice it". Bullocks, I say. as mentioned above, it is an artificial "resolution" process unlike higher textures or higher res shadows or higher amounts of Aniso filtering.

    I own an Nvidia GPU and I thought this was a more driver oriented feature so thats why i posted here if anyone is wondering... This is going to make my graphics card a little more capable in the future not having to worry about anti-aliasing anymore (though i have a reason to get a 4k screen now).
     
    Last edited: Jan 11, 2016
  2. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
  3. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    Oh none taken. But all im saying is turn AA off for a bit (any game) and give your own mind a review if it made a difference to you or not. Instead of getting defensive. ;)

    I don't mean to push this belief on you guys. I'm just giving my own personal experience. If enough people actually try it out and agree then it could be a minor revelation and only the most sensitive (like myself) would probably go without AA for the sake of perception for reduced delay. If not then we'll all go about our lives.
     
    Last edited: Jan 11, 2016
  4. Martigen

    Martigen Master Guru

    Messages:
    534
    Likes Received:
    254
    GPU:
    GTX 1080Ti SLI
    Ah, hmm. Where to begin. You may be a little bit crazy :)

    No scaling happens on the display (unless you choose it to, for eg on many TVs). It's merely displaying the 1080p image the GPU feeds it. Any MSAA latency you perceive is GPU latency with the GPU working harder to produce that single frame that gets fed to the display. If you're maintaining for eg 60fps with MSAA, there is no extra latency.

    Upscaling likewise should not itself produce input delay. It's not adding artificial color information, it's actually rendering the image at a higher resolution then downscaling with an algorithm to get for eg a 1080p image. It's just more GPU processing. There's no input delay here -- if the GPU can delivery 60fps of upscaled frames per second, then that's what it delivers per second. If upscaling causes your FPS to drop below 60, then you may get an input delay just the same as if the FPS dropped below 60 without upscaling.

    Consoles forgo anti-aliasing because they can't afford it. Additionally, TVs are different from PC displays (and why PC displays can be more expensive) and pixels often blend due to the layout of the pixels and the underlying technology being used, leading to a naturally softer image. 4K displays may different though, as you note...

    I just realised you may talking primarily about TV displays and the XBOX, in which case while I personally don't enough about how TVs 'upscale' I can imagine they would need at least a single frame buffer to store the upscaled image, in which case you may get input delay. But the same is not true if the XBOX is doing the upscaling (assuming it has the power to), nor a PC doing the upscaling.

    Regardless though, any AA method be it MSAA or post-processing happens before the final image is pushed to the display, and if the GPU is asked to deliver 60 of these per second and it can do this then that's what it delivers and your maximum delay should be no more than 16.6ms, the standard for 60fps.
     
    Last edited: Jan 11, 2016

  5. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    I understand what you are saying and I stated above my reasons for why i felt upscaling and anti-aliasing cause delay. I have done this back and forth with my own 360 @ 720p and 1080p and my PC with GPU scaling and Display scaling in the NVCP with DVI PC input label on my Samsung TV which i also use as my PC monitor.

    It is basically your "PC monitor". PC label or a similar advanced option on TV's that support it convert the display to receiving pure full chroma RGB signals instead of chroma subsampled images like that of YCbCr as well as disable all enhancements kind of like game mode would. So my 40" HDTV is essentially a monitor now.

    Good thing in my case that consoles forgo AA with how i feel about it now.

    As someone somewhere said it on the internet "Content looks better in the native resolution it was created in. There is no replacement for missing data."

    I have no fancy "LED attached to a mouse button then time the screen draw time vs the LED going off" setup to prove this. Just giving out some insight for those that want to geek out. :nerd: Maybe they will experience what i did? maybe not... sorry for my stubbornness in advance.
     
    Last edited: Jan 12, 2016
  6. Martigen

    Martigen Master Guru

    Messages:
    534
    Likes Received:
    254
    GPU:
    GTX 1080Ti SLI
    Ah right, well again you may get a delay if you're letting the Samsung do the upscaling, but you shouldn't if it's a PC doing it.

    And the things I stated aren't stereotype, they're the facts of how the technology works. Your experience doesn't invalidate what's actually happening, the error could be in your perception. Placebo? :) Or, it's there but the cause lies somewhere else.

    EDIT: actually I'm going to correct myself here -- "Regardless though, any AA method be it MSAA or post-processing happens before the final image is pushed to the display, and if the GPU is asked to deliver 60 of these per second and it can do this then that's what it delivers."
    This remains true but it's possible the GPU may use an extra buffer to do this with upscaling (presuming it has the higher-res image in one buffer, then downscales it to another buffer), which would introduce a frame delay. Who knows? Someone more knowledgeable would have to answer this.

    Absolutely correct. Nothing makes up for lost data. But a GPU doing upscaling or AA won't be adding input delay.

    Also, it's worth noting that no matter the res of a display, pixels are still square(-ish, TVs tend to use differently shaped pixels). You will always get jaggies, they're just harder to notice at higher resolutions. MSAA and post-processing AA will always improve image quality as it smooths out these jaggies. I'm typing this on a 3440x1440 display and I wouldn't play any game without AA, I can see the jaggies easily.

    Upscaling is superb in this regard, as it's really the best method to reduce aliasing across the board for geometry, textures and specular etc, but even here a little AA can help post-upscaling. Finally, keep in mind upscaling produces a higher-quality image -- depending on the source of the textures, an upscaled image will show more detail in the textures than a native image, as it's actually being rendered at a higher-res and then reduced to a smaller size. However this image is more accurate texture-wise than a version of these textures being rendered at the native res -- because there's less data to represent them.

    Upscaling is demanding as hell, but totally worth it for any game you can use it and maintain 60fps.
     
    Last edited: Jan 11, 2016
  7. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    No i dont have the Sammy upscaling anything unless a game can't go up to 1080p. I have my scaling option set to display - no scaling as my display knows what to do with 720p, 1024x768, etc. You know, because " GPU scaling for me adds input lag"

    How can you be so sure? those two methods are essentially "adding" to something that isn't there...

    I do know this, but from my experience the jagged edges are the natural appearance and the limitation of how our technology is built (GPU's work with pixels as you know). "adding" extra color information not in the original image whether on GPU or display is my belief of where the lag comes from. The higher the resolution, the greater the density of the picture in a certain size and less of a need for AA so thats my next avenue. You probably know this already though having a 3k monitor. 4k or 5k display would make it even less noticeable (duh MDA).

    Unless Upscaling were to cause input delay even if it didnt saturate my device's resources... that's when someone like my crazy ass (lol) would sacrifice the extra (fake) clarity for more immediate response. The reason why i bring up xbox 360 so much in my past is because no console before it or in its current generation had as much flexibility upscaling its native (720p in its case) resolution. It was made just like a small desktop PC (not a surprise being designed by microsoft).

    I'm sorry man, I'm standing by my stubbornness here.
     
    Last edited: Jan 11, 2016
  8. CK the Greek

    CK the Greek Maha Guru

    Messages:
    1,316
    Likes Received:
    37
    GPU:
    RTX 2060S
    Consoles...TVs...etc. Nothing beats a real PC monitor if that's the case here..sure there are some good TVs out there but NONE may be compare with a similar PC monitor, I think you know that.

    (too many walls of text, I've read them "photographically", sorry if I am a bit out of topic then..)
     
  9. GanjaStar

    GanjaStar Guest

    Messages:
    1,146
    Likes Received:
    2
    GPU:
    MSI 4G gtx970 1506/8000
    I do notice that mouse responsivness is lower with DSR compared to native res, even when i'm capped at constant 60 fps at both.

    same with gpu scaling based custom resolutions.

    I never did any testing though to see if frame time or something could be the culprit though.
     
  10. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Played for most of my PC gaming life (since 1996) without using AA or AF at all due to the detectable latency. These days the latency is much less noticeable and less of a resource hog. If you're playing offline/ single-player games, then, AA-on is fine. However, if you want lowest latency possible in fps playing online, then, AA-off along with v-sync-off can make a little difference. Older hardware will benefit from AF-off as well.
     

  11. stereoman

    stereoman Master Guru

    Messages:
    884
    Likes Received:
    181
    GPU:
    Palit RTX 3080 GPRO
    I'm sure anti aliasing will add some delay but I've never noticed unlike v-sync which for me is very noticeable and one of the reasons I upgraded to Gsync, everyone's perception is different though so what might look delayed to one person might look perfectly fine to another, would be interesting to see some data on it tho to see just how much anti aliasing effects latency.
     
  12. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    no problem. the topic is really about how i think anti-aliasing (and other resolution enhancing techniques like upscaling) could add input delay when playing a game.

    Everyone has a different brain. some can detect more easily than others. I just know from fiddling around with it, i've had to lower many of my games' sensitivities to compensate for it.


    With downscaling, i would think you wouldn't perceive much delay as with upscaling. You are taking actual color information and packing it down through a interpolation filter to make it fit smoothly in a smaller window.

    So there's no fake information being added as my opinion is with anti-aliasing or upscaling. Maybe the smoothing process creates a bit of delay, but i have not played with DSR enough nore do i like my ingame UI really small due to the scaling differences in resolutions.



    That's cool if you somewhat notice what i might be. If it affects my game response, i'll usually disable it in all cases if it affects multiple games.

    As for AF, in my opinion i'd think thats a little different to possibly inducing latency as you are resampling textures at extreme angles to make them stand out with clarity unlike when you're affecting the actual resolution with AA. But I might test that too now that you mention it.

    No doubt, it is a proven fact that normal vsync adds at least a frame of lag. I do use Adaptive Vsync as even though there is added latency with vsync, with adaptive it would only happen at my monitor's refresh rate and the vsync lag is the lowest when your framerate is running in sync with your refresh rate. I can't stand tearing and I feel the amount of lag that adaptive gives is low enough to not ruin my gaming experience for now.

    I would like to see stats though too on things like MSAA and upscaling as I stated before, I don't have the precise capture tools to detect the latency differences. Its all in my noggin for now.
     
    Last edited: Jan 11, 2016
  13. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,725
    Likes Received:
    1,854
    GPU:
    EVGA 1070Ti Black
    anything adding to the original image adds input delay, there is nothing new about that, that is why HDTV tend to have more input delay then monitors HDTV have alot of image processing going on in comparison. upscaling/dowscaling/AA/AF all and some short of input delay, if that is seen and felt by everyone debatable.

    Consoles game use AA too so i wouldnt say they "dont" bother either.
     
  14. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    Yes of course. HDTV's can be configured into a monitor as well from what i described in the first post. In this case though it is more of graphics card delay. I believe what seperates the delay from AA, upscaling, etc. from any other graphics setting is how it alters the resolution.

    One common thing happens when changing AA, scaling mode, or Vsync is that they all refresh the screen (goes black and turns on) when switcing them. This is where i get the notion that an unnatural process is being placed into the image chain and creating unneeded input delay.

    AF, texture setting, shadows, effects, etc. don't do this where it leads me to believe that those settings only cause delay if the graphics card is maxing out its resources.

    Consoles rarely ever use MSAA at all. Its mostly FXAA that is a blur filter rather than resampling technique.
     
    Last edited: Jan 12, 2016
  15. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,886
    Likes Received:
    1,015
    GPU:
    RTX 4090
    Rendering a frame is a process which is a delay. Unless you're not rendering anything you will always have some kind of a delay. The more time it takes to render a frame - the bigger the delay will be.

    This is however completely unrelated to delays added by vsync / SLI / something else which basically adds an X frame latency on top of the rendering process itself. This is what usually happens with TVs as well as their image processing require such delay to perform itself.

    Consoles are running at 30 fps usually meaning that you have a 33ms delay by default. In the very least you should add the TV chain delay here (which is usually another 10ms at minimum) and the gamepad delay (which can be pretty sizeable on the modern wireless gamepads). All in all the lag on consoles is usually higher than on a similarly specced PC.
     

  16. bishi

    bishi Master Guru

    Messages:
    575
    Likes Received:
    17
    GPU:
    GTX 1080 SLI
    This feeling of delay might be from microstutter introduced from the additional overhead of AA or DSR.
    Speculation, but even if you are hitting 60fps there may be sub-frame stutters caused by the extra workload of bullet casings or an explosion that is only visible on the screen momentarily.
     
  17. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    Again, my theory is that there is a minimum lag (just like vsync adds) that AA (or any resolution enhancing method) imposes on the GPU and like i said in the first post, the GPU can be running full steam ahead at 60 fps, yet there is more lag with AA in use than without.

    So this is the enhancement/faking detail im trying to describe with the GPU render time. Like another framebuffer its going through.
     
    Last edited: Jan 12, 2016
  18. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,886
    Likes Received:
    1,015
    GPU:
    RTX 4090
    You're right if you're rendering with vsync. Without vsync your input will register even in frames which won't be shown and thus the perceived lag will be less than with vsync. Anyone with a 60Hz display can try this in a game which is able to reach ~120 fps - on a good mouse and display you will feel the difference in input lag between a vsynced and non-vsynced rendering even though you'll see only 60 frames in both cases.

    There is no such thing as a "minimum lag". When you add AA or anything else which impact the performance you're increasing the time in which a frame is rendered and as a result you are increasing the delay between your input and the presentation of a frame which register that input. There's nothing more to it.
     
    Last edited: Jan 12, 2016
  19. Mda400

    Mda400 Maha Guru

    Messages:
    1,089
    Likes Received:
    200
    GPU:
    4070Ti 3GHz/24GHz
    Just tested CS:GO capped at 60 fps in bot match. initially i enabled AA only ingame and when you asked forced through control panel that was something i hadn't cross-referenced yet. so i went back and forth enabling/disabling in the control panel and i could still feel the lag...

    It wouldn't apply to ambient occlusion or filtering as they don't mess with the organization of resolution like AA, upscaling, or downscaling does. Vsync is in its own category as its another buffer.
     
    Last edited: Jan 12, 2016
  20. RealNC

    RealNC Ancient Guru

    Messages:
    4,953
    Likes Received:
    3,230
    GPU:
    4070 Ti Super
    Yep. MSAA is done as part of the rendering phase. There is no latency associated with it other than the increase of frame latency (as with any other setting that affects performance.) In other words, the latency increase of MSAA has the same cause as the one you get with upping resolution, increasing shadow quality, etc. It's just the increase of frame latency (aka "lower FPS".)
     

Share This Page