Anti-Aliasing: Surprised, but I shouldn't be..

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Mda400, Jan 11, 2016.

  1. Mda400

    Mda400 Maha Guru

    Messages:
    1,090
    Likes Received:
    201
    GPU:
    4070Ti 3GHz/24GHz
    That's what most would think and again I'm capped at 60 fps and not maxing out my gpu at all and it makes a difference for me. Its not what i just experienced going over CS:GO, Gears of war for windows, and Doom 3: BFG edition with AA enabled/disabled ingame or control panel just now. My onscreen display is always going back and forth from 59.9 to 60 fps. There is no "lower fps" for me in those games.

    I don't expect anyone to come back here and admit they actually felt the difference of disabling AA, upscaling, etc. but it would be cool if people could be that honest through text ya know? :)

    Yup its like polling rate on a mouse. 1000hz in 60hz is 16.67 times to make sure thats where your mouse cursor was at the time you moved it in real life. 1000fps on a 60hz display the same deal.

    I don't do this because i'd rather not hear my GPU squeal and whine while playing and i do care about how much energy im wasting with my electric bill :D Adaptive Vsync and G-Sync were the best things Nvidia could have added in their drivers in a while.
     
    Last edited: Jan 12, 2016
  2. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,930
    Likes Received:
    1,044
    GPU:
    RTX 4090
    Even when you limit your maximum fps there are still frame rendering times which can still be bigger or smaller depending on how complex the rendering is. This may theoretically be perceived by a really good nervous system -) since you're still getting different delays between input and presentation. But I find it hard to believe as we're obviously talking about times which are less than 16ms (that's 0,016 of a second). You have to be Batman to notice these differences.

    Did you try a different method of fps limiting, btw? It's quite possible that what you see is actually a microstutter introduced by the frame limiter and not lag.
     
  3. Mda400

    Mda400 Maha Guru

    Messages:
    1,090
    Likes Received:
    201
    GPU:
    4070Ti 3GHz/24GHz
    I use the frametime indicator on my onscreen display. it usually only deviates -.5/+.5 ms except for one game that i have installed which is FireFall and that deviates from 8ms-31ms but that game's a mess anyways...

    If there were frequent microstutters, i would give up caring about input lag in that certain game ;)

    But I use Adaptive Vsync and yes I know that there's some lag accompanied with it but not as much as normal Vsync with the way it works. Don't know if I mentioned that before.
     
    Last edited: Jan 12, 2016
  4. Mda400

    Mda400 Maha Guru

    Messages:
    1,090
    Likes Received:
    201
    GPU:
    4070Ti 3GHz/24GHz
    Because this isnt about vsync. I already know what Vsync does and Adaptive makes it less apparent as syncing at your monitor's native refresh rate causes the least amount of lag unlike dropping below it where adaptive turns off vsync so their is no lag under your native refresh rate.

    I can stand the occasional tearing when i dip below my refresh rate (which is somewhat rare with my rig and display these days) but i cant stand the rapid tearing of faster frame rates (not to mention the GPU whining when the frequency of frames processed is high) and am willing to sacrifice a bit of response time for image quality.

    With AA its vice-versa. I can live without it if its going to reduce the notable delay for me while retaining a stable picture.

    As i've stated multiple times now, for me this makes a difference in input delay even with adaptive vsync enabled.
     
    Last edited: Jan 13, 2016

  5. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    This is a different issue though.
    The mouse behaves differently at different resolutions. With DSR they are trying to do something intermediately to fix it somewhat, with mixed results.
    The same thing doesn't happen with gamepads.
     
  6. kx11

    kx11 Ancient Guru

    Messages:
    4,841
    Likes Received:
    2,646
    GPU:
    RTX 4090
    TemporalAA found in RYSE + FO4 is far better than any AA i have seen


    except SSAA which makes the image quality superior but it hits the fps hard no matter what rig you have
     
  7. TheRyuu

    TheRyuu Guest

    Messages:
    105
    Likes Received:
    0
    GPU:
    EVGA GTX 1080
  8. nakquada

    nakquada Guest

    Messages:
    352
    Likes Received:
    0
    GPU:
    Gigabyte GTX 1080 FE
    I find that if I enable MSAA in most games it feels horrifically floaty due to apparent input lag. Because of this I never use MSAA. Battlefield 4 is a perfect example. Even at 60+ fps with AA on it just feels awful compared to no AA.

    At the end of the day I'm sure it's just an engine thing rather than a global issue.
     
  9. Shadowdane

    Shadowdane Maha Guru

    Messages:
    1,464
    Likes Received:
    91
    GPU:
    Nvidia RTX 4080 FE
    ANY graphic setting you change to improve the graphics adds latency. It doesn't matter if it's MSAA or Shadow detail.

    For example figure a game where you are rendering at 130fps set to High graphic settings. That comes out to 7.692ms between each frame to render 130 frames each second. So that gives your computer 7.692ms to process everything AI, Physics, Sound, Video, ect..

    Now you increase your detail settings to Very High and now the game runs around 90fps. This increased your latency to 11.11ms between frames.

    So you just increased your input latency by 3.418ms just by increasing the graphic detail to Very High. Granted MSAA or SSAA has a significantly higher load on the GPU so it will increase that will add more latency.



    Granted if you use something like V-Sync or a Frame Limiter that won't add any additional latency if your GPU can still render the frame in the time allocated for your V-Sync rate or Limited rate. Of course at higher refresh rates it just gives your GPU less time to do all the processing it needs to do. Why do you think almost all console games are at 30Hz. Gives 33.3ms/frame for the console to get everything onto the screen.

    For example:
    30Hz = 33.333ms per frame
    60Hz = 16.667ms per frame
    75Hz = 13.333ms per frame
    85Hz = 11.764ms per frame
    100Hz = 10.00ms per frame
    120Hz = 8.333ms per frame
    144Hz = 6.944ms per frame
     
  10. quickkill2021

    quickkill2021 Guest

    Messages:
    131
    Likes Received:
    3
    GPU:
    1080ti sli Poseidon
    The solution here is to go to the graphics options and turn all settings to low.

    The least you render the less lag there is! If you can turn down those options so all you see is a block, i am confident that all your lag will be gone :)
     

  11. Shadowdane

    Shadowdane Maha Guru

    Messages:
    1,464
    Likes Received:
    91
    GPU:
    Nvidia RTX 4080 FE

    Yup just look at Quake 3, back in the day... i remember turning everything down to make making all the textures a blurry mess to make sure it never dropped below 120fps!


    r_picmip 7 anyone?? Granted this pic is someone doing it in QuakeLive, but u get the idea!

    [​IMG]
     
  12. pesticle

    pesticle Guest

    Messages:
    29
    Likes Received:
    0
    GPU:
    980GTX 4GB
    I can not fathom how someone so sensitive to mouse input lag would have any form of vsync on.:fuse:
     
  13. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    More likely a placebo effect with other factors that you aren't noticing.

    Unless you have objective testing backing you up that it shows that MSAA and MSAA alone at the same exact framerates with no drops is causing the issue.
     
  14. Mda400

    Mda400 Maha Guru

    Messages:
    1,090
    Likes Received:
    201
    GPU:
    4070Ti 3GHz/24GHz
    Because its a visual annoyance without vsync. The input delay for me with max pre rendered frames at 1 (if you're not CPU limited) and adaptive vsync is hardly noticeable compared to enabling hardware AA. I'm ok if it tears under refresh rate as thats where input lag would occur with normal Vsync. But having it disabled and frame limited (to prevent coil whine) then having to watch a steady line across my screen, isn't the most comfortable sight.
     
  15. pesticle

    pesticle Guest

    Messages:
    29
    Likes Received:
    0
    GPU:
    980GTX 4GB
    Ah I see. Is that for any and all games you play? I always dealt with the tearing in favor of having that sludgy vsync mouse feel, with some exceptions (some single player games or games that favored a controller or games that just seem to tear much more than others)
     

  16. quickkill2021

    quickkill2021 Guest

    Messages:
    131
    Likes Received:
    3
    GPU:
    1080ti sli Poseidon
    You are clearly a super hero. If there is ever a league of unfit super hero's you need to join.

    You can say your super power is having the uncanny sense to perceive input lag in video games up to the billions of milliseconds and beyond.

    From now on the real question is what's your super hero name? This is a big deal we all just discovered a new superhero. You need to have a name

    Maybe if you concentrate your powers you can even perceive the lag of your own body movements. When the brain tells you a command you can see the lag. You definitely need to train more who knows how much lag there is for you to notice in the universe
     
    Last edited: Jan 21, 2016
  17. slayer213

    slayer213 Master Guru

    Messages:
    208
    Likes Received:
    0
    GPU:
    Zotac GTX 1070 Amp! 8 GB
    Ever since I started using SweetFX, i have completely abandoned AA and all its variants. Personally, it never made any difference to me while gaming though ofcourse theoretically, it does. However, when you are immersed in the game, it hardly matters or makes any difference.

    On the other hand, the visual difference with SweetFx enabled is hugely apparent. Plus, I don't have to suffer the performance hit which comes with AA (like MSAA).
     
  18. Mda400

    Mda400 Maha Guru

    Messages:
    1,090
    Likes Received:
    201
    GPU:
    4070Ti 3GHz/24GHz
    Yes, all games. There's a point where I start getting energy conscious about my PC and thats running 300+ fps, 500+ watts for ~10ms less latency. I try to minimize everything I can under the 60 fps limit (or 120/144hz if I had a higher refresh display).

    I dunno, lag bolt? Strong as one, yet I can quickly perceive lag like a bolt of lightning. :funny:

    While I don't use SweetFX (to me, basically a software enhancer that ruins the game's original look), I do want to use AA, but a type that's not resource intensive. SMAA is something that should be included in all games (besides fov slider and mouse smoothing/acceleration toggling options). It takes barely any GPU resources while providing an immensely less-blurry image than FXAA while rivaling the accuracy of normal MSAA.
     
    Last edited: Jan 23, 2016
  19. GanjaStar

    GanjaStar Guest

    Messages:
    1,146
    Likes Received:
    2
    GPU:
    MSI 4G gtx970 1506/8000
    Yep, I didn't think it was the same thing, but yes, it is noticeable with a mouse but not gamepad. Any sort of downsampling requires an increase in the mouse DPI settings to feel responsive like it does on native res.
     

Share This Page