NVIDIA: V-Sync OFF Not Recommended as a Global Setting Starting

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 12, 2021.

  1. Venix

    Venix Ancient Guru

    Messages:
    2,681
    Likes Received:
    1,391
    GPU:
    Palit 1060 6gb
    @NiColaoS i think the 144 and 165 come from the produced panels they can do this number one way or another so they will not limit that and advertise it as 120hz :p . Now on a freesynch monitor the higher the ceiling the better ! Besides such monitors can run on lower modes too.
     
  2. asturur

    asturur Maha Guru

    Messages:
    1,279
    Likes Received:
    460
    GPU:
    Geforce Gtx 1080TI
    @NiColaoS any 144hz monitor can be set at 120hz and 100 and sometimes 90 other than 75. So still good if missing vsync you can gear to your machine.
    I have an old i7 990x that couldn't do 144hz frame on modern wow, but does 100 most of the time.
     
  3. asturur

    asturur Maha Guru

    Messages:
    1,279
    Likes Received:
    460
    GPU:
    Geforce Gtx 1080TI
    Yes but my main point is that people call it `input lag` while is the whole experience lagging. And what happens is that then you browse forums with people arguing:
    `The VsYnC MakKESS my keYboArd go slower`.
     
  4. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,808
    Likes Received:
    3,370
    GPU:
    6900XT+AW@240Hz
    Well, they kind of forgot to deliver interaction with adaptive sync here. Because regular screen now days has g/free-sync.
    And there people quite often disable v-sync. Then there is that option in windows settings.

    upload_2021-2-12_13-17-2.png
    Will above apply to those windowed applications too?
    Will there be some kind of composition deadline in terms of milliseconds? (Like wait 1ms after main window finishes redraw till others have to be here or image is presented to driver/screen?)
    I think that proper name is "Motion to Photon Delay". And you are quite right in depiction of it.
     
    Last edited: Feb 12, 2021

  5. ObscureangelPT

    ObscureangelPT Master Guru

    Messages:
    552
    Likes Received:
    66
    GPU:
    Zotac GTX 1650 Supe
    Not 100% sure about that.
    For that a game implementation of nvidia reflex was not needed, it could be turned on via driver anyway .
    It can't be only triple buffering as I have always noticed that triple buffering even with a framerate lock is kind of inconsist, and I don't actually have the same feeling with Nvidia Reflex + Vsync.
     
  6. Camaxide

    Camaxide Active Member

    Messages:
    87
    Likes Received:
    32
    GPU:
    MSI 1080 Ti Gaming X SLI
    you are right, its the delay of the image and not the input that is truly delayed. However, eye to hand coordination suffers when the motion you do takes more time to display, thus it is referred to as input lag.. it does not help that the pc knows where your crosshair and opponent currently is, when you dont.. the ‘input lag’ that is created makes it much harder to track a moving target of two reasons.. 1: the enemy is in reality no longer where you see him, since your monitor already shows an old image that is not up to date. 2: when you flick the mouse you will not see the motion until a few frames later.. how are you then supposed to hit the fire button exactly when crosshair meets the player? By the time you see them match both the player and your crosshair has already moved on for a couple frames.. this is for gaming... but snappy accurate update of the mouse onscreen is really a benefit in mostly all tasks, just the punishment for high delay is not as severe in most windows tasks.
     
  7. asturur

    asturur Maha Guru

    Messages:
    1,279
    Likes Received:
    460
    GPU:
    Geforce Gtx 1080TI
    I still cannot explain how 1 frame can make that difference. Because with vsync off all you gain is not waiting for the next refresh, eventually drawing half and half ( or possible more pieces ) of frames on the screen. Or there is waiting in vsync that i do not understand?
     
  8. Astyanax

    Astyanax Ancient Guru

    Messages:
    14,450
    Likes Received:
    5,870
    GPU:
    GTX 1080ti
    depending on the pressure from the swap chain it can feel like 2 frames.
     
  9. asturur

    asturur Maha Guru

    Messages:
    1,279
    Likes Received:
    460
    GPU:
    Geforce Gtx 1080TI
    i think vsync fast has nothing to do with triple buffering, it just keeps updating the back framebfuffer unthrottled, and then at the right moment it put the newest one in the front buffer. Nvidia describes it as acting as vsync off internally.
     
    JonasBeckman likes this.
  10. ObscureangelPT

    ObscureangelPT Master Guru

    Messages:
    552
    Likes Received:
    66
    GPU:
    Zotac GTX 1650 Supe
    @asturur If that was the case, GPU Usage would stay at 99% giving everything it got, but that's not the case.
    GPU Usage drops to 60/70% since it's the only Usage it needs to reach the Framerate threeshold.

    Altough Warzone also features 2 types of Reflex, ON (the one that i use), and a higher mode which claims what you say.
     

  11. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,562
    Likes Received:
    2,958
    GPU:
    MSI 6800 "Ref"
    Could maybe use the global settings like AMD does and have it apply to OpenGL here but keeping the override itself yet leaving it for advanced settings as a per-profile thing should users want to set the state to disabled regardless of compatibility concerns.
    That way it can't be left on off if there are some software that is in some way entirely incompatible with immediate presentation mode yet keep the option for those who want this setting on a per-profile basis.

    Internal profiles work too I suppose with a bit more work outright deny changes to some software if the breakage is bad enough to require this but a recommendation is a ton less work and testing while serving the same idea just reminding users to be mindful on setting it off globally. :)
     
  12. WalterDasTrevas

    WalterDasTrevas Master Guru

    Messages:
    220
    Likes Received:
    95
    GPU:
    Gigabyte GTX970 oc
    It makes a difference and the difference is explicit in less powerful GPUs. I was never able to get used to the delay in CS:GO when VSYNc is on.
     
  13. Astyanax

    Astyanax Ancient Guru

    Messages:
    14,450
    Likes Received:
    5,870
    GPU:
    GTX 1080ti
    That'd work, theres plenty of settings that don't do crap globally but do from the profile.
     
  14. Netherwind

    Netherwind Ancient Guru

    Messages:
    8,268
    Likes Received:
    1,944
    GPU:
    Gigabyte 4090 GOC
    For us G-sync users - is V-sync on in NVCPL and V-sync off in games still the best option?
     
    Solfaur likes this.
  15. Astyanax

    Astyanax Ancient Guru

    Messages:
    14,450
    Likes Received:
    5,870
    GPU:
    GTX 1080ti
    Yes.

    the issue is specifically about applications that expect their vsync on state to be respected.
     
    Solfaur and Netherwind like this.

  16. BlindBison

    BlindBison Ancient Guru

    Messages:
    1,602
    Likes Received:
    594
    GPU:
    RTX 3070
    Do you know of some example games like that which "break" if you force the control panel's v-sync rather than using their in-game options?

    The primary reason I typically used the control panel V-Sync rather than in-game (before I had G-Sync) is because then you know precisely what "type" of V-Sync you're using (traditional double buffered / linear triple buffering / fast sync / Adaptive V-Sync).

    Absolutely makes me irate when games don't even tell you what form of V-Sync they use when there's only one single option present. They also never communicate whether or not enabling the option did more than enable v-sync such as introducing a framerate limit automatically in-game (I've read some games did this, but I haven't found any that do personally -- barring games which just limit their framerate internally to 60 for physics reasons). Due to this I always forced my preferred method via the control panel then set FPS limits manually with RTSS or in-engine limiters in the case of Overwatch for example since it's limiter has low input delay.

    In general I really wish game setting were more verbose -- so, explaining exactly what type of v-sync is used with the in-game setting (and any other changes/optimizations that it "may" do) and also more games should do what Gears 5 does where they explain how much impact on CPU/GPU various settings have (So, in Gears 5 they'll say "X setting has a low CPU impact and high GPU impact" for example).
     
  17. Astyanax

    Astyanax Ancient Guru

    Messages:
    14,450
    Likes Received:
    5,870
    GPU:
    GTX 1080ti
    Vsync off issues range from Input doesn't work properly, to visual glitches and engine misbehaviors (but these are mostly framerate exceeding engine tolerances, and long been resolved with third party frame limits)

    the primary issue nvidia is addressing here has been when global vsync is off in applications that have accelerated interfaces but seperately draw them rather than updating them as one whole surface, this issue is worsened now as applications are drawing into their own layers without relying on DWM at all (at the discretion of the OS) so you can get flickering in Application X while scrolling in application Y.


    Now off the top of my head on what games have been (or previously) broken with vsync off,

    <Insert Gamebryo based title prior to frame limiters> - Physics goes banana's
    Divinity Original Sin (since fixed) - Crash
    Homeworld (Eventually fixed by frame limiters) - At 1000fps, mouse input breaks.
    WarFrame (since fixed) - Crash

    I'm sure given enough games to test and the time to test them i could find more.
     
    BlindBison likes this.
  18. waltc3

    waltc3 Maha Guru

    Messages:
    1,427
    Likes Received:
    545
    GPU:
    AMD 50th Ann 5700XT
    Interesting...my 5700XT/BenQ EW-3270 60Hz displays no visible tearing in ~99%+ of my games--I've left the global setting in my Adrenalins set to vsync-off--in both full screen and windowed borderless. No problems. I have maybe one game that looks and runs a bit better with vsync on--Grim Dawn & expansions. And the game doesn't page tear--it just stutters a bit unless I enable in-game vsync, for some reason. As 60Hz monitors are far more common than higher Hz monitors, turning vsync off is the only way to > 60 fps. I have some older games and benchmarks that run at hundreds of frames per second with no visible tearing. I well remember what tearing looks like from years ago, but with the last two 4k monitors I've owned, vsync off has not been a page-tearing issue. I chalked it up to the anti-flicker circuitry in both monitors. I'm now running an advanced beta version of Win10 and that hasn't changed.

    This is also interesting because back in the days when I owned a TNT1 and a TNT2, and V2 SLI, a V5 5.5K, and a V3, many years ago, The nVidia TNTs both had major problems with the vsync off condition--the 3dfx GPUs did not, so the difference was easy to see.
     
  19. smashmambo

    smashmambo Active Member

    Messages:
    52
    Likes Received:
    14
    GPU:
    GTX 970 4GB
    Thinking about it now I don't think I've ever forced anything globally in the CP. If there is ever anything that needs a change I just make / select a profile for it in CP or use Nvidia Profile Inspector. Simple.
     
  20. Mineria

    Mineria Ancient Guru

    Messages:
    5,505
    Likes Received:
    681
    GPU:
    Asus RTX 3080 Ti
    Not quite sure what you even trying to state.
    With off removed you can either globally go On, Fast or let the 3D Application decide, you can just pick the last option and control it either from profiles or games, no reason to have an option that forces it off for everything.
    Besides, you also have options that counter the input lag introduced by v-sync, works quite well with most properly coded engines.
     

Share This Page