Nvidia should implement their own version of Intel's "Smooth Sync"

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by BlindBison, Nov 29, 2022.

  1. BlindBison

    BlindBison Ancient Guru

    Messages:
    1,787
    Likes Received:
    718
    GPU:
    RTX 3070
    Though I haven't yet found any of the big tech Youtubers discussing or testing this feature, on paper it seems like an elegant solution, or, at least a piece of an elegant solution for the V-Sync problem on traditional refresh panels. From what I gather the idea is to let the frames tear (e.g. V-Sync OFF behavior) but then use dithering or somehow blend the frames together along the tearline such that the tear is less noticeable. My immediate thought is that in conjunction with other techniques this could be very useful. For example, suppose we do the following:

    1) Double Buffered Adaptive V-Sync (where the V-Sync behavior acts like traditional double buffering til the framerate drops then V-Sync is dropped and frames are allowed to tear).

    2) Instead of just letting frames tear like Nvidia's Adaptive V-Sync does, instead it could switch over to an Intel Smooth Sync style blending so the tearing is less noticeable -- once the framerate recovers back to target refresh traditional double buffering is re-enabled.

    3) We could combine this with Xbox's "improved" adaptive v-sync which places the tearing in the top or bottom of the screen only (no tears in the middle thus the tearing is less noticeable). I'm not sure why Nvidia adaptive sync can't work like the Xbox version as in my mind the Xbox version is noticeably better as you don't get tears in the center of the screen.

    4) This one's not so big of a deal I suppose as users can simply test and set this value themselves via RTSS (ideally it would be nice if the Nvidia driver limiter had 2 decimal places of precision as a side note). But my thought is that the final step could be for this improved driver level v-sync option to automatically enable the Nvidia driver FPS limiter so as to reduce input lag when v-sync is engaged. As per the BlurBuster's low lag v-sync guide capping the framerate in conjunction with v-sync lowers input lag (I've tested this personally and it works -- I assume because we're filling the backbuffer "just in time" instead of filling it immediately and letting it sit til buffer flip, but I'm just speculating there). Of course when used with traditional v-sync if this cap is too low you'll get stutters creeping in and if it's too high you don't get all of the input lag reduction. In the case of adaptive v-sync this cap is also a bit wonky as it has to be set rather high or v-sync is dropped (e.g. In my local tests with Nvidia's adaptive half refresh rate v-sync I had to set the the RTSS framerate limit to more like 30.5 FPS or tears would creep in and V-Sync would drop -- I'm suggesting that when this theoretical driver level v-sync is switched on a driver level fps limit would automatically be set at an appropriate/safe value for some delay reduction). While we're at it, perhaps ULLM/Reflex could be enabled by default for this form of V-Sync to boot as input lag is one of the primary criticisms often levied against V-Sync solutions.

    All that to say for an "idealized" traditional v-sync solution to work to its fullest it would need all of these parts combined together seems to me. Of course "true" adaptive sync (e.g. G-Sync/Freesync) will always be superior, but many many users -- probably most -- are using traditional 60 hz panels so if a driverlevel v-sync toggle (one for half refresh rate should also be provided) could be made to work like this it would be miles ahead of any other existing solution seems to me. Most users could simply toggle it ON per game profile and it would be more or less the best we can do with a traditional panel to my knowledge. Even as as G-Sync user myself I'd love to have a V-Sync driver level option like this for my laptop on the go. Seems like a dream come true on paper.
     
    Last edited: Dec 2, 2022
    EDK-Rise likes this.
  2. Astyanax

    Astyanax Ancient Guru

    Messages:
    14,948
    Likes Received:
    6,114
    GPU:
    GTX 1080ti
    "Smooth Sync is Intel's in-house display refresh-rate technology alongside support for VESA Adaptive Sync"

    so its fast sync set as the vsync mechanism, with gsync enabled.
     
  3. BlindBison

    BlindBison Ancient Guru

    Messages:
    1,787
    Likes Received:
    718
    GPU:
    RTX 3070
    @Astyanax Apologies if I'm mistaken, but I saw a few articles suggesting that it was a technology intended for fixed refresh rate displays and not an adaptive sync technology:

    https://gpuinsiders.com/intel-smooth-sync/
    > "A new dithering filter introduced by Intel is Smooth Sync, which helps to minimize screen tearing on displays without Adaptive-Sync or V-Sync active"

    https://gpuinsiders.com/intel-smooth-sync/
    upload_2022-11-28_23-47-17.png



    My understanding is that the idea is just to let the game tear (v-sync OFF) and then dither/blend the frames together along the tear line so as to make the tear less distracting. This would make the tech rather a lot different to "Fast Sync" which just renders unconstrained and then when it's time for the fixed refresh display to refresh they choose the most recent frame and toss out intermediary ones (leading to the weird jittery look associated with fast sync).
     
    Chade, Cave Waverider and EDK-Rise like this.
  4. oneoulker

    oneoulker Active Member

    Messages:
    87
    Likes Received:
    68
    GPU:
    NVIDIA RTX 3070
    i really wonder what kind of vsync some ps4 games employ. I'm literally cross comparing miles morales on PC and PS4 back to back rn, I have both devices, and I try to mimic / achieve the 30 FPS smootness of PS4. I've tried countless things and somehow I can't achieve it. I literally get mad over it.

    same screen, same refresh rate (60 hz)... similar framerates.

    i've tried

    just 1/2 half vsync = super input lag that DEFINITELY does not happen on PS4
    1/2 half vsync + 29.99 fps limit (blurbusters way) = it reduces input lag, but there's still something off about how the camera movement starts, it is still a bit more laggy and overall the screen rotation is not as smooth
    1/2 half vsync + 29 fps nvcp limit = this one looked a bit better but constantly stuttered
    1/2 half vsync + 30 fps nvcp limit = same as just 1/2 half vsync


    i've made sure i was getting 30 FPS %1 and %0.1 lows at all times, checked them. it wasn't a frametime issue etiher.

    its not about the visual smoothness, you see. yes, 30 FPS is 30 FPS and it still blurs everything once you move the camera. but somehow PS4's input was smoother. It is not placebo, it is not wrong assumption. I literally have 2 DUALSHOCKS at my disposa (one came with PS4 I purchased, one I had before, personally bought). I've literally matched everything possible, yet still can't achieve that PS4's input delay.

    ONLY WAY to achieve PS4's input delay is to... TURN OFF Vsync completely and simply lock it to 30. THAT's it. then, it somehow matches how the PS4 behaves in terms of input delay / input lag / input start. but then, it tears all over. vrr+30 fps also works. but I CANNOT REPLICATE what PS4 is managing on the FIXED REFRESH.

    i've closely examined and I cannot see any tears with the PS4 version.

    so... WHAT THE HECK is going on. how can they employ such a low input lag vsync implementation?
     
    BlindBison likes this.

  5. BlindBison

    BlindBison Ancient Guru

    Messages:
    1,787
    Likes Received:
    718
    GPU:
    RTX 3070
    I'm glad you brought this up because I've observed the same behavior. I have some ideas about this (I'm speculating so could be wrong of course) -- the first thing I've noticed is that on my 144 Hz 1 ms response G-Sync 1440p panel, if I enable G-Sync + Driver level V-Sync then set a driver level or RTSS level framerate cap to 30 fps, it does not appear to look that smooth compared to some console 30 fps games. I don't know why this is.

    However, on my laptop which is a fixed refresh LCD display, I can use half-refresh rate v-sync via Nvidia inspector and, in conjunction with motion blur, it actually looks half decent/pretty smooth (certainly smoother than the G-Sync 30 though input lag is higher). Still not quite as smooth looking as the console 30, but I wonder if that has something to do with their using a longer motion blur shutter speed, I'm unsure. I have heard John from Digital Foundry mention that some flat panel display technologies look blurrier vs sharper in motion so maybe the display type itself has something to do with the perceptual smoothness of 30 fps, I'm really unsure and can only speculate as to why I see the difference in smoothness between G-Sync 30, my laptop half refresh rate v-sync'd to 30, and console 30 (in games that do it "right" I mean, not your Elden Rings, etc).

    I do know that Playstation games typically employ a kind of triple buffered V-Sync (it seems to be the "first in first out/smooth" kind, not the LIFO fast sync kind) whereas Xbox typically employs the "improved" adaptive v-sync I mention in my initial post here where it drops v-sync when the framerate dips, but only shows tearing in the top or bottom of the screen not the center (this is also something DF has discussed for games like CoD for example).

    As for your notes on the BlurBuster's low-lag guide, you definitely have to be concerned about capping too low seems to me -- these days I even cap a hair over what the hz testers recommend as I don't like getting v-sync stutter when the framerate is too low. For example, Nvidia half-refresh rate adaptive v-sync would start tearing if I set the RTSS framerate limit lower than say 30.6 fps so I just left it there and it was still noticeably less laggy than no cap, but I also never saw stuttering in my tests. If I were to wager a guess as to what console games were doing I'm guessing they're using a low CPU prerender ahead value in conjunction with a "proper" in-engine FPS limiter (for example Battlefield V and Overwatch have limiters which actually reduce input delay beyond what Nvidia's or RTSS's can do) and then combining that approach with proper half refresh rate v-sync and motion blur with a long shutter speed. The other thing it could maybe be is a difference in the way input is read or something on that front -- I recall in the Uncharted 2 PS3 dev blog (something like that) they mentioned how the whole game was built to output 30 unique frames every cycle from the ground up, some games end up with weird camera animation lurches due to how they input read seems like (of course a lot of this is speculation on my part, I'd like to know more about the inner workings of their games too).

    If anyone else has insight into why G-Sync 30, half-refresh rate v-sync 30, and console 30 can end up looking quite a lot different in terms of smoothness and input lag I'm all ears. In any case on PC, it would be amazing to have the "theoretical" improved adaptive sync + Intel smooth sync to dither tears + a proper fps cap for the V-Sync mode I describe in my original post. Would be pretty dang awesome if Nvidia spent real time implementing something like that seems to me -- after all, most people don't have a G-Sync/Freesync display.
     
    Last edited: Nov 29, 2022
  6. BlindBison

    BlindBison Ancient Guru

    Messages:
    1,787
    Likes Received:
    718
    GPU:
    RTX 3070

    The quality after YT compression is lower than I'd like, but there don't seem to be too many videos talking about this yet. In any case the tears seem a bit tougher to pick out on the left to my eye whereas on the right you can clearly see the hard tear line at times. Neither looks quite right, but assuming the effect is basically "free" in performance terms it seems to me like it's worth having -- especially if it's combined with other techniques like I described in the original post (e.g. Port whatever Xbox's adaptive v-sync solution is where tears are kept in the top and bottom of the screen only then do an Intel style "smooth sync" dither/blend along the tear so it's less noticeable).
     
    JiveTurkey likes this.
  7. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,400
    Likes Received:
    656
    GPU:
    RTX 4090
    I mean they could but all it does is hides the tear line somewhat with very simple dithering.
    Seems like a band aid to the problem which doesn't really solve much - chances are that those who are susceptible to seeing tears will see these dithered tears too.
    Also kinda a feature which is late to the party in the age of VRR displays.
     
  8. BlindBison

    BlindBison Ancient Guru

    Messages:
    1,787
    Likes Received:
    718
    GPU:
    RTX 3070
    I see where you're coming from -- of course "true" VRR is the clean winner/superior -- but the the main reason I think this feature is clever and would be useful is since most people do not have VRR / G-Sync / Freesync displays and that likely won't change for many years, at least for most users. I get the impression Guru3D skews more towards upper midrange to high end systems so probably less relevant to us.

    I agree that you can definitely still notice the ditherered/blended tear, but I do think it's preferable to a "hard" tearline. If you combined this technique with the Xbox's Adaptive Sync (the recent CoD uses this on Xbox for example) which only shows tearing in the top and bottom of the screen (not the center as to be less noticeable/distracting) I think it would pretty much be the best solution to date in terms of a good v-sync solution for traditional refresh panels. Adaptive V-Sync is double buffered as well which means 1 frame less input lag compared to FIFO triple buffering and since that solution allows the frames to tear beneath refresh it looks less "juddery" during framedrops to boot. I think a solution combining these things would be very useful for traditional fixed refresh panel users.
     
    Last edited: Nov 30, 2022
  9. RealNC

    RealNC Ancient Guru

    Messages:
    3,920
    Likes Received:
    2,112
    GPU:
    EVGA GTX 980 Ti FTW
    This seems to be targeting vsync off and VRR off people (like Counter Strike players.) Most of them prefer tearing and never use VRR. I don't think @BlindBison is the target audience for this :p
     
    BlindBison likes this.
  10. BlindBison

    BlindBison Ancient Guru

    Messages:
    1,787
    Likes Received:
    718
    GPU:
    RTX 3070
    One of my buddies plays CSGO in 4:3 720p-ish upscaled lol -- I don't get it, but he's happy :p

    For myself I have a G-Sync panel so this wouldn't matter much for me, but if a technique like this were combined with the Xbox adaptive v-sync (double buffered v-sync til the framerate dips then it switches over to top and bottom screen tearing only -- my idea is to also combine that with Intel's solution so that those tears are also dithered with the smooth sync effect) I think it would have a lot of value for console players and/or fixed refresh rate users. I also own a fixed refresh rate laptop where something like this could come in handy.

    True VRR is the real solution and all that but I'm the only person I know in my work/friend group with a VRR display so seems like this could be useful to people. I think it would be worth the effort for Nvidia to implement and/or upgrade their existing adaptive V-Sync option (or, they could upgrade their adaptive v-sync option to match the xbox's superior version then have smooth sync as an adjacent setting which could be enabled at the same time if the user wanted, something in that vein).
     
    Last edited: Nov 30, 2022

  11. BlindBison

    BlindBison Ancient Guru

    Messages:
    1,787
    Likes Received:
    718
    GPU:
    RTX 3070


    Sort of tangential to this topic, but some of Intel's GPUs seem pretty good in terms of value in conjunction with DXVK Async. I'm happy to see another challenger since Nvidia's prices have gotten wild.
     
    endbase likes this.
  12. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    2,383
    Likes Received:
    858
    GPU:
    3060 TUF
    Yes, though Intel's driver quality is apparently still hit & miss also with API wrappers. There has been no driver release yet that brought some major improvement, instead they improve minor things and regress others at the same time. Really hinders my optimism towards their future cards.
     
    BlindBison likes this.
  13. BlindBison

    BlindBison Ancient Guru

    Messages:
    1,787
    Likes Received:
    718
    GPU:
    RTX 3070
    Sadly you're right going off what I've read/the tests I've seen. My "hope" is they keep at it and maybe within a few generations they'll be more competitive. I hope they don't give up on it -- I know the whole "smooth sync" thing doesn't much impress many people but to me it seems like a creative idea (especially if it was used as a "bolt on" to Adaptive V-Sync). Kind of reminds me of when AMD announced anti-lag/CAS which prompted Nvidia to implement their own solutions -- i'm not sure nvidia would've bothered otherwise (though nvidia sharpen seems just flat out worse than AMD CAS for what it's worth).
     
  14. BlindBison

    BlindBison Ancient Guru

    Messages:
    1,787
    Likes Received:
    718
    GPU:
    RTX 3070


    ^11:22 Timestamp -- Turns out Intel already does have the "Smooth Sync" approach combined with Adaptive V-Sync which drops V-Sync when the framerate dips below monitor refresh target. Rich demonstrates this in action at that timestamp.

    Sadly I don't think their solution does the Xbox CoD / Crysis thing where tears only show up in the top and bottom of the screen, but still this is an improvement over what Nvidia's Adaptive V-Sync solution can offer. I get that Nvidia doesn't care because "G-Sync is the future" and all, but again -- the vast majority of people do not have freesync/g-sync panels so improved v-sync options would help them a lot (or people who have laptops without VRR aside from their VRR desktop, etc).
     
    Last edited: Dec 12, 2022
    TheDigitalJedi and endbase like this.
  15. P_G19

    P_G19 Member

    Messages:
    26
    Likes Received:
    3
    GPU:
    GTX 1660 / 6GB
    Nvidia won't do that ever probably.
    I hope that I'm wrong though as that is going to benefit my 60Hz monitors.
     
    BlindBison likes this.

  16. BlindBison

    BlindBison Ancient Guru

    Messages:
    1,787
    Likes Received:
    718
    GPU:
    RTX 3070
    I would be surprised if they did yeah — but it would be a genuinely great feature for anyone using traditional refresh panels so that’s a shame eh?
     
  17. BmB23

    BmB23 Active Member

    Messages:
    70
    Likes Received:
    23
    GPU:
    GTX 1660 6GB
    Adaptive v-sync was the widely recognized term for fixed refresh v-sync, where v-sync is adaptively disabled when framerate drops below the refresh rate, before it was hijacked to mean a generic non-branded name for freesync.

    I would anyway rather have SILK back, that worked wonders when I was playing vermintide.
     
    enkoo1 and BlindBison like this.
  18. BlindBison

    BlindBison Ancient Guru

    Messages:
    1,787
    Likes Received:
    718
    GPU:
    RTX 3070
    I'm not familiar with SILK -- will have to search around about that huh
     

Share This Page