True Double Buffering

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by BmB23, Jun 7, 2022.

  1. BmB23

    BmB23 Active Member

    Messages:
    53
    Likes Received:
    15
    GPU:
    GTX 1660 6GB
    For a long while now, I have been unable to get true double buffering to work with Windows 10/Nvidia drivers. I don't know exactly what caused it to stop working, but it stopped working one day and I've never been able to get it back. Either Windows or the driver does something to force some kind of artificial triple buffering on the game. This may be desirable if you want triple buffering in games that don't support it otherwise, but it may also not be, if you are able to maintain a high enough framerate.

    In my experience vsync is the only truly smooth way to play a game.
    • Vsync off: juddering, tearing
    • Triple buffering (opengl): some juddering, smooth only at monitor cap
    • Triple buffering (directx): same as opengl but horrendous input lag
    • "Classic" borderless window: minimal juddering, but never smooth
    • "Flip model" borderless: same as vsync off/on
    • Fast sync: microstuttering
    • Nvidia 1/2 sync: microstuttering
    • scanline sync: microstuttering, frame drops
    but true double buffering is absolutely smooth not only at monitor cap but also at all whole fractions of it, if you can maintain the fps. this is in my opinion the only way to play games at 30 or 40 fps. But it's also useful for 72 fps on a 144hz monitor. 1/2 sync should be identical to it but it isn't for whatever reason and always have some minor frame pacing issues.

    I've tried disable full screen optimizations, I've tried disable game bar and game mode, I've tried rolling back drivers, I've tried setting __COMPAT_LAYER environment variables, nothing works. double buffering is gone. Only thing left to try is to roll back windows to an earlier version but that will not be sustainable as compatibility issues will rise with time and I just don't want the hassle.

    So any clue as to what is doing this? I'd really like it back.
     
    DaRkL3AD3R and BlindBison like this.
  2. RealNC

    RealNC Ancient Guru

    Messages:
    3,671
    Likes Received:
    1,866
    GPU:
    EVGA GTX 980 Ti FTW
    Set low latency mode to "ultra" in the nvidia panel. What you're seeing is not necessarily triple buffering, but vsync backpressure. The presentation buffer length is only one part of the equation. The other is the pre-render buffer size, which also increases lag. The "ultra" setting minimizes that. Capping your FPS also minimzes it. See: Low latency vsync
     
  3. BmB23

    BmB23 Active Member

    Messages:
    53
    Likes Received:
    15
    GPU:
    GTX 1660 6GB
    NULL has only to do with the render queue. It certainly does not cause double buffering to be unlocked from fractions of the refresh rate.

    And in case you were confused, this has nothing to do with lag. This is not a topic about latency or lag or any kind of delay.
     
  4. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    1,956
    Likes Received:
    557
    GPU:
    3060 TUF
    vsync in games = unhappiness
    It seems no one really knows where "Classic DB" vsync went, but I'd declare it officially missing and not expect it to come back. Time to live on (VRR with LFC).
     

  5. BmB23

    BmB23 Active Member

    Messages:
    53
    Likes Received:
    15
    GPU:
    GTX 1660 6GB
    Well it's the absolute only way for some games to be smooth, so I will not accept that answer. I'll happily take some input lag in exchange for real smoothness. VRR is good for unstable framerates but nothing will ever beat a stable, locked fps.

    It wouldn't be such a big issue for me if 1/2 sync actually worked properly, but it has some kind of frame pacing issue that makes it not look nearly as smooth as double buffering.
     
  6. Shadowdane

    Shadowdane Maha Guru

    Messages:
    1,445
    Likes Received:
    77
    GPU:
    MSI RTX 3080 Suprim
    VRR with a FPS cap works very well. I usually do that over just a cap near the VRR limit.
     
  7. BmB23

    BmB23 Active Member

    Messages:
    53
    Likes Received:
    15
    GPU:
    GTX 1660 6GB
    Well I'm not looking for alternatives or excuses, only answers. So that's kind of irrelevant. In my experience any kind of framerate capping that isn't driven by the video clock is never going to be perfect.
     
  8. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    1,956
    Likes Received:
    557
    GPU:
    3060 TUF
    Some games have "inexplicable" stutter/skipping with VRR that isn't visible in RTSS graph (but in the monitor's refresh rate OSD, if the game's input processing isn't at fault at least), but imho there are games that are completely smooth with VRR + fps cap. Vsync may help to filter out more fluctuations by adding more delay (also DB). Well, but yes, it is the only completely smooth result in at least some cases (e.g. Days Gone mouse input processing...).

    Thing is you can't enforce buffer configuration of the application via driver, also D3DOverrider has stopped working. Either behavior with vsync is what you want or you are screwed (usually the latter).
    There is Gallium Nine on Linux that by default is "DBed" with vsync, but that's kind of a niche.
     
    BlindBison likes this.
  9. BmB23

    BmB23 Active Member

    Messages:
    53
    Likes Received:
    15
    GPU:
    GTX 1660 6GB
    And yet, that is exactly what is happening. Games that only support double buffering run as if they had triple buffering.
     
  10. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    1,956
    Likes Received:
    557
    GPU:
    3060 TUF
    Or at least almost: With GPU maxed out and with at least specific frame rates, the result looks much more stuttery than with triple buffering and this is also reflected in RTSS graph. No idea why this idiotic behavior exists. Probably also one of the reasons why so many devs screw up triple buffering, as they think it were to be working "as intended"...
     

  11. BmB23

    BmB23 Active Member

    Messages:
    53
    Likes Received:
    15
    GPU:
    GTX 1660 6GB
    No, exactly. we're not talking about VRR at all here. We're not talking about the games implementation. Read the OP before posting.

    We are talking about Windows 10/Nvidia replacing double buffering, with no apparent way to turn off this replacement.
     
  12. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    1,956
    Likes Received:
    557
    GPU:
    3060 TUF
    Read my post again, it makes perfect sense in context of DB/TB vsync without VRR.
    When the games don't implement triple buffering, that "seemingly" TB result is much worse than the proper one.

    Anyway, no one can give you what you want.
     
  13. Martigen

    Martigen Master Guru

    Messages:
    488
    Likes Received:
    229
    GPU:
    GTX 1080Ti SLI
    Have you tried Special K? It gives full control over buffers, flip queue, render ahead, vsync modes, latent sync and more.

    This is the latest installer from the official Discord (or join that and go to Installers if you don't trust random internet links) -- https://sk-data.special-k.info/repository/SpecialK_22.6.2.exe
     
    aufkrawall2 and BlindBison like this.
  14. BmB23

    BmB23 Active Member

    Messages:
    53
    Likes Received:
    15
    GPU:
    GTX 1660 6GB
    Special K doesn't actually seem to have any settings related to this. It can force borderless flip model, and verify that a game is in "legacy flip" mode. But it doesn't do anything about this issue. And it's not useful for games like say, apex legends or other online shooters where you might want good frame pacing.
     
  15. Martigen

    Martigen Master Guru

    Messages:
    488
    Likes Received:
    229
    GPU:
    GTX 1080Ti SLI
    I'm certainly not an expert with the tool, but achieving good frame pacing is one of the core purposes of it.

    Are you seeing similar settings to these when you load it?:

    azCapture.PNG
    You an directly set the backbuffers there ^^^, and it will clearly show where your latencies are. There's also a low-latency checkbox above this (though read the pop-up help), and of course Reflex control. Right-click on the FPS limit bar to get Latent Sync to pop up (should that help, never tried it, works similar to RTSS' scanline sync). Also, you can right-click on the Window Resolution bar at the top and force Fast Sync. There's lots to play with!

    If you open up the OSD options further down, you can get windows like the frametimes as a pop-up and the Threads one lets you change process-priority of threads on the fly. Read the wiki to get a full rundown on all the options. Somewhere in the wiki are its own recommendations for VRR, vsync on, and vsync off for what to set for Presentation Interval, Backbuffer Count, and Maximum Device Latency accordingly.

    Edit: I have no idea why the forum is stretching the image, simply used the 'upload file' function to add it. If anyone knows how to un-embiggen it, let me know.
     

  16. BmB23

    BmB23 Active Member

    Messages:
    53
    Likes Received:
    15
    GPU:
    GTX 1660 6GB
    It's not useful for online games because it's a very invasive tool that is not whitelisted by anticheat.
    I'm not sure which of those settings are supposed to do it because I don't see it.
     
  17. BmB23

    BmB23 Active Member

    Messages:
    53
    Likes Received:
    15
    GPU:
    GTX 1660 6GB
    Ah, sorry. Didn't see the "Backbuffer Count" switch at first. It doesn't show up at all in DX9 games either so you have to set it with the config file.

    Anyway I can confirm that with backbuffer set to 2 in Special K, it still has triple buffering with unlocked fps. So whatever is doing the patching is happening below Special K's interception, likely at the driver level.
     
  18. DaRkL3AD3R

    DaRkL3AD3R Active Member

    Messages:
    52
    Likes Received:
    9
    GPU:
    GTX 1080 Ti STRIX OC
    This is a topic extremely near and dear to me. I too have been left in frustrated wonder as to what the hell happened to true Double Buffering. I am pretty positive I had it on Windows 10 1607 at least in OpenGL, but in the later builds it seems to be completely MIA. I hope we can get to the bottom of this because it's really starting to piss me off.
     
    BlindBison likes this.
  19. BmB23

    BmB23 Active Member

    Messages:
    53
    Likes Received:
    15
    GPU:
    GTX 1660 6GB
    I just found out the damnedest thing. It seems that when forcing vsync with Special K on Jedi Fallen Order, it does enable double buffering without even setting the buffer counts. Why this game in particular is different I don't know. Could be some setting in the profile maybe? I see in inspector it has one unknown setting shared only with Shadow of the Tomb Raider. I don't have that one, but it would be interesting to see if it also worked in it.

    Anyway seeing that ultra smooth 60, or even 40 fps if I crank up the settings, is something else. This is on a 120hz monitor.
    And another point to make there, even though the frametime graph is wiggly, there is on the actual monitor not the slightest hint of a dropped frame or any kind of unsmoothness. It's the silkiest thing you've ever seen. So either the video clock is unstable (highly unlikely) or there is something wrong with the way frame times are measured in software. This is not unique to Special K.

    [​IMG]
     
  20. Martigen

    Martigen Master Guru

    Messages:
    488
    Likes Received:
    229
    GPU:
    GTX 1080Ti SLI
    That's great to hear you found some success! How does it perform using the frame limiter? There's a guide on BlurBusters somewhere about using a web tool find your exact monitor refresh rate, and setting a cap just little below it -- something 59.6 or 59.7 based on your screen shot, but it's different for every system.

    I imagine '-1' for buffers defers to the game's default, i.e. unmodified by Special K. But if it works, it works!

    For reference, these are the recommended settings for the swapchain on the Special K wiki depending on how Vsync is configured:
    Code:
    VSYNC OFF
    PresentInterval    0    PresentationInterval
    BackBuffer Count    -1    BackBufferCount
    Maximum Device Latency    -1    PreRenderLimit
    Wait for VBLANK    Disabled    WaitForVBLANK
    Waitable SwapChain    Disabled    SwapChainWait
    Enable DWM Tearing    Enabled    AllowTearingInDWM
    
    VSYNC ON
    PresentInterval    1    PresentationInterval
    BackBuffer Count    3    BackBufferCount
    Maximum Device Latency    4    PreRenderLimit
    Wait for VBLANK    Disabled    WaitForVBLANK
    Waitable SwapChain    Enabled    SwapChainWait
    Enable DWM Tearing    Enabled    AllowTearingInDWM
    
    G-SYNC/VRR
    PresentInterval    1    PresentationInterval
    BackBuffer Count    3    BackBufferCount
    Maximum Device Latency    2    PreRenderLimit
    Wait for VBLANK    Disabled    WaitForVBLANK
    Waitable SwapChain    Disabled    SwapChainWait
    Enable DWM Tearing    Enabled    AllowTearingInDWM
    
    What is Vsync set to in your drivers? Some experimentation of that with the above may help for other games.

    I am just a casual user of Special K, you'll likely get better answers from more knowledgeable people in the discord (including from Kal the author of Special K himself): https://discord.gg/c6k6d98N
     

Share This Page