1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Does SLI/Crossfire add a Frame(s) of Latency?

Discussion in 'General Hardware' started by EerieEgg, Sep 4, 2018.

  1. EerieEgg

    EerieEgg Member Guru

    Messages:
    155
    Likes Received:
    12
    GPU:
    GTX 1080 Ti
    Hi there guys,

    Regarding SLI potentially adding an extra frame(s) of latency, I was reading this previous Guru3D thread here:
    https://forums.guru3d.com/threads/does-sli-add-latency.388803/

    Unfortunately reading through to the bottom, it seems there's some contention and disagreement in terms of how much latency (if any meaningful amount at all) is added.

    Suppose one is running a game at 30 fps on a 60 Hz panel with Max-prerendered frames set to 1, 1/2 Refresh double buffered Vsync enabled, and Rivatuner being used to framelimit the application to 29.95 fps (low lag traditional Vsync solution outlined by Blur Busters -- [59.935/2] rounded down - 0.01).

    Would SLI add an additional frame(s) of latency in this scenario? I'm confused about this since I haven't been able to find a straight answer on this or just how much latency SLI adds in general (I've been Googling around for some time now).

    If it does add an additional frame(s) of latency, that would explain why 30 fps on console feels so much less laggy than on my PC (using a controller for both). Namely that in-game frame limiters (usually) give 2 frames less latency (while Rivatuner gives 1 less) and also because my SLI setup is giving another frame(s) of latency on top of that.

    Anyway, Thanks for your time, I appreciate it.
     
    Last edited: Sep 4, 2018
  2. BrianG

    BrianG New Member

    Messages:
    3
    Likes Received:
    0
    GPU:
    Gtx 760
    Last edited: Sep 7, 2018
  3. RealNC

    RealNC Ancient Guru

    Messages:
    3,076
    Likes Received:
    1,315
    GPU:
    EVGA GTX 980 Ti FTW
    I think somebody around here mentioned in the past that SLI will automatically use triple buffering (forced by the driver.) But I don't actually know if this is true or not. If true, then indeed there will be 1 additional frame of latency when using vsync.
     
  4. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,360
    Likes Received:
    896
    GPU:
    1080Ti H20
    It depends on the setup.

    Comparing a single GPU set to the default 3 pre-rendered frames(application controlled), they will have the same latency/input lag.

    SLI requires 3 buffered frames to have good performance which is also the default for single GPU.
    SLI will have the same input lag in such a case with lowered inter-frame latency.

    In a setup where a single GPU is set to 1 max pre-rendered frames ahead, single GPU will have up to two frames less of input lag.

    In short, out of the box, SLI will have the same input lag as a single GPU.
    But single GPU can have less if tweaked.
     
    EerieEgg and Dragam1337 like this.

  5. RealNC

    RealNC Ancient Guru

    Messages:
    3,076
    Likes Received:
    1,315
    GPU:
    EVGA GTX 980 Ti FTW
    OK, that's not triple buffering then. MPRF is a different thing.

    The tricky question is: on single GPU, MPRF doesn't matter if you cap your frame rate. If you cap to 60, and the game actually hits 60FPS, MPRF behaves as if it's set to 0. The CPU is prevented by the frame limiter from pre-rendering anything. (Unless you're using nvidia's in-driver limiter, I guess, which might explain why it has more latency than in-game limiters or RTSS. But that's off-topic speculation.)

    So I wonder what happens here with SLI.
     
  6. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,360
    Likes Received:
    896
    GPU:
    1080Ti H20
    My guess SLI vs single both capped, SLI will still have more input lag as 3 buffered frames is inherit to how AFR works with SLI.

    I don't think there is a way to force it to buffer less, but if that's possible then yes, such a case would reduce input lag.

    SLI in AFR would not work at all if only using a single buffer(zero increase in performance) and only having 2 frames of buffer would be very detrimental for performance.

    Anyways, that would require testing to really determine what exactly happens.
     
  7. EerieEgg

    EerieEgg Member Guru

    Messages:
    155
    Likes Received:
    12
    GPU:
    GTX 1080 Ti
    Well, that might explain why I've been getting such poor performance in certain applications (DOOM 2016/RAGE for example).

    My current setup is SLI 770s where I've set the max-prerendered frames to 1 in the Nvidia control panel. I'll have to do some more tests using only one of the GPUs and see what happens or bumping up the max-prerendered frames count to the Default.

    That said, in other games (Overwatch for example) when I have max-prerendered frames set to 1 and reduce buffering enabled in-game I still manage to get solid performance (~70 fps at max setting @1080p).

    Thanks for all your replies, this is very helpful -- this Fall I don't expect I'll be doing SLI again lol.
     
    Last edited: Sep 6, 2018

Share This Page