Comprehensive benchmarking of NVIDIA's FPS limiters (V1, V2 & V3) vs RTSS vs In-engine

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by RodroG, Mar 1, 2020.

  1. RodroG

    RodroG Active Member

    Messages:
    55
    Likes Received:
    63
    GPU:
    RTX 2080 Ti / 11GB
    Hello everybody.

    I just published the following comprehensive benchmarking of the performance of all NVIDIA's frame rate limiters (V1, V2 and V3), RivaTuner Statistics Server limiter and in-engine limiters in 5 games (2 DX11, 2 DX12, 1 VK) through their built-in benchmarks and w/ G-Sync ON (NV V-Sync ON):


    I hope you liked it and can be useful for the Guru3D community. :)

    As a summary, here are all the final notes of my full data article linked above. Feel free to comment here.

     
    Last edited: Mar 5, 2020
  2. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    38,996
    Likes Received:
    7,689
    GPU:
    AMD | NVIDIA
    Hi, thanks for your entries.

    However, if you have something to tell or report, please do that in these forums and stop forcing people to move to Reddit?
     
    GSDragoon, Nekrosleezer and xacid0 like this.
  3. RodroG

    RodroG Active Member

    Messages:
    55
    Likes Received:
    63
    GPU:
    RTX 2080 Ti / 11GB
    Sorry, I'm not forcing people to move to reddit. I'm just informing people here on the release of my latest research, sharing here a link to my full article that was published on such subreddit, and offering here a summary too.
    Of course, I intend to reply here to all the users that comment/reply from this thread. Regards.
     
    ZeroStrat and Smough like this.
  4. aufkrawall2

    aufkrawall2 Master Guru

    Messages:
    675
    Likes Received:
    80
    GPU:
    RX 480 8GB OC/UV
    Nvidia's limiter doesn't work well for me with D3D12/Vulkan. There are odd minor and less minor spikes in RTSS graph in every game tested. RTSS limiter on the other hand always seems to be flawless, but I'd say the same is true for Nvidia's limiter with D3D11. There don't seem to be noteworthy differences between v1/2/3, it might be just all the same with minor configuration differences.
     
    Last edited: Mar 1, 2020
    RodroG likes this.

  5. hemla

    hemla Member Guru

    Messages:
    171
    Likes Received:
    10
    GPU:
    nvidia
    I'm not sure how accurate those "input lag" measurements are. In past it's been known that "input lag" is very app dependent, as in-app limiters aren't all the same. For example for me CryEngine3 is still much snappier with in-game limiter than with anything else. Would be interesting to see how it performs in benchmarks like yours just for comparison.
     
  6. RodroG

    RodroG Active Member

    Messages:
    55
    Likes Received:
    63
    GPU:
    RTX 2080 Ti / 11GB
    Precisely, my finding showed exactly how game/engine/API dependent are the performance results of the different FPS limiters/methods. The latency results are accurate and they have to be referred to latest approximate, and probabilistically "expected", approach that was mentioned and detailed in the full article linked above and from other sources.

    Anyway, and if you really want to know technical info on such approximate latency measurement approach, please read carefully ALL the following relevant and related sources:
    I hope it helps to better understand the input lag approximation I used for my research. Regards.
     
    Last edited: Mar 2, 2020
    BlindBison likes this.
  7. cryohellinc

    cryohellinc Ancient Guru

    Messages:
    3,302
    Likes Received:
    2,582
    GPU:
    RX 5700 XT/GTX 1060
    Very interesting analysis. Thank you!
     
    BlindBison and RodroG like this.
  8. hemla

    hemla Member Guru

    Messages:
    171
    Likes Received:
    10
    GPU:
    nvidia
    So technically it's just guessing, more or less accurate. It's strange because it doesn't correspond with battle(non)sense "input lag" research, where he claims that in-app limiters are always best.
     
  9. aufkrawall2

    aufkrawall2 Master Guru

    Messages:
    675
    Likes Received:
    80
    GPU:
    RX 480 8GB OC/UV
    The software approximation might be wrong then, as only the method based on "physical evidence" used by Battle(non)sense to measure input latency is actually reliable.

    Though Battle(non)sense doesn't test with ULL set to ultra. If he changed this, external limiters should work much better in terms of input latency in DX11 titles.
    He also doesn't mention that you can't combine AMD's Chill with their Anti-Lag feature (which also works worse than Nvidia's ULL ultra regarding framtime consistency).
     
  10. hemla

    hemla Member Guru

    Messages:
    171
    Likes Received:
    10
    GPU:
    nvidia
    Honestly, I'm not 100% sure about battle(non)sense methods neither. He measures just by pressing mouse button once and recording effect with camera, good in theory but in practice very limited as there are other factors as well. For example when you are using NVCPL(3)/RTSS limiters the 0.1% fps drops are much better than with any in-app limiter and if you get some fps drops below your target fps then "input lag" generated by those drops should be technically also lower. But the average "input lag" forced by in-app limiter should be better.
     

  11. BlindBison

    BlindBison Master Guru

    Messages:
    384
    Likes Received:
    61
    GPU:
    RTX 2080 Super
    @hemla BattleNonSense notably only tested a handful of games -- HardwareUnboxed conducted their own tests and found that this is NOT always true. For example, Far Cry 5's in-engine limiter did not reduce input lag more than RTSS did while having worse framepacing.

    Overwatch/Battlefield V on the other hand seem to have "proper" in-engine limiters that do reduce input delay more than RTSS, but also appear to have worse frame pacing.

    HardwareUnboxed also found that capping framerate did not always result in reduced input delay like BattleNonSense found in his tests, but I'm not entirely certain of how they approached testing there -- my understanding is that you would have to be GPU limited/have your GPU maxed out normally to get the input lag reduction from fps limiters, though I could be missing something there. Historically @RealNC has been a real wealth of knowledge regarding this sort of thing so I'll tag you just in case I got anything wrong here :)

    EDIT:

    I don't remember if they produced more than one video on the subject, but I believe this is where they did some of those tests at least.
     
    RodroG likes this.
  12. BlindBison

    BlindBison Master Guru

    Messages:
    384
    Likes Received:
    61
    GPU:
    RTX 2080 Super
    @RodroG Thanks for all your work here, this is awesome :) I have read your write-up/the reddit thread and I am curious which external limiter you personally plan to use primarily going forward as it seems different limiters can have slight advantages in different scenarios. Anyhow, thanks for all your work and time on this.

    EDIT: I see you answered this in the Reddit thread here:

    > "It should be noted that, according to my tests and stability-wise, the NV CP limiter, and the NVIDIA's limiters in general, can leads with significantly worse Lows avg numbers in some scenarios (DX12 and Vulkan), which means (minor) stuttering, specially noticiable in WolfYB. In my opinion, RTSS is still the best and most solid method in terms of frame time consistency with a good input lag level."
     
    Last edited: Mar 2, 2020
    RodroG likes this.
  13. RodroG

    RodroG Active Member

    Messages:
    55
    Likes Received:
    63
    GPU:
    RTX 2080 Ti / 11GB
    Hi. Replied other users on the same or similar question. So, please, let me to quote you here the answer I offered in other forums:

    Anyway, as i said above, you should understand the technical details that currently support this approximate and probabilistical approach for the estimation of the expected input lag by reading carefully all the sources I liked above.

    Of course, Frame times can very well be measured with software. Consistency is a matter of analysis, not a matter of measurement. However, yes, the limiters work differently (different order hook <-> sleep). To account for this fact, PresentMon and CapFrameX developers have defined an expected approximate latency, which contains an upper bound to include delays caused by reverse calls of the sleep instructions in the calculation. Using only the lower bound would be critical in some cases. Additionally, CapFrameX provides an offset (6 ms in my reseach) for the peripheral latency (mouse and keyboard).

    I know that to get used of new and alternative but valid software measurement approach is not easy at fist, but we will need to get used to the fact that something like input lag can be approximated.

    I don't think it is a matter of misunderstanding or contradictions between my results and theirs, but of how we can interpret and assess the reliability, validity and representativeness of the results and conclusions of any analysis of this type. That is, the Blur Busters' or Battle(non)sense's results and mine should not be considered directly contrary but complementary. The reason for this is that there are significant methodological differences between our analysis that prevent one from being adequate to invalidate directly the other. In fact, technically, this would only be possible if both reviewers/observers obtain contrary results with the same methodology (and this wasn't the case); and, if this happens, it'd provisionally invalidate the superiority or convenience of the conclusions of both analysis.

    Just as an example, and without an aim of completeness, here are some of the more evident methodological differences to which I'm referring: number and selection of the sample of games / engines / 3D APIs used, resolution and graphic options applied, FPS limit values, methods and instruments for capturing, measuring and visualizing performance metrics ...

    On the other hand, and not less important, assertions of the type "X is always or will be ..." lack any technical or scientific accuracy or validity in any study or benchmarking with the minimum methodological rigour. Every useful conclusion/recommendation is always tentative and related to a particular context and situation of analysis and application; that is, there are no fixed or invariable rules but provisional and relative recommendations/ conclusions more or less useful for us.

    Finally, and specifically on the supposed conclusion that the in-engine limiters are "at least 1 frame ahead of the external" ones, I'd consider it as a relative conclusion and based on their button-to-pixel studies and methodology; and, if we consider or add the term "always" to the assertion, it would be more an ideal condition, or theoretical desideratum, since, according to my experience and analysis, that's only met or would be fulfilled when the in-engine limiter implementation can be optimal (and not suboptimal as it is usually in the vast majority of the cases I know and have been able to analyze so far).

    Sorry for the long answer but I think it's the best way to answer your questions. Regards.
     
    BlindBison likes this.
  14. aufkrawall2

    aufkrawall2 Master Guru

    Messages:
    675
    Likes Received:
    80
    GPU:
    RX 480 8GB OC/UV
    Though he repeats that several times and provides avg values, which should be safe.

    But yes, he tests only games with a somewhat proper in-game limiter while there are also other cases in the wild.
    And it's also not true that every in-game limiter has a worse frame tiem variance than RTSS, e.g. the limiter in Serious Sam: Fusion causes a completely flat line in RTSS graph.

    My 2 cents in a nutshell:
    D3D11: Set ULL to ultra and either use Nvidia limiter v3 or RTSS, probably doesn't matter at all.
    D3D12/Vulkan: Try internal limiter first and if it's crap, resort to RTSS.
     
  15. RodroG

    RodroG Active Member

    Messages:
    55
    Likes Received:
    63
    GPU:
    RTX 2080 Ti / 11GB
    First, thank you for your feedback and words. :)

    On the question, personally, I'm not an e-sport or competitive gamer and I prefer single-player games and solo gameplay experiences overall. In this sense, the absolute lower input lag over any other performance consideration is not the main concern for me, to be honest, but the best frame time consitency or stability with an acceptable and good input lag level. Therefore, in that particular context, and according to my findings/experience, I still consider RTSS limiter as the most consistent and balanced method to limit my frame rate on G-Sync and most engine/APIs scenarios.
     
    Last edited: Mar 2, 2020
    BlindBison likes this.

  16. BlindBison

    BlindBison Master Guru

    Messages:
    384
    Likes Received:
    61
    GPU:
    RTX 2080 Super
    @RodroG Thanks! I feel the same way personally.
     
  17. hemla

    hemla Member Guru

    Messages:
    171
    Likes Received:
    10
    GPU:
    nvidia
    I have tried to run CapFrameX to see the results myself but it seems that it's unable to detect any processes on my computer.

    Anyway I checked github links you've provided and the statement is clear:
    https://github.com/GameTechDev/PresentMon/issues/73#issuecomment-589748837
     
  18. RodroG

    RodroG Active Member

    Messages:
    55
    Likes Received:
    63
    GPU:
    RTX 2080 Ti / 11GB
    The statement you quoted can't be clear though because it isn't the full comment of the author, doesn't include rest of comments of the same author, and it isn't even contextualized without considering and referring all the comments and statements after and before the partial one you quoted, and the other GitHub thread and sources I linked above.

    Your comment is clearly biased. Please, let everyone read all the information I've provided in order they can understand the subject without any biased pre-considerations. Do not try to do so for them by quoting only part of a larger comment/thread, and taking them out of all context.
     
  19. hemla

    hemla Member Guru

    Messages:
    171
    Likes Received:
    10
    GPU:
    nvidia
    I've read all of it and pointed most important piece as you have clearly stated that measurements are "accurate". They might be accurate when related to other measurements done in same way but by no means they are "input lag" panacea.
     
  20. RodroG

    RodroG Active Member

    Messages:
    55
    Likes Received:
    63
    GPU:
    RTX 2080 Ti / 11GB
    They are accurate when related to the lastest PresentMon's approximate input lag equations (lower bound, upper bound and expected) the devs implemented and I use in my research. Of course, no one talked here about anything similar to an " 'input lag' panacea ".

    We (PresentMon's and CX's dev teams and me) just talk about the best approximate, and probabilistically expected, input lag approach through software tools. Just this.
     
    hemla likes this.

Share This Page