Blur Busters Feature Request: Remove 300fps Cap in RTSS UI

Discussion in 'Rivatuner Statistics Server (RTSS) Forum' started by mdrejhon, Jun 30, 2020.

  1. mdrejhon

    mdrejhon Member Guru

    Messages:
    128
    Likes Received:
    136
    GPU:
    4 Flux Capacitors in SLI
    Feature Request

    Founder of Blur Busters / TestUFO here!

    Request to remove the "300 clamp" for the Framerate limit, pretty please. ;)

    I have to test multiple caps rapidly several times an hour, and that makes config-file editing a pain in the brand new 360Hz+ monitor era.

    P.S. If there are political/economic/legitimate reasons it cannot be done at the UI level by default, please add a "ClampLimit=300" to configuration file that I can manually edit to 1000 for my internal prototype displays!

    [​IMG]

    Synopsis

    Currently, multiple 360 Hz monitors are coming, and ASUS has confirmed a roadmap to 1000 Hz displays, is it possible to please remove the 300fps cap since it is now beginning to interfere with Blur Busters testing. We also rather not keep editing configuration files in this ultra-high-Hz era.

    1/300sec = 3ms microstutter is now human-visible thanks to the Vicious Cycle Effect, whereupon:
    - Bigger / Wider screens
    - Higher resolutions
    - Higher refresh rates
    - Reduced motion blur
    - Increased brightness / HDR

    All co-amplify each other to make ever-smaller stutters become human visible, as seen in Amazing Human Visible Feats of The Milliseconds

    At 4000 pixels/sec on a 2560x1440 240Hz display (About 1.5 screenwidths per second panning/turn speed), a 3ms microstutter is a 13 pixel stutterjump (1/300th of 4000 pixels). That's big enough of tiny microstutter to punch through the 240Hz frame granularity of 4.2ms. As motionblur-widths (in milliseconds) starts to become smaller than stutter-widths (in milliseconds) -- ever smaller stutters become human visible. This is important, as we march towards simultaneous retina-rez + retina-refresh. It's a Vicious Cycle Effect.

    The simultaneous combo of (High resolution) + (High refresh) amplifies visibility of ever-tinier stutter errors. This stutter-science an important textbook exerciuse in the refresh rate race to retina refresh rates.

    Useful Reads Of Refresh Rate Race

    Being Blur Busters the famous refresh rate mythbusters, sometimes I have to explain to people about the benefits of ultra high refresh rates, since many still doubt 1000Hz. This occasionally necessitiates a pre-emptive micdrop to rother eaders / game developers / etc (who may not be familiar with high-Hz), so:

    1. Blur Busters Law: The Amazing Journey To Future 1000 Hz Displays
    2. The Stroboscopic Effect of Finite Framerate Displays
    3. Frame Rate Amplification Technologies (NVIDIA is also working towards cheaper 1000fps)
    4. GtG versus MPRT: Two Pixel Response Benchmarks
    5. PC Magazine: Confirmation of ASUS 1000Hz roadmap (scroll down to see "road to 1000Hz")

    I talked to ASUS at CES 2020, and they confirmed they have a 1000Hz roadmap that exists over the course of this 2020-2030s decade. I created the 360 Hz demo for CES 2020 that was exhibited at both NVIDIA and ASUS suites (see NVIDIA announcement, scroll down to see "BlurBusters partnered"), and have been credited by NVIDIA in some papers (i.e. Page 2 of Temporally Dense Raytracing), so my credentials should speak for itself.

    Display motion blur on sample-hold displays are the same as a SLR photograph (1/120sec versus 1/1000sec SLR camera shutter is human visible in the resulting photograph for a camera shutter. The blur of camera panning versus the blur of similar display panning, is actually mathematically identical (assuming these scientific variables: sample-and-hold, non-impulsed, GtG=0). As you can see, it is measured that the display motion blur behaves the same way -- and laboratory experiments of experimental 360Hz, 480Hz, and 1000Hz+ displays confirm).

    Doubling Hz halves motion blur on sample-and-hold displays, and eventually achieving 1ms MPRT without the use of a strobe backlight, requires 1000 unique 1ms frames consecutively with no black periods in between. Basically, 1ms consecutivity, instead of 1ms strobe flashes -- to create blurless sample-and-hold (aka full brightness lagless / stutterless / strobeless ULMB).

    Thank you for adding >300 cap flexibility to the next version of RTSS! ;)

    P.P.S. I just recently helped a small indie software developer remove two layers of stutter from their game, successfully futureproofing their game for ultrahigh-Hz displays, and making the game VRR-compatible. Surprisingly, it was only a few lines of code changes to their Unity-engined game since 99% of the stutter were timing related (the need to keep gametime-vs-presenttime in better sync). It was also thanks to the help of RTSS frametime debugging
     
    Last edited: Jun 30, 2020
    CrunchyBiscuit and Dan Longman like this.
  2. Andy_K

    Andy_K Master Guru

    Messages:
    845
    Likes Received:
    242
    GPU:
    RTX 3060
    In RTSS 7.3b4 it is at 360fps or 2777microseconds

    Framerate-Frametime.gif
     
  3. mdrejhon

    mdrejhon Member Guru

    Messages:
    128
    Likes Received:
    136
    GPU:
    4 Flux Capacitors in SLI
    Thanks, I'll have to upgrade to 7.3b4!

    That'll solve 360 Hz monitors. (That said, I also have access to 480 Hz+ experimential displays too)

    I also actually help some game developers de-stutter, because we've emailed them RTSS statistics screenshots (e.g. framerate-vs-physics lock effects, etc).
     
  4. Dan Longman

    Dan Longman Master Guru

    Messages:
    225
    Likes Received:
    157
    GPU:
    4080 FE
    sorta off topic, but saw in the latest patch notes for cloudpunk that you worked with them to help with the stuttering! Great work. game has great frametiming now! thanks!
     
    mdrejhon likes this.

  5. mdrejhon

    mdrejhon Member Guru

    Messages:
    128
    Likes Received:
    136
    GPU:
    4 Flux Capacitors in SLI
    Yes -- that's one of them.

    Cloudpunk is an amazing indie game if you like simple adventures in a cyberpunk atmosphere in a sort of mash-up between Minecraft-graphics-look & Blade runner look.

    For a couple days, I volunteer playtested the stutters from a developer-knowledge perspective (me), and we chatted on discord to stomp out all the stutter causes, including programming instructions.

    They credit me for helping debug stutters, in the Steam Store release notes:
    https://steamcommunity.com/app/746850

    I also want to thank @Unwinder (et al) for the public service of enabling end-users help developers stutter-debug their game -- RTSS makes that possible, without needing access to original source code or Unity profiling tools.

    Stutter is a huge onion of many stutters, such as beat-frequencies (fps-vs-Hz, physics-vs-Hz, physics-vs-framerate, fps-vs-physics, pollrate-vs-framerate, performance, diskload, rendertime variances, etc) Some are simple one-line game fixes and others are a lot of work.

    Note about stutters that aren't fixed by VRR
    An example is a beat-frequency effect between physics clocks (60Hz) and frametimes (varies in VRR). Those will punch through VRR as visible stutter. Stutters (that cannot be fixed via VRR) is a gametime-vs-photontime divergences. Whether it's the wrong gametime rendered, or presented at the wrong time. VRR enables presenttime-vs-photontime sync, but stutters can still punch through VRR. De-stutter VRR requires very consistent gametime-to-presenttime sync. For perfect VRR, the gametimes must float correctly and organically with frametimes. Both should increment at eactly the same amount, with the exact presenttime incrementing an exact amount too. This can be hard with fluctuating rendertimes, as 1ms divergences can still be visible stutter (the 1440p 240Hz situation, or the ULMB situation, where blur is low enough to amplify microstutter. I also expect 360Hz monitors to make 1ms stutters human-visible, too). The tinier milliseconds become more visible at ever higher resolutions + ever higher refresh rates, and even 0.5ms divergences are about to begin to become human-visible. It's the Vicious Cycle Effect, a chaptoer in the Blur Busters 1000 Hz Journey articlle.

    Simultaneous convergence of retina rez + retina refresh, creates a major Hz-requirements-amplification effect. (sub-milisecond blur is visible, sub-millisecond stutter is visible, etc). It's illustrated in forum post, The Amazing Human Visible Feats of the Millisecond. Futureproofing a game engine requires decoupling from all clock granularity where possible (use microsecond clocks instead of millisecond clocks, use Update() instead of FixedUpdate() except where necessary, use interpolated physics positions for organic frame rates where needed). This helps an engine to self-adapt to GPU progress and display progress. Mostly common-sense game engine fixes, that are easy for off-the-shelf game engines.

    One useful tip for game developer is the RTSS 50fps cap or 55fps cap, for stutter-debugging VRR, because that is easy to see beat-frequency stutters from physics-vs-framerate issues. That's how Cloudpunk finally fixed a lot of stutters.

    I am writing a new article about this, which I'll release soon -- about "paying attention to milliseconds" (milliseconds matter much more than you think for enabling VRR suspport -- but fortunately off-the-shelf engines like Unity and Unreal makes it easy now)

    For now, I have developer instructions for de-stuttering Unity engine games here:
    https://forums.blurbusters.com/viewtopic.php?f=22&t=7158

    It takes only about 3 lines of code to begin to enable VRR support for a Unity-engined game. Though there needed to be some other optimizing, like a beat-frequency effect between a fixed-updating camera that was locked to physics rate rather than organic framerate.

    That was fixed too, and made Cloudpunk way more VRR-comaptible. The developer and I took only 2 days to debug all the low-lying-apple stutters out, and there are sudden rave "butter smooth" reviews from 75Hz and 144Hz users, and VRR users.

    Just by the stutter fixes -- the game feels almost as if the framerate has doubled! Now it is acrade butter-smooth like NES Super Mario Brothers panning or Sega Model 3 arcade machines -- even during fluctuating framerates, since the VRR keeping the fluctuating framerates stutterless. Just the minor graphics-cull/load stutters or disk-load stutters, but otherwise completely butter smooth.
     
    Last edited: Jul 4, 2020
    Dan Longman and CrunchyBiscuit like this.
  6. CrunchyBiscuit

    CrunchyBiscuit Master Guru

    Messages:
    343
    Likes Received:
    126
    GPU:
    MSI GTX 980 Ti
  7. mdrejhon

    mdrejhon Member Guru

    Messages:
    128
    Likes Received:
    136
    GPU:
    4 Flux Capacitors in SLI
    I am actually creating some new custom TestUFO animations to demonstrate beat-frequency stutters, beause my article will be about visual stutter debugging.

    The 50fps/50Hz test is an excellent predictor of VRR stutter issues. Almost all 60Hz office monitors can do 50Hz (non-VRR) via Custom Resolution, and it's a framerate usually within all known VRR ranges (40-60Hz, 48-75Hz). A bottom-barrel generic 60 Hz office monitor will usually be capable of a 50Hz or 55Hz mode (via Custom Resolution Utility), or a narrow VRR range, sufficient enough to allow visual stutter-debugging.

    This dramatically accelerates visual stutter debugging, for indie developers of limited gaming equipment. Though this developer had a 120Hz VRR monitor, the 120fps hid a lot of stutters visible at 144fps or 50fps.

    Hopefully my article will become a go-to textbook reading for Unity developers who likes visual stutter debugging, since beat-frequencies are easy to see (55fps-vs-60Hz - five clear stutters per second). Graphs are important but having a visual stutter dramatically makes it easier, fixing 55fps stutter will usually Hz-future-proof a Unity game.

    Based on how much things got accelerated, I now think at least some developers and playtesters should configure their 60Hz monitor to 50Hz or 55Hz, for stutter bug reports, since it's such a big beacon and good predictor of high-Hz stutters and VRR-unfixable stutters.

    This can compress a 2-week debug into 2-days, like for me and the Cloudpunk developer.
     
    Last edited: Jul 4, 2020
    CrunchyBiscuit likes this.
  8. howiec

    howiec Member

    Messages:
    20
    Likes Received:
    4
    GPU:
    GTX 970 HOF
    Now that we have 360Hz monitors and some of us are overclocking them, could we please get an increased cap again? =)

    Thanks!
     
  9. Unwinder

    Unwinder Ancient Guru Staff Member

    Messages:
    17,194
    Likes Received:
    6,865
    Is was extended to 360 a few versions ago.
     
  10. mdrejhon

    mdrejhon Member Guru

    Messages:
    128
    Likes Received:
    136
    GPU:
    4 Flux Capacitors in SLI
    He meant above 360.

    I've already overclocked my ASUS PG259QN too, so I agree, 360 is currently too low. ;)

    It can do 366 Hz without frameskipping and 433 Hz with frameskipping. Also 480 Hz prototypes are now in various labs (ETA 2021-2022 is my wild guesstimate).
     

  11. Unwinder

    Unwinder Ancient Guru Staff Member

    Messages:
    17,194
    Likes Received:
    6,865
    He missed the train, new version was just released and no updates are expected in nearest weeks. ;) I’ll extend it to 480 in future versions.
     
    mdrejhon and Astyanax like this.
  12. howiec

    howiec Member

    Messages:
    20
    Likes Received:
    4
    GPU:
    GTX 970 HOF
    Awesome, thank you sir!
     
  13. Unwinder

    Unwinder Ancient Guru Staff Member

    Messages:
    17,194
    Likes Received:
    6,865
  14. mdrejhon

    mdrejhon Member Guru

    Messages:
    128
    Likes Received:
    136
    GPU:
    4 Flux Capacitors in SLI
    Thanks!

    Take your time, just want to keep the door open.

    Currently, the Windows hardcoded limit is in the territory of 500 or 512Hz (conflicting information from multiple sources).

    P.S. By the way, I successfully convinced Microsoft recently to experimentially raise the refresh rate limit of Microsoft Windows to 1000Hz. Although the information is only available to Microsoft Partners at this time, there is now a clear path to 1000Hz experimentation for display researchers & manufacturers.

    P.P.S. On this topic of the refresh rate race .... The refresh rate recommendation is geometric upgrades to punch the diminishing curve of returns (60Hz -> 120Hz -> 240Hz -> 480Hz -> 960Hz). Doubling Hz halves motion blur on a sample-and-hold display (assuming GtG=0 and framerate=Hz), which is why 120Hz is becoming mainstreamed nowadays in smartphones, VR, televisions, etc. Although Apple bumped that to iPhone 13, it will probably eventually become hard to buy any Apple devices (laptop, mobiles, etc) that is less than 120Hz by mid-to-late 2020s. Ever since we stopped strobing (CRT), and went ergonomic flickerfree, it triggered a slow refresh rate race as the ergonomic blur reduction method. I heard indirectly that mainstream vendors such as Dell/HP intends to introduce 120Hz to generic office monitors at near no cost by 2025-2029ish, so there's a time where it becomes increasingly hard to buy 60Hz-only displays by year 2030 like it's hard to buy a 720p HDTV or non-retina phone today. In the parts channels, I'm seeing signs that 120Hz is shortly to cease to be a premium-cost refresh before end of this decade, much like HD and retina is cheap now. Meanwhile, by then, esports will have moved to 1000Hz+ :)

    EDIT: Just realized you already posted 480Hz change. doh. Thanks! Don't worry about exceeding 480 quickly but I'd like to see a 1000 limit earlier this decade rather than 2030.
     
    Last edited: Nov 6, 2020

Share This Page