A 500Hz refresh rate NVIDIA G-Sync compatible gaming LCD is in the works

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 24, 2022.

  1. MonstroMart

    MonstroMart Maha Guru

    Messages:
    1,397
    Likes Received:
    878
    GPU:
    RX 6800 Red Dragon
    Personally i can definitely see the difference between 100fps and 60 fps. When my games drop in the 60 i can tell without checking at the fps. I can't tell the difference between anything over 80ish fps. If i got 120fps and i drop to 85fps i wont notice anyway not enough for me to care. But if i drop from 100 to around 60 i'll definitely notice it in a bad way like it doesn't feel as smooth.
     
  2. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    You are misrepresenting BlurBuster's claims and demonstrations. If you carefully went over the material and explanations on display there, you would see that all these aspects of display technology are addressed.
     
    mdrejhon likes this.
  3. Venix

    Venix Ancient Guru

    Messages:
    3,476
    Likes Received:
    1,975
    GPU:
    Rtx 4070 super
    I can instantly thell when I drop bellow 48 fps because of the freesync range .... Hard to explain like everything was swimming in water and suddenly the water is thick as mud and a bit framey .... 48 to 75 Hz I can not tell and mind you I had no issues with hitting elden rings iframes ! But this was the first thing in long time I had to test my gaming reflexes .... In general I am not into shooters etc anymore!
     
    tunejunky likes this.
  4. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    5,249
    Likes Received:
    1,612
    GPU:
    RTX 3060 12GB
    I can see the difference between infinity and nearly infinity.
     
    Undying, iNerd and tunejunky like this.

  5. iNerd

    iNerd Master Guru

    Messages:
    226
    Likes Received:
    69
    GPU:
    EVGA RTX 3080 Ti
    To be honest i read your post and first thing i‘ve done was to google for abx blind test and read what that is :) so the answer is: No. Just a personal user feedback… no pro tests or similar. As i said 99+% i will not see any difference but there are some moments where i could see it like that one movement in that one game :) and i need to add: I would never replace this IPS monitor for a 500hz tn pannel. I have upgraded from 240hz tn panel to 360hz ips and would never go back to tn again… next upgrade would be 30+ inch 240hz or 27inch 360hz but it is too expensive since i dont use it for work
     
  6. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,489
    Likes Received:
    3,113
    GPU:
    7900xtx/7900xt
    ^ THIS w/ mdrejon's comment

    marketing people get their hands on numbers and get it twisted for two reasons; A) they don't understand the subject matter's basic principles, B) they look for the most impressive (seeming) numbers and if there's a substantial difference they will flog it like hell regardless of applicability, OR, C) all of the above.
    a few good (modern) examples are "gtg" (only applicable on LCD and all modern sets are "good enough")
    and brightness. yes a monitor needs to put out light, but all that light needs is to be is of a balanced spectrum, so quality over quantity.
    frame rate is similar (in marketing going wild re: spec) Neo Cyrus is entirely right. i bought a 144Hz monitor for my gaming system and have a 120Hz monitor w/ htpc and workstation. i don't play quick twitch shooters but my nephew claims he can tell the 144Hz is better for C.O.D. for me there is no difference worth paying for.
    but for a reality check on the brightness thing - a True Black 400 blows away HDR 600 unless you're in a very bright room. and for the eye searing specs on some (way cool) mini-leds - totally unnecessary (1200 nits+) even if you're in a very modern south facing room with floor to ceiling glass.
    mind you, the mini-leds are self emitting like OLED pixels so they have True Black, so that kind of number is marketing overkill as for function in a bright room you wouldn't need more than 600(ish) nits

    but my main point is IDK if even competitive league gamers can even gain an advantage considering latency at other points and at one point the frame rate not going to help w/ lag.

    but hey e-peen enhancement is a steady business
     
    Venix likes this.
  7. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    (I didn't really get it wrong)
     
  8. I do a custom refresh rate for a Acer S271HL 1080p monitor from 60Hz to 75Hz and can say the same. It's a minor improvement, but I can still tolerate 60Hz fine.

    Differences in refresh rate are a lot more noticeable in VR for me. Going from 90Hz to 72Hz and I can easily see the screen blinking with the lower refresh rate. Even 72Hz to 80Hz is a good enough difference to go from uncomfortable to tolerable.

    I saw a laptop with 300Hz G-Sync display and it was amazing. I don't know how much flat screens can really benefit with higher rates than that, but I can see it being useful for VR displays.
     
    tunejunky and Venix like this.
  9. mdrejhon

    mdrejhon Member Guru

    Messages:
    128
    Likes Received:
    136
    GPU:
    4 Flux Capacitors in SLI
    Introduction
    My credentials (as Chief Blur Buster) have massively improved over the years, as I have discovered more and more weak links in current tests utilized by mainstream media.

    Firstly, are you aware I am cited in more than 25 research papers already as well as, have you seen this comment?
    Short Answer:
    Mainstream media are not following the scientific variables published here where 90% of mainstream can see 240Hz-vs-1000Hz, for maximizing human-visible tests.

    We recently discovered limitations with the CS:GO test (software jitter, mouse jitter, sync technology jitter, GPU limitations) that maxes out below currently-available max-Hz monitor refresh rates. In addition, these factors along with nonzero LCD GtG blots out more than 80% difference of 240Hz-vs-360Hz.

    Trained individuals can tell 240Hz-vs-360Hz. But it's ultra subtle, so not the majority of population. But we recently solved the limitations of that test, and discovered use cases that >90% of population can tell 240Hz-vs-1000Hz in a blind test of well-designed forced-eye-tracking framerate=Hz material (like attempting to read sideways scrolling text -- like the nametags above RTS players or the street name labels of an infinitely-scrolling map that never stops scrolling, etc). Also, VR exercises parallel use cases to that way more often than a stop-flick-stop-flick shooter like CS:GO.

    We also, additionally discovered, that these weak links are also solvable, via new framerate=Hz technologies on 0ms-GtG displays, when fixing control-device jitter and fixing software jitter. That's why we recently discovered 120Hz-vs-240Hz is much more human-visible on OLED than 120Hz-vs-240Hz on LCD, now that we've feasted our eyes on OLEDs at DisplayWeek.

    Also, 0ms-GtG 480Hz is available in some technologies like the Christie DLP E-Cinema projector, and there's other lab 0ms-GtG displays at quadruple-digit refresh rates already, which starts to linearly follow Blur Busters Law, unburdened by GtG, unburdened by mouse jitter, unburdened by software jitter, unburdened by VSYNC OFF jitter, etc, etc. This is hugely relevant to VR research.

    Testing forced-eyetracking tests (like www.testufo.com/map), on 0ms-GtG displays, amplified refresh rate differences massively more than they currently do on triple-digit-Hz LCDs.

    We have access to 1000Hz+ prototypes. The bottom line is that many everyday use cases (and future use cases such as VR & Holodecks) show massively more visible improvements capable of near-perfect blind test passes at refresh rates far higher than many researchers originally expected. Improved/newer software makes a massive difference, when utilizing the perfect-frame-pacing software technology that needed to be invented for VR headsets, which further amplifies Hz-differences (especially on near-0ms-GtG displays like OLED). Also many esports players don't eyetrack in crosshairsed games, so Hz-differences actually show up more in certain crosshairsless games and other use cases (e.g. VR). Forced-eyetracking tests made a huge difference in Hz-vs-Hz blind tests.

    Given sufficient Hz differentials to punch the diminishing curve of returns (4x-8x Hz differences, like 240Hz-vs-1000Hz), sufficiently fast pixel response (0ms GtG) and sufficiently fast forced-eyetracked material (e.g. like trying to identify stuff inside a blur), passing blind tests among non-gamers reliably in the refresh rate stratospheres recently started to happen in early internal tests at many companies.


    Long Answer:
    Read onwards.

    It's fine to dispute claims, but the evidence is overwhelming now....

    In addition, there are already many TestUFO tests that prove the claim at 60, 120, 240, 480 and thus easily extrapolates.

    Also, quadruple-digit refresh rates already exists in prototype in laboratories.

    I also had temporary access to multiple 1000-2000 Hz prototypes, including 1440 Hz vision-research projectors (Viewpixx sells one already), and the science still correctly scales.

    It may be best to reply to my earlier comment, if you see any flaws.

    My high-Hz versus powerpoints often does a see-for-yourself micdrops to CEOs, project managers, engineers, etc, some who were formerly dubious of the refresh rate race.

    Mainstream media are not following the scientific variables published here, for maximizing human-visible tests.

    CS:GO blind tests are useful for CS:GO only, but are not the same thing as testing for "maximum Hz of humankind benefit" (e.g. VR etc) because of multiple CS:GO limitations such as game jitter, mouse jitter, VSYNC OFF jitter (even 1-pixel jitter is worse than the motion blur difference of 144Hz-vs-165Hz at moderate motion speeds!). Also you need to test large Hz differences to compensate for a lot of weak links -- like 60Hz-vs-240Hz or 120Hz-vs-360Hz, or 240Hz-vs-1000Hz (at framerate=Hz).

    Let's give an example of a different simplistic test. Trying to read the street map labels www.testufo.com/map succeeds a lot more blind tests when comparing 120Hz-vs-360Hz (3x Hz difference), because the test avoids a lot of error margins that are specific to CS:GO. The great news is that use cases such as virtual reality reproduce these differences much better than CS:GO does.

    One great example is that refresh rate doesn't improve CS:GO nearly as much as it does a scrolling RTS game (like DOTA2 tweaked for better scrolling fluidity matching TestUFO -- there are tweaks available) running motion at framerate=Hz, because of the www.testufo.com/map effect. Easier to read nametags and identify enemies when double Hz halves motion blur (excluding GtG error margin).

    Also, read below:

    Even though 240Hz-vs-360Hz is almost invisible in CS:GO (1.5x blur difference throttled to 1.1x blur difference due to slow LCD GtG and high frequency microjitter caused by software / mouse hardware / mousepad / chosen sync technology).

    Remember there are now over a million different TestUFO demos just by the combination of 30 tests multiplied by their customizable parameters -- so I can scientifically show off a lot of display concepts. For example, just look at how TestUFO stutter-to-blur ontinuum animation. High-frequency stutter (of sample-and-hold) or jitter (of erratic framerates or mouse jitter) can blend to motion blur.

    Very Interesting TestUFO: Stutter-To-Blur Continuum
    (Demo of high-frequency stutters blends to blur. Stare at bottom UFO for 15 seconds)


    It is already proven that mathematically if you perfectly framepace at framerate=Hz on a 0ms GtG display (like tests recently done on 60Hz OLED, 120Hz OLED, and 240Hz OLED), you get MPRT(100%) persistence of 1/refreshtime and 1/frametime worth of motion blur. The only way to reduce blur further without strobing is to raise the refresh rate and frame rate.

    Now if you can't framepace perfectly (e.g. erratic stutter or jitter), even 70 erratic microstutters per second on a 360Hz or 500Hz vibrates so fast (like a fast-vibrating string -- see TestUFO demo for scientific proof) that it's extra motion blur worse than the maximum possible clarity afforded by the Hz. So, 240Hz-vs-360Hz is diminished further by all kinds of jitter sources (game itself, mouse itself, etc), in addition to slow GtG. So instead of the proper 1.5x blur difference, it's more like a 1.1x blur difference in many real-world games -- not visible.

    So to overcome this, we have to oversample (overkill) refresh rates even more geometrically, e.g. 4x refresh rate differences, especially if we're sticking to LCD and 1000Hz mice (which has more jitter than 2000Hz+). There's a research paper on mouse jitter where it shows 1000Hz is not enough due to the jittering of mouse Hz versus display Hz:

    [​IMG]

    [​IMG]

    Research paper DOI: https://dl.acm.org/doi/10.1145/3472749.3474783

    *IMPORTANT NOTE: When using more stringent test variables, including display MPRT less than mouse poll interval, the 4000 Hz boxes can become red squares. However, 0.25ms desktop MPRT displays sufficiently bright is still a fair time away. 0.25ms = 1/4000sec persistence translating to 1 pixel of motion blur at 4000 pixels/sec or 2 pixels of motion blur at 8000 pixels/sec. Very subtle though, not relevant to 1080p (too fast to eyetrack) but relevant for future 0.25ms MPRT 4K, 8K, and VR displays where 4000 pixels/sec motion is easy to eyetrack. Also, it's worth noting ultralow MPRTs are already on the market -- Oculus Quest is already 0.3ms MPRT, via strobing.

    It does not dismiss that internal demonstrations of 240Hz-vs-1000Hz under parameters normally used for VR (VR is always VSYNC ON and perfect framerate=Hz, which amplifies Hz-vs-Hz) has silenced a lot of non-gamers.

    Also on OLED, 120Hz-vs-240Hz is much more visible than on LCD because of OLED's fast GtG. (Practically zeroing-out GtG proved Blur Busters Law even further!).

    Remember, a 240Hz OLED and 500Hz LCDs was shown off at DisplayWeek 2022 at one of the public booths, which I was at.

    [​IMG]

    On closing this out, it is common for many laypeople to miss the Hz forest for its refresh rate trees. So I will crosspost the different human-vision thresholds, since sometimes people are fixated on things like flicker-fusion thresholds (a low Hz).

    There are many different effects caused by the multiple weak links of the humankind invention of finite frame rates to simulate analog moving images; this science is relevant science when using displays to try to perfectly match real life (e.g. VR).

    ---crosspost---

    Many people misunderstand the different sensitivity thresholds, such as "Humans can't see above 75Hz" -- but that is only a flicker threshold. The purpose of this post is to show that there are extremely different orders of magnitude that refresh rate upgrades do address.

    Even in a non-gaming context, one thing many people forget is that there’s many thresholds of detectable frequencies.

    These are approximate thresholds (varies by human), rounded off to nearest order of magnitude for reader simplicity of how display imperfection scale.

    Threshold where slideshows become motion: 10
    This is a really low threshold such as 10 frames per second. Several research papers indicate 7 to 13 frames per second, such as this one. This doesn't mean stutter disappears (yet), it just means it now feel like motion rather than a slideshow playback.
    Example order of magnitude: 10

    Threshold where things stop flickering: 100
    A common threshold is 85 Hz (for CRTs). Also known as the “flicker fusion threshold”. Variables such as duty cycle (pulse width) and whether there’s fade (e.g. phosphor fade) can shift this threshold. This also happens to be the rough threshold where stutter completely disappears on a perfect sample-and-hold display.
    Example order of magnitude: 100

    Thresholds where things stop motion blurring: 1000
    Flicker free displays (sample and hold) means there is always a guaranteed minimum display motion blur, even for instant 0ms GtG displays, due to eye tracking blur (animation demo). The higher the resolution and the larger FOV the display, the easier it is to see display motion blur as a difference in sharpness between static imagery and moving imagery, blurry motion despite blur free frames (e.g. rendered frames or fast-shutter frames).
    Example order of magnitude: 1000

    Threshold for detectable stroboscopic effects: 10,000
    Where mouse pointer becomes a continuous motion instead of gapped. This is where higher display Hz helps (reduce distance between gaps) and higher mouse Hz (reduce variance in the gaps). Mouse Hz needs to be massively oversample the display Hz to avoid mouse jitter (aliasing effects). If you move a mouse pointer 4000 pixels per second, you need 4000Hz to turn the mouse pointer into a smooth blur (without adding unwanted GPU blur effect).
    Example order of magnitude 10,000

    An example test of stroboscopic lights:
    [​IMG]
    (From lighting industry paper but has also been shown to be true for stroboscopics on large displays, including VR displays intended to mimic the real world)

    More information can be found in Research Section of Blur Busters.

    Please vet me, rebut me, and peer review me -- I am fully prepared with science, with Ph.D researchers working with me too. The integrity of the refresh rate race depends on people trying to find flaws in scientific theory, so that research can be further improved, to find out more genuine real use cases for Hz.

    Also, as a reminder, remember LCD-vs-LCD (120Hz vs 240Hz) is muddied by LCD GtG. 120Hz-vs-240Hz is much more visible on OLEDs I've seen on the exhibit floor at DisplayWeek. Parrotting self-experience on LCDs belittles the fact that I've seen thousands of laboratory, prototype, and unreleased displays in various places.

    Remember -- due to limitations of old-codebase flick-shooter CS:GO that is otherwise the go-to benchmark for Hz -- the refresh rate race becomes more human-visible in stutterless, jitterless & crosshairsless apps and games. That means software that force you to eye-track ultrasmooth motion that is framerate=Hz. That means many use cases other than CS:GO, such as virtual reality (where framerate=Hz is an absolute eye-health headache-free necessity), map panning, and other ultrasmooth-pursuit motion use cases that amplify Hz visibility. There are great examples of both game and non-game use cases.

    New public papers will be coming (by the mid 2020s) especially as ultra Hz near-0ms-GtG displays commercialize (witness the newly announced 240Hz OLEDs) and start hitting researchers worldwide for tests that incrementally moves ever closer and closer to retina refresh rate testing.

    The goal by Blur Busters is not "highest Hz that benefits CS:GO on an LCD" (where the mainstream are doing their 240Hz-vs-360Hz tests, bottlenecked by GtG, game jitter, mouse jitter, and VSYNC OFF jitter, blotting out differences of small refresh-rate-difference multiples).

    The goal is "highest Hz of human-visible benefit for the most extreme use cases" -- like a Star Trek Holodeck (VR headsets) or other use cases that amplify non-strobed sample-and-hold Hz differences much more massively. Perfectly matching real life requires ultrafast GtG (0ms) combined with analog-like motion simulated by retina refresh rates. Then once we finally have 0ms GtG, then in addition, very far up the diminishing returns curve, one needs to compare 4x-8x Hz differences, before finally discovering the vanishing point of the diminishing curve of returns. That is the proper intent of Correct Proper Design of a Refresh Rate Blind Test For Such Use Cases.
     
    Last edited: May 28, 2022
    Catspaw, GoldenTiger, HandR and 11 others like this.
  10. Silva

    Silva Ancient Guru

    Messages:
    2,051
    Likes Received:
    1,201
    GPU:
    Asus Dual RX580 O4G
    Having a good CPU cooler and a good case with airflow will reduce that effect.
    Also, I always use a voltage offset on my CPU -0.1V undervolt and my GPU is undervolted also.
    If your PC is noisy, maybe the fans you use are poop or you have it badly configured.
    Also, don't sit next to the case, put it on the floor?

    PS: if you're more focused on the noise your PC makes than playing/enjoying content, what the hell are you playing?
    PS2: you don't have issues, yet.
     
    Catspaw and cucaulay malkin like this.

  11. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    Hi, could I get some monitor purchase advice off you, and also congratulations on your website & work you do, I've been using your discoveries in the "G-sync 101" articles since about 2015 (capping framerate below max refresh rate). Mostly I play fps multiplayer games and I've found upgrading from 75Hz to 144Hz an amazing difference (back in 2015), and I noticed an improvement overclocking to 180Hz, albeit that was subtle. I've got this old G-sync monitor (G2460PG https://pcmonitors.info/reviews/aoc-g2460pg/ ). If I was considering a monitor upgrade in a quest for less motion blur, based on what you've been saying then it seems important for me to focus on how quick the GTG transition is of the monitor as well as the Hz......and if I've understood the gist of your post then I should probably not settle for anything less than 360Hz to notice a proper difference. Companies don't really accurately say what their GTG is for their monitors do they? What kind of GTG transition time would I need to look for on a 360Hz monitor to notice a proper improvement vs my current monitor when it comes to blur reduction, and how much blur reduction could I expect from that upgrade? OLED looks like it would be the best of all worlds, but that's gonna be super expensive even if does exist at 360Hz.....what advice would you have for me re other technologies like TN vs IPS vs VA.....traditionally my understanding is that TN screens offer the best blur free experience of those 3 technologies, (albeit with other weaknesses such as viewing angle / colour reproduction)? Perhaps my questions might help other gamers on here upgrade sensibly too.

    EDIT: it also seems that a 4000Hz mouse would reduce jitter from your table you showed when paired with a 360Hz monitor. Is that an applicable factor in fps style games or were you referring to that more in other usage scenarios?

    EDIT #2: is there a formula a person can use to combine refresh rate with GTG time when comparing two monitors to work out the motion blur reduction?
     
    Last edited: May 26, 2022
  12. mdrejhon

    mdrejhon Member Guru

    Messages:
    128
    Likes Received:
    136
    GPU:
    4 Flux Capacitors in SLI
    Unfortunately, LCD GtG is not simple:
    1. There is a VESA 10%-to-90% standard, due to oscilloscope noisefloor measurements. This excludes the still-human-visible artifacts below 10% and above 90%.
    2. On LCDs, different colors have different GtG speeds. This is worse on some panels (e.g. VA panels)
    3. GtG is a curve that can be of different shapes for the same GtG value;
    4. There are two pixel response benchmarks, GtG and MPRT. See Pixel Response FAQ: GtG versus MPRT

    [​IMG]

    Still, regardless, GtG is a major error margin in refresh-rate difference tests.

    That's why 120Hz-vs-240hz is massively more noticeable on OLEDs than on LCDs.

    One option to reduce GtG from being human visible, is to use a strobe technology like ULMB, DyAc, ELMB, VRB, PureXP, etc.

    Then use additional tricks such as refresh rate headroom to hide even more of the LCD GtG in the blanking interval between refresh cycles -- hide LCD GtG in total darkness of the dark period between strobe-backlight flashes -- e.g. 120Hz strobing on 240Hz monitors look much better than 120Hz strobing on a 144Hz monitor. This is because a 120Hz refresh cycle can be configured to refresh in 1/240sec, then a longer pause between refresh cycles to let pixels finish more of GtG in the dark, before the strobe backlight flashes.

    One of the better ones is the XG2431 that I worked on for Blur Busters Approved 2.0 which supports the most advanced version of Strobe Utility:

    [​IMG]
    (see XG2431 strobe tuning HOWTO)

    *Disclosure: I worked with ViewSonic to implement highly-tunable motion blur reduction (strobe backlight), to meet the specifications of Blur Busters Approved 2.0. At the best tuned settings, it can produce clearer strobe quality than many other technologies such as NVIDIA ULMB.

    There are many fast-GtG options, such as the BenQ XL2546 series and other, although none as zero-GtG as an OLED. It's all a pick poison, that requires you to make compromises.

    All usage scenarios.

    I can see 1000Hz-vs-8000Hz easily at the Windows desktop.

    Real long-exposure photographs, representing what I see with my eyes (mouse arrow stroboscopic effect similar to www.testufo.com/mousearrow ...)

    1000Hz mouse poll rate on my 360Hz monitor:

    [​IMG]

    8000Hz mouse poll rate on my 360Hz monitor:

    [​IMG]

    So it creates human visible differences during things like:
    - Mouse pointer
    - Map panning
    - Photo panning
    - Dragging a window
    - Etc

    It's even pretty visible (to me) for photo editing -- times I need fast-pan around a large canvas too! I even noticed the difference there because of the lack of jitter between mouse Hz and display Hz. I'm very sensitive to jitter -- but not everyone is.

    ___________

    On a related topic: Some sync technologies can amplify visibility of mouse liimtations...

    Be noted that Windows desktop is VSYNC ON, so it makes Hz-vs-Hz easier to tell -- it is an important part of the scientific test variables recommendations for blind tests targetted at determining a refresh rate.

    Also, 360Hz VRR makes 1000Hz mouse limitations more visible too, so VRR makes pollrate limitations more visible than VSYNC OFF does. You can still see 1000Hz mouse jitter during VSYNC OFF, but it is more visible in VRR, and then even more visible in VSYNC ON. So the choice of sync technology in scientific testing is a huge consideration in retina refresh rate testing for blind-test Hz-vs-Hz comparisons.

    This is because VSYNC OFF jitter is an error margin that affects small Hz-vs-Hz differences like 240Hz-vs-360Hz -- yet another different error margin than just LCD GtG that blots out non-highly-geometric differences in refresh rates.

    VSYNC OFF is great for esports latency but is fundamentally a humankind band-aid because VSYNC ON latency is finite. The great news is that VSYNC ON latency approaches closer and closer zero when you're hitting 1000Hz+.

    Also, VSYNC ON is also mandatory in VR apps since VSYNC OFF creates discomfort in virtual reality and any Holodeck/reality use cases (some as bad as nausea/headaches).

    This is because you're trying to simulate real life, and the giant IMAX-sized FOV amplifies VSYNC OFF issues. And since real life doesn't have jitter/tearing, and VR is attempting to simulate real life.

    VR can be very dizzying and motion-sickness inducing otherwise, and getting VR more comfortable for average users, required a lot of optimizations that are far ahead of CS:GO and other typical esports apps. The amazing software optimizations that VR software developers put into their VR apps, amplified Hz differences too (at least when displayed on sample-and-hold displays). Be noted most VR headsets are pulsed/strobed, though. But the software skills are applicable to software that tests for retina refresh rate thresholds.
     
    Last edited: May 27, 2022
    GoldenTiger, Robbo9999, HandR and 4 others like this.
  13. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,050
    Likes Received:
    7,382
    GPU:
    GTX 1080ti
    its actually not, the 600 marketting refers to subfield calculations.

    your TV is 60hz, and has 10 subfields.
     
    mdrejhon and tunejunky like this.
  14. mdrejhon

    mdrejhon Member Guru

    Messages:
    128
    Likes Received:
    136
    GPU:
    4 Flux Capacitors in SLI
    The refresh rate of the Christie Digital DLP projector at the local movie theater is 2880 Hz mirror pixel refresh.

    DLP uses tiny mirrors that flip on and off rapidly many hundreds of times per second (minimum) to generate greyscales.

    But it is 1-bit, on vs off. So it needs many mirror refresh cycles per image refresh cycle to temporally-dither 1-bit monochrome to 30+ bits color, through three DLP chips (one per R, G, B color) all refreshing many times per ”real” refresh cycle. This is a “temporal ditherig” technique.

    Marketing often use subrefresh numbers which are higher, but are not true native frames at native refresh rate.

    Plasma had great motion quality (especially Pioneer Kuro and similar) due to they way they clustered together all those subfield refreshes, and then let its phosphor behavior blend it all well with a phosphor fade-out between real image refresh cycles. Effectively, it blended to one big bright pulse to the human eyes per 60 Hz refresh, so 60 Hz flicker was visible.
     
    Last edited: May 27, 2022
  15. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,962
    Likes Received:
    1,246
    GPU:
    .
    make you pay more instead of fixing current LCD issues
     

  16. iNerd

    iNerd Master Guru

    Messages:
    226
    Likes Received:
    69
    GPU:
    EVGA RTX 3080 Ti
    ps: since my knowhow on this subject is very limited i just wanted to thank you and i'm really enjoying reading this thread. I learned so much new things i really feel reacher! :)
    also i was very happy, after changing my pooling rate according to the example posted here, that what i was thinking it might be a short system lag until something is not loaded actually came from bad pooling since i could not reproduce the issue. Thank you!
     
    Last edited: May 27, 2022
  17. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    Thanks for such a well thought out & detailed reply!

    So, based on what you're saying I think I've come to the following conclusions & questions:
    -gonna rule out VA panels due to GTG issues, is IPS and TN on an equal footing here?
    -use ULMB for most reduction of motion blur (I can read the street names on your UFO map scrolling test at 1440 pixels per second as an upper limit using ULMB on my current monitor (this test: https://www.testufo.com/photo#photo...ursuit=0&height=0&stutterfreq=0&stuttersize=0 ) - is that a good performance, can I expect any better result with a better monitor?
    -have you got any more monitors coming out that are gonna be finetuned by yourself and in terms of the ULMB (or other strobing implementation), will they be available in UK/Europe?
    -I should consider a 360Hz or 240Hz monitor to use with ULMB at a lower Hz level, eg 120Hz for best ULMB results - better enabling all pixel transitions to occur when the backlight is off.
    -If not using ULMB (or other strobe tech), then an OLED 240Hz monitor would be excellent due to low GTG.....do they make OLED monitors that include ULMB, I think they wouldn't because the GTG is already happening nearly instantly so no need to hide pixel transition by turning off backlight?
    -Main overall conclusion: I think my next monitor would be a 240Hz OLED or 360Hz ULMB (or other strobe tech).

    -mice/jitter: ergonomics and feel most important, can't overclock 1000Hz mice to 4000Hz or yes? Otherwise limited mouse choice at 4000Hz, last I heard 2000Hz was the highest I remember advertised. Last time I looked into mouse overclocking I vaguely remember a few drawbacks associated with it.
     
  18. mdrejhon

    mdrejhon Member Guru

    Messages:
    128
    Likes Received:
    136
    GPU:
    4 Flux Capacitors in SLI
    More or less similar nowadays (for post-2020 FastIPS panels).
    However, it's rumored that the 480Hz E-TN display has even faster GtG than current TN and FastIPS panels.

    Manufacturer claims 60% faster, but real-world GtG tests need to borne that out. But even if real-world GtG of all colors is 60% faster than real-world GtG of other monitors, then it's a big boost to the quality achievable of 480 Hz.

    If you reduce ULMB Pulse Width in your menus down to 30 or 50, you can read the map labels at 3000 pixels/sec. Give it a try!
    Also, there's a easier map test link: www.testufo.com/map because it's the most popular ULMB test.

    The main thing is to get more photons per backlight pulse, and very few monitors can do it well, e.g. BenQ DyAc can still achieve 300 nits while having less than 1ms MPRT.


    No new news at this time yet, although if you're willing to take a risk with Eve-Spectrum, the 4K 144Hz monitor supports the free Eve Strobe Utility. They implemented the strobe tuning API that Blur Busters now recommends all monitor manufacturers add. Currently, that's the highest-resolution panel that supports a strobe tuning API.


    Yes, that is a general rule of thumb. Be noted that NVIDIA's flavour ULMB often only offers you 2 or 3 refresh rates, rather than "any refresh rate" capability of certain other brands of strobe tech. If this is a buying consideration, take heed of the specs of its strobe backlight. There's many good ones out there, and many not-so-good ones out there.


    No, since MPRT still produces motion blur.

    Even 0ms GtG produces display motion blur, as seen in Why Does Some OLEDs Have Motion Blur?

    There are two different pixel response benchmarks, GtG and MPRT, see Pixel Response FAQ: GtG versus MPRT.

    GtG is pixel transition time.
    MPRT is more akin to pixel visibility time.
    Briefer pixel visibility time (via impulsing or via higher frame rate) = less motion blur.

    As you track moving analog eyes, your eyes are in different positions during the beginning and ending of pixel visibility time. That smears the stationary pixels across your retinas (sample-and-hold refresh cycles are moments of briefly static images that are flipbooking into each other).

    Motion clarity for framerate=Hz for:
    ....impulsed displays (ULMB) is dictated by pulse width -- aka the length of the strobe flash
    ....sample-and-hold displays is dictated by frametime & refreshtime.

    Once you zero-out GtG error margin, the motion blur math of strobed and non-strobed is the same:
    - 0ms GtG with 1ms of frame flash (strobe) = 1 pixel of motion blur per 1000 pixels/sec
    - 0ms GtG with 1ms in consecutive frames (1000fps 1000Hz) = 1 pixel of motion blur per 1000 pixels/sec.

    [​IMG]

    [​IMG]

    Old strobe backlights like LightBoost were 2ms flashes (2ms MPRT). That can be equalled by 480fps 480Hz for LightBoost-clarity with zero strobing. Although LCD GtG is an error margin, I am expecting that the 500Hz monitors have the best motion clarity achievable without any form of strobing.

    Although a 240Hz LCD will be clearer-motion than a 60Hz OLED (huge refresh rate difference multiple), the new 240Hz OLEDs is clearer than a 360Hz LCDs (smaller refresh rate difference multiple). This is because the LCD GtG can overwhelm smaller refresh rate difference multiples.

    Remember doubling framerate=Hz to halve motion blur is only a perfect halving when GtG=0ms. GtG adds extra blur that reduces Hz-vs-Hz differences of the sample and hold effect.


    Sample-and-hold 240Hz OLED will never have less than 1/240sec = 4.2ms of motion blur unless strobing or BFI is added.

    The Oculus Quest 2 VR headset has 0.3ms of motion blur. And most adjustable-pulse monitors can be adjusted to 0.5ms of motion blur or less (the "ULMB Pulse Width" setting in your monitor menus is an example).

    Let's take TestUFO Map at 3000 pixels/sec as an example, www.testufo.com/map#pps=3000 ... This is easy to eye-track on a 1440p display. Now, reading the street name labels successfully is not easy at 1ms MPRT of default ULMB setting. You now need to adjust the setting called "ULMB Pulse Width". You need 0.5ms MPRT (or less) to more easily read the street name labels on this TestUFO panning map test at 3000 pixels/sec.

    Impulsed / Strobe Backlight:
    1ms ULMB pulse = 3 pixels of forced display motion blur at 3000 pixels/sec
    0.5ms ULMB pulse = 1.5 pixels of forced display motion blur at 3000 pixels/sec
    0.25ms ULMB pulse = 0.75 pixels of forced display motion blur at 3000 pixels/sec

    Sample-and-hold OLED or other "near 0ms GtG" non-strobed display:
    OLED matching 1ms pulse ULMB in map test = 1000fps 1000Hz strobeless
    OLED matching 0.5ms pulse ULMB in map test = 2000fps 2000Hz strobeless
    OLED matching 0.25ms pulse ULMB in map test = 4000fps 4000Hz strobeless

    For example, you'd need 3333fps 3333Hz OLED to match the motion clarity of an Oculus Quest 2 LCD -- it's quite apparent John Carmack did an excellent job on that VR LCD, and it's one of the best strobed LCDs since it can do a fairly bright job of strobing at 0.3ms pulse width (0.3ms MPRT100%).

    You can notice that the ULMB becomes dimmer with shorter pulse widths. Some monitors can strobe brighter at shorter pulse widths, but I haven't yet seen any monitor strobe as brightly as an Oculus Quest 2 VR headset can at 0.3ms MPRT -- so some VR LCD technologies are ahead (at the moment).

    However, OLEDs have amazing color, perfect blacks, and consistent pixel response for all color combinations, with no ghosting and no coronas. The motion blur is much more "comfortable", and it's possible to prefer 240fps 240Hz OLED over 480fps 480Hz LCD, because of sheer GtG purity.

    However, this does not mean a 240Hz OLED is capable of having less than 4 pixels of motion blur at 1000 pixels/sec if the OLED is unable to strobe (no BFI feature).


    There are now already true-8000Hz mouse available now, including the Razer 8KHz and the new Corsair Sabre Pro mouse. Both are configured true native non-overclocked 8000Hz, with selectable 4000, 2000, 1000, 500, 125 modes.

    Just underclock to 2000Hz so you don't have a too-demanding mouse (not all systems and games can handle 8000Hz from a mouse)

    I highly recommend them for any ULMB user that would like the most perfect jitter-free strobed motion possible, since mouse jitters are unusually amplified during ULMB-like modes.
     
    Last edited: May 28, 2022
  19. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    Good, that gives me more options then, and IPS are generally better in contrast / colour / & viewing angles, so I'll consider both IPS (newer panels) & TN for my next monitor, whilst definitely ignoring VA.

    I reduced Pulse Width down to 30 and I still can't read 3000 pixels/sec - I mean I've got a 1080p monitor so the street names are spending less than a second to shoot across the display so I think my eyes are just finding it hard to latch onto single item to track from the initial blur when you first look at a fixed point on the screen. I can do 2400 pixels per second though (sometimes but not totally reliably), so reducing the Pulse Width to 50 seems to have made me able to progress from 1440 to 2400.
    And the test ufo ghosting test (https://www.testufo.com/ghosting#ba...on=1920&pps=2400&graphics=bbufo.png&pursuit=1 ), the ufo's look pretty clear with 3 eyes at 2400 pixels per second if I reduce it down to the just one ufo crossing the screen at a time, otherwise my eyes can't latch onto one. I can see some subtle ghosting in the middle track, and some stronger ghosting in the bottom track. I'd say my G2460PG is doing ok at Pulse Width 50, but it is quite dim if light was shining into the room at the wrong time of day, and the ghosting is not optimal......so I'd say this monitor's fine apart from those 2 things, so if I upgrade I'd be looking to remove the ghosting & increase the brightness.

    144Hz isn't really a step up from my current monitor and 4K is just too hard to drive to proper fps, so I want to stick with 1080p when shooting for these high fps, I don't want to be reducing gaming image quality to the lowest settings, but I don't mind compromising some settings to be below max, just I don't want to have to stick everything on minimum to try to get decent high frames on a 4K display.

    "Any refresh Rate" capability sounds great! Although I might miss G-sync if they're not offered with that capability. What are the good brands of strobe tech for "any refresh rate" capability? And about the backlight, how do you know if it's a good backlight from the specs, what should you look for? And I don't remember seeing specs for backlights when viewing monitors to buy, so how do you delve into finding out what backlight is being used & the specs for it?

    So strobing technology is really vastly superior in the current landscape of what's on offer for reducing motion blur. Just it means trying to use scan line sync in RTSS and making sure game settings allow you to hit monitor refresh rate 100% of the time. A monitor that has both G-sync or Freesync as well as strobing at least allows you to choose which games are best suited to each. Where would my G2460PG monitor sit in your Motion Blur Impulsed Displays Table, when it was at Pulse Width 50 (or other Pulse Widths)? (I don't know if it's possible to work out or know). You mentioned default ULMB setting (Pulse Width 100) was equal to 1ms MPRT and 1ms ULMB Pulse, so does that mean Pulse Width 50 is always equal to 0.5ms ULMB pulse & so on & so forth? So that would hold true for all ULMB monitors? If so that's a very good way to know exactly how much motion blur you're getting from any ULMB monitor assuming the GTG is happening quick enough?

    Razer Viper 8kHz looks pretty good in size & shape, might be ok for me, maybe a tad long. Something I'd have to look into on the Sabre too, as they don't seem to list the dimensions there for the Sabre. I feel a bit more confident that in the future they'll be one that would be perfect for me....I hope they continue to release more models at these high Hz rates. Previously I had thought it nonsense, but I can see now from your explanations that there is a good reason to have these high Hz mice. If I move on buying an upgraded monitor sooner rather than later then the Razer Viper 8kHz might do it....but I'm not moving on it right now, I'll experiment a bit more with the reduced ULMB width on my G2460PG, because I've not really used it at reduced Pulse Width settings.....it might just be good enough, although I'm aware of the reduced brightness & ghosting, more so the reduced brightness, but I'll see how it goes.
     
  20. iNerd

    iNerd Master Guru

    Messages:
    226
    Likes Received:
    69
    GPU:
    EVGA RTX 3080 Ti
    This maybe sound funny to some of you but YOU CAN OVERCLOCK A MOUSE?! need to google that asap! :)
    ps: according to the picture provided it would mean that with a 360HZ Asus PG259QN i should overclock the max. 1000HZ to either 4 or 8K in order to be in the "green" area?
     

Share This Page