Findings of RTSS on new 240Hz monitor

Discussion in 'MSI AfterBurner Application Development Forum' started by mdrejhon, Apr 11, 2017.

  1. mdrejhon

    mdrejhon Member Guru

    Messages:
    117
    Likes Received:
    79
    GPU:
    4 Flux Capacitors in SLI
    Hello,

    In the last several weeks alone, there's four 240Hz monitors on the market now, with two of them being GSYNC (Acer Predator XB252Q [GSYNC 240Hz], ASUS ROG PG258Q [GSYNC 240Hz], AOC AGON AG251FZ [FreeSync 240Hz], BenQ ZOWIE XL2540).

    To avoid the lag-increase effect when suddenly slamming against the GSYNC limit (As measured by the old Blur Busters G-SYNC Preview #2 and reconfirmed by other sites)... For the last many months Jorim has been testing RTSS+GSYNC input lag via high speed camera and comparing to other configurations.

    Some games like CS:GO allows you to do a framerate cap internally (fps_max) but not all games allow you to do this, so this is where RTSS usefully comes into play...

    Now, a preliminary findings observed with the new 240Hz monitors when combined with RTSS.

    It appears that a frame cap of approximately 220-230 avoids the lag-change effect (sudden changes in lag is more annoying than consistent lag). People have been using 138-142 capping for 144Hz GSYNC monitors to prevent the sudden lag increase upon hitting the max framerate limit of a VRR monitor (sudden VSYNC ON lag) when hitting 144fps. Not yet sure how far below 240 is needed, but it appears 230 works well. Also, this probably also applies to other VRR technologies including FreeSync too, but currently untested.

    NOTE: This necessitiates manually editing RTSS config file to go beyond 200 as the UI only allows up to 200.

    Just a heads up, for 240Hz+GSYNC+RTSS

    Cheers
    Mark
     
    Last edited: Apr 11, 2017
  2. dr_rus

    dr_rus Ancient Guru

    Messages:
    2,983
    Likes Received:
    360
    GPU:
    RTX 2080 OC
    I have a hard time believing that anyone from humanity would be able to notice any lag changes when running at 230 fps - unless we're talking about like 10 or so frames of input lag which should not be the case with just vsync kicking in.
     
  3. jiminycricket

    jiminycricket Master Guru

    Messages:
    203
    Likes Received:
    4
    GPU:
    GTX 1080
    That's beside the point.
     
  4. dr_rus

    dr_rus Ancient Guru

    Messages:
    2,983
    Likes Received:
    360
    GPU:
    RTX 2080 OC
    No, it's not. If nobody can actually notice the lag then there's no issue with the inability to set the cap at higher than 200 fps.
     

  5. jiminycricket

    jiminycricket Master Guru

    Messages:
    203
    Likes Received:
    4
    GPU:
    GTX 1080
    Wow you are dense.
     
  6. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,849
    Likes Received:
    243
    GPU:
    EVGA GTX 1080Ti SC
    And what sort of evidence do you have?
     
  7. mdrejhon

    mdrejhon Member Guru

    Messages:
    117
    Likes Received:
    79
    GPU:
    4 Flux Capacitors in SLI
    Just 1 frame of input lag can have the result of being much more likely to round-off a game action (fire button) to the previous tick (e.g. 125Hz tickrate in CS:GO, 8ms granularity). Being half a tickrate early still has roughly a 50% chance of rounding-off to the previous tick cycle (assuming all other things equal, though algorithms in certain games can play a role in attempting to partially compensate for latency advantages).

    Like the Olympics, you don't need to feel the millisecond to cross the finish line 1ms or 2ms or 3ms ahead of your competitor.

    With the well-matched reaction times with top players (spreads of under 10ms difference in human reaction time), the shoot-first of simultaneous-draw situations is tantamount to crossing the finish line in the 100 meter sprint...

    You both go around a corner in an FPS shooter, you see each other at the same time, you draw guns at the same time, your human reaction time average between the two of you are less than 5ms apart (say, 151ms versus 152.5ms benchmarked human-react time), and then who gets the frag? In this situation, milliseconds do matter. Especially when competing using kitted-out computers with 1080Ti's on old game engines, and their ability to run at ultra-high framerates (>500fps+) with low jitter and low button-to-pixels latencies.

    While it usually doesn't matter to enthusaist FPS gamers like me, I totally respect that a full refresh cycle (4ms) can definitely make or break things in eSports leagues in the world of well-matched reaction times. During well-matched players (near equal hardware, low-jitter engines like old games, equal human reaction time) one can still notice their frag rate suddenly go down during 'draw' situations, etc. It doesn't happen in just one cycle, but noticed over many games that something seems off.

    Also, people train for a specific lag in a game. If the lag fluctuates, it is equivalent to your reaction time being off. And also in the world of cross-the-finish-line effects of simultaneous-draw situations -- it can be tantamount to randomly being suddenly teleported forward or backwards by 4 milliseconds during a 100 meter Olympics sprint, often the difference between 1st (win frag) and 2nd place (lose frag).

    Also -- regarding framerate capping, lag of 200fps cap is 5ms, while lag of 1000fps cap is 1ms. That's a whole 4ms difference in frametime, big enough to affect cross-the-finish-line effect of simultaneous-draw statistically in the world of well-matched reaction times (e.g. say -- 151ms vs 152ms vs 152.5ms vs 153ms -- for say, a specific human-reaction-time-measuring benchmark utility -- that's a very narrow spread)

    Some games do malfunction (latency-wise) at high framerates (Especially if you hit specific bottlenecks, such as CPU) but this is not an issue anymore in many games that are found in eSports that are able to run at several hundred frames per second.

    Obviously, we're talking about well-matched human reaction times of eSports, rather than just random players off the street and tested for a survey. But when you survey top leagues (like Olympics sprinters) the human-reaction-time spread between competitors is actually tight -- mere milliseconds apart.

    Generally, most eSports players do not use GSYNC or VRR technologies -- they really prefer VSYNC OFF as extreme framerates far beyond refresh rate (e.g. 500fps @ 144Hz) do reduce input lag over just doing 144fps@144Hz -- by reducing frame render times. GSYNC/FreeSync often forces framerates to run lower than their GPU capabilities, and you have the added disadvantage of latency-change effect of slamming against the GSYNC/FreeSync limit -- (unless you use the automatic VSYNC OFF mode found in newer drivers during hitting GSYNC limit). However, this might change with 240Hz monitors and higher GSYNC/FreeSync caps that approach the framerates used during eSports.

    Regardless --

    We can nitpick about whether milliseconds matter, but one also needs to understand the difference between actually seeing/feeling the millisecond, versus winning by the millisecond (without feeling it) -- ala 'cross-the-finish-line' effect of 'simultaneous-draw' situations. Being a few milliseconsd earlier, like crossing an Olympics 100m sprint finsih line, can win the race. As a result, you do not need to 'feel' the milliseconds in order for you to win by the milliseconds. Anyone who claims one millisecond doesn't matter in the 100 meter Olympics sprint, is lying. Bottom line -- milliseconds matters to professional competition gamers -- but not in the way many think.
     
    Last edited: Apr 13, 2017
  8. dr_rus

    dr_rus Ancient Guru

    Messages:
    2,983
    Likes Received:
    360
    GPU:
    RTX 2080 OC
    That"s a great post, mdrejhon. But why are you suddenly talking about 1000 fps vs 200 when the OP was about vsync at 240 fps vs 230 fps with no vsync? Vsync at 240 fps will produce lag of approximately 4 additional milliseconds (8 for tripple buffering). You claim that this is noticeable? I really don't think so.

    The fact that vsync kicking in may result in some game engine introducing additional lag in the pipeline due to how the engine is designed is an issue of that game, not vsync itself. Also this is mostly irrelevant as as you've said such esports games are rarely played with any x-sync anyway.

    So - what's the problem?
     
  9. lexer98

    lexer98 Master Guru

    Messages:
    667
    Likes Received:
    1
    GPU:
    GTX 1070 - WC
    Sorry not really "on-topic". But you can feel the difference between 144hz and 240hz ?

    I use mostly "graphics designer" monitors in work and my home, between those monitor and a 144hz is the day and night. But for example 144 vs 165hz is the same for me (same monitor overclocked)
     
  10. RealNC

    RealNC Ancient Guru

    Messages:
    3,309
    Likes Received:
    1,509
    GPU:
    EVGA GTX 980 Ti FTW
    240Hz has less motion blur for games where you can reach 240FPS (like Counter-Strike.)

    You can't feel a difference in input lag. You can see a difference in motion blur.

    It's the reason there's "240Hz" TVs. The reason they take a 60Hz input and interpolate to 240Hz is to lower motion blur.
     

  11. Andy_K

    Andy_K Master Guru

    Messages:
    596
    Likes Received:
    103
    GPU:
    MSI GTX 960 OC
    1. if this would happen in real both were killed, not only one.
    2. how many of these top players are there?
    3. how many of these top players use MSI AB/RTSS while official tournaments, where it really counts?

    I don't say this is not an issue, I don't say it is, but what I say is: is it really an issue for a noticeable group of users to take care?

    ROFLMAO :heh: :funny:
    You really made my day
     
    Last edited: Apr 13, 2017
  12. Unwinder

    Unwinder Moderator Staff Member

    Messages:
    15,314
    Likes Received:
    2,612
    Np, I'll extend GUI limit to 300 in the next version.
     
  13. mdrejhon

    mdrejhon Member Guru

    Messages:
    117
    Likes Received:
    79
    GPU:
    4 Flux Capacitors in SLI
    It doesn't have to be noticeable in order to win draw battles. There's a difference between feeling/noticing the lag, versus winning with it.

    I only mention this for argument's sake to explain the need to keep an open mind. To me, 4ms doesn't matter. But that doesn't mean categorically saying it doesn't matter to the whole world.

    The example of 1000fps is mentioned only merely because some eSports players use uncapped framerates with older engines (e.g. CS:GO), the higher the framerate the lower lag per frame -- even at same refresh rate (lag of 1000fps@144Hz is lower than 144fps@144Hz).

    GTX1080's on CS:GO do that regularly, so it's another way of reducing lag without raising refresh rate. Understanding how the tear slices come out and how they reduce lag, is something few people mathematically understand. At 1000fps means 1000 tearslices a second (each only 1ms fresh apiece, as scanned-out on a display), rather than being forced to wait, say 1/144sec (6.9ms) for the next refresh cycle to begin. Whether these milliseconds matter or not is another story (but already explained).

    Getting 240Hz by interpolation is useless for games. It often adds too much lag.

    That said, interpolation is a method of reducing motion blur by reducing frame visibility time. 240Hz interpolation can have approximately 1/4th the blur of 60Hz -- but often with some annoying side effects (e.g. artificial-looking soap opera effect and lag) -- making it useless for video games.

    However, Oculus has come up with a nearly lagless interpolation method called "reprojection" -- to convert 45fps to 90fps@90Hz during VR.

    Anyway, this is a scientific topic of a completely different offtopic matter....
    ....but yes, I agree with you, 240Hz "fake-frames" TVs are a joke for games. :D


    I'll briefly get off topic, since it's all kinda under a shared umbrella of "Is 240Hz worth it?".

    144Hz and 165Hz is only a 15% motion-blur difference (165/144 = 1.14583333). However, 144Hz vs 240Hz is a bigger difference (240/144 = 1.6666667). But you definitely lose color quality (TN...) by going away from the beautiful 165Hz IPS monitors, so you definitely want to stick to your monitor.

    Now, as a rule of thumb, motion blur is directly proportional to frame visibility time -- as seen in TestUFO Eye Tracking -- (I'm the inventor of TestUFO -- I authored this test) ... As you track eyes, your eyes are in a different position at the beginning and at the end of refresh cycles.

    Pixel response is part of it, but ever since pixel response time is a tiny fraction of refresh cycles nowadays -- persistence ("MPRT", the term used in Google Scholar) -- blur caused by eye tracking as seen at www.testufo.com/eyetracking is the dominating factor of motion blur nowadays.

    At 144Hz or 165Hz you still can't read the street name labels of TestUFO Moving Map Test because that pattern is moving 960 pixels per second. For a sample-and-hold 144Hz display (ULMB turned off). Your eyes has tracked 7 pixels in 1/144sec, trying to track a 960 pixels/second moving object on this motion. That's 7 pixels of motion blurring caused by eye tracking (as again, explained in TestUFO Eye Tracking Animation).

    Reducing motion blur is done via shortening frame visibility time. If you want less motion blurring, you shorten frame visibility time. You do that by turning on ULMB (Ultra Low Motion Blur) or LightBoost. That adds black periods between refresh cycles, and briefly flashes each refresh cycle for about 1-2ms.

    If you turn on ULMB, you can successfully read the street name labels in TestUFO Moving Map Test.

    As an example, to make tiny text during fast motion readable -- you need less than ~2 pixels of motion blurring at ~1000 pixels per second motion -- there are only two ways to do that -- flash each frame for only 2ms, or fill the full second with 2ms frames (aka 500fps@500Hz to avoid strobing). So scientifically and mathematically (and in science papers), ULMB has motion-blur equivalence to approximately a theoretical 500fps@500Hz sample-and-hold display (non-strobed non-flashed non-impulsed non-CRT display), due to its 2ms strobe-flash-per-frame. There were also 500Hz and 1000Hz laboratory displays, and this also confirms it -- that tracking-based motion blur is directly proportional to frame visibility time.

    (Likewise, a camera shutter of 1/1000sec will have less tracking motion blur in the photo than a camera shutter of 1/60sec or 1/120sec or 1/240sec -- the faster you track and the sharper the original iamge is, the easier to see tracking based motion blur). Tracking-based blur with your eyes compares well to that photographic metaphor -- as seen in many TestUFO tests and science papers ... and I'm also an author of a peer reviewed conference paper now, so as you surmise, I know my stuff.

    So in other words, to halve motion blur -- you have to double the refresh rate (and framerate) or add 50%:50% ON-vs-OFF strobing. To get one-quarter motion blur -- you have to quadruple the refresh rate (and framerate) or add 25%:75% ON-vs-OFF strobing (e.g. ULMB/LightBoost style backlight strobing). And so on.

    Please note, motion blur isn't the same topic as lag, although higher refresh rates affects both simultaneously -- so both are mentioned as effects of a higher Hz.

    Even OLEDs, which can respond in 0.1ms, OLED can still have motion blur too (persistence/MPRT) -- that's why Oculus/Vive strobes their VR headsets with a rolling-scan. And certain OLED displays such as Sony TriMaster have a rolling-scan strobe.

    Yeah, I'm getting off topic, but it all falls under the umbrella of "X doesn't matter" stuff (whether it's "240Hz" or "5.5ms" or whatever) actually has lots of spinoff topics that many people are completely unaware of.

    To most, it doesn't matter. But to those "Trying to simulate a Holodeck" people it can a great deal to avoid blur above-and-beyond natural blur created by the human brain. (blur & lag science) Or to those "eSports players" (lag science). Or motion-blur sensitive people who likes ULMB/LightBoost (blur science). Or forced eye-tracking situations such as the blurring of little targets during a high-speed low-altitude helicoptor flyby (blur science). Etc, etc, etc -- but I'll stop. A lot of legitimate edge cases. I certainly can respect them, even if they don't matter to me nor to you... It's a fascinating world out there.

    Mark Rejhon
    Chief Blur Buster / TestUFO Creator
     
    Last edited: Apr 18, 2017
  14. mdrejhon

    mdrejhon Member Guru

    Messages:
    117
    Likes Received:
    79
    GPU:
    4 Flux Capacitors in SLI
    Multiple GSYNC testers & display reviewers worldwide thanks you for that in the world of emerging high-Hz displays.

    I see your RTSS stuff in many YouTube videos and reviews.
     
  15. RealNC

    RealNC Ancient Guru

    Messages:
    3,309
    Likes Received:
    1,509
    GPU:
    EVGA GTX 980 Ti FTW
    So you disagree with the fact that 240FPS on 240Hz has lower motion blur than 60FPS?

    Can you explain why you think that, and why you think it's so funny?
     

  16. RealNC

    RealNC Ancient Guru

    Messages:
    3,309
    Likes Received:
    1,509
    GPU:
    EVGA GTX 980 Ti FTW
    Who said anything about using interpolation for games? If you read more carefully next time, you'll see that I maintain my point that 240FPS on 240Hz has less motion blur than 60FPS/60Hz or 144FPS/144Hz.

    And to support my point, I said the reason TVs do interpolation to 240FPS is to lower motion blur.

    There is PROOF that higher frame rates on higher refresh rates provide less motion blur. But it seems most people here are completely clueless about these things.
     
  17. lexer98

    lexer98 Master Guru

    Messages:
    667
    Likes Received:
    1
    GPU:
    GTX 1070 - WC
    @mdrejhon (If i quote your comment i will break the forum lol)
    Great comment, now makes sense the 240hz.
     
  18. Andy_K

    Andy_K Master Guru

    Messages:
    596
    Likes Received:
    103
    GPU:
    MSI GTX 960 OC
    I emphasize what makes me lough so hard:
    If you have a 60Hz input, there is nothing a TV/monitor can do, to interpolate it to 240Hz to reduce motion blur.
    In fact, if the TV/monitor is interpolating by mixing image a and image b to get those missing 4 frames until next image c is sent you'll get more motion blur in a single interpolated image. The animation will be smoother, but this is due to the increased motion blur.

    If you have a 240Hz input, there is no need for interpolating and the fps is high enough to see the animation flawlessly.
    But no way you can reduce motion blur, by interpolacing from 60Hz to 240Hz.
    What do you reduce is stuttering, but not motion blur.

    Also what mdrejhon says about motion blur is stuttering, not motion blur.
    Sorry about that, but motion blur is streaking (smearing) in a single image (due to motion while exposure time), not the optical effect, what our brains makes out of different pictures displayed in a rapid sequence.
     
    Last edited: Apr 17, 2017
  19. RealNC

    RealNC Ancient Guru

    Messages:
    3,309
    Likes Received:
    1,509
    GPU:
    EVGA GTX 980 Ti FTW
    Reducing frame persistence reduces motion blur. The interpolated frames being smudgy just means motion blur isn't as low as with a true 240FPS signal. But motion blur is still reduced simply due to the fact that each frame stays on the screen for a smaller amount of time.

    Downloading this video and watching it with frame rate interpolation makes the text appear clearer when it's moving. And that happens not because the in-between frames are sharp (they aren't), but because the text stays on the same position on the screen for a much smaller amount of time.

    This is why strobing reduces motion blur and why higher frame rates reduce motion blur. Both result in each frame being seen by our eyes for a smaller amount of time.
     
  20. mdrejhon

    mdrejhon Member Guru

    Messages:
    117
    Likes Received:
    79
    GPU:
    4 Flux Capacitors in SLI
    Yes, this is true, if response time isn't a limiting factor.

    For example, 240fps (4ms frames) interpolation on an 8ms-GtG IPS/VA LCD, will mostly useless in reducing motion blur. The LCD GtG needs to not be a limiting factor for a specific refresh rate, in order for the blur-reducing benefits to truly be realized.

    Actually, it depends. Back when 240fps interpolation came out, it was being done at lower quality, on slower-responding HDTVs. And the original clarity of the video frames within the source material plays a role.

    Instructions On How To Use Interpolation To Reduce Motion Blur In Videos
    1. Get a 120Hz or 240Hz monitor ("1ms GtG" TN is particularly effective)
    2. Get a powerful GPU and CPU
    3. Download a copy of Smooth Video Project
    4. Configure it to interpolate (NOTE: may need to edit config file to unlock the 60Hz limit).
    5. Run a 1080p motion pattern (that has no video blurring) such as a 60fps horizontally scrolling test pattern. Preferably full-sharpness 4:4:4 chroma video (to avoid chroma blurring effects, etc). High-bitrate motion-test MP4 files are perfect for this, as compression artifacts of low-bitrate video interfere with successful blur-reducing interpolation. If playing camera-recorded video files, make sure camera shutter of the video file is far faster than refresh period (e.g. 1/1000sec per-frame video camera shutter, often frequently found in fast-pans of brightly-lit environments such as outdoor sports, like ski racing and car racing, or GoPro HD camera views of smooth, predictable, linearly fast-scrolling scenery, etc).

    RESULT: Interpolation successfully realtime massively reducing motion blur. You get 1/2 motion blur at 120fps and 1/4 motion blur at 240fps (compared to the original 60fps video file played at 60Hz).

    TL;DR: Both you and I are right, but for different interpolation algorithms and/or different displays. If neither is the weak link, then newer advanced interpolation successfully prevents blurring during interpolation. It can reduce motion blur, but only if configured as such, and on a display where response time isn't a major factor of motion blur, on non-preblurred video material.

    And that's even without enabling strobing/ULMB/LightBoost. You can go far beyond by combining the two -- you can even get ~80-90% less motion blur on fast-shutter video (sharp non-preblurred frames) if you combine strobing+SVP, and play at the same rate as the strobe rate (e.g. 120fps@120Hz for ULMB). That's if, the interpolator successfully prevents adding blur to the frame during interpolation to create new sharp frames between already-sharp non-preblurred video frames. If LightBoost or ULMB supported flickery 60Hz single-strobe (of flicker-painful CRT days), you don't need to do the interpolation step before strobing. But alas ULMB only works at certain refresh rates including 120Hz. So the interpolation step is needed if you're wanting to motion-blur-eliminate 60fps video with 120Hz strobing/ULMB/LightBoost without the double-image effect. One frame per strobe is the magic recipie for razor sharp CRT-clarity video playback on LCD displays with perfectly sharp fast scrolls. One-strobe-per-frame (of non-preblurred video) is what allows you to break the 50% motion-clarity improvement barrier of 60fps@60Hz -> 120fps@120Hz.

    Newer 1ms TN LCD's (the ones with well-optimized 256x256 overdrive tables for clean GtG of all color combos, needed for 3D and/or strobing) -- are a classic example when response time successfully becomes a tiny fraction of the cause of motion blur, and the frame visibility length (refresh cycle,
    persistence, MPRT -- 16.7ms for 1/60sec of 60Hz) from sample-and-hold, ends up being the dominating factor of motion blur. Although 1ms as a manufacture hype is not always 1ms for full 0-100% GtG, the modern 1ms panels still manage to complate 90% of GtG in that 1ms -- so remnants avoids dominating the motion blur factor. Most HDTVs are using ~4ms-8ms IPS or VA panels often with far longer fuller-GtG completion times. So response-time-related blurring will begin to occur on these 240Hz TVs before you've isolated down only to tracking-based blur (sample-and-hold). On many of these displays, you will see smoother but lots of smearing -- much like the older overclocked QNIX QX2710 LCDs or other panels where effective response time still approaches or exceeds the length of a refresh cycle, and starts to add to tracking based blur. Also, you can easily observe GtG imperfections via www.testufo.com/ghosting and www.testufo.com/eyetracking -- the more clear the transitions are, the less treaking behind objects (e.g. behind stars in Eye Tracking, or behind UFO in Ghosting) -- and the more strobe-friendly a display is, the less LCD response interferes with 120fps or 240fps motion blur. A strobe-friendly display is a good testing platform for 'interpolation-reduces-motion-blur' testing. Testing on a display that supports ULMB/LightBoost or one of the more expensive, expensive impulsed LED-backlit LCD HDTVs (one without the stupid strobe crosstalk, double-image artifacts of low-quality strobing) --
    such well-strobe-capable panels (including "3D"-capable panels which need clearly sharp transitions between refresh cycles by necessity) -- have to be designed along with ultraprecisely tuned overdrive -- in order to complete LCD GtG's sufficiently fast in the blanking interval between refresh cycles. The cleaner, more uniformly complete, and faster the majority of GtG transitions -- the less it adds to motion blur issues. Those displays are great examples where increasing refresh rate reduces motion blur (e.g. a gaming 120Hz TN display). Now going back to 60Hz then interpolating to the display's 120Hz refresh rate, you can successfully (with a properly configured interpolator) halve the motion blur of a 60fps video via 60fps->120fps interpolation. Good newer interpolation algorithms (in _newer_ fast-response HDTVs or when run on TN gaming monitors) -- interpolation then makes a huge sharpening effect instead of a blurry-smoothing effect.

    Now if you're turning on interpolation on your cheap 10-year non-3D-capable old Vizio or Westinghouse or Walmart or RCA HDTV, you're right, its LCD response is usually a joke and all its "120fps interpolation" or "240fps interpolation" mode does is make the motion smoother, without less motion blur. Even many of the brand name panels don't do much of a better job. The lower quality interpolation of cheaper (or older) HDTVs combined with the slower response LCD (together), means it's not going to make a motion clarity upgrade like interpolating 60fps->120fps via SVP on a gaming 1ms TN monitor. At 1ms, the response time is tiny compared to the 8.3ms timescale of a 120Hz refresh cycle, so it adds only a few percent to the motion blur. Though, admittedly, the 10-year-old slower responding Vizio IPS/VA panel probably will still have far better and richer color than crappy-color 1ms TN panels. :D

    I've seen HDTVs do exactly what you describe (smoother but not less blur) and I've seen HDTVs that resembles more the "SVP+120Hz gaming monitor" (motion-sharpened-interpolation example).

    Slow samera exposure per frame can prevent ability to do blur-reducing interpolation
    For example, playing high-def sports (each video frame 1/1000sec shutter, to prevent camera-blur within frames from interfering with blur-reducing interpolation) -- you easily get half motion blur at 120fps interpolation and quarter motion blur at 240fps interpolation -- on the best HDTVs I've ever seen, the kind of stuff that costs three of four times the prices I'd normally pay at a Walmart. Also, the new OLED HDTV's are a great example -- several of the ones that are out available now have good newer 120fps interpolator chips. And darn near exactly halve motion blur of such sports videos (And fast-shutter videos) since OLED response time doesn't interfere. Now if you're playing movies, many use natural motion blur in each movie frame, and you'll NOT see interpolation sharpen those movie frames anyway. (Remember, shutter speed per frame needs to be faster than the refresh cycle -- e.g. sports 1/1000sec per frame filming ski racing on a bright sunny day -- benefits particularly HUGELY with blur-reducing interpolation -- 120fps@120Hz interpolated has half the motion blur of 60fps@60Hz non-strobed/flashed/impulsed (when viewing the same video on the OLED or fast LCD + good interpolator + camera-blurfree video!). But if shutter speed per frame is 1/60sec, you're building-in lots of blur that an interpolator can't avoid). Also some interpolators combine interpolation and strobing simultaneously, e.g. interpolate to 120fps, and then strobe-flash at 120Hz (or use a scanning backlight) to reduce motion blur by 75%-90%+ instead of just reducing motion blur by 50% (for non-strobe 120Hz versus non-srobed 60Hz). 60Hz strobe flicker can be headachey, so that's why some HDTVs interpolate to 120fps first before strobe-flashing each frame at 120Hz -- this can be more effective than interpolating all the way to 240fps because at 240Hz, LCD response time of IPS/VA easily starts to kill any further motion-clarity-improving benefits.

    Common HDTV LCD 8ms GtG on a "240Hz" 4ms refresh cycle DEFINITELY is going to smear/streak/blur the refresh cycles into each other (without even factoring in crappy/cheaper/older interpolation chips). Even 8ms GtG on 60Hz LCDs will add roughly ~50% extra perceived blurring above-and-beyond that 16.7ms (1/60) sample-and-hold blurring (tracking based, not GtG). At some point, raising refresh rate starts to eat into LCD response time and you don't see further increases in motion sharpness.

    LCD response time limitations, combined with poor interpolation, combined with blur in video frames, will all simultaneously combine to create the non-sharpening interpolation effect Andy_K speaks of. But play sharp video (with sharp individual frames) on one of the newer OLED HDTVs (or expensive fast LED-LCD HDTVs) with newer 120fps higher-quality interpolator chips, and BAM -- 50% less motion blur at 120fps interpolated non-strobed than 60fps uninterpolated non-strobed. (Delta can be even bigger if strobing is combined in this equation). SOE artifacts (Soap Opera Effect) are a different topic altogether, but in simple panning backgrounds with no parallaxing effects (fewer occulsion guesswork needed for interpolator) SOE problems mostly disappears in the newer chips, and all you see is the miraculously reduced motion blur in various types of fast-shutter/non-preblurred video material interpolated to twice or quadruple framerate.

    ...Yes, TMI. Too Much Information. Repetition, yes. But just throwing more firelogs under the semi-but-not-fully-offtopic matter the "X doesn't matter" umbrella...
    (X = 240Hz, X = millisecond, X = interpolation, X = GtG, pick a word)


    Mark Rejhon
    Chief Blur Buster / TestUFO Creator
     
    Last edited: Apr 18, 2017

Share This Page