480Hz Monitor Display Panel Prototype Spotted

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 14, 2017.

  1. RealNC

    RealNC Ancient Guru

    Messages:
    3,111
    Likes Received:
    1,339
    GPU:
    EVGA GTX 980 Ti FTW
    Don't forget another thing about ultra high Hz displays: they make VRR obsolete. 480Hz is not yet there. But at 1000Hz, you get 1ms worst case frame variance, and 0.5ms average. So such a display would not need g-sync or freesync anymore. At such low frame times, FPS/Hz mismatch judder would probably be impossible to perceive.
     
    mdrejhon likes this.
  2. mdrejhon

    mdrejhon Member Guru

    Messages:
    103
    Likes Received:
    61
    GPU:
    4 Flux Capacitors in SLI
    Correct, RealNC. I am going to mention that in the Holiday 2018 special follow-up.

    Refresh granularity times so fine, that fixed-1000Hz virtually looks like VRR when playing any random framerate, whether 27fps or 93fps or 218fps.

    All sync tech would all have almost the same lag once you went 1000Hz. VSYNC ON, VSYNC OFF, GSYNC, FreeSync, all would have darn near the same lag, so ultra-Hz also replaces all sync tech which are all simply workarounds for the lag or motion artifacts of coarse refresh granularities. All that disappears with fine refresh granularities. All VSYNC ON 2-buffer lag was 33ms at 60Hz, but is only 1ms at 2000Hz. Fully buffer-backpressured VSYNC ON with no lag!

    Who needs the legacy sync tech when you have >1000Hz refresh rates?

    No tearing, no blur, no lag, no flicker, no stutter, no stroboscopic effects.

    All the advantages of ULMB, G-SYNC, FreeSync, VSYNC OFF, VSYNC ON, combined, all in one. Ultra frame rates at true ultra Hz (not fake Hz) fixes everything simultaneously!

    We will still need VRR for a long time though, until displays were well beyond 1000Hz. And even then, NVIDIA and AMD could still call plain 1000Hz+ a new sync tech -- "FreeSync Extreme" or "G-SYNC Ultra" or whatever. A lot of engineering is still needed to make 1000Hz+ succeed.

    And yes, I've seen this tantalizing potential with my eyes on experiments. 1000Hz DOES fix-all.

    Though game software and GPUs do have to catch up. Many game software go wonky at ultra frame rates, and those need fixes. Also, various frame rate amplification technologies (FRAT -- the blur busters terminology I use), like improved 100fps->1000fps descendants of Oculus Timewarp 45fps->90fps. Such frame rate amplification ratios are doable by the time 1000Hz displays arrive, possibly by co-GPUs.

    Mind you, it may be a decade. While I may school the 480Hz-deniers and 1000Hz-deniers, 240Hz will be good for long time. The first 1000Hz displays won't be perfect, just like the first 120Hz displays weren't, nor the first 240Hz. So even when it first arrives, there's a long period of optimizing that occurs. Heck, some mouse manufacturers only do 500Hz and haven't gone on to 1000Hz. So, don't hold off your 240Hz purchases this Christmas. ;)
     
    Last edited: Dec 20, 2018
  3. mdrejhon

    mdrejhon Member Guru

    Messages:
    103
    Likes Received:
    61
    GPU:
    4 Flux Capacitors in SLI
    About the common question about how GPUs will be able to keep up to 1000Hz...
    The technology is being developed: frame rate amplification technologies.
    Oculus Rift VR headsets use a variant to convert 45fps to 90fps in a perceptually lagless way. Eventually, 5:1 and 10:1 framerate-increase ratios will be achievable with upcoming frame rate amplification algorithms.
     
    Last edited: May 12, 2019
    Dragam1337 and JonasBeckman like this.
  4. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    15,757
    Likes Received:
    1,738
    GPU:
    Sapphire 5700XT P.
    Interesting, that would be a pretty neat solution without having to need some really enthusiast class hardware (Let alone keeping up with newer more demanding titles.) though I suppose tech like G and Free sync also helps somewhat but this sounds like it can improve it further, maybe even both working together and ensuring smooth responsiveness and feedback with refresh rates and framerates above the current 120 to 240 as technology improves further over the next couple of years. :)

    Not like I would have the understanding on this subject but I like the sound of what I am hearing from what little I do understand.

    EDIT: Well display technology itself but also GPU and other hardware and the interface connecting them such as display port and HDMI for bandwidth and criteria particularly for 10 to 16-bit and higher resolution requiring further bandwidth from features such as HDR and also resolutions pushing beyond "4K" commonly 3840x2160 to I think we're starting to hear 8k now as a thing already.

    Though newer standards will probably have the power to drive the hardware as it's actually coming out on the market and in range of more regular consumer pricing levels.
    (Well in time at least, early on this type of tech would probably be a bit of a premium feature but when was that anything new.)
     
    Last edited: May 12, 2019

  5. TiePhiter

    TiePhiter New Member

    Messages:
    9
    Likes Received:
    4
    GPU:
    Vega 64 LC
    What I'm not seeing anyone say.... is the Latency we will get from 480hz. ONE person I've seen here gets it.

    One. LOL.

    60hz = 16ms of lag. 120 = you guessed it 8ms. 144hz is 5ms. 240hz is NOT half the latency. The best monitor tested was around 4-4.5ms of lag. That's the issue. If these 480hz monitors are ACTUALLY closer to the 1ms Mice are at now.... they will be INSANE.

    I also see a few of you commenting on what you think is the case, with 60hz vs 120hz+ monitors.... Not sure why you think you get to make assumptions like that, not owning any fast monitors. I can tell you without a doubt that 1. I could NEVER use a 60hz monitor again. Ever. My eyes dont hurt at ANY time no matter how long I use my 144hz and 165hz monitors for. Any amount of hours, no eye strain. Yet another point completely overlooked by everyone. 2. If you have proper eyesight, you will start losing the ability to see FPS with your eyes around 90fps. After that, nobody can tell the difference by LOOKING at it. It's possible to tell the difference by playing.
     
    Last edited: May 13, 2019
  6. MonstroMart

    MonstroMart Master Guru

    Messages:
    598
    Likes Received:
    190
    GPU:
    GB 5700 XT GOC 8G
    I just imagine the "pro" gamers playing at 540p with low settings to get the most fps possible on this panel ...
     
  7. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,762
    Likes Received:
    2,204
    GPU:
    5700XT+AW@240Hz
    Not gonna play on that trash. There is no way to use predictive methods for missing geometry. Therefore they have to use 1 frame lag interpolation which works reasonably well except the lag and that there will be artifacts especially in fast movements (high number of pixels movement in different directions) and motion blur.

    Have your UFOs running around on random at high speed and this thing just fails. It will make them to move in other than intended directions.
    Basically 1st UFO moves from A to B, and 2nd moves from C to D. But this may decide that as they intersect 1st moved to CD's path and 2nd moves to AB's path.

    Higher the number of similar objects and speed, higher the risk of error in motion vector. Then there is missing information for background geometry (background behind fast moving foreground objects.)

    Day this interpolation can handle old C64 game like Turrican 2, they can start asking for my approval.

    (Check 59th minute. It is even nightmare for efficient video compression.)
     

Share This Page