ASUS ROG Strix XG248Q Adaptive Sync 240 Hz Monitor

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 31, 2018.

  1. Corrupt^

    Corrupt^ Ancient Guru

    Likes Received:
    There's very few engines where even a TITAN could probably hold 240 constant.

    Only the very efficient Doom 2016 engine comes to mind, that was putting out massive framerates even on my 1080 GTX.

    Anyways I've grown more "casual" but I'm still very nitpicky about latency, so I've settled somewhat around 120 fps & 120 Hz... until 120 fps becomes mainstream and 240 becomes way more affordable.

    Personally I wish NVIDIA/ATI also came with some scaling mechanism so that 720p looks crisp on a 1440p display (as 1 pixel of 720p would fit exactly within 4 squared pixels of a 1440p resolution).

    That way I could play my casual games at 1440p and play my more serious stuff at 720p.
    fantaskarsef likes this.
  2. mdrejhon

    mdrejhon Member Guru

    Likes Received:
    4 Flux Capacitors in SLI
    When one doubles refresh rates, one does have to up the game everywhere else. 240Hz can easily feel worthless in situations (e.g. 30fps at 240Hz isn't going to be noticeably better than 30fps at 144Hz!) unless you do lots to eliminate lots of weak links. Just replacing the monitor is not always enough.

    -- GPU upgrade with more framerate, or playing older games (e.g. CS:GO)

    -- Engine upgrade to handle more framerate better without other problems getting in the way (e.g. microstutters), e.g. engine performing as smooth as TestUFO

    -- Mouse upgrade since mouse microstutters can become a huge weak link at high Hz (unable to tell 120fps vs 240fps). Upgrade your mouse mat too -- make sure mouse turn left/right is as smooth as keyboard strafe left/right -- before judging monitor Hz.
    For VSYNC OFF and VRR operation, a mouse poll rate unsynchronized with refresh rate, should be a poll rate of at least 4x the display refresh rate. We've already noticed human-visible microstuttering in 1000Hz gaming mice during strobed/ULMB operation that is only fixable at 2000Hz+ (or via other means such as perfectly synchronizing the poll rate to the refresh rate).

    -- Panel upgrade since early 240Hz monitors had poor overdrive tuning (pixel response limitations that doesn't halve motion blur relative to 120Hz). This happened to a couple of early 240Hz monitors.

    -- Impeccable framepacing. Framepacing errors should ideally be a tiny fraction of a refresh cycle. A framepacing error of 4ms doesn't matter at 60Hz, but it produces mega-microstutter at 240Hz (4ms refresh cycles).

    At 8000 pixels per second panning motion (one screen width panning in 1/2 second at 4K), a 1ms gametime error generates an 8-pixel jump -- still human visible as a single microstutter in a TestUFO-style motion test! So gametimes and framepacing ideally should become sub-millisecond accurate during 240Hz+ operation. CS:GO is capable of that nowadays, but many games cannot achieve such framepacing accuracy. VRR operation reduces the necessity of framepacing accuracy to an extent, so 480Hz VRR will help, that said, the framepacing accuracy demands to avoid a microstutter (which is still visible on 240Hz and 480Hz), jumps up a lot.

    Sure, the microstutter vibration amplitude halves at twice the Hz, and is visible for the duration but still human visible (e.g. 479fps at 480Hz still produces 1 slightly visible TestUFO stutter per second). But it's now become so damn sensitive to microstutter due to huge demands on consistent frametimes, so bad microstutter harmonics can still become visible -- such as pileups of delays -- where a 4ms pause means 2 missed refresh cycles at 480Hz (1/480sec = ~2ms)! In fact... Motion problems of microstutters are even still (barely) visible at 1000fps @ 1000Hz under scentific tests, so we're infact still not at the vanishing-point of diminishing returns curve. Yet. The invention of Hertz (the human idea of using a series of static images to represent moving imagery) is still a royal pain with artifacts such as tearing, microstuttering, latency, etc, which only gradually diminishes until the next weak link is hit (e.g. the GPU, the mouse, etc).

    Either way:

    The leap to true genuine display refresh rates of 480Hz, 960Hz, 1000Hz will probably take a couple of decades to mature (including weak links in software and other computer accessories), but definitely worthwhile with lots of solvable problems that engineers are currently working on.

    The GPU side of equation is the hardest one. The "frame rate amplification technology" part will be the biggest issue -- extra framerate for cheaper is critical -- especially since framerates will be kept down by things like real-time ray tracing. However, it does not preclude continued increases in framerates over the long term. Average 3D framerates today are still higher than 20 years ago, so there is progress on average.
    Last edited: Jun 5, 2018

Share This Page