Is "Asynchronous Reprojection" worthwhile for Non-VR Gaming? Perhaps a DLSS3 Frame Generation combo?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by BlindBison, Dec 3, 2022.

  1. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    Linus video:

    2kliksphilip video:

    This is a technique I had never heard of until recently -- apparently it's been a thing in the VR space for quite sometime now with the technique being refined quite a bit over the years. I think this sort of thing may be particularly relevant to Nvidia GPUs as it "seems like" their AI magic (sort of in the vein of DLSS or DLSS Frame Interpolation) may be able to help mask some of the artifacts associated with this technique.

    In particular this technique seems like it would pair very nicely with DLSS3 Frame Generation. 1) Because it would help offset the latency associated with DLSS3 and 2) because DLSS3 FG gives us more total frames which is good for making the Async Repro artifacts less noticeable. I'm actually surprised this technique hasn't been offered as an option in more standard PC titles (I don't know of any non-VR titles to do so). This technique seems very interesting in concept, especially for high refresh rate users.
     
    Last edited: Dec 4, 2022
  2. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    Some interesting comments from the Linus video:
    upload_2022-12-3_14-11-33.png
    Also just for context on what the whole "Asynchronous Reprojection" thing is for those who haven't watched the video it's a method of updating player inputs every refresh even if the game is refreshing only at, say, 30 fps.

    > "Reprojection involves the headset's driver taking one or multiple previously rendered frames and using newer motion information from the headset's sensors to "reproject" or "warp" the previous frame into a prediction of what a normally rendered frame would look like. "Asynchronous" refers to this process being continuously performed in parallel with rendering, allowing reprojected frames to be displayed without delay in case a regular frame isn't rendered in time, and is used in all frames by default to reduce perceived latency.” -- https://www.avsim.com/forums/topic/580732-asynchronous-reprojection-aka-stuttering-ghosting-etc/

    As I understand it (I'm probably butchering this so probably best to watch the videos where it's explained much better) they do this with a "trick" where they can take the onscreen image and "warp"/stretch it and, if I understood correctly, determine where they think objects on screen "should" be then when the next image does come along we get a true update. There are of course artifacts with doing this, but it seems like a clever technique in any case and it's impressive to me how well they've gotten this to work in VR -- the idea behind the Linus/2kliks tests is to consider how this could be also be used for your standard flat panel/non-VR titles.

    In the Linus tests when it was configured appropriately with all the current tricks applied users had a dramatic improvement in "perceived" input lag and the only noticeable artifacts happen around the periphery of the screen from how it looks to me that is. This could be very interesting used with something like DLSS3 FG since it could help offset the input lag associated with that approach.
     
    Last edited: Dec 4, 2022
  3. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,732
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    In this thread, post 935 onward on HardOCP, the owner of Blur Busters has been pushing for this technology a while.
    He explains more background detail.
    https://hardforum.com/threads/more-dlss.1990492/page-24
    He may have posted more info in other threads too, cant remember.

    He talks of not just regenerating frames but reducing latency of screen and mouse response through prediction too.
    With current hardware it has been demonstrated 1000fps at UHD res is possible with little issue and only requires around +20% GPU use worst case.
     
  4. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    Thanks, that's great to know -- will read over this :)
     

  5. magic09

    magic09 Member

    Messages:
    40
    Likes Received:
    6
    GPU:
    2080 Super
    I have always thought this as well! From the first time I got an Oculus Rift-S and tried forcing ASW and saw how well it can work(yes it has distortion issues which look identical to the DLSS3 artifacts ive seen) that man why don't they do this in flat games? I believe that Optical-Flow was developed for VR and with Oculus/NVIDIA together if I remember correctly as 20 series had that and the special USB port that was supposed to be for VR and the Index was originally going to use it. Anyways it seems that is what DLSS3 actually is-an refined version of ASW 2.0 for flat games.
     
    BlindBison likes this.
  6. Terepin

    Terepin Guest

    Messages:
    873
    Likes Received:
    129
    GPU:
    ASUS RTX 4070 Ti
    It wouldn't surprise me if AMD is taking this route with FSR3.
     
    BlindBison likes this.
  7. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,732
    Likes Received:
    2,701
    GPU:
    Aorus 3090 Xtreme
    Its quite a read covering all his posts and additional clarifying info from others such as Trunks0, it went in better the second read :)
     
    BlindBison likes this.
  8. TimmyP

    TimmyP Guest

    Messages:
    1,398
    Likes Received:
    250
    GPU:
    RTX 3070
    Nvidia OFA is refined ASW.
     
    BlindBison likes this.

Share This Page