NVIDIA DLSS 2.1 Technology Will Feature Virtual Reality Support

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 9, 2020.

  1. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    No it doesn't. That's my point. If the framerate with DLSS on is higher, then communication with the tensor cores is literally never adding to the render time vs's the render time of a 4K native image. It's always decreasing it vs that native image. You can't have a higher framerate with a higher render latency. That's an oxymoron. You can have a higher framerate with a higher total latency - because either the screen got slower, or you have a terrible mouse, or your OS is lagging. But the "Render Latency" aka the amount of time for a GPU to process the frame from start to finish is linked to the framerate. If the framerate goes up it's because the GPU is processing frames in less time.

    DLSS has a fixed "upres" time, probably associated with the render resolution.

    Let's go back to the slide:

    [​IMG]

    Let's say we have a GPU capable of doing 4K@60 it would look like: Mouse latency is 2ms + 4ms for CPU (OS/Game) + 16.7ms for Render Latency (Native 4K image) (Render Queue + GPU) + 1ms for composite + 10ms for Scanout/Display. Total of 33.7ms from the time you move your mouse to the time the frame hits your eye.

    Now let's say we turn DLSS on and let's say communication with the tensor cores adds 4ms of delay but it's on and we're getting 90fps @ 4K now. it would look like: Mouse latency is 2ms + 4 for CPU (OS/Game) + 7.11ms for Render Latency (Native 1080p image) + 4ms (DLSS/Tensor Upres to 4K) (11.11ms total (90fps) (Render Queue + GPU) + 1ms for composite + 10ms for Scanout/Display. Total of 28.11ms from the time you move your mouse to the time the frame hits your eye.

    The only thing that's changing is the latency in the "Render Queue + GPU" which is where the framerate is generated from. So no matter what if the framerate is increasing, that latency is decreasing (1000/60) vs (1000/90).

    Anyway I feel like we're either talking past each other or reached an impasse, so i'll leave it at this.
     
    Lex Luthor likes this.
  2. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    I understand that, but I have stated repeatedly that render time isn't what I'm referring to.
    I understand what you're saying there; I don't disagree with any of that. But like you said, we're talking past each other, so yeah - let's leave it at that.
     
    Lex Luthor and Denial like this.
  3. Lex Luthor

    Lex Luthor Master Guru

    Messages:
    781
    Likes Received:
    356
    GPU:
    RTX 3060
    I think the question is, "which will be the lesser of the 2 evils (less latency)". Until we get hands on and test, a philosophical discussion. I'll stay optimistic. Statistically historical, a good bet ; )
     
  4. Prince Valiant

    Prince Valiant Master Guru

    Messages:
    819
    Likes Received:
    146
    GPU:
    EVGA GTX 1080 ti
    Why would anyone want to upscale content that's next to their eyes? I can only imagine how bad DLSS will look with a headset when it already looks poor with monitors.
     

  5. Dribble

    Dribble Master Guru

    Messages:
    369
    Likes Received:
    140
    GPU:
    Geforce 1070
    Well you need AA and most games AA ends up looking blurry and has motion artefacts (sparkling edges), if DLSS is sharper and doesn't have the motion artefacts which is generally the case for DLSS 2.0 on the quality setting then it's a no brainer.
     

Share This Page