NVIDIA AI Powered DLDSR Unveiled

Discussion in 'Frontpage news' started by Rich_Guy, Jan 11, 2022.

  1. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    I don't think this fits.

    What does it matter? The technology is utilizing a hardware GTX cards just don't have. Is Nvidia not supposed to innovate? or not utilize hardware until some arbitrary amount of time passes so owners of old cards can feel better?

    Like how come you're not complaining about AV1 decode not coming to GTX or RDNA1? Raytracing/Sampler Feedback/Mesh Shaders with older AMD cards, Smart Memory Access - should all those features be cut on your card because older ones don't have access?
     
    mentor07825 likes this.
  2. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    8,637
    Likes Received:
    10,687
    GPU:
    RX 6800 XT
    You do realize that Pascal has support for DP4a instructions. So it could run DLSS in shaders. It would not be as efficient as tensor cores and performance would not increase as much, but it would be a way to improve performance for many gamers.
    nVidia did use shaders to run DLSS 2.0 in a version of Control. And Intel's XeSS has support for it's own tensor units, and for DP4a. So it will run on Pascal, RDNA2 and RTX cards.
    At a time when it's so difficult to upgrade your GPU, giving support of DLSS on Pascal would be a welcome addition. Don't you think?
     
    Last edited: Jan 12, 2022
  3. mackintosh

    mackintosh Maha Guru

    Messages:
    1,162
    Likes Received:
    1,066
    GPU:
    .
    No one is saying NVidia nor AMD should not innovate. They can innovate to their heart's content. The issue is that we are no longer in a normal upgrade cycle. A statistically sizeable portion of PC gaming enthusiasts has been priced out of their hobby for the time being. The least either manufacturer could do is to lend them a helping hand through driver software optimizations - even if it means looking back instead of forwards for once. Even corporate greed ought to have its limits.
     
  4. Krizby

    Krizby Ancient Guru

    Messages:
    3,067
    Likes Received:
    1,742
    GPU:
    Asus RTX 4090 TUF
    I would rather the finite resources at Nvidia are spent at improving DLSS with tensor cores further like what they are doing now. Pascal owners can sell their GPU and buy newer ones, they had plenty of opportunities to do so.
    Heck even selling my old 1080Ti right now could get me a brand new 3060 (not much of an upgrade I know, but 3060 is still a better GPU than 1080Ti feature-wise)
     
    Last edited: Jan 12, 2022
    mentor07825 likes this.

  5. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,414
    Likes Received:
    1,139
    GPU:
    RTX 3070
    Is it clear exactly how this tech works? Is it like DLSS but forceable from the driver? Is it DLSS but with dynamic input resolution? I'm just a bit unclear how this replaces DSR. Usually DLSS was used to improve performance while DSR was used to just brute force higher input resolutions, no? Thanks,
     
  6. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Pascal has 1/4th performance of INT8 of Turing and cannot concurrently issue INT/FP - How do you know DLSS/DLDSR is even utilizing INT8?

    DLSS '1.9' is not an example. How do we know it wasn't running in FP16? Either way it was worse quality than 2.0 and significantly worse than what we have now (2.3) - also the performance advantage of DLSS isn't entirely from Tensor cores it's mostly from rendering the image at a lower resolution. Run RSR or NIS on Control and compare it to DLSS 1.9, $10 says the spatial upscalars will do a better job with the image with the same performance benefit.
     
    Last edited: Jan 12, 2022
    BlindBison likes this.
  7. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,414
    Likes Received:
    1,139
    GPU:
    RTX 3070
    For older games in particular I'm really excited to try out those features. I do hate that these have to be done via GeForce overlay and not the Nvidia Control Panel -- I much prefer enabling things there if I can.
     
    eGGroLLiO likes this.
  8. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,725
    Likes Received:
    1,854
    GPU:
    EVGA 1070Ti Black
    so it done via GFE well seeing it not installed and never will be useless feature to me:(
     
    BlindBison and eGGroLLiO like this.
  9. TimmyP

    TimmyP Guest

    Messages:
    1,398
    Likes Received:
    250
    GPU:
    RTX 3070
    There is a picture a page back with the setting in the control panel...
     
  10. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    8,637
    Likes Received:
    10,687
    GPU:
    RX 6800 XT
    DP4A is not the same as supporting packed 8-bit integer math of some kind, it's a specific instruction doing 8x8->16bit multiplication and saturating accumulation into a 32-bit integer supporting various combinations of signed and unsigned operands. Presumably DP4A and DP2A counted as a single instruction, not a quad/double rate operation run at the same rate as either 32-bit integer or float instructions.

    From nVidia itself:
    https://docs.nvidia.com/cuda/pascal-tuning-guide/index.html#int8
    GP104 provides specialized instructions for two-way and four-way integer dot products. These are well suited for accelerating Deep Learning inference workloads. The __dp4a intrinsic computes a dot product of four 8-bit integers with accumulation into a 32-bit integer. Similarly, __dp2a performs a two-element dot product between two 16-bit integers in one vector, and two 8-bit integers in another with accumulation into a 32-bit integer. Both instructions offer a throughput equal to that of FP32 arithmetic.

    DLSS 1.9 still had a decent performance boost, despite running on shaders. In terms of quality, it was lacking but I guess it wasn't using motion vectors and temporal accumulation for the final result. I think that only came with DLSS 2.0
    And of course, with training, nVidia has made huge steps with more recent version of DLSS 2.x

    On the other hand, we can look at the Intel presentation where they compare render times of XeSS on DP4A and tensor units.
    As you can see, DP4A might be 2.5X slower than XMX, but it's still faster than native rendering.

    [​IMG]
     
    BlindBison and Denial like this.

  11. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,725
    Likes Received:
    1,854
    GPU:
    EVGA 1070Ti Black
    I did not see that must missed it, i just previous post saying it was done via GFE overly but that might been referencing something else i will have to double check..

    If it done via CP that i most intrested to see how it down scales 1440p/4k to 1080p and if look clear and more readable special on desktop. would save me the trouble of buy new monitor for time being, IF it works how i think it would anyway and i coudl be wrong about that too
     
  12. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    @BlindBison was presumably talking about the Freestyle filters also mentioned in the OP - which are enabled through GFE.
     
    BlindBison likes this.
  13. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,725
    Likes Received:
    1,854
    GPU:
    EVGA 1070Ti Black
    probably, I know why I missed screenshot of DLDSR listing in CP though... I didnt have coffee yet when I was looking threw the thread.

    I still only install what I need for drivers to work which drivers physx and HDaudio, hell i still running 466.11 drivers, I see point chaning them when everything works and nothing i need new drivers yet.
     
  14. geogan

    geogan Maha Guru

    Messages:
    1,267
    Likes Received:
    468
    GPU:
    4080 Gaming OC
    I just checked my own NVidia Control panel settings there, and for some reason the "DSR - Factors" setting was set to 2x and "DSR - smoothness" at 33%!??

    Does that mean I was killing my 3070 framerate running all my games at 2x1440p resolution all the time unknown to myself?? I did a reset to defaults, and the DSR is now set to OFF!
     
  15. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    you have to select the resolution in game
    so no
     
    geogan, Dragam1337 and BlindBison like this.

  16. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    GT 710 is still being produced(or it was as of last year), doesn't mean anyone should expect it should have modern amenities


    Couldn't have said it better myself.

    It's like people want innovation, they want new features, but they are unwilling to accept the innovation and new features

    People who bought the last GTX cards knew what they were getting and what they were not getting, they knew nvidia was heavily putting their efforts and innovation into the RTX series, and the people who didn't know that are also the people who are extremely unlikely to even understand what DLDSR is, let alone use it.

    So there's no point at which i can understand being upset this doesn't go to what we already knew was less featureful GTX cards.

    In all honesty i feel like people only complain here because they feel entitled as gamers? cause this isn't abnormal for literally anything.

    You buy a samsung galaxy M series phone, you do not expect to get the support and features of the S series phone, or if there are new features down the line updated through software, you do not expect it'll fall back to the M series.

    You buy a LG OLED A2 tv series you would not expect it to have the same support and features of their LG OLED G2 series, and if future software comes out that improves the TV features, you would not expect it to go to the lower end.

    Not to say just because you would not EXPECT it to that it's impossible it wouldn't in any situation, but you would not EXPECT it to.

    This is life, there is nothing unique here just because it's gaming hardware and we are gamers.....
     
    Last edited: Jan 12, 2022
  17. Trunks0

    Trunks0 Maha Guru

    Messages:
    1,278
    Likes Received:
    791
    GPU:
    PC RedDevil 7900XTX
    RSR will likely get expanded support in later drivers. The same thing happened when they introduced Radeon Image Sharpening.
     
  18. Chert

    Chert Member Guru

    Messages:
    142
    Likes Received:
    44
    GPU:
    Gigabyte RTX 3060ti
    The accompanying photo in this article shows that DLDSR 2.25x is almost as fast fps-wise as native resolution. Does this mean if you use DLDSR 1.78x instead, you will get faster frame rates than native resolution?

    Can you also use DLDSR together with DLSS?
     
  19. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,011
    Likes Received:
    7,352
    GPU:
    GTX 1080ti
    nobody wants opencl decoders.

    no it couldn't.
     
    cucaulay malkin likes this.
  20. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    8,637
    Likes Received:
    10,687
    GPU:
    RX 6800 XT
    Thanks for that in-depth explanation.
     
    HandR and cucaulay malkin like this.

Share This Page