1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

NVIDIA Shows Comparison Benchmarks for DLSS Optimized 4K Rendering

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 16, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    33,718
    Likes Received:
    2,718
    GPU:
    AMD | NVIDIA
  2. Glottiz

    Glottiz Master Guru

    Messages:
    374
    Likes Received:
    52
    GPU:
    Strix 1080Ti
    Of course performance with DLSS is higher because DLSS is just a fancy name for upscaling from 1440p to 4K. Basically this chart is just showing difference in performance between 1440p and 2160p.
     
    scatman839 and Dragam1337 like this.
  3. Inquisitor

    Inquisitor Active Member

    Messages:
    85
    Likes Received:
    11
    GPU:
    EVGA 1080Ti SC Hybrid
    Wait... they're not even testing the same graphics cards. Why are there no like for like benchmarks. i.e 2080 DLSS Vs 2080 TAA?!

    Those results are just exaggerating DLSS effect even more, making it look better than it is!
     
  4. nevcairiel

    nevcairiel Master Guru

    Messages:
    329
    Likes Received:
    78
    GPU:
    MSI 1080 Gaming X
    The performance is higher because you practically get anti-aliasing for free, this frees up a lot of GPU performance, especially on higher resolutions. TAA is actually quite expensive.
     

  5. Glottiz

    Glottiz Master Guru

    Messages:
    374
    Likes Received:
    52
    GPU:
    Strix 1080Ti
    "Free anti-aliasing method", hah, how naive are you? Nothing is free. I guess Nvidia's marketing campaign really worked on you.

    It's not free AA and it's not native 4K res. Digital Foundry did in depth DLSS analysis and pixel counted that DLSS is just running games at 1440p internal resolution and upscaling to 4K.
     
    Dragam1337 likes this.
  6. Caesar

    Caesar Master Guru

    Messages:
    275
    Likes Received:
    59
    GPU:
    GTX 1070Ti Titanium
    It's a new upscaling technique, just think about it, this is a new architecture after several years of the best one in the market and we still can't have a GPU capable of pushing 4K 60fps as standard, there's always something to trick you like DLSS, checkerboarding, etc. 4K is simply not worth it.

    It “think” that DLSS blurs images. It's similar to FXAA but here with less computational cost (Wattage???), less jaggies, more blur. I personally prefer aliasing to blur.

    And only compatible with a handful of games that haven't been released yet. While 1800p + AA is possible with almost every games currently available and to come.
    Nvidia bringing complex and overpriced new technology in order to achieve what was already possible for cheaper.

    Revolutionary indeed!
     
    HonoredShadow likes this.
  7. nevcairiel

    nevcairiel Master Guru

    Messages:
    329
    Likes Received:
    78
    GPU:
    MSI 1080 Gaming X
    It costs die space, and as such money, because it uses the Tensor Cores. But it frees up resources on the CUDA cores.

    PS:
    The Digital Foundry article hardly reads very in-depth.
     
    carnivore, Maddness, Caesar and 2 others like this.
  8. AlmondMan

    AlmondMan Master Guru

    Messages:
    429
    Likes Received:
    25
    GPU:
    Sapphire 480 Nitro+
    "nVidia shows" - oh, like they showed everything else about this launch. With absolutely meaningless and misleading numbers and graphs that make things look good because they're in a vacuum.
     
  9. PolishRenegade

    PolishRenegade Member

    Messages:
    14
    Likes Received:
    7
    GPU:
    RTX 2070
    This.

    It is free in terms of raster computing power. Which is why FPS are higher when DLSS is enabled. The real problem with DLSS is not the tech but the implementation. Devs have to send their games to some AI compute centre at NVIDIA et even enable it. Dead-end IMHO.

    Until NVIDIA can make this an engine-wide implementation by licensing the algorithm (will never happen), the only +value of the 2000 series is the Ray Tracing cores.
     
  10. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    9,186
    Likes Received:
    1,421
    GPU:
    1080Ti @h2o
    These are not benchmarks, that's a PR slide that looks just like that we saw from the Turing announcement event in August...
     

  11. Denial

    Denial Ancient Guru

    Messages:
    11,613
    Likes Received:
    608
    GPU:
    EVGA 1080Ti
    They don't have to send it - Nvidia already stated they can train it themselves, it just costs a ton of compute power. On the flipside most devs already send their game code to Nvidia and they'll do the training for free - so it's a no brainer and the it's the reason why several indie games are pledging support for it. People should also realize that it's an iterative process as all AI applications are. The more training data the more accurate you can make the model the better DLSS gets.

    I personally don't care for the idea of paying for "future" performance but tech itself as a value-add is pretty nifty and I don't think it's going to be a one and done thing. I think the use of deep learning to accelerate visual applications is only starting.
     
  12. Glottiz

    Glottiz Master Guru

    Messages:
    374
    Likes Received:
    52
    GPU:
    Strix 1080Ti
    Yes resources are shifted more towards Tensor Cores, but this is not the main secret behind DLSS performance. Upscaling from 1440p resolution is how Nvidia magically gets this performance boost.

    This is why Nvidia only compares DLSS vs TAA. Because if they showed 4K without AA vs DLSS everyone would realize that something doesn't add up. Why would native 4K without AA perform so much worse than DLSS?
     
  13. nizzen

    nizzen Master Guru

    Messages:
    523
    Likes Received:
    36
    GPU:
    2x1080ti /1080/1060
    We need the games with RT and dlss, not demos and benchmarks...
     
  14. Barry J

    Barry J Ancient Guru

    Messages:
    2,688
    Likes Received:
    78
    GPU:
    MSI 1080ti GAMING X
    So DLSS gives you free AA at 4K, then I don't need DLSS as I don't use AA at 4K
     
  15. tensai28

    tensai28 Maha Guru

    Messages:
    1,112
    Likes Received:
    255
    GPU:
    2080ti MSI X TRIO
    The 2080ti does 4k ultra@60fps with no problems with or without dlss.
     

  16. Denial

    Denial Ancient Guru

    Messages:
    11,613
    Likes Received:
    608
    GPU:
    EVGA 1080Ti
    This isn't how it works, no.
     
    Anarion, carnivore, Maddness and 2 others like this.
  17. PolishRenegade

    PolishRenegade Member

    Messages:
    14
    Likes Received:
    7
    GPU:
    RTX 2070
    You need to contact NVIDIA or they need to contact you to "train" your game, ends up being a process of approval. And if it's not, I would be very surprised as a dev to "suddenly" see my game supported with a driver update.

    The worst part is that the SDK for it is STILL not available, a month after launch.

    The whole thing, from a dev point of view, feels like a walled garden of proprietary tech... circa 2005-ish era of closed gaming technologies and engines. Like having to buy a PhysX card to your machine for X-Y game.
     
  18. Andy Watson

    Andy Watson Member

    Messages:
    11
    Likes Received:
    1
    GPU:
    960
    Not the most useful of things. We need to see values for 4K without any DLSS or TAA, then with TAA and then with DLSS AS WELL AS full res screen shots to compare the quality
     
  19. Stefem

    Stefem Member

    Messages:
    28
    Likes Received:
    2
    GPU:
    16
    That's was know from day one, DLSS works by reconstructing a 4K image from a lower resolution one using AI eliminating aliasing in the process, DLSS 2x start from a 4K image and use AI to just remove aliasing, That's how NVIDIA describe it
     
  20. Stefem

    Stefem Member

    Messages:
    28
    Likes Received:
    2
    GPU:
    16
    It's not even similar to FXAA and require much much higher computational resources as it reconstruct the image using AI, the incredible computational density of tensor core allow that to look as "free"
     
    Nima V likes this.

Share This Page