NVIDIA Shows Comparison Benchmarks for DLSS Optimized 4K Rendering

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 16, 2018.

  1. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    We did. DLSS, TAA are only methods removing shimmering without dropping fps to 1/2 or less. So, they are ones to be compared here in terms of performance and IQ. Deal with it. Or don't like you did not till now.

    If you think that you are playing w/o shimmering your Forza, you consider shimmering to be something else that it is.

    There is other possible explanation. Playing at 4K with 28'' screen and sitting far enough. That way you actually no longer perceive 4K image, but downsampled one.
     
  2. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    Dear lord are there many nvidia fanboys who just accept whatever nvidias marketing department says as god honest truth...

    Dlss is equivelant to using traditional upscaling together with taa and a sharpening filter - nothing more, nothing less.
     
    Biały Wilk and Glottiz like this.
  3. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    You are one who brought distorted images as proof last time for this stupid statement? Good one. Take another cake, you win by default.
    Because I can't take another comparison like that. No reason to make my eyes bleed again.
     
  4. Biały Wilk

    Biały Wilk Guest

    Messages:
    159
    Likes Received:
    29
    GPU:
    Gigabyte Radeon HD 7850
    DLSS will live in this generation of NV cards, when AMD will release new ones it will die. End of story. It's a today(still almost no games) feature and thats all to it.
     

  5. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    9,633
    Likes Received:
    3,413
    GPU:
    NVIDIA RTX 4070 Ti
    Where is the comparison?
     
  6. Valken

    Valken Ancient Guru

    Messages:
    2,922
    Likes Received:
    903
    GPU:
    Forsa 1060 3GB Temp GPU
    Here is thought - Nvidia owners with 4K screens, do a test and upscale a 1440p res to 4K with TAA so we can see it side by side with FPS counter.

    I only have a 1080p screen.
     
  7. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    13,138
    Likes Received:
    1,091
    GPU:
    MSI 2070S X-Trio
    I bet those who have paid well over 2k for all those fancy Asus 4k HDR monitors etc... are well happy, with having to drop em down to 1080p for Ray Tracing, and upscale to their 4k for DLSS. :p
     
    jura11, -Tj-, carnivore and 4 others like this.
  8. PolishRenegade

    PolishRenegade Guest

    Messages:
    16
    Likes Received:
    7
    GPU:
    RTX 2070
    Free AA is free AA. Doesn't matter what method is used.

    That said, I am still pissed it requires NVIDIA to implement it into a game. Again, it -does not matter- if you can get their framework from open-source. It's not the framework that's important it's the accumulated AI results that the AI has been trained on for months-years. And there is no-way NVIDIA will share that which makes DLSS a 100% proprietary tech that only a sub-set of game devs (AAA, and smaller very well known studios "aka; indie - lol") will ever be able to implement.

    However, ray-tracing is worth is compute power in gold. Just having any engine implement support for using those cores during game-dev is incredible. Bye bye low-quality real-time baking of lightmaps... hello full lightmap in real-time during dev. Or moving all ray-tracing functions to those cores... you have no idea how much ray-casting is used in gameplay; occlusion culling, AI, path-finding to name a few.
     
  9. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,251
    Likes Received:
    232
    GPU:
    EVGA GTX 1080@2,025
    That's not what they said. You're grossly oversimplifying things. Also, the only way you could tell a difference was when comparing screenshots side by side. And keep in mind, they were using the low quality 'performance' DLSS variant. There is a higher quality version that doesn't use any lower than native resolution imagery, but instead uses supersampling.

     
    Fox2232 likes this.
  10. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    No it's not.. I'm getting sick in general of people not understanding things so they go "well it must be this because it's easier to explain" "must just be marketing nonsense because I don't want to spend a few minutes reading anything about it". "Tensor cores and RT cores are the same because I didn't read the article I linked as proof that Nvidia engineers don't know what they are doing!!1!"

    If you said "it's not worth the effort" "there are issues with it" "the implementation sucks because it requires training" "the feature isn't worth the price/transistor premium on the card" etc, I'd be fine with all of that, but it's not the same as upscaling with a sharpening filter. It's not even remotely close to the same.

    DLSS is based on the work Marco Salvi presented at Siggraph 2017 on warped autoencoders in real time rendering:

    https://openproblems.realtimerendering.com/s2017/05-msalvi-dl-future-of-rendering.pptx

    He demonstrated a network trained against 16 spp, 8 sequence images and utilized a warping mechanism on the hidden states in order to reduce bluring/temporal issues by allowing the network to follow features around the frame without increasing the size of the layers. DLSS is just a continuation of that work. It's a drastically different approach to just upscaling and sharpening the final image, which wouldn't carry any of the temporal alias benefits DLSS's approach has and wouldn't even give as sharp of an image as DLSS in many cases.
     
    fantaskarsef likes this.

  11. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    I said equivelant - not technically the same ;)

    I think most people would be hard pressed to see the difference between dlss and upscaling with taa and sharpening filter, and performance gain is the same.
     
  12. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    It is simple. DLSS removes shimmering. Any sharpening filter makes shimmering and aliasing worse. You, personally, may not see it, but that's exactly what happens.

    You think most. I think very limited few. You think most are visually impaired, I think minority has that kind of problem with sight.
     
    fry178 likes this.
  13. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    All this talk and yet we still don't have a single game with DLSS. Much ado about vaporware :p.

    Personally, I would love to try out DLSS, but I doubt I'll be able to anytime soon. Even if I buy a RTX card I doubt many game developers would support it, seeing as how only 1% of gamers would be able to use it (game devs aren't going to devote significant resources to implementing a feature that virtually nobody can use). The RTX 2070 may change this a bit, but I expect the majority will buy the cheaper and more sensible 10-series cards. Without mainstream support, DLSS will be a feature reserved for the elite few (the same goes for ray-tracing).

    As with ray-tracing, I think we will need to wait for at least the 30-series before this feature becomes available to the mainstream. I think adoption will be slow and gradual, assuming it happens at all.
     
    Prince Valiant and Dragam1337 like this.
  14. Prince Valiant

    Prince Valiant Master Guru

    Messages:
    819
    Likes Received:
    146
    GPU:
    EVGA GTX 1080 ti
    That's kind of sad. I knew RT was going to be vaporware; I didn't expect DLSS would be vaporware too.
     
  15. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Idk.. 25 games are already pledging support for it, that's way more than any other previous nvidia feature this early into release. Also I don't know why you think game devs have to do anything significant, they ship nvidia their game code (which most do anyway), nvidia ships them back a DLSS library for free, they add a check box in the menu that enables it as a post process effect. They don't really have to do anything at the gamecode level. It's why so many games are pledging support already. It will almost definitely be a feature that continues forward throughout Nvidia's hardware line. So all games that support it now will support it on the next series and so on. It will also improve as the training model improves - potentially a GTX3060 could "upscale" a 720P to 1440p with little to no image quality loss enabling higher framerates at lower price tiers. I don't think DLSS is going away - if anything I think the idea of deep learning inside the rendering engine is going to be applied to many more ideas than just "upscaling".
     
    Maddness likes this.

  16. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    So it's just like what Huang said: it just works? :p

    It's not quite that simple. Not only would implementing DLSS introduce an additional delay but there would also need to be additional testing and validation afterwards (they can't just assume it'll work correctly and release willy-nilly). In particular, this kind of feature would impact the entire game so it would need end-to-end testing to make sure there are no anomalies. This requires additional time, money and resources - when you're running on a tight timeline and need to meet that holiday deadline, such delays can prove costly (I should know - I'm a professional application developer). Patches, DLCs and other updates might also require further training and/or testing, presenting an on-going cost.

    This isn't to say that DLSS is a particularly difficult feature to implement, just that there are costs associated with it that need to be considered. The same goes for ray-tracing - I know Huang like to present it as an easy feature to implement but that's mostly just marketing.
     
  17. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    I get that there will be some development cost but it's extremely minimal. It's not like they are doing some extra validation pass specifically for DLSS, that pass is occurring regardless before the game ships, at best you're just adding a DLSS toggle and testing various scenes - would take an intern like 20 minutes to write a test case for that - it's not like you can adjust it anyway, it's just on or off.. also judging by FFXV's implementation creating some strange artifacts in various places, I'd be willing to bet most devs don't test it before release regardless.

    A good example of how cheap it is: Deliver Us The Moon: Fortuna is getting it and that's a $100,000 kickerstarter game by an independent studio. I think if they can add it to that game then literally anyone can add it to their game for minimal development cost.

    As for Raytracing that's a bit different.. the idea there is that you're eliminating a lot of the hackery that goes on in the engine to create specific lighting conditions. For most devs all this work is already done - Unreal Engine for example makes use of all kinds of tricks to improve performance while maintaining realism, things like shadow capsules, for example, need to be applied to every character asset in order to improve shadow performance by simplifying the geometry. These capsules need to be hand placed by a character designer on each asset and bound to it's animation. The work of programming/creating/implementing the shadow capsule itself was already done by Epic.. so all devs have to do is just map them to character assets. Theoretically though, if that was never implemented by Epic and instead they just used a RT system to do shadowing, none of that work, or the work involved with placing them would have ever had to be done. They wouldn't have had to spend multiple iterations perfecting shadow capsules, or area light shadows, soft shadows, etc.. it's all taken care of by one implementation that should behave exactly how real light behaves for a predictable performance hit. Eventually that will include shadows, global illumination, caustics sim, 2nd and third bounces, etc - things that traditional renderers aren't even capable of and it's all done by one implementation. Obviously right now it's in it's infancy but the idea is to start creating engines where RT is being implemented and tested and eventually use RT as the only lighting system as performance improves.

    Also the metro dev had this to say about RT:

    You also have to remember they did the iteration testing on Volta and not Turing similar to DICE. On Turning RT is significantly faster allowing for faster iterations, not to mention that I'm sure the RTX libraries, toolsets, etc are all improving during this earlier process. Once those tools are up implementing RT will be relatively easy compared to implementing 50 different hacks into your engine to achieve similar quality results (although perhaps at better performance, but again within a gen or two that won't even matter). So yeah, obviously the jacket is going to spin into "just works" - that's the marketing bit, but there is a lot of truth to it being a preferred, easy to implement solution in the future and I think game developers understand that and the work involved.
     
    Maddness likes this.
  18. RealNC

    RealNC Ancient Guru

    Messages:
    4,945
    Likes Received:
    3,223
    GPU:
    4070 Ti Super
    Actually, it seems like DLSS might only remove shimmering if upscaling to a non-native res and then downscaling again. Like on a 1080p monitor, doing DLSS 4K, and then you see the result at 1080p.

    If you're using a 4K monitor, then DLSS 4K upscaling might not remove shimmering?
     
  19. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    That is possibility. I have yet to see 4K DLSS in action. And I am not in mood to download some XX GB video just to find out. But officially it is meant to do that. And that quite impressive level of AA is nothing to ignore.
     
  20. fry178

    fry178 Ancient Guru

    Messages:
    2,067
    Likes Received:
    377
    GPU:
    Aorus 2080S WB
    @Fox2232
    I dont see it?
    Neither is d3d free (again: licenced), nor did MS develope it and then gave it away for free to another company,
    and them to be the only one making money with it.
    Also, MS is using it in win as you stated, ergo not unselfish act.

    And gsync is really only needed for lower grade gpus or ppl playing on competitive lvl, as any game i have runs at 1440p, maxed out incl 32SS/transparency 8SS, and except for siege (crappy code), even with DSR at 5K, with 60fps (vsync) on my 1080.

    And most ppl run consoles on tvs, and pc gaming on monitors, so freesync was a good move amd, but only with samsung putting on their tvs and MS using it on xbox, was the game changer. Wouldnt have been that, without those things.

    And for me freesync will never happen anyway, as i hate samsung tvs and their 50Hz-european-soap-opera-look on motion processing.
     

Share This Page