NVIDIA confirmed that five titles will feature DLSS 3.0 within the next week.

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 12, 2022.

  1. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    DLSS 3 is 3 different technologies. Super Resolution, Frame Generation, and NVIDIA Reflex. Super resolution and Reflex will work on older GPUs.
     
    Deleted member 282649 likes this.
  2. Krizby

    Krizby Ancient Guru

    Messages:
    3,111
    Likes Received:
    1,793
    GPU:
    Asus RTX 4090 TUF
    There are additional features for having the dedicated Gsync module like Ultra Low Motion Blur mode back when it came out in 2014 that AdaptiveSync can't replicate, now newer Gsync monitors also have the reflex analyzer, these features are only worth it for Esport gamers (Gsync Ultimate Features).

    If you think AMD is playing nice, how about the Smart Access Memory, did AMD release any source code for their tech? Nope, Nvidia and Intel figured it out on their own that it's Resizeable Bar that had existed in the PCIe spec. AMD only open-source their techs after Nvidia have introduced them first (like FSR). In the end all these decisions are for the good of their own respective companies.

    Well for a propriety software, DLSS sure turned out popular enough that almost all new AAA games come with DLSS support. Surely devs must realized that the majority of their target audience have RTX GPUs.
     
  3. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,209
    GPU:
    AD102/Navi21
    meanwhile the free for all fsr 2.0 has most of the work is done by modders not developers and needs dlss dll to work in the fist place.
     
  4. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,023
    Likes Received:
    4,400
    GPU:
    Asrock 7700XT
    The concept of how the ULMB works is something that could be effortlessly implemented by the panel and doesn't need to be part of the AdaptiveSync spec.
    Reflex analyzer is nice but considering it revolves around 3rd party peripherals, that already suggests there was nothing about it that makes it locked to Nvidia.
    Uh.... you know SAM is just some stupid marketing name for rBAR, right (same goes with FreeSync)? What is AMD supposed to release for a feature that was already part of the PCIe spec? What would Intel and Nvidia have to "figure out" when they most likely contributed toward the spec? Clearly, AMD isn't doing anything special with it seeing as Intel CPU platforms can support SAM, and nothing prevented Nvidia GPUs from using rBAR on AMD platforms.

    So yeah, not really the best example.
    There are also a lot of new titles that don't appear to support it.
    As far as I understand (and maybe I'm mistaken), not just anyone can implement DLSS, because Nvidia does most of the work on their end with AI training. I assume they don't just freely give out that training to just anyone. That being said, it's relatively easy for studios to implement it since they don't have to do much themselves.
    Also according to Steam surveys, RTX-capable GPUs make up about 30% of the PC market. Since you specified target audience, that still isn't true since consoles are part of the target audience and they don't support DLSS. Why do I say any of this? Because Nvidia probably approaches the studios to implement DLSS rather than the other way around. Which honestly, I'm in favor of - I see no problem with a vendor working with 3rd parties to make a better product. I'd rather it not be for a platform-exclusive feature but the practice in general is good.

    Indeed it is, which honestly I think is a good thing - it's getting wide adoption that way and we're not at the mercy of AMD's sub-par driver devs to make things work. I would rather have a feature that isn't quite as good but can work on just about anything than a feature that is great (but still has room for improvement) and limited to a select few titles. I guess the nice thing for Nvidia users is they technically can have both, should Nvidia adopt FSR (I'm not sure if they did).
     

  5. NoLikes

    NoLikes Member Guru

    Messages:
    163
    Likes Received:
    58
    GPU:
    1080TI
    The DLSS frame interpolation and the DLSS supersampling doesn't hold and dispatch otherwise fully nonexistent fixed data: both implementation require engine input for their output and neither can be considered trade-secret.
    DLSS2 was ahead in term of optimization and NIVIDA did a wonderful job in blurring the line for the past 4 year but.. right now, both FSR2 and DLSS2 supersampling technique are on par.
    XESS is also doing very well on generic 6.4 shader model hardware.
    So, the point is;
    NVIDIA unlike AMD and INTEL is the only one missing the generic implementation which is netting same quality and better performance alike the locked DLSS2.

    See the bad value point yet?
     
  6. MonstroMart

    MonstroMart Maha Guru

    Messages:
    1,397
    Likes Received:
    878
    GPU:
    RX 6800 Red Dragon
    It will be interesting to see how DLSS 3 performs with non sponsored titles next year.
     
    Why_Me likes this.
  7. H83

    H83 Ancient Guru

    Messages:
    5,515
    Likes Received:
    3,037
    GPU:
    XFX Black 6950XT
    Frame generation has the potential to become something very important in the future, although i have mixed feelings about it.
     
  8. loracle

    loracle Master Guru

    Messages:
    442
    Likes Received:
    212
    GPU:
    GTX 1660 super
    I first mean dlss bullshit, and drivers before the two last ones were also bullshit, they began to fix that after lot of people were complaining about it, also cards are made with bad components not as strong as old days, they were lasting 5 years and sometimes even more, now they last only 2-3 years to make people buy new cards quickly, and they are pushing cards too far with oc for the same purpose unfortunately for us.
     
    Last edited: Oct 13, 2022
  9. Agonist

    Agonist Ancient Guru

    Messages:
    4,287
    Likes Received:
    1,316
    GPU:
    XFX 7900xtx Black
    HOLY HELL FANBOY..................
     
  10. Krizby

    Krizby Ancient Guru

    Messages:
    3,111
    Likes Received:
    1,793
    GPU:
    Asus RTX 4090 TUF
    Nah, rBAR is like a tickbox for Nvidia/Intel, they don't dedicate any resource to it, right now there are only 23 games whitelisted for rBAR in the latest driver. Same for Intel, their CPU support it but the older gens gain no performance benefit or even perf degradation (like my 9900K).

    Like a horserace, Nvidia/RTG release some exclusive features and the other have to reverse engineer those features, that's the beauty of competition. I will give you another example, RTG released Radeon Anti-Lag, which was marginally better than Nvidia Null (0 queue depth), later Nvidia released Reflex which outdid Radeon Anti-Lag (but require devs integration).

    W1zzard from TPU mention that all DLSS2 titles only need Reflex in order to support DLSS3, Reflex takes a day or two to implement

    DLSS/FSR2.0/XeSS require motion vectors and other data from the game engine, devs must first learn how to program their engine to do so and Nvidia has done all the groundwork for FSR2.0/XeSS. That why you will see DLSS game supporting FSR2.0/XeSS, because they all work the same way.

    Sony exclusive titles when ported to PC all come supporting DLSS: FF XV, Horizon Zero Dawn, Death Stranding, Tokyo:GhostWire, God of War, Spiderman Remastered, Uncharted, Returnal, etc...I guess Sony understand what their target audience is
    Meanwhile Xbox titles can't even implement basic TAA (Forza Horizon 5, Halo Infinite), let alone some advance upscalers.

    HELLO THERE FANBOY...........................
     
    Last edited: Oct 13, 2022

  11. Legacy-ZA

    Legacy-ZA Master Guru

    Messages:
    271
    Likes Received:
    203
    GPU:
    ASUS RTX 3070Ti TUF
    DLSS in my opinion, isn't getting any better, if it is going to be tied to a generational hardware limit, screw that.
     
  12. Meathelix1

    Meathelix1 Master Guru

    Messages:
    200
    Likes Received:
    101
    GPU:
    10x 3090ti 24G
    Now AAA is only using FSR, Overwatch 2 only uses FSR. I expect more to follow. Since all cards can use FSR why build just for Nvidia? FSR is just going to keep getting better and used more then DLSS due to it being Open Source.

    I love DLSS as I do have an Nvidia card, however, a lot of games now have adopted FSR and it looks just as good as DLSS.
     
  13. Krizby

    Krizby Ancient Guru

    Messages:
    3,111
    Likes Received:
    1,793
    GPU:
    Asus RTX 4090 TUF
    FSR1.0 is trash, now go away bot.
     
  14. Meathelix1

    Meathelix1 Master Guru

    Messages:
    200
    Likes Received:
    101
    GPU:
    10x 3090ti 24G
    Wow so compelling, you really are making me think I should just have your single word that it's just trash. You really should go tell that AAA company called Blizzard and tell them they are trash and all the other ones who are adapting it over just DLSS.

    You for sure do smell like an Nvidia FANBOY. Did you buy your overpriced 4090 yet? Still plenty in stock. :D

    OMG Look how TRASH FSR 2.0 is... I hope you do know Nvidia just takes Open Source projects and modifies them and calls them their own right? Most big companies do this, Amazon AWS is another big one that takes OS projects and slaps its own label on them. Like the new codec, Nvidia has for their 4000 cards, that's open source. Intel has it as well and so will AMD's new cards...

    GET OUT OF HERE FANNYBOY

    upload_2022-10-13_16-32-42.png
     
    Last edited: Oct 13, 2022
  15. Krizby

    Krizby Ancient Guru

    Messages:
    3,111
    Likes Received:
    1,793
    GPU:
    Asus RTX 4090 TUF
    Dear mr trash bot,
    FSR1.0 is not FSR2.0, learn the difference
    OW2 has FSR1.0, which is just pure trash

    Very talkative for 1-month-old acc eh bot
     

  16. Meathelix1

    Meathelix1 Master Guru

    Messages:
    200
    Likes Received:
    101
    GPU:
    10x 3090ti 24G
    FSR 1.0 still looks great on a game called Grounded. Open Source tech getting as good as DLSS, what a surprise. FSR 2.0 will be as good as DLSS and its open source.

    DLSS 3.0 is magic bullshit putting in fake frames to make it look like it's a higher fps... here comes the latency... BOT, BOT, BOT
     
  17. NoLikes

    NoLikes Member Guru

    Messages:
    163
    Likes Received:
    58
    GPU:
    1080TI

    Nvidia should have just named it NGX DL / NGX SS / NGX FI / NGX RTX ETC ETC
    Personally I don't like very much either the deep learning resolve of DLSS 1 and the frame interpolation of DLSS 3
     
  18. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,023
    Likes Received:
    4,400
    GPU:
    Asrock 7700XT
    What do you mean by "dedicate any resource to it"? rBAR is a rather passive feature; either a game benefits from it being on or it doesn't. Considering what it does, Nvidia is smart to be cautious about it since even though it should never deteriorate performance, there could be unseen stability issues with it, hence the whitelist. As for Intel, seems to me they enable it across the board and it makes a big difference for their GPUs.
    Right but rBAR isn't something Nvidia would have to reverse engineer. I would be shocked if Nvidia did not partake in the spec for it. AMD was just antsy to get it out first because they're desperate for wins.
    I highly doubt Nvidia reverse engineered anything for Reflex. The concept is simple enough that they could easily implement it themselves without copying AMD's homework.
    In most cases where one vendor tries to match the features of another, reverse engineering isn't a viable option, because:
    A. There isn't enough time to do that. We're not in the 90s anymore where drivers were only a couple MB and architectures were simple enough that you could probably map them out in a matter of weeks.
    B. The architectures are so drastically different that the competitor won't get much out of it. That's like finding an ancient recipe written in a long dead language using an ingredient that went extinct - it's not that you couldn't potentially do it, but you won't get the same results so there isn't much value in the effort.
    C. The competitor's architecture might not be well optimized. For example, Nvidia has dedicated RT cores - AMD wouldn't really benefit much from reverse engineering something they have no time or interest to implement themselves, which is why they didn't.
    Yeah I do find that a bit backwards haha. But traditionally, Sony makes more money from the games than from the system. Regardless of the exclusive tech, their PC releases seem to tap into more potential, so looks like Sony's approach is "if we're gonna have PC support, might as well make the most of it".
     
  19. Krizby

    Krizby Ancient Guru

    Messages:
    3,111
    Likes Received:
    1,793
    GPU:
    Asus RTX 4090 TUF
    rBAR do deteriorate performance by default, but it's offsetted when games benefited from it. On older platform like my 9900K, having rBAR enabled in the BIOS reduce performance in non-whitelisted games, overall rBAR is a mess on older Intel CPU + rtx3000, not sure about rtx4000. AMD said that their Ryzen+ rx6000 are optimized for SAM, there's probably some truth to it. You can check out HUB video about rBAR.

    There is no need for reverse engineering when you have a bunch of talented software developers, all Nvidia/AMD need are ideas. You can see that AMD can match any Nvidia exclusive features like Gsync/DLSS just fine. So yeah, the decision to make something open-source are not for the good of consumers, it's for the good of their own company.

    I do believe that Nvidia saw Radeon anti-lag slide and though to themselves that they could develop further on it, Reflex came out a few months after (but first Nvidia had to lie about Radeon Anti-Lag being the same as NuLL :rolleyes:)
    anti lag.jpg

    Game companies would love for gamers to buy their games at full prices, not months later and 50% discount. The true target audience would be gamers with high end PC who buy games at launch.
    Don't believe me? Days Gone dev said so.

    So yeah, the niche market with high-end PCs is actually the money pot for game companies. If you think DLSS2/3 is irrelevant because it's niche, might want to think again
     
  20. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,023
    Likes Received:
    4,400
    GPU:
    Asrock 7700XT
    Shows how little you actually listen. I've already stated DLSS isn't niche. I didn't say it was irrelevant either.

    Agonist may be a belligerent fanboy he's right to call you one if you think I interpret DLSS as irrelevant and niche.
     

Share This Page