NVIDIA confirmed that five titles will feature DLSS 3.0 within the next week.

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 12, 2022.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    45,496
    Likes Received:
    12,477
    GPU:
    AMD | NVIDIA
    Why_Me likes this.
  2. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    14,122
    Likes Received:
    7,328
    GPU:
    2080Ti @h2o
    One thing that's interesting, just on the side of things, is how Nvidia successfully captured what made AMD the pick a few years ago: if you wanted to have something that "gets better in the long run, once drivers mature" and along such lines.
    These days, it's actually DLSS that "keeps on getting better" with more games and longer time.

    Or that's what impression their marketing has on me, anyway. Since DLSS3 is available in not a single game I'd like to play or do right now.
     
    Why_Me likes this.
  3. nizzen

    nizzen Ancient Guru

    Messages:
    2,271
    Likes Received:
    1,015
    GPU:
    3x3090/3060ti/2080t
    Do you even play games anymore? That's the question :p
     
  4. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    14,122
    Likes Received:
    7,328
    GPU:
    2080Ti @h2o
    Super People - yet another battle royale shooter
    Justice ‘Fuyun Court’ - a graphics demo...
    Loopmancer - a platformer game
    Flight simulator - what it says, nomen est omen
    F1 - racing game
    A Plague Tale: Requiem - story telling game

    In this particular list of games that have DLSS3, I'm pretty sure I'm only interested, if at all, in A Plague Tale's second game. Need to play 1st, but yeah, I am not the target group for racing and simulation games, platformers, the 16th battle royale game, and Fuyun Court is not even a real game :D
     
    Solfaur and bobnewels like this.

  5. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,285
    Likes Received:
    2,141
    GPU:
    Sapphire 390
    Fortunately game studios don't need to hurry with DLSS 3.0. It's not like 4090 would need it with all the horsepower it has got.
     
    Why_Me and moo100times like this.
  6. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    14,122
    Likes Received:
    7,328
    GPU:
    2080Ti @h2o
    True, but Nvidia's portfolio's bigger than just the 4090, and I'm fairly sure they want to sell some e.g. 4060 cards down the road as well. Entry-level / midrange cards like those do make good use of DLSS, imho
     
    Why_Me and Airbud like this.
  7. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,836
    Likes Received:
    438
    GPU:
    RTX3080ti Founders
    They need to fix the stuttering in F1 22 first before adding more stuff.
     
  8. Espionage724

    Espionage724 Master Guru

    Messages:
    966
    Likes Received:
    418
    GPU:
    XFX RX 6600 XT
    Does DLSS function with VR games? It seems like that would be notably useful for MSFS
     
    Why_Me likes this.
  9. loracle

    loracle Master Guru

    Messages:
    356
    Likes Received:
    155
    GPU:
    GTX 1660 super
    Nvidia, just keep your bullshit, and start making strong cards and drivers like old days.
     
  10. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,143
    Likes Received:
    3,589
    GPU:
    HIS R9 290
    Off to a bit of a rough start with DLSS 3. In general, it doesn't matter what the potential is for the technology if it requires developers to put in extra work to implement it, especially if it only applies to a single vendor's hardware. I'd say the only truly successful Nvidia-specific technology was CUDA, and that's because they made it sooo much better than the alternative that it became the only obvious choice for most developers. So long as Nvidia locks technologies to their platform, we're never going to see a lot of use-cases for them.
     

  11. Krizby

    Krizby Ancient Guru

    Messages:
    1,932
    Likes Received:
    844
    GPU:
    Asus RTX 4090 TUF
    You won't, but the rest do.
     
    Why_Me likes this.
  12. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,143
    Likes Received:
    3,589
    GPU:
    HIS R9 290
    "the rest" meaning who and based on what evidence? Look at all of Nvidia's exclusive technologies and tell me how many of them found wide adoption. Again, other than CUDA, I can't think of a single one that was implemented by the majority of the target audience. GPU-accelerated PhysX, OptiX, G-sync, DLSS, SLI, etc - they all barely made a dent. Obviously it's even worse for AMD, which is probably why they gave up trying to make their own exclusive features since I think TressFX.
     
  13. Krizby

    Krizby Ancient Guru

    Messages:
    1,932
    Likes Received:
    844
    GPU:
    Asus RTX 4090 TUF
    LOL, everytime you want to downplay something new, just call it "niche", it gets tiring you know.

    Well whatever, whatever new tech Nvidia introduce will have more adoption than the competitors, that's all that matters. I couldn't care less what the majority of people in the world own or do anyways.
     
  14. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    8,467
    Likes Received:
    2,476
    GPU:
    Nvidia RTX 4070 Ti
    What are you talking about, drivers are rock solid.

    Maybe your OS needs a fresh install.
     
    Netherwind and Why_Me like this.
  15. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,143
    Likes Received:
    3,589
    GPU:
    HIS R9 290
    Every time anything slightly negative is said about Nvidia, you get butthurt. It gets tiring y'know.
    These features, including DLSS 3.0, are not niche. To be niche means it appeals to a narrow market. Nvidia's technologies are desirable even by the haters and most of them would improve most games. The problem is how they're implemented, and in turn: that they aren't implemented. Each of the technologies I mentioned yield great results or are gamechangers (literally) but it doesn't matter if only a select few titles implement them. Time is money and game developers would see more profit optimizing the game for more platforms than to implement vendor-exclusive features.
    I really can't comprehend your logic there - in what way does it matter that their adoption rate is higher than the competitors if it's still crappy? And you also say that as though the competitors should see less adoption for their technologies, as in, Nvidia monopolizing on desirable features is an ideal world to you.
    In any case, it does matter because you're paying extra for it and these features distract from further optimizing the drivers. Nvidia could easily knock AMD out of the market if they lowered their prices by spending less time and money on exclusive features, especially ones that can have a more open approach and/or don't require developer/engineer intervention. A bit ironic of course, because in such a world, those features would then be accessible to pretty much everyone.
     
    Denial likes this.

  16. Krizby

    Krizby Ancient Guru

    Messages:
    1,932
    Likes Received:
    844
    GPU:
    Asus RTX 4090 TUF
    Let me ask you then, MS own the majority of OS marketshare, yet only 25% of users go with Windows11. How can Nvidia introduce a new technology and in a short time everyone will adopt it? You are asking the impossible and act disappointed when it will not happen, ever.

    Heck even DX12 is still a niche API, and it came out 7 years ago, Vulkan is even more niche despite being the "superior" API.

    You don't understand what you are talking about, Nvidia doesn't want to kill off RTG because that would put Nvidia under USA antitrust law, instead Nvidia/RTG will maintain a duopoly with healthy profit margins for both.

    What you are advocating for is monopoly, and monopoly kills off any incentive for technological advancement.
     
    Last edited: Oct 12, 2022
  17. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,143
    Likes Received:
    3,589
    GPU:
    HIS R9 290
    Your logic is getting increasingly broken because neither can be compared... at all.
    Games will run just fine on either W10 or 11. The vast majority of games will even work on W7, just maybe not to their full potential. Pretty much the only reason why Windows has a large userbase is specifically because of compatibility of software, hardware, and features. MS actually deliberately breaks some of this compatibility by having specific features only available in newer versions of Windows. This is very different from what Nvidia is doing:
    Unlike MS, Nvidia implements features people actually want. While some of Nvidia's features could be backported (and sometimes are), many of their features can't be because the hardware literally can't do it. I think the GTX 1000 series getting DXR support was Nvidia trying to prove a point about this.

    New technologies can be rather quickly adopted and Nvidia themselves have succeeded in that. The way to do that is to either:
    A. Make it so compelling and easy-to-implement that it becomes the obvious or perhaps the only viable option (like with CUDA).
    B. Integrate it with a widely used platform that other competitors can still use, if they can figure out how (like with DXR).
    C. Make it either open source or an open standard (like with Freesync*)
    D. Make it so existing software can use the feature without further modifications or updates (like with FSR).
    And depending on what the thing is, you need more than one of these.
    Nvidia tends to make their technologies proprietary, closed-source, or heavily dependent upon their drivers/hardware. Of Nvidia's coolest technologies, AMD tends to undermine them with something that doesn't yield as good of results but appeals to a wider market (whether that be platform-agnostic, royalty-free, or doesn't require developers to do more work).

    * Adaptive sync monitors are a niche product, but Freesync was quickly adopted since it was compatible among different platforms and was royalty-free.

    TL;DR:
    It's not impossible, but Nvidia can't have their cake and eat it too. I get it - they put a lot of R&D into their features and they do seem to legitimately care about improving the user experience, so they deserve credit for that. But all they have to do is just design their architecture to favor whatever the new feature is. That's often how the CPU market works.
    DX12 isn't niche anymore. It took a while to adopt because a lot of people refused to use W10 and DX12 was deliberately made incompatible with W7.
    Vulkan is niche because, from what I understand, it's harder to implement (but still easier than OpenGL). Meanwhile, Apple used their own API (Metal), and it took a while for mobile chips to support Vulkan. Since DX12 is easier and applied to a much wider audience, Vulkan just didn't have much of a chance to grow.
    *sigh* I know that... My point is Nvidia could do that.
     
  18. Krizby

    Krizby Ancient Guru

    Messages:
    1,932
    Likes Received:
    844
    GPU:
    Asus RTX 4090 TUF
    Nvidia wants to maintain the premium brand recognition, that means their products are guaranteed to work as intended. G-sync and G-sync compatible monitors are guaranteed to work with Nvidia GPUs, meanwhile having Freesync branding on monitors mean nothing (since AMD doesn't do any testing).

    For example LG OLED TV have Freesync VRR branding, yet VRR don't work on AMD RX6000 (another one from guru3d), what a joke.

    Let say DLSS3 can be made to work with current hardwares, but either there is no benefit (no FPS gain) or the results are so bad it would tarnish Nvidia brand, why would Nvidia ever allow it?
     
    Last edited: Oct 12, 2022
  19. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,143
    Likes Received:
    3,589
    GPU:
    HIS R9 290
    Except Nvidia didn't have to make G-Sync for that scenario to happen, at least it didn't need to have special hardware. AMD does a poor job at guaranteeing results for FreeSync but nothing is preventing them from doing that other than laziness/cheapness/negligence. Nvidia has the ability to use generic VESA AdaptiveSync (with no stupid extra hardware) and still be the more appealing option, by having a scrutinizing certification program so the consumer knows with confidence which displays will work correctly with Nvidia products. It might cost a little extra, but at least it isn't vendor locked. The evidence for this is there are FreeSync displays that do have good results.
    And that's where Nvidia can have their cake and eat it too - they have the means to push for a technology, make it an open or royalty-free standard, and make it easily adoptable, yet still be the best choice. Generally speaking, you can take away all the extra features and Nvidia still has the better overall platform, at least on release day. For years, they've been the preferred choice, not because of all the seldom-used features, but because they offered a more stable and performant product. There is nothing preventing Nvidia from continuing to do this with other technologies.

    I think Nvidia approached DXR perfectly: it's a cool technology that they managed to pioneer and they didn't vendor-lock it. They didn't need to because their implementation was so much better that it became the obvious choice. It works better in Nvidia's favor to let a technology be platform-agnostic because it increases the chances of it being adopted. When something becomes widely adopted but one vendor handles it consistently better, that makes the product look more appealing.
    In other words, which do you think is the better situation to be in:
    A. Having a list of exclusive but seldom-used features.
    B. Having a list of widely-used features but one vendor is a lot better at implementing them.
    Ironically the reality is the opposite situation, where things like the frame generation feature would make it so outdated products remain appealing a little while longer, thereby holding back new sales. But for the sake of argument, let's go with your example: as I previously mentioned, Nvidia already did this by allowing raytracing support on the GTX 1000 series. It performs so badly that you might as well just turn it off. It didn't tarnish the brand at all*, it got people to realize that the RT cores aren't a gimmick and that there is a reason to upgrade.

    * People hellbent on hating Nvidia would say it tarnished the brand, but such people also think raytracing is pointless. They're also the same people who exaggerate how much fidelity loss DLSS 2.0 has. They don't count.
     
    Last edited: Oct 12, 2022
    Krizby likes this.
  20. Espionage724

    Espionage724 Master Guru

    Messages:
    966
    Likes Received:
    418
    GPU:
    XFX RX 6600 XT
    https://www.pcgamer.com/dlss-3-on-older-GPUs/

    It sounds like DLSS 3 will work on older GPUs, but without the frame interpolation feature?
     
    Why_Me likes this.

Share This Page