Nvidia DLSS 3 Only Works With GeForce RTX 40-Series GPUs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 22, 2022.

  1. Embra

    Embra Ancient Guru

    Messages:
    1,601
    Likes Received:
    956
    GPU:
    Red Devil 6950 XT
    This feature will separate the 4xxx series a bit from the 2xxx, 3xxx's, as well as better performance.

    Sort of sucks for the 3xxx series.
     
  2. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,424
    Likes Received:
    1,150
    GPU:
    RTX 3070
    Several YT Tech tubers stated that the 3000 series cards do indeed have the required hardware to run DLSS 3, however Nvidia has stated that the 3000 series hardware is just slower at the task by some amount versus the 4000 series it seems (there's a hardware engine of some kind that it uses for the specific task iirc).

    I'm speculating of course, what do I know, but yeah -- it seems to me that Nvidia probably "could" get DLSS 3 running at least the 3000 series cards then (albeit at a slight to moderately higher runtime cost), but they have a very bad/anticonsumer habit of just screwing over the last gen cards. For example, even though there's no real reason Smart Acces Memory (Rebar) couldn't work on the 2000 series GPUs they just didn't bother.

    That kind of thing is a real blow imo and this sort of thing will definitely push a lot of consumers to AMD I'd argue as they aren't burning their consumers like this in recent times (you can also get a reasonable amount of VRAM on the AMD side without spending ludicrous amounts of cash -- VRAM skimping has been Nvidia's GPU flaw for awhile now imo). Obviously DLSS3 sounds amazing on paper, but imo they should support it on the 3000 series at least if at all possible then just give the caveat that you may gain slightly less performance from it than the 4000 series if it's at all possible.

    Digital Foundry/Alex actually talked about this sort of thing sometime back in terms of giving people the option to run reconstruction tech on shader cores when possible even if the gained perf is lower iirc (i'm butchering what he said and obviously he was referring to other reconstruction techniques and all, but it seems like they "could" run DLSS 3 on the 3k series with slightly worse performance gains it would just take effort for them and they're using it as marketing for their new overpriced cards).

    I could be wrong though *shrug*
     
  3. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,424
    Likes Received:
    1,150
    GPU:
    RTX 3070
    I was under the impression most people on this forum were working adults -- this forum seems to skew higher in terms of age than, say, reddit or some such for example.

    From a business point of view I understand why Nvidia does what they do -- they're the market leader in terms of performance so like Intel with CPUs when they were on top Nvidia can get away with a lot. But if they push too hard they will lose more customers to AMD -- they're probably fine with that and I expect Nvidia will sell out of these cards anyway.

    Personally if Nvidia keeps pulling these kinds of stunts where they introduce new tech every generation that is not compatible with literally 1 gen back I will move to AMD where their focus has been on technologies that everyone can use. It's just more cost efficient for most people.

    I don't think there's anything at all wrong with criticizing a company for doing something anti-consumer or for not enabling a given feature when they technically could. It's a kick in the teeth for existing customers and if you do it too much they will not buy your GPUs anymore, they'll move to a manufacturer that doesn't bite in the rear so much.

    I don't think it reasonable to say people want something for nothing in this case -- people obviously paid for their new 3000 series GPU with the last year most likely and a GPU purchase like that comes, reasonably so I'd argue, with certain expectations from Nvidia for feature support (these GPUs are not cheap -- they cost more typically than an entire console). This is something of a unique case too I'd argue as several YT PC outlets have stated that technically the required hardware engine for DLSS 3 is present in the 3000 series cards, it's just somewhat slower than the one present in the 4000 series GPUs. I'm speculating of course and could be wrong, but my bet is that they probably "could" get the feature to work on those cards (albeit without as much of a perf uplift as the 4k series) but won't bother with the effort as it would take some portion of resources and they want the advertising bullet for the 4k series as you say. Still sucks if you own a 3k card though, nothing wrong with stating that imo.
     
  4. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,424
    Likes Received:
    1,150
    GPU:
    RTX 3070
    In my experience playing at 1440p, FSR (even 2.0) has noticeable quality loss compared to native resolution + TAA (though of course that tech is fairly new and I appreciate that it's hardware agnostic).

    Early DLSS looked weird and painterly in games like BFV and it looked really "soft" til it was patched imo. The "hand tuned" 1.9 version of DLSS in Control was very interesting, but broke down a ton in motion and I didn't care for it very much. Lots of noticeable artifacts around hair and when moving the camera and in fan blades (etc).

    DLSS 2.0 was when it really started to get good to my perception and the Quality mode with 1440p output looked pretty solid in games like Cyberpunk to my eye (though there were still some issues like ghosting trails and weird softness from time to time). DOOM Eternal though I preferred native + TAA as there was just something weird about how the DLSS handled detailed surfaces in motion and the game came out looking sort of "soft" to my eye -- not bad mind you, but I thought it looked noticeably worse than native.

    The latest iteration of DLSS (2.3 or 2.4? can't recall) looks amazing with the right sharpen settings to my eye, it even holds up pretty well in motion the few games I've tested it on. This is all assuming Quality mode however -- turn it lower and native + TAA starts looking better again to me. I wouldn't say DLSS 2.4 is "better" than native + TAA, but considering the performance uplift for me it's now "worth it" assuming I'm GPU bound that is (some games like Spiderman I'm more CPU bound).
     

  5. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,597
    Likes Received:
    1,929
    GPU:
    7800 XT Hellhound
    FSR 2.1 mod looks better than native TAA in e.g. Doom Eternal or Dying Light 2 and also native 2.0 looks better than old UE TAA in Wonderlands.
    But yes, there is still room for improvements with ghosting in some situations, particle effects or disocclusion.
     
    BlindBison likes this.
  6. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    8,793
    Likes Received:
    10,918
    GPU:
    RX 6800 XT
  7. Spets

    Spets Guest

    Messages:
    3,500
    Likes Received:
    670
    GPU:
    RTX 4090
    I hope there's a way to distinguish between normal frames and interpolated frames, the render latency would be at w/e the normal frames are with dlss2.
    More concerned by the strange interpolation motion, I haven't seen a good one yet but that's something that'll have to be hands on to test.

    I'm sure BlurBusters and Batte(non)sense will be testing that in depth.
     
  8. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,228
    Likes Received:
    1,546
    GPU:
    NVIDIA RTX 4080 FE
    Most modern TVs, even cheap ones, do motion interpolation so it is really nothing new and doesn't seem require that much processing power considering how limited the CPUs and graphical hardware tend to be in TVs.

    I suspect that this DLSS 3 feature is the excuse NVIDIA needed to increase the price of their cards. Boosting of 3X-4X increases in performances sounds impressive on paper but if most of that comes from image upscaling and "fake" doubling of the framerate then in my view it isn't really impressive at all. DLSS 2 already has obvious compromises to visual quality when upscaling in the form of visual shimmering on fine details and so on so I can only imagine when DLSS 3 looks like with those issues and motion interpolation artefacts as well. These things are great for consoles in my opinion, where you typically sit much further away from the TV and are therefore less likely to notice the glitches, but I would wager that most PC gamers sit directly in front on a monitor so the issues are far more obvious... and, in my experience, can be distracting/annoying.
     
    Maddness and eGGroLLiO like this.
  9. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,053
    Likes Received:
    4,437
    GPU:
    Asrock 7700XT
    That's true but what the FI TVs do is hot garbage and (like hot garbage) usually really distracting. Seems to me what Nvidia has done is basically just crumpled up paper - a sign of something on the right track but maybe not quite good enough. Since the GPU itself is producing the image, it can more intelligently figure out how to apply the sub-frames, and hopefully you can dial in how "aggressive" it is.
    I agree, though typically I've found these AI-based enhancers to only be distracting if you look where it wasn't optimized. At least from what I've seen, Nvidia's FI is only distractingly bad if you look at it frame by frame, but at that point you're not really playing the game. When you just let it do its thing, it's "okay". I would be fine with the occasional distortion if it meant I could have a consistent 60FPS experience. If you look at hand-drawn animations, you'd be surprised to find how weird everything can look between frames, but when fully animated, you either don't catch such details or you kinda get used to it.
    It's the same idea as playing games on an old CRT TV - it's unbearable at first but after about a half hour, your brain sorta adjusts to it and suddenly it's not so awful to look at anymore.
     
  10. Denial

    Denial Ancient Guru

    Messages:
    14,215
    Likes Received:
    4,132
    GPU:
    EVGA RTX 3080
    Idk - like yeah obviously value-add features are going to be used to increase prices but there are other reasons that factor in:

    R&D cost for the G80 architecture (8800 GTX) was ~$450M over 4 years. The R&D cost for a modern card is easily in the billions. Nvidia is spending close to $5B per year on R&D costs now.

    The price of manufacturing is increasing exponentially:

    [​IMG]

    Then you have general inflation, tariffs, ban of chips to china, etc. This is all going to pass onto the customer in some form.

    So you'd have to like decouple what is like "justifiable" increases in price, from Nvidia's value-add features and mining influences. I'm not sure how you do that. I know the prices are really high and I think 4080 is insanely priced but I also think expecting a 4080 16GB to be like $600-650 is probably equally far-fetched. There exists some pricepoint that's probably fair to be both customers and the companies. $1200 isn't it lol.

    __

    And I said this in the other thread but the reality is technologies like DLSS/AI based image reconstruction & Interpolation is probably the future whether you like it or not. Physics is limiting manufacturing. There's no way to just brute force increase performance anymore without massively increasing power requirements. You need to get fancy with the software and that's the route every company is going. Best we can hope for is that Nvidia/AMD/Intel & whoever starts solving the issues with these technologies - I don't think we as gamers should be pushing against them.
     

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,053
    Likes Received:
    4,437
    GPU:
    Asrock 7700XT
    While R&D has grown exponentially, their net income has also grown quite substantially. So, while a 4080 16GB at $650 is farfetched, $750 would be totally justified given Nvidia's pricing history.
     
  12. Denial

    Denial Ancient Guru

    Messages:
    14,215
    Likes Received:
    4,132
    GPU:
    EVGA RTX 3080
    I agree -- I think $750-800 is a fair price for the card.. $1200 is absurd.
     
  13. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,752
    Likes Received:
    1,870
    GPU:
    EVGA 1070Ti Black
    it will sell at those absurd prices none the less cause people and brains and sense dont combine well, I am more likely to rma my 450gts back to evga that running way to hot idle get new card even if it 3050 and using that till it dies cause prices are absurd.

    DLSS/FSR is slowly becoming crutch to industry which i dont like. it great if you running say 4k monitor and cant actual "push" 4k at your target of 60fps but native res power seem to be take back sit to DLSS which i not fan of usefull yes it can not and will not ever replaced native. there more advances in DLSS performance for last 3 gen then actual native performance advances

    dont get me wrong i like idea of DLSS for certian work cases, but nvidia seem to more concerned with DLSS advances at this point
     
    Last edited: Sep 24, 2022
  14. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,115
    Likes Received:
    2,612
    GPU:
    3080TI iChill Black

    Yeah it's that new RT thread reordering module, it speeds it up a lot.

     
    BlindBison likes this.

Share This Page