NVIDIA keynote and GeForce RTX 40 GPU / DLSS 3.0 announcements

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 20, 2022.

  1. H83

    H83 Ancient Guru

    Messages:
    5,551
    Likes Received:
    3,065
    GPU:
    XFX Black 6950XT
    I expect them to be fast in every situation, not just on RT.

    Otherwise, this series are going to be major disappointment, because the 3000 series were released 2 years ago!...

    I agree.

    The 4090 is the only justifiable purchase at this point. It`s expensive has hell but it`s going to perform like a beast and the 3090ti is not that much expensive.

    The 4080`s are a joke.

    The 16gb version is simply too expensive, much better to spend a "little" more and get the 4090.

    As for the 4070 pretending to be a 4080, what can i say that others have not said already... This is a 499 card going for almost double the price...
     
  2. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,597
    Likes Received:
    1,929
    GPU:
    7800 XT Hellhound
    AMD, give us open-source Reflex now. You chose to underestimate it for two years, and now it's part of DLSS 3. Fail.
     
  3. pegasus1

    pegasus1 Ancient Guru

    Messages:
    5,293
    Likes Received:
    3,715
    GPU:
    TUF 4090
    Il be interested to see how the 4090 stacks up against the 3090Ti and 6900xt in non-RT, then il make my decision.
    For me, the price isnt a red line (relative to performance), its more the jump on performance.
     
  4. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,666
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    I think an 8pin can do more than 150w and i think the PCIe slot can do more than 75w.

    I have a 2 x 8pin 3090 and that means 375w but i've seen this card pull 420w so either it's wrong on GPU-Z or it can pull more if needed.
     

  5. XenthorX

    XenthorX Ancient Guru

    Messages:
    5,074
    Likes Received:
    3,459
    GPU:
    MSI 4090 Suprim X
    i'm a bit afraid by the prices, 3090 was announced MSRP 1200$ ended up paying 1700euros, if they announce 1500$ for 4090... well...
     
    Embra likes this.
  6. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    NVIDIA stated that the 4090 comes with a 4 x 8pin adaptor if you do not have an ATX 3.0 pcie5 PSU and the new GPU plugs.
     
  7. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,666
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    So for the 4090 that's 675w holy crap.

    I also think Nvidia has gone balls deep into this new DLSS 3.0 tech, well at least it's not DX13 as i think 3000 owners will get DLSS 3.0 but just not that Optical Flow Accelerator part.
     
  8. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,068
    Likes Received:
    7,397
    GPU:
    GTX 1080ti
    "Can" however both are only rated for 150 and 66 on 12v, higher is out of spec and risks failing pcie validation.
     
  9. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE

    Portal has broken DX9 even for NVidia. That is why has Vulkan support these days instead of DX9.
    Video Decoding green screens is also an Nvidia issue. I don't see you though mentioning it not stopping people from working. (have you looked at the NV forums?)

    And can list about the rest but not bothering.

    I will say only this. ONLY NVIDIA cripples with drivers a 3 month old GPU to reduce performance so it can sell the same GPU with better memory.
    This is what happened with the GTX1080 back in October 2016 when it was preparing the GTX1080 11Gbps launched alongside the 1080Ti.

    I know that because had the card back then and was involved in heavy overclocking (through the custom curve) and benching as saw a drop in bench numbers with all drivers after September 2016. Rolling back to August drivers the performance came back.

    Or many Freesync monitors like the XL2730Z couldn't run at 144hz only 100hz with Nvidia cards up until 2020. Took Nvidia 5 years to fix this. The thread at the official forums was hundreds of pages long.
     
  10. XenthorX

    XenthorX Ancient Guru

    Messages:
    5,074
    Likes Received:
    3,459
    GPU:
    MSI 4090 Suprim X
    The only thing they would earn out of this is degrading the brand "DLSS 3.0" with underwhelming quality on rtx30XX and below
     

  11. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,666
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    Nvidia will update 2 of the 3 components of DLSS and call it DLSS 2.5 or something for 3000 owners. This will leave the 3.0 branding for 4000 series.
     
    pegasus1 likes this.
  12. bballfreak6

    bballfreak6 Ancient Guru

    Messages:
    1,907
    Likes Received:
    470
    GPU:
    MSI RTX 4080
    But that IS the point; they want to confuse the average consumer into thinking "oh an $100 increase in MSRP for upgrading from the 3080 12GB to 4080 12GB is pretty reasonable" when in reality the 4080 12GB is a xx70 tier card and upgrade to true successor of the 3080 card is now an extra $400. Scummy marketing move.
     
    Last edited: Sep 21, 2022
  13. Crazy Serb

    Crazy Serb Master Guru

    Messages:
    270
    Likes Received:
    69
    GPU:
    270X Hawk 1200-1302
    I don't think reflex will help that much if real framerate is 30 fps. Especially when we consider that DLSS (and FSR obviously) so far are just using older tech in 1 package (TAA+upscaling+sharpening) and inherited all of the issues.
    While interpolation works, I it is hard to expect that games will be playable due to input lag. It would require massive overhaul of render pipeline (if that is even possible considering image needs to be rendered first) to make it feel like displayed fps...
     
  14. user1

    user1 Ancient Guru

    Messages:
    2,816
    Likes Received:
    1,325
    GPU:
    Mi25/IGP

    nvidia can name their small dies p*$$yslayer 4080 bfg s*ck my nuts edition and charge $$$$ all they want and it wont be confusing since the performance is as described, what matters is that people get what they are expecting from the product they are buying , having 4gb of vram as the only immediately obvious difference is misleading when the 16gb card has 27% more gpu compute units

    case and point being the kepler cards , they released the gk104 die (which is the successor to the gf104 aka the 460) as the 680 instead of the gk110.

    no one was tricked into buying a gk104, the product was as described, this whole 16gb/12gb fiasco can genuinely cause people to purchase a product that is different from expectation, and that is far more scummy than hiking prices. its basically false advertising.
     
    Fediuld likes this.
  15. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,597
    Likes Received:
    1,929
    GPU:
    7800 XT Hellhound
    Well, the point is it always helps and can still help a lot at even 100fps (very noticeable in e.g. Overwatch). It is ticking one checkbox and you have lowest lag guaranteed at both capped and uncapped fps, including staying in VRR range without additional lag when combined with vsync. I'd miss it dearly, and it seems many more games will get it via DLSS 3. AMD makes it way too easy for Nvidia...
     

  16. bballfreak6

    bballfreak6 Ancient Guru

    Messages:
    1,907
    Likes Received:
    470
    GPU:
    MSI RTX 4080
    So...we both agree then that NV did this intentionally to deceive the average consumer here and it's not confusing as to why they did this? Or do you mean it's confusing for the consumers?
     
  17. user1

    user1 Ancient Guru

    Messages:
    2,816
    Likes Received:
    1,325
    GPU:
    Mi25/IGP
    We both agree that they did it intentionally, however simply marketing a smaller die as a 4080 and raising the prices isn't itself deceptive or confusing, whether it was 4080 ti (16gb)/4080 (12gb) or 4080 (16gb)/4070ti (12gb) there would be no issue for the consumer, since either way each product is clearly distinguishable.

    What nvidia has done by naming them the same thing is open the door to conflation and confusion. In the past nvidia has gone out of their way to avoid this, due to backlash on older products.(

    The question I have is why their marketing team would intentionally make such an obvious and blatant mistake. You can't really find a good answer since adding a "ti" to the upper model would instantly remedy this issue, without impacting their plans for the rtx 4070 series. you could say that nvidia intends to release a ti model later which is better, which would remove the name slot, but nvidia has made super cards in the past, so clearly that isn't a good enough reason to give the 16 and 12gb cards the same name. Its very strange, one can only imagine what kind of glue sniffing sessions are going on internally at the nvidia marketing department.
     
    Last edited: Sep 22, 2022
  18. Chert

    Chert Member Guru

    Messages:
    142
    Likes Received:
    44
    GPU:
    Gigabyte RTX 3060ti
    Then, RTX 2080 ti ($1199) = RTX 3070 ($499)
    Now, RTX 3080 ti ($1199) = ??

    So, what card in this new 4000 series will match the performance of this $1199 card of the 3000 series and how much more will it cost from $499?
     
  19. Undying

    Undying Ancient Guru

    Messages:
    25,642
    Likes Received:
    13,040
    GPU:
    XFX RX6800XT 16GB
    4080 12gb is somewhat fast or little slower than 3090ti so lets say the 4070 will be fast as 3080ti for a 3070ti msrp so thats 599$ if we are lucky. I would not be surpised if it costs more.

    That leaves 4060ti at 499$ which is will be around 3080 10gb and 4060 399$ at 3070 level of performance.
     
  20. Spets

    Spets Guest

    Messages:
    3,500
    Likes Received:
    670
    GPU:
    RTX 4090
    You still have the benefits from DLSS 2 + Reflex to improve input lag, DLSS 3 is just throwing frames on top so I don't think the input lag will be a problem. I've yet to come across a game that can only do DLSS 2 in performance mode below 50 frames, the extra frames will what, add lets say 15 frames on top of that, the ms wouldn't feel that different.
    If the interpolation/"extra frames" ends up being as strange looking as tv's I hope we have a way to turn it off :D
     

Share This Page