NVIDIA Announces GeForce RTX 3070, 3080 and 3090 (With Even More Shader processors)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 1, 2020.

  1. Ok, gotcha - yeah poor optics there
    Same thought - I'll bet the yields are low & supply isn't going to be high.
     
  2. HybOj

    HybOj Master Guru

    Messages:
    230
    Likes Received:
    162
    GPU:
    ASUS GTX 970 DCmini
    Because 3090 is niche... its NOT equivalent of 2080ti. 3090 will be produced in VERY limited quantities. Its just there so it appears that nVidia has a behemoth of a consumer card, but its more like a paper launch for a very limited audience. Just watch... you wont be able to buy the card.
     
    Fediuld likes this.
  3. pharma

    pharma Ancient Guru

    Messages:
    1,858
    Likes Received:
    656
    GPU:
    Asus Strix GTX 1080
    There will be cards for a proper launch. The problem is demand for the cards will be unprecedented, even for the RTX 3090. AIB's should have all models on the shelves on Sept. 17, the same time as the FE editions.
     
    Maddness likes this.
  4. thesaiyan

    thesaiyan New Member

    Messages:
    6
    Likes Received:
    1
    GPU:
    EVGA GTX 1080
    I never get why people compare a 500$ GPU as being equivalent or "same price" as a PS5 or Xbox series X. Where do you leave the SSD, RAM, CPU, MOBO and optical drive? of course I would expect a 500$ GPU to obliterate the unreleased consoles! with the rest of the components, you are comparing at least a 1000$ PC with a 500$ console.
     

  5. Denial

    Denial Ancient Guru

    Messages:
    13,623
    Likes Received:
    3,182
    GPU:
    EVGA RTX 3080
    For me it's the same reason why you don't consider the TV as part of the consoles cost.

    I need a relatively powerful desktop for work, so all those components are essentially a sunk cost - I'm going to have them whether I play video games or not. To turn it into a gaming PC I can simply slot a $500 graphics card in.

    It's definitely skewed comparison though and I'm willing to bet the majority of PC gamers (that are buying $500 graphics cards) are spending far more money on hardware than console ones.
     
  6. Fediuld

    Fediuld Master Guru

    Messages:
    651
    Likes Received:
    346
    GPU:
    AMD 5700XT AE
    And lets not forget, the APU found in Xbox X series is MINIMUM 30% faster than the 5700XT.
    (52 RDNA2 vs 40 RDNA1)
     
    PrMinisterGR likes this.
  7. nick0323

    nick0323 Maha Guru

    Messages:
    1,003
    Likes Received:
    55
    GPU:
    Asus DUAL RTX2060S
    As a Brit I love Australian pricing and Singaporean pricing even more.

    $809 = £445. Now minus the tax I'll get back at the airport and that figure is even closer to £400. That's about £100 cheaper!
    THIS is why I always buy my electronics abroad.

    Any electronics which require a plug I get from Singapore since they use the UK defacto.
     
  8. sbacchetta

    sbacchetta Member Guru

    Messages:
    140
    Likes Received:
    66
    GPU:
    GTX 1080ti
    The problem with the whole PC VS console price argument is that everybody is applying their own use case scenario to everyone else.

    Like almost everything in life there isn't a universal rule that applies to everybody.
     
    Last edited: Sep 3, 2020
  9. pharma

    pharma Ancient Guru

    Messages:
    1,858
    Likes Received:
    656
    GPU:
    Asus Strix GTX 1080
    On paper. You basically need benchmarks to determine what is.
     
  10. k0vasz

    k0vasz Active Member

    Messages:
    71
    Likes Received:
    26
    GPU:
    nV GTX1060 6G
    same here. to get a gaming rig, I just need to put a single GPU in my work rig, so a console and a (pricey) GPU is very comparable for me
     

  11. Denial

    Denial Ancient Guru

    Messages:
    13,623
    Likes Received:
    3,182
    GPU:
    EVGA RTX 3080
    pharma, OnnA, Noisiv and 1 other person like this.
  12. pegasus1

    pegasus1 Maha Guru

    Messages:
    1,215
    Likes Received:
    190
    GPU:
    ROG 6900XT@2.3Ghz
    OCUK showing prices on my phone but not when i access via laptop, cheapest 3080 is £639 then a load of cards for £649, thats £45 less than i paid for my 1080Ti on release day.
     
  13. Embra

    Embra Maha Guru

    Messages:
    1,215
    Likes Received:
    406
    GPU:
    Vega 64 Nitro+LE
    I think the 3090 will be in very low supply. Definitely a "Halo"card.

    Grab one if you can if it is what you want.
     
  14. sbacchetta

    sbacchetta Member Guru

    Messages:
    140
    Likes Received:
    66
    GPU:
    GTX 1080ti
    @haste Check this block diagram form hardwareLuxx (found at videocardz), it seems they have replaced the FP64 ALU with 2 FP32 ALU for gaming (RTX) Ampere
    https://cdn.videocardz.com/1/2020/09/NVIDIA-GeForce-Ampere-SM-Layout_HardwareLuxx.jpg

    edit : this a Turing diagram

    https://cdn.videocardz.com/1/2020/09/NVIDIA-Turing-SM.jpg

    Edit2 : this the overall article from videocardz providing further info about other topic as well

    https://videocardz.com/newz/nvidia-provides-further-details-on-geforce-rtx-30-series
     
    Last edited: Sep 3, 2020
  15. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,367
    GPU:
    6900XT+AW@240Hz
    Long time ago, @Hilbert Hagedoorn marveled over fine details in Battlefield uniforms as he started it on 4K screen. I doubt that such fine details will be there if you upscale something to 8K from 9 times lower resolution which is 2560x1440.
    Why the heck 9 times? It makes it look like it is not even 4K GPU. 4 times upscaling mode from 4K will dot he trick too.
    This 9 times upscaling is like taking 640x360 and turning it into 1080p. Everyone can imagine result. There is no benefit from going 8K if it does not improve image quality over 4K.
     
    pegasus1 likes this.

  16. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,523
    Likes Received:
    2,927
    GPU:
    MSI 6800 "Vanilla"
    Though through DLSS and the AI scaling algorithm the learning samples and training model could be done from a super-sampled 16k / 15360x8640 res which would be how some finer details are preserved and why DLSS 2.0 titles are examples of the looking better than native resolution. :)

    In some specific cases and depending on how it's trained though artifacts and loss of certain details also needs work and improvement going by what I've been reading up on but primarily from Death Stranding as there's still only a few 2.0 based games using DLSS.

    EDIT: We'll see though especially going from the older scaling mode max (4x?) up to 8x or higher here and how that then affects things, could still be a improvement in overall scaling quality and the algorithm particularly for 4k and lower even if 8k might see some loss of detail.

    TV and being a bit further from the display too for such setups.
    (Potentially consoles but the hardware isn't there unless the Switch2 GPU were to feature DLSS which would be interesting if NVIDIA and Nintendo's partnership continues.)
     
  17. Denial

    Denial Ancient Guru

    Messages:
    13,623
    Likes Received:
    3,182
    GPU:
    EVGA RTX 3080
    I would imagine it's an option. I'd think running 4K+DLSS to 8K would result in lower frames than just Native 4K. So in games where the card is running close to 4K60, trying to play those titles at 8KDLSS(4K) would result in an experience less than 60fps.

    Dunno, would have to see comparisons on a 8K screen to really know and see where the performance actually falls. Might be a better experience at 1440P upscaled and that's why the option is there.

    Fox's position is that DLSS on average doesn't look better than native. I might be inclined to agree but I think it gets 90% of the way there, while significantly boosting performance and reducing issues by most TAA methods.

    The biggest problem is that there is a range of TAA implementations, some are really bad, others aren't. Further in games like F1 DLSS just seems to do a shitty job compared to other titles.
     
  18. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,523
    Likes Received:
    2,927
    GPU:
    MSI 6800 "Vanilla"
    Yeah the training model likely plays a huge factor in this and training the entire game multiple times over is a time consuming affair which DLSS 2.0 seemingly moves away from (But not entirely, simplifying the requirements a bit though but I misunderstood this entirely initially.) requiring less data for similar results in terms of quality but can't quite do away with this or I'd imagine a stand-alone implementation would be very rough with the current implementation much as it'd be a impressive accomplishment too.

    Thus the mixed results and some parts of the games or even differences between games including those on the current latest 2.0 version.
    Separating it from a game implementation would also allow for upgrading and improving the tech over time through driver improvements instead of having a bunch of 1.x and 2.0 titles at the end of active support and further patches including more major updates to underlying tech and PC specific features besides smaller updates or more critical bug fixes requiring attention.

    EDIT: Oh and shaders and the current usage of temporal effects and screen-space and getting that right, probably a lot of complications around some of these elements too.

    EDIT: I think this can be a huge thing depending on how it goes but it is limited by the per game implementation and overall support in games but it's improving especially if it can cover game engines although even so it'd be up to developers to utilize the tech and also utilize it well.
    (Might also still require directly working with NVIDIA for some of it and getting it implemented with the best possible quality.)

    Performance potential alone probably makes it a attractive feature for a lot of the user base allowing their GPU to hit 1920x1080, 2560x1440 or 3840x2160 at much higher framerate and a acceptable reduction in image quality for the trade-off in framerate gain.

    Which can only improve as development and research continues. :)

    Combined with other tech either game implemented or from the driver or third party tools you'd possibly get the higher framerate and resulting improvements to overall smoothness / frame time variance and stability and for higher-refresh displays greater reduction on input latency which is great.
    (Using the additional framerate for net gains overall and keeping it stable at a higher framerate target or how to put it.)

    EDIT: Depending on the game, optimizations the game engine and all kinds of factors.
    Hit a framerate dip from 100+ FPS or a hitch and stutter and yeah.

    Frack.

    Think that covers for that. :D
     
    Last edited: Sep 3, 2020
  19. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,367
    GPU:
    6900XT+AW@240Hz
    In situation where game gains no fine details by going from native 1440p to native 4K I do agree. But as with 640x360 => 1920x1080, absent details will not come out of thin air.
    And that means, if one can run native 4K @60fps in game that adds fine details past 1440p, one is better of running just native 4K over upscaling 1440p to 8K for sake of bragging right.

    In games that can't add details by finer texture or detailed shaders past certain resolution, DLSS is not bad idea as it does good job on AA, and there are no details to be lost.
    But with most of those AAA games that implement DLSS/DX-R, there is a lot to be seen with higher resolutions that's often not there due to upscaling.
     
  20. Exodite

    Exodite Ancient Guru

    Messages:
    2,053
    Likes Received:
    232
    GPU:
    Sapphire Vega 56
    It's hard to put into words how much statements like this drive me crazy.

    I were watching DFs coverage and got until the part where they were talking about DLSS is better than native until I had to stop.

    DLSS 2.0 is obviously a great technical achievement but it's, quite literally, impossible for it to be better than native. The best it can do is be perfectly, pixel-for-pixel, equal.

    Unfortunately it's rarely compared to native rendering anyway, but rather to eye cancer TAA, which helps immensely with the perception of greatness.

    Feel free to talk up the technology as much as you like, DLSS is something Nvidia should be rightly proud of, but please please stop with the 'better than native' nonsense.

    For the sake of my blood pressure if nothing else. :p
     
    JonasBeckman and southamptonfc like this.

Share This Page