Phoenix GeForce RTX 3080 and RTX 3090 From Gainward leaks (but also confirms specifications)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 29, 2020.

  1. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    Isn't it always the same?
     
  2. jarablue

    jarablue Member Guru

    Messages:
    167
    Likes Received:
    52
    GPU:
    PNY 4080 Black
    Yup.
     
  3. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    People already accepted whatever nVidia brings this time around. (As they usually do.)
    Circle did close 2 times around by now. Flaming hell when AMD's/ATi's card eats more power. Content quiet when nVidia eats more.

    nVidia is yet again going to win by default thanks to mindshare.
     
    sbacchetta and itpro like this.
  4. Xuanzang

    Xuanzang Master Guru

    Messages:
    250
    Likes Received:
    21
    GPU:
    ASUS TUF 7900 XTX
    I want to see some 3090 benchmarks. ;)
     
    MegaFalloutFan likes this.

  5. itpro

    itpro Maha Guru

    Messages:
    1,364
    Likes Received:
    735
    GPU:
    AMD Testing
    I never accepted them. It all depends if RDNA 2.0 can become better than RTX 30. If it cannot compete and win at both efficiency and pricing, I can see people choosing only performance. AMD should win 2 of 3 battles. I don't demand absolute performance, but I demand both perf/watt and perf/$.
     
    sbacchetta and Fox2232 like this.
  6. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,412
    Likes Received:
    3,078
    GPU:
    PNY RTX4090
    No one can say this yet, we don't have benchmarks or actual cards to test for ourselves. But you have to remember this is a new architecture, on a new smaller node, with new memory, with a generational leap for rt and tensor cores, more CU's packed in there, and then you have to factor in any improvements in CU IPC also.

    I expect at minimum 30% better rasterization performance compared to 2080Ti (even this seems very low considering all the above), but I do think you are right that they will be pushing massive RT performance improvements as this is where the 2000 series lacked big time and probably the reason why they were also so expensive compared to previous generations. Imagine if they can say 100% performance when RT is enabled. This could be their justification to raise the prices AGAIN like they did with the 2000 series.

    What I can't get my head around is the massive difference in memory sizes, they are jumping from 10GB on the 3080, 14GB MORE, to 24GB on the 3090.... that just seems like a massive jump up. Though it does leave a lot of room to slot in a 3080Ti or a 3080S later on with like 14GB and 16GB respectively.

    Maybe they are waiting to see what AMD has with RDNA2 as internal rumours are putting the top AMD card either matching or just above the 3080 in terms of performance. Nvidia seem to be leaving a massive gap for them to slot in more cards later on and maybe mess around with pricing if they need to depending on how competitive AMD are.
     
    XenthorX likes this.
  7. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    More power then what? Because the complaints have come when AMD has required to use more power then an equivalent nvidia product....

    If there's an nvidia product using more power for the same performance as an AMD card in 2000 or 3000 series, i am unaware of it, please enlighten the rest of us.
     
  8. no_1_dave

    no_1_dave Master Guru

    Messages:
    256
    Likes Received:
    30
    GPU:
    4090
    3090 is the Titan
    3080 is the 3080

    A 3080 TI will come further down the line with performance very close to the 3090 at 70% of the cost and less vram.

    Not sure to hold off for the 3080TI and not sure how comfy these will fit in my Corsair Air 240 mATX case :confused:
     
  9. pipes

    pipes Member Guru

    Messages:
    182
    Likes Received:
    0
    GPU:
    Rtx 4080 frostbite
    [QUOTE = "Fox2232, post: 5823143, member: 243702"] Le persone hanno già accettato qualunque cosa nVidia abbia portato questa volta. (Come fanno di solito.)
    Il cerchio si è chiuso 2 volte a questo punto. Un inferno infuocato quando la scheda AMD / ATi consuma più energia. Contenuto silenzioso quando nVidia mangia di più.

    nVidia vincerà di nuovo per impostazione predefinita grazie a mindshare. [/ QUOTE]
    o grazie alla stupiditò della gente che come me ha nvidia o_O
     
  10. narukun

    narukun Master Guru

    Messages:
    228
    Likes Received:
    24
    GPU:
    EVGA GTX 970 1561/7700
    10gb SUCKS, if they don't release a 16gb version or more for the RTX 3080 it's going to be a really bad card in my opinion.
     

  11. DannyD

    DannyD Ancient Guru

    Messages:
    6,770
    Likes Received:
    3,783
    GPU:
    evga 2060
    Wait for nvidia's response to big navi then.
    I'm hoping memory compression will be a little better with the 3k series, no worries about enough RT cores, and 10gb will do me for 1440p.
    8gb 3070 is a little sketchy but i'd be happy with one.
     
    CPC_RedDawn likes this.
  12. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    If you're talking about delta memory compression, it doesn't effect the total data stored in VRAM.. just effective bandwidth through the bus. Doesn't seem like memory bandwidth is going to be an issue here though

    Yah - I was trying to say this in the other thread. I personally don't care what the total power is as long as the performance scales and the temperature is kept in check (as to not throttle). If Nvidia wants to launch a 700w card that's twice as fast as a 3090, by all means do so.
     
  13. H83

    H83 Ancient Guru

    Messages:
    5,465
    Likes Received:
    3,002
    GPU:
    XFX Black 6950XT
    In a certain way it makes some sense but would Nvidia risk making their GPUs at different foundries? The only explanation for this would be a very limited wafer supply from TSMC, don´t know if that´s the case.

    Another explanation is that maybe Nvidia is not being totally honest and the TDP of the 3090 is much higher than 350W. Maybe they are copying Intel providing misleading TDP values, like telling TDP values for base clocks instead of telling max TDP values for turbo speeds...

    I think Nvidia has been decreasing the value of the 70 series since the sucess of the 970, that was so good that made the 980 almost useless... Now they make the performance diference between the 70 and 80 series higher than ever so there´s a clear gap between them. And to make it worse, the price of the 70 series has been increasing in a crazy way...
     
  14. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    3090 is GA102, vs 3080 which is GA104. So it's possible GA102 is TSMC and 104 isn't -- Nvidia''s never split foundries at the high end but it definitely did it within the same architecture.
     
    Deleted member 213629 likes this.
  15. Hog54

    Hog54 Maha Guru

    Messages:
    1,246
    Likes Received:
    68
    GPU:
    Asus Tuf RTX 3070
    Who puts a $1500 graphics card in a $99 case?lol
     
    Deleted member 213629 likes this.

  16. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,011
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
    uh... no

    both are 102

    there was supposed to be a 103 as a 320bit design in between 102 and 104 but nvidia decided not to use it for Geforce.
     
    sbacchetta likes this.
  17. "ExpertTool ll" - For frying your video card in record timing' :confused:
    But... but bb bbut... :eek:
     
    XenthorX likes this.
  18. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    True

    Coulda sworn they were different. Dunno then, I go back to what I said in the other thread, whole thing seems weird. Just gonna buy a 3080 and call it a day. I want to go 3090 & 4K but I can't find a 4K monitor that tickles my fancy, with QHD 3090 seems like overkill.
     
  19. MegaFalloutFan

    MegaFalloutFan Maha Guru

    Messages:
    1,048
    Likes Received:
    203
    GPU:
    RTX4090 24Gb
    This year has the most things to buy ever, 3090 looks like a must have


    try LG OLED Cx 48inch, i use the 55inch as my monitor and its amazing, image quality, gsync, size, low latency gaming mode, HDMI 2.1 support, 4K/120Hz/10Bit/4:4:4 mode
    I been doing it since 2016, first got C6p and in the end of 2019 i gave it to my parents for the living room, it still kicks ass, no burn-in and its already 4 years old OLED that was and still tuned on all day long.
    Instead i got C9, it was end of 2019 and Cx was in the future.

    But if I was buying one now, i would get 48inch model, its better suited for desk and still covers my field of view
     
    Last edited: Aug 30, 2020
  20. iicycube

    iicycube Member

    Messages:
    48
    Likes Received:
    3
    GPU:
    Nvidia 3080TI OC
    Only waiting for the "TI" version and that will be the best buy.
     

Share This Page