Rumor: NVIDIA GeForce RTX 3070 and 3080 Coming Q3 2020 + Specs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 29, 2020.

  1. DesGaizu

    DesGaizu Ancient Guru

    Messages:
    3,712
    Likes Received:
    74
    GPU:
    AORUS 3060TI
    I have about £550 put aside for a gpu upgrade from my still faithful 980ti, the way nvidia prices are going I might be able to afford the 3050 ... ;_;
     
  2. EL1TE

    EL1TE Guest

    Messages:
    426
    Likes Received:
    102
    GPU:
    GTX1080 SH X 2.1GHz
    I rarely look at benchmarks besides first ones for a product, so normally founder card reviews, it's expected however that over time the cards perform better too.

    I actually found a video comparing all 3 cards, seems mixed in results but indeed the GTX1080 is closer to a 2060.



    It would be sad that the formula was always the same, either way i wanna see the prices first and ill have to become a console pleb if expensive :p (will keep the GTX1080 ofc)
     
  3. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    yeah that gap is too much between 3080 & 3080TI. . But I guess if it has 1024 tensor cores then it needs to be thaat big.

    Also I saw some news a while ago saying NV will drop xx70, xx80, xx80Ti price a bit, 100$., xx70 model 50$.



    I was looking at normal 3080, so those spec put it ~2080ti or a little more,..imo to be expected.
    RT core is good, 2x more. :D
     
  4. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    Historically 30-50% uplift per gen. I don't see that changing. What was different this time was the pricing. We didn't get a 2080 Ti at $699.
     

  5. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,040
    Likes Received:
    7,381
    GPU:
    GTX 1080ti
    PS: No they didn't.

    GPU's are not CPU's.
     
    sykozis and Noisiv like this.
  6. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    I don't think that it's far-fetched for the 3070 to be as fast as the 2080ti, I mean the 1070 wasn't far off the 980ti (and think it was better in some games, but I'm vague on this)....so yeah it's not far-fetched for 3070 to be as fast as 2080ti, especially as 2080ti wasn't a massive leap over 1080ti, and this time we're on a die shrink & a new architecture to boot. So I would be actually very disappointed if the 3070 wasn't as fast the 2080ti...I mean I want it to be faster to make it more worth my while upgrading considering 2080ti wasn't far above 1080ti.
     
    jbscotchman and Dragam1337 like this.
  7. yeeeman

    yeeeman Member

    Messages:
    29
    Likes Received:
    12
    GPU:
    9600GT 512mb
    I agree. If we assume that cuda cores keep the same performance, then the 3070 has more cores than RTX 2080 so it will be at least better than that. But I doubt nvidia won't touch the cu, given this long span of development time. So I expect rtx 3070 to be ~rtx 2080ti, which is just ... massive.
    AMD fanboys are happy that rdna 2 will bring rtx 2080 ti performance, but nvidia is moving that performance tier to mainsteam, lol. amd will not have the same luck with nvidia as they had with intel, which basically stuck 5 years with the same product. No crap they managed to beat Intel, cause Intel is still using 5 year old parts to compete with amd latest and greatest. And they still compete on performance, so I don't know. Call me a hater, but amd success on the cpu side is to a big degree related to Intel being stuck in the past.
    As for the 8000+ cores behemoth, that will be reserved for HPC. the rtx 3080ti will be probably have less core, something in the region of 5000 cores, which should bring at least 50% better performance compared to rtx 2080ti, which sounds good.
     
  8. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    rofl...

    Mate, look at economics of GPU production. Big means expensive to make. And nVidia has more transistors in GPUs of same performance than AMD.
    And they are about to cram much more tensors and other stuff there making AMD's advantage even bigger when facing RDNA2.
     
  9. H83

    H83 Ancient Guru

    Messages:
    5,512
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    My next GPU has to offer a 50% performance increase over my 1080Ti and can´t cost more than 500€!
     
  10. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    Sorry my fault, indeed the Xbox X is 405 not 495mm2 and Mi60 is 331mm2.
    Yet still Nvidia direct shrink of the 754mm2 12nm is close to 460-500mm2 at 7nm, doubling the cores according to this rumour and almost doubles in size 800mm2+ (I give some space to not direct 40% shrink).

    So regardless no Nvidia GPU can be made at this size, which is my argument.
     

  11. Corrupt^

    Corrupt^ Ancient Guru

    Messages:
    7,270
    Likes Received:
    600
    GPU:
    Geforce RTX 3090 FE
    Well yeah, maybe now we"ll get an upgrade for the price, whereas the difference between the 1080(Ti) and 2080(Ti) really wasn't worth it.
     
    Solfaur and Dragam1337 like this.
  12. metagamer

    metagamer Ancient Guru

    Messages:
    2,596
    Likes Received:
    1,165
    GPU:
    Asus Dual 4070 OC
    3080ti looks like a beast. The 3080 also looks faster on paper than the 2080ti. I'm ready.
     
  13. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super

    haha I love how you make Nvidia pay twice for their ray-tracing hw, while giving AMD a freebie for their future implementation of the same feature:

    Also AMD advantage(??) will be even bigger when they are on the same node... yet they can barely compete while having a full node advantage... fascinating stuff :D

    Here is that "advantage":
    this is perf vs # transistors vs power, Turing vs RDNA, leveled ground - both without ray tracing (1660Ti vs RX 5500 XT):

    [​IMG]

    https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_20_series
    https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units#Radeon_RX_5000_series
    https://www.3dcenter.org/artikel/la...-rx-5500-xt/launch-analyse-geforce-gtx-1650-s
    https://www.3dcenter.org/artikel/la...ch-analyse-nvidia-geforce-gtx-1660-ti-seite-2
     
  14. Elfa-X

    Elfa-X Member Guru

    Messages:
    162
    Likes Received:
    21
    GPU:
    AMD RX 6800 XT
    If those specs are true, it's gonna wipe the floor with your bank account.
     
    DesGaizu, xIcarus, HandR and 2 others like this.
  15. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Now some reality check.
    Are those GPUs which are so much praised as new mainstream cards with higher than 2080 Ti performance going to be without RT related parts? No.
    Is AMD's implementation which enables RT related tasks based on separate (and therefore extra) building blocks that require its own data path with in/out caches? No.
    Is it expected 1660Ti with 50% more ROPs and 50% higher memory bandwidth performs better than 5500 XT? Yes.
    Is it expected that 5500 XT which compensates with higher clocks is less power efficient? Yes.

    Is only real representative sample (RX 5700 XT) having fewer transistors than its closest (and more expensive) competitor RTX 2070 and still outperforming it? Yes.

    When people talk about big ass GPUs, take closest representative example, not worst possible low end card you can find.
    - - - -
    So what are limitations here?
    AMD's statement that RDNA is going deliver quite same performance per watt jump as RDNA1 over previous comparable generation. Which meant 50%.
    May nVidia deliver more power efficient GPU? Yes.
    Will it matter? Yes, only for 300W limit cards.
    What will matter is performance per $, as always. And features. Maybe this time lower end (read mainstream) RTX cards will not be half useless for RT which people had to pay for quite some premium.
    - - - -
    If this rumor is to be taken seriously, then 3070 (1950MHz) will be able to deliver 90% of RT performance which 2080Ti has at 1750MHz.
    But then you can look at the list and make educated guess on which of those chips will be full and which will be cut down GPUs. (Mind the cost.)
    And guess relative price of those cards.
    And nice example that touches current nVidia's GPUs is again RTX 3070. In terms of building blocks its around middle of the road between 2080 and 2080 Ti. May have some 16B transistors.
    Take price of those 2 existing cards. Make approximation of how expensive 3070 would be on same node. Then translate it to new node and tell me how big "leap" in performance per $ 3070 will deliver over 2080 Ti.
    Even if you take in exponential progression of price with transistor count and make old node approximation for 3070 quite cheap... And then add just 10% price for cutting edge node, you'll end up with only about 10% higher performance per $ than 2080 Ti.
    - - - -
    After comes harsh reality of this rumor. 3080 will have like 2.5B transistors more than 2080 Ti. And I doubt that anyone expects it to be cheaper on cutting edge node.
    That 3080 Ti is almost a joke in terms of transistor count and therefore price unless it is chiplet based.
     
    Embra and jura11 like this.

  16. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    wall of text...
    once again giving AMD a freebie: 5700(no ray tracing) vs 2070 (ray tracing) , while pretending that 2070S does not exist.

    insisting on comparing oranges and apples, while glancing over perfectly leveled ground example (1660Ti vs 5500XT)...

    yeah right :D
     
  17. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    @Fox2232

    You're unfairly counting the transistor cost of dozens of Nvidia's features over RDNA1 that you aren't factoring into the "performance" comparison of the two cards. How does NVEC quality measure up to VCE? VRS? DLSS? RayTracing? Mesh Shaders? These are all features that Nvidia has over AMD (and probably more) that would add/change the transistor count but aren't being factored into your "performance" argument here.

    Until you have actual concrete general performance, RT Performance, feature set and transistor count of a discreet RDNA2 chip it's pretty meaningless.
     
  18. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    You get what you pay for is a lie. People paid premium and had hard time enjoying RTX features anyway.

    Next generation of nVidia's cards will not deliver any kind of price revolution. Lowest RTX 3060 is practically 2070S in terms of rasterization but has double RT core count + some 15% higher clock if we look only on cheapest 2070S cards.
    That's nice RT performance boost. One that will be gobbled quite fast and without extra effort. And as that happens, even 2080Ti will become worse performer in RT games developed for next generation.
    But at least RTX 3060 will have reasonable RT performance and people may finally get what they paid for. (That's if rumor is true.)

    If you happen to look at post I replied to originally which @Noisiv kind of defends for no good reason, you would or would not agree that "specs" (true or not) will not wipe floor with anything else than wallets.
     
    Last edited: Apr 30, 2020
  19. DownwithEA

    DownwithEA Active Member

    Messages:
    64
    Likes Received:
    47
    GPU:
    1060 6GB
    True or not, good or bad, all I have to say is if it costs more than $600 it doesn't exist for me. :(
     
  20. ruthan

    ruthan Master Guru

    Messages:
    573
    Likes Received:
    106
    GPU:
    G1070 MSI Gaming
    Yeah is misspell it has to be boobs clock.
     
    Rich_Guy likes this.

Share This Page