NVIDIA Announces GeForce RTX 3070, 3080 and 3090 (With Even More Shader processors)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 1, 2020.

  1. itpro

    itpro Maha Guru

    Messages:
    1,364
    Likes Received:
    735
    GPU:
    AMD Testing
    Amd will easily beat 3070. You don't expect a new 5700xt, right? Unless amd will stay mid range. Even 5700xt destroyed 2070. Same can happen with 3070. Question is how really strong rdna2 will be. Perhaps it is able to surprise us!

    Also price is a thing. I am also wondering about Intel next year. Nvidia super also. I can hardly convince myself to buy a gpu in 2020. :rolleyes:
     
    sbacchetta and Undying like this.
  2. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    Look. 5700XT is 40CU RDNA1. If AMD had 54CU RDNA1 GPU it would have been equivelent to the 2080Ti. (it doesn't need to be 56CU because it would have wider VRAM bus and not chocking for bandwidth like the 5700XT does).

    AMD makes a 54CU RDNA2 APU atm with 2CU cut off (hence 52CU advertised) for the Xbox X series. That APU is faster than the 5700XT by more than 50%.

    So an 80CU (rumoured big Navi) RDNA2 GPU would be probably beating the 3090 and still it would be 2/3 the size of the 3090 die so much cheaper. (5700xt is just 250mm2)
    A 60CU RDNA2 would be around 3080.

    It all depends after that how AMD handles RT etc. We know is using unified shaders doing all jobs together and if the 52CU Xbox X series APU only needed 1/3 of the shaders to do RT to be in par to the 2080Ti.

    So I would say any sensible consumer should wait until we see RDNA2 reviews.
     
    Loobyluggs and Undying like this.
  3. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    One year is typically a refresh period. Super came faster than that.

    You always get more down the road - why should that suddenly be a concern? Just dont buy 1 month before the launch.
     
  4. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    no dog.

    1. 5700XT is not choking.
    2. if it was, you're making your goal even harder, cos you would need even more BW than simply a proportional increase.
    3. your hypothetical RDNA1 5700XT would need to be cca 50% faster, hence it would need at least +60% more hw i.e. 64 CU, not 54
    https://www.3dcenter.org/artikel/fullhd-ultrahd-performance-ueberblick-2012-bis-2019
     

  5. RamGuy

    RamGuy Master Guru

    Messages:
    252
    Likes Received:
    15
    GPU:
    nVIDIA GeForce GTX 580
    As an enthusiast, I'm once again left in this void that is no xx80 Ti release.

    The numbers are rather impressive, there is no denying that. But at the same time, I'm left feeling that going from 2080 Ti to 3080 is a step sideways. There will be a performance uplift and the price of the 3080 is attractive.

    But at the same time, I can't shake the feeling that a 3080 Ti will show up in 6-12 months and make me regret the decision of getting a 3080.

    But the release of the 3090 has me confused. I know it's being branded a consumer version of Titan RTX. Resulting it having what looks like a really awkward price to performance ratio as Titan cards have had for so long. But what does this entail for the 3080 Ti? Will this affect the release schedule for a 3080 Ti?

    The 2080 Ti was simply consumerization of the Titan RTX. But the 3090 is already somewhat of consumerization of an Ampere Titan RTX. And the price of 3090 is already lower than what we have come to expect from recent Titan cards. Might we end up seeing no 3080 Ti? Or might we end up seeing no Titan RTX Ampere?

    One would figure it's going to be one way or the other. The 3090 is placing itself in the middle of the 2080 Ti and Titan RTX when it comes to pricing. And considering its 24GB GDDR6X VRAM it's looking more like a Titan card than a xx80 Ti card. Is this NVIDIA bringing the 3090 back down to becoming more relevant for consumers or is this NVIDIA pretty much replacing their enthusiast offering with this 3090 branding and increasing its price in the process?

    1499 USD is a hard pill to swallow. But it would be much easier to consider it if we actually knew more about whether we are supposed to be looking at the 3090 as new Titan RTX and thus should be expecting a 3080 Ti down the line which will provide the same amount of performance at 999 USD price point or not.

    Considering how the RTX 2000 series had the 2080 and 2080 Ti releasing at the same time it makes it even more confusing. Are we back to the previous logic where Ti cards would normally show up 9-18 months later or shouldn't we be expecting any 3080 Ti at all?
     
    Loobyluggs likes this.
  6. TechEnthusiast

    TechEnthusiast Member

    Messages:
    18
    Likes Received:
    4
    GPU:
    RTX 2080ti
    Oh, I do believe the Image. Saw the presentation. I may just have seen more charts than you did.
    There where comparisons without RT, right before he switched to show RT as well.

    Not that I would care about non RT performance, since I don't need more of that.... but still. A almost 100% uplift in pure raster performance is quite nice. No matter how you slice it.
     
  7. user1

    user1 Ancient Guru

    Messages:
    2,782
    Likes Received:
    1,305
    GPU:
    Mi25/IGP
    The titan isn't really a consumer card, its kind of a poor mans quadro feature wise, so unless nvidia has unlocked those extra bits for professional software on the 3090, there will be another titan, heres hoping it will be a GA100 based card if we're lucky.
     
  8. alanm

    alanm Ancient Guru

    Messages:
    12,274
    Likes Received:
    4,477
    GPU:
    RTX 4080
    I cant understand Nvidias marketing sense re the 3090. They should have just called it a Titan. Easier to justify price than a 3090. Plus they could have compared it to RTX Titan Turing which was $2500 and they would have wowed ppl with the new price/perf vs that. Have feeling this was JHH decision and marketing is not his strong point.
     
  9. Undying

    Undying Ancient Guru

    Messages:
    25,478
    Likes Received:
    12,884
    GPU:
    XFX RX6800XT 16GB
    Titan is not for gaming and 3090 is even though we all know it is a titan nvidia wants to sell as a gaming gpu.
     
  10. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    Naming is a subtle point in the GPU lineup. It won't save or ruin your business, right?

    That being said, I doubt they are dropping Titan brand all together. Down the road... who knows - maybe even A100 based.
     

  11. ACEB

    ACEB Member Guru

    Messages:
    129
    Likes Received:
    69
    GPU:
    2
    With regard to the PR all being RTX/DLSS on, remember this is how just about every big game release moving forwards is going to be and that the key is in the fact that the consoles have it which means you know for sure the developers will have to use it otherwise the casual console fanboi will see it as a negative if its not in their game.
    I am not a game developer so I have no idea what the pipeline is to have ray tracing in your game, but I can't see it being difficult when its essentially plugged into an engine like Unreal Engine 4, this is going to be the norm moving forwards just like having PhysX was back in the day because developers are not going to have fake illumination just like they didn't have to fake physics.

    Now in the present there are little uses for ray tracing but if the DF video and some other insights are correct then the 3080 is still around 60% faster at the lower end in your traditional AAA games with no ray tracing over the 2080, the 2080ti is not anywhere near 60% faster than the 2080 I believe it was about 20%? Considering we know the cuda core count has essentially doubled and the memory is faster I don't see anyway to be disappointed in 'traditional' performance.

    As for AMD, I don't see why they would be worried, they would have known all this and will be waiting for the dust to settle and then you can expect a couple of leaked benchmarks to come out next week sometime. If anything I think AMD would be happy that the 3090 really isn't that great when compared to the 3080 which suggests Nvidia are pushing hard at that point, maybe they have something secret behind other than a normal super or ti version. With how powerful the consoles are we already know what AMD is going to be capable of and with Nvidia being semi reasonable with pricing it should tell you that AMD is going to bring something good to the table around that 3080 performance point. You may have to wait till late October/November but then its not like you can plug in the 3080 until end of September start of October anyway.

    Personally I'm excited for both brands, I'm also excited what AMD is bringing with the new Zen 3. Its a great time to be a fan of PC tech as performance has come along way the last few years with the CPUs, now the GPU's have caught up and PCI SSD's are becoming affordable at large sizes, hell I even replaced my monitor with a 55 inch gaming TV, thats how crazy things have become lol
     
    sbacchetta and Undying like this.
  12. yeeeeman

    yeeeeman Active Member

    Messages:
    54
    Likes Received:
    18
    GPU:
    Geforce 9600GT 512MB
    those "10000" cores are not similar to a turing with 10000 cores...
     
    Fediuld likes this.
  13. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    The Digital Foundry people had access to the cards, they were not allowed to show numbers, but they could show percentages in games selected by NVIDIA.

    Of course NVIDIA wouldn't select games that will show badly the new architecture (as I suspect most old crap like GTA V will), but even in modern games without RT (like Doom Eternal), the new cards actually performed even better.

    DXR and the equivalent of DLSS, or DLSS itself are important at this point. Buying a new card (even the "cheapest one" at $500), and "not caring" about them is, let's say, thoughtless.
     
  14. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    I would not worry that much. Look:
    3090 has 2.7 times as many transistors as RX 5700 XT
    3070 has 1.8 times as many transistors as RX 5700 XT... but no extra ROPs, and just 15% more TMUs. Sure it has double FP16/32 TFLOPs. But 3070 is basically 2080 with double shaders and tensors, nothing else increased in count outside of L1 cache and 14% higher memory bandwidth.

    It is kind of absurd, to give those cards insane increase in shading power, and then make them faster at DLSS.(Which is already contradiction.)
    And then giving them just 14% higher memory bandwidth in comparison to card with 1/2 of shading ability.
    There will be scenarios, where those cards will excel, and some where they'll comparably sux when transistor increase is factored.

    3070 is some 40% ahead of 2070S and that's some 10% ahead of RX 5700 XT. Therefore 3070 is some 54% ahead of RX 5700 XT.
    I think that AMD could manage to deliver 54% performance uplift above RX 5700 XT without spending 1.8 times as many transistors and that would still be RDNA1. That's not bad starting point. What matters now is:
    "Does RDNA2 needs more transistors per SM/TMU/ROP/... than RDNA1?" (Production cost.)
    "Does RDNA2 improves again in scheduling, caching, data compression over RDNA1?" (Better use of memory bandwidth.)
    "Are SIMDs in RDNA2 having even better utilization than RDNA1?" (GCN used to sit on its arse, waiting. RDNA1 did not wait as much.)
    "Does RDNA2 improve energy efficiency over RDNA1?" (3070 has currently ~50% higher energy efficiency than RX 5700 XT... a least on paper.)

    XSX has 10% lower TDP on entire chip than RX 5700 XT while this TDP is shared with 8C/16T CPU. XSX has some 30% more of everything in GPU too. It has comparable clock. And since it is guaranteed to not throttle CPU nor GPU, both together will fit into its 200W limit.
    And important part is that XSX GPU+CPU has total 15.3B transistors. (Zen2 8C/16T core die has some 3.9B transistors.) That leaves some 11.4B transistors for RDNA2 GPU, IMC for 10x GDDR6 and everything extra XSX has.

    In other words: RDNA 2 with 25% wider IMC and 30% more of everything in GPU than RX 5700 XT has just 11% more transistors. Does not fit, right? So is someone going to revisit XSX transistor count? Well, transistor density actually quite matches. Therefore numbers may be right.

    'To all my Pascal gamer friends: "It is safe to upgrade now." '
     
    sbacchetta likes this.
  15. sbacchetta

    sbacchetta Member Guru

    Messages:
    141
    Likes Received:
    66
    GPU:
    GTX 1080ti

  16. xIcarus

    xIcarus Guest

    Messages:
    990
    Likes Received:
    142
    GPU:
    RTX 4080 Gamerock
    I believe RDNA2 will deliver close to 3080 levels of performance.
    I've noticed that that Nvidia usually 'accidentally' offers slightly better performance at the top end at about the same price, even though they launch earlier than AMD do. I don't think that's a coincidence, 'specially when last generation was so expensive for not much performance benefit outside of raytracing. It later turned out that last generation AMD wasn't very competitive.
     
  17. SniperX

    SniperX Member Guru

    Messages:
    149
    Likes Received:
    80
    GPU:
    MSI RTX2080S Trio
    I really like the fact that they state the TGP figures and not TDP

    Also, it feels wrong that there is no Ti version :confused:

    And, are all of these GPUs fully unlocked?
     
  18. & AMD fanboys got annoyed with me for mentioning the possibility of RDNA-2 not bringing the heat; tsk tsk' - man if there ever were a time for Radeon Group to put up or ... *ugh basically die, it's now :(
     
  19. TechEnthusiast

    TechEnthusiast Member

    Messages:
    18
    Likes Received:
    4
    GPU:
    RTX 2080ti
    I am still worried about BigNavi.
    Simply put:
    All the math and guesswork is sound, it should be "easy" to do,... but they are not doing it. Why?
    If all it takes is a few more cores for them to beat a 3080 AND they actually improved the cores as well, why are we still waiting? Why the silence? They could have done at least a third Navi GPU by now, if it was that easy. Hell, make a entry level GPU... a 5800 XT... anything really.
    But they did not and we are still not hearing anything.

    That is what worries me. It sounds so easy, yet they are just not doing it.
    Makes me wonder if it is actually as easy as we guess... or if Navi was actually borderline for them. If Navi was very expensive for them to make that would explain why they don't release more performance brackets. They just needed to stay in the press with something.

    There has to be a reason why they took this long with just two Navi GPUs (that are not bad at all!), without taking the opportunity to scale those to lower end and higher end.
     
    Loobyluggs and Maddness like this.
  20. sbacchetta

    sbacchetta Member Guru

    Messages:
    141
    Likes Received:
    66
    GPU:
    GTX 1080ti
    Wafer supplies is probably the main issue for AMD, let's say they released Big Navi now let's say it's wonderful everyone wants one -> no enough supplies price skyrocket -> bad PR
    If Amd released after Nvidia start selling by buckets the 3000 series, AMD can build inventory, work on driver, they will have less demand to answer, less chance of having serious supplies issues, in the end better PR.
    If wafer supplies is really constraint for AMD coming first or waiting a bit won't in the end affect the number of unit sold for them.

    Edit: in term of wafer allocation for Amd, discreet GPU aren't high on the list.
     

Share This Page