NVIDIA Ampere cards well under way for a release in September? Flagship 50% faster?

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 3, 2020.

  1. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    These are more expensive to make than Turing. Nvidia is not a charity and they will pull an Intel.
    Top of the range with the attrocious yields due to the die size will be expensive and hard to find and the market will be flooded with mid range cards.

    People forget the same rumour said that the chip is over 800mm2 at Samsung's 8nm process. That chip is huge for any meaningful profitable product at nodes smaller than 16nm. Even at 700mm2 the chip going to be almost twice as expensive than the one used in 2080Ti.

    AMD on the other hand had very small chip so would be easy to sell cheap if needed as it bit less than twice the size of the 250mm2 5700XT. (around 490mm2).

    So don't expect Ampere been than Turing, probably more expensive at the top of the range.
     
  2. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    No offense, but no sht. Who wouldn't want 40% faster GPU for 40% more power. You get same efficiency with more perf. under the hood - wheres the downside in that?
    Whether inadvertently or not, you're actually making a case for perf PER WATT in your post - twice. Despite starting with a call for more rendering power regardless of TDP - as long as we can "properly cool it".

    Agreed.
    350W would be kinda bad to borderline acceptable. Because it could just as well turn into 300W real power which is just... what.. just ~10% more power than 2080Ti. Which i guess is acceptable due to VRAM inflation.

    But if the card really needs 350W or god forbids 400W that would spell trouble. Is why I disagree with @Denial: IMHO 400W GPU can never turn out to be "perfectly fine" in the real world.
    Because if you are so desperate to grab measly 5% of performance by inflating your TDP by hunrdeish watts, that means you have unacceptably low perf. to begin with.
    There is no going around that: something went wrong, if you are that desperate to extract those last few digits of perf.
     
    carnivore and Fediuld like this.
  3. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    Nvidia is using Samsung's 8nm mobile process (same as 10nm process with few tweaks). It would need a lot of electricity the sheer size of the chip due to the power saving nature of the process.
     
  4. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    as I understand there are two or three variations of this 8nm process
    why would they make this kind of a chip on a mobile node ? isn't a mobile node suited for like 15-25-45w range ?
     
    Silva likes this.

  5. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    The guy I'm quoting literally said he wouldn't want that - but he's saying that based on "ignoring scaling" which again is my problem.. everyone is just ignoring scaling for whatever reason, makes no sense.

    Also where are you getting 5% from?

    We don't have performance numbers at all - people are just seeing 350w and are going "it's too high" but if it scales fine, then like you said - no crap everyone should want that. For some reason everyone is just assuming it won't scale properly based on nothing.

    Like you're sitting here saying 350/400w is too high - can't turn out "perfectly fine" in the real world - but people were literally saying that about the GTX480 in 2010 which was 250w and now 250w is fine - because again, cards came out that scaled in performance to that wattage and cooling was improved upon.
     
    Last edited: Aug 4, 2020
    fry178 and Solfaur like this.
  6. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    the problem is it's another 100w
    I don't where the limit is,but it has to be somewhere.and 250w is not 350w.

    anyway,I don't believe in 350w cards one bit.
     
  7. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Why though? Again why is the limit 250w and not 350? Lol..

    What if I told you Nvidia could build an architecture that requires a minimum of 350w but is 500,000 times faster then what's out now. Would you say "nah i don't want that, it's passed the limit"? Obviously not. You can't have a meaningful discussion on this conversation without the performance. And now to make it worse we're just making up numbers, 100w for 5% increase in performance - where is that even coming from?
     
    fry178, PrMinisterGR and Solfaur like this.
  8. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    [/QUOTE]

    From the way power scales with perf in the upper range. From my own experience with down/overclocking. From previous attempts at attacking usual TDP standard (GTX480,Vega, Fury), which turn out to be moves of desperation.
    IMHO 300-350W could turn out to be anything from meh to acceptable due to inflation of memory. 350-400W would simply mean trouble.
     
    carnivore likes this.
  9. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,973
    Likes Received:
    4,341
    GPU:
    HIS R9 290
    That's fair; I get that, and I have no interest in telling you what your opinions/preferences should be. But, you were broadly speaking why people care about wattage, and there are valid reasons, regardless of efficiency or performance. Hence my point about comparing a 9920X vs a 3900X.
    If all you're looking at is just the wattage, yeah, I'd totally agree. But the wattage itself isn't the problem, it's the all the side effects that go with it (how and where the heat is dissipated, noise, lifespan of the hardware, physical size, more expensive components, etc).
     
    Noisiv likes this.
  10. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,547
    Likes Received:
    608
    GPU:
    6800 XT
    If it was 15% in optimized titles it would be even worse tbh.

    [​IMG]

    Considering this is the best we can see in really optimized titles. And even in a lot of titles something like 5700 xt is only 20% behind. While being supposedly half of what the rdna2 is going to be. Or even less behind like these two titles. Normally 15-30%

    [​IMG]
     
    Fediuld likes this.

  11. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,547
    Likes Received:
    608
    GPU:
    6800 XT
    Tbh we cannot see just less and less wattage going to smaller nodes. Considering we are not doing big jumps anymore. Haven't been doing for a while. It's nice that these are only 320W.
     
  12. slyphnier

    slyphnier Guest

    Messages:
    813
    Likes Received:
    71
    GPU:
    GTX1070
    for me personally, scaling or not, keep raising wattage is still no good
    if we look at semiconductor trends, those nm-race is all about power efficiency and lower thermal

    if now nvidia rising the wattage for performance rise
    then its more like they just "brute-force" rather than really improved architecture-design
    i mean they should already get some benefit from new processing, from 12nm turing to 7/8nm ampere, right ?

    we have yet to see related to its cooling, as for basic-things is usually that more wattage = more heat

    aside that, for me personally the electric-bill already high here, small-studio-apartment cost like $150/month
    abit oot, how much u guys play electric bill ? it seems some people dont really care about their electric bill as it cheap enough
     
    Brasky likes this.
  13. NVIDIA is offering a cards for clunkers program
     
  14. Legacy-ZA

    Legacy-ZA Master Guru

    Messages:
    271
    Likes Received:
    203
    GPU:
    ASUS RTX 3070Ti TUF
    I have to say, I am disappointed with the amount of VRAM, I expected the lowest card to have at least 10GB.
     
  15. Brasky

    Brasky Ancient Guru

    Messages:
    2,602
    Likes Received:
    640
    GPU:
    Gigabyte 4070 Ti Su
    electricity is cheap, but i'd still like to see a greater increase in efficiency.
     

  16. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,212
    Likes Received:
    1,536
    GPU:
    NVIDIA RTX 4080 FE
    RTX 3080 Ti seems to be missing from that list so even if those cards do come out in September then I would be happy to wait for inevitable Ti variant with 12 or 16 GB of VRAM in 2021. The 3080 might be faster than my GTX 1080 Ti but it has less VRAM (10 GB vs. 11 GB) so I definitely would not buy that. And the Titan is likely to be £2,000-£3,000 knowing NVIDIA so that would never be a consideration.

    My GTX 1080 Ti is still a damn fine card and while I cannot use raytracing or DLSS, I am still gaming at 1440p on a G-SYNC display and will be for years to come so I have no need for a powerhouse 4K gaming card. All I want is an £800 (at most) ray tracing card that offers a substantial performance increase over my 1080 Ti, something the 2080 Ti did not offer (sure, it was faster but that £400-£500 was a massive price hike over previous high-end cards and ray-tracing is still poorly supported outside of a handful of key releases making it not very good value in my book).

    Actually, I may even consider AMD's new cards if they turn out to be competitively priced and NVIDIA's card end up being uber-expensive again. I will be buying a PS5 for 4K HDR gaming on my LG B9 OLED TV anyway.
     
  17. Great White Shark

    Great White Shark Active Member

    Messages:
    90
    Likes Received:
    41
    GPU:
    GTX 1070 Jetstream
    Don't get me wrong I love using Nvidia products, but they are a highly arrogant company after the bitcoin scandal with hiking the prices of their previous series of cards, as most of the world is suffering with the covid 19 pandemic, it will be very interesting to see how they price or over-price their new cards.

    Anyway, I'm confused about their specs: I understand RTX 3070/3070TI are 8GB and the 3090 Titan is 24GB, but why the hell is the RTX 3080 only have 10GB??? Shouldn't be 12GB as memory normally rises in 4's. i.e 8gb 12gb, 16gb. Can anyone explain Nvidia's thinking please?

    • RTX 3090 (Titan): 5248 shaders|24GB GDDR6X VRAM| 350W TDP
    • RTX 3080: 4352 shaders|10GB GDDR6X VRAM| 320W TDP
    • RTX 3070 Ti: 3072 shaders|8GB GDDR6X VRAM| 250W TDP
    • RTX 3070: 2944 shaders|8GB GDDR6 VRAM|220W TDP
     
  18. Silva

    Silva Ancient Guru

    Messages:
    2,048
    Likes Received:
    1,196
    GPU:
    Asus Dual RX580 O4G
    @Denial

    Obviously a good designed product wont matter that much for the majority of buyers acquiring it, even if it's rated at over 9000 (joke). That said, you have to take into consideration two things: first, every generation you see good and bad cards, someone is bound to f up; second, I do feel the heat coming out of my case with a Ryzen 2600 + RX580 combo and although it's not noticeable in winter, I do notice it in summer. How would a 600W GPU would perform and feel like in +30ºC weather? You can bet me the temperatures will not be reasonable, gets increasingly harder to cool down a GPU efficiently with economical viable solutions, unless you don't mind to ask Intel for their portable AC.
     
  19. Supertribble

    Supertribble Master Guru

    Messages:
    978
    Likes Received:
    173
    GPU:
    Noctua 3070/3080 FE
    Turin had a big performance upgrade in Horizon 4. I'm not sure if that graph reflects that. It might, or might not. I'm leaning at not. Dunno.
     
  20. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    I believe NV stated no 3080Ti it would be 3090 for the full fat chip.

    Both Samsung process are for mobile.

    Samsung 8LPP is the only process to make those huge dies Nvidia requires and is designed for slow speed mobile SOCs and is 100% DUV lithography.
    Nvidia cannot use 8FDS (is designed for RAM) nor 8LPU which is for tiny dies, 45w high speed ones (3Ghz) and not 800m2 behemoths.
    And GPUs don't work at 4Ghz otherwise it would require 1000W PSU on it's own.

    And already rumours state that the Ampere cards have special power socket to deliver more power than the 2x8 pin can deliver.
     

Share This Page