Nvidia Ampere GA100-GPU would get 8192 cores and boost speed up to 2200 MHz

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 7, 2020.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,531
    Likes Received:
    18,841
    GPU:
    AMD | NVIDIA
  2. Undying

    Undying Ancient Guru

    Messages:
    25,477
    Likes Received:
    12,883
    GPU:
    XFX RX6800XT 16GB
    Vegeta, what does scouter say about the price? Its Over 9000, ghaa!
     
  3. LEEc337

    LEEc337 Member Guru

    Messages:
    160
    Likes Received:
    49
    GPU:
    Team Green 960
    A doubling of core count, like early Ryzen did with intels count makes me think nvidia have always been able to release bigger better gpus but didn't need to because nobody else was pushing them to, just my humble opinion
     
    386SX likes this.
  4. Hulk12

    Hulk12 Master Guru

    Messages:
    301
    Likes Received:
    78
    GPU:
    Zotac RTX 4090 :D
    This with large VRAM size won't be the gaming card but yes GA100 w/ 22GB VRAM. First GA104 w/ 16GB will release between in the end of spring and the begin of summer and later GA100 w/ 22GB. But maybe no - GA100 will be the gaming card. :)
     

  5. Stairmand

    Stairmand Master Guru

    Messages:
    376
    Likes Received:
    188
    GPU:
    RTX3090 Aorus Maste
    The Die size for the current TU102 is already pretty massive as it's on 12nm. It would be hard to make it bigger without very low yields. The 3xxx move to 7nm is needed.
     
  6. barbacot

    barbacot Maha Guru

    Messages:
    1,002
    Likes Received:
    982
    GPU:
    MSI 4090 SuprimX
    By your opinion today we would still use 8800GTS...if I remember correctly that was the last time that anybody "pushed" them...
    No..it's not like that. To their credit they always pushed - don't make comparison between CPU's and GPU's - they are different beasts and they develop differently.
    Their price policy is questionable at times but even without a serious competitor for a long time they developed and innovated.
     
    Maddness and 386SX like this.
  7. Petr V

    Petr V Master Guru

    Messages:
    358
    Likes Received:
    116
    GPU:
    Gtx over 9000
    I like to have this thing in my desktop but i better add some more money and get used c63 amg.
     
  8. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,464
    Likes Received:
    2,574
    GPU:
    ROG RTX 6090 Ultra
    Looking forward to upgrade my tired overworked GTX 1080 with something that is double the the performance, without being double the price (preferably even cheaper than what I paid for the 1080, one can dream...)
    I don't care which company it will come from...
     
  9. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,750
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    I need to stop looking at this stuff it might give me the itch to replace my 1070ti, which I still not over the 500$+ i payed of it

    Something for 250-300$ that capable of pushing 4k @ 60 fps might make me more willing
     
  10. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    If they're trying to pull another $$$ stunt like with 2080Ti, they can keep this ngreedia crap :p
     
    SplashDown likes this.

  11. Ricepudding

    Ricepudding Master Guru

    Messages:
    872
    Likes Received:
    279
    GPU:
    RTX 4090

    Ditto with my 1080ti, I just worry about the cost this is looking at, but honestly even if this came out as $1200 like the 2080ti, considering the performance jump I would say it was worth it (of course i would love it cheaper) but i reckon the size of the chip must be huge for 8k cores, though something like a 3070/3080 might be more up your alley, looking at it they might double your performance for a more resonable price (considering what you see as reasonable)
     
  12. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    That's just ignorant to be honest.

    Do you see the size of the rtx 2080 ti? Not the card, the actual gpu die? The thing is the largest consumer GPU die ever. It's massively costly to make and making a larger GPU with more shader cores "just because they could" isn't a feasible or smart business plan. Making a GPU as large as the 2080 ti was already pushing it.

    The only reason it's possible now is due to them going from 12(14?)nm to 7nm, something that wasnt possible when the 2080 ti was released.
     
    Solfaur likes this.
  13. XenthorX

    XenthorX Ancient Guru

    Messages:
    5,059
    Likes Received:
    3,438
    GPU:
    MSI 4090 Suprim X
    *Watch last kidney* "Your hours are counted"
     
    chispy likes this.
  14. Ricepudding

    Ricepudding Master Guru

    Messages:
    872
    Likes Received:
    279
    GPU:
    RTX 4090
    Considering its meant to double RT cores and almost double the cores from last gen, I still see this being a massive die, wondering if it could be even bigger than the 2080ti even with the shrinking, would be nice if it is smaller, should mean it will be cheaper. But totally agree, if Nvidia really wanted to with the tech they could make an 8000 core die using turning, but it would be too stupidly expensive to even consider making
     
  15. LEEc337

    LEEc337 Member Guru

    Messages:
    160
    Likes Received:
    49
    GPU:
    Team Green 960
    Wow sorry I wasn't tryin to offend I ment it as a resources to competition way (why put more out than you need to stay ahead) and now more people wantin 4k powerhouse cards the core counts have doubled and suspect all 3 companies to bring cards with 4k at mid range and high end cards going for high fps 4k with better 8k support again just guesses It will be a credit to all for innovation I love watching the innovation, I've watched gpus develop since direct ☓ 6 and earlier was in the snes vs mega drive wars of old so I have nothing but respect for the innovative things in the pc world just like yourself I'm sure, this just a forum these are just my opinions you may be 100% right but calm down before you pop mate ✌
     

  16. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    People are allowed to have opinions, ofcourse, but don't expect people to not comment on your opinion if you're going to post it on the internet. Especially if you post ignorant posts. Not trying to be mean here, it's the only word i can think of to describe your comment, as it has no basis in reality, and even the tiniest of research would make that comment cease to have ever existed.
     
  17. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,035
    Likes Received:
    7,378
    GPU:
    GTX 1080ti
    It isn't meant to double RT cores, nvidia went with scaling SM groups and increasing shader counts in the gpu instead.

    more sm groups, means automatic increase to RT cores, as there is one per SM group.

    TU102 is only larger if you exclude hbm (and you don't)

    GA100 = 826 mm²
    GV100 = 815 mm²
    TU102 = 754 mm²

    No, they really couldn't.
     
  18. Ricepudding

    Ricepudding Master Guru

    Messages:
    872
    Likes Received:
    279
    GPU:
    RTX 4090
    I mean OP says they double the tensor cores (rumour mind you), though by that standard 2080ti has 68SMs and 68RT cores, this would then have less than double as its 128SMs not 136SMs

    Reason for why they couldn't? RTX got to 4608 cores (I assume cause they can make a lot of these there could still be some imperfections on the die, if i'm wrong about that then my bad), maybe 8k might be pushing it a bit, but 6k in theory could be possible if they got a perfect platter right with no imperfections on it?
     
  19. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,035
    Likes Received:
    7,378
    GPU:
    GTX 1080ti
    far far to big a chip to deploy in nvmesh configs.
     
  20. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,516
    Likes Received:
    2,361
    GPU:
    Nvidia 4070 FE
    Will they be PCIe 4.0 cards or remain PCIe 3.0, I wonder.
     

Share This Page