Nvidia Ampere GA100-GPU would get 8192 cores and boost speed up to 2200 MHz

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 7, 2020.

  1. Hulk12

    Hulk12 Master Guru

    Messages:
    301
    Likes Received:
    78
    GPU:
    Zotac RTX 4090 :D
    Maybe no PCIe 4.0 support for top- or high-end GPU Ampere because the Intel doesn't announce PCIe 4.0 for desktop in this year. ;)
     
  2. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,793
    Likes Received:
    1,396
    GPU:
    黃仁勳 stole my 4090
    Even if it's "only" $1000 MSRP, it's absolutely hilarious that people here would think that's a reasonable or fair price, by any stretch of the imagination. In b4 someone comes in talking about how HBM2 costs more than an ocean of virgin blood, that their R&D costs more than 10 billion pure souls, then links to nVidia's ballsack's reddit page stating they pay everyone in their staff and homeless people outside $10K per minute therefore they need to charge insulting amounts. :rolleyes:

    I'll drop to 30 fps gaming or buy a console before I pay nVidia's mafia monopoly prices. Or better yet, bust out my backlog of old games. Too bad, I was looking forward to Cyberpunk on PC (no, I won't play it on a console). Guess I'll play that in 2026 or so. I can wait.
     
    Last edited: Mar 9, 2020
    carnivore and -Tj- like this.
  3. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,045
    Likes Received:
    7,382
    GPU:
    GTX 1080ti
    Because you're market ignorant.
     
  4. EngEd

    EngEd Member Guru

    Messages:
    138
    Likes Received:
    40
    GPU:
    Gigabyte RTX 3080
    Expecting 50-70% more performance over a 1080ti and 40% more performance over a 2080ti? . Expecting to reach 2.4Ghz with a stable overclock on air/water? Expecting much better RTX performance aswell. Will it undervolt stable to decrease the temps? Find out more when Nvidia releases their new 3000 series graphics cards which will buttkick your wallet so far to the outer reach of the solar system.
     

  5. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,552
    Likes Received:
    609
    GPU:
    6800 XT
    Why not tho. Like that is the only part where it would even make sense. Anyway you can put the pcie 4.0 cards in 3.0 slots anyway.
     
  6. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,766
    Likes Received:
    9,667
    GPU:
    4090@H2O
    I still have my doubts to believe this is the gaming card we will see, but more a workstation card. And thus, I don't really expect to see those 8192 cores hit the 3080TI either.
     
    Maddness likes this.
  7. Crazy Joe

    Crazy Joe Master Guru

    Messages:
    297
    Likes Received:
    129
    GPU:
    RTX 3090/24GB
    No, that would probably the 3xxx series Titan card. I guess for the TI and lower cards they'll disable some SM's and go with GDDR6 i.s.o HBM2e. Just to save costs on the memory.
     
  8. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,766
    Likes Received:
    9,667
    GPU:
    4090@H2O
    Indeed, like they usually do. Hence my post, I concurr.
     
  9. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,107
    Likes Received:
    2,611
    GPU:
    3080TI iChill Black
    No he is right. The only ignorance is your nv fanboyizm and the need to defend it. :p
     
    Neo Cyrus likes this.
  10. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,045
    Likes Received:
    7,382
    GPU:
    GTX 1080ti
    You're both market ignorant.
     

  11. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,107
    Likes Received:
    2,611
    GPU:
    3080TI iChill Black
    Is that "market ignorant" a new word you like to toss around like your placebos?

    lol, fail. xD
     
  12. alanm

    alanm Ancient Guru

    Messages:
    12,277
    Likes Received:
    4,484
    GPU:
    RTX 4080
    Yeah, $1000 GPUs are a horrible thing if it holds into next gen, but I doubt the massive dies used in them allows them much room to come down in price to Pascal levels.
     
    Aura89 likes this.
  13. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,045
    Likes Received:
    7,382
    GPU:
    GTX 1080ti
    The failure is you.

    https://www.guru3d.com/news-story/gddr6-significantly-more-expensive-than-gddr5.html
    https://www.reddit.com/r/nvidia/comments/99r2x3/attempting_to_work_out_rtx_2080ti_die_cost_long/

    And then this doesn't even have costings for pcb complexity,


    nuff said :cool:

    Turing is priced exactly where it should be for the people who were passing over pascal as an upgrade.

    The people who won't buy turing weren't going to buy turing anyway regardless of the price.
     
  14. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,107
    Likes Received:
    2,611
    GPU:
    3080TI iChill Black
    So ram was more expensive and now it made gpu 50% more expensive.. right..nice trolling.

    Amd hbm is more and in end it got slashed down to initial prices before mining craze and all..


    That said the only price hike on nv side was yes they taking advantage of mining market. Greed has no limits.. but it took a toll on them for sure.

    I've seen some news a while ago high end gpus wont be as expensive as now, just lower end like xx60 and xx70 will somewhat stay the same.. top tier will have price cuts.
     
  15. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,045
    Likes Received:
    7,382
    GPU:
    GTX 1080ti
    You were given 2 links, and a third component to consider and you only picked on the one who's cost directly scales on amount used.

    https://www.reddit.com/r/nvidia/comments/9o6256/2080ti_allocation_problems/

    a good part of the freaking cost is the component shortage, another part is price per wafer to buy tmsc fab time.

    If Ampere is cheaper i'll be surprised since euv takes longer to fab and backdrilling is time and money.
     
    Last edited: Mar 10, 2020

  16. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,045
    Likes Received:
    7,382
    GPU:
    GTX 1080ti
    [​IMG]

    102 = 3080ti[5120/80], RTX Titan A, Quadro RTX A8000)[5376/84]
    103 = 3080 Full/3070[3584/56]
    104 = 3060ti Full/3060[2816/44, possible even 2560/40]
    106 = 3050ti Full/3050 160bit interface?
    107 = 3030(no nvenc?)
     
    Last edited: Mar 12, 2020
    fantaskarsef likes this.
  17. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,251
    Likes Received:
    232
    GPU:
    EVGA GTX 1080@2,025
    Ok... so i'm a bit confused here. Why does everyone believe that nVidia is making these Ampre chips using 10nm? The leather jacket himself clearly stated that these chips will be made primarily using TSMC 7nm, with some others using Samsung's 7nm. Did nVIdia make a statement in the past couple months stating otherwise? I personally think they made that switch from Samsung to TSMC for the bulk of these upcoming chips because they plan to use TSMC's CoWoS technology.

    The availability and consumer demand of an item dictates the price of said item. Anyone who doesn't understand that concept is "market ignorant" and/or what Bernie Sanders calls, a "democratic socialist"....
     
  18. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Yeah - because the demand of buying a video card is similar to healthcare...

    You can make points without bringing politics into it.
     
    HandR likes this.

Share This Page