NVIDIA to Announce Ampere at GTC in 2020?

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 13, 2020.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,388
    Likes Received:
    18,559
    GPU:
    AMD | NVIDIA
    Aside from fast 360 Hz monitors, things have been silent for NVIDIA at CES on the desktop front. However new industry reports indicate that Nvidia plotted an announcement date for its new architectur...

    NVIDIA to Announce Ampere at GTC in 2020
     
  2. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,693
    Likes Received:
    9,572
    GPU:
    4090@H2O
    Further discussion also HERE
     
  3. emperorsfist

    emperorsfist Ancient Guru

    Messages:
    1,972
    Likes Received:
    1,074
    GPU:
    AORUS RTX 3070 8Gb
    Well, it will be interesting to see how the "50% more performance at 50% less power consumption" pitch will hold up in reviews once the cards hit the market. Sounds too good to be anything but a sales pitch. Prices will be another thing as well...
     
    schmidtbag likes this.
  4. barbacot

    barbacot Master Guru

    Messages:
    996
    Likes Received:
    981
    GPU:
    MSI 4090 SuprimX
    Let's hope this rumor is true:
    Because then maybe this rumor will be true:

     

  5. Backstabak

    Backstabak Master Guru

    Messages:
    860
    Likes Received:
    347
    GPU:
    Gigabyte Rx 5700xt
    I think that if Ampere is going to perform as the rumor claims, then I'm absolutely sure that it will not be cheaper at all and very likely far more expensive than the current lineup. NVIDIA produces huge chips compared to AMD and they are really great that they can optimize design to solve all the problems associated with it, but producing one large chip is still going to be expensive, on smaller node even more so.
     
  6. cryohellinc

    cryohellinc Ancient Guru

    Messages:
    3,535
    Likes Received:
    2,974
    GPU:
    RX 6750XT/ MAC M1
    I doubt that they will have 50% less power consumption and 50% more performance. That can be assumed that they provide 50% more performance at 50% less power usage on a specific clock. That doesn't mean card will be equally efficient at 100% performance.

    If NVIDIA achieved such efficient, we will have the same story as with GTX 580, where NVIDIA achieved higher efficiency but instead of offering a significantly better product for the consumer, they have offered a mid range product, for the price of the high end product. Thus started the insane price increase era fpr the GPU market.

    They did it once, they will do it again, UNLESS AMD will come up with a highly competitive product. Otherwise, next gen will be yet another 10% performance increase, with 50% more profitability for NVIDIA as in this case yet again they will be able to sell lower class silicon as a high end version.
     
  7. Andy Watson

    Andy Watson Master Guru

    Messages:
    304
    Likes Received:
    177
    GPU:
    960
    Not sure about the performance or cost but the time frame seems reasonable.

    All those 1080 owners who have been hanging on might be happy .....
     
  8. HybOj

    HybOj Master Guru

    Messages:
    394
    Likes Received:
    325
    GPU:
    Gygabite RTX3080
    Market has never really recovered from the mining craze. Even the RX 5700 XT which is a nice card by todays standards and offers the best price/perf ratio in its performance segment - is too expensive. Let alone ANYTHING above its level from nVidia.

    Lets hope it gets better with the next generation of cards in 2020

    I will hold on to my dear GTX970 until this settles down somehow. Its still really good for 60fps gameplay.
     
    Maddness likes this.
  9. geogan

    geogan Maha Guru

    Messages:
    1,267
    Likes Received:
    468
    GPU:
    4080 Gaming OC
    I thought 7nm was so cutting edge that wafer yields were not great. Other companies like AMD only have to produce relatively small chiplets which gets around this yield issue as it doesn't matter if a few of them fail per wafer.

    NVidia has to produce a few great big GPU dies on each wafer, so the chance of a lot of them having small errors will be much greater, so yield will be less, and cost per GPU of those that do work perfectly will be much higher, I'd say.

    Going to be an awful lot of semi-broken GPU dies available to them - which they will probably be able to flog as lower-end GPUs like 3050, 3060, 3070 etc. but I'd say the fully working complete GPUs will be rare enough per wafer, so high-end 3080Ti, Titan etc will be expensive.
     
  10. kings

    kings Member Guru

    Messages:
    158
    Likes Received:
    131
    GPU:
    GTX 980Ti / RX 580
    I very much doubt that we see anything related to gaming. GTC is a Deep Learning and AI conference.

    If they do show something, it is most likely to be for the professional market.
     
    Last edited: Jan 13, 2020
    cryohellinc likes this.

  11. Spider4423

    Spider4423 Active Member

    Messages:
    78
    Likes Received:
    34
    GPU:
    ASUS TUF 4080
    Imo theres nothing to hang... I mean my 1070 works wonders under 2560x1440, so far no issues. RTX unless it matures a bit, I can go without.
    Will see when Cyberpunk releases if there is a need for upgrade :p
     
  12. Undying

    Undying Ancient Guru

    Messages:
    25,332
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB
    Im glad nvidia continue to dominate, makes amd and intel try their best to fallow. Im only scared nvidia will bump the prices even more.
    If 3080 beats 2080ti with some additional rt cores/improvements you can only imagine how much will it cost. Same goes for 3070. Im sure many of us wont even afford 3060. :D
     
    Last edited: Jan 13, 2020
    warlord likes this.
  13. Silva

    Silva Ancient Guru

    Messages:
    2,048
    Likes Received:
    1,196
    GPU:
    Asus Dual RX580 O4G
    @Undying Your comment is sad and reveals why the market has come to this state. You're glad Nvidia dominates, but you're scared they'll bump prices even more. What do you think it will happen when a company has the monopoly, offer a ticket plane and 2 weeks at Hawaii?
     
    Mesab67, cryohellinc and warlord like this.
  14. warlord

    warlord Guest

    Messages:
    2,760
    Likes Received:
    927
    GPU:
    Null
    I will never buy blue and green products. All red for ever and consoles. Until Intel and Nvidia ask reasonable money, I have no reason to brag about pc hardware and show my friends anything about it or beg for upgrade discounts and parts swapping. Real life dominates digital one. Amd shows the way, years now. They won cpu battle. If only they could win also gpu front.

    We hope Nvidia "to learn" from intel's mistakes. Overspending $ at r&d you should never alleviate the costs through consumers, only partners etc.
     
  15. Petr V

    Petr V Master Guru

    Messages:
    358
    Likes Received:
    116
    GPU:
    Gtx over 9000
    2K for gaming gpu is nuts.
     
    jura11 and Mesab67 like this.

  16. cryohellinc

    cryohellinc Ancient Guru

    Messages:
    3,535
    Likes Received:
    2,974
    GPU:
    RX 6750XT/ MAC M1
    This so much.
     
    Silva likes this.
  17. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,973
    Likes Received:
    4,341
    GPU:
    HIS R9 290
    As said in another Ampere thread: it's likely to be a 50% total improvement.
    So if it's 50% faster without using more watts than the 2080Ti, that's technically 50% more efficient, because it's basically more performance for free. But the phrasing is ambiguous. 50% more efficient could mean it uses half the power of the 2080Ti while also being 50% faster, which is absurd and not going to happen.
    Even a 50% total improvement sounds unlikely. Nvidia isn't going to pull an Intel and wait for AMD to catch up, but, they don't need to try this hard either.
     
    Silva likes this.
  18. nevcairiel

    nevcairiel Master Guru

    Messages:
    875
    Likes Received:
    369
    GPU:
    4090
    Thats not even a sales pitch, thats just unfounded rumors.

    Or maybe 50% more isn't even trying that hard, if better architecture is combined with a node shrink. And its not like we could ever have enough graphics performance, between a strong push for 4K and RTX, performance keeps going down, not up.
     
  19. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    They kind of do because they are falling behind in deep learning performance and they typically use the same architecture across their entire product stack. I predict at some point they'll split the server stuff completely off and go MCM with that but I don't know if that's happening with Ampere.

    Keep in mind that this is a double node shrink for Nvidia and they are probably going straight to 7nm+ EUV. IIRC Vega VII was ~30% improvement over 64 with just a die shrink - 7nm+ EUV adds another 10-15% over 7nm. So even if you ignore architecture improvements Nvidia is going to gain like 40% just from node improvement.
     
    Maddness likes this.
  20. barbacot

    barbacot Master Guru

    Messages:
    996
    Likes Received:
    981
    GPU:
    MSI 4090 SuprimX
    When Nvidia changed nodes they always improved quite a lot - see 1080 Ti that wiped the floor with everything Nvidia before so changing now to 7nm it may be possible again based on their history.

    Beware the leather jacket man!
    [​IMG]
     

Share This Page