1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

NVIDIA Next-Gen Ampere GPUs to arrive in 2020 - based on Samsung 7nm EUV process

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 5, 2019.

  1. MegaFalloutFan

    MegaFalloutFan Master Guru

    Messages:
    632
    Likes Received:
    79
    GPU:
    RTX 2080Ti 11Gb
    I NEED HDMI 2.1 ASAP
     
    fantaskarsef likes this.
  2. Mpampis

    Mpampis Active Member

    Messages:
    65
    Likes Received:
    37
    GPU:
    RX 5700 XT 8GB
    Well, the next release from nVidia is going to be a tricky one.
    Right now, an RTX 2080 Ti can only do 60+ fps with RT on @1080p (and decent 1440p), paired with an i9-9900k that is.
    Since nVidia will certainly have Ampere priced way above Turing, they must deliver 60fps@4K. I don't think people would spent ~1500k $ just to get 75fps@1440p instead of 60fps.
    Plus, Samsung's 7nm process has not been tested yet (besides their own test samples), so that's a big question too.
     
  3. BuildeR2

    BuildeR2 Ancient Guru

    Messages:
    2,835
    Likes Received:
    129
    GPU:
    EVGA 1080 Ti FTW3
    My thoughts exactly.
    My thoughts exactly.
     
  4. illrigger

    illrigger Member Guru

    Messages:
    139
    Likes Received:
    37
    GPU:
    GTX 1080 Ti
    Yeah, that's not going to happen until someone competes with them at the high end, and it doesn't look like that's happening anytime soon. The mid-range and lower is going to see a huge shake-up in the next year, but that 2080Ti is going to remain untouchable. There probably isn't enough market in the above $700 range for AMD to even bother trying (unless it also has some sort of datacenter application they can make real money from).

    As to Intel Xe, I'll believe it when it's in a reviewer's hands telling me it is faster than a 1080p@60 capable card when they ship in a year or two. My bet is that is about all they are going to manage, and that's completely dependent upon them getting a process shrink working that can actually turn out enough wafers for them to make them. They already dedicate a pretty big chunk of silicon on their current CPUs to graphics, and their top-end Iris is an order of magnitude slower than even the lowest end dedicated GPUs from AMD and nVidia. It's possible that they somehow have a miracle waiting in the wings, but it just seems... unlikely.
     

  5. shamus21

    shamus21 Active Member

    Messages:
    97
    Likes Received:
    10
    GPU:
    0
    20 to 30% may be more is possible but configuring chip design towards ray tracing improvements will cut the % down a lot. The real problem with this years new RTX gen is pricing.
     
  6. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,724
    Likes Received:
    177
    GPU:
    EVGA GTX 1080Ti SC
    Why would that be the case?
     
  7. Jawnys

    Jawnys Member Guru

    Messages:
    129
    Likes Received:
    21
    GPU:
    zotac amp extreme 1080ti
    honestly if they would have made a gtx 2080ti, at the price of the rtx 2080 i would probably have upgraded from my 1080ti, but the pricing of the 2080ti makes it a no go
     
    HARDRESET likes this.
  8. HARDRESET

    HARDRESET Master Guru

    Messages:
    399
    Likes Received:
    151
    GPU:
    1080Ti G1 GAMING OC
    My 1080 Ti Gigabyte gaming oc cost me $629.00 new , double the price, 30 to 40 % gain with 2080 Ti, no thanks.
     
  9. angelgraves13

    angelgraves13 Maha Guru

    Messages:
    1,347
    Likes Received:
    298
    GPU:
    RTX 2080 Ti FE
    I wonder if they’ll just have an ASIC chip that does raytracing in the future for very fast performance of RT tasks.
     
    HARDRESET likes this.
  10. Petr V

    Petr V Master Guru

    Messages:
    263
    Likes Received:
    62
    GPU:
    Gtx over 9000
    If nvidia keep this nonsense pricing they reliably kills their new gen products.
    750EUR for Lightning is good offer.
     

  11. Kaotik

    Kaotik Member Guru

    Messages:
    136
    Likes Received:
    3
    GPU:
    Radeon RX 5700 XT
    Just to nitpick, but wasn't IBM NVIDIAs main manufacturer with the infamous FX-series back in 2003?
     
  12. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,322
    Likes Received:
    879
    GPU:
    GTX 1080ti
    i wonder if this ridiculous impossible idea will ever stop being postulated.
     
  13. Stefem

    Stefem Member

    Messages:
    40
    Likes Received:
    4
    GPU:
    16
    Laws of physics may sucks but are there and can't be overcome, better make peace with them...
     
  14. Stefem

    Stefem Member

    Messages:
    40
    Likes Received:
    4
    GPU:
    16
    Yep, and they got burned...
     
  15. DrKeo

    DrKeo Member

    Messages:
    35
    Likes Received:
    12
    GPU:
    Gigabyte G1 970GTX 4GB
    This is very exciting. Turing was a huge architectural change for NVIDIA, it made the die huge and expensive so performance per $ wasn't great but it's a very advanced piece of hardware that begs for a die shrink. The last time that we've got a real die shrink (TSMC's 12nm and 16nm is basically the same) was the 1000 series and performance jumped two folds. For instance, the 970 GTX which was on 28nm 3.49TFLOP/s on a 398mm^w die became the 1070 GTX on 16nm which was 6.46TFLOP/s on a smaller 314mm^2 die. That's 85% more performance on a 21% smaller die. So I have very high hopes for the 3000 series, the 3070RTX could have a 2080ti level of performance and I'm expecting lower prices considering the 2000 series haven't sold well. So a 3070RTX with 2080ti performance for 400$-500$? I'm in.
     
    Stefem likes this.

  16. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,322
    Likes Received:
    879
    GPU:
    GTX 1080ti
    people claiming that RT made the die huge are wrong and need to stop speaking.
     
  17. Undying

    Undying Ancient Guru

    Messages:
    11,780
    Likes Received:
    1,499
    GPU:
    Aorus RX580 XTR 8GB
    Isnt the turing die much larger than pascal? I wonder why if its not the rt.
     
  18. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,322
    Likes Received:
    879
    GPU:
    GTX 1080ti
    Google it.
     
  19. Denial

    Denial Ancient Guru

    Messages:
    12,343
    Likes Received:
    1,529
    GPU:
    EVGA 1080Ti
    I did and they said it's the RT.

    FWIW I don't think it's the RT, I think it's the Tensor cores, which aren't required by RT.. but telling people to shut up and google the solution when 99.9% of the internet thinks the answer is RT is pretty dumb. There's been multiple threads, posts, even anandtech kind of concludes a major part of the die difference between the 1660 and 2060 is RT/Tensor. So I'm not sure what you expect them to find but RT.

    The shrink will help with yields but the cost per transistor on 7nm EUV is higher than 16/14/12nm - which will drive up the cost of chips regardless. I don't think 7nm is going to be as big of a jump as people think it is, without the price significantly inflating.
     
    Last edited: Jun 6, 2019
    Maddness, yasamoka and Fox2232 like this.
  20. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,322
    Likes Received:
    879
    GPU:
    GTX 1080ti
    RT cores are tiny fixed function units.

    The size comes from the increased cache, Tensor block (which doubles as the dedicated FP16 cores) and a hell of a lot more SM's per gpc.
     
    Last edited: Jun 6, 2019

Share This Page