TSMC Ramping up 2nm Wafer Fabrication Development

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 8, 2020.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    38,966
    Likes Received:
    7,643
    GPU:
    AMD | NVIDIA
    Your girlfriend would disagree, but yeah, smaller is better, especially in the land of technology. While production-wise we're at 7nm you've already heard about 5nm. But yes, chip wafer production c...

    TSMC Ramping up 2nm Wafer Fabrication Development
     
    Maddness and BetA like this.
  2. Brasky

    Brasky Ancient Guru

    Messages:
    2,240
    Likes Received:
    191
    GPU:
    ASUS 1080 Strix
    Is there a size limit where it can't physically get any smaller?
     
  3. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    38,966
    Likes Received:
    7,643
    GPU:
    AMD | NVIDIA
    They're closing in on that limit and it remains fascinating to see and follow. There are a lot of studies on that and there is no clear answer, but I don't think we'll pass anything smaller than 2nm anytime soon. Crazy when you think about the sizes, I mean a strand of your DNA is 2.5nm.

    There has been a prototype of 1nm established however transistors are getting closer and closer to atom sizes and to carve that into a silicon wafer is nearly impossible (I think I read somewhere that 70 atoms are currently the smallest design). Once that threshold is reached, the next step is not fabbing smaller wafers and optimizations (e.g. like Finfet++++), but the future is computing technology like quantum bit computing.

    Then again I also heard that some researchers have been able to make a transistor 167 picometres in diameter that's like 0.167nm .. technology always evolves.
     
    m4dn355, Ricardo, carnivore and 4 others like this.
  4. wavetrex

    wavetrex Maha Guru

    Messages:
    1,129
    Likes Received:
    715
    GPU:
    Zotac GTX1080 AMP!
    There will be a point (probably under 1nm) where there will be no benefit of reducing even more, as quantum tunneling effects counteract any gains from the reduced structure size.
    Even if it is possible to reduce transistor size even more, it will not be faster and consume less.

    My guess is that the next evolution in performance will be to move into 3D (like the stacks of cells in an SSD), but for logic transistors, if a way to properly cool the inner layers is found.
     

  5. Fox2232

    Fox2232 Ancient Guru

    Messages:
    10,788
    Likes Received:
    2,721
    GPU:
    5700XT+AW@240Hz
    There was time, not so long ago, when 5 nm was considered as impossible target. And now they have 3 nm and talk about 2 nm.

    As @Hilbert Hagedoorn wrote, at this time, it is being counted in atoms. Freaking atoms.

    Some day in far future, we may have CPU's built atom by atom like from 3D printer. And maybe in few thousands of years even atoms will be manufactured through different energy fields.
    Low enough operational voltage and clock will take care of it.
     
  6. thesebastian

    thesebastian Member Guru

    Messages:
    130
    Likes Received:
    37
    GPU:
    GTX1080 + H90
    Last year I got a 7nm 3700X CPU and also started got a smartphone with 7 nm SoC (although Google adds so many services into Android that I rarely see a real benefit comparing vs older Nexus phones with much more density SoC and the same battery capacity/mAh).

    I can't wait to have hardware with 2 nm CPUs
     
    Last edited: Jun 8, 2020
  7. sverek

    sverek Ancient Guru

    Messages:
    6,099
    Likes Received:
    2,948
    GPU:
    NOVIDIA -0.5GB
    That's what she said.
     
    m4dn355, user1, cookieboyeli and 4 others like this.
  8. Astyanax

    Astyanax Ancient Guru

    Messages:
    7,706
    Likes Received:
    2,580
    GPU:
    GTX 1080ti
    funfact: 2nm won't technically be 2nm.
     
  9. Kaarme

    Kaarme Ancient Guru

    Messages:
    2,039
    Likes Received:
    700
    GPU:
    Sapphire 390
    I'd really like to see optical computing start to take more concrete steps. Maybe when the traditional silicon tech reaches the nm endpoint, it will happen. Probably first in hybrid solutions. Since PCIe 4.0 already gives troubles to the developers, I really imagine replacing the electric PCIe with optical communication would be satisfying. But then again, I'm not an engineer.
     
  10. Noisiv

    Noisiv Ancient Guru

    Messages:
    7,133
    Likes Received:
    774
    GPU:
    2070 Super
    neither was 7nm or 14nm or 22nm

    Definition of process size is about as much technical, as is marketing.

    IOTW: It's fine to be anal. Just don't think you're special if you're living in ancient Greece :D
     

  11. Astyanax

    Astyanax Ancient Guru

    Messages:
    7,706
    Likes Received:
    2,580
    GPU:
    GTX 1080ti
    :D

    Graphene and carbon nanotubes ahoy
     
  12. Silva

    Silva Maha Guru

    Messages:
    1,208
    Likes Received:
    437
    GPU:
    Asus RX560 4G
    Every company calls their products what they wish to call them, for marketing reasons.
    Intel advanced 14nm is much more dense over TSMC 12nm and even slightly more denser than GF 12nm. Source: https://en.wikipedia.org/wiki/14_nm_process
    Again, we could compare TSMC 7nm to Intel 10nm, very similar (minus the fact there are no Intel products using 10nm at all). Source: https://en.wikipedia.org/wiki/7_nm_process
    That means if TSMC calls something 2nm, doesn't mean it's actually 2nm.

    What really matters is how to contain the flow of electrons. We need clear 1s and 0s and we can't have leaks or we get errors. Smaller means less voltage and probably a point at witch we get less frequency (maybe that's why Intel 10nm failed). I think we're already hitting an economic wall at 7nm and they're hammering it with loots of science money. The limit I don't know where it is, but economics will play a big part.

    As for what would we do next, I think 3D is one option. Heat dissipation could be an issue, but if we put the low power components at the bottom and the high power ones on top, we could get away with a first generation. Also, lowering voltages across the board in favour of having a denser chip, or reinvent a better cooling solution to keep it cooler.
     
  13. Ricardo

    Ricardo Member Guru

    Messages:
    145
    Likes Received:
    82
    GPU:
    1050Ti 4GB
    Kudos for the unexpected joke, you got me there LOL :p
     
    Dragam1337 likes this.
  14. slyphnier

    slyphnier Master Guru

    Messages:
    770
    Likes Received:
    60
    GPU:
    GTX1070
    what i curious more is about transistor aging
    last year i read :
    https://semiengineering.com/transistor-aging-intensifies-10nm/
    https://semiengineering.com/transistor-options-beyond-3nm/

    so far there not much report regarding it, other than report from people that OC their CPU and get degraded quite fast, that i know many of those people did OC above unsafe voltage range, so not really mean that smaller nm = faster degradation either

    but i believe there are some trade for more efficient chip in someways,
    well i suppose they design the chip to work at least within warranty period, so around 5years? before seeing some degradation
     
  15. Middleman

    Middleman New Member

    Messages:
    9
    Likes Received:
    5
    GPU:
    1070TI
    So back in the day, about 20 years ago, i read a leaked document about microchip development and the planned introduction into human beings. The report stated that the goal was to achieve 2nm manufacturing node, and then at that time they could start integrating them into people.
     

  16. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    5,284
    Likes Received:
    1,877
    GPU:
    HIS R9 290
    I think most of us are aware of this, however, I do think people put a little too much emphasis on transistor size and performance+efficiency. Performance improvements were noticeable back in the days of shaving off 15nm^2 per transistor. Now, we're talking 2 or 3. That isn't going to make a big difference to the consumer. The reason manufacturers are pushing for it is because of being able to fit more product on a single wafer.
    I agree with this. Stacking appears to be the only sensible choice for the future, at least for GPUs. Heat won't necessarily be a problem if voltages are lowered. Think of it like this: imagine having the amount of transistors found in something like a RTX 2080Ti, then double it. Thermals do not rise linearly with voltage or clock speed. If you slow down each transistor (which you kinda need to do for these tiny nodes anyway) the bottom layer might be able to just barely run cool enough to offer some insane performance. I doubt we can achieve a triple-layer stack without serious thermal issues, though I'd love to be proven wrong.
     
    Silva likes this.
  17. David Lake

    David Lake Master Guru

    Messages:
    716
    Likes Received:
    29
    GPU:
    Titan V watercooled
    What happened to the 10nm limit of silicon?
     
  18. Aura89

    Aura89 Ancient Guru

    Messages:
    8,027
    Likes Received:
    1,183
    GPU:
    -
    Can you stop pasting nonsense?
     
    RzrTrek likes this.
  19. Jespi

    Jespi Member

    Messages:
    22
    Likes Received:
    7
    GPU:
    8GB
    I think we will reach the limits of a sillicon, rather than limits of physics. I think, that around 1nm - 0.8nm we will have to switch from sillicon to other materials, i´ve already read about something about graphene and other complex substances, which are too expensive. I think manufacturers are trying to squeeze maximum from the sillicon just because the prices are "low" ... nobody would pay 5000€ for mid-range desktop processor based on let´s say graphene.

    People are mentioning quantum computing, but i think that´s rather 15-20 years into the future (when we will be able to build quantum computer, that can do general computing not pre-defined specific things)
     
  20. lmimmfn

    lmimmfn Ancient Guru

    Messages:
    10,409
    Likes Received:
    82
    GPU:
    AorusXtreme 1080Ti
    It's wafer thin!!!
     

Share This Page