Moore's law on Next Gen cards

Discussion in 'Videocards - NVIDIA GeForce' started by stefanovicho, Aug 26, 2008.

  1. stefanovicho

    stefanovicho Maha Guru

    Messages:
    1,461
    Likes Received:
    0
    GPU:
    gf 6800GT / M8600GT
    What do u guys think, is the fact that we see more and more dual cpu's and gpu's a sign that we are at the limit of what we can be done with single cpu's/gpu's?

    If this is true the preformance increase of products we are used to will stagnate. Because the more expensive production of these kind of multiple gpu's is more expensive..

    I'm very happy that nvidia has a new product based on a single chip, but will the next generation also be based on a single more advanced chip? Or will nvidia als go the way ATI is going with dual gpu's?
     
  2. proFits

    proFits Guest

    Messages:
    5,866
    Likes Received:
    3
    GPU:
    RTX 2080
    They HAVE to make each chip better either way, but obviously two brains are better then one ... almost (two Forest doesn't make an Albert/Hilbert)

    And don't you worry more's law is being respected and will continue being respected, new gens are just around in a few corners
     
  3. stefanovicho

    stefanovicho Maha Guru

    Messages:
    1,461
    Likes Received:
    0
    GPU:
    gf 6800GT / M8600GT
    I red about this subject a year ago.. It seems that scientists encounter some problems with making the chip smaller everytime and using more transistors on a smaller space.

    I do'nt know the English word for it, some kind of leaking of electricity appears to be a bottleneck.


    Excuse me for my bad English :p
     
  4. Dustpuppy

    Dustpuppy Ancient Guru

    Messages:
    4,146
    Likes Received:
    0
    GPU:
    integrated - fffffffuuuuu
    We're slated for 40nm fab processes chips in late Q1 of next year from ATI & nvidia.

    But yes we're pushing the limit on current technology.

    We essentially have to invent a totally new kind of transistor between now and 2013-2018 (pending on how fast Intel hits the theoretical limit) or else fab processes will get stuck and no more die shrinks. At least that's what I read.


    The leaking electricity, you're probably thinking of 'barrier penetration' or 'quantum tunneling' most folks prefer quantum tunneling cause using the word quantum in everything makes them feel smarter.
     

  5. stefanovicho

    stefanovicho Maha Guru

    Messages:
    1,461
    Likes Received:
    0
    GPU:
    gf 6800GT / M8600GT
    Yes, I believe that’s the word I was looking for, quantum tunneling.

    How did you get those estimates (2013 - 2018)?

    You seem to know a lot about this. I for one think this is a very interesting subject.
     
  6. Dustpuppy

    Dustpuppy Ancient Guru

    Messages:
    4,146
    Likes Received:
    0
    GPU:
    integrated - fffffffuuuuu
    The problem becomes really bad below the 16nm mark, Intel is going to hit 22nm in 2012 with the 8 core Haswell. On the tock, what are they going to shrink to? That's where the 2013 date comes from, the 2018 date comes from the International technology roadmap for semiconductors.

    You can read all that on wikipedia.

    Intels schedule

    Fabs
     
    Last edited: Aug 26, 2008
  7. stefanovicho

    stefanovicho Maha Guru

    Messages:
    1,461
    Likes Received:
    0
    GPU:
    gf 6800GT / M8600GT
    Hmm very intresting, I wonder what we should think of when talking about a 'new kind of transistor'.
     
  8. Tbone325

    Tbone325 Member Guru

    Messages:
    164
    Likes Received:
    0
    GPU:
    EVGA GTX570
    nanotech or maybe biological:nerd:
     
  9. BlackZero

    BlackZero Guest

    [​IMG]
     
  10. Curtimus

    Curtimus Member Guru

    Messages:
    156
    Likes Received:
    0
    GPU:
    MSI HD4870 790/4400

  11. Black_ice_Spain

    Black_ice_Spain Guest

    Messages:
    4,584
    Likes Received:
    17
    GPU:
    5700XT
    well... that limit is supossed to be reached in 2020, so we are not so near...

    also problem with that limit, is that physics of quantum particles (its not smart, its scienfitic) is different than normal particles,

    so we just have to wait quantum physic to get studied, and keep building 16cores =)


    btw we are in nanotech already @_@. I dont think no1 will release new transistors, just build quanticum ones and test over and over till they make it work. The problem is not "they wont work", the problem its "we dont know how they will work"
     
  12. Tyler Lowe

    Tyler Lowe Member

    Messages:
    31
    Likes Received:
    0
    GPU:
    BFG 9800GTX+ SLI
    Look, an AMD tri core.:heh:

    Sorry to any AMD fans, it was just too hard to resist.:)
     
  13. Kolt

    Kolt Ancient Guru

    Messages:
    1,657
    Likes Received:
    509
    GPU:
    RTX 2080 OC
    It's either they are at the point where they can't develop anything cutting edge, or it's to the point where resources have altered to the point where adding in extra cores is a cost effective way of upgrading components. Or both.

    I think it's just easier for them to add in multiple cores. We all know there are new things coming. Just read some of the blogs and updates floating around.
     
  14. Phantomchess

    Phantomchess Active Member

    Messages:
    87
    Likes Received:
    0
    GPU:
    BFG GTX 280@700/1475/2660
    i read something about they will be switching to laser processors. its around 10000x faster than current silicon processors. sun microsystems they said already have a commercial prototype.
     
  15. dp100

    dp100 Member Guru

    Messages:
    118
    Likes Received:
    0
    GPU:
    EVGA 8800GT 512mb
    dont say things you are going to regret!! !!!
     

  16. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    idgi, there's 4 mice there..

    Just a response to that specifically, not to moore's law, but i don't think we're hitting a bottleneck making us use dual-GPU's etc., i mean that's like saying "hey look, dual-core and quad-core CPU, guess Single CPU's arn't able to do it anymore" when in fact CPU's are continually getting better, faster, etc. despite of the multiple cores, i think the reason we're seeing dual-GPU's is because people always want more and since they have the technology: why not use it? gets them more money, gets their customers what they want (more performance) etc.
     
  17. stefanovicho

    stefanovicho Maha Guru

    Messages:
    1,461
    Likes Received:
    0
    GPU:
    gf 6800GT / M8600GT

    Yes, but isn't it strange that amd chose a dual core high end product rather then a single core which can do the same (i'ts proven with the 280 from nvidia)

    Apperantly its easier to do that then developing a 'real step forward'.

    Also the fact that for example the geforce 9 series where based on the geforce 8 series walks trough my mind in this aspect
     
  18. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    AMD said they would do that to lower the costs of their cards, they specifically said they were not going for "high end" anymore, but rather, to make a high end card, double it up

    As to the 9 series, i don't think that is really relative....just because the GT200 is based off the 65nm technology just as the geforce 9, so there's nothing that happened between then and now that changed anything other then trial and error on their part to make a better, faster, GPU
     
  19. Tyler Lowe

    Tyler Lowe Member

    Messages:
    31
    Likes Received:
    0
    GPU:
    BFG 9800GTX+ SLI
    The 4th one is physically there, but disabled.:giggle2:
     
  20. Tyler Lowe

    Tyler Lowe Member

    Messages:
    31
    Likes Received:
    0
    GPU:
    BFG 9800GTX+ SLI
    It's in good fun. :)
     

Share This Page