Intel pledges 80 cores in five years

Discussion in 'Frontpage news' started by Infested Nexus, Sep 27, 2006.

  1. Infested Nexus

    Infested Nexus Ancient Guru

    Messages:
    4,782
    Likes Received:
    0
    GPU:
    AMD Radeon™ HD 8850M
     
    Last edited: Sep 28, 2006
  2. QuadCannons

    QuadCannons Ancient Guru

    Messages:
    2,536
    Likes Received:
    0
    GPU:
    GeForce 210
    Holy ****!!!
     
  3. MrFox

    MrFox Master Guru

    Messages:
    828
    Likes Received:
    0
    GPU:
    Gigabyte 580GTX SOC x2
    Mother F*cker! Thats so awsome :bigsmile:
     
  4. AlecRyben

    AlecRyben Ancient Guru

    Messages:
    7,774
    Likes Received:
    0
    GPU:
    5x580 2x590 2x780Ti 1x970
    If that 80 core prototype is running Windows, there are so many cores that are idling all the time, it's not even funny! :D
     

  5. Phalkon30

    Phalkon30 Ancient Guru

    Messages:
    1,529
    Likes Received:
    0
    GPU:
    Leadtek 6800 GT @ 420x1175 /water
    I'm pretty sure heat dissipation is going to be a problem there. Frankly it just looks like a 200 or 300 mm wafer of chips, nothing that would actually work.
     
  6. Infested Nexus

    Infested Nexus Ancient Guru

    Messages:
    4,782
    Likes Received:
    0
    GPU:
    AMD Radeon™ HD 8850M
  7. scatman839

    scatman839 Ancient Guru

    Messages:
    14,126
    Likes Received:
    528
    GPU:
    1080, KD55XD800
    Well, about time CPU's caught up with GPU's.
     
  8. Alexstarfire

    Alexstarfire Ancient Guru

    Messages:
    8,316
    Likes Received:
    0
    GPU:
    GeForce 9800GTX+ @ stock
    And Intel gets dumber. More cores isn't the way to go forever, just like increasing the CPU speed didn't work forever. 80 is too many and I doubt we'd ever need that many to begin with.
     
  9. Armoured

    Armoured Ancient Guru

    Messages:
    2,068
    Likes Received:
    0
  10. Netgamer

    Netgamer Ancient Guru

    Messages:
    2,096
    Likes Received:
    0
    GPU:
    6600LE 300/300
    What would the point in 80core have?
     

  11. Denial

    Denial Ancient Guru

    Messages:
    13,294
    Likes Received:
    2,777
    GPU:
    EVGA RTX 3080
    Uh how so? I'm sure Intel has a team or two of engineers working on things like quantum computing and such but it looks like the best way to go atm is more cores. I also fail to see how 80 is to many, if a thread application can scale to use that many then what down side is there to it?
     
  12. Armoured

    Armoured Ancient Guru

    Messages:
    2,068
    Likes Received:
    0
    You won't be saying that 5 years from now
     
  13. Netgamer

    Netgamer Ancient Guru

    Messages:
    2,096
    Likes Received:
    0
    GPU:
    6600LE 300/300
    But Honestly, we have Quad core, and we dont even need that many right now.

    Dual core is more than enough for the hard core gamer.

    Only thing that truly Benefits of a quad core CPU is a server.


    After reading the thing about AMD thought they were toast, but now reading this i just might think different. Nvidia needs to start making CPU's
     
  14. tsunami231

    tsunami231 Ancient Guru

    Messages:
    11,505
    Likes Received:
    795
    GPU:
    EVGA 1070Ti Black
    80 cores eh? What we gona need a 5000w psu to run our pc's? I sincerely hope they find way to seriously reduce power consumption on pc's I find it ridicules that we have 1000w psu now and as it is they produce to much heat cause all the power they need. If they can find a way to reduce power consumption and reduce heat greatly I all for this 80 core =P

    As it is I dont need heater in my room cause my pc heats it for me :smoke:
     
  15. Denial

    Denial Ancient Guru

    Messages:
    13,294
    Likes Received:
    2,777
    GPU:
    EVGA RTX 3080
    Why do you think dual core offered such a huge performance increase? Nothing is made for quad core yet, when games do start supporting it you will see huge gains on those machines. Look at physics based games for example, something like Alan's Wake. In the Alan's Wake Tornado demo they utilize all four cores to their extreme and get a huge benefit from it.

    You also talk about nvidia making cpu's, who do you think pioneered multi-core computing? Nvidia has always backed parallel processing, their first major multi paralleled graphics cards was the FX series, you might say thats a blunder but look at what road they have taken the 8800 down. The 8800 has 128 processors working together to compute data, I don't see anyone complaining that nvidia should have stuck with 2.
     

  16. Norvekh

    Norvekh Ancient Guru

    Messages:
    2,678
    Likes Received:
    12
    GPU:
    EVGA RTX 2080 Ti
    I remember a mere 20 months ago when I was getting ready to get my Athlon 64 system about how many people were talking about how "dual-core is a foolish endevor. We'll never need that kind of power. etc. etc." yet, today there are no high-end single-core processors left on the market or being developed. I think people are missing the big picture on this processor, and the fact this is from September of last year when it was released.

    It's designed for running crazy-high floating point applications in a massively parallel nature. It's not even based on x86 architecture so it's hardly designed for home user, it's more of a proof of concept. Power consumption is very low because any core that isn't being used is shut down and it's clocked fairly modestly. If anything it's a step towards Intel's move towards the GPGPU market as nvidia and ATI both have been bragging about their floating point prowess in recent generations and how they've developed massively cored architecture.

    Anandtech did a rather informative article on the chip and how it's mainly a show of concept and not a finalized product yet still could be produced even on today's technology.

    Intel's official statement on the "Era of Tera" can be read here.

    Intel and AMD are both going in very different directions to end up at the same place. What I mean to say is while AMD is researching in putting multiple different cores on a single die, Intel is researching into doing massively parallel cores that can each do whatever they are asked to do. The end result is that both companies see a future where we no longer have a dedicated graphics card but instead an upgradable chip (or chips) on the board that can do any calculation you want whether it's physics, graphics, AI, or encoding. It will be interesting to see how they both progress in this respect as either way computer architecture will change rather dramatically again.
     
  17. Infested Nexus

    Infested Nexus Ancient Guru

    Messages:
    4,782
    Likes Received:
    0
    GPU:
    AMD Radeon™ HD 8850M
  18. mike41

    mike41 Maha Guru

    Messages:
    1,208
    Likes Received:
    0
    GPU:
    Evga 560ti
    can i run halflife 2 and oblivion and cnc3 and solitaire at the samw time on max settings?
     
  19. llerenaprincipe

    llerenaprincipe Ancient Guru

    Messages:
    7,503
    Likes Received:
    0
    GPU:
    DIAMOND X1950XTX 512MB, 650/2000Mhz
    GEt outta here :eek:
     
  20. Sash

    Sash Ancient Guru

    Messages:
    6,961
    Likes Received:
    0
    GPU:
    video
    how they gonna cool that? it looks like Apollo's disk
     

Share This Page