Discussion in 'Frontpage news' started by Infested Nexus, Sep 27, 2006.
Read story at CNET News
Mother F*cker! Thats so awsome :bigsmile:
If that 80 core prototype is running Windows, there are so many cores that are idling all the time, it's not even funny!
I'm pretty sure heat dissipation is going to be a problem there. Frankly it just looks like a 200 or 300 mm wafer of chips, nothing that would actually work.
Intel readies massive multicore processors
Well, about time CPU's caught up with GPU's.
And Intel gets dumber. More cores isn't the way to go forever, just like increasing the CPU speed didn't work forever. 80 is too many and I doubt we'd ever need that many to begin with.
Link doesn't work
What would the point in 80core have?
Uh how so? I'm sure Intel has a team or two of engineers working on things like quantum computing and such but it looks like the best way to go atm is more cores. I also fail to see how 80 is to many, if a thread application can scale to use that many then what down side is there to it?
You won't be saying that 5 years from now
But Honestly, we have Quad core, and we dont even need that many right now.
Dual core is more than enough for the hard core gamer.
Only thing that truly Benefits of a quad core CPU is a server.
After reading the thing about AMD thought they were toast, but now reading this i just might think different. Nvidia needs to start making CPU's
80 cores eh? What we gona need a 5000w psu to run our pc's? I sincerely hope they find way to seriously reduce power consumption on pc's I find it ridicules that we have 1000w psu now and as it is they produce to much heat cause all the power they need. If they can find a way to reduce power consumption and reduce heat greatly I all for this 80 core =P
As it is I dont need heater in my room cause my pc heats it for me :smoke:
Why do you think dual core offered such a huge performance increase? Nothing is made for quad core yet, when games do start supporting it you will see huge gains on those machines. Look at physics based games for example, something like Alan's Wake. In the Alan's Wake Tornado demo they utilize all four cores to their extreme and get a huge benefit from it.
You also talk about nvidia making cpu's, who do you think pioneered multi-core computing? Nvidia has always backed parallel processing, their first major multi paralleled graphics cards was the FX series, you might say thats a blunder but look at what road they have taken the 8800 down. The 8800 has 128 processors working together to compute data, I don't see anyone complaining that nvidia should have stuck with 2.
I remember a mere 20 months ago when I was getting ready to get my Athlon 64 system about how many people were talking about how "dual-core is a foolish endevor. We'll never need that kind of power. etc. etc." yet, today there are no high-end single-core processors left on the market or being developed. I think people are missing the big picture on this processor, and the fact this is from September of last year when it was released.
It's designed for running crazy-high floating point applications in a massively parallel nature. It's not even based on x86 architecture so it's hardly designed for home user, it's more of a proof of concept. Power consumption is very low because any core that isn't being used is shut down and it's clocked fairly modestly. If anything it's a step towards Intel's move towards the GPGPU market as nvidia and ATI both have been bragging about their floating point prowess in recent generations and how they've developed massively cored architecture.
Anandtech did a rather informative article on the chip and how it's mainly a show of concept and not a finalized product yet still could be produced even on today's technology.
Intel's official statement on the "Era of Tera" can be read here.
Intel and AMD are both going in very different directions to end up at the same place. What I mean to say is while AMD is researching in putting multiple different cores on a single die, Intel is researching into doing massively parallel cores that can each do whatever they are asked to do. The end result is that both companies see a future where we no longer have a dedicated graphics card but instead an upgradable chip (or chips) on the board that can do any calculation you want whether it's physics, graphics, AI, or encoding. It will be interesting to see how they both progress in this respect as either way computer architecture will change rather dramatically again.
Intel presentation reveals the future of the CPU-GPU war
can i run halflife 2 and oblivion and cnc3 and solitaire at the samw time on max settings?
GEt outta here
how they gonna cool that? it looks like Apollo's disk