http://www.extremetech.com/extreme/...-speed-of-light-optical-computer-on-your-desk I don't think anyone is going to have this verbatim on their desk in 6 years, but it's interesting.
My bet is on graphene or fully programmable chips which would reorganize themselves based on workload type.
Somehow I doubt this all newer tech for the last decade that are real changes we dont seem to get just rehases and updates We must pay for all RND and they will bleed us slowly as long as possible as long as we let them
what about quantum computing ? That's been in development for ages, I would guess by 2020 something commercially would come have come out of that. And than this technology is already outdated.
by 2020 actually means 2030.... if ever??????? i am not very good at googling, but it would be really nice to see all previous announcements that said we would have certain technologies by now actually come to the market
I was pointing to fact that you have this light based computer which works similarly to quantum computers as light beams will have altered wave length too. In comparison to our standard 0/1 x86 computer. Where even IBM's PS3 CPU was just a bit different to have trouble being emulated by x86. Basically with x86 you can imagine logic + integer+floating point operations to get result. Quantum computing is something altogether different. And this laser based system will be somewhere in between. And that raises a question what kind of application would made users to buy that. I personally do not believe you would buy a computer based on laser which will degrade (fail) through time as any light source we have today.
If there is the power, there will come use for it. Like with anything mankind invented so far As for quantum computers, I doubt they will be ready for any practical use in the next 15 years. They are merely getting to understand why some things happen the way they do, and not even many.
Every single light source changes more or less intensity and wave length based on voltage, time it is running from cold start and years of service. Then again they are using liquid crystal to change/block/alter light pathing. This technology even while old is still prone to manufacturing errors even if it is lower than 1 in 1,000,000,000 it would be too big of a chance as one defective "cell" would cost entire "chip" (block). And as of today it happens that devices based on liquid crystal tech time to time express new malfunction while running.
I hope to married and off the grid by 2020, dunno about you folks, spend thousands every couple years for what? a handful of nice games a year, half of this year's are delayed, screw computing it's nerdy and basically loserish.