"By 2020, you could have an exascale speed-of-light optical computer"

Discussion in 'Frontpage news' started by Veeshush, Aug 9, 2014.

  1. Veeshush

    Veeshush Maha Guru

    Messages:
    1,095
    Likes Received:
    2
    GPU:
    MSI GTX 680 Lightning
    http://www.extremetech.com/extreme/...-speed-of-light-optical-computer-on-your-desk

    I don't think anyone is going to have this verbatim on their desk in 6 years, but it's interesting.

     
  2. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    My bet is on graphene or fully programmable chips which would reorganize themselves based on workload type.
     
  3. FULMTL

    FULMTL Ancient Guru

    Messages:
    6,704
    Likes Received:
    2
    GPU:
    AOC 27"
    Can we overclock it? :)
     
  4. volkov956

    volkov956 Ancient Guru

    Messages:
    6,132
    Likes Received:
    16
    GPU:
    RTX 3080 12GB
    Somehow I doubt this all newer tech for the last decade that are real changes we dont seem to get just rehases and updates We must pay for all RND and they will bleed us slowly as long as possible as long as we let them
     

  5. expo

    expo Guest

    Messages:
    59
    Likes Received:
    0
    GPU:
    Gtx 970 / 3.5GB + 0.5GB
    what about quantum computing ? That's been in development for ages, I would guess by 2020 something commercially would come have come out of that.
    And than this technology is already outdated.
     
  6. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    And what kind of home application would you need it for?
     
  7. thatguy91

    thatguy91 Guest

    Play grossly unoptimised console ports.
     
  8. Clawedge

    Clawedge Guest

    Messages:
    2,599
    Likes Received:
    928
    GPU:
    Radeon 570
    by 2020 actually means 2030.... if ever???????

    i am not very good at googling, but it would be really nice to see all previous announcements that said we would have certain technologies by now actually come to the market
     
  9. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    I was pointing to fact that you have this light based computer which works similarly to quantum computers as light beams will have altered wave length too.

    In comparison to our standard 0/1 x86 computer. Where even IBM's PS3 CPU was just a bit different to have trouble being emulated by x86.

    Basically with x86 you can imagine logic + integer+floating point operations to get result. Quantum computing is something altogether different. And this laser based system will be somewhere in between.

    And that raises a question what kind of application would made users to buy that.

    I personally do not believe you would buy a computer based on laser which will degrade (fail) through time as any light source we have today.
     
    Last edited: Aug 13, 2014
  10. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    5,246
    Likes Received:
    1,608
    GPU:
    RTX 3060 12GB
    That last paragraph gets a big 'huh?' from me.
     

  11. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,767
    Likes Received:
    9,665
    GPU:
    4090@H2O
    If there is the power, there will come use for it. Like with anything mankind invented so far ;)
    As for quantum computers, I doubt they will be ready for any practical use in the next 15 years. They are merely getting to understand why some things happen the way they do, and not even many.
     
  12. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Every single light source changes more or less intensity and wave length based on voltage, time it is running from cold start and years of service.

    Then again they are using liquid crystal to change/block/alter light pathing.
    This technology even while old is still prone to manufacturing errors even if it is lower than 1 in 1,000,000,000 it would be too big of a chance as one defective "cell" would cost entire "chip" (block).
    And as of today it happens that devices based on liquid crystal tech time to time express new malfunction while running.
     
  13. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    5,246
    Likes Received:
    1,608
    GPU:
    RTX 3060 12GB
    most upgrade their computers every 18 months to 3 years anyway...
     
  14. SLI-756

    SLI-756 Guest

    Messages:
    7,604
    Likes Received:
    0
    GPU:
    760 SLI 4gb 1215/ 6800
    I hope to married and off the grid by 2020, dunno about you folks, spend thousands every couple years for what? a handful of nice games a year, half of this year's are delayed, screw computing it's nerdy and basically loserish.
     
  15. Xendance

    Xendance Guest

    Messages:
    5,555
    Likes Received:
    12
    GPU:
    Nvidia Geforce 570
    So what have we learned from this post?
    That apparently computing == playing games.
     

Share This Page