Intel Larrabee GPU designer rejoins Intel GPU Team

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 20, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,544
    Likes Received:
    18,856
    GPU:
    AMD | NVIDIA
    It's been roughly a decade, but Tom Forsyth was the man behind Larrabee if you can remember it, you need to go back to the year 2007 for the first Larrabee rumors. Forsyth will be teaming up with ...

    Intel Larrabee GPU designer rejoins Intel GPU Team
     
  2. UnrealGaming

    UnrealGaming Ancient Guru

    Messages:
    3,454
    Likes Received:
    495
    GPU:
    -
    i rly hope they release something interesting. gpu market has been boring af for almost 10 years now
     
  3. Koniakki

    Koniakki Guest

    Messages:
    2,843
    Likes Received:
    452
    GPU:
    ZOTAC GTX 1080Ti FE
    Would love to see them using the Iris name: "Hey, just got an Iris xxx" or something.. :p
     
  4. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,691
    Likes Received:
    962
    GPU:
    GTX 1070
    Intel is gearing up for the first damn time to get serious on the GPU market. I don't know why it took this long but I couldn't be more thrilled to have a third competitor.

    I read Tom's blog he only called Larrabee a success in relation to what Intel asked his team to build. They built it and it nailed what Intel wanted. With that said what Intel wanted was way off point on what the market wanted which has been the story of Intel GPU's.
     

  5. Solfaur

    Solfaur Ancient Guru

    Messages:
    8,013
    Likes Received:
    1,533
    GPU:
    GB 3080Ti Gaming OC
    I'm actually looking forward to this, since they seem to be serious about it. The GPU market sure needs the competition.
     
    fantaskarsef and HonoredShadow like this.
  6. ruthan

    ruthan Master Guru

    Messages:
    573
    Likes Received:
    106
    GPU:
    G1070 MSI Gaming
    Same mistakes again and again..
     
  7. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,397
    GPU:
    Asrock 7700XT
    Kind of weird how he accepted a job and doesn't know what he'll be doing.
    Larrabee actually would've been a success if Intel didn't keep shafting it. If they funded it properly, Nvidia would be nowhere near as successful as they are in the server market.
     
  8. RooiKreef

    RooiKreef Guest

    Messages:
    410
    Likes Received:
    51
    GPU:
    MSI RTX3080 Ventus
    Well for once it looks like Intel is actually pushing hard to enter the GPU market. Maybe it’s because they can see AMD is catching up quicker than they thought on the CPU side.
    At the end it will be great for all of us when a third player is around.

    Now only Nvidia has to make a CPU for the PC sector.
     
  9. asturur

    asturur Maha Guru

    Messages:
    1,376
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    From what i remember it was performing bad, like 30% of competition.
    It was a complete different architecture.
     
  10. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,397
    GPU:
    Asrock 7700XT
    I was thinking something similar. Intel knows they won't compete against AMD or Nvidia in the gaming market, but the server market is a whole different beast. Intel knows that AMD isn't doing so great in the server GPU market, and unlike Nvidia, they're willing to create a new architecture from the ground-up specific for server workloads (Nvidia's hardware is still heavily revolved around the needs of gamers and workstations).
    That will never happen. To my recollection, they were explicitly denied the license to manufacture x86. This is why they went with ARM instead, and made the Tegra series.
     

  11. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Yeah, they originally intended it to compete in the consumer space but the performance wasn't up to the competition so they kind of reshuffled it as a compute card with the added bonus of being easy to program for compared to CUDA at the time. Most of the work of Larabee ended up going into Knights Landing and AVX512 and stuff - so it wasn't all lost.
     
  12. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    They were serious before, burned quite some cash in it, and everyone knows what came out of it.
     
  13. Texter

    Texter Guest

    Messages:
    3,275
    Likes Received:
    332
    GPU:
    Club3d GF6800GT 256MB AGP
    I'm sure they'll just throw together a bunch of AMD iGPU dies this time around...and then proclaim it'll become thrice as powerful by 2023...
     
  14. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,691
    Likes Received:
    962
    GPU:
    GTX 1070
    Totally agree. Google Tianhe-2 or TH-2 supercomputer. It was the fastest supercomputer at the time and was based on a ton Xeon Phi's aka Larrabee.

    One of the coolest ideas with Larrabee was it was x86 based with software sitting on top of those tiny cores versus having dedicated single function hardware like AMD and Nvidia. You could do a driver update and go from say Directx 9 compatible to 100% Directx 12 compatible. In fact Intel did just this with its prototypes and made them Directx 11 compatible even know the hardware predated the Directx 11 specification. The down side since the software sat on top of x86 cores is Larrabee just wasn't that efficient compared to AMD and Nvidia GPU's but as with software I suspect that could have been vastly improved.
     
    Solfaur likes this.
  15. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    The hiring of Forsyth reinforces my suspicion that Intel is making a compute product, as opposed to a gaming product. This might be a Larrabee 2 or sorts, to compete with Nvidia and AMD in the AI, machine learning and cloud computing markets.
     

  16. Moonbogg

    Moonbogg Master Guru

    Messages:
    306
    Likes Received:
    212
    GPU:
    GTX1080Ti@2.1GHz
    I think Intel wants to be a part of the AI future. Its going to mean everything moving forward and GPU's are what's used for that. I honestly think gaming will be very low on Intel's list of stuff to consider or care about. They might have a GPU in 2020, but it won't mean much for gamers and it will likely be way underpowered and weak anyway. I expect nothing but garbage from Intel, although I have to admit its odd to see them assembling such a significant team...hmm...maybe we'll get lucky...or NAH
     
  17. nz3777

    nz3777 Ancient Guru

    Messages:
    2,504
    Likes Received:
    215
    GPU:
    Gtx 980 Radeon 5500
    Would be cool if they could come-up with something at least mid-range if not the high-end of gaming spectrum. Nvidia has things tied down and I do not see that changing anytime soon unless they team-up with Amd and come out with something kick-ass. If that works then they can improve on there crappy i-gpu side of things also,all in all they are making the right steps in my opinion. Nvidia is the Giant in the room whahaha.
     
  18. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Well he was/is a "expert" by compute and guess Raja needed some help there, idk why would that automatically make it just a compute only card.


    I still think intel will deliver and stir up gpu market a lot. 2020 is not far away, good year and a half to go.
     
  19. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,793
    Likes Received:
    1,396
    GPU:
    黃仁勳 stole my 4090
    And daylight robbery. Don't forget daylight robbery.
     
    fantaskarsef and Solfaur like this.
  20. asturur

    asturur Maha Guru

    Messages:
    1,376
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    well at some point, you need the frequency too. 150fps is like 6ms per frame. Give yourself some clock.
     

Share This Page