Intel shows huge monolithic Xe GPU on Twitter, calls it "The Father of All"

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 2, 2020.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,531
    Likes Received:
    18,841
    GPU:
    AMD | NVIDIA
  2. ruthan

    ruthan Master Guru

    Messages:
    573
    Likes Received:
    106
    GPU:
    G1070 MSI Gaming
    Lets call it Larrabee 2, until is proven otherwise - not dead on arrival.
     
    angelgraves13, EspHack and Silva like this.
  3. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    Well it looks like it's made of 8 chiplets to me, if you look at the back side of the package. The article calls it "monolithic" in the title, I thought that was supposed to mean that the GPU was gonna be one big piece of silicon rather than chiplets, but maybe I'm misunderstanding "monolithic" (the article then goes on to suggest chiplets too)? Most of the time monolithic has been used to describe a chip made of just one piece of silicon in my experience.
     
    Cave Waverider likes this.
  4. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    [​IMG]
    [​IMG]
     
    Last edited: May 2, 2020
    GSDragoon, BetA and HandR like this.

  5. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,604
    Likes Received:
    13,612
    GPU:
    GF RTX 4070
    Do they mean "ancient"?
     
  6. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    That or it's Temujin / Genghis Khan. :D

    Well nice to see Intel showing AMD and NVIDIA who's the big daddy in the GPU department, possibly, guess we'll see how it goes and what their development into graphics cards can do for the current market situation. :)

    EDIT: Or they're giving NVIDIA back for their little comics back then which painted them in a not so positive light...

    [​IMG]

    (There's a fair few of these, no idea what it was about originally but it's the internet so the comics preserve forever.)


    Still it'll be interesting to see what this leads to, server market, desktop or further developments on the mobile end or perhaps a bit of everything going to take a while though as usual with new processes and from planning to engineering and whenever these are actually on the market but it changes things up a bit and that can't be too bad.
     
    Last edited: May 2, 2020
    anticupidon likes this.
  7. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Father of all expenses?

    But you have to acknowledge something, it has quite some IO = no HBM in package?
     
    anticupidon likes this.
  8. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,251
    Likes Received:
    232
    GPU:
    EVGA GTX 1080@2,025
    I had thought i'd die of old age before Intel brings a proper GPU to market. heh... who knows though, still might.
     
    jbscotchman likes this.
  9. Ne1l

    Ne1l Active Member

    Messages:
    69
    Likes Received:
    25
    GPU:
    465 Sli
    Screenshot_20200502-132613.jpg

    Soo many Pins though.., I bet MB manufactures aren't looking forward to all the RMA's

    *I remembered Intel getting us to glue plastic-guides on Ivy Bridges as bent pins or contaminated pads sent RMA requests through the roof. (found a CPU with the mount like the one we attached)[​IMG] [​IMG]
     
    Last edited: May 2, 2020
  10. D1stRU3T0R

    D1stRU3T0R Master Guru

    Messages:
    681
    Likes Received:
    241
    GPU:
    8 GB
    AMD did 3 slot GPU which *could have worked* as 1 slot, just for having a better cooling.
    Intel doing 3 slot GPU, really utilising every inch from that space to have some performance.
     

  11. Silva

    Silva Ancient Guru

    Messages:
    2,051
    Likes Received:
    1,200
    GPU:
    Asus Dual RX580 O4G
    Oh, look: Larrabee 2.0! ahahah
    Intel, you crazy! This is doomed to fail on a mass market application.
    If they're targeting servers first, means it will be years until a consumer grade product.
     
  12. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
  13. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,952
    Likes Received:
    1,244
    GPU:
    .
    Too small.
     
  14. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,665
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    At the time i thought Larrabee was an amazing idea but it was just too early for that tech to even work.

    These days they might of cracked the multi-gpu system.
     
  15. anticupidon

    anticupidon Ancient Guru

    Messages:
    7,898
    Likes Received:
    4,149
    GPU:
    Polaris/Vega/Navi
    Until it is on the test bed on Guru3D labs, I won't try to assume or anything.
    Let's see real hardware and real benchmarks.
    Then we'll talk.
     

  16. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Oh neat, reminds me of what 3DFX attempted with the Voodoo5 I think it was but now on a single chip and I presume multi-GPU improvements has advanced even if AMD and NVIDIA has scaled back their usage for multi-GPU support.
    Which for gaming I'd imagine must be kinda problematic to get working right unless the devs work the game engine from the beginning to scale properly and minimize issues and drawbacks even if Vulkan and D3D12 supports it better than before.

    Workstation and programs or server environments though this could be really neat plus instead of a whole array of GPU's you can do it like this or scale it up even more possibly further still in the future depending on how many chips could be placed on each die what with the shrinking fabrication process and all those complications already having drawbacks like yields and more but that could improve or find other means or materials that will improve things. :)


    EDIT: So instead of 3 - 4 cards one GPU each this would make it 12 - 16 units total and depending on scaling if they could work together that would also greatly improve speeds but I suppose we'll see how this all works.

    Nice potential though from what I'm thinking of how it could work or would work even without multi-GPU support on that level.


    AMD I think had some patents or plans but for later on or just patented for now, not sure about NVIDIA and now Intel is actually doing it but we'll see when it's showcased how this all works, going to be interesting to see what this solution could do but it must be pretty complicated stuff too having multiple GPU chips like this.

    Oh well the same happened for CPU's and scaling in the 2000's and GPU's already have a ton of cores so why not take it to multiple chips, it'll work out and see various resolutions and improve whether it's workstations, server or possibly even desktop environments in the future. :)
    (Bit optimistic perhaps and besides since it's like 4x GPU's what would that do to pricing heh.)


    EDIT: Well there's also the part where 4x cores on one chip would result in a bit more heat, can't exactly go big-tower cooling on a GPU slot-in card so AIO water or something else would be needed to dissipate that ha ha.

    Still interested in seeing how this will be used though, work and program oriented of course but it's a nice little change-up of things and if it could bring back multi-GPU support that'd be neat but far as gaming goes that's pretty far out from being a thing for regular desktop systems I would expect.
    (HBM and what can be done with memory stacking and such is already a hurdle, doing multi-GPU dies and even more complications would at the very least result in a very costly card and initial scaling would be a problem too.)



    Well it should shake up the server market a bit and other workstation type environments, can't be bad. Not sure for AMD and their position but it might shake things up a bit for NVIDIA although CUDA is still a strong incentive to use their hardware for one thing.

    What would I know, just fun to see something new, well sorta but it's a neat little multi-GPU implementation using new tech and hardware advancement and years of research and general improvements both hardware and software wise.
     
    Last edited: May 2, 2020
    Undying likes this.
  17. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Nvidia already has inference chips utilizing chiplets in testing. They also published a number of papers that talk about chiplets being the only way to get scaling going forward. Bill Dally's (Nvidia) recent interview he says they consider chiplets on GPUs "de-risked" and ready for actual implementation. There has already been rumors that Hopper (next gen after Ampere) is going to utilize chiplets.

    That being said it's completely different then multi-GPU and it doesn't seem like it's going to be good for gaming. It's highly likely we'll only see it in HPC GPUs for sometime across all three GPU designers.
     
    Last edited: May 3, 2020
  18. JamesSneed

    JamesSneed Ancient Guru

    Messages:
    1,691
    Likes Received:
    962
    GPU:
    GTX 1070
    Do people really get excited about the back of a chip? AMD's HPC CPU / GPU (1 cpu + 8 gpu's) chips are large too. Not sure I care much about any of these HPC chips from either camp.
     
  19. darkvader75

    darkvader75 Member

    Messages:
    22
    Likes Received:
    2
    GPU:
    7900XTX on Water
    Even if Intel makes physical product, they have never proven they can make gameday drivers for our market. So it really doesn't matter if they make a product as it will be ignored with its lack of driver support.
     
  20. xvince1

    xvince1 New Member

    Messages:
    6
    Likes Received:
    1
    GPU:
    CFX R9-290
    Wow, I haven't noticed that Jim Keller as integrated Intel en 2018... That will make some incredible pieces of hardware
     

Share This Page