1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Intel Buys Indian Startup To Gain Discrete GPU Tech Expertise

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 18, 2019.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    34,669
    Likes Received:
    3,900
    GPU:
    AMD | NVIDIA
  2. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    3,717
    Likes Received:
    800
    GPU:
    HIS R9 290
    I'm not sure I understand what Intel is getting out of this. Isn't it a bit late for them to be looking for something like this now? Also, it doesn't seem like they provide GPUs that would be much more powerful than Intel's current offerings. So, unless they have some patented tech that Intel wants/needs, I just can't seem to find why they'd do this.
     
  3. rl66

    rl66 Ancient Guru

    Messages:
    1,926
    Likes Received:
    92
    GPU:
    quadro K6000+Tesla M2090
    To achieve it they need vertical and horizontal investement, it's a good point to have lot of money to do it, and intel have quite some
    .
     
  4. TieSKey

    TieSKey Active Member

    Messages:
    89
    Likes Received:
    14
    GPU:
    Gtx870m 3Gb
    You know, its easier and a lot CHEAPER to buy current (and very underpay) worker contracts in the form of a "company/startup" than to hire them directly. Seen this pattern happen a lot even here in south america. Hell, some people just make startups to get a bunch of talented people on board and then sell the company before even having a product, IP or copyrights.
     

  5. waltc3

    waltc3 Master Guru

    Messages:
    807
    Likes Received:
    165
    GPU:
    XFX 590 8GB XFire
    Does anyone remember the last time Intel decided to do a discrete GPU product line, and what happened to it? IIRC, long ago and many moons past, a company called "Real3d" partnered with Intel to form a GPU chip company, which Intel later bought out to produce the "i740"--a discrete but not really discrete GPU--and doing this back when 3dfx and nVidia were shipping products with 16mb's of local onboard vram--and doing all of their texturing out of local ram--but not the i740 Intel series--oh, no. Intel decided to market a discrete GPU with only 4-8mb's of onboard vram, so that the GPU would be forced to do AGP texturing from *system ram*--yes, dog-slow system ram.. i7xx was designed that way! Long story made short: 3dfx's and nVidia's 16mb local bus vram GPUs literally ran circles around Intel's GPU because they did not rely on the much, much slower AGP texturing with which Intel hobbled its own GPU right out of the gate. (nVidia advertising at the time claimed their TNT GPUs were using AGP texturing for gaming--that's yet *another* story about nVidia's penchant for misleading and/or false advertising, for another time!) The pundit web sites like Sharky Extreme and Anandtech went *nuts* over AGP texturing, even as dog-slow as it was/is compared to texturing from onboard Vram running some 20x faster than AGP system ram even then! (That's why 3d cards even today sport massive amounts of Vram--it's still many times faster than PCIe3/16 access to system ram.) Wasn't even a contest. And so, Intel gathered up all its marbles and went home--ceasing production of the i7xx series and closing down the company for good. Until now...or, until sometime in the indeterminate future, that is...;)

    I remember a lot of this because I bought and tested, and returned, every i7xx GPU Intel marketed. I forget how many different models there were--2 or 3--but I owned 'em all. And as I say, I returned them all--the 3d performance was just terrible, and AGP texturing was the reason why. Intel hobbled its own GPU on the drawing board. I always give this stuff the benefit of the doubt--especially I did in those days when 3d was brand new. We were practically just out of the Matrox Millennium era (fantastic 2d card that tried, and failed, at 3d, too--Matrox Mystique also graced my GPU graveyard--ugh.) IIRC, 3dfx was the only 3d-card OEM at the time who was honest about AGP texturing--saying they weren't interested in it because texturing out of their local vram was ~20x faster--and, man, I still remember how these supposedly knowledgeable Internet pundits *turned on* 3dfx with a passion (I recall nothing of the sort from HH in those days!)! Because AGP texturing was new and gimmicky, and it was an Intel tech, so *of course* it had to be *great*...so many of them made fools of themselves about that--and nary a one ever retracted the falsehoods and garbage they'd plastered all over the Internet. Ah, it was sort of like Larrabee--the same know-nothing people were accusing Larrabee of all kinds of super-human 3d feats of daring, including *cough* real-time ray tracing--right up until Intel cancelled Larrabee without ever putting it into production. When money changes hands--beware of over-the-top marketing! Chances are good it's totally false. Really, though, I doubt Intel paid them anything to hype Larrabee as they did--it was just a sort of *page hits* kind of thing, if you get my drift. Yours truly was on those forums trying valiantly to explain why 3dfx had an excellent point--but it was like being a reed, crying in the wilderness--they did not care for the facts. I'm glad those days are *gone*--the people left standing like HH *know* what's what and are perfectly happy sticking with the facts!

    https://en.wikipedia.org/wiki/Intel740

    *weird about Wikipedia: I first looked up "real3d" and their article said nothing about the company being bought by Intel, but when I looked at the link above, the write-up on the i7xx does mention it, indeed...I'm not terribly a Wiki fan as it is hit and miss.
     
    fantaskarsef and Backstabak like this.
  6. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    1,634
    Likes Received:
    101
    GPU:
    Guru3d GTX 980 G1 Gaming
    Holy wall of text batman.

    I feel like i just read a book.
     
    fantaskarsef likes this.
  7. Michal Turlik 21

    Michal Turlik 21 Member

    Messages:
    26
    Likes Received:
    1
    GPU:
    Geforce gtx 1080
    It is almost certain that Mr Koduri did not do the work alone. Probably it was not he to implement the TBDR approach into Vega just as all the hardware algorithms in charge to perform the rendering tasks. Designing gpu blocks and then assigning development tasks to do them is a different story rather arranging the whole logic as many instruction sets.
    If intel thinks that this company has the knowledge to do what Imagination technologies (ex PowerVR), Qualcomm and other players do with their tile based deferred rendering chips I am of the idea that they did a big mistake...not to mention ray tracing and pseudo ray tracing techniques. We have a good example at nvidia with their poor and blurry DLSS...a company that has spent about 20 years by fighting with computer graphics.
    If Koduri was left free to leave or encouraged to do so there should be a good reason behind...most probably it is not he that did the hard work and Intel has discovered it at its own expense...see what happens by cheating on a Curriculum Vitae? :)
     
  8. ruthan

    ruthan Master Guru

    Messages:
    217
    Likes Received:
    13
    GPU:
    G970/3.5G MSI
    I would be suprised if Koduri or his friend would haves some shares in it..
     
  9. angelgraves13

    angelgraves13 Master Guru

    Messages:
    957
    Likes Received:
    137
    GPU:
    RTX 2080 Ti FE
    Well I guess in a year we'll know how it all turns out.
     
  10. rl66

    rl66 Ancient Guru

    Messages:
    1,926
    Likes Received:
    92
    GPU:
    quadro K6000+Tesla M2090
    It's like my told one, i was talking with his mother about "lord of the ring" and he said "yes it's a very good movie despite the FX suck"...
    A book is active while a movie is passive for your mind: reading is good for training your brain.
    (And you haven't seen industry technical doc were you fight to not fall asleep after the 1st page ...)
     

  11. JamesSneed

    JamesSneed Master Guru

    Messages:
    439
    Likes Received:
    122
    GPU:
    GTX 1070
    @waltc3 Intel did have a neet idea using x86 like cores but the reality is a general use GPU wasn't going to outperform a single purpose GPU. I assume Intel learned that lesson and will have something competitive this go around.
     
  12. K.S.

    K.S. Maha Guru

    Messages:
    1,234
    Likes Received:
    190
    GPU:
    1080 Ti SEA HAWK
    I'm glad Intel is making strides in discrete GPU... hopefully we'll see something next year - and yet I thought this is why they hired Raja Koduri. Then again perhaps this is his influence at work... still - I really hope we get more competition on the horizon..
     

Share This Page