Intel Shows Off Working Larrabee, Set to Take on AMD, NVIDIA Next Year

Discussion in 'Frontpage news' started by LedHed, Sep 28, 2009.

  1. deltatux

    deltatux Guest

    Messages:
    19,040
    Likes Received:
    13
    GPU:
    GIGABYTE Radeon R9 280
    i guess sometimes with the rebranding doesn't really matter as long as it works fine. However, I hope with added pressure NVIDIA would stop reusing the 8800 design.

    deltatux
     
  2. midweskid

    midweskid Guest

    Looking forward to this. Competition for our wallets will make prices competitive. Who knows intel took lead in the ssd market coming out of nowhere, maybe we will see the same result here.
     
  3. Mike Z

    Mike Z Guest

    Messages:
    773
    Likes Received:
    4
    GPU:
    GTX 670
    I'm just saying that they seem to make the same cards twice, then change a few things and name the card something different just to sell 'em, so I was implying they could shift their focus and POSSIBLY be more efficient, but, what I just described seems similar to binning.

    My card is excellent for what I paid for it. I have no complaints. I'm just saying that it MIGHT be more efficient if they made 5 cards per series: a low end, a mid, a mid-high end, a high end, and an ultra high end card instead of like 10 or 11 different cards in each series.
     
  4. CronoGraal

    CronoGraal Ancient Guru

    Messages:
    4,194
    Likes Received:
    20
    GPU:
    XFX 6900XT Merc 319
    I don't see why it would matter if I'm getting a really good bang for the buck though. If they were to do what you want them to then they'd have to release a whole new set more frequently which may end up costing them more. I want them to constantly release slight upgrades for killer deals, I love competition.
     

  5. salanos

    salanos Maha Guru

    Messages:
    1,301
    Likes Received:
    0
    GPU:
    GeForce GTX980 4GB (Ref.)

    I lol'd



    Anyhow, if Larabee can't get pure performance equal or better to the nVi/ATi parts of the time I kind of think it won't meet a good response. GTX285 speed next year is going to be rather bland.

    Sure, it'll be super-programmable as a bunch of x86 cores, but how many people really know how to do GPGPU programming?
     
  6. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    It's X86 architecture, so pretty much every developer. The SDK handles the parallelism.
     
  7. salanos

    salanos Maha Guru

    Messages:
    1,301
    Likes Received:
    0
    GPU:
    GeForce GTX980 4GB (Ref.)
    I meant, how many average people?
     
  8. Duke Nil

    Duke Nil Maha Guru

    Messages:
    1,243
    Likes Received:
    62
    GPU:
    GTX 1080
    why? they make a chip, release a few differently-performing versions of it for accordingly different prices, the consumer gets to pay more choices so they can get closer to exactly what they want according to how much they want to pay, the company gets to make money off chips that aren't up to the standards of the majority or at least the intended majority. Everybody wins, no?

    what?
     
  9. NeoElNino

    NeoElNino Master Guru

    Messages:
    978
    Likes Received:
    0
    GPU:
    Gigabyte GTX1080 8GB Soon
    OMG the water in that pic looks amazing! Ray-Tracing is the way to go!

    Intel brags of its efficiency, saying the water effects (reflections, transparency, etc.) were accomplished with only 10 lines of C++ code ---> a big slap in Nvdias, ATI face.
     
  10. Xendance

    Xendance Guest

    Messages:
    5,555
    Likes Received:
    12
    GPU:
    Nvidia Geforce 570
    How many lines of code there are isn't really a matter, it depends on the functions that you use. And last time I heard making a raytracer isn't really that hard, as in you don't need complicated spaghetti code for it.
     

  11. alphaphotek

    alphaphotek Master Guru

    Messages:
    576
    Likes Received:
    0
    GPU:
    2x MSI GTX 260 SLI
    Can we use Larabee with the new Hydra chip?! Just imagine, Larabee, Ati, and NVidia Gpu's all in one case working together!
     
  12. Jonp382

    Jonp382 Master Guru

    Messages:
    582
    Likes Received:
    0
    GPU:
    Sapphire HD 5770 Vapor-X
    As someone already said it's X86, so it's very developer friendly.
     
  13. Mike Z

    Mike Z Guest

    Messages:
    773
    Likes Received:
    4
    GPU:
    GTX 670

    [​IMG]
     
  14. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Even if it is, which i doubt, that's not going to be good for 2010

    Odd, i thought the water in those pictures looked absolutely horrible and like 6 years old at the least...and i'm being honest

    It just looks like a texture, add shine onto it, add reflection with distortion, and add very low-profile waves, all of those are the BASIS of water, nothing spectacular done
     
    Last edited: Sep 29, 2009
  15. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    no, not at all, the more the better, if they only had 3, that'd be sad and bad for the market
     

  16. TyrantofJustice

    TyrantofJustice Ancient Guru

    Messages:
    5,011
    Likes Received:
    33
    GPU:
    RTX 4080
    I second that the water looked like crap.
     
  17. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Obviously you either have no experience or very limited experience with programming. Every line of code, increases the execution time....which decreases efficiency. A good example here would be C++ code compiled using a Pentium2. It takes roughly 30secs to compile 100 lines of code. On the other hand, 50 lines of code takes roughly 15secs to compile. Now, for an app that requires no user interaction, that 100 lines of code can take upwards of 30secs to fully execute, whereas that 50 lines of code can be fully executed in roughly 10-15sec. Now, given the 2 languages being used, C and C++....compile and execution times become very important when using GPU's to process the resulting instructions. The fewer lines of code necessary, the sooner the GPU will receive the instructions...thus, the sooner the task will complete. Since nVidia's GPU's are incapable of processing x86 instructions directly, that means the code has to be converted to x86 instructions, then converted BACK to a language that nVidia's GPU's can actually "understand". Intel, has the luxury of converting the C++ code to x86 instructions, and processing those x86 instructions natively. Even at the rate that GPU's process data....the difference between how Larrabee functions, and how nVidia's GPU's function...will have a rather large impact on performance. Now, if Larrabee's first-gen is as fast as GTX285....in GPGPU functions, it will outperform it simply due to the efficiency in code execution. Also note, that Intel could easily alter Nehalem for Larrabee second-gen. Also, an advantage that Larrabee has over GeForce and Radeon cards....no need to upgrade when MS decides to release DX12....Larrabee is already prepared and simply needs a driver update.
     
  18. salanos

    salanos Maha Guru

    Messages:
    1,301
    Likes Received:
    0
    GPU:
    GeForce GTX980 4GB (Ref.)
    My point is, how many of the average chaps you see everyday know how to do programming?

    @Alphaphotek;
    I heard something about Intel attempting to partner or buy Lucid out, so it's likely that Larabee will work with Hydra.
     
  19. Xendance

    Xendance Guest

    Messages:
    5,555
    Likes Received:
    12
    GPU:
    Nvidia Geforce 570
    Well yea, I'm just a rookie. Have studied computer science in university for a month. But I was talking about coding in general. Not just raytracing. But I just repeated what our lecturer said. Though he was talking about java.

    I don't understand your point. Of course people who know nothing about programming know nothing about programming for larrabee. But people who know C or whatever you use for larrabee, know how to code for it.
     
    Last edited: Sep 30, 2009
  20. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    In any case, the fewer the lines, the faster the code can execute. Whether it be for raytracing, sorting, etc.

    In the case of Larrabee, ASM (x86 Assembly) and C++ are the languages that are known to be compatible based on the released development info
     

Share This Page