Intel Lakefield CPU Combines fast and economical cores

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 8, 2019.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,531
    Likes Received:
    18,841
    GPU:
    AMD | NVIDIA
  2. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,516
    Likes Received:
    2,361
    GPU:
    Nvidia 4070 FE
    But one problem remains: Can it run Crysis?
     
    Solfaur and emperorsfist like this.
  3. BLEH!

    BLEH! Ancient Guru

    Messages:
    6,408
    Likes Received:
    423
    GPU:
    Sapphire Fury
    They're taking hints from ARM. I guess it's all down to how different architectures scale with respect to performance per unit power at each clock speed.
     
  4. RealNC

    RealNC Ancient Guru

    Messages:
    5,090
    Likes Received:
    3,374
    GPU:
    4070 Ti Super
    Wow. Can't wait.

    /s

    Intel is really worried about ARM CPUs. Some food for thought in this analysis:

     

  5. emperorsfist

    emperorsfist Ancient Guru

    Messages:
    1,977
    Likes Received:
    1,076
    GPU:
    AORUS RTX 3070 8Gb
    I mean, that is the only real question here!
     
    Kaarme likes this.
  6. Solfaur

    Solfaur Ancient Guru

    Messages:
    8,012
    Likes Received:
    1,532
    GPU:
    GB 3080Ti Gaming OC
    So wait, now they are done with lakes and started using fields in their naming and the very first will be called LAKEFIELD? :eek:
     
  7. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    What I don't understand is why Intel (or AMD for that matter) hasn't done this big.LITTLE-like architecture years ago. Every task is different. Some work best with long single-threaded pipelines. Some can easily make do with short pipelines at low clocks. Some don't need any advanced instruction sets at all. Others work best multi-threaded. Having a single CPU with a variety of cores that excel at different workloads would really maximize efficiency. Such a CPU wouldn't be much interest to those with more constant workloads (like workstations or servers) but it'd be great for pretty much everything else.
    Both AMD and Intel (but mostly AMD) are leading us to believe that what we need is more cores, but what we really need are specialized cores. Despite what a lot of people think, many CPU-bound tasks are never going to become multi-threaded, nor should they. That's not to say having more cores is a bad thing, but rather, it's not the only thing we should be focusing on.

    Kinda gets me to wonder why they didn't take these hints the first time around. ARM is successful in their market for a reason, and Intel just completely ignored all of the reasons why during their first attempt at mobile processors. I'm skeptical they actually learned and understood what they did wrong the first time.

    Are they done with lakes? This a different product lineup.
    Either way, funny observation.
     
    Last edited: Jan 8, 2019
  8. BLEH!

    BLEH! Ancient Guru

    Messages:
    6,408
    Likes Received:
    423
    GPU:
    Sapphire Fury
    Are the Atom cores any good, though?
     
  9. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    From Intel's first attempt? No, not at all. At least, they weren't any good at what they were supposed to be.
    As for Lakefield, there's not enough info for anyone to make judgment.
     
  10. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    I dunno, I think it would most likely use up a lot of precious die space and see very little use in desktop systems. People already complain about the amount of die space that Intel's iGPUs use, saying it could be used for more cores instead. The big.LITTLE design makes sense for smartphones and tablets, which need to maximize battery life. Not so much for desktops (laptops are a different story, but laptops need to cater to desktop tasks as well).
     

  11. rl66

    rl66 Ancient Guru

    Messages:
    3,931
    Likes Received:
    840
    GPU:
    Sapphire RX 6700 XT
    if it's like the 8 core one we have at work: yes it does it not so slow it was expected ( :D lol ) and with very friendly energy efficience.

    This should be nice as most user doesn't use all the core of their computer (exemple my wife use 2T at max on her 4C\8T... what a waste of money lol)
     
  12. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Intel probably thought they could just shove their modern architecture into a phone and abuse their process advantage to compete with ARM. It's way cheaper and they came pretty close but ARM clearly won. Also remember that Apple has had a massive performance/power advantage on the CPU for a long time now and they only recently moved to big.little with the A10 in 2016. So I think a lot of engineers were questioning if it was even necessary at the time.

    I think he's speaking in reference to Lakefield which is a mobile processor.
     
  13. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    The difference here is the iGPU doesn't do anything for most desktop users. It's literally wasted space and wasted money. I think you somewhat misunderstand what I'm saying though (to be fair, I didn't explain it very well):
    In the hypothetical CPU I'm thinking of, there would be at least 2 different kinds of cores that would have roughly the same amount of FLOPs but are structured very differently from each other, where you can maximize performance given a specific task. So for example, there could be a set of cores with a short pipeline, limited instructions, no SMT, and each core can adjust its clocks independently. I figure such cores would be able to clock pretty high, since they're not very complex, but, they would scale down to very low (sub-GHz) speeds efficiently too. These cores would be ideal for background tasks, scripted languages, some games, and basic programs that don't constantly churn data.
    Then, there would be another set of cores with complex instructions, SMT, long pipelines, a more narrow frequency range, and maybe a difference in how cache works. These cores are pretty much for foreground tasks that handle a lot of advanced calculations, like encoding/decoding, compiling, rendering, etc.
    Note how both of these functions are traditionally for CPUs, and should stay that way, so I'm not suggesting there be another separate processor, in the way that a GPU functions.
     
    Last edited: Jan 8, 2019
  14. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    Yup pretty much what I was thinking.
    Haha nowadays, Mac (not sure about iOS) is horrendously slow compared to Windows and Linux. I'm not entirely sure if performance-per-watt is better or not, but I assume it's not when the exact same task runs much slower on Mac than it does on other OSes.
     
  15. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Yeah I'm speaking strictly of mobile where Apple's mobile SoC's are significantly faster than the ARM competition:

    https://www.anandtech.com/show/13392/the-iphone-xs-xs-max-review-unveiling-the-silicon-secrets/6

    Even before they moved to big.little though they've almost always had a 50% performance advantage over competing ARM SoC's with half the core count. I also recall Ryan Shrout and them from PC Perspective (who ironically all work at Intel now) talking with Qualcomm engineers about how difficult Big.Little was to implement on the scheduler/software side and how it took them a few generations to even see an advantage from utilizing it. So I think someone like Intel looking from the outside was saying "hey we need to compete with mobile, Apple doesn't need big.little to do it and we have a process advantage, it should be no problem!" then they failed massively and ended up pulling completely out until now.

    I mean they do already kind of do this on desktop - it's basically what AVX is.. but I agree going forward I think we'll see more specialized core designs as simply scaling to 16-32 threads is not doable in a lot of workloads.
     
    schmidtbag likes this.

  16. BLEH!

    BLEH! Ancient Guru

    Messages:
    6,408
    Likes Received:
    423
    GPU:
    Sapphire Fury
    It's the question of is the user going to be sitting in front of it yelling "hurry up you stupid machine!"? :p

    Energy efficient doesn't mean fast :p I went from X58/980X @ 4.13 GHz, with a motherboard alone that drew 100-odd watts to my new X99 system where the power dropped by a factor of 3! I probably don't use it to full potential, but some overhead is good!
     
  17. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    Well, that, and/or "this is burning my leg!" as well as "seriously, 40% battery life!? I just charged this an hour ago!"
     
  18. coth

    coth Master Guru

    Messages:
    561
    Likes Received:
    81
    GPU:
    KFA2 2060 Super EX
    There were 2 Atom attempts.

    One was over decade old CPUs for notebooks. That attempt wasn't good at all.
    Second attempt was smartphone/tablet Atom SoCs. They were pretty good, better than at the time ARM competitors, but they failed in marketing, so they suspended the model row. Morganfield and Willow Trail were cancelled. Now they are resuming SoCs with P1275 process.
     
  19. coth

    coth Master Guru

    Messages:
    561
    Likes Received:
    81
    GPU:
    KFA2 2060 Super EX
    Do not mix up most desktop users with gaming users. Most desktop users do benefit a lot with light integrated GPU.

    For gamer users, once Intel released gaming dGPU, they could do DX12 explicit multi-adapter support. It depends on gaming support, but by the time DX12 will gain popularity. So you will have your several fps increase.
     
  20. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    Huh? The 2nd attempt was atrocious. They were horribly power hungry (even when idle) and their performance-per-watt suffered when trying to make battery life better. Intel can shove themselves anywhere they want and make decent sales so long as the product is "decent" or better. The reason this platform failed is because it was an inferior competitor to what ARM had to offer.

    Fair enough, but that was a little besides the point anyway. I was more addressing the comment specific to D3M1G0D's point about people who feel the iGPU is wasted die space.
     

Share This Page