Intel's 14th Generation Meteor Lake Processors: Emphasizing AI and Energy Efficiency

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 30, 2023.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    47,158
    Likes Received:
    15,868
    GPU:
    AMD | NVIDIA
    mbk1969 and fantaskarsef like this.
  2. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,017
    Likes Received:
    8,650
    GPU:
    2080Ti @h2o
    So... that VPU... are they putting it into all them CPUs now?
     
  3. nevcairiel

    nevcairiel Master Guru

    Messages:
    863
    Likes Received:
    362
    GPU:
    4090
    They at least say "All SKUs", so all meteor lake CPUs will have it. If it continues into the future depends on adoption, i guess, but at least this upcoming generation should be consistent.
     
  4. heffeque

    heffeque Ancient Guru

    Messages:
    4,374
    Likes Received:
    184
    GPU:
    nVidia MX150
    I'm assuming that this is Intel's response to AMD's Ryzen AI (Xilinx) on the Ryzen 7040 family (and assuming most future products).
     
    rl66 likes this.

  5. rl66

    rl66 Ancient Guru

    Messages:
    3,772
    Likes Received:
    785
    GPU:
    Sapphire RX 6700 XT
    Not only AMD, there is also Qualcomm, Apple,... and so more, nearly everyone is in the train :) .
     
  6. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    7,771
    Likes Received:
    9,631
    GPU:
    RX 6800 XT
    At this moment, every company is fighting of nvidia's leftovers in the AI market.
     
    H83 likes this.
  7. heffeque

    heffeque Ancient Guru

    Messages:
    4,374
    Likes Received:
    184
    GPU:
    nVidia MX150
    Well, Qualcomm and Apple are not in the x86 industry...
     
  8. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,205
    GPU:
    AD102/Navi21
    22 threads and 16 cores means 6/12+10e

    [​IMG]

    given how Volta was shown on a roadmap in early 2013, Jensen really had a plan how to stay ahead of the pack.
     
    Last edited: May 30, 2023
  9. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,017
    Likes Received:
    8,650
    GPU:
    2080Ti @h2o
    Good, thanks I guess I missed that one.
    Still have to find out why I should want one of those VPUs, but at least I know you can't buy a CPU without it.
     
  10. heffeque

    heffeque Ancient Guru

    Messages:
    4,374
    Likes Received:
    184
    GPU:
    nVidia MX150
    More and more programs will use AI. Examples are programs that use your webcam (doing better at blurring background, etc.), or your microphone (doing better at taking out background noise)... Also the obvious photoshop, video editing, etc... and probably more programs that we can't imagine right now that they could use it, but will.
     

  11. H83

    H83 Ancient Guru

    Messages:
    5,099
    Likes Received:
    2,609
    GPU:
    XFX Black 6950XT
    I´m getting so tired of earing that everything has or is going to use AI...
     
  12. heffeque

    heffeque Ancient Guru

    Messages:
    4,374
    Likes Received:
    184
    GPU:
    nVidia MX150
    Well... get used to it.
    Years ago it was "cloud this, cloud that" and it has now been normalized.
    Before that it was "internet 2.0 this, internet 2.0 that" and now it's a given.
    AI is a breakthrough that is on its infancy (and thus is currently advancing rapidly) and will only get better, more powerful and with more use-cases as time advances.
     
  13. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,017
    Likes Received:
    8,650
    GPU:
    2080Ti @h2o
    Webcam? Don't have one. Microphone? I'm well understood right now (as in, good mic with little background noise), don't need that. I do no photoshop or video editing... so I still don't need one, please thanks mkay? I know it's just my use case, but I don't see sense in paying for a VPU, cooling a VPU, have one in my power budget while gaming, etc.

    Yeah, I believe you are right. For better or worse, with little to no real gain in what you do with all that stuff, they will push it everywhere they can.
    I still don't upload my pictures to the cloud and don't need cloud space either. Thanks, I can afford my own harddrives.
    I don't need my own webpage, as much as they wanted me to believe it during myspace times.
    And I will live long enough to be the grandpa telling you "I told you we don't need it" just before the nukes from Terminator burn us down.
     
    CrazY_Milojko, Embra and icedman like this.
  14. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,690
    Likes Received:
    4,088
    GPU:
    HIS R9 290
    I too find it a little annoying how buzzy of a term AI has become, but, it makes sense considering how rapidly it is growing. It'd be nice if it were only used where machine learning was involved at some point. What I don't get though is why the average person would want a dedicated processor for it. I think it makes sense in the server space, but I don't see why a home user couldn't just do FP16 on a GPU (though from what I recall, on Nvidia, there's no performance advantage to doing that).
     
  15. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,289
    Likes Received:
    2,302
    GPU:
    Aorus 3090 Xtreme
    I've been waiting for Meteor Lake as my next upgrade to replace a 10700K for gaming and my 6700K for browsing, light gaming, secuirty etc.
    But, the 6700K only uses 57 to 60W idle with a 1080ti and is on practically 24/7. I bet a Meteor Lake system wont be able to match that, especially with a modern GPU.
    When on 24/7, every 10W difference is approx £25 a year electricity, it adds up. To the point that the Titanium efficient PSU probably paid for itself vs Gold standard.
    But the PSU is still only around 90% efficient at 75W (10% of the 750W PSU).

    There needs to be a higher efficiency standard than Titanium covering 0 to 9% of max power use, given the size of PSUs we now have to fit and the minimum loads PCs achieve.

    I eagerly await Meteor Lake power use tests of motherboards and CPUs.
     
    cucaulay malkin likes this.

  16. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,690
    Likes Received:
    4,088
    GPU:
    HIS R9 290
    Why do you say that? While 60W for a 1080Ti and a K-series CPU is decent, it's still pretty high for an idle and if efficiency matters to you. Meteor Lake may have a lot more bells and whistles but they're much more refined.
     
    cucaulay malkin likes this.
  17. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,205
    GPU:
    AD102/Navi21
    it is quite high, 10700f + 6800 use about 10-20 altogether. I'd rather not have it run at 60w when the pc is on almost 24/7.
     
    schmidtbag likes this.
  18. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,274
    Likes Received:
    1,644
    GPU:
    EVGA 1070Ti Black
    how is your 6700k+1080ti idle using that much wattage?
    my 6700k + 1070ti barely uses 30 watts idle combined and that with FF open on these forums, less hwinfo64 and rtss arnt show me the full truth

    Normal both are under 10 watts idle with nothing open running other then windows desktop

    Intrested in this energy efficacy can they 95 tdp cpu again? that actual runs around that as max load? never mind PL2 as it stand when and if i build new pc the PL2 will locked to 10-20% over PL1 or it will be disabled out right.
     
    Last edited: May 30, 2023
  19. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,690
    Likes Received:
    4,088
    GPU:
    HIS R9 290
    That's why I prefer ARM for my home server. I don't know if the entire system ever pulled more than 15W from the wall. Granted, it's not very fast, though the platform is several years old so I can get more than double the performance within the same power envelope.
    However, Mufflore's PC is used as a PC and not a server, so of course I'm not exactly comparing apples to apples here. In any case, it's not hard to make an x86 system sip power, even under load.
     
    rl66 likes this.
  20. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,274
    Likes Received:
    1,644
    GPU:
    EVGA 1070Ti Black
    I saw other instance idle power draw is wrong, like my uncle has 1050ti in (8400 i5 that 65watts but idles around 10watt too), is should be pulling 10watts if that idle, but in his system idle is drawing 45watts, i put the card in another system and it idle around 10watts. and as far as i can both systems power saving stuff is setup the same. what one build pulls idle compared to another system pretty much identical dont mean it gona pull same idle watt apparently

    my whole idle behind new pc builds is IF consoles does this(x) with 300watts or less I want to do 2x it pefer for under 400watts, it why i like cpu that 95watts maxed out and why i pefer gpu under 200watts maxed ( ideally around 150watts)

    was perfectly doable out box with messing with bios or setting atest it was before pl2 became thing
     
    XP-200 likes this.

Share This Page