Intel Alder Lake Core i9-12900K Overclocked to 5.2 GHz on P cores, uses 330 Watts

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 20, 2021.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    42,454
    Likes Received:
    10,269
    GPU:
    AMD | NVIDIA
    TheDigitalJedi and DannyD like this.
  2. Embra

    Embra Maha Guru

    Messages:
    1,206
    Likes Received:
    400
    GPU:
    Vega 64 Nitro+LE
    Impressive watts numbers, definitely will be king of watts used.

    Add cooling, Mobo, hefty PSU etc... These will be an expensive build.
     
    TheDigitalJedi and tunejunky like this.
  3. moab600

    moab600 Ancient Guru

    Messages:
    6,359
    Likes Received:
    268
    GPU:
    Galax 3080 SC
    I bet DDR5 gonna cost more than the CPU itself.
     
  4. tunejunky

    tunejunky Ancient Guru

    Messages:
    1,758
    Likes Received:
    702
    GPU:
    RX 6800 RTX 2070
    only for 32Gb +
     
    TheDigitalJedi likes this.

  5. FatBoyNL

    FatBoyNL Ancient Guru

    Messages:
    1,590
    Likes Received:
    47
    GPU:
    MSI 2070 Gaming Z
    Oh noes it uses more power when it's overclocked! Overclocking blerghhh :eek:
    Now let's wait for some proper reviews ;)

    EDIT: I mean older systems have been known to suck more to start with
     
    TheDigitalJedi likes this.
  6. SamuelL421

    SamuelL421 Master Guru

    Messages:
    221
    Likes Received:
    150
    GPU:
    RTX 5000 / 2080 Ti
    Wondering this myself. Also, will first gen DDR5 even be as good as top end DDR4? Past two transitions - from DDR2 to DDR3 and then DDR3 to DDR4, the top spec previous gen matched or beat the newer gen at launch.
     
  7. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    755
    Likes Received:
    235
    GPU:
    KFA2 RTX 3090
    what ? why and how ? that's overclocked 24c threadripper territory and 24 real cores
    that's a 8 cores cpu, yeah i dont count idle cpu they should count since their only goal is to do very little
    my 9900k was at 190w cinebenchr23 5.0Ghz all 8 cores at 1.28v 217w@5.1ghz@1.36v
     
    Last edited: Oct 20, 2021
  8. TLD LARS

    TLD LARS Master Guru

    Messages:
    255
    Likes Received:
    92
    GPU:
    AMD 6900XT
    Actually not to bad, Hilbert review of the 11900k had 410w usage of the system when 5200Mhz allcore, take away 50-70watts for the rest of the system and efficiency losses and the 12900k would use a little less power then the 11900k at allcore load.

    If the 12900k allcore performance is in a spot between a 5900x and a 5950x, like most of the leaks suggest, the 12900k is still 30-40% faster then a 11900k for the same amount of power, I do not think Intel has ever had a generational jump that high the last 10 years.

    Everything still sounds too good to be true for me, with these leaks.
     
    tunejunky likes this.
  9. user1

    user1 Ancient Guru

    Messages:
    1,826
    Likes Received:
    663
    GPU:
    hd 6870
    Im not very surprised, At the frequencies they are targeting , the power consumption on "7nm" (aka 10nm++++) is about what you would expect slightly worse than 14nm. It is also a 16core chip, the 8 E cores might be clocked lower, but they are still going to pull numbers in the ball park of a lower frequency skylake core in avx workloads. that together with the big cores at high frequency is going to pull some serious juice.

    the latency will be worse pretty much guaranteed, The kits being teased aren't anywhere close to what the best ddr4 kits you can buy are, rough latency on a run of the mill cl 19 4000mt/s kit is going to be 9.5ns , the fastest kit mentioned thus far is the cl 36 ddr5-6600 that equates to about 10.9ns, Latency isn't everything, but you're definitely not going to see any improvements to non-bandwidth bound workloads for the most part, especially on the early platforms, alderlake is basically going to be basically be forced to run the imc at 1/2 rate with ddr5 and possibly 1/4 rate being used for extreme frequencies both of which have an exceedingly severe latency hit.

    however It will be a big win for integrated graphics, at 6600mt/s in dual channel , that gives you about 105.6gb/s, that not far off from something like a gtx 1650 , especially since the latency is alot lower than your typical gddr,
     
    tunejunky, SamuelL421 and Keitosha like this.
  10. tty8k

    tty8k Master Guru

    Messages:
    638
    Likes Received:
    164
    GPU:
    3070
    Looks "normal" to me.
    While not ideal I didn't expect any less given the conditions.

    It's a 5.2GHz all cores (P cores) @ 1.385v
    Small cores running default at 3.7GHz
     

  11. Glottiz

    Glottiz Master Guru

    Messages:
    601
    Likes Received:
    198
    GPU:
    TUF 3080 OC
    These are funny numbers to see now that Apple announced 400GB/s memory in M1 Max and their "integrated" graphics will compete with RTX3070/3080.
     
  12. user1

    user1 Ancient Guru

    Messages:
    1,826
    Likes Received:
    663
    GPU:
    hd 6870
    I wouldnt consider the m1 max comparable since its going to cost like 3k + and use hbm memory . Like comparing a lada to a ferrari.
     
    tunejunky, Embra and butjer1010 like this.
  13. suty455

    suty455 Master Guru

    Messages:
    488
    Likes Received:
    183
    GPU:
    Nvidia 3090
    Wow! lets just digest those numbers 330w just for the CPU, that's a piece of silicone say 4cmx4cm (guess) a whole 3090 uses just 20w more, and this is supposed to be the pinnacle of design right now as its the most current CPU about to be launched, what kind of cooling is that going to take in an average system, its not much change from the current Intel systems yes its great for those headline figures but cmon 330w for 8 cores!
    I think AMD has those stacked CPUs just sat waiting to spoil this party and at lower consumption and heat but hey lets see on launch Day what happens, at least Intel is able to push AMD which is great otherwise they would stagnate.
     
  14. butjer1010

    butjer1010 Active Member

    Messages:
    67
    Likes Received:
    25
    GPU:
    RX6900XT Toxic EE
    I don't think majority here knows what Lada is! :D:D:D I'm from Croatia, so i know (my father had Samara 1989.)......
     
    user1 and tunejunky like this.
  15. alanm

    alanm Ancient Guru

    Messages:
    10,674
    Likes Received:
    2,772
    GPU:
    Asus 2080 Dual OC
    Overclocking for practical purposes is dead and has been for last few years.. Both AMD and Intel have squeezed out every last bit of headroom from their chips as boost. Beyond that, power draw goes through the roof for very little performance gain. Days of Sandy Bridge have long been over folks, time to move on.
     
    tunejunky and Embra like this.

  16. kapu

    kapu Ancient Guru

    Messages:
    5,027
    Likes Received:
    538
    GPU:
    Radeon 6800
    Actually there are real gains on AMD side without wattage sacrifice .
    You can get extra 10% performance out of any Zen3 right now.
     
  17. asturur

    asturur Maha Guru

    Messages:
    1,162
    Likes Received:
    374
    GPU:
    Geforce Gtx 1080TI
    Apple announced 400GB for the 32core GPU and 200GB for the 16core gpu, no mention of the 24core option.
    That makes me think that that bandwidth is achieved with multiple controller/channel depending on the GPU.

    I won't compare arm soc with x86 cpu in general, those are different technologies entirely.
     
    tunejunky likes this.
  18. Timur Born

    Timur Born Active Member

    Messages:
    57
    Likes Received:
    23
    GPU:
    Nvidia 2070S 8 GB
    Well..... You can increase multi-core performance some by lifting clocks on "worse" cores and then -30 on curve optimizer still does not lift the worst cores up to the best cores' stock levels. But headroom on "best" cores is rather limited, which limits single-core boosts.

    My best 5900X core allows an curve optimizer setting of -8, with -10 being definitely unstable and even -8 might need more testing. And then you theoretically have to test each core for stability and still don't know for sure, it's a gamble for anyone not just interested in gaming.
     
  19. tty8k

    tty8k Master Guru

    Messages:
    638
    Likes Received:
    164
    GPU:
    3070
    There is another test with the cpu pushed to 5.3GHz

    "The Intel Core i9-12900K CPU consumed an insane 400W of power in the AIDA64 AVX stress test at full load while running at 5.3 GHz (1.44V) and close to 300W in the standard SSE CPU-z test. It is also reported that the AIDA64 stress test only lasted for a minute before the system shuts itself off since the temperatures were out of control even when the radiator was submerged in ice to provide more cooling to the CPU."

    It clearly doesn't scale well with voltage, which is to be expected.
    I'd rather see a undervolted version running 5GHz at less than 200W.
     
  20. Dimitrios1983

    Dimitrios1983 Master Guru

    Messages:
    294
    Likes Received:
    80
    GPU:
    AMD RX560 4GB
    Lada a car that is made like a soda can and under 5MPH the whole windshield pops out and if you crash that car at 30MPH you're dead. A car that handles like crap too.

    I'm from America and quickly learned about Lada's after watching a few crash video's on LEAKEDREALITY!
     
    butjer1010 likes this.

Share This Page