Intel Core i9-13900K could get extreme performance mode at 350 Watt TDP

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 18, 2022.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    44,669
    Likes Received:
    11,345
    GPU:
    AMD | NVIDIA
  2. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    4,644
    Likes Received:
    5,438
    GPU:
    RTX 2070 Super
    Good thing energy prices are going down. Oh wait. It's the exact opposite.

    [​IMG]
     
    Ivrogne, Airbud, carnivore and 5 others like this.
  3. Embra

    Embra Maha Guru

    Messages:
    1,342
    Likes Received:
    487
    GPU:
    6800xt Nitro+
    That is certainly extreme.
     
  4. nizzen

    nizzen Ancient Guru

    Messages:
    2,178
    Likes Received:
    869
    GPU:
    3x3090/3060ti/2080t
    This isn't even extreme :)
    My old 3900x could draw 350w easy when overclocked on water. 5900x and 5950x too.
    My 6 years old 7980xe was drawing 700w+ on cold water :D (still going strong)

    I'm ready whatever the powerdraw is :cool:
     

  5. Glottiz

    Glottiz Master Guru

    Messages:
    696
    Likes Received:
    311
    GPU:
    TUF 3080 OC
    People who are buying computer parts like 13900K and RTX4090 don't care about power usage and can afford it.
     
    Falkentyne, Airbud and nizzen like this.
  6. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    6,102
    Likes Received:
    3,341
    GPU:
    RTX 3060 Ti
    who cares about that abomination.
     
    fredgml7 and mohiuddin like this.
  7. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,382
    Likes Received:
    924
    GPU:
    .
  8. ocsystem

    ocsystem Member Guru

    Messages:
    185
    Likes Received:
    12
    GPU:
    MSI1070Ti
    Seems of topic but is 13gen fully support avx512 or this one also missed

    edit: yep
     
    Last edited: Aug 19, 2022
  9. kanenas

    kanenas Master Guru

    Messages:
    411
    Likes Received:
    315
    GPU:
    rtx 2070 ,5700xt
    There is no comparison, the processors you mentioned have 12 cores and more, I can't imagine how much current a 13900k would draw if it had 12 "big" and more cores.
     
  10. nizzen

    nizzen Ancient Guru

    Messages:
    2,178
    Likes Received:
    869
    GPU:
    3x3090/3060ti/2080t
    Be prepared guys! For gaming people win run "stock" cpu , but VERY fast memory ;)

    Hynix A-die is here :D:cool:
    [​IMG]
     

  11. Fediuld

    Fediuld Master Guru

    Messages:
    722
    Likes Received:
    413
    GPU:
    AMD 5700XT AE
    The 3900X 350W is for the WHOLE SYSTEM, not just the CPU.
    Even manually OC 5900X doesn't get over 236W at multicore test, with default not going over 186W.

    13900K goes 350W on its own.
     
    ~AngusHades~, HandR and Embra like this.
  12. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    6,815
    Likes Received:
    3,190
    GPU:
    HIS R9 290
    Well, the 13900K supposedly has 24 cores, though only 8 are P-cores. So, those E-cores sure must be working hard too because I don't get how else the power draw could be that high.

    Let's not forget the 13900K's TDP is stock. It's to be expected to get very high power draw from a manually OC'd chip, because you're pushing it to its limits and you have to give yourself some voltage headroom to retain stability. In some cases, people might be doing an all-core OC, which will also draw more power. So, it is actually a bit extreme for a desktop chip to consume that much power from the factory when it most likely doesn't boost all cores at max speed, let alone for much longer than a minute.
     
    Fediuld likes this.
  13. Dazz

    Dazz Master Guru

    Messages:
    993
    Likes Received:
    125
    GPU:
    ASUS STRIX RTX 2080
    You do realize that's impossible even with 1.4v and all core overclock of 4.9Ghz you are looking at about 270w so where's the other 70w gone to? Even Buildzoid when he talks about the power phases of the X570 gets to the 300w power delivery no Ryzen even a 59050X can ever pull that much power even on LN2. So 350w on water is mythical at best.
     
    Fediuld likes this.
  14. nizzen

    nizzen Ancient Guru

    Messages:
    2,178
    Likes Received:
    869
    GPU:
    3x3090/3060ti/2080t
    Have you tested yourself? Try harder ;)
     
  15. TLD LARS

    TLD LARS Master Guru

    Messages:
    417
    Likes Received:
    163
    GPU:
    AMD 6900XT
    I am stuck at max 140W on a 5800x because higher simply does not make sense because the performance is not there unless doing allcore workload.
    The performance/voltage scales like crap above those speeds.
    Doubling 8 cores at 140W is 16 cores at 280W, so I am also missing 70W, just like Dazz notes.
    Skatterbencher gets around 270W
    Buildzoid gets 270W
    Gamers nexus stops at 253W
    Hardware unboxed says it used 140W more then stock when overclocked so around 260-300W (370W system power measured).
    Der8auer stops at below 270W too
    Should they all try harder?
     
    Dazz and Fediuld like this.

  16. nizzen

    nizzen Ancient Guru

    Messages:
    2,178
    Likes Received:
    869
    GPU:
    3x3090/3060ti/2080t
    Yes, they didn't go all in, but they didn't go in for it either ;)
     
  17. tty8k

    tty8k Master Guru

    Messages:
    856
    Likes Received:
    286
    GPU:
    3070 / 6900xt

    How many FPS in AIDA64 ?
    Cause in games it's more like this for those with such budget:

    upload_2022-8-19_0-36-15.png

    :p Nizzen
     
    ~AngusHades~ and TLD LARS like this.
  18. nizzen

    nizzen Ancient Guru

    Messages:
    2,178
    Likes Received:
    869
    GPU:
    3x3090/3060ti/2080t
    Overclockers understand that the clock manipulation techniques employed by Intel for the past several generations are overclocking: Our overclocks have employed power limits far beyond 350W for as long as this type of power management has existed.

    Given that Z690 voltage regulators support loads in the thousands of watts, the leak that “only Z790 will support the new power mode” appears designed to prepare critics to think uncritically when they see a Z790 motherboard working an upcoming 13th-gen Intel Core CPUs harder than when that same processor is mounted to the Z690. Critical thinkers must remember to hold manufacturers of Z690 motherboards responsible for any performance shortcomings, rather than credit Intel for magically “enabling” a feature on Z790 that Z690 already had the hardware to support.

    https://www.pcinq.com/tech-press-fa...hwFzqd1Djmj8EafrsWz6FZ2QLsFhQIB2sZiEmA8r1Gb34
     
  19. nizzen

    nizzen Ancient Guru

    Messages:
    2,178
    Likes Received:
    869
    GPU:
    3x3090/3060ti/2080t
    I use 3440x1440 165hz and 1080p 360hz and 4k 120hz. It depends on the game ;)
     
  20. user1

    user1 Ancient Guru

    Messages:
    2,159
    Likes Received:
    857
    GPU:
    hd 6870
    That's quite easy to explain actually, the big 8 cores are probably going to be running in excess of 5.2ghz all core, This isn't on a die shrink so more speed = exponentially more power, reminder that this is still 10nm +++++(+?) , despite what intel's marketing department would have you believe. alderlake at >5.2ghz all pcore is uncoolable essentially , even though there are chips that theoretically can do it, so I wonder what kind of tricks they have up their sleeves.
    [​IMG]
    the 12900ks has an all pcore clock of 5.2ghz and the system draw exceeds 425w , so 350w tdp is believable especially since it might have more cores, I should mention that the 12900ks is almost uncoolable, the guys at OC3D used a 360 rad with 3000 rpm noctua fans at 100%, and the temps still hit 100c!

    https://www.overclock3d.net/reviews/cpu_mainboard/intel_core_i9-12900ks_review/16
     

Share This Page