Intel reportedly delayed 10th gen Desktop Due to high (300w) power Consumption

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 10, 2020.

  1. RooiKreef

    RooiKreef Guest

    Messages:
    410
    Likes Received:
    51
    GPU:
    MSI RTX3080 Ventus
    Intel should just stop chasing the Ghz game and focus how they can be more efficient. The 14nm node is really biting them hard atm.
     
    angelgraves13 likes this.
  2. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Honestly using TDP to compare products of any kind is pointless. Gamers Nexus had an excellent video on it - AMD's TDP number doesn't even use electrical power as part of the calculation
     
  3. isidore

    isidore Guest

    Messages:
    6,276
    Likes Received:
    58
    GPU:
    RTX 2080TI GamingOC
    Oh snap, i guess does high frequencies come at a cost.
     
  4. BLEH!

    BLEH! Ancient Guru

    Messages:
    6,402
    Likes Received:
    421
    GPU:
    Sapphire Fury
    Nonetheless a brown trouser moment, yes xD
     

  5. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Can I get a link to that video if you have one? Not trying to argue with you. I'm just interested in seeing it.

    Tomshardware has an obvious anti-AMD bias and always has. However, we must be looking at completely different reviews since the one I'm looking at shows the 3700X consuming 82-123watts. In fact, below are all of the power consumption charts from the 3700X review over at Tomshardware.... They didn't even make it half way to 300watts...

    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
     
  6. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    987
    Likes Received:
    370
    GPU:
    KFA2 RTX 3090
    I'm not at home so I can't link you my screenshot of yesterday but here's what an actual owner of both a 9900k and 3960x TR can tell you

    1) processors advertised TDP are fantasy if you even mention them anything else you write becomes worthless
    2) don't mix total system power with cpu package power, two very different things
    3) my 9900k@5.1Ghz 1.36v used around 180watts alone, total system power while stress-testing maxing cpu and gpu (1080ti) around 600watts
    4) a 9900k downclocked to 4.2Ghz used 60watts instead in the same test !!!
    5) a TR 3960x 24 cores 100% stock used 287watts ALONE, total system power while stress-testing maxing cpu and gpu (1080ti) 770-840watts !!! (I had a 850watt psu for the 9900k I bought a 1200watts for the TR)
    6) my 9900k cpu package at idle used like 20watts, the TR uses like 80watts (simply surfing the web my psu reports 250watts on the amd build which is enough to activate the psu fan, something I never had on the intel)
    7) the noctua double fan tr4 is rated 180watts the bequiet dark rock pro tr4 is rated 250watts, both are already insufficient for the base 24 cores TR, needless to say you can FORGET using them on a 64 cores 3990x unless you think it's ok to have a 2400rpm fan running full speed in a 5-10k desktop computer, I'm not
    8) AIOs are the same, none is rated for 287+ watts so none of them are good for TR4
    9) open bench review build temperatures do not represent a real-life closed case
    10) a fully populated 3 M.2 +6 sata drives all memory slots occupied etc... TRX40 motherboard generates a ton more heat inside your case than a "gamer" setup like my Z390 9900k, there's a reason why AMD put fans on the chipset, my PCH is at 60°C most of the time and I have 8-10 case fans

    Being simply "ok" and being "good" or even "adequate" and working in a reliable and acceptable way with no throttling are two different things
    I was "ok" using with a H115i rgb platinum aio I can encode videos for 8hrs and the cpu temp won't go above 90°C but with maxed fans and pump and more importantly yes of course the cpu clock went down to like 4.0Ghz I didn't really check frankly staying close to a full fans machine encoding all day long isn't a pleasant experience.
    So....I ordered custom loop parts, rebuilding my AMD case this weekend I could buy a 32 cores but didn't because I guessed the 24 cores would already be a nuclear reactor and I was right.

    p.s.
    I can run Kombustor stress test while doing cinebenchR20 runs without the computer crashing (no way with the 9900k) and I still managed to have 12k score lol oh yeah I was surfing the web at the same time rofl threadripper is fun ^^
     
    Last edited: Jan 11, 2020
    Mesab67 likes this.
  7. butjer1010

    butjer1010 Active Member

    Messages:
    93
    Likes Received:
    46
    GPU:
    RX6900XT Toxic EE
    I can confirm that, that is true!
    I've had terrible experience on tomshardware forum, only because of being "AMD fanboy". 13 years ago, there was only 2 of us :) (that's joke of course, but it wasn't far from that)
     
  8. TLD LARS

    TLD LARS Master Guru

    Messages:
    767
    Likes Received:
    362
    GPU:
    AMD 6900XT
    ---------------------------------------------------------------------------------------------------
    To be fair a 9900k@5.1Ghz 1.36v at 180Watts is Silicon lottery top specs and your CPU i better then a lot of 9900ks CPU´es you can buy today.
    Gamers Nexus meassures 200W on the 12V power cables with a stock but unlimited 9900ks.
    In my opinion a comparison with your CPU is somewhat not indicative with "normal" parts and if a similar 9900k is sold today, it would be branded 9900ks.
     
  9. BReal85

    BReal85 Master Guru

    Messages:
    487
    Likes Received:
    180
    GPU:
    Sapph RX 570 4G ITX
    I'm just curious if the remaining small number of Intel hardcore fans will admit Intel has made the egg-frying CPUs like AMD with the FX (tbh, this was already true for Coffee Lake i7).
     
  10. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,450
    Likes Received:
    2,545
    GPU:
    TUF 6800XT OC
    I don't think there has been in the last 10 years a Dead-on-Arrival CPU like this one...
    Even FX cpu's, as inefficient as they were, brought more cores (but unfortunately software at the time was way behind the multi-core revolution).

    But these "10th gen" 10-cores are completely obsoleted by a product that launched 6 months ago (3900X)

    What Intel should do is accept defeat on this front, reduce prices on their 9th gen to get parity (so 9900K would be at the price level of 3800K, so ~$380) and they will sell decently well !
    And then focus all efforts and investments on the new architecture and their 7nm process (or whatever will get them to cut power in half)

    This 300W monstrosity has no place on the market, just like AMD's FX 9590 which should not have existed.
     
    Solfaur, Jagman and butjer1010 like this.

  11. Jagman

    Jagman Ancient Guru

    Messages:
    2,264
    Likes Received:
    328
    GPU:
    16GB RX6800
    ^ Couldn't agree more mate. I think Intel are going all Pentium 4 on us again :(
     
  12. Mesab67

    Mesab67 Guest

    Messages:
    244
    Likes Received:
    85
    GPU:
    GTX 1080
    Would be interesting if you could let us know how the custom CPU block and loop compare to the AIO you were previously using - especially using a new Threadripper. Most AIO CPU block cooling fins (the critical component) don't cover the full CPU (irrespective of the block plate), however custom Threadripper CPU blocks are supposed to have full cooling fin coverage e.g. Heatkiller IV PRO, XSPC Raystorm NEO, EK, etc etc
     
    Last edited: Jan 12, 2020
  13. Pilz

    Pilz Guest

    Messages:
    8
    Likes Received:
    0
    GPU:
    1080 FTW @ 2.2Ghz w/EKWB
    Are you dumb? They don't pull 300W+. I own both a Gigabyte X570 Aorus Xtreme with the 3900X and Gigabyte X570 Master + 3700X. the most I've ever pulled with per CCX OC was 240W on the 3900X. Stop trying to deny that this 12C/24T CPU is more efficient than Intel's garbage lately. I've owned 2x 9900K's for the record and they routinely pulled ~250-300W overclocked. Stock my 9900K could pull 200W while my 3900X will only hit 150-170W stock.
    Edit:

    Pictures proving the above.

    9900K was paired to an Asus Z390 Maximus XI Extreme, Bitspower Monoblock, Alphacool Nexxos 480mm *60mm rad in push / pull with Corsair ML120mm fans at 1900RPM, Aquacomputer D5 Next pump

    3900X is paired to a Gigabyte X570 Aorus Xtreme, EK Velocity block, HW Labs GTS 360mm *30mm radiator in push only, Lian Li Borla BR Digital fans at 1700RPM, Aquacomputer D5 Next pump

    Temps on my 3900X will drop another 5-6C once I get my Optimus Foundation block since it's the only one designed properly for Ryzen 3000
    https://photos.app.goo.gl/TJWnJWQSLU4mg11o6
     
    Last edited: Jan 12, 2020
  14. Pilz

    Pilz Guest

    Messages:
    8
    Likes Received:
    0
    GPU:
    1080 FTW @ 2.2Ghz w/EKWB
    Optimus makes the best CPU blocks for AMD, they're a new company based out of Chicago but they're products are game changing vs the crap EK has been making lately. I should know since I own 7-8 EK blocks.
    Here's their website: https://optimuspc.com/


    Their claims are accurate, no nonsense functionality first really is game changing.
     
  15. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    How when I've never managed to hit even 300W with my 3900X with stupid amounts of voltage and current running through it when OC'd? That's absolutely made up BS.

    Edit: I just ran Cinebench R15 multithread with the 3900X set with PBO and it peaked at 130W according to Ryzen Master.

    300W draw on the just the CPU, at stock, would be more than a problem. It'd be your PC overheating until it ionizes the gasses around it, collapses into a star, then goes mini nova. I want whatever mushrooms you're on, don't be greedy and share.
     
    Last edited: Jan 12, 2020
    carnivore, wavetrex and bernek like this.

  16. bernek

    bernek Ancient Guru

    Messages:
    1,632
    Likes Received:
    93
    GPU:
    Sapphire 7900XT
    Or Greed Lake where they've been bathing for years ...

    Like someone said Intel was killed in 2019.

    I would still buy a cheap 9900K just for the fun of making a hackintosh machine with a VEGA64 that I have. I'll just wait for prices to be slashed again wtih Ryzen 4xxx then I will get it for ~ 200-250.
     
  17. bernek

    bernek Ancient Guru

    Messages:
    1,632
    Likes Received:
    93
    GPU:
    Sapphire 7900XT
    I kinda like these kinds of people they keep the fire burning like Intel :p
     
  18. Mesab67

    Mesab67 Guest

    Messages:
    244
    Likes Received:
    85
    GPU:
    GTX 1080
    Please link to the page then quote the line that tells you a 3700X can pull 300w...just to clarify any confusion.
     
    Last edited: Jan 12, 2020
  19. Mesab67

    Mesab67 Guest

    Messages:
    244
    Likes Received:
    85
    GPU:
    GTX 1080
    Just checked out the website and their Optimus Threadripper 3 block looks very interesting, primarily, in that the cooling fins look to extend way beyond the dies underneath (chiplet and IO) - to almost the full physical CPU size. https://optimuspc.com/products/absolute-cpu-block-threadripper-3. It's on preorder though (release date? made to order?). This looks significant since, for any CPU, the full surface area (not just the die area) is used in heat exchange - if current blocks can't guarantee 100% generated heat capture(/rotation) then optimal cooling will be affected, within (return/benefits) reason. This appears in significant contrast to to the competitors above whose fin areas extend to only the die edge area underneath (they're all still superior to non custom alternatives). Interestingly, would the current custom blocks full-cover the extended die area seen in the 3990X - with the additional 4 chiplets? I have my suspicions.
    We do need independent, validated benchmark results of course - hope Guru3D can test this out.
     
    Last edited: Jan 12, 2020
  20. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    I did, thus my comment. I actually looked to see if you had edited the story or I missed a link. There's no 'source link' and your sources says 'rumor/chatter'. I can read.
     

Share This Page