Intel reportedly delayed 10th gen Desktop Due to high (300w) power Consumption

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 10, 2020.

  1. bernek

    bernek Ancient Guru

    Messages:
    1,524
    Likes Received:
    48
    GPU:
    2080TI/1080TI/VEGA
    I kinda like these kinds of people they keep the fire burning like Intel :p
     
  2. Mesab67

    Mesab67 Master Guru

    Messages:
    211
    Likes Received:
    73
    GPU:
    GTX 1080
    Please link to the page then quote the line that tells you a 3700X can pull 300w...just to clarify any confusion.
     
    Last edited: Jan 12, 2020
  3. Mesab67

    Mesab67 Master Guru

    Messages:
    211
    Likes Received:
    73
    GPU:
    GTX 1080
    Just checked out the website and their Optimus Threadripper 3 block looks very interesting, primarily, in that the cooling fins look to extend way beyond the dies underneath (chiplet and IO) - to almost the full physical CPU size. https://optimuspc.com/products/absolute-cpu-block-threadripper-3. It's on preorder though (release date? made to order?). This looks significant since, for any CPU, the full surface area (not just the die area) is used in heat exchange - if current blocks can't guarantee 100% generated heat capture(/rotation) then optimal cooling will be affected, within (return/benefits) reason. This appears in significant contrast to to the competitors above whose fin areas extend to only the die edge area underneath (they're all still superior to non custom alternatives). Interestingly, would the current custom blocks full-cover the extended die area seen in the 3990X - with the additional 4 chiplets? I have my suspicions.
    We do need independent, validated benchmark results of course - hope Guru3D can test this out.
     
    Last edited: Jan 12, 2020
  4. HeavyHemi

    HeavyHemi Ancient Guru

    Messages:
    6,718
    Likes Received:
    806
    GPU:
    GTX1080Ti
    I did, thus my comment. I actually looked to see if you had edited the story or I missed a link. There's no 'source link' and your sources says 'rumor/chatter'. I can read.
     

  5. warlord

    warlord Ancient Guru

    Messages:
    2,825
    Likes Received:
    945
    GPU:
    Null
    Intel knows better. If they say 600w fanboys will blame incompetent motherboard makers. Mark my words. Those who support Golden Poop will never make proper purchases, neither on earth nor other planets at distant future. No hope for.
     
  6. Mesab67

    Mesab67 Master Guru

    Messages:
    211
    Likes Received:
    73
    GPU:
    GTX 1080
    The source link is active (https://www.computerbase.de/2020-01/intel-comet-lake-s-mainboard-cpu/) and the page does state "Several motherboard manufacturers revealed that the ten-core break the 300-watt mark at maximum load.". Yes, it's correctly reported as chatter/rumour, so issues with this at all.
     
    warlord likes this.
  7. nosirrahx

    nosirrahx Master Guru

    Messages:
    204
    Likes Received:
    61
    GPU:
    HD7700
    Anyone getting a familiar feeling here?

    If you have been around for a while this feels a lot like when Intel was going for 4.0 ghz on the P4 platform as AMD was destroying them with their 939 line of chips.


    Intel came roaring back with a new architecture after finally abandoning NetBurst. I wonder if this history will repeat again as well?
     
  8. angelgraves13

    angelgraves13 Ancient Guru

    Messages:
    1,736
    Likes Received:
    443
    GPU:
    RTX 2080 Ti FE
    That was a different time. The game's changed now and everyone is ditching Intel. They no longer have the best fabs either.

    Intel is done. At least for PC.
     
  9. nosirrahx

    nosirrahx Master Guru

    Messages:
    204
    Likes Received:
    61
    GPU:
    HD7700
    Sure its different in some ways but they still have the insanely deep pockets. I don't expect 2020 to be Intel's year but by 2022 I would be surprised if they don't have something far superior to their current offerings. Their choice to build monolithic chips due to interconnect latency fears were soundly proven inaccurate by AMD. If Intel decides to go the chiplet route they can finally break away from their yield issues. If Intel had a 16 core chip that was a network of 4 core chiplets I bet their prices could reach parity with AMD.
     
  10. The Laughing Ma

    The Laughing Ma Ancient Guru

    Messages:
    4,022
    Likes Received:
    552
    GPU:
    Gigabyte 2070 Super
    If the 3700x really does pull 300w then I am going to be well annoyed. My new build has one and I was hoping that the Noctua D15 was going to be enough... the things freaking huge much much larger than anything my Core 2 Duo ever had strapped to it (usually a Zalman) long back before I tried my hand at WCing.

    Beyond that anyone see the irony in calling these hotplate CPUs after lakes. Is it a subtle hint at what you need to keep them cool?
     

  11. toyo

    toyo Member Guru

    Messages:
    183
    Likes Received:
    78
    GPU:
    Gigabyte 1070Ti 8G
    I don't get why people are so shocked about the 300W consumption. We already know how much Coffee Lake 6 and 8 core CPUs consume. We already know it's still 14nm, with some improvements. CFL-R was a small improvement over CFL-S when it came to the voltage required to achieve higher frequencies (can be seen in the Silicon Lottery charts). Add another small improvement and you're left with a 10/20 14nm CPU that will require somewhere around 1.2V for 5GHz on all cores for actual stability under AVX stresstesting. Intel and the mobo partners will, as usual, overvolt a bit at default to make up for silicon quality variance.

    So yes, considering the 20 threads, the 300W under full load at 4.8GHz is not at all surprising. Overclock it and it will go through the roof further.

    The "good" news? Menial tasks require less voltage. Gaming will not be as demanding at all either. But if you do something like Blender, you'll come close to those 300W.
    [​IMG]
     
  12. Denial

    Denial Ancient Guru

    Messages:
    12,658
    Likes Received:
    1,879
    GPU:
    EVGA 1080Ti
    Is anyone here really shocked? Browsing through the thread it appears most people aren't shocked, in fact most people seem like they expected it. They just find it embarrasing that a ~$70B a year company is having trouble competing with a ~$7B one.
     
    Jagman and Solfaur like this.
  13. H83

    H83 Ancient Guru

    Messages:
    2,877
    Likes Received:
    465
    GPU:
    MSI Duke GTX1080Ti
    Maybe, who knows? One thing is for sure, when Intel finally moves to a smaller node and a new architecture they are going to offer a big performance jump over their current products, the question is how big this jump is going to be.


    This is so absurd that i´m having an hard time replying to it...
     
  14. angelgraves13

    angelgraves13 Ancient Guru

    Messages:
    1,736
    Likes Received:
    443
    GPU:
    RTX 2080 Ti FE
    Until they get a new architecture....whenever that will be. In 3 years? AMD will be on Zen 4 or 5 by then. Sure Intel can make a comeback, but they're officially dead to enthusiasts and budget builders. Only the fanboys who were ripped off by them for over a decade seem to defend them. Next-gen consoles use Zen 2. Games will be better optimized for Ryzen in the future.
     
    Jagman and EtherPhoenix like this.
  15. HeavyHemi

    HeavyHemi Ancient Guru

    Messages:
    6,718
    Likes Received:
    806
    GPU:
    GTX1080Ti
    Please stop trolling. Thanks.
     

  16. HeavyHemi

    HeavyHemi Ancient Guru

    Messages:
    6,718
    Likes Received:
    806
    GPU:
    GTX1080Ti
    Or sorry, missed that teeny tiny grey source text which is also a link. A link to a German site. How does anyone see that on something less than a 50" 4K screen? I was looking for a link in the article. Not some grey text that says source. Wow...that needs fixing. :eek:
     
  17. angelgraves13

    angelgraves13 Ancient Guru

    Messages:
    1,736
    Likes Received:
    443
    GPU:
    RTX 2080 Ti FE
    I'm not. I'm 100% dead serious.

    A lot of my friends laughed at me 3 years back when I said Intel was losing the CPU game to AMD. They were diehard "Intel forever" and now they're all on Ryzen.
     
    Jagman and EtherPhoenix like this.
  18. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,371
    Likes Received:
    3,404
    GPU:
    2080Ti @h2o
    Sure, as an enthusiast, I can only say bring on the power, my custom loop can cool it, and there's a reason there's PSUs above 500W...

    ... but on the other hand, Intel really seems to have issues beyond normal stumbling to introduce their new generation. From what little I can tell by the looks, they are seriously lacking in the possibility to improve their architecture (still haggling with security by principal design of speculative execution), bring it onto a new node, offer it at the price the competition does (or they just don't want to), or offer any additional value features to make sure they stay on top no matter what.

    All that said, Intel still makes it's money with data centers and OEM, so they actually don't really care about DIY builders at all. Investors, data centor, OEM, the rest is just white noise for Intel. I wonder how long investors will look on and nod it off until heads will roll...
     
    Jagman and Mesab67 like this.
  19. NewTRUMP Order

    NewTRUMP Order Master Guru

    Messages:
    454
    Likes Received:
    110
    GPU:
    STRIX GTX 1080
    You are the one that is confused. The Toms Hardware you refer to managed to get a whopping 90w power consumption out of the 3700x stress test. And the Techpowerup article got 140 watt draw with their stress test. The "whole" system power draw was 300watts. That means everything on the motherboard, to include the motherboard at full power draw is 300w. And of course Guru3d results were 165 watts. All far from 300watts. [​IMG]
     
    Jagman likes this.
  20. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    2,106
    Likes Received:
    1,350
    GPU:
    2 x GeForce 1080 Ti
    Dude, a 3960X is rated for 280 watts - that's the advertised TDP. This is not a fantasy figure; the figures that AMD provides are pretty close. The problem is mostly on the Intel side, where a CPU rated for 95 watts runs at 168 watts at full load.
     
    Jagman likes this.

Share This Page