Intel Core i7-10700K Spotted in 3Dmark listing at 5.30 GHz Turbo Boost

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 12, 2020.

  1. Aura89

    Aura89 Ancient Guru

    Messages:
    7,747
    Likes Received:
    974
    GPU:
    -
    If you're going to argue against factual information, be my guest, but i'm not going to continue this pointless discussion about facts not being facts.
     
  2. Ridiric

    Ridiric Member Guru

    Messages:
    188
    Likes Received:
    106
    GPU:
    GTX 1080Ti@2068mhz
    Hey I'm not arguing against facts, as i pointed out, in the graph you showed only 2 outliers for intel were over target by a considerable margin, and many were under, the graph also showed none of AMD's current lineup, which is what i was talking about, the fact still stands both the 9900k and 3800X use roughly the same power load.

    Both AMD and Intel currently show inaccurate figures for TDP, TDP itself is pretty much a made up number anyway as i was saying in some of my other posts, gamers nexus did some great videos on how its actually calculated.

    Just trying to shine some light on actual numbers of current products you can go and buy right now, because a lot of people on these forums seem to think everything AMD makes right now uses way less power than Intel, where that is just not true, in their current lineup many of the processors from intel and AMD with the same core counts have very similar power draw, that changes somewhat when getting into the higher core count parts, where AMD is definitely more power efficient, but not so much at 8cores or less.

    Also as i stated, that's not saying that this new processor is not going to use a lot of power, it most definitely WILL use a lot of power, probably more than the 12 core AMD part.
    AMD's 3800x, 3900x, and 3950x all list 105W TDP though, so considering the 3800x is already over that, you know the 3900x and 3950x are all going to be over that as well so even if they are more efficient than Intel's new processor it still doesn't change the fact that the listed TDP doesn't match up with actual power draw.

    As Astyanax pointed out TDP doesn't actually mean power draw anyway so i guess neither company really needs to have to have it match up with actual power draw, still doesn't stop people comparing it to power draw though, which is fair cause as the gamers nexus videos and articles pointed out TDP numbers don't really mean anything.

    Both companies need to stop with the bullshit and just list idle wattage, single core wattage and max wattage on the spec sheet and maybe an average of those as default TDP, instead of the current TDP number which means next to nothing for both companies current range of CPU's. Just make the numbers plain and visible to everyone so all these stupid arguments can stop.
     
    Last edited: Feb 13, 2020
  3. K.S.

    K.S. Ancient Guru

    Messages:
    2,083
    Likes Received:
    545
    GPU:
    EVGA RTX 2080 Ti XC
    @Hilbert Hagedoorn any insight into how Intel is achieving these higher clocks far as the manufacturing process behind them - would be really interested to know that... far as if they're having to sacrifice more assets on premium silicon for higher clocks while refining lower nano-meter process' to replace current architecture... *speculating here
     
  4. ToxicTaZ

    ToxicTaZ Member

    Messages:
    41
    Likes Received:
    2
    GPU:
    RTX 2080 NVlink
    I wonder how the 10700K fares against my 9900KS?

    My 9900KS is @5.2GHz and is running cooler than my 8700K @5.1GHz ever did for two years with the same open loop Cooled by EK

    I'm really amazed on how well Intel can control heat with high Watts CPUs ... Intel CPUs run till water boils @100c that's crazy ... should never run CPUs beyond 75c it's just crazy they can do that. . . .
     

  5. Ridiric

    Ridiric Member Guru

    Messages:
    188
    Likes Received:
    106
    GPU:
    GTX 1080Ti@2068mhz
    My two mates PC's i built have the 9900K in them, they cool better than my 8700k did both CPU's at 5ghz, but once i delided my 8700k it ended up being able to hit 5.2Ghz (5.3 if i push the voltage to 1.4 but its a little higher than i wanted to go and i get micro heat spikes when i do that).

    So yeah the soldered 9900k definitely beats the stock intel paste on the 8700k, but delid 8700k does better, overall i like the soldered option they went with, don't have to void the warranty to get extra performance anymore.
     
    ToxicTaZ likes this.
  6. Astyanax

    Astyanax Ancient Guru

    Messages:
    4,734
    Likes Received:
    1,403
    GPU:
    GTX 1080ti
    Steve from Gamers Nexus and anyone he has on his channel (excluding the times hes had EVGA engineers on) are not trained silicon engineers, Jdec qualified or otherwise that would permit them to understand whatever they were talking about.

    TDP is not magic numbers, its simply not the case that power consumption can directly translate into heat generation as within a processor there is not 100% efficiency in conversion of power to heat as there would be with a space heater.

    Your belief that power to heat is 1:1 in a chip has no basis in reality, the ratio is way off where it actually is because resistance is the primary factor of heat generation within a circuit and a good amount of the power that runs through a chip does not finish up in the chip.
     
    Last edited: Feb 13, 2020
    K.S. and Evildead666 like this.
  7. Ridiric

    Ridiric Member Guru

    Messages:
    188
    Likes Received:
    106
    GPU:
    GTX 1080Ti@2068mhz
    ... wattage in VS heat out, in a CPU is close enough to 1:1 that it makes no difference... hell you can even test it, hook up any processor you can find, measure wattage in VS heat out, its always as close to 1:1 as makes no difference.
     
    ToxicTaZ likes this.
  8. Astyanax

    Astyanax Ancient Guru

    Messages:
    4,734
    Likes Received:
    1,403
    GPU:
    GTX 1080ti
    No, go back to school.
     
  9. Ridiric

    Ridiric Member Guru

    Messages:
    188
    Likes Received:
    106
    GPU:
    GTX 1080Ti@2068mhz
    Wow, how infantile...

    Seriously do a little research, I'm not joking, its easy to find out, wattage in VS heat out is pretty much 1:1, the only time that rule changes is when the energy is converted into something else, like light or movement.
    Its pretty basic stuff we are talking about here.

    If you can find a single example of a microprocessor that produces any significantly less amount of heat compared to the wattage going into it ill say sorry, admit I'm wrong, delete my account and never post here again, I'm that serious.

    (of corse discounting non standard things like quantum processors, or fibre optic processors e.t.c, they don't function the same as a normal processor and i have no idea if they have any unique thermodynamic properties)
     
    Last edited: Feb 13, 2020
    HandR and ToxicTaZ like this.
  10. Denial

    Denial Ancient Guru

    Messages:
    12,563
    Likes Received:
    1,792
    GPU:
    EVGA 1080Ti
    Yes it does. You learn this in a basic circuits class.
     

  11. Astyanax

    Astyanax Ancient Guru

    Messages:
    4,734
    Likes Received:
    1,403
    GPU:
    GTX 1080ti
    A cpu is not a basic circuit.

    power consumption is quadratic vs clock speed

    "Power consumed is proportional to the transition rate of the clock and the conduction losses with switching those effective capacitor gates. Temperature rise however is proportional to the power consumed times effective thermal resistance , in degrees C per watt and thus is independent of energy, or rather may run cooler or hotter depending upon the power consumption and not spreading that power over a longer period of time. There may be a formula that shows that temperature rise with clock speed is some fractional power of power greater than one."
     
    K.S. and Evildead666 like this.
  12. Ridiric

    Ridiric Member Guru

    Messages:
    188
    Likes Received:
    106
    GPU:
    GTX 1080Ti@2068mhz
    That's nice, still doesn't change the fact that wattage in = heat out, and in the case of microprocessors its pretty much 1:1 wattage vs heat, again show me an example where a microprocessor produces meaningfully less heat than should be expected from the wattage in.
    Clockspeed has nothing to do with any of this btw, its directly related to wattage in VS heat out. (for example a 100mhz cpu that used 50W is going to produce essentially the same amount of heat as a 1Ghz cpu that uses 50W)
     
  13. Glottiz

    Glottiz Master Guru

    Messages:
    506
    Likes Received:
    130
    GPU:
    Strix 1080Ti
    Exactly. People act like they live in some alternative universe where Noctua NH D15 doesn't exist. I have 9900K in one machine and I have never ever had any overheating problems. All these exaggerations and fake news makes me laugh.
     
  14. squalles

    squalles Master Guru

    Messages:
    754
    Likes Received:
    32
    GPU:
    Galax GTX 1080 EXOC OC
    amd fanboys always thinking intel losing 20% of performance with security patches, everyone runs cinebench 24/7, overheat more than hell and power consumption are the double kkkkk
     
  15. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    6,791
    Likes Received:
    118
    GPU:
    5700 XT UV 1950~
    Mind you tho. The full load on Hilberts is gonna have x570 chipset that alone eats 30-50 watts more then x470. So gotta remember to add that. Sure 3800x is gonna be meh compared to 3700x and not as accurate. But still

    [​IMG]

    But 3800x is still better than 9900K in this regard. Not off as much.

    [​IMG]

    [​IMG]
     

  16. Denial

    Denial Ancient Guru

    Messages:
    12,563
    Likes Received:
    1,792
    GPU:
    EVGA 1080Ti
    You don't think the fundamental properties of physics applies to both complex and basic circuits? You don't think the wire in a basic circuit has thermal resistance?

    I also like how you handwave away the guys gnex has on his videos but then quote some random person on stack exchange lol
     
    Ridiric and -Tj- like this.
  17. Astyanax

    Astyanax Ancient Guru

    Messages:
    4,734
    Likes Received:
    1,403
    GPU:
    GTX 1080ti
    The people on stack exchange are more qualified than Steve.


    The people on 4ch are more qualified than Steve.
     
  18. K.S.

    K.S. Ancient Guru

    Messages:
    2,083
    Likes Received:
    545
    GPU:
    EVGA RTX 2080 Ti XC
    Lol I post on stack wtf... *sniff sniff
     
  19. Ridiric

    Ridiric Member Guru

    Messages:
    188
    Likes Received:
    106
    GPU:
    GTX 1080Ti@2068mhz
    Still waiting on that single example of a microprocessor that magically produces less heat than it should for any given wattage.

    So far from your infantile remarks like "The people on 4ch are more qualified than Steve." maybe you belong there?

    Oh and as a side note, as you so seem to love industry qualified sources, Steve spoke to many people in different parts of the chip making industry as part of his research for his TDP articles and videos to try and find out exactly what was used in their calculations for TDP.

    I mean that still doesn't change the fact that anyone who regularly builds or works with electronic devices understands wattage in = heat out, and the only major variable is if the energy in is being transformed into something else, like light or movement.
     
    Last edited: Feb 14, 2020 at 2:54 PM
  20. Ridiric

    Ridiric Member Guru

    Messages:
    188
    Likes Received:
    106
    GPU:
    GTX 1080Ti@2068mhz
    Nice input on the discussion there mate.

    Didn't realise the X570 was that much of a power jump compared to the X470, guess i understand why the fans were such a requirement now, though i still think they could have just gone with actual useful heatsinks instead of the stylised crap they whack on there and done without the fans.

    So yeah both AMD and intel still understate power usage on some of their processors (not all), but it appears to only be by a small amount and only under specific workloads, so i guess that's nice to know, though there are still some weird inconsistencies on those charts, like the 3800X pulling less power than the 3700X both with PBO enabled on the handbrake test.

    Also, surprised on those graphs, intel actually using less power than i thought they were as well, well, as long as its running stock and MCE and all that stuff isn't turned on (do motherboard manufactures still leave that on by default on some boards? Been building mostly AMD systems for customers lately so haven't had any new systems being built with high end intel for about a year now, but back when i built my system i remember a bunch of boards had it enabled by default).

    I guess the take away here, is that these pushed to the limits parts intel are releasing do use quite a bit more power than expected based on just looking at the TDP in the spec sheet (which i kinda already knew) even though they upped it to 127W TDP for example with the 9900KS, their more standard parts still seem to be fairly accurate though, and AMD probably should have a separate TDP listing for their 12core and 16core parts instead of listing them all at 105W TDP.
     
    Ryu5uzaku likes this.

Share This Page