1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Review: Intel Core i9 7900X processor

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 19, 2017.

  1. redrumy3

    redrumy3 Master Guru

    Messages:
    433
    Likes Received:
    2
    GPU:
    Evga GTX 1080 FE
    hmm looks like I am going to build Ryzen build for a buddy I don't see any benefit in going with x299 build over Ryzen. Save a good penny as well. As much as I love Intel going to skip this new socket.

    Thanks for the review love reading through guru3d reviews!
     
    Last edited: Jun 20, 2017
  2. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    4,430
    Likes Received:
    1,361
    GPU:
    HIS R9 290
    I probably should've elaborated "Intel currently doesn't see AMD as a competitor" because you're right, AMD was a threat at one point. Intel had a contract with IBM to keep AMD around. By the time that contract lost relevance (in particular, because AMD stopped depending on Intel for designs), all other competitors were wiped out, but Intel couldn't wipe out AMD because then they'd be a monopoly.

    The release of Skylake-X was going to happen regardless of AMD's actions. Creating an i9 product series and an 18-core CPU, however, is clearly influenced by AMD's actions.

    I never said or implied that. I'm just saying Intel is still going to see oodles of revenue regardless of AMD's success. I expect a bright future of AMD. Keep in mind I'm not hating on AMD here; I own a Ryzen myself and I don't think the 7900X was deserving of the "Recommended" prize at Guru3D.

    Intel is aware, my point is they just don't care. The price point and arbitrarily disabled features of i9/X299 is enough proof of that.
     
  3. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,739
    Likes Received:
    2,199
    GPU:
    5700XT+AW@240Hz
    In a way, I can second it.
    Intel's mobile strategy gets on my nerves so badly. Raven Ridge can't come soon enough.
     
  4. Silva

    Silva Master Guru

    Messages:
    940
    Likes Received:
    313
    GPU:
    Asus RX560 4G
    Canada does have a low temperature climate so you can actually benefit from it! :banana:

    In the end, what you do with your money is your business alone.

    Intel hit a wall with 14nm. Up until then, they made designs shrinks and that made the chip more efficient and more overclockable. They did however charge a premium for unlocked models and didn't innovate on the design.

    AMD knew it couldn't compete in raw performance and eventually Moore's law would die. So they chose the alternate path: to innovate again. They made the CCX, an efficient (good yields) quad core that they could connect with the Infinity Fabric and scale "infinite" amount of times.

    The problem with Intel, is that they have nothing to respond to other than using the overclocking margin they've been charging a premium for. And to make matters worse, their design is not modular: if a 20 core chip has 1 defective core, they're screwed. And the cherry on top of the cake, is the ****ed up tooth past they're using below the IHS...

    I love AMD for the simple fact that AM4 socket is planed to last up to 2020. If I wanna upgrade I can only go as far to an i7 3770k, that is selling second hand for 250/300€ where I live (lol). So ya, Intel lost in 2017. Let's see what they do with Cannonlake (and 10nm) because until then, nothing Intel puts out will be worth buying.

    Even at stock it's having problems...the delta from the core to the cpu cooler is 71ºC at times...

    Well, my first computer was a P3 450mhz...second was a P4 2.5Ghz then a P4 3.0Gz then a E8400 and now i5 2500k...so I was pretty happy with Intel...

    Any company will try to profit as much as it can, even AMD if they get the chance. In 2017 it doesn't make sense buying Intel.

    I would love to be able to upgrade...DDR4 is forbidden expensive and having to buy a new mobo is even worse...
     
    Last edited: Jun 19, 2017

  5. BangTail

    BangTail Ancient Guru

    Messages:
    3,590
    Likes Received:
    1,126
    GPU:
    EVGA 2080 Ti XC
    Could not agree more.

    :cheers:
     
  6. Denial

    Denial Ancient Guru

    Messages:
    12,343
    Likes Received:
    1,529
    GPU:
    EVGA 1080Ti
    You can bin chips without a modular design.
     
  7. Silva

    Silva Master Guru

    Messages:
    940
    Likes Received:
    313
    GPU:
    Asus RX560 4G
    Yes, binning is part of the trade.

    But think about this: the wafer has a standard size, if you make lots of small CCX you can afford to throw some away or reuse them to make quad cores (with 2 CCX). If you make bigger chips, the wafer price will be less diluted within the quantity you make and that will make the chips more expensive to the consumer. Factor in the chips that will fail to be 100% functional and its obvious AMD has a clear advantage over Intel here.

    Intel i7 7700k is a formidable CPU, but unless Intel comes up with something like infinity fabric, it can't stand a chance against CCX efficiency.
     
  8. Denial

    Denial Ancient Guru

    Messages:
    12,343
    Likes Received:
    1,529
    GPU:
    EVGA 1080Ti
    Yeah, I definitely agree that AMD's solution is favorable, its just that your post made it sound like if one core is dead Intel loses the entire chip, which isn't necessarily true.
     
  9. zer0_c0ol

    zer0_c0ol Ancient Guru

    Messages:
    2,976
    Likes Received:
    0
    GPU:
    FuryX cf
    I forgot to mention BTW in some review the I9 actually took 346 Watts and shut down the motherboard...and no the i9 when it comes to temps is really really bad
     
  10. Denial

    Denial Ancient Guru

    Messages:
    12,343
    Likes Received:
    1,529
    GPU:
    EVGA 1080Ti
    ...

    It was the Tom's hardware review you literally linked and it was running at 4.8Ghz on all 10 cores, running 20 instances of some year long calculation along with Prime95 with AVX enabled.

    I don't know how to stress this enough, AVX is by far the most dense units on the CPU and Intel has double the number of registers dedicated to it per core. When they come online they are going to significantly increase power consumption, therefore heat.

    This isn't even to mention that Hilbert himself stated the newest BIOS's basically broke all the P-State stuff and sent power consumption way up. Like after Ryzen reviews at 1080p, literally every person including you was in here saying "wait for benchmark with new BIOS's" "Wait for benchmarks with faster memory" "benchmark with x motherboard feature disabled" "benchmark with 2 cores parked and standing on one foot" and now it's Intel and everyone just immediately says "garbage processor because it uses a lot of power when running a feature AMD's chips do at half the performance" or whatever.

    I definitely agree with the all posts regarding the wonkyness of Kabylake X and the pricing, raid features, etc. It's intel being ****ty Intel. But the processor itself seems fine. The temps it's hitting at stock clocks are fine. The 6/8 core parts both seem perfectly reasonable.
     

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    4,430
    Likes Received:
    1,361
    GPU:
    HIS R9 290
    I agree. I don't know why people are expecting things to be different. 4.8GHz on a 10-core is insane, both in a good and bad way. Most people are happy to get 4.4GHz on a quad core. The power consumption and temperature isn't going to magically remain the same just because the voltage and frequency are the same. This is why I've been stressing that overclocking other i9s this high is going to be irrelevant once you get beyond 10 cores. It may be possible but aside from bragging rights, it isn't practical.

    This CPU doesn't have a functionality problem. Sure, maybe it'd have been better off being de-lidded, but the current thermals and wattage are just what happens when you push this much hardware to its limits.
     
  12. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,356
    Likes Received:
    836
    GPU:
    EVGA 1080ti SC
    Oh wow that 8 core looks interesting. Though I may only really "need" a 6 core which at $390 seems to be a okay deal.
     
  13. C-Power

    C-Power Member Guru

    Messages:
    109
    Likes Received:
    13
    GPU:
    Msi 2060 Gaming-Z
    Finally something to look forward to - now the "Core wars" have started, I guess next year or so finally no more 4 core CPU's :p

    Wonder how fast it will go and where it will stop, will this be the next "GHZ war" that was going for a few years? I bet it is.

    Now all we need is software developers to hop on too and we will have a very nice future hehe.

    For me, the x299 platform doesn't look all that great - I'll be sticking with my X99 atleast untill next gen. Especially looking at these power draws vs multicore performance - I am still very happy with my "unlocked" 2683 multicore performance :)

    And, it doesn't need LN2 to keep cool :banana:

    I love Intel, but I think they are making a bit of a mess with their new X299 platform.


    Edit'-
    PS;
    Hilbert nice review as always!
     
    Last edited: Jun 19, 2017
  14. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    4,430
    Likes Received:
    1,361
    GPU:
    HIS R9 290
    It's possible "core wars" will replace the "GHz war". They're both equally dumb, and scary. They're both a sign of stifling innovation and pushing limits that no developer can take advantage of, for the sake of the majority of users who don't have such suped-up systems. We're heading to a point where processing power is going to heavily outpace efficiency. A decade ago, CPUs may have been inefficient but at least there were very few in existence that needed more than the 4-pin 144W CPU power connector. Now we're at a time where we need 16 pins.

    I think now is a good time to remind people of Amdahl's Law. Adding more cores to things isn't going to magically make your FPS go up. Why do you think Intel has been avoiding more cores for so long?
     
  15. Paulo Narciso

    Paulo Narciso Maha Guru

    Messages:
    1,215
    Likes Received:
    29
    GPU:
    ASUS Strix GTX 1080 Ti
    Ten years later from the release of q6600, some games are now finally taking advantage of more than 4 cores :)
    I remember buying one for Crysis, which was announced to take advantage of quadcores and in the end it used two cores at best.
     
    Last edited: Jun 19, 2017

  16. Evildead666

    Evildead666 Maha Guru

    Messages:
    1,231
    Likes Received:
    234
    GPU:
    Vega64/EKWB/Noctua
    heh, I had a really sh*t trip home this evening in 30+°C temps, and that made me laugh and chill a bit. :)

    @HH, great review, very objective. Thanks. its getting rarer. ;)
     
  17. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,356
    Likes Received:
    836
    GPU:
    EVGA 1080ti SC
    Stop speaking logic. MOAR CORZ!!!!!!!

    At this point it would be nice if we had something with the multicore performance of a 1800x (not necessarily with 16 threads mind you) with the power consumption of a 7200 i3.
     
  18. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    4,430
    Likes Received:
    1,361
    GPU:
    HIS R9 290
    Well, we're pretty close to that with some GPUs. The problem is devs don't want to look into utilizing OpenCL, CUDA, or Vulkan for every-day applications. CPUs are really only good at crunching big numbers, but we keep telling them to run things in parallel, which they're relatively bad at. Things like HSA remove a lot of the inefficiencies.

    In an ideal world, I think what we need is something like big.LITTLE but for x86 and done in a less stupid way. Different processor architectures are good at different things. What I'd like to see is a CPU with a minimum of 6 cores (with SMT/HT) where two cores have the IPC of Bulldozer but can clock beyond 5GHz, another two cores have the IPC of Skylake with clocks around 4GHz, and another 2 cores with roughly double the pipeline length of Skylake but operates around 3GHz. Tasks would pick the cores that best suits them. Tasks that don't know what to pick would use the "Skylake" cores by default. </tangent>
     
  19. C-Power

    C-Power Member Guru

    Messages:
    109
    Likes Received:
    13
    GPU:
    Msi 2060 Gaming-Z
    That's true :p

    Depends on your workload though, but yes if were talking about FPS (currently) for 99% of the games out there it's still better to have 4 speedy cores vs "a hundred" slow ones.

    And then there is the thing with what resolution you game on etc.. I play at 3440x1440 and CPU really doesn't make (an actual perceivable) difference (unless I'd run it on 1 core and 1ghz, obviously), but the GPU is much more important at these resolutions.


    My main work with this rig isn't gaming tho, but it certainly get's the job done even though it's only 3,1ghz on all cores.


    The Ghz wars were insane - you could buy a new CPU nearly every month back in that time lol. I hope this isn't going to be one of those repeats again, but with cores.
     
  20. Silva

    Silva Master Guru

    Messages:
    940
    Likes Received:
    313
    GPU:
    Asus RX560 4G
    I'm sorry if my text wasn't clear.
    Of course both Intel and AMD can sell those lower core count chips for cheaper, but Intel will have yield problems because it's easier to have 4 good cores over 20.

    A chip should be capable of performing good whatever the load the user chooses to put it on. This could be possible with a good thermal paste and maybe a more reasonable clock. Intel is pushing the design to it's limits.

    Regarding comparisons between Intel and AMD, Intel has been releasing stable products for years and that's what people who prefer Intel say: it's a stable platform. On the other hand, before Ryzen AMD didn't launch a new processor since FX. But I do agree we could see optimization on Intel part but, this is a rushed product to fight ThreadRipper so I doubt any will have a meaningful effect on temps and power usage. Intel will say: "water cool it and don't overclock".

    More cores allow for multitasking and for developers to get creative with AI and other stuff. We all benefit from more cores, it's not about FPS.

    What are you talking about? Since I have my i5 2500k all games (except crappy ports or bad developed ones) use 4 cores.
     

Share This Page