Review: Intel Core i9 7900X processor

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 19, 2017.

  1. TieSKey

    TieSKey Master Guru

    Messages:
    226
    Likes Received:
    85
    GPU:
    Gtx870m 3Gb
    Well yes and no. We have 2 different kinds of parallelization here, same thread parallel instructions (GPU like) and multi-threading. As u state, CPUs are not designed for the former, yet a certain degree is useful and welcome (and we already have it) and compilers and languages are slowly adding support for it (parallel for statements for instance).
    Now, multi-threading operates at a different lvl and is software architecture dependent, this is were we need to work harder.

    About the FPGAs I think the idea would be to have a bunch of small/middle sized cores connected a la Infinity Fabric so each program can have a number of cores configured and dedicated for it (imagine something like 64 cores, with 10 running apps using 5 cores each and the OS routing/sharing common tasks on the rest).


    Yeah I know they exist but I wasn't aware of such hybrids. I really think this is the way to go as CPUs are getting more and more specialized instructions sets to compensate for lack of better speeds.
    Unless Intel reveals some consumer x86 germanium-graphene chips by 2025, at which point we will be stuck not only at clock speed but also at density (do we have roadmaps for 3nm silicon???)
     
    Last edited: Jun 20, 2017
  2. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    The bottom according to everything we seem to know, will start with a 10-core CPU. The interesting part of that is that AMD seems to lower the prices of the 1800x too, which would indicate the possibility of a 8/16 TR. Still, even the 10/20 will probably end up at half the price of the 7900x.
     
  3. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    AMD likely lowered the price of the 1800X because people realized there was no point buying it when a 1700 is just as good for $200 less (or whatever the price difference was).

    Though I think an 8c/16t TR would be nice, I doubt we'll be seeing one, at least not until the other models are released first.
     
  4. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    Effectively, for what we know so far, they willl only decline 12-14 and 16 cores TRs. I will say that maybe a more plausible question, is if in the future AMD could pull down a 16+ cores based on Napples.
     

  5. TieSKey

    TieSKey Master Guru

    Messages:
    226
    Likes Received:
    85
    GPU:
    Gtx870m 3Gb
    Using 4 2-good-cores CCXs for an 8/16 TR instead of 2 Ryzen quad cores sounds interesting, specially if the 4 channel memory does increase performance as expected/hyped.

    And the ridiculous core/size ratio would make them run cold :p
     
    Last edited: Jun 20, 2017
  6. Emille

    Emille Guest

    Messages:
    785
    Likes Received:
    27
    GPU:
    1080 Ti Aorus Extreme
    Actually historically that has been one of the worst things about AMD, whenever I was considering a new upgrade Intel always had the latest features while AMD had archaic ram support etc.

    Also unless you buy 2 cpus within a 2 year span by necessity the latest cpus will demand a new socket and feature set...which anyone who has any self respect when it comes to spending the money they work for....should want rather than a new cpu that has been held back for the sake of the 1 percent of the desktop market or less who buys 2 cpus for a single chipset AND motherboard.

    I think, and hope, that going forward AMD will have a new chipset out every 2 years so their cpus can expand as fast as possible rather than being limited by the socket size.
     
  7. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    So, reason for intel's chip overheating is software used? Really? Transferring intel's inadequate and unreasonable engineering choice into blaming software developers.

    Truth is, that if chip generates too much heat, it should be clocked to lower clock which in return allows lower working voltage. Intel is simply running their chip out of advisable specification in hopes, that it will attract unsuspected consumer.
     
  8. aKiss

    aKiss Guest

    Messages:
    33
    Likes Received:
    0
    GPU:
    Gigabyte 1050ti LP
    how much power will the i9-7980XE draw? 300W just the CPU at full and 150W at idle?
     
  9. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    Yeah, I agree. Let's not start making excuses for Intel's poor design decisions. It's obvious that they deliberately pushed the chip beyond what would normally be considered acceptable parameters, and their decision to use paste only amplifies that problem.

    At least one of my grid computing projects use AVX, and I would be very worried about doing any sort of computing on the Core i9. I'd imagine the thermals would be through the roof even with a custom water loop and it would be a constant struggle to keep it from throttling during the summer. Of course if I don't use it for computing then I really have no need for it at all, as I don't do any rendering or programming work at home (obviously don't need such a chip for web browsing or gaming).
     
  10. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Even if one can cool it with efficient cooling solution there is another problem. This kind of power has to go through VRMs. Running heavy compute operations for several days or in some cases for weeks...
    Not good idea for those poor VRMs on MB.
     

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    I'm guessing these boards don't have a traditional VRM arrangement. They likely have better quality ones and/or more of them. Perhaps for some boards, there are fewer phases, since the manufacturers likely expect you'll reach CPU thermal limits before voltage quality becomes an issue.
     
  12. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    I don't even remember my first processor. Would have been around the 1995-1999 era, the first one i remember the name of as an AMD barton lol
     
  13. Ricepudding

    Ricepudding Master Guru

    Messages:
    872
    Likes Received:
    279
    GPU:
    RTX 4090
    See this has become the issue, like others i want more than 4 cores, due to the fact quads seem to be a dying breed, and we hopefully should be moving on to using more cores now. however what is good to pick?

    7800/20x? with the gimped PCIe lanes plus only one core boosts high from what i understand. also the 7800x only boosts to 4ghz from what ive seen on tables, though you can overclock it quite high. or do you splash out on the 7900x which i think compared to the other two chips costs a little too much.

    or do you go to AMD side with the 1800x which only has 16PCIe lanes which limits some people like myself and only dual channel support... or threadripper which has loads of PCIe lanes but i don't think it will improve much on ryzen beyond having more cores (which some programs use and others dont)

    Due to intel being dicks, it has become a very odd time on what to buy and what is best to pick from. and if we are going into a Core War then maybe going for something with more cores right now will allow it to become a little more future reliant...

    Sorry for rambling so much, just getting my thoughts out of my head. should just wait to see what threadripper shows i guess, maybe it might improve on ryzen
     
  14. H83

    H83 Ancient Guru

    Messages:
    5,510
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    At stocks speeds you can use AVX instructions without any problem. Sure the cpu is going to run hotter than usual but that´s it. In case you overclock, then there´s a high probability that the cpu is going to have thermal issues when running AVX. Fortunately every MB has an AVX offset ratio that allows the user to lower the speed of the AVX in relation to the speed of the cpu to prevent the cpu from overheating or throttling when running AVX instructions. Using my case as an example, my cpu is overcçoked to 5.0Ghz and the AVX ratio is set to 1, meaning that AVX runs at the same speed as the cpu. But if i want i can drop the AVX speed to 4.8 or 4.6Ghz to prevent any problems.
    Intel has explained this issue regarding AVX from the beginning and how to prevent or minimize it.

    Someone correct me in case i´m wrong about any detail.
     
  15. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    I believe mine was a Pentium Pro, had something like 16MB of RAM, and a 2GB HDD. Also had a tape drive.

    Quads are not a dying breed. I like to think of them like menopausal women - not as strong or attractive, no longer capable of supporting a new generation, and maybe can't handle many new tasks, but still have plenty of life left and are by no means useless or incapable of handling important things.

    Personally, I don't find the decisions all that difficult, because it ultimately comes down to what you need, not what you want. Ryzen 7, the i9 7980X (but none of the other socket 2066 chips), Threadripper, and many 18+ core Xeons each have their own distinct advantages and disadvantages.

    But consider the PC in your sig - how well has that kept up with your workload? How often do you find any of your cores regularly getting maxed out? How long do you intend to keep your next PC? Are there things you would like to do that you deem your current PC inadequate for? These are all factors that decide what you should get.

    Just keep in mind: you're probably not going to notice a big difference in real-world applications going to quad-channel memory. Adding more cores will not improve performance if nothing is using them. I am not aware of any GPU that will saturate more than 8x PCIe 3.0 lanes.

    The way I see it, your options are:
    A. If you want a rig to brag about and is theoretically and potentially the best, get a 7980X.
    B. If you want a very solid step-up from what you have without worrying about it being prematurely obsolete, get a Ryzen 7.
    C. If you just want a crapload of cores and PCIe lanes for the hell of it, but, a system that will be very reliable and feature-rich, get an 18+ core Xeon.
    D. If you want an overkill rig where the hardware won't have wasted potential (due to TDP) for a good price, get Threadripper.
     
    Last edited: Jun 20, 2017

  16. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Here are my thoughts:

    Hilbert's test hits 82c on a AIO cooler at stock settings. It's an ES sample but it seems roughly the same as the few other reviews where I've seen AIO setups. The problem is I can't find a single review that does P95, small FFTs on air. I would assume if Hilbert put his on a Noctua or some other high end air cooler, it would be closer to 86-90c, which is definitely far above what we consider normal - but that brings me to another point..

    I have no idea how durable these chips are and what Intel's target longevity is for them. People keep saying 80c+ is bad but for all I know they can run like that for 10 years 24/7. My GTX480 is still being gamed on, relatively frequently, at 85c+ 7 years later. Do I like temperatures that high? No. But I also can't remember seeing any real numbers on temperature vs longevity to back up my fear of it.

    The other thing is that Hilbert, among other reviewers, basically said the last minute bios's stripped out all the power saving stuff because of issues in gaming. This bothers me for two reasons, one - Intel could easily tweak voltage tables and whatnot to bring those temp/power numbers down and two - performance could drop due to those tweaks and non-savvy consumers wouldn't know because no one really rebenches anything - people following the ryzen story know what it's like.

    So yeah, power stuff could be good, could be bad, I'm inclined to give Intel the benefit of the doubt - I don't feel like they needed uncapped AVX performance when the competition's architecture doesn't contain a comparative hardware level. So I feel like they either intend to fix it via BIOS updates or they don't think it's an issue in the first place.
     
    Last edited: Jun 21, 2017
  17. Venix

    Venix Ancient Guru

    Messages:
    3,472
    Likes Received:
    1,972
    GPU:
    Rtx 4070 super
    i got my first pc back in 2000 i was 13 i had an inhell pentium 3 @ 666mhz ! ((667 according to inhell ! )) with a voodoo 3 2000 ....yes back then was the name i knew and i am well aware why when i asked for voodoo the seller was looking at me that weird ! anyway aside from the heavens door well shut for me since then ..... to the topic ...well ok the thermals are abysmal the tim and then people dr8 ...how ever he is called going out and saying that tim has pros also ... sure guy that sell deliding tools worldwide ...
    now about the pci express lanes well 16 should be enough for almost everything dual cards seem to be fine although i understand people wanting more even for the ease of mind right now ryzen 7 5 offering is enough for a high end card 16x pcix nvme drive and satas for more storage if needed so that will cover most ... so realistically the 44 lanes from intel should be fine also 64 is a tottal over kill and yes i know is very much based on your needs ... and something last about pricing x99 motherboards are more expensive cause they have to support among other things 44 pcix lanes tr with 64 we might see a lot of the value from these chips getting kind of equalized from the cost of their motherboard
     
  18. H83

    H83 Ancient Guru

    Messages:
    5,510
    Likes Received:
    3,036
    GPU:
    XFX Black 6950XT
    I think the problem here is the use of P95. P95 is a stress test that puts an unrealistic workload on cpus to test their stability. This results in much higher temps than normal because all the cores are being stressed at 100%. This alone is sufficient to drive a normal cpu to extremely high temperatures no to mention a 10 core cpu... And to make things worst, this 10 core cpus has AVX.

    From what i´ve read, and understand, AVX is a very powerfull instruction set that can bring very big performance performance gains if software use it properly but it has two little/big problems: the die area needed is very big, the AVX part on the 7900X is the size of an Atom core!!!. And the it uses a lot of power, so much power that Intel created the offset ratio to when cpu can´t run it at normal speeds. Knowing this i think we can guess what happens when someone uses a stability test like P95 that also uses AVX, the cpu just overheats because of the AVX high energy requirements.

    Also if i remember correctly, this behavior also happened when the 7700K was reviewed, some sites overclocked it and discovered that the cpu would throttle everytime it used AVX instruction.

    And Hilbbert has already warned us about using software like O95 or Furmark to test hardware saying that they are basically useless and can ruin the parts being tested... Nvidia even implanted the famous driver lock everytime someone uses Furmark...

    Conclusion: "torturing" an Intel cpu using AVX is wrong and stupid. Of course things would be better if Intel used solder instead of ****ty TIM and if didn´t rush the release of the platform...

    Hope everything i wrote makes sense and is correct. If not feel free to correct me, please.
     
    Last edited: Jun 21, 2017
  19. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Well, it's announced for $400 :D

    [​IMG]

    [​IMG]

    [​IMG]
     
  20. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    I wouldn't say an AVX torture test is stupid, but, what is stupid is drawing conclusions based on it.

    Keep in mind many of these tests are performed in a controlled environment using settings that the average user is able to encounter. Though a synthetic stress test is definitely a worst-case scenario (in terms of executing something), the environment is ideal. Despite this, the results were still a bit troubling.

    I think what is safe to draw conclusions on is a real-world benchmark that utilizes AVX. Only then is it ok to get worried about thermals and wattage, but, I suspect it'd be much better.


    Back when the R9 290X was being reviewed, it was the same idea - people got concerned over how hot it got due to things like Furmark, but the GPU's power consumption could drop by as much as 50W just by testing a game instead. I think it's a good idea to highlight the potential dangers processors (of any kind) may encounter, but, it's a bad idea to criticize the product because of a synthetic benchmark (in fact, I think it's an equally dumb idea to praise a product because of a synthetic benchmark).


    That's the 8c/16t Epyc. Those clock speeds are pretty low and I doubt Eypics will be overclockable, buuuut... you should be able to get all those PCIe lanes.
     
    Last edited: Jun 21, 2017

Share This Page