Intel Shows 28-core processor die-shot

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 29, 2017.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,317
    Likes Received:
    18,405
    GPU:
    AMD | NVIDIA
    During its Technology and Manufacturing Day Intel has shared a slide of a server-processor with 28 cores. The die-shot as such must be the Skylake-SP series of products in the Xeon Platinum range....

    Intel Shows 28-core processor die-shot
     
  2. Silva

    Silva Ancient Guru

    Messages:
    2,048
    Likes Received:
    1,196
    GPU:
    Asus Dual RX580 O4G
    A new socket, why? Just because.
    Not buying another Intel product until they stop being silly.
     
  3. b101uk

    b101uk Guest

    Messages:
    222
    Likes Received:
    5
    GPU:
    Gigabyte GTX1070
    one would assume there must be a general trend at some point given the amount of cores that you will need new socket, so the socket itself is not holding back the CPU.

    that's assuming you don't utilise head up arse thinking.
     
  4. pato

    pato Member Guru

    Messages:
    185
    Likes Received:
    7
    GPU:
    MSI 3060TI
    On the other hand, this are server products. I don't know many companies who upgrade their CPUs in servers. They rather buy a new server with more current technology and fresh warranty.
     

  5. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    So you were hoping to pop one of these in your Z68? I really don't understand the big knock on a new socket every other CPU generation. I've had my 2600k for 5-6 years now. When I upgrade I want all the latest tech. My Z77 does not provide that. I will have to buy a new motherboard, what does it matter at that point if it is lga 1150, 1151, 1155, 2011 or what ever else?
     
  6. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,445
    Likes Received:
    2,538
    GPU:
    TUF 6800XT OC
    The only real reasons for new CPU socket is new memory tech and width (channnels)

    So, basically, we should have had one socket per DDR type (on the same number of channels).
    Of course, more channels ( like the HEDT plaform ) - another socket.

    But 1156, 1155, 1151, 1150... yea, that is completely silly. Yay for monopoly \o/
     
  7. genie

    genie Guest

    Messages:
    5
    Likes Received:
    5
    GPU:
    Vega 7
    If you were happy with the 2600k, why should you need to upgrade your processor to benefit from other peripheral upgrades like M.2 slots and the like. Sure, you'd need to buy a new motherboard, but Intel's frequent socket changes is what ties the chipset upgrade with a CPU upgrade, a tie that doesn't need to exist. The last umpteen Intel processor updates have maintained the same DMI chipset connection, yet Intel's frequent socket swapping effectively forces CPU upgrades when all you might want is a upgrade to USB 3 or M.2.

    Sure, Skylake brought DDR4, that likely warrants a socket upgrade. Yet compare LGA1155 and LGA1150, and there is very little change in supported technologies. Ivy Bridge supported DDR3, PCI-E 3.0, so did Haswell. But if you wanted things like M.2 which started coming in with the Z97 chipset, you needed Haswell, even though m.2 functionality is independent on the processor as they both support the same chipset connection.

    Reality is, Intel most likely could have made the Z97 chipset support LGA1151, allowing you to keep your 2600k and still take advantage of newer technologies like M.2 and more USB3 ports. But doing that wouldn't force people to buy a new processor, so less money for Intel.
     
  8. holystarlight

    holystarlight Master Guru

    Messages:
    792
    Likes Received:
    62
    GPU:
    Nvidia 4090
    man, all these socket changes, cant they just make a oversized socket and utilize the pins that are needed, and extra pins can be used at later date when they change the architecture.

    Like they must have insight knowledge of new memory bandwidth/tech and all that stuff waaaay before they even think about making a CPU.

    Hate having to buy a whole new system each time a high end CPU is released, grated the x99 as lasted for ages now.
     
  9. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    Not for you then. Time to move on.
     
  10. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,955
    Likes Received:
    4,336
    GPU:
    HIS R9 290
    I'm finding it hilarious how people are whining about a new socket for a series of CPUs that they likely will never be able to afford for themselves. These aren't consumer-level products...

    Keep in mind, AMD is also doing a new socket for their stupidly large server CPUs too.
     

  11. Matt26LFC

    Matt26LFC Ancient Guru

    Messages:
    3,123
    Likes Received:
    67
    GPU:
    RTX 2080Ti
    I find it funny too lol I mean these are enterprise grade sever parts not something anyone here is buying for there little desktop computer at home lol

    I see no problem with a new socket for consumer grade processors every other gen either tbh
     
  12. warlord

    warlord Guest

    Messages:
    2,760
    Likes Received:
    927
    GPU:
    Null
    :) Only working idiots or spoiled brats care about Intel nowadays tbh. An idiotic overpriced company. They only want us pay more for less each new generation. Their innovation skills are not equal to their R&D size. But, whatever, move on...
     
  13. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,628
    Likes Received:
    1,119
    GPU:
    4090 FE H20
    You have a pretty closed mind.
    Basically you're saying anyone who has money is an idiot... Well newsflash :stewpid:.

    Intel is still the best performing. Now whether that's worth the cost to someone that's another story.
     
  14. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,955
    Likes Received:
    4,336
    GPU:
    HIS R9 290
    I've stated in the past how the Guru3D reader-base doesn't understand enterprise-grade hardware and that articles like this should probably be avoided for that very reason (I doubt Intel appreciates the negative publicity), though I've been told off for making such statements.


    Personally, I do see a problem with new consumer-grade sockets every other gen. If making a new socket is a necessity then by all means go for it. But I have a very hard time believing Intel needed a new socket for the CPUs that fit in 1155, 2011, and 1150. As for 1151, I'll give that a pass, since that was transitioning to DDR4.

    Meanwhile, look at what AMD accomplished - if you have an AM3 (non-plus) CPU, that can fit in AM2+, AM3, AM3+, and some some cases, AM2 boards. These are not pin-identical, and, this goes from DDR2 to DDR3. Based on release dates of the sockets themselves, that's a range of roughly 5 years. If you take into account when AM3+ was superseded by AM4, that would be 11 years. That's impressive. Sure, AM3 CPUs were a slower than their Intel counterparts, but not by a wide margin. In another perspective, AMD also had at least 2 completely different architectures compatible with socket AM3+.

    Intel has the funding to make a socket last, but, it wouldn't surprise me if they have a deal with motherboard manufacturers where they're expected to break compatibility in order to increase sales. If Intel went AMD's route, motherboard manufacturers would likely lose tens of millions of dollars every year. Think of it in this way: motherboard manufacturers maybe give Intel a small bribe for making a new socket, allowing them to make an entire set of sales when Intel releases a new CPU. As for Intel - they don't care. They have their own fabrication facilities so it's not like they have to pay a 3rd party a hefty price for changing designs. They're going to make sales whether they change the socket or not, so the little bribe they get from the motherboard manufacturers likely pay for the expense of changing things up. In the end, it's a win-win for both companies. The consumer never knows the secrets behind these designs, so they'll never know if the change in socket was ever a necessity.
     
    Last edited: Mar 29, 2017
  15. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    So witch is it? Intel making the killing or the MB manufacturers. You two contradict each other.
     

  16. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,955
    Likes Received:
    4,336
    GPU:
    HIS R9 290
    There is no contradiction - like I said in my post, both Intel and the mobo companies will return a profit. Both of them have to spend a little extra in design changes, but that's worth it if it ensures both sides make sales. Without changing the socket, only Intel will profit. If Intel kept the same socket for most of their consumer-level DDR3 CPUs, mobo manufacturers would be at a massive loss, because nobody would need to buy a new board when upgrading the CPU. Since Intel isn't really in a position to anger mobo companies, they likely made a deal to change things up once in a while.
     
  17. RedSquirrel

    RedSquirrel Guest

    Messages:
    81
    Likes Received:
    6
    GPU:
    Intel Iris 6100
    All those pins are likely for the 6 channel memory, system buses (PCI-E and more) and copious amounts of grounds/power :p Connecting billions upon billions of transistors to the outside world simply needs a lot of connects.

    Desktop wise they've been messing about for years anyhow, probably because AMD bulldozer was so utterly crap, though even Optane is utterly worthless for self builders currently unless you want to indulge Intel's shonky attempt to market an incomplete product to a non-existant market (people using the latest platform and CPU but not using an SSD...????!). Anyhow, I imagine most people changing their sockets right now are going from an Intel LGA to an AMD PGA :)
     
  18. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    There has to be more to the socket change than you are making it out to be. You used AM2/AM2+/AM3/AM3+ as an example. I would like to point out that those sockets held back AMD. It did not allow for the PCIe controller to be moved to the CPU die held back available memory bandwidth on the FX processors over all it hindered AMD from moving forward because they kept having to look back.

    And as new tech comes out people would upgrade there motherboards more and keep their CPU in your fantasy world. Again MB manufactures do not gain anything from socket changes. And again most people will want to upgrade their CPU with the latest available to match with their new motherboard so not only do they get all the latest tech from PCI-e, specs to USB upgrades, M2, Thunderbolt (may you rest in peace(no seriously don't come back)) updates in LAN and on-board WiFi/Bluetooth. They will also want the latest instruction sets that come with updated CPU architectures as well as the bump in IPC.

    TL;DR If you are that pissed about socket changes you probably don't like Intel to begin with and this is just something you latched onto as your go to argument against the company.
     
  19. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,955
    Likes Received:
    4,336
    GPU:
    HIS R9 290
    "has to be" is as presumptuous as saying "must not be". Some of the CPU architectures (like Ivy Bridge) were supported on multiple sockets. Of the sockets I mentioned, I am not aware of any differences between them that warrant an entirely new backward-incompatible socket. It's one thing to prevent an old CPU being used in a new motherboard (much like what AMD did with AM2 CPUs on AM3 motherboards) but I can't find any valid excuse for what Intel did, other than profit.
    As for AMD, I do agree that them holding onto the same socket probably hurt them (in terms of performance) more than helped. Bulldozer likely would've been better if they did a fresh new socket.

    MB manufacturers absolutely do gain from socket changes, if it means they make a sale from a new CPU purchase. Think of it in my perspective: I have a socket AM3 motherboard. When I bought it, AM3+ wasn't a thing. When Piledriver came around, it was compatible with my chipset. All I had to do was update my BIOS and I could upgrade my CPU. I made that decision because I didn't need to buy another motherboard. In other words, a MB manufacturer lost out on a sale even though I still got a CPU upgrade. Intel makes a hell of a lot more sales than AMD. If everyone who owned a Sandy Bridge could upgrade to a Haswell without needing to upgrade their mobo, then these companies would lose millions. This is business - their income is more important than a convenience for customers that the customers don't have to know could exist.

    Most of the new tech you mentioned is totally irrelevant to most people. Using my mobo as example again, it is nearly 7 years old and yet it can still wholly handle a GTX 1080 (where the CPU would be the main bottleneck). There are PCIe converter cards for M.2 cards, and they're not expensive. Thunderbolt, as you have established, was not of anyone's interest. Gigabit LAN has been available to consumers for about a decade and is still pretty much all you'll find; most people don't have routers that support 4 or 10Gbps. Most motherboards don't come with wifi; if you want to upgrade, it's not hard to just buy a compatible replacement card. Most new instruction sets are rarely taken advantage of for most consumer-level applications. Most people aren't willing to spend hundreds of dollars on just the CPU alone for a 5-15% performance bump in IPC.

    Considering how seldom I upgrade, socket changes are the least of my problems when it comes to Intel. I think you are gravely overestimating how much this bothers me.
     
    Last edited: Mar 29, 2017
  20. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    You love speaking out of both sides of your mouth don't you. In the process of trying to discredit my POV you proved my point. Thank you.
     

Share This Page