High-end Skylake processors to get yet another socket

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 20, 2016.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    36,779
    Likes Received:
    5,860
    GPU:
    AMD | NVIDIA
    Intel is a big fan of swapping out CPU socket as often as they can, it seems that with the successor to Broadwell-E, the Skylake-X and Kaby Lake-X (yes the X is new) we'll move to Socket 2066. new ...

    High-end Skylake processors to get yet another socket
     
  2. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,132
    Likes Received:
    3,219
    GPU:
    2080Ti @h2o
    Not surprised, somebody has to sell new mainboards :rolleyes:
     
  3. slyphnier

    slyphnier Master Guru

    Messages:
    688
    Likes Received:
    45
    GPU:
    GTX1070
    this time intel naming kinda confusing
    i thought -X is replacing -E that for enthusiast class

    then what with the kabylake-X (4cores/16lanes) ?
    are they planning to merge/release desktop socket to just one type now ?
     
  4. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,837
    Likes Received:
    2,236
    GPU:
    5700XT+AW@240Hz
    It is just to satisfy upgrade itch. Because there will be many people who may move to Zen just because it is new and shiny platform even if it is just small upgrade over their few years old intel system.

    By releasing new socket, manufacturers will make new boards and they'll flood market with even more products for customer to pick from.
    (Decreasing chance that someone will jump to AMD's ship.)

    It is sound market practice. But unless it brings some new features which were not possible otherwise, it is bad for customer.
     

  5. Dellers

    Dellers Active Member

    Messages:
    62
    Likes Received:
    1
    GPU:
    MSI 2080 Ti Trio
    That late? I was expecting Skylake-E in Q2 like Broadwell-E. After all Skylake is already quite old, and more than two years after the mainstream CPUs is late. May be too late for me then, but I really don't want to buy a new machine right before a new socket comes out either. Like always I assume that the strong dollar won't make the prices in dollars go down, but rather up like usual. It can't be worse than GFX cards though, with a 140% price jump in 5 years because of both exchange rates and increased prices to begin with. I expect the GTX 1080 Ti to be almost 3x as expensive as the 580 was here in Norway, and I guess that Intel will follow that up nicely with the lack of competition.
     
    Last edited: Jul 20, 2016
  6. slyphnier

    slyphnier Master Guru

    Messages:
    688
    Likes Received:
    45
    GPU:
    GTX1070
    its been the trend in intel to release enthusiast class late
    and rather CPU performance, it kind of shifted to PCH that give "extra" compared to mainstream PCH

    the enthusiast CPU is just more cores compared to mainsteam rather than big-improvement, even theres improvement its just slight (looking to ivy-haswell-broadwell), except you fully using all cores
    so not really worth the price/value

    we can say intel enthusiast platform is more like a bundle (CPU+PCH/mobo)

    but maybe will change with kabylake-x... if kabylake-x inteded for mainstream
    then people can choose which cpu they want, either enthusiast version or mainsteam ... considering the socket same, the mobo should support it
     
  7. MainFrame Alpha

    MainFrame Alpha Member Guru

    Messages:
    139
    Likes Received:
    3
    GPU:
    Strix GTX 980 TI
    was this on Intel road map or it is an answer for the ZEN engineering samples surface?! who knows maybe Zen is the one to bring back balance to the force:nerd:
     
  8. BLEH!

    BLEH! Ancient Guru

    Messages:
    5,918
    Likes Received:
    91
    GPU:
    Sapphire Fury
    Intel generally support a socket for 2 generations.
     
  9. Dazz

    Dazz Master Guru

    Messages:
    863
    Likes Received:
    89
    GPU:
    ASUS STRIX RTX 2080
    Hardly a surprise they have to keep their chipset business going even if it's not really required.
     
  10. GALTARAUJO

    GALTARAUJO Active Member

    Messages:
    54
    Likes Received:
    0
    GPU:
    2 x GTX980 Strix
    This is impossible to be done. Can you imagine a 2011-pin socket that is fully compatible with 1150, 1151 and 2011 (both versions)?
    Anyway, if what this new chipset has to offer is only more SATA/USB ports and LAN, then adoption will be slow, unless CPUs are overwhelmingly powerful, which I do not expect to be the case.
     

  11. EspHack

    EspHack Ancient Guru

    Messages:
    2,456
    Likes Received:
    39
    GPU:
    ATI/HD5770/1GB
    i got confused at the kabylake-x part...
     
  12. Rentesh

    Rentesh New Member

    Messages:
    1
    Likes Received:
    0
    GPU:
    EVGA GTX 770 SSC
    I too am confused about the Kaby-Lake X part. Quadcore on HEDT in 2017?!?!?!
     
  13. Size_Mick

    Size_Mick Master Guru

    Messages:
    419
    Likes Received:
    211
    GPU:
    Asus GTX 1070 8GB
    I just purchased an i5 6600k last month on the advice that most games don't take advantage of hyperthreading and many don't take advantage of more than 2 cores. Soon they will have these new CPUs coming out and I'm wondering if anyone thinks things will have changed by then? Or will we (gamers) be just throwing good money away with this?
     
  14. sunnyp_343

    sunnyp_343 Master Guru

    Messages:
    501
    Likes Received:
    25
    GPU:
    Asus ROG GTX 1080
    Intel likes changes.
     
  15. thatguy91

    thatguy91 Ancient Guru

    Messages:
    6,648
    Likes Received:
    98
    GPU:
    XFX RX 480 RS 4 GB
    Hyperthreading is useful when there are threads that are not maximising the performance of the core. Normally workload would switch between the threads as required with resulting lost processing time with the switch. With hyperthreading two threads can be processed in parallel thus allowing previously unused capacity in the core to go to good use.

    For benefit of single application use like gaming, the workload needs to be effectively broken into different streams without the different streams having to wait for the data from another stream to be completed. Gaphics in a game by nature is very parallel, however the non-graphical processing components of the game may not be.

    One aspect that could very much and greatly benefit from parallelism is physics processing. This is why Nvidia Physx is done through their GPU's, with the downside of course of it taking away processing power that would otherwise be used for other graphical elements. Nvidia Phsyx is actually a bad example of CPU based Physics as CPU based Physx is still very much not coded for peak performance. There was no incentive for Nvidia to do so as they were promoting their GPU's through Phsyx, if they highly optimised CPU based Phsyx the performance would be much closer to GPU Physx. If the GPU is already maxmised through game elements and the CPU relatively low, CPU Physx could very much have the potential to be faster, if it were coded correctly as it would not take processing time away from other graphical elements.

    For a long time their Physx on CPU was in native x86 code. In theory code wise the 32-bit version could run on an 80386... from 1985! It's why it performed so badly on CPU. Most liely as a result of the bad publicity when this became known, Nvidia added the use of SSE2, however even that is old and much of the code could benefit greatly from more modern instruction sets like AVX etc. The code could also be made to be hugely parallel, so you could make use of 32 simultaneous threads with physics processing, if coded correctly (which they aren't).

    So the benefit of extra threads depends on the game. I think we may see more thread-friendly games in the future, especially if for example, the upcoming Xbox 'Project Scorpio' uses an 8-core with HT custom Zen type CPU.

    Unfortunately games will reflect what is happening in the console world. I'm sure games will make use of the extra performance of 'Project Scorpio' for things other than virtual reality. 4K gaming would be interesting on that, but even with the updated specs I highly suspect some game quality features will be reduced in order to draw 4K resolution.
     
    Last edited: Jul 23, 2016

  16. slyphnier

    slyphnier Master Guru

    Messages:
    688
    Likes Received:
    45
    GPU:
    GTX1070
    simply depends on how you like to play your game
    if you feel fine as long you get stable 60FPS+ on 1440p... then mostly ur cpu will sufficient for a while... as long you upgrade your GPU

    but if you want to get full immerse experience, like 4K, VR with graphic setting at high-ultra ... then your cpu might bottlenecking your system
    but again that cpu bottleneck might happen if you using like 3way~quadSLI

    eitherway, rather than CPU... GPU the one that giving major improvement in gaming
     

Share This Page