Core i7-6700K and Core i5-6600K SkyLake Specs ?

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 22, 2015.

  1. xIcarus

    xIcarus Guest

    Messages:
    990
    Likes Received:
    142
    GPU:
    RTX 4080 Gamerock
    Where is this technological advancement on AMD's side that I'm not aware of? They have 3 year old CPUs on the market for crying out loud. Their architecture-to-architecture improvements have been close to null.
    At least Intel made their architectures more efficient even if their performance haven't increased a lot. But then again WHY would Intel give us more performance? They don't have to, AMD cannot compete. They barely even focused on laptop CPUs and those have actually seen decent performance improvements purely as the fruit of more efficient desktop architectures.

    AMD's downfall in the CPU market is simply due to the inefficient nature of Bulldozer. You cannot expect people to buy a 220W CPU when their similarly performing counterparts eat up almost 2.5x less power. Hell, my GPU doesn't eat that much power. And a byproduct of this inefficiency is the impossibility to compete in the laptop market. You cannot throw a 50W CPU in a laptop and expect it to be bought, not when the competition has far better performance using the same 50W.
    Only then come the stupid market decisions, like trying to force feed us those APUs that nobody gives a crap about because gamers will always prefer a dedicated card and you don't need that kind of GPU power for HTPC builds.
    The APU idea is great. I commend AMD for it. But they are and will continue to be bottlenecked by RAM throughput, there is absolutely no way around that, not even with DDR4. They need to get their attention away from those damn APUs.

    Saying that AMD's chips from 2012 are more advanced compared to today's Intel chips is the worst twaddle I've heard in years, have you been living under a rock? AMD began losing CPU market starting with Intel's Core architecture, this all started 9 years ago.
     
    Last edited: Apr 22, 2015
  2. Fender178

    Fender178 Ancient Guru

    Messages:
    4,194
    Likes Received:
    213
    GPU:
    GTX 1070 | GTX 1060
    Since this will be a tock CPU makes me wonder if Intel will go back to solder in between the heat spreader and CPU di. Or that is only reserved for the -E based CPUs with 6-8 cores or more.
     
  3. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    There was an article that I've posted before that I can't find now -- but basically it explained why they moved away from solder and the jist was that the die sizes of modern desktop chips are getting smaller, faster then they are lowering power consumption. So the same amount of thermal energy is essentially spread across a smaller area. The type of fluxless solder they were typically using for TIM was causing issues in such a small area, when it was heated and cooled repeatedly, it would crack.

    There is different grades of solder and I think that's what they ended up doing for like Devil's Canyon and stuff, but it probably costs more, requires retooling at the manufacturing level, and probably isn't worth that extra cost on regular consumer chips (the vast majority of people don't overclock).

    I imagine that the larger chips will still get solder as the die area is larger, but I think the cheaper stuff will probably still use paste.
     
  4. nexus_711

    nexus_711 Master Guru

    Messages:
    221
    Likes Received:
    0
    GPU:
    Inno3D GTX 1080 Twin X2
    /\ /\
    Well said! :thumbup:
     

  5. Hughesy

    Hughesy Guest

    Messages:
    357
    Likes Received:
    1
    GPU:
    MSI Twin Frozr 980
    I thought Skylake would be the one, at the moment we know nothing really. My 3570k is still going strong, but I do think it struggles a bit in some game (lowest framerate) GTA V really pushes my CPU, I feel I'd get a consistent 60fps at all times with a better CPU. I really wanted an affordable 6 core Intel CPU. I'd probably plum for a i7, I'll wait to see the performance before passing judgement.
     
  6. Visor

    Visor Guest

    Messages:
    1
    Likes Received:
    0
    GPU:
    GTX760
    I was looking for news on Skylake on Google and stumbled upon this thread. Since I'm extremely hyped for the upcoming CPUs, I took the effort of registering here and replying to a few posts I don't agree with.

    I've shortened the links with bitly and replaced them with text since I don't have 5 posts yet.

    I don't think that's the strategy Intel plans to keep on following.

    According to an economical news report (bit.ly/1OevbmS), Intel's CEO believes that people have less incentive to upgrade, because even the 4 year old platforms remain powerful enough for most PC users.

    So, any other time Intel, as per being the de-facto monopolist on the high-end market, would normally prefer to artificially slow down the progress. But in the current situation with dwindling PC sales, they have no other choice but to get back on track and start rolling out real "tocks", capable of tickling the upgrade itch. We're talking about at least 20% performance increase compared to last-gen. But I personally believe it will be far more; 30-40% in average, with up to 70% in specialised tasks like encryption, voice recognition etc. (compared to Haswell/Devil's Canyon/Broadwell).

    I don't think being so categorical in your estimations is the right choice here.

    bit.ly/1HkQbUT : this is a SiSoft Sandra becnhmark for an engineering sample of quad core Skylake @ 2.3GHz with HT. It was made at the end of last year. ~97 GOPS
    bit.ly/1JbBlka : 4790 @ 4GHz ~120 GOPS
    bit.ly/1DhJRXJ : i7-4700MQ @ 2.4GHz ~70 GOPS

    I suppose these results can give us some good hints. Assuming ideal conditions, a similarly clocked Haswell is roughly 30% slower than a ES Skylake.
     
  7. kenoh

    kenoh Active Member

    Messages:
    92
    Likes Received:
    0
    GPU:
    TitanXP 2.1GHz
    The prices for entry model E chips has gone down so much it wouldn't make much sense to go with the K models, unless you're really on a tight budget! If you're willing to spend over $300 dollars for a CPU, why not go the full Monty if the next model up is only a $30 dollar price difference? It's stupid not to release a Mainstream 6-core! Even without hyperthreading, it's still more beneficial then paying for a 4-core with hyperthreading for almost the same amount!
     
  8. FranciscoCL

    FranciscoCL Master Guru

    Messages:
    267
    Likes Received:
    59
    GPU:
    RTX 3080 Ti
    Better wait for DX12 (CPU usage improvements) and then decide.
     
  9. Fender178

    Fender178 Ancient Guru

    Messages:
    4,194
    Likes Received:
    213
    GPU:
    GTX 1070 | GTX 1060
    Ok thanks for the information.
     
  10. kosh_neranek

    kosh_neranek Guest

    Messages:
    341
    Likes Received:
    0
    GPU:
    Palit GTX 1070@2101 Boost
    year by year I feel better for cashing all that money out on a then brand new 3930K. C'mon Intel, can you bring something worth upgrading? DDR4 on it's own is no reason to upgrade..
     

  11. icedman

    icedman Maha Guru

    Messages:
    1,300
    Likes Received:
    269
    GPU:
    MSI MECH RX 6750XT
    How many dam sockets do they need i could understand changing sockets with the change to ddr4 and all but now theyre getting rediculous.
     
  12. Cartman372

    Cartman372 Maha Guru

    Messages:
    1,469
    Likes Received:
    0
    GPU:
    EVGA 1660Ti
    They want to force people to have to upgrade motherboards as well. Intel makes more money that way.
     
  13. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Honestly Intel is probably doing motherboard manufacturers a favor by changing the sockets. They are suffering the worst from the lack of performance improvements.
     
  14. Solfaur

    Solfaur Ancient Guru

    Messages:
    8,012
    Likes Received:
    1,532
    GPU:
    GB 3080Ti Gaming OC
    I'm not excited at all, maybe I'm wrong, but I doubt the performance boost will be significant. If DDR4 wouldn't be so damn expensive I would rather go X99 (or the next, skylake equivalent), since there at least I have option for 6 or even 8 core. I will certainly not upgrade to another quad core (unless something unexpected happens to my aging rig and I won't have the money for anything else).
     
  15. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    One thing I will say is Intel is always showing improvement in their chipset performance with every new socket, where as AMD does not being based on the same socket for years. But then you look at FM2 and chipset performance with the a78 and a88 chipsets, and those are flying compared to the 8 or 9 series on AM3.
     

  16. bemaniac

    bemaniac Master Guru

    Messages:
    341
    Likes Received:
    27
    GPU:
    Zotac 4090
    urgh we all say it every time. Go back on the forums to when we all had 2700ks or 930s and everyones like ''holy crap skylake is when things are going to be amazing''

    Here we are a few years on and nothing has changed. Those same 930 and 2700k owners are experiencing the same gameplay and similar encoding performance to todays £1000 processor owners.

    Basically when a totally new tech comes out, buy the first mainstream affordable one and keep it for 8 years. It seems to work.

    I have no idea why I went from 930 to 3820k or even why I went 3820k to 4930k. I guess I just was bored and had money to burn but I've had no improvements for gaming and woohoo I can create a winrar in a few less seconds or reencode a movie with ac3 in a minute less but my god, I'd still be happy running 3 titanXs off a 930i7......

    In years gone by you'd have to totally change your pc because of the processor every year when certain games came out but as we know even upcoming windows 10 stuff runs on almost anything.
     
    Last edited: Apr 22, 2015
  17. red6joker

    red6joker Guest

    Messages:
    572
    Likes Received:
    0
    GPU:
    MSI GTX 980TI Gaming
    I agree, I went AMD when the Phenom 2's came out. The 1090T was truly an amazing CPU for AMD but after that I only stuck with AMD cuz of the price, 139 for my fx8320 or 339 for a 4790k.

    I am not looking to upgrade until atleast 2016 (unless something so amazing I have to upgrade or die comes out) and if AMD's "Zen" cant live up to the rumors like the bulldozer and piledrivers when they came out, I will be going back to Intel regardless of price.
     
  18. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,750
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    i7 6700k might be my new cpu once those skylake motherboard come out. 4ghz stock clock, been waiting for those cpu's
     
  19. Undying

    Undying Ancient Guru

    Messages:
    25,477
    Likes Received:
    12,883
    GPU:
    XFX RX6800XT 16GB
    You can get one now, 4790k is a 4Ghz stock CPU, nothing new there.
     
  20. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,516
    Likes Received:
    2,361
    GPU:
    Nvidia 4070 FE
    Just imagine how deafening the cheering would have been if i7-6700 and Core i5-6600 had been six core CPUs. Intel would have most certainly seen a spike in sales, people finally feeling they would get a real extra kick with the upgrade, no questions asked. It's plain strange they stick to four cores after all these years. Undoubtly the reason software hasn't developed more broadly to put multiple cores to efficient use is because Intel has held back. It's the market leader and trendsetter. The underdog AMD couldn't change the situation.
     

Share This Page