Review: Core i7 6700K processor

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 5, 2015.

  1. ScoobyDooby

    ScoobyDooby Guest

    Messages:
    7,112
    Likes Received:
    88
    GPU:
    1080Ti & Acer X34
    Guess I better get comfortable with my 2600k sticking around for the foreseeable future.
     
  2. Chillin

    Chillin Ancient Guru

    Messages:
    6,814
    Likes Received:
    1
    GPU:
    -
    I'm still hoping for a Skylake compute stick from Intel; would be amazing.
     
  3. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    What kind of symptoms? Just bought the same mb for a 2500k rig.
     
  4. Moegames

    Moegames Guest

    You can upgrade with current hardware to Windows 10...then if your mobo goes bad or you build a new rig...you can "re-activate" that same Windows 10 that was on your old rig or old mobo....Microsoft will indeed allow you to use the same original Windows 10 key on a new rig or new hardware/mobo...you do the simple phone activation. There is so much rumors floating around about this but i already did this and it works...you gotta read the "fine print" ha! But for real bro..you're fine to upgrade now/build a new rig and use the same orig win10 key.


    Makes no sense to upgrade right now until Intel starts pushing out more cores on all their processors (like you mention)..not just the uber high end stuff..but the current and previous line up of gaming processors that are the mainstream chips for gamers...example the sandy,ivy,haswell,sky 2600,3770k,4770k,6700k,etc,etc

    These current and previous gens of intel chips are more than sufficient for gaming and even the sandy's overclocked are bad ass chips for gaming still! If you are a gamer...all you need to focus on for now and the foreseeable future are the GPU hardware...The only thing i can see being crucial in the future are more cores...why Intel hasn't jumped fully into adding more cores is baffling but who cares, i am seeing a new trend of games making use of threads in place of more cores..we are fine with previous and more current Intel Icore chips as far as gaming is concerned.

    I currently have a 2 year old delidded Intel 3770K@5.0ghz stable and this Ivy is good to go for some years. The only thing i need to worry about is when to upgrade to a newer GPU.
     
    Last edited by a moderator: Aug 6, 2015

  5. E^vol

    E^vol Guest

    Messages:
    17
    Likes Received:
    3
    GPU:
    Gigabyte 670 OC
    I don't even see much reason to upgrade from my i7-2600k.....
     
  6. Ursopro

    Ursopro Guest

    Messages:
    52
    Likes Received:
    0
    GPU:
    Gigabyte 7950 WindForce3
    I'll stick with my 4770K :)
     
  7. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Keep in mind that I've had my board since release. It's also spent most of it's life running benchmarks, stress tests and Folding@Home for days, weeks or months at a time. This is the first motherboard I've had last longer than a year in my possession....lol

    After a few hours, if I stress the CPU, the system will shut down. CPU temp is just fine. After 2-3 shutdowns, I have to flip the switch on the PSU for a few minutes before it will power on again. Last 4 boards I've had do the same, had burn marks on the VRMs....lol

    Edit: It's down again....lol
     
    Last edited: Aug 6, 2015
  8. seaplane pilot

    seaplane pilot Guest

    Messages:
    1,295
    Likes Received:
    2
    GPU:
    2080Ti Strix
    My Golden 3930k @ 4.7GHz is still sufficient, until next time Intel.
     
  9. Aethanite

    Aethanite Guest

    Messages:
    1
    Likes Received:
    0
    GPU:
    Geforce GTX 980 4gig
    First time poster, long time reader.

    I think I already know the answer, but thought I might throw it out there and see some opinions.
    I currently am running a i5 3570 (non K) CPU, first of its gen, 8 gig DD3 1600, a Z77 M/B that supports non-K o/c and a Gigabyte Geforce 980 GTX Windforce. Also running OS off a SSD with a 2nd 250gig SSD for gaming with multiple 1+2 TB Platters for storage. I have a 750w Power Supply.

    Now, when I first built this PC a few years ago, I had a Geforce 660ti as the GPU but due to it unfortunately packing in a couple months back, I decided to future proof myself a bit and bang a 980 GTX in it. I now have the problem where I think the CPU is bottlenecked from the 980, which leads me to upgrade my CPU, MB and ram in the coming months.

    My question is this, would a 6700k, m/b and dd4 warrant enough of an upgrade from my current parts, or would I be better to pay approx the same price and go for the 5820K, m/b and DDR4 as I am not interested in SLI atm or the near future because the 980 really does handle anything I'm throwing at it (I game in 1080p only so i know its kinda overkill). What would be a preferred route to take or should I perhaps hold off until next year?

    Thanks!

    Aethanite
     
  10. StrongForce

    StrongForce Guest

    Messages:
    17
    Likes Received:
    0
    GPU:
    r9 290x Windforce
    Could you test with higher resolution like you do with graphic cards ? I wonder if the gains might be bigger then.


    What games do you play most ? do you get FPS drops ? (like I do with my FX.. lol) does it bother you ? if so it's worth it, but yea in europe the 6700k is way overpriced.. 410 CHF in swiss in my local shop.. what the.. and 395 euros in an online store..

    Also I checked the prices of the 5820k and remember it's more like 600 euros now.. where the 4930k was 500 ish if I remember correctly, we're suffering a massive milking !
     
    Last edited: Aug 6, 2015

  11. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,725
    Likes Received:
    1,855
    GPU:
    EVGA 1070Ti Black
    hmm i been reading 350 MSRP for the 6700k if so microcenter will probably sell it for under 300$ i hope
     
  12. Yoshpop

    Yoshpop Member Guru

    Messages:
    106
    Likes Received:
    0
    GPU:
    1080 SLI 2.05GHz
    These releases have been a little frustrating to be honest. I remember how much fun it was to drop in the 2600k when it came out. I wish something as game-changing would come around again. I'm curious to see what Skylake-E has to offer. But then again, maybe my 3960x will last a few more years. For me to jump ship, I'd like to see Intel offer an 8-core processor that isn't pushed as an extreme edition; otherwise I'll probably just add a third 290x and ride this CPU out.
     
    Last edited: Aug 6, 2015
  13. thatguy91

    thatguy91 Guest

    I think my i5-3570K will hold out until AMD Zen is released :). Considering Zen will go up against this CPU, I think it may be a worthy adversary. The only question will be whether to get the 4 true core, 8 thread APU with GPU, seeing as there may be load balancing with a discrete card, or the 8 true core monster with no GPU. The 8 core should in theory, obliterate the current Intel offerings.

    In the review, a comparison was done between 2133 RAM and 3200 RAM. This showed the 3200 RAM to be effectively pointless. One of the issues of the faster RAM though is timings, so what would be interesting is how this affects the results of the benchmarks.

    For example, I looked on the Gskill website, and found the following:
    DDR3-2133:15-15-15-35
    Pretty much all models, except for one which was 15-15-15-36

    DDR3-2400: 14-14-14-34
    [Ripjaws 4] F4-2400C14Q-16GRK

    Now, I highly suspect the DDR3-2400 at those timings will be beneficial over the DDR3-2133. I also suspect that maybe the difference will be more noticeable once overclocking comes into play.

    There's also the consideration of RAM tweaking. If you had the 2400 RAM, put slightly higher voltage through it, then tweaked things like the Refresh interval, 1N vs 2N timing etc, how much difference that will make?

    There were a lot of review done with Sandy and Ivy and RAM of 1333, 1600, 2133 etc, it would be good to see a good review done with Skylake as well. The key thing to point out about the Sandy and Ivy reviews was that the Sandy CPU and chipset showed that anything faster than 1600 wasn't really worth it, however from Ivy CPU + Z77 onwards, that 2133 proved to be the 'sweet spot'. People often then referred back to the Sandy results which were of course, not relevant.

    The thing that was common in these reviews though was that after 2133 you got greatly diminishing returns, and I believe some even showed poorer performance than 2133 with DDR3-2800 or whatever. I believe the same thing is happening here, that 2133 may be like, 1600 RAM, and the 3200 could be like the 2800 RAM.

    A DDR4-2400 RAM test with good timings like the module above will likely show definitively whether faster than 2133 is worthwhile, at least on the Skylake platform (things may change later as DDR4 matures and timings drop).

    Comparisons done should be:
    • base comparison (standard XMP profile RAM with standard CPU clock)
    • CPU overclocked performance comparison (seeing as most with the K processor will overclock)
    • CPU overclocked + tweaked RAM settings comparison (since this is what most enthusiasts buying the better RAM in the first place would do)

    Of course, this would only make sense if the faster RAM is like the one I suggested above (good timings with a small bump in speed). The reason for the small bump is for the same reason with DDR3. If you compared DDR3-1600 and DDR3-2800, you might think that anything faster than DDR3-1600 isn't worth it, because the benefit is very little. However, DDR3-2133 has been shown to be the 'sweet spot' for Ivy Bridge CPU's and later, since the RAM speed and timings aren't a simple matter of scaling well.

    What do people think, would this be a worthwhile experiment?
     
    Last edited by a moderator: Aug 6, 2015
  14. moab600

    moab600 Ancient Guru

    Messages:
    6,658
    Likes Received:
    557
    GPU:
    PNY 4090 XLR8 24GB
    It seems intel can't or don't want to, bring the wow factor we had with Sandy Bridge.

    Skylake is pointless upgrade if u have high clocked sandy\Ivy or haswell, as for sandy the upgrade won't be the cpu performance but the whole new chipset.
     
  15. Lolcibolci

    Lolcibolci Guest

    Messages:
    18
    Likes Received:
    0
    GPU:
    ASUS GTX 1080 Strix
    +1. I would like some new features, but at this price/performance ratio, long live the 2600k! :D
     

  16. StrongForce

    StrongForce Guest

    Messages:
    17
    Likes Received:
    0
    GPU:
    r9 290x Windforce
    Oh I got mixed up with the prices of the 5930k that's what is 600 euros the 5820k is indeed 400 euros ish.. wow I just don't get it...
     
  17. thatguy91

    thatguy91 Guest

    Let's hope AMD Zen is as good as it seems. Even if you are anti-AMD, you should also be hoping that AMD Zen is not only good, but provides good competition. You would have to be pretty stupid and clueless not to, since it's the lack of competition as to why Intel hasn't really produced the wow factor performance wise at the moment.
     
  18. anticupidon

    anticupidon Ancient Guru

    Messages:
    7,878
    Likes Received:
    4,126
    GPU:
    Polaris/Vega/Navi
    I will hold on my upgrade until Zen is out.
    My rig ,as old it may be just works and never skipped a beat.
    The only thing that can motivate me to upgrade is power consumption
     
  19. Fusion_XT

    Fusion_XT Master Guru

    Messages:
    852
    Likes Received:
    0
    GPU:
    MSI GTX1080 X 2139/5400
    Im in the same boat, i just upgraded to a 3930K/x79 from a 980X/x58. Got i really cheap so it was a no brainer and sold my old x58 for allot! (still popular i guess)

    Running now at 4.3ghz and i hoped Skylake would bring in some real performance upgrades but it didnt ... I had my x58 for more then 6 years so im guessing with DX12 incoming this 3930K will last me another 3 years. :infinity:
     
  20. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    How about giving that hardware away, instead of throwing it in the garbage...

    It's still a very potent CPU...

    Of course it's a step back.
     

Share This Page