Core i7 3770K & 3750 review with Z77 DZ77GA-70K mobo [Guru3D.com]

Discussion in 'Frontpage news' started by Guru3D News, Apr 23, 2012.

  1. PhazeDelta1

    PhazeDelta1 Moderator Staff Member

    Messages:
    15,616
    Likes Received:
    13
    GPU:
    EVGA 1080 FTW
    Its more of a side grade if you already have a 2600k/2500k. But this is Guru3d, where we upgrade ur parts for teh lulz. :D
     
  2. Lane

    Lane Ancient Guru

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    Its what worry me a bit right now... If thoses CPU goes easy to 5.5-5.6ghz under H2o for benchmark, this is good, but if you are constrain to similar clock of the 2600K ( let say 5.2-5.3ghz in average case )... it will not worth the investment ( even from an overclocking and bench point of view )
     
  3. Mufflore

    Mufflore Ancient Guru

    Messages:
    11,667
    Likes Received:
    753
    GPU:
    1080Ti + Xtreme III
    Its certainly not worth upgrading from a clocked 2500K.
    Especially when changing CPUs can damage the motherboard socket pins and they have no warranty cover.
     
  4. PhazeDelta1

    PhazeDelta1 Moderator Staff Member

    Messages:
    15,616
    Likes Received:
    13
    GPU:
    EVGA 1080 FTW
    True

    But even if they could get the temp under control in a newer revision, is that extra 300mhz oc really worth it?. Your bound to reach a plateau with either cpu at some point. You would be better off putting that extra cash towards something else.
     

  5. Lane

    Lane Ancient Guru

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    Absolutely not for gaming or average user .. I try just to figure if in the case of H2o, benching, it can worth the investement ( benching, as a 300mhz faster will give a pretty good score advantage without upgrading to a third gpu, this ofc if you are not unlucky and got a bad overclocker ) ..
     
  6. kapu

    kapu Ancient Guru

    Messages:
    3,745
    Likes Received:
    6
    GPU:
    MSI Geforce 1060 6gb
    I'm on i5 750 @ 4.0GHz ( The OLD dog ) and i still dont see the point of upgrade.

    And i wanted to spend some money big time.
     
  7. Zboe

    Zboe Master Guru

    Messages:
    533
    Likes Received:
    0
    GPU:
    GALAX GTX 970
    No excuse for that kind of negligence. If you even pay half attention this should never be an issue. The fact you even say that on this kind of site worries me actually. That is not a valid reason for deciding not to change a CPU.
     
  8. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    15,849
    Likes Received:
    394
    GPU:
    EVGA GTX 1080 Ti SC2
    I'm not impressed by these new chips at all, particularly the high temperatures when overclocking. Seems to me that the i7-2600K is better for overclocking and just as fast for most tasks at the expense of slightly higher power consumption.

    And to think I had my mind set on upgrading to Ivy Bridge at the start of this year. I think I'm more than happy to wait for something better to come along, whether that's Haswell or whatever.
     
  9. Mufflore

    Mufflore Ancient Guru

    Messages:
    11,667
    Likes Received:
    753
    GPU:
    1080Ti + Xtreme III
    Well I'm no newb to PC building and my first 2 P67 boards got bent pins in the socket in different places, one was from Asus, the other Gigabyte.
    Both were with different 2500k CPUs.

    The Asus bit it first, then I got the Gigabyte and was super super careful, same bloody thing, i couldnt believe it!

    Despite following the instruction leaflet they both were dead and with no warranty cover.
    These sockets are very fragile and there will inevitably be some that are easier to damage.
    They know there is a rate of failure, but none of them are covered under warranty.

    You can bend the pins taking the CPU out as well as fitting.
    I recommend turning the board upside down to extract the CPU.

    fyi
    Gigabyte replaced the board without expressing a reason.
    I couldnt get past Overclockers UKs support to even try and RMA the Asus board, so I took a £150 hit on that board and learned where not to shop (again, sucker for punishment).
     
    Last edited: Apr 24, 2012
  10. slickric21

    slickric21 Ancient Guru

    Messages:
    2,458
    Likes Received:
    4
    GPU:
    eVGA 1080ti SC / Gsync
    I'm going to get the 3770k definatley.

    Only got the 2500k 6 months ago as a cheap tide me over until Ivy Bridge. i7 is really what I want in my rig.

    PCI-E 3.0 means I can Sli/X-Fire with no performance penalty in future, plus the new Lucid Vitru MVP seems interesting..(Hyperformance mode especially)
     

  11. IPlayNaked

    IPlayNaked Banned

    Messages:
    6,559
    Likes Received:
    0
    GPU:
    XFire 7950 1200/1850
    I don't know, I tend to agree with him. Pins are fickle things, and while I wouldn't stave off an upgrade based on that, I can't say that I would blame someone for saying it.

    God knows the horror I've felt bending pins in the past.


    Honestly, PCI-E 3.0 will offer no benefit in speed, even with crossfire/SLI.

    And, as someone who has tried out Hyperformance...its crap. It doesn't offer any performance boost in most games, a marginal one in others, and some games don't even start up, just crash on launch.

    There's the no-tearing, novsync part of it...But I've found its not worth the headache of crashing games all over the place.
     
    Last edited: Apr 24, 2012
  12. RamGuy

    RamGuy Master Guru

    Messages:
    231
    Likes Received:
    11
    GPU:
    nVIDIA GeForce GTX 580
    Looking at IvyBridge from a HTPC perspective it seems like the integrated Intel HD 4000 graphics should be plenty for all kinds of 1080P playback no matter if it's mkv, m2ts encoded with h264, mpeg4 or whatever.

    It seems to be able to put up with a fair amount of game titles as well if you are willing to accept 30-40 FPS without any kind of anti-aliasing and not play on the highest settings.


    But my biggest grip with Intel and their integrated graphic solutions in the past has always been their drivers and the lack of optimization and hardware accelerating support in quite a lot of software out there. Will this still be the case with IvyBridge and Intel HD 4000 I wonder? How well supported will the hardware acceleration be from software like Corel WinDVD, Cyberlink PowerDVD, XBMC etc.. Not to mention games in general?


    I still feel like solutions like Mac Mini and others featuring AMD Radeon HD 6630M and Cataclyst drivers is a safer bet even-though it's raw performance shouldn't be much better than the Intel HD 4000, but here you are almost certain you'd get hardware acceleration support from all kinds of software.
     
  13. IPlayNaked

    IPlayNaked Banned

    Messages:
    6,559
    Likes Received:
    0
    GPU:
    XFire 7950 1200/1850
    Indeed. Hilbert here is using a big Noctua cooler, and cranked the volts up to 1.4...And you can see that's not nearly enough to keep it cool for such an overclock.

    On the overclock page the processor package is maxed 93C. It must be throttling heavily.
     
  14. slickric21

    slickric21 Ancient Guru

    Messages:
    2,458
    Likes Received:
    4
    GPU:
    eVGA 1080ti SC / Gsync
    Oh okay, I had my suspicions about the new Lucid Viru MVP eg why no-one was talking about it if it really did offer benefits in games !!! Guess its just a gimmick at the mo still then.
    The tweaktxxn and TomsH guides on MVP show some improvements at res's up to 1080p, but performance degradation on higher resolutions, but both mentioned some problems with games crashing etc etc.

    I'm still inclined to think that PCI-E 3.0 is a good choice for a Z68 owner who wants to go Sli/X-fire though seems like AnandTech agree
     
  15. IPlayNaked

    IPlayNaked Banned

    Messages:
    6,559
    Likes Received:
    0
    GPU:
    XFire 7950 1200/1850
    Tom's hardware does a review about this every so often, and every single time, they come up with...8x or more lanes, it really doesn't matter above that.

    But with the 7970, maybe you'll see a little benefit from having two 3.0 slots. Personally, if I already have Sandy (I do) I wouldn't bother...You'd probably be better off just getting the 7990/690 when it comes around and use your x16 slot. Your call though.
     

  16. TechFreaK

    TechFreaK Banned

    Messages:
    155
    Likes Received:
    0
    GPU:
    GeForce GTX570 Custom OC
    Depends on what resolution, and how many gpus


    For 4x 680gtx @ 4500x1600? something resolution it is a bottleneck and PCIe 3.0 is the only way to go.

    part1
    http://www.youtube.com/watch?v=S0-xcxAvu54

    part2
    http://www.youtube.com/watch?v=tkZzssm-kWs
     
  17. RamGuy

    RamGuy Master Guru

    Messages:
    231
    Likes Received:
    11
    GPU:
    nVIDIA GeForce GTX 580
    Another thing about HTPC usage, how does the Intel HD 4500 Graphics work in combination with a dedicated card like HD 7870? Is there some smart driver going making you use the integrated graphics while idling, surfing the web and other things not requiring the added performance of the dedicated graphics card like on recent notebooks.

    Or would you be stuck utilising the dedicated card only and rely on it's 2D and 3D clock features for power savings?
     
  18. Mufflore

    Mufflore Ancient Guru

    Messages:
    11,667
    Likes Received:
    753
    GPU:
    1080Ti + Xtreme III
    Have you any proof that it is better than 4 x PCI-e 2.0 x16?
     
  19. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,702
    Likes Received:
    515
    GPU:
    2070 Super
    There are some benchies on NV Forums.

    Even with 680 SLI and 1920x1080, there is 5-10% difference.
    And about 50% difference on 5760x1080.

    (Bars are bit weird. They represent min,max,avg over two runs)

    [​IMG]
     
  20. teleguy

    teleguy Maha Guru

    Messages:
    1,282
    Likes Received:
    81
    GPU:
    GTX 1070/Vega 56
    Only on mainboards that support Lucid Virtu.

    http://www.lucidlogix.com/product-virtu-gpu.html
     

Share This Page