Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 1, 2013.
Not really, Sandy Bridge was a completely new workup with mobile emphasis based on the P6 Pro.
I have a feeling this is as much due to these "pre-overclocked" system makers just upping the voltage and not using offset ever.
I doubt they will be that lax.
Looks like the temp difference is the major cause.
Higher temp needs better cooling and/or higher voltage (as long as temps are still in check).
If the temps cant be kept in check, the overclock has to be reduced.
They usually are, most "pre-overclocked" systems just come with a steady voltage and not offset.
After Netburst, all of Intel's architectures can be traced back to the P6 Pro architecture which powered Intel Pentium III. Obviously there has been a lot of changes since then but it wasn't by all means a new microarchitecture. I think AMD is also realizing that sometimes having a radically new microarchitecture isn't always ideal.
So far my 4770K @ 4.4 is not passing 55C in gaming sessions. That's with the fan speed setup as Silent in the Bios.
I'm running it at 1.155v in the Bios. It passed 1 hour of OCCT Linpack.
How far can you push it before hitting near 100C?
Gaming doesnt really stress CPUs much. What are the temps under OCCT?
But as i said it passes the test, and this is a gaming PC so i will NEVER ever see such high temps.
thats not bad at all, unless you wanna stress test 24/7. I have seen temps higher than that in p95 on this rig
The more I read about Haswell,the happier I am that I went with x79 and a 3930K. It will get even better with IB-E as my next upgrade.
Never stress tested my 950 either for 24/7 It ran at 3.8 (1.25) for over 2 years without issues.
Call me stubborn, but 1 hour of OCCT Linpack and 1,5 hour of Heaven is enough for me to know my rig is stable for Gaming.
A 57.5% normalized increase in battery life while still improving the performance in the ultrabook category. That's the largest single battery life increase ever in Intel's history.
And people still have the nerve to bitch about Haswell? This is exactly where the market is going, and so far Intel is nailing it.
I agree. And even if you do see such temps, no problem. I think HW's max temp threshold (TJMax) is 105c, like IB?
Well, enthusiasts think the world revolves around them.
How many of us are using low-power, mobile processors? Unless you're looking to buy an "ultrabook", the power consumption of those processors is meaningless. I don't really care how much power my desktop uses...I only care that it will perform the tasks I need with a reasonable level of performance.
^ True, I dont give a crap about energy either. Enthusiasts have valid gripes about HW, but those who say Intel 'fail' (quite a few here) because of HW's poor raw performance gains miss the point. Intel has a massive win with HW and the energy/IGP improvements its achieved for its larger target market. They arent in it for you/us (enthusiasts), we are a piddling minority compared to the larger picture they are focused on.
Yet, STILL, even as an enthusiast, if I were looking for a new CPU to buy today, I'd still go with HW. So I dont think Intel is even losing its whiny enthusiast base, where else are they going to turn to?
I care about energy, but I do not want at the same time want CPUs which run hotter because Intel decides to save a bit money on the CPU to heatspreader connection.
Would those temperatures be consider hotter or cooler compared to 1st gen i7?
Not that it would matter I probably gona sit on the current build for another year or 2
But doesn't it already do that? I really can't think of a single task where I'm like "damn I wish the processor was faster" with the exception of encoding and Intel has been improving there constantly. I play mostly CPU bound games like SC2, Counterstrike GO -- and other games but I never really notice my CPU being a problem and now with 4K displays and stuff, new consoles, it's going to shift heavily back to the GPU being a major bottleneck. Especially because the console processors are nowhere near what we have, so developers will optimize appropriately.
Would I love if Intel suddenly came out with 200% faster processor in every benchmark? Sure, but I don't think I would like it at the expense of battery and onboard GPU performance. Lately I'm finding those far more valuable as no software really would take advantage of the 200% anyway.