Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 16, 2019.
So, let's stop that before it gets out of hand guys...
We are getting even more funny. 2003's P4 was single core with HT. (That had almost no use at time.) Had lower clock and IPC than FX-8150 while eating around same amount of power.
That means, Having That FX chip would be like having 8 P4s in one package. Which would do better in anything... including gaming.
Your memory of the past seems to be drastically skewed.
As time has gone on, improvements in performance decreased year over year, especially on the processor side. Even more so, programs requiring or simply working much better with newer faster technology has decreased in effect.
The 90s to mid or late 2000s was the period where upgrading your cpu every year was more understandable then today.
Amd wise for instance, your 10 year idea. People were not using a 1991 am386 20mhz in 2001 trying to run xp or 2000 on it, that would have been horrible.
Same thing for amd in 2001, people would not be trying to run a 2011 operating system with an athlon xp 1.3ghz
Now, 2011, going intel side since amd wasnt around for a long period of this...unless something drastically changes from a software requirements side of things, i can completely see someone using a i7-3870 4 core / 8 thread 3.6ghz processor on presumably windows 10 in 2021, no problem.
Now it seems trivial, but even as late as 2005 would you imagine moving TBs around your system as a matter course? Just resuming from sleep for example, is GBs.
I get that userbenchmark isnt the MOST reliable place for EXACT performance differences, but it isn't THAT bad.
I just compared the most powerful pentium 4 (to my knowledge) to the least powerful, first generation fx processor, andyou'd really rather use the pentium 4? Even if its not fully 182% faster, even if it were only 100% faster, twice as fast rather then up to 3 times, you'd really still use the pentium?
I wouldn't say that. I remember playing Warcraft 3 on a pre-HT P4 and the antivirus scanner would start in the background, turning the game into an unplayable, stuttery mess. I then upgraded to a P4 with HT and it was buttery smooth (the HDD light was flashing crazily so I knew it was scanning, but there was no impact on gameplay whatsoever). Needless to say, it made quite an impression on me, and I've been a fan of HT/SMT ever since.
The thing I most remember about HT was the confusion it caused, as people thought only half of their CPU was being used. I found it almost impossible to explain that the CPU was actually running on full load.
That' true, but software itself rarely extract additional performance from HT. UE 2.5+ benefited nicely from HT. But its time was gone before it could lift off. That's why 2nd time intel brought HT back, it did much better. As there were those artifacts from the past already.
(I did not mean it in terms of total system as that's apparent that there are multiple threads and they'll be executed as CPU can. What I meant is gain for single application. And I was not "brave" enough to keep enabled background scans of AV.)
As for that load thing, I saw it in the past. But it has been "solved" later. I hated HT on z5-x8500 Atom. It proven to be fake quad core and in reality it was like 2C/4T inside. CPU under full load delivered only around 2.5x performance of one thread even while it could keep clock. And that's part with dual channel IMC.
And when I run threading tests with 2 threads assigned to different "cores" it behaved as HT/SMT too.