Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 19, 2018.
if the 6700k already meets your needs, wait.
if you need a faster cpu get a new cpu.
Hilbert, why did you remove them? Those charts are useful datapoints. You include MANY synthetic non-gaming benchmarks that mean nothing to the vast majority of your members.
I find it hilarious that no one is complaining (Intel fanboys included) about those benchmarks that literally are meaningless "use cases" for the vast majority of your members but so many AMD fanboys are complaining about data points that DO have value simply because they make AMD processors look bad.
Hilbert, if you stop posting 720P CPU benches purely because AMD fanboys complain because it makes AMD processors look bad, you've lost me. I won't trust your site going forward because you will have made it clear you are curating your content to appease the most vocal complainers on this site versus doing what you did that drew me here and heir me here, giving us as much unbiased data as you can.
My two cents, very upset to read this post from you.
R5 1600 user here and i agree the 720p charts should remain to show the difference of cpu performance even if its a resolution that no one would use today. Thanks for the hard work Hilbert
Agree, the more results the better. Having results but not publishing them is just wasted effort. There will always be haters and complainers. But people complaining about things that, in their opinion, shouldn't be in the article but are is just stupid. If a reader isn't interested in part(s) of the article just skip it and read only what you want. Now even the people who did want to see the 720p results can't thx to the whiners.
The argument that benching CPU at 720 or even 1080p is useless because no one with the high-end GPU would realistically use those resolutions is one of the stupidest ideas floating around.
More often than not benchmark is just a benchmark, and does not necessarily represents practical or even real-world workload.
What's next? Banning synthetics? How about banning theoretical through-outputs, DGEMM/SGEMM, texture/pixel/ filtrates, bandwidth tests?
Shall we ban 3D Mark, because it's neither a game nor a productive application?
720p benchmarks are giving us an estimate of how powerful is a particular CPU in pushing frames.
Does not mean we are expected to run our games at 720p.
Rather these benchmarks are metrics informing us how suited is a particular CPU for high frames, for future GPUs, and for pushing frames in less then ideally coded games.
I demand a 1024 × 768 benchmark for my new 2080 TI.....
As usual a few ruin it for the many...
I found the 720p results helpful.
Wait, did Hilbert remove 720p results cause of triggered users?
457.17€ today in a very good swiss shop (owned by a major company in the country, it's my go-to webshop, they also have real luxury shops you can go if you want to buy a 4000€ tv lol it's not only a website with a warehouse)
I don't know if linking is allowed so google it : digitec .ch
at that price I ordered one
no idea how they handle out of country orders but if you live in italy,germany,france you can try
availability is unknown (10 cpus already on order) but that's because unlike other shops they don't lie on availability, if they say 2 weeks it's 2 weeks if they say unknown it's because they don't have a 100% delivery date, you can also easily cancel orders for long/unknown delivery times
edit : yesterday it was 547.73€ so it seems to be some kind of launch price/offer
I found them helpful 2 generations ago. They told me how high fps one could get with particular CPU on 1080p if I had GPU that does not exist, yet.
Today, I say that 2080Ti is that non-existant GPU being CPU limited on 1080p. And as it is limited by CPU on 1080p, 720p becomes redundant. I do not mind Hilbert to do those tests, but their value per his time spent is nowhere near other tests.
As for those who think that nobody would play on 1080p with 2080Ti for high fps, that's defective way of thinking. 1080p screens are only ones which have 240Hz now, therefore someone preferring high fps is more likely to own that screen over 1440p or 4K even while card may be marketed as 4K.
Then if we take raytracing, 1080p seems to be resolution where one can have comfortable image quality at reasonable fps. nVidia felt that they saw end of the tunnel, so they changed direction and started to dig raytracing which will take another decade before it becomes satisfying. That's decade they can release new HW and stay in business.
As to the why, I think I've explained the really well with the explanation linked to.
I haven't given up on 720p testing just yet as some users find them useful, however, how relevant they really are with an RTX 2080 Ti is up for debate. I am however getting tired of continuously being verbally attacked as no matter what I do, a lot of you is always going to disagree or get offended by it. Over the years I have gotten used to verbal attacks and can deal with them (which is a weird unfortunate fact when you think about it), however at one point enough is enough. Decency these days simply seems to be gone.
In the case of 720p results, this was requested by you guys. Removal is not based on one verbal comment, browse through the complete thread and be amazed. I'll think about what we'll do with 720p results. I am thinking of changing article chapters not separated by game, but separated by resolution. E.g. 1 page 720p results, the other 1080p then another 1440p. That way users can more easily browse (or skip) what they find relevant.
And trust me, I'll still get verbally attacked when I would do that.
Why you stop CRT monitor benchmarks? ....J/K
Love you Boss!.....
hey man ,640x480 is clearly high-rez
1080p on ultra settings still can drug high end GPU down with poor optimizations on engine sides.
I think having games set on medium/high video settings would have more plausible results for CPU benchmarking. I assume current benchmark are done on maximum video settings.
1440p/4k on maximum settings are useless CPU performance measurement if GPU is chocking.
No details, no Draw Calls, no extra CPU utilization.
Clearly some people are feeling bad for expensive high tier GPUs when are evaluated by lower resolutions.
They clearly do not understand the basis of bottleneck, when CPU is the only gamebreaker when used with enthusiast GPUs. It is not your fault. GPUs are to blame that they still do not have the power to run 1080p as 720p, enabling us to check CPUs limit. Or even better when CPUs will also be strong enough all of them from low to top end tier, you can skip 720p. But not all CPUs tested are i7s. There is a demand to fully utilize all models.
You do the right thing, keep it up. You are far too good and old to what you are doing, we cannot tell you what to do after all these years, you have built a credibility and a reputation with your guru3d stamp. Keep it up. Ignore the noise. As long as technology cannot help you avoid it, low resolutions are a necessary evil.
100% extra GPU utilization for 5% extra CPU utilization. Next thing, let's turn that 16x MSAA to make sure we stressing that CPU nicely
If we had monster GPU that would address all GPU cap, this wouldn't be an issue.
Since we would actually see what 720p benchmarked showed, pure CPU cap.
Why test with 2080Ti? Cause it's best GPU right now? And yet it will hit a cap on 1440p highest settings way before CPU does. Inconsistency.
Some games have exclusive CPU settings like the talos principle, but most have it merged.
From games that I played, having high settings are good enough to offload GPU, without affecting CPU performance.
EDIT: When we see gaming CPU benchmark, we can't see how far was GPU utilized. So we kinda guessing where and how much GPU was used and capped.
Why bother with gaming benchmarking if it's inconsistent. We just rely on highest end GPU to deliver frames, so CPU that being tested might hopefully start capping and show us THE RIGHT NUMBERS. THE NUMBERS THAT WE SEE IN 720P.
"BUT NOOOOOOOOOOOOOOOOOO, NOBODY GAME ON 720P. WE JUST WANT TO SEE GPU FRAMEFEST, CAUSE LOWER RESOLUTION MAKES AMD LOOK BAD and OTHER BULLSHIT and EVERYBODY GAME ON ULTRA SETTINGS, CAUSE HIGH/MEDIUM SETTINGS ARE FOR NUBS. ME WANT ULTRA SETTINGS WITH GPU BURNING AND CPU IDLING."
I am done venting. Ty.
If you're gaming on 60Hz screen then you absolutely don't need to be buying an expensive 9000 series Intel CPU, the AMD CPUs with more cores for less money are probably a better choice.
As an idea, to keep it realistic in terms of usage scenarios, the pinnacle real world situation for high fps gaming is a 240Hz 1080p screen, so testing at 1080p with reduced graphical details to achieve a target of 240fps is something that gamers in real life would do. Perhaps testing at 1080p could be done at say Medium graphical settings as well as Ultra graphical settings, that way you're targeting the real world crowd who reduce graphical settings to hit 240fps 1080p on their 240Hz monitors - after all this scenario is really where these gaming CPUs are designed to operate. That way you can get rid of the argument that no-one games at 720p, which is true pretty much, while still keeping it based in the real world usage scenarios.
I don't see how any of these new Intel CPUs can be recommended compared to Ryzen for any workload other than gaming. Ryzen and Threadripper simply destroy Intel for anything CPU intensive.