http://www.overclock.net/t/1362591/gamegpu-crysis-3-final-gpu-cpu-scaling With all 8 cores being used at over 60% at minimum on a very GPU bound game how does what you're saying make any sense? This is what I mean by the game designers utilizing AMD hardware more and more in the future. Especially with the console platforms being made by them as well.
Good utilization still doesn't make up for the 8350's weaknesses. It's not even a true 8 core like the Jaguar chip that the consoles are getting (or at least the PS3, the next Xbox specs are still not confirmed I don't think).
Crysis 3 is an example, and its going to be as good as it gets. 8350 will still perform less than the i7 but theyre still going to be terribley slow, low clocked low power atom equivalents.. meh
Its only the first example of a game that uses 8 cores. As game consoles come out with 8 cores we will see more utilization from game designers.
How can you say that's as good as it's going to get?? Was the first game ever made written to utilize the processor and that was as good as it got??? Games are always being tweaked to better utilize the available hardware ever since they started making games.
Its not the first game to properly use 8 threads. Threads are threads, if games utilize 8 threads better than BOTH i7 and amd are going to get better performance.bf3 is another example. 8350 is between an i5 and i7. Benchmarks and other heavily threaded apps show 8350 between an i5 and i7, you have this wishful thinking in your mind that the 8core piledrivers are going to sore past i7s.
Theres also the odd presumption that all future console games will be multi-threaded aware/optimized to take advantage of the new hardware, but I dont think that will necessarily be the case. If devs can make their game runs well on less cores and cutting out the added expense and complexity of such coding, they may opt for that. A good game will still sell without it. I think the main point of multi-threading was to reduce time to complete tasks, something which doesnt necessarily carry over as well in gaming applications (depending on game). The PS4/future consoles just offers devs the option to utilize that, but whether they will all use it is another thing imo.
Another thing that distinuishes games from all other applications, and which adds tremendous complexity to it imo, is the constant user interactions, which results in interupting, re-calculating and re-assigning of threads, tasks in multiple areas. And this goes on on a near constant basis in gaming, which imo makes multi-threaded coding for games so much more complex than usual applications. Thats why I dont see it as a an attractive proposition to game devs. Although if they can afford the time/expense to do it and market that aspect of it as well, why not.
Utilization does not equal performance gain! Even with things like video encoding where the synchronization of multiple threads is much simpler since they only really need to be synchronized at end, you don't get a 1:1 performance gain with additional threads. With games this gain is much much lower, and with each additional thread, synchronization becomes more complex and takes up more CPU time, which means you end up with even less performance gain. Theoretically, at some point the additional load created by adding more threads that need to be synchronized, should be greater than what you gain in performance.
S! Registered here as been a long time reader. Had to add just some points on the Intel vs AMD. Both being good hardware, been using both brands over the years. By looking at the graphs on Crysis 3 I made following conclusions based on component pricing where I live, northern Europe. 1) To get 67fps with Intel I have to pay 1100€ for the CPU. 2) To get 64fps with Intel I have to pay 340€ for the CPU. 3) To get 61fps with AMD I have to pay 210€ for the CPU. Umm..So 3fps more in a game costs me 130€ and a whopping 6fps costs me 890€. And have not even included motherboards or memory yet. Really cost effective, right? If your rig can run a game above 60fps you are good to go IMHO. And by the looks of it both brands can do it quite nicely. So matter of preference without wearing any brand specific goggles
You're correct, utilization doesn't ALWAYS mean improved performance and no there won't be a 1:1 gain. However, if there is the ability to use more of the available processor power or use more cores when other cores are being fully used already then doing so will add performance.
Good post. You are correct that the price/performance ratio in Crysis 3 is in favor of the AMD's, but there are other games (whose graphs I've shown) that do see greater gains in the different CPU architectures. Also, the OP asked between two different builds, in this case we just pointed out the better between them.
Agreed those games you are eluding to happen to be heavy on single thread performance. But most of those games are still getting over 60fps min. For the money the FX is a good CPU couple that with the motherboards which tend to be very feature rich for a lot less money than their 1155 counterpart.
S! Thanks for the welcome, this board has been very good read over the years XBeast, good add 40€ difference there. Again I compared top of the line CPUs: FX8350, i7 3770K and i7 3960X. I am assembling a FX8350 system and can run straight comparisons against a similar 3770K build this week in games I play: Mechwarrior Online, War Thunder, EVE Online, IL-2 Sturmovik Cliffs Of Dover, Dead Space series etc. I do not play Crysis or CoD or BF in any form as my reflexes for those are too darn slow Again matter of preference, I do not say other hardware is better than other. It just boils down how much you are ready to invest to get some increase in performance. Would I buy a nVidia Geforce Titan for 1100€ it costs here when my AMD 7970HD GHz Edition still kicks up good performance in the games I mentioned? Is the huge price difference really worth it? I bet no, but for the hardcore enthusiasts it matters for the bragging rights. Totally understandable and acceptable as they have the moolah to use on their hardware. Not envious, just happy someone can do that
hey guys, another question: what about the Intel Xeon E3-1230V2? a workmate's told me it's about the same price as the Intel Core i5-3570K but reached better scores in benchmarks and is supposed to be the sleeper albeit being a server cpu.