Discussion in 'Frontpage news' started by Guru3D News, Apr 27, 2010.
Great review like always hilbert, I know that cpu will run 4.1 steady...
i hope i get a bios update.
Great review as always! There's one thing I don't understand though. Why would you leave gaming aside? It's a perfect CPU for gaming and I can't see how i7s are better. Triple channel memory and HT don't have an impact on gaming, you've said it yourselves. Again, as you said X6 1095T performs as well as i7-965/975 (290 USD vs. 800/1000 USD) and when overclocked exceeds all except for i7-980X. As far as I can read benchmarks they are proof of this as well. Also, six cores sound very future proof. There're still games that can't utilize quad-cores.
I'm not an expert but I really can't understand how i7s are better at gaming. In my eyes 1095T is even overpowered for gaming as I feel 955BE/i5-750 are more than sufficient.
Note: I'm strictly speaking for gaming performance.
Speaking of gaming performance check out my post.
And other reviews that actually benchmark this CPU for games.
This particular CPU doesn't seem good choice for gaming , even i5 750 is faster.
truly , it's not like you gone game at 800*600 or 1024*768 ! ... so c'mon ,we all know that a modern x4 or x6 or i5 i7 can handle all the games and any future games ... + 2 - 3 years ! maybe more ... personally i think that this x6 or an i7 930 can handle games for the next 5 years ! if you want to have always the most newer cpu's on the market this is another problem !
This is true. I did some BC2 benches awhile back with my Phenom 2 @ x2, x3, and x4 cores...there was a difference between x2 and x3 only at 1680x1050...at my normal 2058x1152 there was zero difference whatsoever.
I reckon as long as you have a Core2 above 4GHz or Core2Quad,Phenom2,i5/i7 above 3.4-3.6GHz then you are absolutely fine for games unless you are a nutter and run at 800x600 or something.
There is 40% performance increase going from 2 to 4 cores on my system in Bad Company 2 at 1920x1080 , and thats only with HD4870.
Imagine the CPU bottleneck with HD5970 or GTX480 SLI.
If you want review CPU performance in games you need to know where to start.
And lowering resolution is not the solution .
Testing much more powerfull GPU is.
I had exacly same performace jump going from E8400 @ 3.5Ghz to i5 750 @ 3.2 Ghz
why do they choose lower resolution to benchmark ?
surely a better BM is to use crossfire graphics and see which cpu limits first?
16:9 1080p is standard now. let alone people running higher 1600p or 1152p or eyefinity, or nvidia multiscreen etc..
Exactly it is , but only few websites decided to do so.
I wish Hilber would do soo as i trust his benchmarks more.
But i have little doubts when few websites show similar results.
They used HD5870 in CF, which isnt most powerfull setup you can get ...
WOW thanks heaps Hilbert and on time!!!:banana:!! Switching very soon!!
ok then ... another point ! do you notice once you past from 180 FPS to 200 ? cuz i really can't
Bang for Bucks for sure
the thing is if your an average pc gamer you may have a 4850 or a 260gt lower/middle card with a relatively average dual core with 2gb of ram and can still run your mainstream console port games fine on default settings at medium to high..
but if your on guru3d or any other hardware based forum/review site then you are almost 100% a PC enthusiast, someone who likes full or custom graphics settings (and audio I guess). In which case makes no sense to benchmark a state of the art 6 core processor with overclock at lower resolutions on a single card does it ?
This last part is my thinking as well. For example in GPU reviews I only really pay attention to 1920x1200 benches with 4xAA as they are the only relevant ones to my system really.
I agree. This is review for enthusiast , soo i'm surprised we are seeing single GPU and uber low resolution.
Actually it would be totally awesome if Hilbert could do some SLI/CF CPU scaling review . 480GTX SLI would produce huge CPU bottleneck.
Soo far only website that did that is legionhardware
and some minortests on other websites.
You would love to see charts from Bad Company 2
And from what i see right now AMD X6 isn't the champ in games no idea why.
meh 1090T /= 17 930 IMO, either way u'll be happy, although the Thuban does seem to lag behind in some games
:/ Doesn't look like the CPU speed matters much at all really....going from 3.2 to 4.1GHz does zero for performance in BC2.
Doesn't matter when you are already on Quad and with 1 GPU.
It does matter when you run Crossfire or SLI or you have Dual core.
I dont like repeating myself but here it goes again :
Check out Phenom II X4 vs I5 750 clock perclock.
Rape is good word here.
If Hilbert would use second HD5870 you would see completely different result.
Actually showing graphs with 1 GPU might be misleading for many people that think "CPU Doesnt matter"
CPU Does matter, more than you think.
CPU doesn't matter in real gaming since ALL graphic cards in crossfire/sli or not will limit your fps under CPU bottleneck ones. In real gaming you will use AA, AF and a really huge resolution. Is like removing your car's seats, windows, doors and everything wich make it comfortable because it gets faster.
Make a bit sense dude, no one is taking game cpu benchmarking seriously anymore. GPU with a real game enviroment setup is the only bottleneck, it doesn't matter if you have a C2D or an i7. I have a Q6600 and with a high-end graphics card I'm sure I can get a fps result very close to any other CPU.
I find it funny that people want CF and SLI reviews when they don't even have 1, about 90%? of the people that reads this website has a 5890CF and let alone a 480 SLI. Its kind of a waste :/
I mean come on... "Hilbert wheres our 480 4-SLI review?!??!! I mean, that would prove this $200 CPU is crap compared to a $1000 CPU since everyone has 4-SLI"
@Kapu, stop defending your CPU so much, If you get the $200 6-core AMD it will be an upgrade for you, period.