Discussion in 'Benchmark Mayhem' started by BlackZero, Dec 19, 2012.
OC'ed my CPU to 4.75ghz, GPU at 1080/1450 + Cat13.2Beta (I guess there's a CF fix with this driver)
Lol! it took 3x 7970s to bump me down from the #1 spot Catzilla results with my 2x 680s.
What can I say? This benchmark really favors nVidia cards......:3eyes:
Exactly and i wonder why since according to the author Catzilla doesn't use physx.
The benchmark measures physics with the CPU I think?
nope, gpu. blackzero proved that in the beginning of this thread somewhere.
you can test by yourself aswell by putting physx on cpu in the nvcp
So that means it uses PhysX ? It's really weird a heavily overclocked 7970 can't get close to a GTX 670.
Could someone explain me the logic of this:
My 7970 @ 1240/1700 + 2700K @ 4.70Ghz HT ON: 9076 Tiger
Random GTX 670 @ 1200/1500 + 3570K @ 4.40Ghz: 9327 Tiger
Yea...no PhysX there....
I believe that it, perhaps, does not use PhysX, but I find it odd that Catzilla seems to favor a Kepler over an AMD GPU. Ceteris paribus, a GTX6xx tend to outscore an equivalent AMD card. I have no hard evidence, but that is the impression I have gotten after checking out the various scores here. Take mine for example, I have three HD7970 in TriFire with a 3960X @4.75ghz, and I had barely outscored seaplane pilot's score with 2x GTX680 + 3930K @4.6ghz. It take a higher clocked CPU and three AMD cards to barely beat a 3930K @4.6ghz + 2x GTX680.......yes, perhaps the Cat driver I am using is not that well optimized, but I can't shake the notion or feeling that Catzilla favors nV....mind you, I ain't making a fuss over this. Neither am I an AMD fanboy, though I do prefer AMD over nV, but that hasn't stopped me from building a dual GTX670 rig.
Actually i found out it does use PhysX (CPU), at least according to the software creator:
I'm not making a fuss about it either, i just thought this is a neutral benchmark but apparently it is NOT, if it is designed to favor NVIDIA cards in my opinion it should be stated in the first post, i found out after i read all the eleven pages.
Here is mine.
A 3960X @4.75ghz paired with TriFire 7970's(I drooled btw) barely beating a 3930K @4.6ghz paired with 2 GTX680's assuming good scaling on both setups? Obvious is obvious here.
LMAO!:roll: Dang dude, you had me in stitches!
BTW, leopr, I hope you understand that I was in no way, shape or form, implying that you were making a fuss over this, if you'd felt I had.....I'm sorry I'd not made myself plain enough.
But like I'd stated, looking at the results here, Catzilla isn't a neutral benchmark (I think Heaven is, it seems to scale well with both AMD and nVidia cards) as it seems to heavily favor nV cards, especially Keplers. Again, let me reiterate, this is just my opinion or impression, and I'm NOT condemning this benchmark, which I find to be quite fun.
There I am at the #4 spot. Yay! :nerd:
Nvidia did send the developer 4 x GTX680s for testing and development. So it's no surprise that it performs better on their hardware.
Post your best 3D Mark11 score with url so we can compare the two benchmarks.
P19362 with NVIDIA GeForce GTX 680(2x) and Intel Core i7-3930K @ 4.725GHz
seaplane pilot , can you make one with physx set to cpu in the nvcp please ?
I believe I can squeeze a little more out of my rig, but it would have to wait as my R4E has developed a SATA connection issue, my HDDs disappeared on me. So, I had sent it off for RMA today. I would very likely get a new board in 3-4 weeks' time. Till then, I would have to limp along with my FX8120 @4ghz + 2x GTX670.:nerd:
Make one what? A Catzilla or Mark11 benchie with physx on CPU only? I currently have it set to Auto. I will when I get home from work, phooie on work. uke2:
yeah a catzilla with physx on just cpu please. thanks!
this is getting really annoying btw. i installed the latest beta (19) , and the key isnt working anymore (unlocking to basic). just says basic is unlocked. cant run the tiger one now!!!!
i have had this happen a few times already.
I'm not a fan of this benchy, should have never of been released without being fair to both gpu makers, the creator should have waited on that AMD shipment first.
The part at the end that kills gpus i dislike too.