It's not cheating as compared to the programs that mask your CPU/GPU to some ungodly MHz. By default AQM3 will test your graphics card at a resolution of (1024x768 pixels) in 32bit color mode. Anti Aliasing (AA) is disabled and 4x Anisotropic Filtering (AF) is enabled.
Hate to tell you(not really), but everything is at default...everything. V-sync not running. I havent touched it...dont have a need to. I overclock my cpu/mem, AGP bus, and video card...thats it. Get a AMD64 with a nforce3 250 mobo, and you'll find it not so hard to believe. I "just" ran it once this time. Everything is set "as is" from the factory 4.3 Cat driver install. On that run, I got: 48,206 GFX- 6,324 CPU- 10,131 I'm telling you, AMD's 64 line was built for gaming.
LOL ... my P4 is quit fast enough even @ stock. So tell me just what do the default settings for an ATI have running by default? Is the Image set @ Low, Medium or High? I would tend to believe that your default Texture and Mipmap detail are set @ High Quality. If you are benchmarking with those on High ... no way you score those results. Now if you set them to Low yes your scores are more in line. If that is what you did while benchmarking you did basically the same as me with my nVIDIA image quality ... so come clean
P4's can't compete with the new line of A64's. Getting a whopping 2ghz HTT and around 14gb/s bandwidth. Anything else wont really be able to compete effectivly IMO.
That is correct...they are on high. Sorry Blue...thats my score on high IQ. I have never ran this benchmark with low or medium settings with this current cpu/video combo. Maybe I should, just to see how high...but that isnt the way the bench should be ran for comparison to other peoples rigs. I'm 35yrs old. I'm not 13 trying to impress people with high scores(not saying you are, just making a point). I'm not going to lie about a benchmark. There's no reason to. I post scores from default settings, and I want people to do the same. That way, people can gauge if their rig is running smooth or not, comparing it to similar rigs. Or, maybe they'll want to go out and spend some money to upgrade. BTW, I had a BFG 5900nu (clocked at ultra), and replaced it with a 9800pro. My scores went up all the way across the board. Yes it is. You have a great rig.
I got you bet @ 41 years old You really should try then cause I've seen a lot of ATI's run AQM3 and they don't score what your's has unless the driver is turned to Low or Off for those settings even overclocked. Post your AQM3 results before and after making the runs for a comparison, the direct links like below my Sig picture. Mine @ default settings and overclocked drops to around 46000.
This is going to sound strange, but I put the settings to "optimal performance"(keeping 4x AF on by default, as is set in bench, application pref in driver) and the score went down about 200 points. Has anyone ever heard of that happening??? The picture quality went down a bit too, as I expected it to. But the lower score? :shrugs: I know in nvidia drivers, you can force the 4x AF off for everything. Are you running it with 4x AF forced off? In the ATI driver panel, you either have application preference, or 2x AF as minimum.
I think where the problem comes in is that using AA and AF enabled in the driver causes a problem where they are already being set in the benchmark. That's why just about all people turn it to OFF in the driver. Just try the lowest setting with the ATI driver. Kinda like with Far Cry. You get problems running AA and AF in both the driver and game. They need to be set to one or the other. I usually set mine to application control in the driver then adjust AA and AF in the game settings.
I just ran it again, going into drivers forcing 2x AF instead of 4x. I scored 48820...thats with IQ set to high. Let me figure out how this submit online thing works. I'll post 'em.
I did. I put AA and AF to app preference, and set the overall to optimal performance...which puts the texture and mipmap to performance(which is low setting). When I ran it like that, the score dropped to barely over 48k from 48206.
Here's a result at default(this is with driver settings at default...high IQ) http://arc.aquamark3.com/arc/arc_view.php?run=1084558345
That's a very good score and a excellent score on the CPU. It's just seems the graphic score relates to one where the driver is turned down. I don't know I won't doubt you anymore since I'm no ATI expert, maybe do one again with the driver on Low or Application Control.
blue, my gfx score is 6300. Your gfx score is 7185. Thats almost 1k diff. You're running with IQ on low, so it would make sense that your gfx score is higher. I just think the 9800pro is a faster card than a 5900nu. I have a BFG 5900nu. It was my main card before I bought the 9800 pro. I had it flashed to a 5950, and the 9800pro still beat it(both being overclocked of course). I ran the test above with everything set in driver to application preference, and high quality on mipmap and texture. The benchmark runs at default image quality of high, with 4x AF on. Here is a score with everything the same, except mip and texture set to performance. http://arc.aquamark3.com/arc/arc_view.php?run=413479231 It went down just a little. I dont know what its doing, but it runs and looks best on high IQ.
That is odd cause using it @ the Default setting enables AA 4x which will look better but should slow down the frame rate. Putting it to Application Control, if I'm not mistaken, sets it to 2x meaning a higher frame @ a lower IQ yet increasing the score.
Isn't that kinda like cheating? Of course you are gonna get a high score if u set everything to look like crap. I can even do that too and gain 400 pts in 3dmark03. You have to benchmark the way everybody does.
No it's not cheating and the image doesn't look like crap. I'll get some screen shots if you want 'em to prove it. Like I said before anybody that knows uses this method with a nVIDIA card. If it was considered cheating AQM3/Massive Development wouldn't allow it. Go ask 'em for yourself: Cheated Am3 Scores By default AQM3 will test your graphics card at a resolution of (1024x768 pixels) in 32bit color mode. Anti Aliasing (AA) is disabled and 4x Anisotropic Filtering (AF) is enabled. If you go setting your GFX's AA and AF to run to something different than what the benchmark is doing your going to be slowed down. Application Preference would be the proper setting for an ATI.