I wasn't going to make a whole thread about this but I changed my mind after I realized how long my post was gettting. I've seen a lot of posts here at Guru3D and elswhere online comparing 3DMark06 scores. I've even been flamed here at Guru3D for discussing the issue of 3DMark scores on certain cards and how (if at all) they relate to actual game performance. This issue has irritated me since the day my GTX 285 arrived in the mail. I bought my 285 to replace my 8800GTS 512MB. Of course the first thing I did was fire up 3DMark06 and Vantage to see what my new scores would be. I honestly can't remember what my 8800 scored in Vantage but I think the 285 was about 4000 points higher. Much to my surprise my score was almost the same in 3DMark06. Which brings me to my irritation. IMO 3DMark06 has become completely unreliable in it's scores. Despite this it is still widely referenced when comparing new GFX cards. In the past six months I've run 3DMark on a 4870 512MB, my 8800GTS 512MB, and of course my GTX 285. While not exactly a huge upgrade from an 8800GTS 512MB the 4870 scores just under 13,000 on the VGA charts. (I would reference my own tests but I hated the 4870 so much I didn't save any.) The lowest score I ever got with my 8800 was 13,596. The highest score I got after OC'ing my 8800 was 15,232. Since the 4870 didn't OC well it got destroyed despite the fact that we all know it's about 10-15% faster than an 8800GTS 512MB. After those tests I didn't really think much about it until I ran 3DMark06 on my GTX 285. My scores with the 285 demonstrate what I mean by 3DMark06 scores no longer being reliable. The highest I have been able to get with my 285 is 16,531. Despite the fact that when I tested my 8800 my CPU was clocked at 3.0Ghz and with the tests on the 285 it's clocked at 3.2Ghz. Anywho... While I understand why the GTX 285 scores are only 8% higher than my scores with the 8800, it's obvious from what I read in this forum and elswhere that the vast majority of people do not. There are many reasons why the scores come out this way. Without going into the issues too much the main problem lies with the default settings of 3DMark06 which almost everyone uses because you get the best score that way and then you can brag to all your nerdy friends. :nerd: Unfortunately the majority of us do not play games @ 1280x1024 resolution with AA and AF turned off. Since a card like the 285 doesn't really begin to shine until you run games with AFx16 and AA 8xQ the default settings of 3DMark06 make it look like a huge waste of money. That being said anything over AAx4 would make my 8800 slow down signifigantly while it takes a setting of 16xQ with Supersampling Transparency AA to slow down my 285. While I firmly believe that Futuremark Benchmarks are a great way to get an idea of how a card will perform I think they are relied on a lot more than they should be. This is especially true when comparing newer cards with older GPU's and GPU's with more than 512MB of memory. I just think we need to shift the conversation away from benchmarking programs and focus more on real world performance a.k.a in game FPS.