I'm still a novice in technical computer knowledge so I rely on forums like this and charts like toms hardware has to research components before a purchase. Are Toms charts accurate? The reason I ask is I can't seem to understand some of the results jumping around when you change from one benchmark to the next. For example: http://www23.tomshardware.com/graphics.html?modelx=33&model1=585&model2=606&chart=198 The 8800 GTX clearly outperforms all other cards on extreme resolutions. But drop down to a lower resolution... http://www23.tomshardware.com/graphics.html?modelx=33&model1=585&model2=606&chart=197 ...And the GTX falls behind. On most of the gaming benchmarks, if you change the parameters like this, the result is repeated. Are the charts flawed? Or is there a reason the GTX can't handle the lower resolutions as well? My monitor won't do the higher resolutions where the GTX shines, so would the less expensive cards that perform better be the obvious way to go? P.S. Any suggestions of other good benchmarking charts from other sites? Thanks
Yes, but I'm actually quite interested in the reason why a powerfull card won't do lower resolutions as well. I must know or my brain will continue to hurt:look:
Because its more powerful then other cards and get badly bottlenecked by CPU. Raising resolution it get freed by cpu dependency.
So your saying when the test system cpu and memory are pushed to there limit, the GTX operates flawlessy. But when the test systems cpu and memory can easily handle the load put on it, the GTX somehow bottlenecks even though everything else is handling the stress and can take more if called upon? I would have thought a high res setting would cause the GTX to be slowed down by an overloaded cpu and memory. Wierd. This was the test system for the T.H. chart: Processor(s) AMD Athlon 64 FX-60 2.6 GHz, 1.0 GHz HT-Link, 1 MB L2 cache Platform Nvidia: Asus AN832-SLI Premium Nvidia nForce4 SLI, BIOS version 1205 RAM Corsair CMX1024-4400Pro 2x 1024 MB @ DDR400 (CL3.0-4-4-8) Hard Drive Western Digital Raptor, WD1500ADFD 150 GB, 10,000 rpm, 16 MB cache, SATA150 Networking On-Board nForce4 Gigabit Ethernet Power Supply PC Power & Cooling Turbo-Cool 1,000W CPU Cooler Zalman CNPS9700 LED System Software & Drivers OS Microsoft Windows XP Professional 5.10.2600, Service Pack 2 DirectX Version 9.0c (4.09.0000.0904) Graphics Driver(s) ATI - Catalyst 6.10 WHQL Nvidia - Forceware 96.94 Beta
Does it really matter? You're not going to notice the difference between 133 FPS and 137 FPS. It's gonna be sh!t fast no matter what. If the card can still do over 100 FPS when you raise the resolution and quality then it's all good.
reason 1. $449.99 vs. $649.99 for 33% higher price... reason 2. I'm just more curious than anything else. The GTX, once vista and DX10 games come out will make those other cards seem very dated. But for now, I would just like to know why the performance drop on lower resolutions.
Yesterday I installed 2 XFX 8800GTXs in SLi. On first firing them up I ran 3dmark05. To tell you the truth I was horrified. The score was hardly any higher than my previous 2 7800GTXs! But then I ran 3dmark2006 @ 2560x1600 with 4aa & 8af - and the score was almost double my previous! All indications are showing that the CPU does hold back lo res performance - mine is an fx-60 as in the tests. I'm going to oc the CPU to see what difference it brings, but just for curiosity's sake. Believe me the minute you run any game at 2560x1600 with everything set to max detail any doubts fly straight out of the window. The 8800GTX is an incredible card. I am guessing that as hi res hi detail is on the way that is what this card was built for, and what it excels at.