I am sure your current setup will still look better than xbox 720...>http://guru3d.com/news/xbox-720-will-get-gpu-based-on-radeon-6670/
This. I don't get how AMD owners of 6970s were railing on about how good it is, it's a garbage card for the price. The only people who would get good performance : dollar are people upgrading from gtx260/4870 etc, and I doubt they'd be buying the fastest card given that they've been fine with average performance for years. Unless there's a kepler card that matches the 7970 for $300 then I won't be upgrading. $600 for a high end card just so I can play Dues EX HR, AVP, Justc Cause 2 and Crysis 2 wit a 60 frame min uke2:
but will it play better???specs and actual play are very different... if the 720 is getting a 6670 i cant wait to get one that's a nice jump even thoe im still happy with my 360
Buy a console now, and the new ones will arrive in 2 years. Nice advice. And there was not such thing as "slightly" back in the day. And in some cases there still isnt. Look at Metro 2033. Crysis 1 is still far better looking on PC. Someday we will see the graphics gap widen again. And hopefully soon. But its all worth it. The fact that game devs are just making console ports now is actually in PC gamings favor since you dont HAVE to upgrade to a new GPU every year like back in the day. My 8800 as an example. I can play games and get ALOT better graphics than the 360 and better performance with it. Sure I could spend $500 on a new GPU, but right now to me the top of the line GPU is not worth $500 more than what I have.
"back in the day" you didn't have to upgrade graphics cards every year....lol. "back in the day" a graphics card lasted several years. In fact, an FX5900 Ultra was still viable until the 8800 series was released.... No offense, but you're too young to talk about "back in the day". I've got PC's and PC games that are older than you are.... It's only been the last few years that graphics cards have gained accelerated upgrade needs....and that's due to DirectX being updated. Most DX9 cards were viable throughout the DX9 API life-cycle. Same with high-end DX10 cards. DX11 has made DX10 cards "obsolete" faster and with developers starting to use more effects and higher polygon counts, upgrading is becoming more necessary.
Yes you did. If you wanted to play the next games anyway. The ENTIRE FX series of GPUs was a joke. The FX5900 Ultra was hardly valuable upon release, as soon as the 6800 Ultra came out the FX series was useless as the 6 series was easily 3x faster. You do realize that even the GeForce 4 Ti was alot faster than most of the FX series? It may have even been faster than the 5900, cant remember. But it showed that the FX series was very poor. Good for you. I have video games older than myself as well. I started building computers when I was like 13, so I got into this alot earlier than most people thanks to my stepdad who helped me with my first one. You think I dont remember the 90s? I very much remember the 90s. I remember my Playstation, I remember my 13inch lucky-its-color TV. Used to play Gran Turismo and Final Fantasy 7 alll the time. I remember when the GF2 was launched. As I had one. Iv been in the computer scene for a very long time, I may only be 23 but Iv spent more than half my life with computers. I also remember having dial up in 1997 on a computer that my dad got from work that had a Pentium 133 in it. Back when AOL was pretty much required to surf the web. And so what makes now any different than DX 7,8,9? I dont see one. Upgrading has ALWAYS been necessary if you wanted to play the next set of games. Until recently. You have it backwards. Upgrading has not been necessary at all really in the past 5 years as game developers are not making alot of use of DX11 and are more worried about optimizing DX9 for consoles. Do you know how many graphics cards I went through before my 8800? Every GeForce series including one ATi. And now that my 8800 is still running games, I have not needed to upgrade it. I did have a thread on its performance. You can also check the screenshot thread.
I have games *SPAM* back into the 70's....lol. By the time you were 13, I was making a living selling custom systems. I built my first computer in 1992. I got my first computer in 1988. The first computer I built supported 16mb of EDO-Ram. I ran a GeForce FX5700LE from launch until 2006 when it was replaced by a 7600GT. It had no problem running any DX9 game released. The card before it, was a Radeon 9200SE that I got in 2001 when WindowsXP was released. Prior to that was a Diamond StealthII that I got in 1998. Prior to that was a 4MB Trident graphics card that I got in 1993. I also had a Diamond StealthIII S540 Extreme, GeForce 6800, Geforce 6200SE, GeForce FX5600XT, Radeon 9600XT, Radeon X700Pro during that same time period from 2001 to 2006. At no point was I ever "forced" to "upgrade" due to being unable to game. The FX5700LE played Farcry just fine. You might want to look at reviews. The 6800Ultra wasn't "3x faster"....lol In most cases, it wasn't even 2x faster at common resolutions. GeForce 6800 Ultra Preview
Cept it did not. The 6800 Ultra even struggled with Far Cry at times. I struggled to keep 30FPS at max settings with full AA, AF, at 1600x1200 with my 6800. So theres no way an FX5700 is going to play it just fine. At least not at anything above medium. And Im supposed to trust a review from here? Nah. Ill use my real world experience. And based on that, the 6 series was a very dramatic improvement over the FX. Although after reading the review you linked, it pretty much does have the 6800 being 3x faster than the 5950 which was the best the FX series had to offer. So how did that support your case at all? Yes we are. Would you like to join in?
Modern Warfair. Haha. Sorry. Had to point that out. Yea, DX10 cards are not obsolete at all. The benefits of DX11 have really not even been shown enough to warrant a DX11 card.
So, you looked at 1 benchmark from the entire review..... In most benchmarks, the difference was less than 100% Here's a real nice example.... UT04: 5900: 85fps 6800Ultra: 118fps Halo: Combat Evolved: 5900: 35FPS 6800Ultra: 55fps Return to Castle Wolfenstein 5950: 120fps 6800Ultra: 140fps Call of Duty: 5950: 110fps 6800Ultra: 140fps Unreal II: 5950: 77fps 6800Ultra: 101fps Where is this 3x performance gain??? I'm not even seeing 100% gain.... According to you, in Call of Duty, the 6800Ultra should have been around 400fps.....in UT04, it should have been around 260fps.....Halo, it would have been around 130fps... As I said in another thread, you make up more bull**** than anyone else on this forum. Not a single review shows your claim of 3x the performance gain. nVidia never even made that claim. The average difference from 5900Ultra to 6800Ultra was only 40-60% at common resolutions. Obsolete, in this case, is a matter of opinion.....which is why I had it in quotations.
OK Then...Here...>http://www.youtube.com/watch?v=Ysd3...xt=C3ae9a4dUDOEgsToPDskLy7I1gZkPoLaP1loukv4xh
Wheres your link? And what do you consider a common resolution in 2004? In 2004 I was gaming between 1280x960 and 1600x1200. I never went below 1280x960.
Those framerates came directly from Hilbert's review. In 2004, the most common resolutions were 800x600 - 1280x1024 Given that you claim to know everything there is to know about computers....even after being proven wrong repeatedly.....you'd know how the resolutions are chosen for reviews.
If DX10 cards were truly obsolete, I wouldn't have a GTX275 as a "backup" card. It struggles with Diablo3 and F@H, but still useable.
800x600? Who the hell used that in 2004? You? You cant use what was considered a really low resolution even in 2004 to defend that the 6800 was not significantly faster. You fail to realize that the FX GPUs were horribly designed and inefficient and were pretty much useless in DX9 which is what they were supposedly designed for.
Actually, most of my gaming was done at 1024x768 as that was the native resolution of my monitor. But, to say that the FX series was "useless in DX9"....is proven wrong by the benchmarks.... The benchmark results I posted were from 1280x1024.... 1600x1200 wasn't a common resolution due to cost of monitors that supported it. Most at 13-17" used a native resolution between 800x600 and 1280x1024. Try discussing a topic you actually know....cuz this isn't it.