^ the game is not badly coded at all, it has far draw distances and detail that make it hard to run, you can drop the settings and run it fine, just because n9ot everyone can run it at max, doesn't make it badly coded
At last a person that agree with my opinion about Crysis. I wrote it is badly coded because most people says that and i didn't want to get ''flames'' +1000
Yep. Too bad adding another GPU also introduces a lot of microstutter, and the framebuffer needs to be 1.5 GB or more to keep it well over 30 fps without weird **** happening when you turn on AA. Basically, a 2GB 5870 or GTX470 should already get some interesting results. We'll have to wait and see.
^ yeah that is true, I seem to be able to run it very well at 1680x1050, I will test at 1920x1200 wen I get a new screen, I don't think u need to use AA with this game, as far as microstutter, it hasn't been an issue really
If GTX470 is clocked at 650/1300 it should be slightly faster than 5870's average.Also do not expect the 480 below 499$.
crysis is not badlly coded. but crtech wanted to stublish it as an standad thats y they added ultra high level detail thats wh its hard to run..
The 4870X2 isn't supposed to display microstutter, because of its shared memory architecture, but two of them will. I saw Efaco5's Crysis review on Youtube with two 5970s, on which I based my statement. 8xAA at your native res, btw. >100 fps a lot of times, but 30-40 during the final battle. 9.12 hotfix drivers, so maybe it's been fixed by now.
I have a 4870 X2 havent ever experienced "microstutter", but it does not have shared memory, it has 1GBx2 for a total of 2GB....each 4870 chip has its own 1GB frame buffer.
I know...I was hoping the term architecture had some hidden meaning for X2 owners that I'd forgotten about. I just picked it up somewhere, so my guess is the guy meant the on-pcb pci-e bus.
My 260GTX has probably been the best card i have ever bought or maybe my 8800GTS 320mb. Though saying that..I do miss my old ATI 9250SE and the days of CS 1.6 or my Nv 6800XT which i pipe modded and oc'ed into a GT lol. I dont get all this fuss about Fermi...i really dont see whats so special..i mean...what games are their to take advantage of the card...just can't see what wants me to buy this card?
i think the MS is a lot of bs and haven't seen any hard evidence of it existing... also 5870/5850/5970 can all play crysis quite well(given 5970 isnt a single gpu its still a single card)
I recommend thisGTX260,275,280,285,HD4870,HD4890) users to wait till 2011 next generation cards besides theese cards can max out every game except 2 of them which everyone know
No it's not a single gpu because it only takes one pci-express slot.It remains a dual-gpu solution with dual-gpu problems....
Lol you changed that nicely. I'm sure the 5870 would give a smoother gaming experience with a 2 GB framebuffer, regardless of that microstutter BS.
Hey you all, Lets wait and see what Hilbert has to show us when the gpu is in his possession for testing then we can clearly see which one will reign. 60% increase over 5870? My personal opinion, I doubt it, 15-20% maybe, 60% no but like I said lets wait and see and then argue later. Peace out. Thats why I am holding off of buying a brand new gpu cause of fermi, once Hilbert has fully tested the 480 and the lesser version the 470 then I will decide which one, wether its the 5870-5970 or the nvidia 470-480. After the Benchmarks I will clearly make my decision real quick so I am defint waiting for fermi's initial release and testing. c1:
i changed it because i really dont want to start a huge debate on MS.. i can link you to the post i was referring to that shows what one site tested as "MS" then forum people showed that it happens just as much with a single gpu.. however with that...its ppl like u and me and not a site that id say is 100% accurate..so instead of starting a huge MS debate in here ill just avoid it
even if the 5870 had a 2gb frame buffer and used ever ounce of it, how will the fps keep up? It wont be 60 fps, probably more like on the 20's possibly 30's but not 60's. Now if it was the 5970 using full 2gb frame buffer I can see that, single 5870, nah waste. 1gb,1.25gb is good for the card, anything higher then that the gpu will slow down. Dont get be wrong it is a fast card but using all 2gb frame buffer??? could the gpu keep the fps at a high frame rate at the same time?? Card is fast but not that fast. think about it. Fully utilizing 2gb frame buffer @ blistering fps speeds of above 60 fps? Wow, just wow, not going to happen anytime soon, not right now that is.
I used to owned both cards, Dont laugh about the 2XXX cards. I think the drivers were holding the cards back bad. Some games like oblivion, the 2900xt 1gb version completely pissed all over the 8800gtx card, over double the performance in some cases, YES. Some games I personally tested the 8800gtx spanked the 2900XT 1gb version. Great card but overall I would choose the 8800gtx cause it was faster in most games I tested. Early beta testing of crysis multi-player showed about the same fps on both cards, If I remembered correctly the 8800gtx was only slight faster then the 2900xt 1gb version. I tested awhole bunch of games. I can't remember its been quite some time, there was one game that 8800gtx took a major lead against the 2900xt, I think it was some kind of call of duty game. I have used both cards with their early drivers to be more fair on performance, wouldn't make any since to use a fully matured 8800 driver against a fully unmatured 2900 drivers so I put them heads up with their early drivers to be more fair. However this was long time ago, I dont know now cause of the newer drivers but yes I owned both cards in the past. But yes in ES IV: Oblivion, the 2900xt completely destroyed the 8800gtx in some cases. I cant remember what driver it was, I think it was the 2nd or 3rd drivers that came out with the card, not the one that came with the card, hell no. There was Major Drastic increase in performance from the early drivers to the drivers that showed the 2900xt spanked the 8800gtx. ES IV: oblivion was running like s-h-i---t with the very early drivers, then updated the drivers later, MAJOR MAJOR League performance difference. I had a video on this on youtube with my original account but alot of accounts are being pulled down no matter what kind of subject the user account has so I never reposted them. IF someone ask then yes. Oh this one I remembered clearly: The benchmark demo called: call of Juarez benchmark, I think the settings were blasted up when I did the benchmark, the 2900xt ran extremely slow, this I remembered correctly, ran like s-h-i----t. If anyone still has a 2900xt 1gb version in their possession, give me heads up on how much peformance difference has made all this time with the current drivers, very very curious. If anyone still owns the card, I dont, I sold the card long time ago and moved to 8800gtx sli, but sold thoses 2 and got a single gtx 216 and I can run just about any mmorpg with blasted up settings,16X aaq, supersampling,etc. Can make an old game like lineage II or final fantasy XI online utilize over 600-700mb of texture ram by doing alot of modifying its registry settings and overriding some settings. I mostly play mmo's now, Just trying various amounts of mmos, just hopping one game to another to grab my own opinions. FFXIV already signed up for beta. But yes Still waiting for fermi testing till then peace out. Sorrie about the off topic here, if you all want to see the 2900xt 512 test with Hilbert, here the link: http://www.guru3d.com/article/ati-radeon-hd-2900-xt-512mb-review/