Hi I just got a New 22" ViewSonic VX2235wm and i am using the Res 1680 by 1050. I didn't notice much difference on my Computer until I started playing World of warcraft (yes i am an Addict). I realized i was getting 25-60 fps. And It was making be Sick. I currently have a Sapphire 850xt non-crossfire. Am i Getting low fps because of my graphic card? I am really lost. My set up: ASUS A8R-MVP Socket 939 AMD Opteron 148 Venus 2.2GHz Socket 939 (OC to 2.4) 2g of OZ ram OCZ Gold Edition 2GB (2 x 1GB) Ati 850 XT 500watt PSU So do i need to upgrade my Graphics Card for this Resolution or is it something wrong with my LCD. Also i tried overclocking it but Still no luck at all and i was getting around the same fps. Any help or advice would be Great Thanks
What kind of budget are you looking at? I believe the 8800GTS 320mb version was just released. They should run just under $300. I'm not exactly a fanboi of either ATI or nVidia, but that is a pretty good deal imo.
There can't be anything wrong with the actual monitor, that wouldn't make sense, the increase in resolution however would slow things down. When I got a 1680x1050 monitor, I had a x800xt, games were slower than before of course, but playable.
Yes, you are. That resolution will choke any X8xx series card, even at low detail settings....and nobody wants to game like that.
Just so you know...anything over 30FPS will seem the same to your eyes...so if you dont see much below 30 then it could just be the refresh rate...set it at 85 if you can.
I play at games at 1680x1050 and i used to play at 1280 and 1024 and i was was gettting around 60-70fps. Would x1950xt do the trick? i like a Nvida Card but i got a ATI motherboard....... so i am sticking with ATI. And Yes the low fps is making really really sick. Feel like throwing up. I also set the game to 75hz but i was still get same refresh rate
It'll be faster no doubt but don't expect miracles from an LCD display, it'll never be as smooth as a CRT and ghosting, even if it seems unnoticeable, usually put a certain amount of strain on the eyes that can get tiring after a while. Right now I'm typing this in front of a reasonably good quality and supposedly fast LCD (NEC 90GX2) I bought two days ago to fill in the void while my CRT gets fixed and I can barely stand it anymore.
That's not exactly true. It depends on the game type. Something like F.E.A.R., you will easily notice the difference between 30 and 60 fps. Something like C&C you most likely won't. Go check wikipedia if you don't believe me.
The eyes wont see a difference it might just be a little less smooth. And not to be rude, but using wikipedia isnt a a good source, if it is used in adition to another source then it is good, but the info there has no reliability.
I had a whole explanation typed out but I deleted it because I have no interest in sharing information with someone who refuses to accept something new. Long story short, my mom is a licensed optician. I worked for an eye doctors office for several years, and during that time read and learned a lot. Enough to know that your explanation is just way too simple. There is no point where everything hands down becomes fluid. Like I said, it depends fully on what you are viewing. I refuse to play any first person shooter at below than around 50fps. Under that, it's hard to have the quickest reaction time in an intense multi player game.
Here is my deal, if you prove to me that the eye can see more than 30fps then I will listen, I just personaly dont trust wikipedia. From what I have been looking at most movies are run at 18fps...does that make you sick with the low frame rates? If you SHOW ME proof that the eye can see more then I will listen, but saying you have proof but didnt want to show me doesnt do much for me...
Wikipedia's reliability varies considerably from one article to the next but those that cite verifiable sources, and a whole lot of them do, are pretty accurate. Many of those with dubious content actually state so, too bad the media won't follow that example lol.
Actually, the brain can be tricked with fps as low as 10 or even 5fps. It's all done with movement speed and motion blending. The slower the motion from one point to an other, the less fps you can get away with and still have fluid motion. This is why cutting edge tech demos always slowly pan around. Less frames needed to produce the same effect, at the cost of motion speed. In films, the 23.97fps looks fine because the exposure timing of the film catches the blur from movement to effectively blend two frames together, and its simple ingeniousness makes lots of blur from fast motion and little blur from slow motion all blending together seamlessly. Trying to re-produce this effect in a 3d game (if they could perfect it) means cut scenes could run at 24fps and would have the motion of film, with no choppiness since it would be masked by the correct amount of blur. Make a little more sense now?
That's the problem with getting a high quality flat screen, if you don't want things to look like arse you need to run at native res and if you don't have a card with enough grunt to support such a res you lose out on frames.
Turn down qualtiy settings. Perhaps go into the .ini or .cfg folders if there are any and tell it to use more system memory, or other optimizations which may give you a plus in the FPS factory.