Discussion in 'The Guru's Pub' started by Saeid, Jun 6, 2004.
my eyes are so 1337, that movies lag
lol, i suppose you are very frsutated person right? without being able to see movies.
The greater the frames-per-second on a monitor, the less chance you're going to see flicker.
In a perfect world, our monitors (or games) would only need to be set at 40 frames-per-second but the issues is that sometimes they're out of sync with our eyes. This causes eye strain, so the faster frames-per-second we have the easier it is to sync up with our eyes.
A very good question though.
Holy Resurrection thread Batman......
I think the eyes process way differently than a LCD.
7 year Necro = Wow! or = Huh?
well atleast he used search!
welcome to the forum Metalrasputian :thumbup:
Depends on what you've been smoking.
The topic seems to have gone down the route of describing the eyes abilities in terms of monitor fps the problem with that is monitor fps and in fact most photographic and cinematic imagery is displayed in specific time discrete images i.e (x frames per second.)
The human eye doesn't break images down in to discrete time devised images it registers everything it sees continuously the topic at hand is how long it takes the eye to process the image and the measured time just now seems to be 1/25th of a second the phrase used is 'persistence of vision'.
If it must be given in terms that are total wrong for the description of the human eye then it would be 25FPS.
Wow edit just spotted how old this thread is... um move on nothing to see here!
After a plate of hash brownies, I think it's about once every ten minutes.
I think I heard somewhere that if any piece of film either in cinema or TV is filmed at any time under 23fps for over 1 second, it's considered to potentially contain subliminal material and must be changed etc. So I guess that's the lowest threshold of clear eye-to-brain understanding. So any games that dip under 23fps could be doing all kind of things you'll never notice
I might be totally wrong though, I was told a long time ago.
Your close, it's actually 23.976 or to round it off to the nearest number then it would be 24fps...
With regards to gaming, I'm happy with 60fps or more.
Oh FFS, not this thread again.
The human eye does not have a refresh rate, and does not have "FPS".
And anyone who disagrees, is wrong. And of course i wouldn't say anything without backing it up.
The human eye absorbs light using rods and cones, they are activated based on the color and intensity of light entering the eye, and the information is sent straight to the brain for processing in micro-pulses.
Here's the deal though, it's not a single data line in which visual data is sent, there's tens, even hundreds of thousands of these data lines. And they don't all pulse at once.
Think of it this way since people love using the terms "refresh rate", "resolution", or "FPS" when refering the organic vision;
Imagine an LCD monitor, and each pixel has its OWN refresh rate. Further more, it only refreshes, when the data changes. So if a red pixel is highlighted, it'll remain on, without refreshing, until told to disable.
So each light sensitive cone/rod in your eye, will pulse the signal to the brain.
Why do you see ghosting? Because rods and cones warm up and cool down. Which is why if you're outside during the day, and you enter a poorly lit house, it'll seem darker than it really is, as your rods need time to cool down.
Why does the optic nerve pulse? Is the pulse; refresh rate? Latter: no.
The eye pulses simply not to fry the optic nerve.
Resolution. Yes your eyes have a type of "resolution", but not like an LCD resolution.
The center of your retna is the most dense with cones and rods, so it has a higher "resolution" where your periphrial vision is less dense.
The resolution varies from person to person, and is simply how many cones you have in the back of your eyes.
Center density: Stare at a single word, and without moving your eyes, try reading the surrounding words, you'll find it to be hard, as they are most likely blurry, since the surrounding cone density is lower.
So what have we learned today?
Refresh rate, none: Each cone sends it's own signal based on change.
Resolution: sort of: density of cones.
Frames per second: None, the brain is constantly processing the data, like refresh rate, each cone is seperate, so it doesn't update the entire frame/image at once, but rather each cone. And the actual rate of change, varries from person to person and by age.
We clear? Can we put these threads to rest now?
Fighter pilots can see up to 220fps which is to say that for whatever length of time 1/220th of a second is they display a single frame that differs and they can detect it.
I sure would like a legit 100hz lcd tv. Not one of those pixel overclocked piss poor TN panel, 3D, overpriced pieces of crap.
But they would likely cost a fortune and for me 60 frames is good enough for most games.
Less than 60 sucks though.
antyhing lower than 85hz hurt my eyes with my CRT but i dont know if that has to do with "eye fps"
this is directed simultaneously at your response, and the necro itself.
Make sure you guys read this.
Well said I like that Idea. Are Eyes adapt to are way of living. So there is no set frame rate for every person. Was always sceptical about the whole 60fps. I like to run my games above that as much as possible when I turn V-sync on in a racing game it feels laggy and mushy like pole position. Verses running in between 80-100 frames per second. Its a must in iracing your have to deal with split second decisions and it's just not possible at 60 or less frames. The problem with my opinion is I am only running a 60hz monitor. My thought is maybe v-sync causes micro stutter(something less then 60)?