Fps- how much do they matter?

Verve

New Member
Simple question: If the human eye can only detect 24 fps, why do monitors/games need to exceed this number? Is there really any difference in the way it looks? It doesn't seem to make sense. :confused:
 
Not too sure...

I'm running FEAR with my 6600GT (Ol' Faithful is her name :)) and I'm happy with my FPS rates considering my specs :D

The only reason for me is bragging rights :D, which apparently I don't have:rolleyes:
 
Very true, didn't think about that.

But, if it's online it might be your internet that's making things "jumpy"
 
Thats the point, the human eye cannot tell that things are "jumpy" after they reach a speed of 24 fps. The way I see it, fps above 24 means nothing.
 
Wait, if the eye can only detect 24 FPS, then why do I see a huge difference when my FPS rate (during CS:S) jumps from 25-30 to 65+?

I type cl_showfps 1 in the console while playing... that's how I get my FPS rates
 
if you play a game at 30FPS, and at 60FPS, you can notice a huge difference in gameplay. Movies play at 29.97FPS.

And as angel said, when im playing at around 50+ FPS, then when it dips to around 25-30FPS, it looks like its lagging alot.
 
[-0MEGA-] said:
Movies play at 29.97FPS.
Movies are filmed at 24, its been that way for years.

Ok, so can anyone explain to me how a person can detect a change in frame rate over 24 fps? I believe you guys when you say you can notice it, but my question is: why? Technically, you shouldn't.
 
Last edited:
framerate said:
It's important when talking about FPS to realize that your maximum frame rate is not really very important. Far more important is your minimum frame rate. When we say we want more FPS, what we really mean is we want more FPS at the bottom of the range. It's no good to have a system which draws games at 100 FPS when you are cruising around, but when you get into heavy combat, it plunges to 30 FPS.

The human eye can easily detect the flicker of a 60Hz refresh rate, and any time your video card produces less than 60 FPS if becomes obviously jerky. At rates above 60 FPS the situation changes: if your PC is delivering 60 FPS or more then your eye is fully fooled into believing non-flickering motion.

Many people believe that frame rates above 60 per second are a waste of time and money as the eye can't detect all those extra frames. However, when frame rates climb into the triple digits the human eye can detect differences in quality, not quantity. Ultimately, only you can decide how many FPS is enough...

Frames Per Second are important because each "frame" drawn on your screen is a rock solid, stand-alone image - fully detailed, with no "motion blur". You can confirm this by taking a screen shot when things on the screen are going fast. Television can survive on 30 fps because each image on a TV screen is blurry.

.
 
http://www.100fps.com/how_many_frames_can_humans_see.htm

I thought that this was a decent article explaining

How many frames per second can the human eye see?
This is a tricky question. And much confusion about it is related to the fact, that this question is NOT the same as:


How many frames per second do I have to have to make motions look fluid?
And it's not the same as


How many frames per second makes the movie stop flickering?
And it's not the same as


What is the shortest frame a human eye would notice?

Are all very different questions.
 
I'd like to also mention that certain game physics are improved with the faster the FPS. The Quake 3 engine was one such engine. at 24 FPS you could play the game decently however at higher FPS you'd be able to jump higher run faster and jump further etc. So I put FPS (Frames Per Second) pretty high on my list for FPS (First Person Shooter) style games. Also I feel that you can notice the diff. between say 24 FPS and 60 FPS. There is just an all around smoother feel.
 
Back
Top