Frumyfrenzy wrote...
herwin1 wrote...
vania z wrote...
"Just a quick note to say that any frame rate above 30 is impossible for the human eye to detect. So 40, 50, hell 120 is"
eye is a lot faster.
Actually, any frame rate above about 30 fps is too fast for the visual system to respond to. Audition can sense 100 nanosecond jitter, but the visual system simply doesn't perceive more than 25-30 frames per second. The neurone dynamics in the system smooth things on a 30-40 msec time scale.
Here is a nice visual comparison of 15 vs 30 vs 60 fps. 60 fps is much smoother than 30 fps, in my gaming experience too. www.boallen.com/fps-compare.html
First, why is this discussion in the patch update thread?
Anyway, this is sort of a loaded question. The human eye can't detect the changing of the frames themselves past something like 24 FPS; that is, you can't detect flickering as the frames are flipped. However, the eye can easily detect motion stuttering in games at 30 or 60 FPS because of the lack of motion blur. The eye can't detect the frames switching, but without motion blur the eye can easily see that there's a empty gap between a fast-moving object that went from one side of the screen to the other in a frame. Movies get away with lower frame rate because the film captures that blur when the shot is taken, so the frame in a movie would have a smear across the screen where the object went, and the eye would interpret that as motion. Without the blur, the eye sees an object teleport from point A to point B. But the higher the frame rate in games, the less the object moves in between frames, so the teleport effect gets less and less noticeable. Conversely, the faster objects move in games, the higher the frame rate needs to be to convey smooth motion.