That's plain wrong.
Games don't render anything and GPUs don't render in Hertz intervals, they render when it's finished being processed as fast as physically possible. It's the the monitor that periodically refreshes its display (as it has to, unless you'd only see a black monitor), in intervals measured with Hertz.
And as I've said, the difference is significant. Double the frame refreshrate means you see double the frames. So if you have a GPU that can render a game at 120 fps, with a 60Hz monitor you only see half of those.
You can compare it with animated comics. Draw a motion comic in crude successive pictures and you see motion. Draw the same visual with more standalone pictures and speed up the animation and you have a smoother visual because the differences between each single frame is less.
A 120Hz monitor won't make a picture more pretty, but it will improve the visual quality of motion significantly.
When I said games, I meant hardware rendering the games. My bad. And what do you think hertz is? It just means per second. You say GPU renders games at certain frames per second and then say it's not Hertz; they are two ways of saying the same thing.
And no, it doesn't make a significant discernible difference between 60 and 120. Why do you think developers would rather use GPU power to increase resolution from 720p to 1080p or use it for any other effect than increase FPS from 30 to 60? I have a 120Hz monitor and I don't notice any difference with my earlier monitor; let alone a significant one.
And lastly double the framerate doesn't mean you "see" double the frames per second. It just means double the screens are rendered. Human eye can't process anywhere near 120 frames in a second. Almost all movies in theaters use 24 FPS(Hobbit used 48 fps with disastrous results).





Retour en haut




