ZootCadillac wrote...
Now, the frame rate was 35fps and many gpu 'wizards' will laugh and say that's not good enough. But here's a little secret I learned in my first University degree study ( TV, Video and audio electronic engineering).
The human brain can not process images faster than 25 frames per second. It's the reason that when Britain was establishing the first TV system they settled on PAL at 50hz ( 25 frames per second interleaved .
It does not matter how many frames your super card throws out, your screen only shows however many your monitor is set to ( 75hz shows you up to 75fps and no more, ever ) but even so, you can't tell the difference between 30fps and 60fps no matter what you may convince yourself otherwise. Your brain just can't do it.
You are totally, totally incorrect aout framerate and its interpretation by human brain.
First, let's start from non-interactive media (movies, tv-shows): 24fps is NOT the maximum amount of frames per second brain can process! 24fps is the min framerate at which brain stops seeing sequence of frames as sequence of frames and starts seeing them as animation. That does NOT mean brain can't tell the difference between 24fps and 60fps - on the contrary, the latter will be smoother, especially with fast movement on the screen (which is exactly the case in all video games - you see lots of animations close up).
But that's not really the main problem - you forgot one very important detail - control lag. Game is not a movie - in movie you see predefined set of frames, but in game frames on the screen are rendered in response to your actions. Control lag is basically how much time passes between player input (you press a button) and in-game response (player character raises his sword). You can divide 1000ms by this time (in ms) to get input refresh rate which is much more important than visual framerate. You can have 30fps, but very slow response, so 100ms passes from pressing a button to in-game action - that means 1000/100==10 which is more like real "fps". With such input lag and 30fps visuals you click left mouse button and weapon fires 3 frames after. To have 30fps input you must have 33ms input lag which is just non-existent in current 30fps games, so you have to run game at 60fps or even more to achieve fast enough response. Anyway, interaction is the most important thing in videogames, and here 30fps is not enough as lag is too high in most games with such framerate.
Another problem is you measure these 30fps as average value in some not very taxing (perfomance wise) area in the game. That means that you'll get much lower framerate in heavy combat and especially during heavy combat in visually taxing areas like denerium market. And it still is average value - minimum framerate usually is lower, so at any given moment you may have a slow-down which results in higher control lag (yet again) which is especially nasty during combat.
Dragon Age is an RPG so it can get away with 30fps (and still believe me, 60+ is much better), but that's not the case with shooters and vehicle sims.
By the way, there maybe a reason you think 60fps isn't much better than 30. You use GTX295, right? It's basically dual GTX275, so each chip renders own frame at any moment. The problem is both frames should be prepared by CPU beforehand, so even while you can get up to 2x perfomance boost (in ideal conditions) of two GPU's (say, 60fps vs 30fps with single GTX275), input lags is EXACTLY THE SAME. So your videosystem cranks up twice as much frames to display, but controls may still be not as tight as they would be at 60fps with single-GPU card.
That's the reason I always prefer high-end single-GPU boards and not dual-GPU ones.
P.S. You can call the above text whatever you like but somehow both players and developers want higher framerate whenever it is possible. Somehow Infinity Ward makes Call Of Duty at 60fps at the cost of lower resolution and worse graphics (than it could be with 30fps), somehow multiplayer shooter crowd always scales visuals down to achieve the maximum possible framerate and lowest possible input lag, somehow 3D-cip vendors managed to sell poor stupid people all these videocards to play games at 60fps+ (if these people only knew that 30fps is the same!), for some reason Sony even tries to implement 60fps video on newest blu-ray revision... maybe we are all wrong here?