Aller au contenu

Photo

Ideal Graphics Card for Max Settings...


  • Veuillez vous connecter pour répondre
43 réponses à ce sujet

#26
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Ginasue wrote...

Spectre_Moncy wrote...

Avoid 8600GT or below if you want to play at max setting.


I'm running a 8600 GT and have no problems at all.   I tired a 9600 before, and took it back, got better performance with my 8600.

Well, while I cannot agree with Spectre's opinion (assuming one understands that graphics settings are far less important than resolutions, so an 8600 at 1024 by 768 can really have the settings cranked way up),  I'm afraid I've got to say that you erred on two fronts there.  You set up the 9600 wrong, so it didn't perform up to its capability, and then you got rid of it.  I've got some issues with the corporate attitudes at nVIDIA, but the 9600 wasn't a bad card for the money when it was new, while the 8600 GT was always just slightly behind the place that it should have been!  From a frames per dollar point of view, the 9600 was the better value. 

Gorath
-

Modifié par Gorath Alpha, 14 janvier 2010 - 11:35 .


#27
Ginasue

Ginasue
  • Members
  • 246 messages
I'm nearsighted, and I don't wear my glasses at my computer. Having the resoultion to high and I can't see it. I play and do things at a lower res.

#28
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Ginasue wrote...

I'm nearsighted, and I don't wear my glasses at my computer. Having the resoultion to high and I can't see it. I play and do things at a lower res.

Ah, that partly expains some things, anyway.  I didn't wear glasses for using computers until I was supposed to get reading glasses as well as distance glasses (I do have a pair of bifocals, but seldom use them).  I have a special "mid-range" pair of glasses for computing now, and just hold my books only slightly closer to my face these days.  . 

I've been using large-sized displays for quite a number of years, which enabled me to put off using glasses while computer-running, until age finally defeated me in that regard. 

G

#29
DIONightmare

DIONightmare
  • Members
  • 7 messages

ZootCadillac wrote...
Now, the frame rate was 35fps and many gpu 'wizards' will laugh and say that's not good enough. But here's a little secret I learned in my first University degree study ( TV, Video and audio electronic engineering).
The human brain can not process images faster than 25 frames per second. It's the reason that when Britain was establishing the first TV system they settled on PAL at 50hz ( 25 frames per second interleaved .
It does not matter how many frames your super card throws out, your screen only shows however many your monitor is set to ( 75hz shows you up to 75fps and no more, ever ) but even so, you can't tell the difference between 30fps and 60fps no matter what you may convince yourself otherwise. Your brain just can't do it.


You are totally, totally incorrect aout framerate and its interpretation by human brain.
First, let's start from non-interactive media (movies, tv-shows): 24fps is NOT the maximum amount of frames per second brain can process! 24fps is the min framerate at which brain stops seeing sequence of frames as sequence of frames and starts seeing them as animation. That does NOT mean brain can't tell the difference between 24fps and 60fps - on the contrary, the latter will be smoother, especially with fast movement on the screen (which is exactly the case in all video games - you see lots of animations close up).

But that's not really the main problem - you forgot one very important detail - control lag. Game is not a movie - in movie you see predefined set of frames, but in game frames on the screen are rendered in response to your actions. Control lag is basically how much time passes between player input (you press a button) and in-game response (player character raises his sword). You can divide 1000ms by this time (in ms) to get input refresh rate which is much more important than visual framerate. You can  have 30fps, but very slow response, so 100ms passes from pressing a button to in-game action - that means 1000/100==10 which is more like real "fps". With such input lag and 30fps visuals you click left mouse button and weapon fires 3 frames after. To have 30fps input you must have 33ms input lag which is just non-existent in current 30fps games, so you have to run game at 60fps or even more to achieve fast enough response. Anyway, interaction is the most important thing in videogames, and here 30fps is not enough as lag is too high in most games with such framerate.

Another problem is you measure these 30fps as average value in some not very taxing (perfomance wise) area in the game. That means that you'll get much lower framerate in heavy combat and especially during heavy combat in visually taxing areas like denerium market. And it still is average value - minimum framerate usually is lower, so at any given moment you may have a slow-down which results in higher control lag (yet again) which is especially nasty during combat.

Dragon Age is an RPG so it can get away with 30fps (and still believe me, 60+ is much better), but that's not the case with shooters and vehicle sims. 

By the way, there maybe a reason you think 60fps isn't much better than 30. You use GTX295, right? It's basically dual GTX275, so each chip renders own frame at any moment. The problem is both frames should be prepared by CPU beforehand, so even while you can get up to 2x perfomance boost (in ideal conditions) of two GPU's (say, 60fps vs 30fps with single GTX275), input lags is EXACTLY THE SAME. So your videosystem cranks up twice as much frames to display, but controls may still be not as tight as they would be at 60fps with single-GPU card.
That's the reason I always prefer high-end single-GPU boards and not dual-GPU ones.

P.S. You can call the above text whatever you like but somehow both players and developers want higher framerate whenever it is possible. Somehow Infinity Ward makes Call Of Duty at 60fps at the cost of lower resolution and worse graphics (than it could be with 30fps), somehow multiplayer shooter crowd always scales visuals down to achieve the maximum possible framerate and lowest possible input lag, somehow 3D-cip vendors managed to sell poor stupid people all these videocards to play games at 60fps+ (if these people only knew that 30fps is the same!), for some reason Sony even tries to implement 60fps video on newest blu-ray revision... maybe we are all wrong here?

#30
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
PLEASE STAY CLOSER TO "ON-TOPIC".  Even the detour into spectacles wear has more valid usability in terms of this subject than any arguing one way or the other about the human sensorium's hugely variable perception of such things as FPS. 

Modifié par Gorath Alpha, 15 janvier 2010 - 12:35 .


#31
JironGhrad

JironGhrad
  • Members
  • 1 657 messages
This will run DA completely maxed out on my rig at around 80 fps @1680x1050 which is as close to "true" high-def as you'll likely see on a 22" LCD. I was "only" getting 65-70 but I picked up a cheap PhysX card on EBay and despite the claims as to not supporting hardware acceleration I saw the jump in fps and overall performance.

#32
Livanniah

Livanniah
  • Members
  • 38 messages
You're using an AGP 4670!?



And interesting comment on the PhysX card - never even put any thought into bothering with one, if they are cheap by my standards I might toy with one now.

#33
ZootCadillac

ZootCadillac
  • Members
  • 247 messages

DIONightmare wrote...

ZootCadillac wrote...
Now, the frame rate was 35fps and many gpu 'wizards' will laugh and say that's not good enough. But here's a little secret I learned in my first University degree study ( TV, Video and audio electronic engineering).
The human brain can not process images faster than 25 frames per second. It's the reason that when Britain was establishing the first TV system they settled on PAL at 50hz ( 25 frames per second interleaved .
It does not matter how many frames your super card throws out, your screen only shows however many your monitor is set to ( 75hz shows you up to 75fps and no more, ever ) but even so, you can't tell the difference between 30fps and 60fps no matter what you may convince yourself otherwise. Your brain just can't do it.


You are totally, totally incorrect aout framerate and its interpretation by human brain.
First, let's start from non-interactive media (movies, tv-shows): 24fps is NOT the maximum amount of frames per second brain can process! 24fps is the min framerate at which brain stops seeing sequence of frames as sequence of frames and starts seeing them as animation. That does NOT mean brain can't tell the difference between 24fps and 60fps - on the contrary, the latter will be smoother, especially with fast movement on the screen (which is exactly the case in all video games - you see lots of animations close up).

But that's not really the main problem - you forgot one very important detail - control lag. Game is not a movie - in movie you see predefined set of frames, but in game frames on the screen are rendered in response to your actions. Control lag is basically how much time passes between player input (you press a button) and in-game response (player character raises his sword). You can divide 1000ms by this time (in ms) to get input refresh rate which is much more important than visual framerate. You can  have 30fps, but very slow response, so 100ms passes from pressing a button to in-game action - that means 1000/100==10 which is more like real "fps". With such input lag and 30fps visuals you click left mouse button and weapon fires 3 frames after. To have 30fps input you must have 33ms input lag which is just non-existent in current 30fps games, so you have to run game at 60fps or even more to achieve fast enough response. Anyway, interaction is the most important thing in videogames, and here 30fps is not enough as lag is too high in most games with such framerate.

Another problem is you measure these 30fps as average value in some not very taxing (perfomance wise) area in the game. That means that you'll get much lower framerate in heavy combat and especially during heavy combat in visually taxing areas like denerium market. And it still is average value - minimum framerate usually is lower, so at any given moment you may have a slow-down which results in higher control lag (yet again) which is especially nasty during combat.

Dragon Age is an RPG so it can get away with 30fps (and still believe me, 60+ is much better), but that's not the case with shooters and vehicle sims. 

By the way, there maybe a reason you think 60fps isn't much better than 30. You use GTX295, right? It's basically dual GTX275, so each chip renders own frame at any moment. The problem is both frames should be prepared by CPU beforehand, so even while you can get up to 2x perfomance boost (in ideal conditions) of two GPU's (say, 60fps vs 30fps with single GTX275), input lags is EXACTLY THE SAME. So your videosystem cranks up twice as much frames to display, but controls may still be not as tight as they would be at 60fps with single-GPU card.
That's the reason I always prefer high-end single-GPU boards and not dual-GPU ones.

P.S. You can call the above text whatever you like but somehow both players and developers want higher framerate whenever it is possible. Somehow Infinity Ward makes Call Of Duty at 60fps at the cost of lower resolution and worse graphics (than it could be with 30fps), somehow multiplayer shooter crowd always scales visuals down to achieve the maximum possible framerate and lowest possible input lag, somehow 3D-cip vendors managed to sell poor stupid people all these videocards to play games at 60fps+ (if these people only knew that 30fps is the same!), for some reason Sony even tries to implement 60fps video on newest blu-ray revision... maybe we are all wrong here?


Actually I find that quite insulting that you'd tell me what I learned at university is wrong and inferring my degree is not worth ****. So I'm just not going to debate it. I don't think you read what I write. Especially as you tell me I use a 295 and then go on to tell me what's wrong with it when I clearly stated I have one ( it runs my folding rig ) but that I game with my HD4890 on a 52" 1080p LCD at 120hz via HDMI.

Geeks like numbers. People can sell fancy numbers to anyone who is already sold on th idea that bigger is better. Your brain can not process images faster than 25 frames per second no matter how fast you feed them to the eye. It's the premise of television. Sure TV is constant framrate and games are not, there will always be lag at any resolution but to say your brains knows the difference between a constant, non-variable frame-rate of 40fps and 60fps is the kind of wishful thinking that means I make a lot of money selling high end cards to geeks who think the biggest number means the best.

And I thank you for it.

#34
JironGhrad

JironGhrad
  • Members
  • 1 657 messages

Livanniah wrote...

You're using an AGP 4670!?

And interesting comment on the PhysX card - never even put any thought into bothering with one, if they are cheap by my standards I might toy with one now.


Actually, I am.

I paid $40 for the PhysX card.  I went ahead and picked up that 4670 because it was that or new motherboard, new RAM and a new GPU and for a short-term solution this way was around $600 cheaper than my alternative.

#35
DIONightmare

DIONightmare
  • Members
  • 7 messages
2 ZootCadillac

Somehow you're not the only one with university degree here and something tells me this topic wasn't your specialization (if it was, then shame on you).



My facts are:

1)You are not correct that brain can't tell the difference between 40fps and 60fps game.

2)You take wrong assumption brain has to process interactive video as series of clearly recognised individual frames to tell the difference between different framerates - it doesn't have to! Brain looks at the picture as a whole - as video, animation.

3)You "forgot" to read about input lag thing, so no further explanations here. Just ****ing remember that vidogame is interactive and not something you watch in the cinema.



Try reading others's arguments next time - this technique (reading and listening to opponent, I mean) is actually taught in my university. It's very fun and useful.

#36
Althernai

Althernai
  • Members
  • 143 messages

DIONightmare wrote...

1)You are not correct that brain can't tell the difference between 40fps and 60fps game.


It would be interesting to have an experimental answer to this question. I am fairly confident that in a game like DA:O my brain can't tell the difference between 30 FPS and anything higher so I'm curious if there are actually people who can or if they look at the numerical frame rates and convince themselves that it's running slower or faster accordingly.

3)You "forgot" to read about input lag thing, so no further explanations here. Just ****ing remember that vidogame is interactive and not something you watch in the cinema.


Again, we're talking about Dragon Age: Origins. The difference in the input lag due to frame rate between 20 FPS and 100 FPS is 40ms and there is absolutely nothing in the game for which such a difference would be relevant.

If you're going to play FPS games (particularly multiplayer ones), then yes, you may want to get a graphics card that is significantly more powerful, but for games like Dragon Age (or even like Mass Effect), you really don't need anything more than the mid-range card of the latest generation (e.g. the Radeon HD 5670 at the true middle or 5750 if you want something a bit more powerful).

#37
Valaskjalf

Valaskjalf
  • Members
  • 283 messages
Microstuttering only occurs with the Alternate Frame Rendering mode. Supertiling or Scissoring don't have this problem.

#38
Tommy6860

Tommy6860
  • Members
  • 2 488 messages

ZootCadillac wrote...


Just an addition to help out the OP.

I had a machine here that had 2 9800GT's (standard Sapphire's) in so I whipped one out and ran DA:O.

I maxxed all the settings at 1600X1024 which whilst not quite HD and not the highest resolution available it should be considered near enough to HD that you'd have to be churlish to argue that it's not a high enough resolution.


Image IPB

click image for full size

The image result is as good as it can be

Image IPB

click image for full size

Now, the frame rate was 35fps and many gpu 'wizards' will laugh and say that's not good enough. But here's a little secret I learned in my first University degree study ( TV, Video and audio electronic engineering).
The human brain can not process images faster than 25 frames per second. It's the reason that when Britain was establishing the first TV system they settled on PAL at 50hz ( 25 frames per second interleaved .
It does not matter how many frames your super card throws out, your screen only shows however many your monitor is set to ( 75hz shows you up to 75fps and no more, ever ) but even so, you can't tell the difference between 30fps and 60fps no matter what you may convince yourself otherwise. Your brain just can't do it.

So there you have it. a card you can pick up used, under 12 months old for less than $70. Outperforms an HD5670 in the majority of graphic intensive games at higher resolutions except in the one occasion the 5670 is able to use its DX11 or in openGL intensive applications ( which has always been the case with Nvidia and ATI)

Having said that, for a new build on a budget it is a bit old. I'd suggest as value for performance, the GTS250 but really the GTX260 if you can afford it.

The choice is yours. Other graphics cards are available ;)


One problem with your 25FPS theory that the human brain can only process at its fastest (actually, children can see much faster frame rates); video cards don't always run at that speed when data rates and resolutions increase. You think games will "always" run at those minimum speeds, no matter the card? So, sometimes, one needs a top of the line card to make sure the brain can meet its maximum input.

Anyway, from what I've read,, 30FPS is about the average max the human brain can process.

#39
BDbd99

BDbd99
  • Members
  • 7 messages
I just bought a Nvidia 7900 GTX. You think it will run the game well?

#40
MingWolf

MingWolf
  • Members
  • 857 messages
I'm guessing a 7900 GTX will run reasonably well with your settings maxed, depending on how high you want to set your resolution.

As to the whole FPS and how much the human brain can process as stated above, I tend to think 30 is about the average max. I've read somewhere that higher fps can account for effects like motion blur and stuff like that--whatever that means--but for the most part, anything higher is hardly noticable. I normally like to run Flight Simulator on my computer, which of course, tries to simulate real life flying on a computer screen. Above 25 fps, its almost exactly like the experience I get in the cockpit of a real airplane. Anything higher than that is probably just icing on a cake and I would hardly notice.

Modifié par MingWolf, 01 février 2010 - 06:32 .


#41
ShinsFortress

ShinsFortress
  • Members
  • 1 159 messages
I still use a CRT monitor which has the brightness, colour and performance I need. Resolution I use for games is usually 1280x1024. Statistically I have a mid-range PC with an ATI (recently moved away from Nvidia) Radeon 4850. A very mediocre card to some, but it runs games like this, Mass Effect and the rest I play quite well and is good value.



I frequently outperform other online gamers with slightly better systems since I optimise and maintain mine very thoroughly.



I was a solid user of nvidia up to the 8800 GT Super+, so I can understand where the 8/9600GT comments come from, but think they should be qualified. An 8600GT can perform slightly better but only likely under very specific circumstances! E.g. which OS and thus which DirectX are you expecting it to cope with?

#42
dawrogue

dawrogue
  • Members
  • 2 messages

ZootCadillac wrote...

DIONightmare wrote...

ZootCadillac wrote...
Now, the frame rate was 35fps and many gpu 'wizards' will laugh and say that's not good enough. But here's a little secret I learned in my first University degree study ( TV, Video and audio electronic engineering).
The human brain can not process images faster than 25 frames per second. It's the reason that when Britain was establishing the first TV system they settled on PAL at 50hz ( 25 frames per second interleaved .
It does not matter how many frames your super card throws out, your screen only shows however many your monitor is set to ( 75hz shows you up to 75fps and no more, ever ) but even so, you can't tell the difference between 30fps and 60fps no matter what you may convince yourself otherwise. Your brain just can't do it.


You are totally, totally incorrect aout framerate and its interpretation by human brain.
First, let's start from non-interactive media (movies, tv-shows): 24fps is NOT the maximum amount of frames per second brain can process! 24fps is the min framerate at which brain stops seeing sequence of frames as sequence of frames and starts seeing them as animation. That does NOT mean brain can't tell the difference between 24fps and 60fps - on the contrary, the latter will be smoother, especially with fast movement on the screen (which is exactly the case in all video games - you see lots of animations close up).

But that's not really the main problem - you forgot one very important detail - control lag. Game is not a movie - in movie you see predefined set of frames, but in game frames on the screen are rendered in response to your actions. Control lag is basically how much time passes between player input (you press a button) and in-game response (player character raises his sword). You can divide 1000ms by this time (in ms) to get input refresh rate which is much more important than visual framerate. You can  have 30fps, but very slow response, so 100ms passes from pressing a button to in-game action - that means 1000/100==10 which is more like real "fps". With such input lag and 30fps visuals you click left mouse button and weapon fires 3 frames after. To have 30fps input you must have 33ms input lag which is just non-existent in current 30fps games, so you have to run game at 60fps or even more to achieve fast enough response. Anyway, interaction is the most important thing in videogames, and here 30fps is not enough as lag is too high in most games with such framerate.

Another problem is you measure these 30fps as average value in some not very taxing (perfomance wise) area in the game. That means that you'll get much lower framerate in heavy combat and especially during heavy combat in visually taxing areas like denerium market. And it still is average value - minimum framerate usually is lower, so at any given moment you may have a slow-down which results in higher control lag (yet again) which is especially nasty during combat.

Dragon Age is an RPG so it can get away with 30fps (and still believe me, 60+ is much better), but that's not the case with shooters and vehicle sims. 

By the way, there maybe a reason you think 60fps isn't much better than 30. You use GTX295, right? It's basically dual GTX275, so each chip renders own frame at any moment. The problem is both frames should be prepared by CPU beforehand, so even while you can get up to 2x perfomance boost (in ideal conditions) of two GPU's (say, 60fps vs 30fps with single GTX275), input lags is EXACTLY THE SAME. So your videosystem cranks up twice as much frames to display, but controls may still be not as tight as they would be at 60fps with single-GPU card.
That's the reason I always prefer high-end single-GPU boards and not dual-GPU ones.

P.S. You can call the above text whatever you like but somehow both players and developers want higher framerate whenever it is possible. Somehow Infinity Ward makes Call Of Duty at 60fps at the cost of lower resolution and worse graphics (than it could be with 30fps), somehow multiplayer shooter crowd always scales visuals down to achieve the maximum possible framerate and lowest possible input lag, somehow 3D-cip vendors managed to sell poor stupid people all these videocards to play games at 60fps+ (if these people only knew that 30fps is the same!), for some reason Sony even tries to implement 60fps video on newest blu-ray revision... maybe we are all wrong here?


Actually I find that quite insulting that you'd tell me what I learned at university is wrong and inferring my degree is not worth ****. So I'm just not going to debate it. I don't think you read what I write. Especially as you tell me I use a 295 and then go on to tell me what's wrong with it when I clearly stated I have one ( it runs my folding rig ) but that I game with my HD4890 on a 52" 1080p LCD at 120hz via HDMI.

Geeks like numbers. People can sell fancy numbers to anyone who is already sold on th idea that bigger is better. Your brain can not process images faster than 25 frames per second no matter how fast you feed them to the eye. It's the premise of television. Sure TV is constant framrate and games are not, there will always be lag at any resolution but to say your brains knows the difference between a constant, non-variable frame-rate of 40fps and 60fps is the kind of wishful thinking that means I make a lot of money selling high end cards to geeks who think the biggest number means the best.

And I thank you for it.


@ZootCadillac:
I don't know which uni you did  (or didn't) graduate from, which major you were in, which part of your fantastic system with 1080p 52" @120Hz display and GTX295 is real, or is your age really 44 years old for a person with such a manner. Frankly I don't know anything about you at all. But what I'm sure is this : your idea of capping human vision and in process, my vision, at something like 25 (or even 30) Frame Per Second is utterly and absolutely rubbish. Because everybody who've got an eye or two can tell you how a CRT monitor with 85Hz refresh rate is different from a 60Hz one - I can't even imagine what 25Hz look like. And when we're at that why don't you spare few bucks and go buy an used CRT monitor and see with your own eyes, mate. But maybe it's not your fault, maybe 20 years ago some idiots really believed human visual cortex ( which composed of hundreds of millions of neurons - each, in turn, is like a mini computer ) is just a very primitive digital alike system with processing capability of no more than 25 (or 30) FPS. Time to take a long, hard look at your degree, mate.

Modifié par dawrogue, 18 février 2010 - 09:43 .


#43
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
I thought at the time, the business of frames per second and human vision was outside the scope of the OP's subject, and getting somewhat overheated, and now this thread has gotten somewhat elderly, so why bring it back? I really think if you want to argue about it, then either PMs or the Off Topic forum would be far more appropriate.

#44
dawrogue

dawrogue
  • Members
  • 2 messages
Oh really sorry, I didn't notice the timestamp. Guess I just have enough of those expert.