Aller au contenu

Photo

Ideal Graphics Card for Max Settings...


  • Veuillez vous connecter pour répondre
43 réponses à ce sujet

#1
Menthi44

Menthi44
  • Members
  • 56 messages
Hey there! :D

Title speaks for itself, which would be the lowest model of GeForce graphics cards that could play Dragon Age Origins on max settings? I know there's the newest models on the market that have just been released, but they cost a fortune to purchase. I'm thinking of getting a new computer and thus I need to decide which card to get.

I'm also looking for performance-wise, I want the game to run smoothly, aka high FPS.

Thanks in advance!

#2
Guest_Eridhan_*

Guest_Eridhan_*
  • Guests
I'd recommend GTX 260, it's not high-end, there are many better cards now, but DA runs perfectly smooth on max settings, and still it should be useful for at least next two years.

#3
Menthi44

Menthi44
  • Members
  • 56 messages
Thanks! ^_^ Much appreciated.

#4
Althernai

Althernai
  • Members
  • 143 messages
Practically any high-end card (i.e. with the second digit of the name being 8 or greater) from the past 3 years will do as will mid-range cards of the latest generation. DA:O is not the most GPU-intensive of games; if you're buying something new, just avoid cards with the second digit being 5 or lower and you'll be fine. If you want high FPS, make sure to get a decent processor as DA:O is among the better threaded games out there (although if you're buying a new machine, chances are you'll be fine on that score too).

#5
Spectre_Moncy

Spectre_Moncy
  • Members
  • 330 messages
Avoid 8600GT or below if you want to play at max setting.

#6
Livanniah

Livanniah
  • Members
  • 38 messages
For questions like this, keep in mind it's important to give data including RESOLUTION you intend to play at - what can max out a 1280x1024 screen will choke on a 1920x1200.



Toms Hardware does a great monthly piece on the best cards for the money including quotations on each of the expected performance level at common resolutions, link for the current one is: http://www.tomshardw...-card,2521.html



And personal word of note - restricting yourself to one half of the aisle is pretty foolish currently, especially with only one graphics card manufacturer having DX11 cards shipping before Summer - buying any card without compatibility for the most recent DX incarnation seems foolish to me. [In fact still kicking myself for missing that ATI was having theirs come out so soon, had purchased a GTX280 for this machine 2 months earlier, would've gotten a performance boost + DX11 for $20 less if I'd been patient]

#7
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Has Fermi's general release slipped from Q1 to late in Q2 now? I hadn't seen any announcement to that effect, although one would think that as close to 3-31 as we actually are by now, there would have been some official word from nVIDIA one way or the other.

I somewhat doubt that Dx11 will become a major player really soon, however.  Vista and Dx10 are four years old, and only now is that version of Direct3D making much of a showing in games.  I've been looking at the early reviews of the HD 5670 (oops!  sorry about the typo, now corrected, particularly Anand Tech's), and am unimpressed with what that $99 card is able to do in Dx11 in those tests so far. 

Back on track here with the overall topic, nVIDIA had a lock on the high end for single core, single card setups from the release of the 8800s until the Radeon 5800 / 5900 cards appeared this fall.  That was three years' worth, but they did prascticallly give up the middle-medium after the 8600 GT was beaten by the HD 3690 two years ago.  High middle was theirs a little longer, with the 9600 GT, until the HD 4670 beat it, most especially on price. 

I see the HD 5700 cards as the natural long-term successors at the Medium High border zone (edited here - used an odd term that wasn't properly descriptive), not the HD 5670.  

As per the reminders, without a resolution determination, various relatively current Mainline cards, such as the 9600 GT and HD 4670, can run the high image quality settings with the "usual" medium resolution displays.  But by its very name, the word "Max" brings to mind something more than that, beyond a mainline card's realm, the very large, very high resolution diplay that only High End cards can handle. 

Gorath
-

Modifié par Gorath Alpha, 18 février 2010 - 09:25 .


#8
DIONightmare

DIONightmare
  • Members
  • 7 messages
I play DA on Radeon 5850 1GB (with gpu overclocked from 725MHz to 900) and it runs ok (steady v-locked 60fps in most areas, 40-50fps in Denerium market - never lower). Settings are 1920x1080, 16x AF, 8xMSAA, adaptive AA on, everything at max. Perfomance is MUCH better than on Xbox 360.

With this in mind I guess you won't be able to play game "on max settings" with acceptable framerate on anything below GeForce GTX275/Radeon 4890, at least not in this resolution and not v-locked. If that's still too expensive, maybe GeForce 9800GTX/GTS250 or Radeon 4850 will do - just forget about "all max" part and you'll get good framerate with acceptable quality. Plus Radeon 4850 is quite cheap now.

P.S. Just don't forget about CPU - Dragon Age really favours quad-cores, no matter what manufacturer. If price matters the most, try AMD Athlon II X4 620 - it's quite fast quad-core processor with great price. Phenom II X4 925 is 10-15% faster, but more expensive too (no use in buying 945/955/965 models as you can easily overclock 925 to the same speed). If you want something even better, try Intel Core i7-750, but it's much more expensive (not just CPU, but compatible motherboard).
And you should buy 4 gigs of RAM or maybe more.

Modifié par DIONightmare, 14 janvier 2010 - 05:15 .

  • txchimama aime ceci

#9
DIONightmare

DIONightmare
  • Members
  • 7 messages
2 Gorath Alpha
5700 is not high-middle. 128bit bus (even with GDDR5) won't allow good perfomance in high resolutions, so older 4870/4890 are faster as they have the same amount of execution cores with 256bit GDDR5 memory bus. 5850 will be high-middle as soon as NVIDIA releases its Fermi - I believe AMD will lower prices to more acceptable levels then.

Modifié par DIONightmare, 14 janvier 2010 - 06:29 .


#10
Livanniah

Livanniah
  • Members
  • 38 messages

Gorath Alpha wrote...

Has Fermi's general release slipped from Q1 to late in Q2 now? I hadn't seen any announcement to that effect, although one would think that as close to 3-31 as we actually are by now, there would have been some official word from nVIDIA one way or the other.


Nvidia's Q's always seem to run one behind past few years - so was mostly my assumption on that, although with no reference boards being handed out to reviewers yet, much less showing them running to reviewers without giving them an actual board, it does seem that their timetable is very likely behind. (As you stated yourself)

Better than ATI's old trend of being 1 1/2 to 2 Q's behind schedule that they had for a while there though - although recently they've been running ahead of schedule when the insane level of demand for their cards hasn't been stifling them. (i.e. the 5890 delay)

#11
ZootCadillac

ZootCadillac
  • Members
  • 247 messages
Some very confusing information in here Especially that the 4670 beat the 9600GT on proce and performance. The 9600GT is a faster card offering usually equal performance ( performing better in some benchmarks but worse on the open GL based ones where ATI always win out ) and in real world gaming the 9500GT has proven to be close to the 4670 ( and whups the 4650) with the 9600GT showing at least equal performance and better performance at higher resolutions.

The truth of the matter is that the 46xx 47xx 57xx are all mid range cards and not to be considered for any serious gamer with a big LCD.



As for the initial question. The answer is quite simple. The lowest Nvidia card that you can buy new that will run DA:O on full settings at higher resolutions is the 9800GT.

You might consider the GTS250 if you want a newer range but it's a rebadged 9800GTX+ ( and will cost you more as a fact of that )

It's all subjective and people can have their own opinions even if they are misleading.

All I can say is I build a great many systems yearly and spend around £5000 a year buying graphics cards alone. Based upon my experience and the real world benchmarks then 9800GT is the lowest, GTS250 is your bang for buck option.

#12
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

DIONightmare wrote...

2 Gorath Alpha
5700 is not high-middle. 128bit bus (even with GDDR5) won't allow good perfomance in high resolutions, so older 4870/4890 are faster as they have the same amount of execution cores with 256bit GDDR5 memory bus. 5850 will be high-middle as soon as NVIDIA releases its Germi - I believe AMD will lower prices to more acceptable levels then.

Mea Culpa!  I meant Medium-High, of course, the borderline zone between Mainline / Medium and High End, and I will make an appropriate edit, of course. 

G

#13
DIONightmare

DIONightmare
  • Members
  • 7 messages
2 ZootCadillac
9500GT is not a 3D-accelerator, it's some thing you insert in PCI-E slot to get some picture on Display. It's 2 (two) times slower than 9600GT and 4670. And yes, 4670 and 9600GT are ~equal if you take average of all games. I repeat, 9500GT is not even comparable with these two - it has 32 processing units and is basically 8600GT on 55nm.

I wonder how can you not consider 57x0 card for any serious games while propose 9800GT and GTS250, because 5770 is FASTER than 9800GT and GTS250. 4770 is on par with 9800GT, 4850 competes with GTS250 (which is rebranded 9800GTX which is slightly overclocked 8800GTS 512), while 5770 is between 4870 1GB and 4890 perfomance wise. if you want NVIDIA equivalent, go for GTX260 (vs 4870 1GB) or GTX275 (vs 4890).

As you said, "It's all subjective and people can have their own opinions even if they are misleading.", so I agree on 9800GT as absolute minimum for any modern game including Dragon Age, but NVIDIA is not the only 3D-chips developer, so Radeon 4770 is just as good.

So here's table of modern 3D chips. Faster ones are lower in table, direct competitors from each manufacturer are in single line.

GeForce -------------|  Radeon
9800GT- -------------| -4770, 4830
9800GTX/GTS250- |- 4850, 5750
GTX260---------------|-- 4870, 5770
GTX275---------------|- 4890
GTX285---------------|------
-------- ----------------|---5850
GTX295---------------|-- 5870, 4870x2
-------------------------|--- 5970

Modifié par DIONightmare, 14 janvier 2010 - 06:18 .


#14
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
I believe that the overall coverage of the OP's question is now adequate, and opnion might become argumentative, but I wonder how many of our fellow gamers have acvtually considered what the AMD plus ATI merger meant in the long run for nVIDIA?  Intel has already released their new i5s with an uprated IGP included in the package with the CPU cores.  It's still just an Intel chips, so it has no bearing on real gaming, but that is coming, eventually. 

AMD will be showing working combos of CPU plus relatively pwerful actual GPU silicon, both in the same core, within two years or less.  They will beat Intel to market with a game-capable combo of that sort that adds graphics to the CPU much less expensively than on discrete separate circuit boards.  If the FTC rides herd on Intel's proclivities, and keeps them from their usual illegal trade practices, AMD will have a financial windfall for at least a year. 

Intel will eventually have something to sell of somewhat similar capability.  Beginning about three years from now, the market for discrete video cards will start to really shrink.  Before then, nVIDIA needs an alternate product to stay in business selling.  Fermi is as much GPU as it is multi-processor computer CPU, at least that's a major part of its designed-in basis, and the likely reason that they have skipped over so much of the VGA market lately, and given ATI such a large window of opportunity. 

Gorath
-

#15
ZootCadillac

ZootCadillac
  • Members
  • 247 messages

DIONightmare wrote...

2 ZootCadillac
9500GT is not a 3D-accelerator, it's some thing you insert in PCI-E slot to get some picture on Display. It's 2 (two) times slower than 9600GT and 4670. And yes, 4670 and 9600GT are equal if you take average of all games. I repeat, 9500GT is not even comparable with these two - it has 32 processing units and is basically 8600GT on 55nm.

I wonder how can you not consider 57x0 card for any serious games while propose 9800GT and GTS250, because 5770 is FASTER than 9800GT and GTS250. 4770 is on par with 9800GT, 4850 competes with GTS250 (which is rebranded 9800GTX which is slightly overclocked 8800GTS 512), while 5770 is between 4870 1GB and 4890 perfomance wise. if you want NVIDIA equivalent, go for GTX260 (vs 4870 1GB) or GTX275 (vs 4890).

As you said, "It's all subjective and people can have their own opinions even if they are misleading.", so I agree on 9800GT as absolute minimum for any modern game including Dragon Age, but NVIDIA is not the only 3D-chips developer, so Radeon 4770 is just as good.

So here's table of modern 3D chips. Faster ones are lower in table, direct competitors from each manufacturer are in single line.

GeForce---------------- Radeon
9800GT------- ---------4770, 4830
9800GTX/GTS250--- 4850, 5750
GTX260----------------- 4870, 5770
GTX275----------------- 4890
GTX285------------ --------------------
-------- -------------------5850
GTX295----------------- 5870, 4870x2
--------------------------- 5970


Wow. A lot of things to answer. First off,I'm not a fanboy (or even a boy, I'm 44 years old ). I buy all kinds of cards including the HIS 4890 whose empty box I'm looking at now.

Ok to address your points. If the 9500 is not a 3D accelerator then the 8600GT was not either, yet I played a full run through of DA:O on an OC 8600GT at high settings at 1280x1024. Not maximum but perfectly playable for something that's 'not a 3D accelerator'.

Yes i know all about the difference between the 9500 and the 9600 and the 4650 and 4670 . I also know where exactly these cards sit in relation to each other in real world gaming benchmarks.
The bottom line is that the 4670 has no benefits other than the useless addition of DX10.1 and when you consider that you an get a 9800GT for a little more money ( or less if you want to grab a used bargain until the DX11 cards become mainstream and have some games that utilise it ) it's a card not worth considering.

As for the 57x0 I did not propose a 9800GT or GTS250 instead of a 57x0 model anywhere in what I said. I propesed the Nvidia cards in response to the OP's question, because that's what he wanted to know. "what is the lowest NVIDIA card" he could run the game fully on. I thought that's what we were here for, to answer the questions rather than have fanboy arguments.

Earlier in my comment I mention that the 46xx 47xx and 57xx should not be considered high end cards for gaming and I stick by that. ATI make great cards, I've had many over the years but for serious big screen gaming they are not in the game until you get up to the X870 series. Incidentally I'd say the same for Nvidia as well. If you want serious gaming capability then you need to be looking at the single core 285 or dual core 295 (I have the 295 in this machine now ).

I'm well aware of ATI and how good they are but for the serious gamer whichever you choose you won't get the benefit from either company unless you buy top end and have a rig and display to make use uf that. I game on a 1080p 52" Toshiba Regza @120hz with my 4890 if it makes you feel any better.

So I was not offering the guy Nvida over ATI I was simply answering his question in relation to Nvidia, as he asked.

I have sold 6 and bought 14 graphics cards in the last 5 weeks. I base my opinion, and opinion it is, on real world, first hand experience and not some figures I read on a blog.

I hope that clears it up.

Modifié par ZootCadillac, 14 janvier 2010 - 09:18 .


#16
ZootCadillac

ZootCadillac
  • Members
  • 247 messages

Gorath Alpha wrote...

I believe that the overall coverage of the OP's question is now adequate, and opnion might become argumentative, but I wonder how many of our fellow gamers have acvtually considered what the AMD plus ATI merger meant in the long run for nVIDIA?  Intel has already released their new i5s with an uprated IGP included in the package with the CPU cores.  It's still just an Intel chips, so it has no bearing on real gaming, but that is coming, eventually. 

AMD will be showing working combos of CPU plus relatively pwerful actual GPU silicon, both in the same core, within two years or less.  They will beat Intel to market with a game-capable combo of that sort that adds graphics to the CPU much less expensively than on discrete separate circuit boards.  If the FTC rides herd on Intel's proclivities, and keeps them from their usual illegal trade practices, AMD will have a financial windfall for at least a year. 

Intel will eventually have something to sell of somewhat similar capability.  Beginning about three years from now, the market for discrete video cards will start to really shrink.  Before then, nVIDIA needs an alternate product to stay in business selling.  Fermi is as much GPU as it is multi-processor computer CPU, at least that's a major part of its designed-in basis, and the likely reason that they have skipped over so much of the VGA market lately, and given ATI such a large window of opportunity. 

Gorath
-


I do see a great future for integrated GPU CPU configurations but I don't see it having any bearing on the gaming market in the nex 5 years at least. I can see Intel ever producing a gaming configuration in this area ( not now they have fallen out with Nvidia over licensing )

I'd like to see AMD get something going in that area as if I'm a fanboy for anything it is AMD. But again, I think a gaming solution is too far down the line for us to be worrying about it yet. I'n Fact I'm more interested in the Nvidia timeline with more processing power going to the GPU card rather than the other way around.

Interesting times ahead for sure.

#17
DIONightmare

DIONightmare
  • Members
  • 7 messages
2 ZootCadillac

Haven't tried Dragon Age on my laptop with GT130 (more or less desktop 8600GT/GTS), but all the other modern games lag a lot on medium settings. Some aren't even playable. Of course, one can argue on what playable framerate is, but for me it's 60gps avg and 30-40fps at min. It may worth a try to play DA if you already have 9500GT, but I see absolutely no reason to buy new 9500GT now as upgrade. Is it even possible to buy slower card now? (ok, I know there's 9400 and 4350, but that's integrated-level perfomance).

4670/9600GT are not just better - these cards at least alow to play modern games with high settings, at least in 1280x1024. If one is low on money, used card can be good choice to wait for something better (and more expensive).



I base my opinion on decade and a half of PC and console gaming and of building every single PC I ever owned except the first one, so no "figures I read on a blog" here.



As for the other issues, I haven't tried to start a flame but merely presented topic starter with AMD alternatives to NVIDIA cards - hope that clears it up.

#18
ZootCadillac

ZootCadillac
  • Members
  • 247 messages

DIONightmare wrote...

2 ZootCadillac
Haven't tried Dragon Age on my laptop with GT130 (more or less desktop 8600GT/GTS), but all the other modern games lag a lot on medium settings. Some aren't even playable. Of course, one can argue on what playable framerate is, but for me it's 60gps avg and 30-40fps at min. It may worth a try to play DA if you already have 9500GT, but I see absolutely no reason to buy new 9500GT now as upgrade. Is it even possible to buy slower card now? (ok, I know there's 9400 and 4350, but that's integrated-level perfomance).
4670/9600GT are not just better - these cards at least alow to play modern games with high settings, at least in 1280x1024. If one is low on money, used card can be good choice to wait for something better (and more expensive).

I base my opinion on decade and a half of PC and console gaming and of building every single PC I ever owned except the first one, so no "figures I read on a blog" here.

As for the other issues, I haven't tried to start a flame but merely presented topic starter with AMD alternatives to NVIDIA cards - hope that clears it up.


Sure, I agree and no argument or hostility meant here. I'm certainly not suggesting that the 9500GT is a gaming option, but that it will perform better than was being represented in this thread.

And when I say something along the lines of " I do this, I know this because of this and I don't do this" etc. It's not to imply that you or anyone else does anything different or knows any less. It's simply to establish with those reading that I have experience and that my opinion is in my eyes, at least a valid one.

The problem for me with fanboy arguments when they do arise is that i'm devil's advocate for both sides given that I have used just about every card on the market at one time or anoother and I see the merits of both ATI and Nvidia and if I ever square them off against each other it will only be on a card per card basis when they are markted against each other. When someone says "I have this much cash, whhat should I buy?" then the budget and best value determines the choice, not the manufacturer of the GPU.

As an aside, one reason I will always have a high end Nvidia in the current market is CUDA for my Seti@Home and Folding@Home projects because in that area an 8600GT will wipe the floor with any ATI card on the market when folding on gpu.

#19
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
nVIDIA has always had a very competition-driven industrial "personality", if we can call it that, matching to its CEO's own nature.  Perhaps too much so, leading to my own disaffection with them. 

In order for either company to sell High End Cards for $300 to $500, they must sell a ton of sub-$100 cards.  It's going to be that area (business and basic gamer / MMO player) Intel will shoot for at first, when they finally have something to work with, whatever it is.  AMD will aim higher, at the very bracket that they have done so well in the past year, where I see 5700s for the long term, and probably the HD 5670 for the short term. 

Neither ATI nor nVIDIA will be able to sell the same numbers of low end and Mainline cards that they sell now, and the High End pricing will have to escalate upward, or the pace of increased performance will have to be reduced.  There will be high end cards after the combo processor products become mainstream, but the nature of their importance in the video development / generational improvements will have to change.  However, it will be three years, plus or minus, before that next stage in graphical direction gets going. 

Things will definitely be getting interesting in that arena starting from two years away, though. 

G

Modifié par Gorath Alpha, 14 janvier 2010 - 09:57 .


#20
Livanniah

Livanniah
  • Members
  • 38 messages
I wouldn't be so sure about the 5670 - as much as I love what I've seen of the 5xxx series (although from a distance - not buying my first for another few months once I rebuild the lesser of our "serious" machines) - I can't really be too impressed with the benchmarking that Tom's did recently on it - especially on the DX11 test, sure that DiRT 2 game they used for the bench might be terribly optimized since there's nothing but BattleForge that is also DX11 to my knowledge (which really isn't a bench style of game) but pulling sub-30 frames at 1280x1024 (in DX11) means that assuming DiRT 2's engine isn't complete trash optimization-wise, it's not even putting DX11 into the calculations for it. (Which IMO defeats the entire purpose behind the 5XXX line)



Contrasted against the 5750 which dips only a little sub-30 at 1920x1200 in DX11...



A DX11 card that can't really handle DX11 seems like a depressing joke to me - as much as I love the rest of the line. (and the 4770 in my lesser rig as well)

#21
DIONightmare

DIONightmare
  • Members
  • 7 messages
2 Livanniah

5670 is just a 40nm version of 4670 with DX11 support - same 400 processing units (5x80 cores against 20x80 in 5870), just less power consumption and 3-display support. Don't expect anything interesting from it. It's not the card for 1920x1200 and especially not the card for D3D11 (it's like 8600GT for DX10 - had anyone expected decent perfomance of it back then?).



You understand purpose of 5xx0 line wrong IMO. It's not about DX11 (we won't see DX11 games for couple of years maybe - then there'll be next generation of 3D-chips), it's about double processing speed of 4x00 in all tiers - architecture hasn't changed much, but perfomance of 5870 is much higher than 4870 thanks to 1600 processing units. 5770 is mid-end now with 800units and GDDR5, while 5670 is pretty much gaming low-end, while incoming 53x0 (or 55x0?) are mainly HTPC cards.



My overclocked 5850 runs all games in 1080p with 8x MSAA and 16x AF on maximum at stable 60fps, and that's exactly the entire purpose behind 5xx0 line as I see it=).

Well, everuthing but Crysis - it's interesting if Fermi will manage it at last.

#22
Tommy6860

Tommy6860
  • Members
  • 2 488 messages

Livanniah wrote...

For questions like this, keep in mind it's important to give data including RESOLUTION you intend to play at - what can max out a 1280x1024 screen will choke on a 1920x1200.

Toms Hardware does a great monthly piece on the best cards for the money including quotations on each of the expected performance level at common resolutions, link for the current one is: http://www.tomshardw...-card,2521.html

And personal word of note - restricting yourself to one half of the aisle is pretty foolish currently, especially with only one graphics card manufacturer having DX11 cards shipping before Summer - buying any card without compatibility for the most recent DX incarnation seems foolish to me. [In fact still kicking myself for missing that ATI was having theirs come out so soon, had purchased a GTX280 for this machine 2 months earlier, would've gotten a performance boost + DX11 for $20 less if I'd been patient]


I hear you, I waited for those DX11 cards, then spent big money (and saved a bit) building my own PC. This puppy has a Radeon 5870x2 setup and Crysis wimpers under its max settings of 1920x1200.

#23
ZootCadillac

ZootCadillac
  • Members
  • 247 messages
Just an addition to help out the OP.

I had a machine here that had 2 9800GT's (standard Sapphire's) in so I whipped one out and ran DA:O.

I maxxed all the settings at 1600X1024 which whilst not quite HD and not the highest resolution available it should be considered near enough to HD that you'd have to be churlish to argue that it's not a high enough resolution.


Posted Image

click image for full size

The image result is as good as it can be

Posted Image

click image for full size

Now, the frame rate was 35fps and many gpu 'wizards' will laugh and say that's not good enough. But here's a little secret I learned in my first University degree study ( TV, Video and audio electronic engineering).
The human brain can not process images faster than 25 frames per second. It's the reason that when Britain was establishing the first TV system they settled on PAL at 50hz ( 25 frames per second interleaved .
It does not matter how many frames your super card throws out, your screen only shows however many your monitor is set to ( 75hz shows you up to 75fps and no more, ever ) but even so, you can't tell the difference between 30fps and 60fps no matter what you may convince yourself otherwise. Your brain just can't do it.

So there you have it. a card you can pick up used, under 12 months old for less than $70. Outperforms an HD5670 in the majority of graphic intensive games at higher resolutions except in the one occasion the 5670 is able to use its DX11 or in openGL intensive applications ( which has always been the case with Nvidia and ATI)

Having said that, for a new build on a budget it is a bit old. I'd suggest as value for performance, the GTS250 but really the GTX260 if you can afford it.

The choice is yours. Other graphics cards are available ;)

Modifié par ZootCadillac, 14 janvier 2010 - 10:27 .


#24
Livanniah

Livanniah
  • Members
  • 38 messages

You understand purpose of 5xx0 line wrong IMO. It's not about DX11 (we won't see DX11 games for couple of years maybe - then there'll be next generation of 3D-chips),


I understand the point of the various tiers - I was merely stating my disagreement with Gorath's enthusiasm for the lower cards of the 5xxx-line, as you stated yourself it's basically a slightly amped up 4670 with DX11 and Eyefinity (which it won't be able to handle) for nearly double the cost of a 4670 these days. (In fact friend had gotten one [post-rebate] for $45 with free shipping over the holidays).
Hell, it's current pricing is just shy of the 4770's which dust it even non-OC'ed (and we know 4770's OC amazingly).
Also, when it comes to DX11 games, there are already 4 major titles scheduled for this year that are going to support DX11 - in addition to the already available DiRT 2. (AvP, Crysis 2, Bad Company 2, S.T.A.L.K.E.R.: Call of Priyapat) [Additionally the Turbine MMO's will be adding support supposedly]
DX10 was picked up very slowly - DX11 does not seem to be following the same trend.

And Zoot - the 25fps statement is actually somewhat incorrect - as it depends on display type....
For a Television Screen, 25-27 fps is sufficient. (Barring projection TV's which can actually use lower FPS)
For Film, it's 22-23 fps if memory serves.
For PC monitors I'd heard quotes of 30-35 fps in the pre-LCD era, not sure if it's changed with the conversion to LCD or not.

And it's important to keep in mind that MINIMUM fps is actually the key number you'd be looking for regarding a loss of framerate (i.e. for Dragon Age - Denerim, especially the big fights towards the end) - if you have 60 FPS 90% of the time but have a 2 FPS slideshow the other 10% of the time, you're not going to enjoy your game very much on those settings. (Although generally if you're getting 60 FPS, you min framerate will be around 40 and thus solid)

[Not arguing the cards you used gave the numbers you quoted, or trying to state you gave the numbers in the wrong situation - just clarifying part of the myths of framerate]

Modifié par Livanniah, 14 janvier 2010 - 11:22 .


#25
Ginasue

Ginasue
  • Members
  • 246 messages

Spectre_Moncy wrote...

Avoid 8600GT or below if you want to play at max setting.


I'm running a 8600 GT and have no problems at all.   I tired a 9600 before, and took it back, got better performance with my 8600.