Video Card Rankings and Basics
#76
Posté 19 décembre 2009 - 05:08
max settings and most the time i have it locked at 75fps. Lowest would be around 55 during heavy battle scenes which is really not noticeable.
#77
Posté 19 décembre 2009 - 05:15
#78
Posté 24 décembre 2009 - 04:02
thanks in advance,
Bro' A.
#79
Posté 24 décembre 2009 - 05:15
If you want a referral to a good value on a case, I have used these Coolermaster "Elite" cases in two builds and you get a lot of value for the low prices:
www.newegg.com/Product/Product.aspx
That one is only $42 right now.
The various 9600s are based on a somewhat modified 8800 chip that is now getting on up in age, and unable to compete with the comparable Radeons for either performance or price. The Radeons have smaller power draw, and thus require less cooling, so that they are smaller overall.
Here's a nice HD 4670 with free shipping, for $80: www.newegg.com/Product/Product.aspx
Gorath
-
Modifié par Gorath Alpha, 24 décembre 2009 - 05:54 .
#80
Posté 24 décembre 2009 - 08:09
#81
Posté 24 décembre 2009 - 02:53
Unfortunately, I do not have the time or budget to get a new PC or upgrade this one beyond memory and graphics card.
#82
Posté 24 décembre 2009 - 04:38
www.newegg.com/Product/Product.aspx (XFX)
www.newegg.com/Product/Product.aspx (Gigabyte)
Modifié par Brother Abner, 24 décembre 2009 - 04:39 .
#83
Posté 25 décembre 2009 - 05:54
Gorath
-
#84
Posté 26 décembre 2009 - 12:01
#85
Posté 27 décembre 2009 - 07:38
The Vanilla 6600 is well below the practical bottom end for video. If the CPU is closer to reccomended, You can set the screen to 1024 by 768, and all image quality settings to low, and that should speed it up for you.Majyx wrote...
Merry Holidays from the Arctic! Got the game for Xmas but have NVidia GeForce NX 6600. Verra choppy and am wondering (blush - huge noob) what resolution wud b best settings? Mine goes up to 1280x1024 - is best? and what settings in the 'options' of the game? the same? The goal is to be a good grandmother and put in fer a complete new comp from Santa next year!
Gorath
-
Modifié par Gorath Alpha, 30 décembre 2009 - 05:27 .
#86
Posté 30 décembre 2009 - 05:21
Thanks again. It's been a long time since I've been this excited over a new game (since BG I came out).
#87
Posté 04 janvier 2010 - 03:51
Modifié par The_Chibi, 04 janvier 2010 - 03:53 .
#88
Posté 07 janvier 2010 - 04:02
I sincerely hope i'm not getting annoying, but for any who don't know, i'm a puder padawan in middle of a new puder build. I really seriously DON'T wanna screw up, but at the same time, i'm not rich. I already had a close call with a Forumite Graphics Card recommendation that looked buff, until Gorath Alpha's sweet puder wisdom saved me from a bad idea of settling for a 128-bit Memory Interface. Now that I know how important it is, i'm trying to find a card with 1GB Memory Size & 256-bit Memory Interface without ripping my funds supply apart. I'll settle for 512MB Memory Size if I must, & found such a possible card if I don't find better.
Many thanks for your time.
#89
Posté 07 janvier 2010 - 04:12
http://www.newegg.co...N82E16814102837
Anyone got intel on it?
#90
Posté 07 janvier 2010 - 04:21
#91
Posté 07 janvier 2010 - 01:23
So, Lightning, does my choice encourage you? At retail, that's a good price for a stellar performer (a hair slower than the 5770, but leaving a future option for higher resolutions open).
P. S. I just took the HD 4850 out of its bag (no box, it was OEM version, so no driver CD, no DVI adapter, no box), and it looks like the image I'll link in a moment, from GPU Review's HD 4830 feature. The shroud is 3/4 length -- starts and ends before either end of the card itself.
www.gpureview.com/database/images/cards/586/medium/ati-radeon-hd-4830-1.jpg
Gorath
-
Modifié par Gorath Alpha, 07 janvier 2010 - 03:37 .
#92
Posté 07 janvier 2010 - 05:59
Yes!Gorath Alpha wrote...
Actually, Zethell, it is only certain of the 3870's having trouble. My HD 3850 hasn't skipped a beat yet, although it might not know I have an HD 4850 ready to pop it in should it ever do so (incidentally, it's a 512 MB Sapphire, with the older style full-length fan shroud, from eBay, still new, but OEM, that I paid slightly over $100 for, with shjipping).
So, Lightning, does my choice encourage you? At retail, that's a good price for a stellar performer (a hair slower than the 5770, but leaving a future option for higher resolutions open).
P. S. I just took the HD 4850 out of its bag (no box, it was OEM version, so no driver CD, no DVI adapter, no box), and it looks like the image I'll link in a moment, from GPU Review's HD 4830 feature. The shroud is 3/4 length -- starts and ends before either end of the card itself.
www.gpureview.com/database/images/cards/586/medium/ati-radeon-hd-4830-1.jpg
Gorath
-
Many thanks for the sweet info in this thread!
#93
Posté 07 janvier 2010 - 10:24
#94
Posté 07 janvier 2010 - 11:02
I still don't understand why they named the Geforce 6600 GT for XP; that was borderline when new (as was the Radeon X1600 Pro, of course, with very little difference between the X1600 and the X1300). I've been thinking about the two cards in that sentence's parenthetical, and decided that the X1300 was really a "1450", and the X1600 was the one 5that should've been named "1550", instead of the renamed X1300 Vanilla becoming the X1550.
While getting ready to add an update, I noticed a potentially controversial statement that I suspect I've already covered closer to the end here, and added this edit. All of the FX / 6n00 / 7n00 Geforces were "weak" in shader unit functionality, and when shaders are used heavily, they fall well behind various Radeon equivalents. The X1600 was not very fast overall, but had good shader capability for its class at the time. The 6600 GT was faster, without the shaders' presence in a game's code, slower by far when those were used in quantity (Oblivion, NWN2, The Witcher, Fallout 3, Dragon Age).
Gorath
-
Modifié par Gorath Alpha, 25 janvier 2010 - 12:33 .
#95
Posté 08 janvier 2010 - 12:14
==========================================================================
ignore my post, game arrived today and it runs great.
Modifié par wobblydave, 08 janvier 2010 - 11:25 .
#96
Posté 11 janvier 2010 - 06:49
(Ended up getting the Graphics Card 2nd instead of 3rd because my Graphics Card choice picking ended up done faster than my CPU/Heatsink research. The CPU/Heatsink is 3rd now, but that's another story.
#97
Posté 14 janvier 2010 - 09:21
I guess the Nvidia GE Force 8400GS is not up to the task of this game. It did great for Neverwinter Nights 2, but not so much for DA:O.
I am on the lookout for a better, lowprofile power conservative card.
Modifié par Magnus of the Moon, 15 janvier 2010 - 05:54 .
#98
Posté 15 janvier 2010 - 06:38
#99
Posté 16 janvier 2010 - 04:26
#100
Posté 16 janvier 2010 - 04:45





Retour en haut






