Aller au contenu

Photo

Beam us down a Game Card, Scotty!


  • Veuillez vous connecter pour répondre
30 réponses à ce sujet

#1
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
                         Elementary, my Dear Doc McCoy (err, Watson) . .  

An onlooker would be quite correct to believe that he or she could assume that with an M - for mature - game rating, the people playing the game Dragon Age: Origins  would all have completed some sort of computer literacy course at some time, somewhere.  Apparently that isn't a valid assumption after all.  Just in the past few days, another pair of game owners have wanted to "download" a replacement video card because the one they have now isn't good enough. 

(Created about 3 days ago -> "is this e card something i need to go buy or can i download it sorry pretty dumb about stuff like this")

Basic info:  www.encyclopedia.com/doc/1O11-addincard.html

Some added video details:  pcsupport.about.com/od/componentprofiles/p/p_video.htm

While it would be very interesting, indeed, if the matter transporter technology from Star Trek existed, that is merely fiction in this day and time (I am writing this near the end of 2009 AD).  These add-in cards cannot be "downloaded", they must be purchased, shipped, delivered, and installed. 

Although the confusion between what a "card" is, and what software is, seems more extreme, we also see far too many newbies and even some moderately experienced PC users getting onboard video solutions confused with the real thing, actual video cards.

Here is a great long list of the many video cards that have been available: 

www.gamespot.com/pages/forums/show_msgs.php

I don't necessarily agree with the individual ranking placements, and have previously posted my own simplified rankings list based on the one compiled by NotTheKing. 

Here's how to identify your current system's hardware:

social.bioware.com/forum/1/topic/58/index/509580

All modern PC game packaging includes a System Requirements label describing the hardware needed to run the game.  When it comes to the video cards, their names include a performance marker in the form "n600" for those I refer to as Mainline Gaming cards.  An example of such a card is the late 2008 released Radeon HD 4670.  The HD 4 is the generation, and the "670" is the performance level.  Ordinary business graphics cards have the markers n300, n400, and n500.  They are much slower and less capable than the gaming cards. 

compreviews.about.com/od/video/a/DeskVidSpec.htm

Modern games do not offer support to low quality video devices built into the computer's mainboard chipsets.  These onboard (or "IGP") devices are not intended for game use. 

Peripherally related to the errors made purcahsing PCs that have no video cards at all, we also see a fairly frequent complaint from some people about the newness of their machine seemin (to them) to represent something special in regard to System Requirements mistakes.  There always have been, and always will be, multiple GRADES of computer quality being sold new at the same time by the same retailer. 

The market for low quality, inexpensive PCs continues to exist, particularly for laptops,  of which few are adequate game systems.  The same goes for computer PARTS.  Just because a video card is "new" doesn't make it better than an older, higher quality part.

For the very best in game playing, there are High End video cards with performance markers such as n800, n900 (the nVIDIA company also has some GTX260/275/280/285 numbered high end cards). 

Shopping:   www.ehow.com/how_5743276_choose-gaming-graphics-card.html

For anyone living stateside, I have had excellent experience buying my video cards online from the Newegg site, and their pricing is very difficult to beat.  Newegg now is also selling in Canada and in China. 

http://www.newegg.com

Gorath
-

Modifié par Gorath Alpha, 02 février 2010 - 12:40 .


#2
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
This was previously a relatively old comment in terms of this thread's development, but I wanted the added part here to be close to the top in the sequence.  Beginning with their GTX280, nVIDIA mostly abandoned the previous performance ranking numbers in their cards' names.  They use new numbers, but with far less attention to being particularly truthful, once again (here I'm thinking of their FXes). 

The "80" cards are mostly still equivilant to an "850", and the "70" is roughly like an "800".  A "60" card is about where the "750" cards once were, while the "50s" are more variable, ranging from "600" up to "750" equivalences.  There's only been two "40s" so far, and the original comes in two varieties, one of which is like an old "500", while the faster of the two is more like a "570".  I think that the 440 is doing the same thing, splitting its proper performance grade in two . .

The "30" looked at first as though it would be the same as a "500", but the latest one is much more of a "300", actually.  The "20" is basically a "400" or "450", and there have seldom been equivalants to the one with "10" in the names, which are worse than some onboard chips, being like a "100". 

~~~~~~~~~~~~~~

Here's my original content from way back when:

Between the PC Gaming System Basics article, and the Video Card Rankings list, this forms a third article for Game Play hardware reference purposes.  I need to add at least one link to a tutorial about installing the upgrade video card; I've had an old bookmark for one I'll check again to see that it's still there, and add it, and repeat the local URLs for those other two reference articles here in this followup. 

The Basics: 

social.bioware.com/forum/1/topic/58/index/509580

My own interpretation of the video card rankings:

social.bioware.com/forum/1/topic/58/index/128343

The article at Sharkey Extreme is still there, and still looks like a good tutorial today:

www.sharkyextreme.com/hardware/articles/video_card_install_guide/index.shtml


Gorath
-

Modifié par Gorath Alpha, 18 mai 2011 - 06:41 .


#3
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
While editing the primary PC Hardware Basics article to add a warning (as below), I created this P. S. answering a comment about how well one member's X1650 XT was working for him in Dragon Age.  This is it. 

P.  S.  (In edit)  The closest thing that I have to that 1650 is the X1600 Pro I have in the primary loaner box here (I could probably have left it without a card, since its MB came with an HD 3200 IGP, one of the better onboard chips at the time that system was built for a grandkid, whose mother got all high-handed and wierd, so he ended up not being able to keep it). 

This next is IMPORTANT.

It's the only card I have with the so-common "Big RAM Scam" layout.  Like any Mainline card (admittedly very low on that scale when new), it has only a 128 bit memory system, and a resulting narrow memory bandwidth.  It is limited to a max of 256 MBs of VRAM to pass to today's games, so half of the 512 MBs onboard is literally wasted.  The big RAM numbers appeal to the noobs, but it has almost no real meaning at all.  

The important shopping crieria are core speed, RAM speed, memory bandwidth, and shader unit count.


Gorath
-

Modifié par Gorath Alpha, 25 août 2010 - 11:49 .


#4
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Someone brand new to the forum wanted to equate a software "update" to an upgrade today, saying that his video "upgrades all the time"; I believe that was the way he wrote it.  What we had in that case was a question in the long run that ened up with the following thoughts being added to this article. 

                                But is it a worthwhile upgrade after all?  Really?

This subject shouldn't come up very often any more.  It's been five years since PCIe became the successor video bus, and four years since nVIDIA stopped selling any AGP cards newer than the five year old Geforce 6200A.  ATI stopped selling any AGP cards with its own brand on them about three years ago, but does continue to offer its partners the GPU chips for AGP video cards as recent as the HD 4670. 

However, the question keeps coming back into focus.  For the most part, at this date (January, 2010), buying a new video card to upgrade the average PC that needs an AGP card is just not a good monetary investment, no matter that there are new cards of good quality like the HD 4670 readily available at a good price ($75 to $90 from the online sellers, such as Newegg). 

In the case of the average machine still using such old video hardware, the rest of the system is almost certainly equally out of date, with too little RAM, too slow of a CPU, and very limited upgrade paths for such things as replacement CPUs.  Between the costs of older type RAM (not as reasonably priced as current types), collector- cost prices for rare NIB processors, and the VGA also, the total is such that replacing the motherboard with something newer is hardly much of a next step up in cost, given the lower priced RAM, and less expensive video card for PCIe (extremely reasonably priced just now). 

The only systems I can consider will justify spending money on that are now using AGP cards, are various AsRock "Dual" mainboard PCs, that can handle both AGP and PCIe, and have an upgrade slot to allow for the upgrade to both the AM2 processor socket and to DDR2 memory.  I don't think that the Pentium versions of AsRock Duals are equally admirable. 

P. S.  In the intervening time since I wrote this comment to add to this article, I decided to voluntarily upgrade the PC that my sister has in her bedroom, for which the main thing she does is look at old movies on DVDs, using a flat panel display that she already had when her older model DVD player broke down.  On eBay, I found a rather interesting AsRock MB that I'd not seen before.  It has the ULi chipset, an AM-2 socket for an AMD processor, and DDR2 memory.  The odd thing is it has an AGP video bus, so the X1650 from her prior bedroom PC will work for her, just fine.  

Gorath
-

Modifié par Gorath Alpha, 18 avril 2010 - 03:41 .


#5
renegade1982

renegade1982
  • Members
  • 3 messages
helpfull thank you gorath :)

#6
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
This article and thread is more concerned with the lowest of low end video, and I was discussing this subject in a thread on the Mass Effect tech forum a couple of days ago, and at least one Bioware community member was getting very angry about not being able to play the game using an ordinary Intel onboard video chip, complaining that the game's system requirements were too difficult to understand (when in fact, INTEL as a maker of video devices, is specifically named in the ME2 requirements as substandard. 

The next material, below, was in answer to a complaint about how expensive an upgrade might be, and about how a different Intel product might be working: 

Mainline PCIe video cards start from around $65-70 (Radeons -- most Geforces at this level, IMO, seem to be priced too high to be good values compared to the HD 4650 and HD 4670).  The 8400 (in a desktop) could let the game operate, at a cost of maybe $35 (any pricing is strictly stateside and strictly online), but at a very similar price, an HD 4350 is much better, although still technically unsupported. 

igorv2002 wrote...

Actually  MOST games I tried were handled well by Clarkdale's IGP.

Depending on the i5 version, it's not what would be considered an IGP.  Just as already commented, when the CPU and video are combined, everyone is entering brand new territory.  The chip is inside the package with the CPU, although not integrated into it (much like the cache RAM for the Pentium P2s).  Intel did upgrade their design before adding it, and as already noted, the close communication with the CPU makes a huge difference (in fact, on some benchmarks, it can overmatch the competition, which is still merely the old style IGP). 

With regard to any 8400 Geforce, first off, it is specifically NAMED as unsupported, and it may technically  have all of the required features and functions, so should eventually manage to operate with the game's code, but it won't "run", it will only "walk" because of its very low performance. 

www.gpureview.com/show_cards.php

What's been omitted here that led into the answer to Igor is mention of the APUs coming from ATI and Intel in the next couple of years, which will totally redraw the landscape for the add-on video card industry

G

Modifié par Gorath Alpha, 13 février 2010 - 08:49 .


#7
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
                      What's wrong with this computer? 
               
This is a trick question of sorts, and the message thread is the base-level video card data information article published here in the forum: 

www.newegg.com/Product/Product.aspx

It has an Intel Core Quad processor, a Geforce GTX260 video card, six GBs of RAM, and a 0.6 Terabyte hard drive, which seems to cover all the game's basics, but does it do so well enough? 

I have to tell you, I really do not know, and cannot tell from the advertising whether it can replace a desktop gaming system.  Right, if you read on through here before clicking on the link, it's a laptop PC, and thus, unsupported for games by the developer (by any developer for any high intensity 3D game that I know about, in fact). 

The missing information is whether or not it has adequate cooling to run that Geforce GTX 260 for a long gaming session, without either throwing artifacts, or slowing itself way down to be able to cool off better.  It is only 2" thick, and it weighs less than nine pounds.  That isn't "light" for a laptop, overall, but compared to a replacement for a gaming desktop, it *is* relatively light in weight. 

I'm not able to determine from Newegg's web page whether there is enough heat sink hardware, or sufficient ventilation to remove the heat.  AFAIK, the Intel Quad it is using is a low-power device, so it's going to be the heat from the hard drive, RAM, and Video that must be dealt with, to a greater extent than the CPU's heat signature.  

That's not going to help you decide if a laptop will work for you or not, but in my personal opinion, if you are neither a student, a business traveller, nor a shared-space dweller (back to a student there), you surely have the space to set up a desktop, and then you have more confidence it can cool itself! 

Finally, there is this factor: a desktop system is relatively flexible, almost everything inside the tower can be upgraded.  Laptops typically are limited to upgrades of storage and of memory, usually not including the CPU, and almost never the including the GPU.  The worst laptops of all are the cheapies with onboard video chips, and NONE of the laptops made that way have any video upgrade path at all. 

Gorath
-

#8
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
                                 Video Card Competition

The release dates for video card generations haven't been on a regular cycle in about seven years now.  In nVIDIA's case, the complexity of their chips kept on growing rapidly in that period, while ATI has been shrinking their own chips by comparison (past three years in their case). 

However, until just the immediate year and a half or so, nVIDIA almost always had a marketing cycle advantage, with mid-summer releases compared to Radeon releases in the fall.  That was while it was ATI whose chips were increasingly complex (now, I wonder how many cycles of edit / re-edit I'll need to line up this table I've drawn up). 

Generation             Time        Radeon                      Time
Flagship     Date     delay     Flagship         Date     delay    

GF3         2-27-01                     8500     8-14-01

Ti-4600     2-06-02     11 Mo.    9700 Pro    7-18-02     11 Mo.

FX 5800     1-30-03    12 Mo.     Delayed

6800 GT     7-1-04    17 Mo.       X800 XT    6-1-04     23 Mo.

7800 GT     8-11-05    13 Mo.     X1800 XT     1-5-05     15 Mo.

Delayed                                     Delayed                        

8800 GT      10-29-07    26 Mo.    HD2900 XT    5-14-07     28 Mo. 
 
                                                   HD3870     10-15-07     6 Mo.
GTX280     6-16-08     8 Mo. 
9n00 Clones         9 Mo.              HD4870     9-10-08     10 Mo.

Delayed                                       HD5970     9-20-09     12 Mo,

GTX480     4-12-10     22 Mo.

Modifié par Gorath Alpha, 13 avril 2010 - 03:15 .


#9
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Gorath Alpha wrote...

...  at least one Bioware community member was getting very angry about not being able to play the game using an ordinary Intel onboard video chip, complaining that the game's system requirements were too difficult to understand (when in fact, INTEL as a maker of video devices, is specifically named in the ME2 requirements as substandard). 

Mainline PCIe video cards start from around $65-70 (Radeons -- most Geforces at this level, IMO, seem to be priced too high to be good values compared to the HD 4650 and HD 4670). 

Onboard video chip supporters continue to be contentious, and I continue to denigrate Intel's Low Quality junk. 

Modifié par Gorath Alpha, 10 juin 2010 - 12:43 .


#10
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
I think the time is here to bring this to the top again.

#11
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
We have a new arrival with only a Radeon 9800 wanting to play the game on his eight year old antique video device today . .

#12
SSV Enterprise

SSV Enterprise
  • Members
  • 1 668 messages
Huh... I was just reading this when I come upon the revelation that cards with ATi chips as recent as the 4670 are still made for AGP ports. I have a rather old computer that lacks PCIe ports, with only PCI and AGP. It's been running an nVidia GeForce 7300 GT for a few years now, as that was the only AGP-compatible card my dad and I could find at the time. Until now.



So, my question is: would it be worth buying something like an AGP Radeon HD 4670 for this old computer? Here are the other specs for it:



Windows XP Professional

Intel Pentium 4 processor, 2.6 GHz

1 GB DDR RAM



Would I get decent performance out of a game like Dragon Age, or would it be rather a waste of money as the processor and RAM can't keep up?

#13
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
My opinion is that the only AGP mainboards worth considering an upgrade of the video card for are the more recent of those designed for the AMD X2 -- primarily from AsRock and Biostar, which have far greater potential than any pre-DDR2 hardware based on an Intel processor and chipset. 

I just sent one of those off to Austin to replace a system similar to what you are asking about.  My sister is amazingly backward with PCs, practically techno-negative, in fact.  She literally fits the famous stereotype about "dumb and dumber" computer noobs expecting the systems to plug themselves into the surge protector.  It has an AM-2 socket, DDR2, a Via chipset, and it still has an AGP slot for its video bus. 

Now, I should probably try to figure out how she managed to get her old system so badly screwed up that she could hardky get any use from it any more. 

Anyway, I could bump that mainboard's graphics to an HD 4670 without bottlenecking any at all, but in your case, an HD 3650 is probably as recent as won't be throttled by the rest of an old and slow systemm if you can find one.  Of course, even choked down the way that P4 will do to the HD 4n00 cards, there are games it could play, such as TES: Oblivion, and The Witcher, and probably Fallout 3 (I'd have to look that one up, however). 

I definitely wouldn't want to try either of the Mass Effect games, or Dragon Age, with that slow of a P4, though.

G

Modifié par Gorath Alpha, 25 août 2010 - 10:52 .


#14
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
I wasn't sure which of the three (DA: O) graphics add-on card articles was the right one to include a comment about PhysX in, and ended up leaving it out.  Like Havok, it's a game-supporting utility that helps game developers to include real-world physics in their products.  Like Havok, it has been around for several years now.  Originally created by Ageia, it was based on CPU processing until about a year before nVIDIA bought their company. 

They had released an add-on card that worked with the CPU and the GPU to speed up the implementation of the effects, and it is the functions from that card that were eventually added to various of nVIDIA's graphics chips.  They didn't stop support for the CPU-based PhysX, and it is that version of the program that is included in EA's games.  It works as well with Radeons as it does with Geforces, and it is needed in the game, no matter which brand of 3D card is used. 

As of the end of October, 2010 (my current edit here), some potentially very influential changes in PC Hardware are coming that will affect PhysX, because it will seriously impact nVIDIA, if both AMD and Intel pull this off equally well. 

The market for add-on video cards, other than at the absolute very top performance very high-dollar range, is going to be limited by the addition of integrated high quality (leaving Intel out of that for the immediate present) graphics into the main processor itself.  AFAIK, the Fusion CPUs for laptops were were already supposed to be in production, and the first laptops based on the new technology were expected within about two months (around New Year's).

Intel's own Sandy Bridge is more of the same low quality graphics (better, yes, but still not good enough) that Intel is famous for, although they are promising an improved functionality in the future.  Sandy Bridge is also somewhat further out in the near future, maybe late spring, 2011,  than Fusion is.  The Fusion for the desktop CPU market is less than three months away now. 

(Newer Edit, Yet):  We can expect to see what the Mobile Fusion products will look like at CES (Consumer Electronics Show) 2011, which opens on January 6th in Las Vegas.  Until mid-December, it had been my understanding that products would actually be shipping by January 6th.

The current expectations put the shipping date closer to the end of January, 2011.  Desktop versions should follow about 30 days later.

** (Added in Edit Jan 3rd:  Previews of the Intel Sandy Bridge are appearing now, with the mobile versions (of the integrated graphics) being compared to the ATI HD 4200 and 4250, and it's supposed to be better than those.  According to Anand Tech's reviewer, one of the Sandy Bridge laptops was even able to run Mass Effect 2 as well as the Mobile HD 5450, which surprises me, given the disparity in shader processor counts.)

If Intel is never able to fulfill its promise, only then does nVIDIA have a long- term future as a graphics add-on company; however, betting against the 900 pound gorilla of PC hardware being able to eventually accomplish its aims is not usually a good bet.  If the add-on graphics market shifts to only the upper performance levels, PhysX support at the "normal performance / game- ready" hardware levels disappears.  (nVIDIA has been hedging their bets for a viable stake in the future of computer technology by redesigning their high-end GPU chips into massively parallel CPUs).

Another month later, and a fresh edit is needed.  Intel and nVIDIA have pursued a legal argument for six years now, about contracts they had with each other than nVIDIA says gave them rights to participate in the development of system chipsets for recent Intel CPUs.  Intel has said no, and the two have sued one another.  The suits were about to reach the courthouse, when the two agreed to ask for a continuance in order to meet at the bargaining table. 

Previews of the reference systems designed for Netbooks using the AMD Fusion APUs have been very favorable, and those will be on the market in about a month now, along with laptops of both the regular types and the less-expensive, but better-than a Netbook, in-between type, and AMD seems to have clear advantages over the Intel competition.  The conjecture going around about the legal situation is that Intel wants to be able to sell Atom CPUs for Netbooks with the nVIDIA ION chipset video onboard, in order not to lose so much of its market share (it titally dominates in Netbooks).

OK, back to the last line that was in this installment before the first of the PhysX material was added here:

This game is admittedly somewhat more forgiving about video at the lower end, if your personal standards are equally minimal.  ME-2, however, literally chokes and fails at the 15-16 FPS point. 

Modifié par Gorath Alpha, 03 janvier 2011 - 08:38 .


#15
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Gorath Alpha wrote...

This game is somewhat more forgiving about video, if your personal standards are equally minimal.  ME-2 literally chokes and fails at the 15-16 FPS point. 

The 4550, for instance, would likely manage to "stroll" through Dragon Age if the resolution was low enough, then fail in ME-2, but I wouldn't enjoy the mob scenes at all with that one, when everything slowed to a crawl. 

http://www.gpureview...1=555&card2=584

http://www.newegg.co...N82E16814102814

www.gpureview.com/show_cards.php

www.gpureview.com/show_cards.php

Modifié par Gorath Alpha, 25 août 2010 - 11:17 .


#16
flagondotcom

flagondotcom
  • Members
  • 543 messages
Unfortunately, I doubt that the real intended audience of this thread has read this far--I agree with Gorath that the issues need to be raised, but wish there were a way to better communicate them to the needy.

Anyway, I'm writing to note that for anyone reading this, even if you previously were able to play the game on a system with a video card near the "minimum" stats, your luck may have run out. Due to a set of MS hotfixes to WinXP that forced updates to DirectX and my videocard drivers, my NVidia 8600GTS has gone from "reasonably good overall" to "only tolerable in small spaces". In particular from a new playthrough starting just after Ishal, the entire outdoor portions of Lothering and Denerim are almost entirely unplayable (turning the character goes from "real time" to "thinking about it" and actual movement slows down by a factor of 5 or 10)...but as soon as I step into a building all is well. DA didn't change, so this issue is due to either MS or NVidia's drivers.

Bottom line: Know what video cards are actually worth a d__n. Gorath has posted useful links to comparison sites. And even if you think your video card *should* work, that might not be the case as MS and your vendor change drivers.

Edit: removed redundant redundancy.

Modifié par flagondotcom, 26 août 2010 - 03:09 .


#17
Mattcool

Mattcool
  • Members
  • 4 messages
A very helpful thread and a very good observation flagon, although it doesn't entirely account for the choppiness of my game it does cover a good portion of it, I guess i'll just have to wait for the birthday money to roll in haha.

#18
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Based on many, MANY years of past experience with sites such as this one, for every message that I see complaining about this game not working correctly with a minimum card such as an HD 3450, or 9400 GT, probably another six never write, but think the same thing to themselves.  Someone yesterday didn't want to accept my word that his / her 3450 wasn't going to stand back up again in any high intensity situations.  I decided to bring this one back to the top again because of that, although at least a 3450 is an ACTUAL discrete CARD.. 

Meanwhile, Flagon, can you look into rolling back any driver version change to see if you can regain a part of the lost performance?  I've been suggesting for at least a year that the owners of Geforce 7800, 7900, and 7950 cards not update drives past about 190 or so, maybe we can find a "last usable" stopping place for 8600 / 8800 cards.


G.

Modifié par Gorath Alpha, 06 décembre 2010 - 01:28 .


#19
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Gorath Alpha wrote...

The 4550 ... would likely manage to "stroll" through Dragon Age if the resolution was low enough, then fail in ME-2, but I wouldn't enjoy the mob scenes at all with that one, when everything slowed to a crawl. 

http://www.gpureview...1=555&card2=584

http://www.newegg.co...N82E16814102814

www.gpureview.com/show_cards.php

www.gpureview.com/show_cards.php

                          How Terrible *IS* that video card? 

It appears that many HD 3450s were sold in several branded PC companies' bargain priced systems a few years ago, and their owners are loathe to follow through on upgrades.  Games such as this one demand decent Mainline graphics performance, which this device cannot deliver.  Here it is next to the card I consider a Minimum for the normal Medium Textures: 

http://www.gpureview...1=555&card2=472

As it reveals there, the Memory Bandwidth is only 36% as good, the Pixel Fill Rate is only 52% as good, and the Texture Fill Rate is only 52% as good.

Compared to Bioware's officially named minimum Radeon for SMALL Textures, the 3450 shows up far worse for speed:

http://www.gpureview...d1=555&card2=54

This time, those three measurements can only manage to reach 24%, 29%, and 39% as good of a score as the Radeon X850 Pro. 

www.videocardbenchmark.net/video_lookup.php

(That's an interesting comparison I don't have a lot of faith in.  I can imagine that even the onboard Radeon chip called the HD 4200 is better than a 3450, but I can't agree that the eight year old Radeon 9200 is any better.)

Take note of the common definition for what "Minimum" is.  Very typically, it is defined by screen resolution first, frame rates next, and Image Quality last.  The developers are aiming at Medium Resolutions and smooth animation; they want the frames to stay above 25 FPS, and preferably not fall below 30 through most of a game.  In order to do that with their version of what the minimum is, the Image Quality setting are all Low Quality. 

For game players with much lower standards, it certainly is possible to "stroll through" games using less capable hardware, but it won't be smooth, and it shouldn't be Medium Resolution.  It's not going to "run" the way that the game was designed to operate. 

Gorath
-

Modifié par Gorath Alpha, 27 août 2010 - 04:55 .


#20
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
This one is now a month out of immediacy, so I think it's about time for it to cycle to the top.


#21
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Gorath Alpha wrote...

Based on many, MANY years of past experience with sites such as this one, for every message that I see complaining about this game not working correctly with a minimum card such as an HD 3450, or 9400 GT, probably another six never write, but think the same thing to themselves.  Someone yesterday didn't want to accept my word that his / her 3450 wasn't going to stand back up again in any high intensity situations.  I decided to bring this one back to the top again because of that, although at least a 3450 is an ACTUAL discrete CARD.. 

Yesterday, the question involved a routine, ordinary IGP, not a real graphics card.  Once again, the author can't accept the truth. 

Gorath

Modifié par Gorath Alpha, 26 décembre 2010 - 08:21 .


#22
TJPags

TJPags
  • Members
  • 5 694 messages
Very informative.



Also very condescending.



You should work on that second part.

#23
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Gorath Alpha wrote...

Yesterday, the question involved a routine, ordinary IGP, not a real graphics card.  Once again, the author can't accept the truth. 

Although it was a Mass Effect forum author yesterday, the question has cycled back to another person hoping he can "download" a graphics card, and I always feel that for every one who posts a given question, anywhere from 5 to 10 came through thinking the same thing, but were unwilling to post their question.  Beyond that number, are perhaps another several hundred who never visited any forum, but were thinking exactly the same thing yesterday to what did get posted. 

#24
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

I have fully installed the game,  But . . When I open up the game from my desktop it comes to a black screen (the screen that appears when it's loading into the game) and it goes straight back to my desktop I do not understand why it does this.

*UPDATE*  I found a program called Can I RUN IT? It told me my computer could not run it, My video card is not good enough, I thought it would be because I have successfully played WoW before,

My video card is called NVIDIA G series,

Can someone link me to a site where I can update/upgrade my video/graphics card, I don't want to buy a new one and I heard you can update them. My parents hired a computer guy and he is coming tomorrow so he might help but any help will be appreaciated THANKS!

Video cards are actual, physical devices that plug into an add-on slot in a desktop PC's mainboard, or else are separately assembled from a laptop PC's mainboard and then soldered onto it (permanently) before the laptop is finally all welded into a single monobloc device (attempting to open it will destroy it). 

It is not possible to alter the physical characteristics of the card using any software of any kind.  Only desktop PCs, and the higher priced versions of the Sager desktop replacement machine have replaceable video cards. 

P. S.  The possible inspiration (message thread) for this article showed up while this one was near the top for current reference.  Given the total number of messages on these forums, that sort of coincidence is extraordinary, over a year later. 

Modifié par Gorath Alpha, 26 janvier 2011 - 06:34 .


#25
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
When the last Bioware fantasy was still fairly new, nVIDIA stumbled and almost fell, just the way that 3dFX had fallen a couple of years before then.  They rushed the replacements of the disastrous FX generation out, and finally had a workable shader manipulation system, but ATI hadn't been sleeping, and had more shader units in their cards that year than nVIDIA did. The year was 2005, so a 6200 is now six years old.  That is a veritable antique.   The game mentioned was Neverwinter Nights, the original 2003 classic CRPG AD&D game.

Their Geforce 6200 started out too expensive to replace the terrible FX 5200, nor especially the even worse MX 4000, at the costs of those old clunkers, but ATI was selling X300s for less than the 6200s actually cost, while still making a profit.  Major mistake there for nVIDIA.  Neither card was much good for anything beyond the most basic of business graphics, but the 6200 had Dx9.0"c", while the X300 only had Dx9.0"a".  Neither really needed shaders, but they advertised having them.  Within weeks, the original 6200 card was removed from the market, reflashed and renamed, to become a "6500" (about which there's more to tell, but it's not exactly pertinent).  

During most of the run of the Geforce 6n00 generation, they sold the "6200A", which had half of the bandwidth of the "6500" and much cheaper component selections, so it could be sold for less than ATI was selling their X300.  It was terrible compared to the original 6200, which hadn't been particularly better, just accidentally way too expensive in the face of ATI's competitive pricing.

It even had defective shader functions that turned a cloudy sky into a checkerboard, and that wasn't fixed in the 7100, 7200, or 7300 GS that descended from the 6200A, although there eventually was a special bit of code added to the drivers to control the odd rendering of skies to some degree. 

The "good" 6200 wouldn't have played DAO in anything like an enjoyable manner, if there was a new one around to try, while with the 6200A, I shudder to imagine how awful that would be for this game!  

Modifié par Gorath Alpha, 26 janvier 2011 - 12:42 .