Ignore Warning Labels at Your Wallet's Peril
For two years, possibly three, during roughly 2004 to 2006, about 70 % of the desktop PCs sold did not include any dedicated 3D video expansion slot, and this ratio jumped to 90 % of laptops. Without such capability, those PCs are not game-capable machines. Numerous of my (past) references here at the BioWare Community (the original of this comment dates back a couple of years) cover various aspects of this. The minimum (discrete add-in device) video card includes a GPU made by ATI or nVidia. Nothing from Intel qualifies.
The old AGP video bus was a very complicated system, and costly to include on an economy priced mainboard, but the much more recent PCI-e video bus system is far less complicated, and we all should be well pleased that AGP faded out of favor rapidly. However, the manufacturing process that laptops standardized on amounts to a similar limitation. They are almost all a monobloc, solid, with no access to any upgrade for CPU or graphics.
Current laptops continue to to be sold with mostly unusable video systems. All recent 3D games have a warning label on the back, bottom flap, or side panel, of the game's box, that you should never ignore! The official minimums (DA: O), IMO, aren't really good (practical) choices for that designation. Nevertheless, they are real video cards, while Intel hasn't even tried to produce one of those since their disastrous singleton about a dozen years ago.
Far too often, owners of below minimum hardware are complaining about problems and avoiding the truth that they should have expected trouble due to the cheap or obsolete hardware they have. I choose not to try answering questions that don't have a basic list of the important components, at least.
Elsewhere, the differences between what a Game Developer chooses to name as the minimum, and what the cheapskate route to playing games on junk for graphics amounts to, have been defined. Some onboard chips from AMD and nVIDIA can satisfy that latter crowd, but I personally believe they are doing a disservice to the work done by Bioware's people creating this game when they choose to do this.
P. S. Another thing that I have seen fairly frequently during the past couple of years, while the cachet for owning laptops has gone up, and desktops have become too Low-Status to hold onto, is that over time, ordinary systems being forced into use at playing games when they were not designed to do so are deteriorating. I don't know what goes bad, or how, and can only offer my suspicion that it is probably heat-related.
Both AMD and nVIDIA have been offering various Chipset video chips for about three or four years that have all of the needed functions and features, other than dedicated VRAM, and fairly recently Intel has been able to offer some Chipset graphics that challenges the 3D companies' onboard superiority, which has in its own turn encouraged other Intel Chip video owners to try to join in, for which there are some software add-ons to use that disguise the worst of the Intel Chips from the games' configuration tests.
It is systems with those onboard solutions primarily, but also some systems with Low-End business graphics cards, that are wearing out prematurely from the demands of game playing, I think.
Gorath
Modifié par Gorath Alpha, 30 janvier 2011 - 04:55 .





Retour en haut







