Don't buy the game if all you have is Intel as graphics, really. Here's a parcatical look at the minimum:
Windows XP Minimum Specifications
OS: Windows XP with SP3
CPU: Intel Core 2 Single (or equivalent) running at 1.6Ghz or greater
AMD 64 (or equivalent) running at 2.0Ghz or greater
RAM: 1 GB or more
DVD ROM (Physical copy)
20 GB HD space
Video: ATI Radeon X850 256MB or greater (this is clearly wrong)
NVIDIA GeForce "6600 GT" 128MB or greater (and this one is more wrong)
(Note: IMO, the practical choices for the two video cards above should be the Radeon X800 Pro, and the Geforce 6800 GS, at least, for small textures - it will take a Radeon X1650 XT (or X1800 GTO, same thing, almost) for medium or better textures)
INTEL'S video (Graphics Chip) IS NOT SUPPORTED
Windows Vista/Windows 7 Minimum Specifications
OS: Windows Vista with SP1, Windows 7
CPU: Intel Core 2 Single (or equivalent) running at 1.6Ghz or greater
AMD 64 (or equivalent) running at 2.0Ghz or greater
RAM: 1.5 GB or more
Video: ATI Radeon "X1550" 256MB or greater (this was OBVIOUSLY wrong)
(*VERY* stupid here, -- should be the X1650 XT, since the 1550 is only a renamed X1300)
as per this ~ http://www.gpureview...1=472&card2=385
NVIDIA GeForce 7600 GT 256MB or greater
DVD ROM (Physical copy)
20 GB HD space
OK, with that out of the way,
" Ignore Warning Labels at Your Wallet's Peril "
For two years, possibly three, during roughly 2004 to 2006, about 70 % of the desktop PCs sold did not include any dedicated 3D video device, and this ratio jumped to over 90 % of laptops. Without such capability, those PCs are not game-capable machines. Numerous of my (past) references here at the BioWare Community (the original of this comment dates back a couple of years) cover various aspects of this. The minimum (discrete add-in device) video card includes a GPU made by ATI or nVidia. Nothing from Intel qualifies.
I used a generic warning about the typical laptops / mobile PCs as my basis for this, when I couldn't find an original on the Intel video chip subject that dated from the pre-release run-up to Dragon Age: Origins. Recent events have resulted in some edits to the original, which also belong in here.
Most current laptops continue to to be sold with mostly unusable video systems. We are also seeing a growing number of desktop PCs again, being sold without even a business quality graphics card in them. All recent 3D games have a warning label on the back, bottom flap, or side panel, of the game's box you should never ignore! The official minimums, published for Dragon Age, IMO, aren't really good (practical) choices for that designation. Nevertheless, they are real video cards, while Intel hasn't even tried to produce one of those since their disastrous singleton about a dozen years ago.
(I suppose that I should amend that statement to some extent. Intel put together a new video project group about five years ago, with plans to follow a different route to video than ATI and nVIDIA are doing, specifically to using something called "Ray Tracing" to handle graphics. They announced a plan for an actual video card that supposedly would have been released this past winter (March, 2010). They demonstrated some of their work here and there since then. The project name was "Larrabee", and they ran into all kinds of snags with it, so it was cancelled at the end of summer, start of fall, in 2009. There is no announced replacement program for an "end user version" discrete video card from Intel.)
For that matter, when the Mass Effect 2 game's official requirements were published, Intel's video was named very specifically as inadequate for that game (and it should have been named the same way for DA: O). Not even the combined on-package video chip plus CPU, Intel's several i3s / i5s / i7s (technically not IGPs) qualify as full-power mainline devices.
The engineers at nVIDIA have been considering the Netbook and Smartphone devices as a better place for them to compete in than the general PC market, where AMD has a serious advantage in being able to integrate GPUs inside of the designs of their coming "Fusion" line of CPUs. Their influence, if they earn a sizable share of the video in those markets, can only be good. Almost anything other than what Intel is doing nearly has to be better.
That isn't to say that Intel has been going backward graphically, but their low standard has been legendary, and any improvement at all is noticed. Over the past year, many of their i3 / i5 video chip systems have been able to perform almost as well as the nVIDIA onboard graphics currently available, but right at this moment, the Geforce Mobile generation of Fermi GPUs is setting some amazing standards (although demanding better cooling than ever before).
AMD will soon be releasing Mobile versions of their "Fusion" series (January, 2011), which should be extremely helpful to improving the laptop standards. Intel has a new processor family they call "Sandy Bridge" in the wings, due out slightly after the release of the AMD Fusion desktop APUs, expected in February. Instead of merely being a separate video chip riding inside the CPU package, Sandy Bridge will be at least basically integrated into their CPU, sharing the cache. It promises to be as fast as the current AMD onboard chip, the HD 4200, but AMD will already have replaced that one before then (the HD 4250 has already arrived, although it's not the one to look for as a Chipset video chip yet).
It is now December, 2010, and the new mobile Fusion APUs are already shipping to Netbook, Notebook, and Laptop manufacturers, with the PCs using them expected about the turn of the year or even sooner. The desktop Fusion APUs are expected in February.
The latest news on a long-standing feud between Intel and nVIDIA over whether nVIDIA's contract with Intel included the right to design chipsets for the newer Intel CPUs was about to go into court, after six years of wrangling (when Core Duo was a mere embryo). But both agreed to ask for a continuance because they were back in negotiations now.
The conjecture going around is that Intel wants to protect its dominant lock on the Netbook market, which the Fusion threatens, given its many times superior graphics. nVIDIA's ION is a low-current device that offers almost-competitive graphics performance to the Fusion for games, and superior performance in some other areas. Intel may hope to save a bigger part of their Netbook market if they can pair up ION and Atom at a good price, which right now, they cannot.
Whereas Sandy Bridge won't make much of an impact on the poor quality of Intel's own video chips, nVIDIA potentially could make a difference in the Notebook and Laptop markets, displacing more of Intel's junk with an appropriate newer chipset video of their own (the current 9100 / 9200 onboard option is pretty dismal, and Intel has already matched its poor performance with the i3 / i5 / i7 piggyback graphics.
I've refreshed this article, again, although I'd have preferred, still, to have added the changes onto what I wrote five or six months before adapting this material for this purpose. I will always feel that the first effort was probably a better one.
Gorath
-
Modifié par Gorath Alpha, 18 août 2011 - 06:14 .





Retour en haut







