Aller au contenu

Photo

DA's System requirements versus assorted (mostly Intel) video chips


  • Veuillez vous connecter pour répondre
5 réponses à ce sujet

#1
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
             Ignore Warning Labels at Your Wallet's Peril 

Don't buy the game if all you have is Intel as graphics, really.  Here's a parcatical look at the minimum:

Windows XP Minimum Specifications
OS: Windows XP with SP3
CPU: Intel Core 2 Single (or equivalent) running at 1.6Ghz or greater
AMD 64 (or equivalent) running at 2.0Ghz or greater
RAM: 1 GB or more
DVD ROM (Physical copy)
20 GB HD space
Video: ATI Radeon X850 256MB or greater (this is clearly wrong)
NVIDIA GeForce "6600 GT" 128MB or greater (and this one is more wrong)

(Note: IMO, the practical choices for the two video cards above should be the Radeon X800 Pro, and the Geforce 6800 GS, at least, for small textures - it will take a Radeon X1650 XT (or X1800 GTO, same thing, almost) for medium or better textures)

INTEL'S video (Graphics Chip) IS NOT SUPPORTED

Windows Vista/Windows 7 Minimum Specifications
OS: Windows Vista with SP1, Windows 7
CPU: Intel Core 2 Single (or equivalent) running at 1.6Ghz or greater
AMD 64 (or equivalent) running at 2.0Ghz or greater
RAM: 1.5 GB or more
Video: ATI Radeon "X1550" 256MB or greater (this was OBVIOUSLY wrong)
(*VERY* stupid here, -- should be the X1650 XT, since the 1550 is only a renamed X1300)
as per this ~ http://www.gpureview...1=472&card2=385
NVIDIA GeForce 7600 GT 256MB or greater
DVD ROM (Physical copy)
20 GB HD space

OK, with that out of the way,

            " Ignore Warning Labels at Your Wallet's Peril "

For two years, possibly three, during roughly 2004 to 2006, about 70 % of the desktop PCs sold did not include any dedicated 3D video device, and this ratio jumped to over 90 % of laptops.  Without such capability, those PCs are not game-capable machines.  Numerous of my (past) references here at the BioWare Community (the original of this comment dates back a couple of years) cover various aspects of this.  The minimum (discrete add-in device) video card includes a GPU made by ATI or nVidia.  Nothing from Intel qualifies. 

I used a generic warning about the typical laptops / mobile PCs as my basis for this, when I couldn't find an original on the Intel video chip subject that dated from the pre-release run-up to Dragon Age: Origins.  Recent events have resulted in some edits to the original, which also belong in here. 

Most current laptops continue to to be sold with mostly unusable video systems.  We are also seeing a growing number of desktop PCs again, being sold without even a business quality graphics card in them.  All recent 3D games have a warning label on the back, bottom flap, or side panel, of the game's box you should never ignore!  The official minimums, published for Dragon Age, IMO, aren't really good (practical) choices for that designation.  Nevertheless, they are real video cards, while Intel hasn't even tried to produce one of those since their disastrous singleton about a dozen years ago. 

(I suppose that I should amend that statement to some extent.  Intel put together a new video project group about five years ago, with plans to follow a different route to video than ATI and nVIDIA are doing, specifically to using something called "Ray Tracing" to handle graphics.  They announced a plan for an actual video card that supposedly would have been released this past winter (March, 2010).  They demonstrated some of their work here and there since then.  The project name was "Larrabee", and they ran into all kinds of snags with it, so it was cancelled at the end of summer, start of fall, in 2009.  There is no announced replacement program for an "end user version" discrete video card from Intel.)

For that matter, when the Mass Effect 2 game's official requirements were published, Intel's video was named very specifically as inadequate for that game (and it should have been named the same way for DA: O).  Not even the combined on-package video chip plus CPU, Intel's several i3s / i5s / i7s (technically not IGPs) qualify as full-power mainline devices. 

The engineers at nVIDIA have been considering the Netbook and Smartphone devices as a better place for them to compete in than the general PC market, where AMD has a serious advantage in being able to integrate GPUs inside of the designs of their coming "Fusion" line of CPUs.  Their influence, if they earn a sizable share of the video in those markets, can only be good.  Almost anything other than what Intel is doing nearly has to be better. 

That isn't to say that Intel has been going backward graphically, but their low standard has been legendary, and any improvement at all is noticed.  Over the past year, many of their i3 / i5 video chip systems have been able to perform almost as well as the nVIDIA onboard graphics currently available, but right at this moment, the Geforce Mobile generation of Fermi GPUs is setting some amazing standards (although demanding better cooling than ever before).

AMD will soon be releasing Mobile versions of their "Fusion" series (January, 2011), which should be extremely helpful to improving the laptop standards.  Intel has a new processor family they call "Sandy Bridge" in the wings, due out  slightly after the release of the AMD Fusion desktop APUs, expected in February.  Instead of merely being a separate video chip riding inside the CPU package, Sandy Bridge will be at least basically integrated into their CPU, sharing the cache.  It promises to be as fast as the current AMD onboard chip, the HD 4200, but AMD will already have replaced that one before then (the HD 4250 has already arrived, although it's not the one to look for as a Chipset video chip yet). 
 
It is now December, 2010, and the new mobile Fusion APUs are already shipping to Netbook, Notebook, and Laptop manufacturers, with the PCs using them expected about the turn of the year or even sooner.  The desktop Fusion APUs are expected in February.

The latest news on a long-standing feud between Intel and nVIDIA over whether nVIDIA's contract with Intel included the right to design chipsets for the newer Intel CPUs was about to go into court, after six years of wrangling (when Core Duo was a mere embryo).  But both agreed to ask for a continuance because they were back in negotiations now.

The conjecture going around is that Intel wants to protect its dominant lock on the Netbook market, which the Fusion threatens, given its many times superior graphics.  nVIDIA's ION is a low-current device that offers almost-competitive graphics performance to the Fusion for games, and superior performance in some other areas.  Intel may hope to save a bigger part of their Netbook market if they can pair up ION and Atom at a good price, which right now, they cannot.

Whereas Sandy Bridge won't make much of an impact on the poor quality of Intel's own video chips, nVIDIA potentially could make a difference in the Notebook and Laptop markets, displacing more of Intel's junk with an appropriate newer chipset video of their own (the current 9100 / 9200 onboard option is pretty dismal, and Intel has already matched its poor performance with the i3 / i5 / i7 piggyback graphics.

I've refreshed this article, again, although I'd have preferred, still, to have added the changes onto what I wrote five or six months before adapting this material for this purpose.  I will always feel that the first effort was probably a better one. 

Gorath
-

Modifié par Gorath Alpha, 18 août 2011 - 06:14 .


#2
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
If there is anyone in our forum presently planning to purchase any kind of micro-mini computing device, whether Netbook, Nettop, for any use, do not buy one now with an Intel video solution. Wait just two more weeks, perhaps a little more.

This post was first written three weeks ago, and CES has gone by.  We've seen previews of both Fusion and Sandy Bridge there.  Intel's chipset video chips have been produced in a wide variety lately, however, until about three, I think, years ago, all of them lacked several very basic functions that gaming CARDS began featuring ten or twelve years ago, after nVIDIA's first "Riva" cards arrived on the scene, when they included a Textures and Lighting unit internally.

Even when the Intel 3100X Chipset chip appeared, it took Intel eighteen months or so to activate all its functions in drivers.  And many of the producers of laptops in particular, have continued buying the older and less expensive chipset pair instead, so that brand new mobile computer devices still have the same stone age video (and the Atom has not been matched up to any recent Intel video chips, so Intel's Netbooks have remained in the same primitive state).

When the "i" series of Intel Core Duo Multicore CPUs began to appear, they had an upgraded version of the very latest chipset chip riding along piggyback inside of the processor's packaging, where it shared some of the large RAM cache, and for the first time in all history, was competitive in raw speed with Chipset chips from real graphics engineers at AMD and nVIDIA.  That does not mean the i-CPUs can be used for gaming without truly HUGE compromises, however, as is also true of the AMD and nVIDIA Chipset video chips.

With Sandy Bridge, most of the improvement has gone into the actual CPU side, where some 10-20% of added efficiency has been achieved.  However, instead of merely being a separate device riding along, the video support in Sandy Bridge is supposed to have been fully integrated into CPU functioning, giving it new advantages it didn't have while piggy-backing (it's still essentially the same relatively crude video device, however).

Therefore, this does *NOT* mean it is a game-capable option, unless the game settings are seriously crippled to allow it to be used.  According Anand Tech's tests, it is as fast for some things as the Radeon HD 4200 / 4250 pair of Chipset video chips that formerly held the top rank among the onboard video crowd, and even matches AMD's least capable HD 5n00 real card (a poor card for certain), the 5450, on some benchmarks.

The biggest news out of CES for game players (IMO) is that Microsoft will support ARM, and that nVIDIA is building its own ARM processor, so it won't be left behind by AMD's Fusion (which blows past Sandy Bridge, with better battery life, less waste heat, and better video graphics).

Gorath

Modifié par Gorath Alpha, 18 août 2011 - 06:00 .


#3
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
The usability of non-Fusion mobile computers for gaming remains tied to real graphics from real gaming companies. Only the Llano, which is a lower priced system than comparable Intel chipsets for the in-between mobile class (bigger and faster than Tablets and NetBooks, but smaller and less speedy than full-power mobile gaming machines), will offer game-level graphics without a dedicated CARD. A card is a separate circuit board. Intel still doesn't make CARDS.

If it's not already recognized, the fact is that Bioware does not offer official support to laptop PCs.  This is sort of a rock and a hard place situation, since the market these days involves two mobile computers being sold for every desktop, and that statistic includes business purchases.  The private purchases are probably seven or eight times more of the mobile units sold compared to desktops sold.  It's also a fact that the graphics in mobile systems averages to a far worse level, lower performance, now, than at almost any other time. 

Mobile versions of graphics cards start off at a disadvantage of 10-20% detuning to save power.  Then, there is the common situation of using the "wrong" desktop card names, and / or the wrong family.  The standards for mobile graphics are practically non-existent at the source level (AMD, Intel, nVIDIA, Via / S3, Sis).  Add the fast and loose adjustments that the laptop makers make to the suggested specifications, and you have chaos, and game developers can't deal with that. 

A desktop Radeon should be an n600-plus for gaming (HD 5670. HD 6670, etc), and a desktop Geforce should be either a GTS or a GTX (GTS 450, GTX 550), and upward from those.  For a laptop, assume you need the Radeon 5770 mobile, or the Geforce GTX 560 mobile.  

P. S.  I know that the core material of the first comment in the discussion that says "Hold onto your money" is the same as the first comment in this thread.  I lost track of the older version of this particular discussion, and borrowed from that one. 

Modifié par Gorath Alpha, 18 août 2011 - 09:44 .


#4
yiffy99

yiffy99
  • Members
  • 7 messages
Thanks for the info, although, there'are also 'integrated' ATI and Geforce cards around, I was once tricked by those, as I thought I was buying a discrete gfx.

#5
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Outside of the Smartphone, NetBook, and Tablet market, nVIDIA's chipsets are not now a factor in X86 computing (full power laptops and desktops, including Apple's better products at Apple's outrageous prices). AMD still produces a variety of chipset video chip video systems, with the HD 4200 / HD 4250 having gotten pretty long in the tooth by now.

They concentrated on various Fusion APUs the past three years, and then had their two Fabs fall down on the job a year ago, so they had to really scramble to have something still relatively new, but still on the older 40 nm wafers, which threw the Bulldozer a year behind schedule, both with and without integrated video in the package.

While AMD was working hard on Fusion, Intel's Sandy Bridge video (the top one of the two chips) caught up with the HD 4200 on almost all features / functions, and even matches their HD 5450 on the salient points (although a 5450 is really a poor thing, hardly a step above the older HD 4350). All of the really low end stuff, like the Geforce 205 / 210 /310, and the HD 5450, are really bad for games and often are being damaged, it seems, when used for games anyway.

#6
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Intel still does not sell any real video CARD for 3D game playing, and only the Llano APU offers game capability among integrated type hardware. It is within the realm of possibility to tweak and twiddle until an Intel chip "sorta" works with DA:0, but the odds of ending up cooking the entire system by doing so are very unfavorable.