Aller au contenu

Photo

Way below minimum video devices = any Chipset video chips


  • Veuillez vous connecter pour répondre
9 réponses à ce sujet

#1
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
              Ignore Warning Labels at Your Wallet's Peril 

For two years, possibly three, during roughly 2002 to 2005, about 70 % of the desktop PCs sold did not include any dedicated 3D video device, and this ratio jumped to 90 % of laptops.  Without such capability, those PCs are not game- capable machines.  Numerous of my references here at the BioWare Community (the original of this comment dates back a couple of years) cover various aspects of this.  The minimum (discrete add-in device) video card includes a GPU made by ATI or nVidia.  Nothing from Intel qualifies. 

Current laptops continue to to be sold with mostly unusable video systems.  All recent 3D games have a warning label on the back, bottom flap, or side panel, of the game's box you should never ignore!  The official minimums (Especially the foolish ones for ME-1, but also ME-2), IMO, aren't really good (practical) choices for that designation.  Nevertheless, they are real video cards, while Intel hasn't even tried to produce one of those since their disastrous singleton about a dozen years ago. 

For that matter, when the Mass Effect 2 game's official requirements were published, Intel's video was named very specifically as inadequate for that game (and it should have been named the same way for DA: O).  Not even the combined on-package video chip plus CPU, Intel i5s / i7s (technically not IGPs) qualify as full-power mainline devices. 

P. S.  There had been a major project at Intel for the past three years, to create a Ray-Tracing based real video card for end users ("Larrabee").  That one ended up being cancelled because of inability to match the earliest hardware prototypes to drivers (Intel's biggest weakness) and supportung software, while the hardware was still more or less in the same ballpark as the ATI / nVIDIA Mainline video cards, although something similar is still planned for specialty use in the scientific world, without software after all.  At present, IMO, Intel is something on the order of two years behind AMD's progess toward a real APU.

social.bioware.com/forum/1/topic/58/index/79841

arstechnica.com/hardware/news/2009/12/intels-larrabee-gpu-put-on-ice-more-news-to-come-in-2010.ars

Gorath
-

Modifié par Gorath Alpha, 10 janvier 2011 - 12:16 .


#2
helixrain

helixrain
  • Members
  • 1 messages
I know that an intel 4 series chipset won't work the game, but is there any way to physically change the processor?

#3
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

helixrain wrote...

I know that an intel 4 series chipset won't work the game, but is there any way to physically change the processor?

First off, you have to understand that it's totally enclosed as part of what is called an "ASIC", of which there are two on an Intel-based mainboard, the "Northbridge" and the "Southbridge".  You can't peel those ASICs apart.  Second, the typical laptop is sold with only onboard video, and when it is assembled, there is no "empty" place inside of it that a real video card might go into.  Such laptops are effectively disposables, with no upgrade path for anything other than RAM and storage. 

When the situation involves a desktop machine that is in a "standard" tower, not some mini or slim box, if it's got components new enough to meet the game requirements (other than video), it will also have an emply add-in video card slot waiting for the upgrade.  Also, most often, the existence of the add-in card is automatically handled by the computer.  Only the oldest PCs with onboard chips require any changes in the BIOS' Setup, and those most likely wouldn't meet the other requirements.

G

Modifié par Gorath Alpha, 18 avril 2010 - 10:30 .


#4
jakenou

jakenou
  • Members
  • 3 856 messages
That all being said, it seems that hypothetically the brand new MacBook Pros should be able to rock a lot of modern games with Windows Boot Camped on it.

#5
Recnamoken

Recnamoken
  • Members
  • 757 messages
Nothing a good laptop can do, for less money. Apple stuff isn't bad, it's just way to pricey. It's way to easy to get similar or better hardware for a lot less money. And that includes the iPods and other stuff.

#6
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

jkthunder wrote...

That all being said, it seems that hypothetically the brand new MacBook Pros should be able to rock a lot of modern games with Windows Boot Camped on it.

Actually, Apple has a tendency to dilute their product line with less powerful video hardware that doesn't compare well to the (more or less similarly priced) Windows Gaming products such as those from Digital Storm.  What I am curious about with regard to the current AMD vs. Apple flirtation is whether it was prompted by the imminent (early 2011) release of the AMD "APU" products. 

#7
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
I like keeping this sort of article somewhat closer to the top, and it's overdue for its bump. 

Incidentally, if anyone wondered why I used the subject title I did (and which I might edit -- it currently ends as "all Intel onboard")  -- instead of  simply saying "all Intel Video", it's because I recreated the message after forgetting the title of the original, and at the time, Intel hadn't yet given up on every aspect of its Ray-tracing graphics development idea.  They had admitted that they didn't stand a chance in competition with ATI and nVIDIA for a discrete high performance card, but were still talking about a similar proposal to match ATI's "Fusion" project. 

The closest thing Intel has to ATI's current HD 4200 onboard video is an "on-package" device, with one of their ordinary IGP designs' silicon inside the package with the i3 and i5.  It can just about match the raw speed of an nVIDIA IGP, such as the 9200 / 9300, for the very first time, although both are slower than the ATI chip.  They were saying that they would put a relatively competitive graphics chip in the CPU package or even fully integrated into a CPU, similar to the ATI fusion ("APU"), but they have given up on that, as well. 

The only thing they are still working on is a bottom end video chip being integrated into a CPU. 

http://www.tomshardw...ics,2560-5.html

Modifié par Gorath Alpha, 01 juillet 2010 - 08:49 .


#8
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
I have recently updated a message thread in here that included a question about an older model of the Intel Chipset Video Chip, and I thought that this "reference type" article needed a copy of the update: 

Intel's chipset video chips have been produced in a wide variety lately, however, until about three, I think, years ago, all of them lacked several very basic functions that gaming CARDS began featuring ten or twelve years ago, after nVIDIA's first "Riva" cards arrived on the scene, when they included a Textures and Lighting unit internally.

Even when the Intel 3100X Chipset chip appeared, it took Intel eighteen months or so to activate all its functions in drivers.  And many of the producers of laptops in particular, have continued buying the older and less expensive chipset pair instead, so that brand new mobile computer devices still have the same stone age video (and the Atom has not been matched up to any recent Intel video chips, so Netbooks have remained in the same primitive state).

When the "i" series of Intel Core Duo Multicore CPUs began to appear, they had an upgraded version of the very latest chipset chip riding along piggyback inside of the processor's packaging, where it shared some of the large RAM cache, and for the first time in all history, was competitive in raw speed with Chipset chips from real graphics engineers at AMD and nVIDIA.  That does not mean the i-CPUs can be used for gaming without truly HUGE compromises, however, as is also true of the AMD and nVIDIA Chipset video chips.

With Sandy Bridge, most of the improvement has gone into the actual CPU side, where some 10-20% of added efficiency has been achieved.  However, instead of merely being a separate device riding along, the video support in Sandy Bridge is supposed to have been fully integrated into CPU functioning, giving it new advantages it didn't have while piggy-backing (it's still essentially the same relatively crude video device, however).

Therefore, this does *NOT* mean it is a game-capable option, unless the game settings are seriously crippled to allow it to be used.  According Anand Tech's tests, it is as fast for some things as the Radeon HD 4200 / 4250 pair of Chipset video chips that formerly held the top rank among the onboard video crowd, and even matches AMD's least capable HD 5n00 real card (a poor card for certain), the 5450, on some benchmarks.

The biggest news out of CES for game players (IMO) is that Microsoft will support ARM, and that nVIDIA is building its own ARM processor, so it won't be left behind by AMD's Fusion (which blows past Sandy Bridge, with better battery life, less waste heat, and better video graphics).

P. S. The primary aim here is Intel, SiS, and S3 onboard video solutions.  It should be acknowledged that the "Brazos" low-end Fusion APUs are not meant for gaming either, and most of the "Llano" APUs do not include game-capable video power built in, but do boost the power of less expensive add-on dedicated graphics cards, and at least one of the Llano APUs does come with a borderline game capability on a par with a Radeon HD 5570. 

The "Trinity" APUs contain the "FX", Bulldozer cores that have not been compatible with quite a few games, but do have a great deal more potential performance.


Gorath

Modifié par Gorath Alpha, 19 mai 2012 - 11:27 .


#9
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
ME-3 will be here in roughly six weeks, so new members with tinkertoy video in their laptops are wasting money on ME-1 and ME-2. ME-3 won't run right on Intel's quarter power video chips, either. The System Requirements will not include official support for them.

Given the poor quality of the average laptop and Notebook PCs being sold, I would imagine that the official notes will include disclaimers about that class of computer.  It is my personal opinion that laptops are still inadequate as gaming platforms, whether or not a gaming class graphics card is included, because of low cooling capacity.  Nevertheless, I probably will still be assisting where I can when anyone makes what I consider to be the "mistake" of believing the hype related to any one or another "Gaming Laptop". 

(Edited in three days later)  I forgot to check the pinned threads at Social's ME-3 forum.  The demo's requirements have been announced, and as usual, Bioware mangled the lists of below-requirements graphics cards. 

I will not pretend to offer any support as a volunteer here for Intel, S3, or SiS video chips. If you are silly enough to buy a bad laptop, that's your mistake, not mine. Finally, this message thread is a discussion of the requirements, not a debating society invite to argue about how big a piece of junk the video solution in an Intel device is.

Modifié par Gorath Alpha, 23 janvier 2012 - 06:38 .


#10
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
In spite of the nudge upward that Ivy Bridge offers beyond Sandy Bridge, Intel is still offering trash for video. Laptops equipped with only Intel video are still going to overheat and be damaged when playing games. The Intel video isn't designed for the stress, and the laptops without real, dedicated video CARDS in them (98% or so of those being sold, probably more) do not have the needed cooling capacity.

This is not debatable, it is straight fact. This message thread is not for Intel owners to whine and complain in. It's to help explain why games have published requirements. The only subject open for discussion is what the requirements are (they were at least partly in error and never corrected)