As long as there are some nits to pick, let's include an "s" in Per
Snickety.
www.merriam-webster.com/dictionary/persnicketyJackM wrote...
@Gorath
You're really pernickety. OK, sorry, I'll not do it again. I'll write down "graphic chipset" 1000 times in my text editor.
There is a new proposal about babying these things along in here once again. I reiterate that Intel chip fanatics haven't a leg to stand on.
It's not here yet, but according to Anand Lal Shimpi, the name-holder for the An
andTech web site, there will soon be a still further version of the i5 / i7 series of processors, in which the included onboard video gets ratcheted upward a still further notch by giving it access to the cache RAM used by the CPU cores. It's still not going to be a gaming capable device, but it may just finally reach the standard set by the ATI Radeon HD 4200 onboard chip (which I had expected would have been replaced before now, although it has not been, after all -- edited February, 2012).
Intel's chipset video chips have been produced in a wide variety lately, however, until about three, I think, years ago, all of them lacked several very basic functions that gaming
CARDS began featuring ten or twelve years ago, after nVIDIA's first "Riva" cards arrived on the scene, when they included a Textures and Lighting unit internally.
Even when the Intel 3100X Chipset chip appeared, it took Intel eighteen months or so to activate all its functions in their drivers.
And many of the producers of laptops in particular, have continued buying the older and less expensive chipset pairs instead, so that brand new mobile computer devices still have the same stone age video (and the Atom has not been matched up to any recent Intel video chips, so Netbooks have remained in the same primitive state).
When the "i" series of Intel Core Duo Multicore CPUs began to appear, they had an upgraded version of the very latest chipset chip riding along piggyback inside of the processor's packaging, where it did not share any of the large RAM cache, but for the first time in all history, was competitive in raw speed with Chipset chips from real graphics engineers at AMD and nVIDIA. That does not mean the i-CPUs can be used for gaming without truly HUGE compromises, however, as is also true of the AMD and nVIDIA Chipset video chips.
With Sandy Bridge, most of the improvement has gone into the actual CPU side, where some 10-20% of added efficiency has been achieved. However, instead of merely being a separate device riding along, the video support in Sandy Bridge is supposed to have been fully integrated into CPU functioning, giving it new advantages it didn't have while piggy-backing (it's still essentially the same relatively crude video device, however).
Therefore, this does *NOT* mean it is a game-capable option, unless the game settings are seriously crippled to allow it to be used. According Anand Tech's tests, it is as fast for some things as the Radeon HD 4200 / 4250 pair of Chipset video chips that formerly held the top rank among the onboard video crowd, and even matches AMD's least capable HD 5n00 real card (a poor card for certain), the 5450, on some benchmarks.
The biggest news out of CES 2011 for game players (IMO) is that Microsoft will support ARM, and that nVIDIA is building its own ARM processor, so it won't be left behind by AMD's Fusion (which blows past Sandy Bridge, with better battery life, less waste heat, and better video graphics).
Gorath
-
Modifié par Gorath Alpha, 09 février 2012 - 06:47 .