Aller au contenu

Photo

Don't waste money on a laptop with an Intel video chip, Fusion blows all away


  • Veuillez vous connecter pour répondre
25 réponses à ce sujet

#1
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Don't buy a laptop with an Intel video chip, Llano is here, and Trinity is in the wings!!

                 Ignore Warning Labels at Your Wallet's Peril 

For two years, possibly three, during roughly 2004 to 2006, about 70 % of the desktop PCs sold did not include any dedicated 3D video expansion slot, and this ratio jumped to 90 % of laptops (and even worse soon after).  Without such capability, those PCs are not game-capable machines.  Numerous of my (past) references here at the BioWare Communities covered various aspects of this.  The minimum (discrete add-in device) video card includes a GPU made by ATI or nVidia.  Nothing from Intel has ever qualified. 

Once again, the brand named PC makers are doing the same thing, leaving the real graphics parts out of many, many new PCs being sold. 

Technically, although the AMD onboard video chips had been a great deal better than Intel's bad excuse parts, those haven't been supported for gaming, either.  In early 2011, that changed for the better, much better (Intel Sandy Bridge merely brought their IGPs into the same class as the AMD and nVIDIA onboard solutions). 

Current laptops continue to to be sold with mostly unusable (for games) video systems.  All recent 3D games have a warning label on the back, bottom flap, or side panel, of the game's box you should never ignore!  The official minimums (ME-2), IMO, aren't really good (practical) choices for that designation.  Nevertheless, they are real video cards, while Intel hasn't even tried to produce one of those since their disastrous singleton about a dozen years (geeting close to fourteen years now) ago. 

For that matter, when the Mass Effect 2 game's official requirements were published, Intel's video was named very specifically as inadequate for this game (and it should have been named the same way for DA: O / DA2).  The combined on-package video chip plus CPU, Intel i3s / i5s (technically not IGPs just piggyback parts) also failed to qualify as full-power mainline devices. 

As of the second week in January, 2011, the AMD Fusion APUs were already being shipped to Laptop PC makers for production, and could have been in stores on or about the end of February (or at CES) , so choose gift certificates if you were going to buy mobile computing devices as holiday gifts (the original of the post was written i n mid-December), and include a copy of my message with the certificate.  There were going to be AMD Netbooks, Budget Notebooks, and supposedly, some Laptops, all coming with decent graphics included in the APUs they have (full-power Laptops ended up being postponed for a year).

Intel's Sandy Bridge was rushed out in a hurry, and actually beat the Llano into the stores' shelves, but because of that rush, they had a bad chipset that didn't handle all SATA correctly. 

For 2012, the Fusion series is being expanded from the low-power Netbook and Laptop systems into mainline regions, with Medium Quality graphics included. 

Gorath
-

Modifié par Gorath Alpha, 09 février 2012 - 03:11 .


#2
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Game buyers continue wearing blinders when making purchases.  If you don't have a discrete video card (entirely separate circuit board from the mainboard) with a GPU device from either ATI or nVIDIA on it), you aren't playing the games the way that they were designed.  The System Requirements labels on the boxes and on the digital download pages are supposed to be a warning, but you have to actually read the labels. 

As mentioned originally in the opening comment for this thread, there were desktop PCs being sold by Acer, Dell, HP, and Sony, without any video upgrade path five years ago.  The old AGP video bus was both complicated and expensive compared to PCIe, and leaving it out saved them plenty.  Laptop pricing was already falling a couple of years after that, accellerating the popularity of choosing the smaller size mobile devices, whether or not the portability was really needed. 

But laptops never did have a great deal going for them graphically, and never were good game platforms.  The popularity of the latest mobile computing devices, such as slates / tablets, and Netbooks, affected Laptop pricing once again, and with that price drop, there was literally a BACKWARD move technologically, to cheaper versions of Intel's chipset video chips because they cost less. 

While games slowly increased in complexity (there's a Dx9 "cap" on the graphics because of consoles), laptop video was going the opposite direction at the lower, more popular end of the range. 


Gorath

Modifié par Gorath Alpha, 09 décembre 2010 - 05:27 .


#3
LeoInterVir

LeoInterVir
  • Members
  • 184 messages
Luckely Intel is starting to get their act together and thus hopefully it will spur more productivity in both nVidia and ATI.

I personally would like to see more work in having low graphics being run from an integrated source and high graphics being run on a dedicated source. Save energy that way and also extends the life of your graphics card. Don't need a top of the line card running solitare...

#4
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Intel cancelled the last vestiges of their "Larrabee" Ray-Tracing graphics engine a few months ago (roughly February, 2010 for that), and have nothing underway that compares with AMD's "APU" developments.  The only integration they are working on any more is just the same weak  onboard type design they have had for two years now, very slightly retuned when it was tucked inside the packaging with the i3 and i5 cores. 

They announced that they would integrate their "quarter-power" device into future Core developments ("Sandy Bridge").  I call it Quarter-Power with relation to the mainline graphics designs from ATI and nVIDIA (which makes the business grade cards "Half Power", and the High Performance cards into "Double Power").  AMD's APU was intended to offer two options, one of which is equivalent to whatever is the year-old Mainline (HD 5670, I suppose), when it was ready for production in the first quarter of 2011 (the failure of TMSC's 32 nm Fab forced all of the early APUs back to 40 nm, and the higher quality APUs simply couldn't work correctly when held back that way).. 

(As of early December, 2010, I was editing a variety of reference articles I've written that included mentions of the AMD "Fusion" integrated multi-core processors with planned good quality graphics.  From some preview articles, as limited by Non-disclosure Agreements, I believe I'm seeing some alterations from what I'd been expecting.  Previously, I'd not considered Netbooks as anything much for me to pay attention to, but they are included in what AMD wants to break into, and while they have upped the ante graphically by a great deal over what Intel's Atom chipset has, the nVIDIA :ION" video is fairly close to as good, and I think that bodes well.)

They had also planned a higher performance APU, like the HD 5650.  Whether they continue the dual-branches with Intel effectively out of the running at a competive level, I haven't seen any contradiction from them about it.  (Right now, it appears that the two graphical options for budget PCs will be equivilant to an HD 5450, which was last year's business level card, and the HD 5570, which was last year's budget gaming card, at the bottom of the Mainline bracket, on the borderline with the top end of business graphics.)

No specific release dates for Fusion had been announced in December.  About a month after the NetBooks are available for purchase, the Fusion APUs for Laptops were supposed to be officially in play, but the Brazos and Llano had to be released on 40 nm media, and they were as good as AMD was able to offer on the old media. 

Intel's "Sandy Bridge" was then (December) maybe a month, maybe two months, further out in 2011 than the Netbook Fusions, but it is still essentially the same Intel graphics as always, but with a nice speed bump, and they beat their original release date with a rush production that cost them a ton because of the defective chipsets.  .

Gorath
-

Modifié par Gorath Alpha, 05 novembre 2011 - 05:27 .


#5
Guest_NewMessageN00b_*

Guest_NewMessageN00b_*
  • Guests
Intel has widened SM4.0 support on the most recent integrated solutions.

http://software.inte...rated-graphics/

Even though it "kinda" supports the features, falling for these solutions is not a good idea. If you value eye-candy, that is.

Modifié par NewMessageN00b, 13 juin 2010 - 03:08 .


#6
RGFrog

RGFrog
  • Members
  • 2 011 messages
Until game designers go back to the far better openGL engine, you have to stick with directX and shader model version support.

Since ME3 is Unreal based, BW can't really do this. They're stuck with whatever the engine requires. That means graphics chips (not always discreat add-on cards) that support the spec. Intel never has, as their business model is about low cost 2d support for the mass market that doesn't even think about playing graphic intensive games.

What is needed is an 16x pci-e 2.0 addon slot for laptops and/or user removable/upgradable gpu's in laptops. The express card slot almost had it right, except it is a 1x pci-e which just doesn't have the bandwidth. Which is sad, as it could have been made 16x and opened up a market for external video upgrades.

Imagine a box unit with powersupply that upgraded your laptop to an ati 5870. Or purchasing a monitor that included slot for whatever pci-e video card you wanted to use, then hooked up to your laptop via express slot and cable...

But, as with most things laptop, cost wins over function every time.

#7
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Because of the demo (new edit: released last summer), we have more noobs with tinker toys instead of the good stuff for video trying to run the demo on their too-cheap PCs (mostly laptops that cannot be upgraded), and it seems that some of these cheapskates are feeling argumentative about their crap systems.  Thus, this should be recycled toward the top again. 

Modifié par Gorath Alpha, 09 février 2012 - 03:15 .


#8
jbaudrand

jbaudrand
  • Members
  • 12 messages
I just can't stop thinking how good it was to buy a computer game when internet was not was it was, it force the devs to polish game. nowadays, they ask us an internet acces to prevent piracy, but it's just a way to reduce Q&A testing and patch it later, when they got the money...

I'm not bashing ME in particular, all the actual business run like this. And I'm sad about it...

#9
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Daggerfall predated the popularity of the 'Net and was buggier than a hill of termites.  In those days, you used a rather slow dial-up modem to contact a company "bulletin board" and searched on there for answers.  Patches took multiple hours to download, if you could do so despite the commonly "dirty" nature of the old copper wires that the phones still depended upon. 

Further back, the modems cost as much as $350 - $400, and only managed to reach to  "300 baud".  I bought my first Hayes modem as part of a complete (used) Commodore 64 system when I gave up on my Apple ][ in favor of IBM's PC-XT, which was a terrible gaming platform {4 colors, 350 by 200 pixel resolution, no audio (until the ad lib), UGHH!} 

Gorath
-

Modifié par Gorath Alpha, 03 juillet 2010 - 02:43 .


#10
jbaudrand

jbaudrand
  • Members
  • 12 messages
^^ I mean comparing to my amstrad 6128 and my amiga 500.

#11
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Hmmm? Not having ever even SEEN an Amstrad anywhere stateside, I have no idea at all what segment of the Apple to PC to Mac to PC-AT & etc. evolution it fits into, although I was aware of the Atari and Commodore 16-bit competition for the ATs and 386s, but given the ridiculous antics of the primary private owner (Tramiel) who swapped his ownership from one to the other and screwed everything over thoroughly at both places, I never felt I could trust their products.

Meanwhile, even though this thread was still practically front page, yet another new member has arrived, wanting to try using an Intel tinker toy in place of the real thing (and if there is one such message of complaint, the way these things work, there are 8-10 more who never write to post a complaint, but will pass through here nonetheless, with similar opinions).

P. S. (Edit)  It had been many years since I last thought of Jack Tramiel and his damage to the two companies he bought into and then sold out his interests in, and I completely missed a connection when Bethesda's Elder Scrolls series was set with Tamriel as one of the geographical names.  I wonder if that was significant in any way to folks working at Bethsoft when The Arena was created? 

Modifié par Gorath Alpha, 09 février 2012 - 03:18 .


#12
jbaudrand

jbaudrand
  • Members
  • 12 messages
Gorath alpha: Amstrad CPC were only available in europe, comparing to C64, you didn't miss a lot, C64 audio was far superior, but Amstrad 6128 was cheaper, and more important:very well documented, it was the best computer to start programming.






#13
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
AMD upgraded their own onboard graphics one notch.  In addition to their HD 4200, there is also an HD 4250 now, but it's still not adequate for ME2.  Intel's Sandy Bridge had been announced as due this coming April or May, when the same basic low quality Intel video is entirely integrated in the next series of CPUs.  But "Fusion" was practically already here a year ago now, but TMSC gave up on 32 nm and decided to skip to 28 nm, forcing its customers to scramble. 

The AMD device combines a far more capable graphics capability, closely related to the Radeon HD 5n00 generation as an integrated function to their multi-core CPU, and the mobile versions were already supposed to have been in the (figurative) hands of Netbook, and budget Notebook manufacturers, with the PCs using those APUs expected about the first of the year.  The Laptop Fusion APUs were (then, last December) expected in February of 2011. 

The latest news about a long-standing feud between Intel and nVIDIA over whether nVIDIA's contract with Intel included the right to design chipsets for the newer Intel CPUs was about to go into court, after six years of wrangling (when Core Duo was a mere embryo).  But both agreed to ask for a continuance, because they were back in negotiations now. 

The conjecture going around is that Intel wants to protect its dominant lock on the Netbook market, which the Fusion threatens, given its many times superior graphics.  nVIDIA's ION is a low-current device that offers almost-competitive graphics performance to the Fusion for games, and even superior performance in some other areas.  Intel may hope to save a bigger part of their Netbook market if they can pair up ION and Atom at a good price, which right now, they cannot.

Why is this pertinent here?  Fusion was going to be available for standard laptops at a much smaller cost than discrete GPUs plus a CPU, and no one else was going to have anything for laptops that competed.  Private ownership of PCs is concentrated in laptops, not desktops.  Initially, the business grade, like an HD 5450, will probably be priced at almost what similar AMD processors without graphics have cost.  That will really put a dent, potentially, in nVIDIA's sales of chipsets for AMD processors, and for discrete cards like the G.210 Geforce. 

Although pricing isn't being discussed yet, the presumption is that the difference between an APU with business graphics, and one with the equivilant of the HD 5570 graphics integrated into it will be relatively small compared to a card, probably less than $15 to the OEMs, translating to maybe $25 retail (my guess there). 

THOSE APUs will run games such as ME2 without any separate GPU card, which is why it's significant to this article.

Gorath

Modifié par Gorath Alpha, 05 novembre 2011 - 05:36 .


#14
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
We still didn't know, a year ago, that the loss of a 32 nm media was going to affect all of the APU and CPU plans AMD had made so much as it did.  I still thought we'd see the Netbooks and budget Notebooks in January, with the Notebook APUs coming a month later, and was still saying to give out gift certificates for Laptops over the holidays. 

Modifié par Gorath Alpha, 05 novembre 2011 - 05:41 .


#15
Ryzaki

Ryzaki
  • Members
  • 34 422 messages
Hey Gorath. Should someone wait for one of the newer compters beforing buying something like: http://www.dell.com/...id=inspiron-15r

#16
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
1. As per the new subject line, when you don't get a real video card, then it's likely to be less of a bargain compared to Fusion. That particular selection does have a Radeon card.

2. Visit Notebookcheck.Net to look up the 1024MB ATI Mobility RadeonTM HD550V to find out what desktop card it is related to. The shader performance ranking chart only has desktop cards on it. When there are benches for the Mainline Fusion desktop APUs, I'll add it (them?)


#17
Ryzaki

Ryzaki
  • Members
  • 34 422 messages
Thanks Gorath.



See my friend was looking for a good laptop for gaming around $1000



Could you give any recommendations? She can't get a desktop because she doesn't have the space. She mostly wants it for playing ME/ME2 and DAO.

#18
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Gorath Alpha wrote...

2. (snip). The shader performance ranking chart only has desktop cards on it. When there are benches for the Mainline Fusion desktop APUs, I'll add it (them?)

While I do own a personal laptop, it is left over from when I did travel a lot more, several years ago, and while it wasn't really as bad in its own day as the average modern laptop with an Intel chipset video chip is, it was never really, IMO, worth any time investment trying to play the kind of games I prefer on it. 

It's true that there are laptops that can play games; I just don't need one.  I use desktops for game play, and believe that they are the correct platform now.  I'm very interested in whether between them, AMD and nVIDIA can and will remake the mobile computing market into something I will haver more respect for. 

There are other folks here I hear from who are really into laptops, whose opinions should get more weight than mine.  All that being said, I always look to Newegg first, at least to get a handle on what the pricing should amount to. 

(Back soon!)  OK, it's a few minutes later, and I picked one, but looking at the picture, it looks so slender and fragile, much like the Apple AIR does, that I can't imagine it can cool itself well enough for long game playing sessions! 

www.newegg.com/Product/Product.aspx

Modifié par Gorath Alpha, 10 décembre 2010 - 10:25 .


#19
Ryzaki

Ryzaki
  • Members
  • 34 422 messages
Ooh very nice! Thanks Gorath!



Though you're right about the cooling. I'll be sure to warn her to buy a cooling pad.

#20
SSV Enterprise

SSV Enterprise
  • Members
  • 1 668 messages

Gorath Alpha wrote...

(Back soon!)  OK, it's a few minutes later, and I picked one, but looking at the picture, it looks so slender and fragile, much like the Apple AIR does, that I can't imagine it can cool itself well enough for long game playing sessions! 

www.newegg.com/Product/Product.aspx


:blink: That's not nearly as slender as a Macbook Air.  The Air is so thin that there's not even room for a DVD drive.

But yes, pretty much any laptop with a serious graphics card will have cooling problems.

#21
Ryzaki

Ryzaki
  • Members
  • 34 422 messages
Yeah. I was going to suggest she save up some more and get the same one I have because of it's cooling system but she insists that she needs one now.



:/ That's going to bite her in the ass later.

#22
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Gorath Alpha wrote...

 . . . The shader performance ranking chart only has desktop cards on it. When there are benches for the Mainline Fusion desktop APUs, I'll add it (them?)

I've had to edit the original post because all we're going to get at CES in Las Vegas seems to be "previews", with actual (mobile) product shipments following later in January.  Instead of an early February release for desktop Fusion APUs, those are now apparently later in that month.  (Later, after CES has ended, a major edit.) 

Intel's chipset video chips have been produced in a wide variety lately, however, until about three, I think, years ago, all of them lacked several very basic functions that gaming CARDS began featuring ten or twelve years ago, after nVIDIA's first "Riva" cards arrived on the scene, when they included a Textures and Lighting unit internally.

Even when the Intel 3100X Chipset chip appeared, it took Intel eighteen months or so to activate all its functions in drivers.  And many of the producers of laptops in particular, have continued buying the older and less expensive chipset pair instead, so that brand new mobile computer devices still have the same stone age video (and the Atom has not been matched up to any recent Intel video chips, so Netbooks have remained in the same primitive state).

When the "i" series of Intel Core Duo Multicore CPUs began to appear, they had an upgraded version of the very latest chipset chip riding along piggyback inside of the processor's packaging, where it shared some of the large RAM cache, and for the first time in all history, was competitive in raw speed with Chipset chips from real graphics engineers at AMD and nVIDIA.  That does not mean the i-CPUs can be used for gaming without truly HUGE compromises, however, as is also true of the AMD and nVIDIA Chipset video chips.

With Sandy Bridge, most of the improvement has gone into the actual CPU side, where some 10-20% of added efficiency has been achieved.  However, instead of merely being a separate device riding along, the video support in Sandy Bridge is supposed to have been fully integrated into CPU functioning, giving it new advantages it didn't have while piggy-backing (it's still essentially the same relatively crude video device, however).

Therefore, this does *NOT* mean it is a game-capable option, unless the game settings are seriously crippled to allow it to be used.  According Anand Tech's tests, it is as fast for some things as the Radeon HD 4200 / 4250 pair of Chipset video chips that formerly held the top rank among the onboard video crowd, and even matches AMD's least capable HD 5n00 real card (a poor card for certain), the 5450, on some benchmarks.

The biggest news out of CES for game players (IMO) is that Microsoft will support ARM, and that nVIDIA is building its own ARM processor, so it won't be left behind by AMD's Fusion (which blows past Sandy Bridge, with better battery life, less waste heat, and better video graphics).

Gorath

Modifié par Gorath Alpha, 10 janvier 2011 - 12:07 .


#23
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Here are the latest updates:

www.dailytech.com/AMD+Ships+Llano+ASeries+Looks+to+Punish+Intel+on+the+Budget+End/article21898.htm

social.bioware.com/forum/1/topic/58/index/79841/1#7421268

Modifié par Gorath Alpha, 14 juin 2011 - 05:49 .


#24
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
We've had still another new arrival complaining that ME2 doesn't run correctly on his / her Intel onboard video chip . .

#25
SSV Enterprise

SSV Enterprise
  • Members
  • 1 668 messages
I talked my friend into buying an HP laptop with the best mobile Llano chip (A8-3500m with Radeon HD 6620g). Plays Mass Effect 2 pretty well at 1366x768.