Graphics card issues
#1
Posté 04 novembre 2009 - 12:45
Can anyone help me ? Thank you
Regards,
Gene Saika
#2
Posté 14 février 2010 - 04:08
ATI's Xpress1100 (and 1150) were both IGPs based on the same X300 that the Xpress200 was, and none of them offered SM3 pixel shader functions, which are at least required in part by Dragon Age (the high end Xn00 cards did have the "first notch" of SM3, not the full level of it). Incidentally, Mass Effect 1 & 2 both require a newer Radeon than the X850, with every part of the SM3 pixel shader functionality.
Similar to almost any other "IGP" (Integrated Graphics Processor), even if the OP's onboard chip had been the Radeon X2100 / X2200 device, based on the X1300, it would have been too slow, whether it had the full range of SM3 shader functions or not. Ordinary IGPs are basically the bare minimum, just for limited graphics requirements.
The reason that the word "ordinary" was used just now is (was) the comparatively imminent arrival of a variety of new types of video devices that differ from either ordinary onboard chips or from the current add-in graphic circuit "packages". Intel has been first to market with those, but isn't really offering that much yet.
www.xbitlabs.com/articles/video/display/intel-hd-graphics.html
Both AMD and Intel had been planning to eventually be selling what AMD refers to as an APU (Accelerated Processing Unit), which eventually will integrate a full power GPU into the CPU itself. Those were forecast to be available for AMD's end-users' purchase about February of 2011 for the laptop versions. Intel currently sells i3s, i5s, and i7s with a different onboard arrangement. The video chip isn't integrated at all; it shares the packaging with the CPU. It was upgraded somewhat, and with the improvement in CPU to video device communication, is able to match the nVIDIA IGPs on some benchmarks, for the very first time.
But it's not at all a full power GPU. That's still in the future now, for AMD. For the sake of our gaming community here, I chose to consider the Mainline Game Level ("Medium") as the full power level, making High End "Performance Grade" video into "Double Power" for here and now.
Edited: About three months had passed since this short article was added to the Dragon Age Tech database, and Intel ended up surrendering most aspects of their long-range ("Larrabee") plan to compete with AMD in this regard. They will go on with what they have now, a quarter-power video device that doesn't compare with the real thing, and integrate it fully into some future processor designs ("Sandy Bridge", forecast as due next April or May).
That is far different from their original plan, and from what AMD planned to be doing. I have recently updated several reference article type posts here in the Dragon Age PC Tech Forum, and a couple of threads of similar nature in the ME2 Tech Forum, about the Netbook, Notebook, and Laptop APUs that we should have been able to see in various mobile devices about the turn of the year (it was December 8th as I made that last previous edit)
Gorath
-
Modifié par Gorath Alpha, 20 mai 2011 - 11:13 .
#3
Posté 14 février 2010 - 06:41
#4
Posté 14 février 2010 - 07:13
From a budget-restrained buyer's point of view, the latest IGPs are miles out in front of where that class had always been, and both of the 3D players' offerings are really an amazing value. The cost to the end user of a decent (non- Intel) IGP compared to total crap is literally less than pocket change! Nevertheless, I would never favor seeing game developers cater to that lowest denominator in planning a next game!
Trying to guess what the eventual fallout from the release of APUs might be will be great fun over the next couple of years. Right now, AMD says it will be putting the equivalent of the HD 5750 / 5770 in one of the APUs, and the 5550 / 5570 in another. They claim that they aren't doing a Clarkedale, and using anything like a "5350", or worse. There will still be some market for some high dollar video cards because I can't visualize (given current socket and edge connector designs) swapping APUs with quite the same lack of concern as now attends swapping video cards.
I hope that missing reply doesn turn up now, as lengthy as I've made this followup!
P. S. It's ten months later, and for the mobile segment, at least, the information I had then hasn't been quite what we are going to see in less than a month now. Supposedly, it's much more like an HD 5450 than an HD 5570.
Gorath
-
Modifié par Gorath Alpha, 08 décembre 2010 - 07:16 .
#5
Posté 14 février 2010 - 07:43
#6
Posté 16 février 2010 - 04:08
Here's an interesting, although getting old now, discussion about what nVIDIA's CEO was saying publicly about the subect two years ago:
www.techreport.com/discussions.x/14538
Now, while that's getting pretty "old", this guy is very vocal publicly about everything else going on, and has something to say about almost all aspects of the PC business, Google didn't find another mention of the subject from him before or since, while both AMD and Intel have issued fairly frequent updates during that period.
Gorath
-
Modifié par Gorath Alpha, 16 février 2010 - 04:46 .
#7
Posté 16 février 2010 - 04:24
Lol, in Techno-Time, 2 years = 2 centuries.Gorath Alpha wrote...
It's my feeling that the high end graphics will end up being more expensive as a result, and the middle will be where the APUs dominate. IGPs will still go on as bottom feeder hardware.
Here's an interesting, although getting old now, discussion about what nVIDIA's CEO was saying publicly about the subect two years ago:
http://www.techrepor...ussions.x/14538
APUs will probably be lucky if they manage to be useful to even businesses.
#8
Posté 17 avril 2010 - 02:10
The author is less optimistic about how well ATI will pull it off than I am, and pretty much in parallel with me about Intel's chances of doing as well any time in the near future. It's still my thought that ATI has an excellent chance of dominating the most profitable GPU segment by 3-5 years from now, through including actual Mainline quality video within the CPU at a cost that will cut into all GPU levels other than the topmost.
The article was written six weeks ago, which is why it mentions Fermi not being here yet:
www.tomshardware.com/reviews/future-3d-graphics,2560.html
I'd been checking to see if Tom's had an April "best VGA" article ready yet, and whether that article included Fermi benchmarks. The answer was no on both counts.
The biggest reason I see for the potential win at the Mainline level for ATI is that ever since the HD 4n00 generation, is that Mainline level that ATI concentrates on and then scales UP from there to High End, so they have much smaller chips, much higher ratios of good chips from each wafer, while nVIDIA went the other way (that ATI had previously followed), to a truly huge and complicated primary High End chip that everything has to scale down from.Gorath Alpha wrote...
Both AMD and Intel will eventually be selling what AMD refers to as an APU (Accelerated Processing Unit), which integrates a full power GPU (my definition) into the CPU itself. Those will be available for end-user purchase in about two years. Intel currently sells some i3s and i5s with a different onboard arrangement. The video chip isn't integrated at all; it shares the packaging with the CPU. It was upgraded somewhat, and with the improvement in CPU to video device communication, is able to match the ATI and nVIDIA IGPs on some benchmarks, for the very first time.
But it's not at all a full power GPU. That's still some time in the future now, for both companies. For the sake of our gaming community here, I chose to consider the Mainline Game Level ("Medium") as the full power level, making High End "Performance Grade" video into "Double Power" for here and now.
Gorath
-
Modifié par Gorath Alpha, 17 avril 2010 - 02:46 .
#9
Posté 06 juin 2010 - 06:38
Dragon Age Tech forum's database, thus the previous parenthetical in included for them).
Since every major aspect of Intel's plans for a full-power video integration into their processors have been cancelled, I've edited the thread here to suit the new situation. .
#10
Posté 25 juin 2010 - 06:53
social.bioware.com/forum/1/topic/58/index/79841
#11
Posté 08 décembre 2010 - 07:31
What if nVIDIA convinced Microsoft to allow them to engineer a version of Windows that wasn't based on X86 code? Would they:
1. Have a chance of selling the idea to Microsoft?
2. Be able to compete with Intel and AMD on the CPU end, and with AMD on the combo APU package?
Their boss is an extremely competitive guy. You might not believe what he went through on his arrival in this country as a pre-adolescent, and he came out of it a winner. He tries to hire people with that same kind of drive. The next couple of years might be even more interesting than I'd been thinking they would be! And, along the lines that this (above) comment was exploring is the single most interesting (to me) announcement at CES 2011:
MJicrosoft will support the ARM processor with a windows version. nVIDIA already had an ARM license, and has already engineered their own version.
In other regards, for Sandy Bridge and Fusion, CES was about as I expected.
Intel's chipset video chips have been produced in a wide variety lately, however, until about three, I think, years ago, all of them lacked several very basic functions that gaming CARDS began featuring ten or twelve years ago, after nVIDIA's first "Riva" cards arrived on the scene, when they included a Textures and Lighting unit internally.
Even when the Intel 3100X Chipset chip appeared, it took Intel eighteen months or so to activate all its functions in drivers. And many of the producers of laptops in particular, have continued buying the older and less expensive chipset pair instead, so that brand new mobile computer devices still have the same stone age video (and the Atom has not been matched up to any recent Intel video chips, so Netbooks have remained in the same primitive state).
When the "i" series of Intel Core Duo Multicore CPUs began to appear, they had an upgraded version of the very latest chipset chip riding along piggyback inside of the processor's packaging, where it shared some of the large RAM cache, and for the first time in all history, was competitive in raw speed with Chipset chips from real graphics engineers at AMD and nVIDIA. That does not mean the i-CPUs can be used for gaming without truly HUGE compromises, however, as is also true of the AMD and nVIDIA Chipset video chips.
With Sandy Bridge, most of the improvement has gone into the actual CPU side, where some 10-20% of added efficiency has been achieved. However, instead of merely being a separate device riding along, the video support in Sandy Bridge is supposed to have been fully integrated into CPU functioning, giving it new advantages it didn't have while piggy-backing (it's still essentially the same relatively crude video device, however).
Therefore, this does *NOT* mean it is a game-capable option, unless the game settings are seriously crippled to allow it to be used. According Anand Tech's tests, it is as fast for some things as the Radeon HD 4200 / 4250 pair of Chipset video chips that formerly held the top rank among the onboard video crowd, and even matches AMD's least capable HD 5n00 real card (a poor card for certain), the 5450, on some benchmarks.
The biggest news out of CES for game players (IMO) is that Microsoft will support ARM, and that nVIDIA is building its own ARM processor, so it won't be left behind by AMD's Fusion (which blows past Sandy Bridge, with better battery life, less waste heat, and better video graphics).
Gorath
Modifié par Gorath Alpha, 10 janvier 2011 - 11:54 .
#12
Posté 19 mai 2011 - 12:31
Neither of the Fabs that AMD was relying on could make the next step, from 40 nm to 32 nm wafers in their predicted timeline. AMD had to totally redesign their APUs, CPUs, and GPUs to suit the older 40 nm Fabs, and it took a lot more time than they expected; the current CPU generation is still not released, and over half of the APUs are still hanging fire today (May 18, 2011). OK, make that a 1/3 part now, June 14.
The Netbooks and NetTop APU Fusion devices ended up with a (relatively) minimalistic integrated class of graphics device after all. It's better than what Intel has been offering, but AMD couldn't do what they planned to do with the "thick" dies, and their greater power requirements, so it was back to the drawing board for those. This is the prediction that Daily Tech has come out with about what they call the "Notebook Mid-Market":
www.dailytech.com/AMD+Fusion+Emerges+as+Serious+Threat+to+Intel+in+the+Notebook+MidMarket/article21763.htm
I don't think that I've seen a prediction for when / if the retail availablity of Desktop APUs is expected.
Global Foundries has their 32 nm plant running. Here's the followup article to that link above:
www.dailytech.com/AMD+Ships+Llano+ASeries+Looks+to+Punish+Intel+on+the+Budget+End/article21898.htm
Modifié par Gorath Alpha, 14 juin 2011 - 05:48 .





Retour en haut







