Aller au contenu

Photo

PC Hardware Basics for Game Playing


  • Veuillez vous connecter pour répondre
86 réponses à ce sujet

#51
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Gorath Alpha wrote...

If it was a matter of catering to the majority, the games would not have advanced past the level of 4 color CGA graphics at 350 by 200 pixels (or 320 by 240,  whatever it was in 1981).  Throughout the 29 years that x86 personal computing has been standard, something on the order of 80 to 90% of the systems that have been in service were always below the level of what the game-oriented people have owned, and (to whom) the developers have looked to for sales. 

That's always been true, and constant streams of raw noobs keep on making the same mistakes.  In today's case, back to ignoring the stickies, and not being willing to exert any effort to figure out how to include the standard information. 

G

Modifié par Gorath Alpha, 06 décembre 2010 - 01:52 .


#52
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

ohupthis wrote...

. . . I think you Moderate on a forum specifically for processor issues?

No, only on video issues, and on general tech subjects.  Meanwhile, I wanted to recycle this. 

Last week. the noob arguing with me and wanting to ignore having a bad video card was using an HD 2400 Radeon, which is a business-only card from four years ago.  This week, it is a much worse POS, the defectively designed Geforce 7300 GS, that is being promoted as something worth more than the thirty cents' worth of precious mettals it can be melted down to yield. 

I've never included any of the junk like the 2400 and 7300 in the performance lists. 

(For some reason, DA seems to be attracting a larger share than usual of total noobs, wanting to ask about low quality semi-garbage video cards.  Here's the lineup of those "No-Go" parts {ranked in reverse order of ineptitude - best is first, worst is last, and Intel, naturally, isn't in the running at all}):

HD 4550 > HD 3470 > HD 4350 > 9400 GT > 8500 GT > HD 2400XT > X1300 Pro > X1550 > 7300 GT > 6600 Vanilla > HD 3450 > 9300 GS > 8400 GS > 7300 GS > HD 2400Pro > 8300 GS > X1300 > 7300 LE > X550 > 7200 GS > X300 > 9550 > X1050 > X300 SE > Xpress200 (IGP) > 7100 GS > 6200A.

Modifié par Gorath Alpha, 06 décembre 2010 - 01:54 .


#53
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Today, the bad idea graphics complaint involved one of the IGPs from nVIDIA that were based on about half of the 6200 / 6200A (the worst actual card ever made that more or less handles pixel shader SM-3, if we make allowance for various defective aspects it shares with its brethren at the low end of the 6n00 and 7n00 generations). The 6100 and 6150, and the "SE" versions of those, have only half of the shader and texture units that the 6200s had.

OK, it's half an hour later, I just had a chicken salad sandwich and a fruit cup for lunch, and I decided to be sure I was clear with the readers when I aver that the Geforce 6n00 generation was in fact the first one that had "native" SM-2 and SM-3 pixel shaders, even if those were screwed up on the 6200s and some of the 6600s that year. 

The infamous FX 5n00 generation was going to be the "better idea" that sidestepped Dx9's shader processing, in favor of a more complex and potentially enriched, set of shaders that would be "SM-4" before there was any SM-4, more or less (the head of nVIDIA, and Microsoft, were having a spat and suing each other just then). 

The GF3s and GF4s, other than the MXes, had already included Dx8's pixel shader SM 1.4 functions, but the FXes didn't start out even able to do those as native functions.  All Dx8 / Dx9 shaders were having to run in the conspicuously slow advanced shader system, making the FXes slower at Dx8 by a great deal than the GF3s has been, and much. MUCH slower than ATI's Radeon 9500 / 9700 pair was in Dx8.  The FX 5800 was so loud, and looked so large, with the first ever "two-slot" cooling system shroud attached, that it got the nickname, :Dust Buster" right away. 

The entire series appeared to be a disaster, with the exception of the FX 5600, which did have a native Dx8 shader function included, but it still wasn't competitive with the Radeon 9500 Pro.  nVIDIA worked extra shifts to recover from the compounded errors, withdrew the FX 5800, and revamped the FX 5200 / FX 5500 / FX 5600 to improve their Dx8 and Dx9 capabilities as much as possible.  For the FX 5700 and FX 5900 / 5950, that meant literally grafting part of the GF4 silicon into the FX design, resulting in a very large chip that ran very hot, and still wasn't competitive at running Dx9's pixel shaders.   

Modifié par Gorath Alpha, 16 août 2010 - 05:17 .


#54
XX Morrigan XX

XX Morrigan XX
  • Members
  • 2 messages
*I'm a bad dog and will need a vacation if I spam any more threads*

Modifié par Eurypterid, 19 août 2010 - 09:03 .


#55
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
The awaited 430 and 450 Fermi-based Geforce GTS cards that had been rumored to be coming this month seem to have been anticipated too soon.  Those two may be released in September. 

(Edited a month later here.  The GTS450 did show up, and was aimed at the higher end of the Mainline range, but the GTS430 is an entirely different kettle of fish than its name indicates, about which more will appear here later.) 

ATI had been planning to release only a new range of laptop VPUs in October, with their desktop generation to come out in early 2011, along with the "Fusion" APUs.  Their planned move from 40 nm at TMSC to a still thinner wafer, however, had to be reset further on into 2011.  The foundry's progress with 32 nm hadn't seemed satisfactory to TMSC, who elected to skip 32 and go straight from 40 to 28 nm. 

The current AMD plan is to issue an upgrade series for Evergreen (HD 5n00) on the existing 40 nm wafers, to be called the "Southern Islands" family, with the "Northern Islands" full-scale new generation in the first quarter of 2011, or at such time as TMSC's 28 nm foundry is ready to go.  (Edit here: two different foundries -- TMSC isn't doing the Fusion.) 

The Southern Islands chips are now supposed to be ready in late October or early November (the first two are refreshed versions of the HD 5750 and 5770, but named a bit confusingly, as the HD 6850 and 6870).. 

Gorath
-

Modifié par Gorath Alpha, 06 décembre 2010 - 01:59 .


#56
Tyrax Lightning

Tyrax Lightning
  • Members
  • 2 725 messages
A reminder for those of you who, like me, have crudtacular Memory: Remember to keep up on your routine Air Can Maintenance! :wizard:

#57
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Very interestingly, nVIDIA also chose the Mobile market to release their Fermi-descended Low End and Medium into first, as AMD had been planning to do (I'll have to get used to using "AMD" for both CPUs and GPUs  -- the old ATI brand name is being dropped soon). 

When we eliminate business buying from the statistics, private individuals have been buying more "Mobile" types of computers than of desktops for several years now, although only a very few of the producers have invested much effort into developing laptops that can play games for long, grueling play sessions.  Most just don't have the cooling capacity to do that. 

AND, let us never forget, the current production standards for laptops totally ignores any option for either CPU or GPU upgrades.  It is simply far too expensive to assemble them any other way.  Everything electronic is permanently soldered together, everything else is glued / welded together.  There is no longer any "inside" area accessible.  Trying to separate the mainboard from the chassis simply destroys both. 

Intel is the 900 pound Gorilla of laptops and netbooks, getting about 98% of the video device coverage, with both AMD and nVIDIA getting almost nothing.  nVIDIA is putting a great deal of effort into getting part of the handheld smartphone video market, the netbook market, and most especially, it would seem, the laptop market.  I think that AMD is also looking hard at Intel's dominance in laptops, and making moves, particularly with Fusion, in that direction. 

The only question here and now is whether one or the other, or both, can convince the producers to follow their lead.

(And in closing, let me add my own "Amen" to the comment about regular air can dust blowouts!) 


Gorath
-

Modifié par Gorath Alpha, 16 janvier 2011 - 01:25 .


#58
Tyrax Lightning

Tyrax Lightning
  • Members
  • 2 725 messages
Weird, why is ATI changing into AMD? Was having 2 Names for their stuff confusing people?

I can't help but to wonder if AMD really has anything to worry about from Nvidia. AMD is more affordable in times like these, we need cost effectiveness more than power right now. I'm happy with the performance I had years ago from my Nvidia GeForce 6800, but my current ATI Radeon HD 4850 is rocking the house too! It seems to me like neither is a bad brand, but ATI is currently more affordable. Nvidia's probably just panicking & trying to out-attention ATI.

(LOL, it looks like it's not gonna be easy for me to say 'AMD' instead of 'ATI' either! :P)

Modifié par Tyrax Lightning, 09 septembre 2010 - 03:19 .


#59
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
The second paragraph of the main article in the opening message here already includes a reminder that the "Newness" factor is NOT a criteria in determining the performance level of any PC.  We had a new arrival yesterday who seemed to be misinformed about what it means to buy a laptop that runs any version of the Windows 7 operating system, as if that was some way of determining that it had a good performance. 

Along those lines, I had promised to mention the Geforce GTS 430 again, and it is being released this week, so here goes.  Eever since the GTX280 was named, nVIDIA has used the middle digit to "sort of" function the way that the 2nd digit did when they used four numerals in their card names (the several "100" named cards that were renamed from the 9n00 repeat generation were far afield from the 200 and 300 naming they followed).  They haven't been disciplined about it, so that in many cases it's been misleading, but a "10" has been roughly equivalent with a "300", and a "20" with a "400", and a "30" with a "500".

Some of the cards with "30" numbers have been usable for gaming, although others have not.  The GT240 is particularly misleading, with a foot on each side of the Low End boundary.  The primary GT240 comes with the GDDR5 VRAM used through most recently produced Mainline cards' component selections, but the budget card with the same name and 20% poorer performance, has DDR3 instead. 

The GTS430 is misnumbered, and should be a "GT420" instead, at least when it comes to speed.  It is about the same level of a card as the Radeon HD 5550, on the borderline (a little below), or an HD 5570 (a little above that borderline between Low End and Mainline). 

Modifié par Gorath Alpha, 12 octobre 2010 - 08:11 .


#60
Tyrax Lightning

Tyrax Lightning
  • Members
  • 2 725 messages
Yeah, it's sad to see the computer illiterate ripped off by Stores. & the Store Personnel can sleep at night afterwards...

#61
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
There are retailers whose employees' actions are unforgiveable, however, I'm afraid that the primary reason we see so many ignorant new game buyers is the combination of a badly failed education system and societal shortcomings.  Computers and PCs have been part of our culture for so long that they are taken for granted by the masses, who really should know better, but actually don't even try to think about the difference between an appliance such as a toaster or a radio, and the extremely complicated nature that is the actual fact about PCs. 

When it comes to the single most important part of a PC to use for game play, you can look in Intel's direction.  Intel just doesn't pay much attention to games, and since they dominate the laptop industry, and laptops dominate the home buying situation, there just isn't much in the way of bragging or highlighting of graphics among the promotional materials. 

Both AMD and Intel are heading toward integration of some kind of graphics in the CPUs used for laptops, although Sandy Bridge will amount to only a mere shadow of AMD's Fusion processors when graphics are considered.  I've already mentioned that nVIDIA dove into mobile versions of their Fermi before expanding into the 450 last month, and the 430 this month, and have hyped those mobile designs with enthusiasm.  They are more likely to be able to compete with Intel for graphics share on smaller systems than laptops, either cellphones, tablets, or Netbooks, in spite of anything they can offer for laptops (using the ARM for a processor). 


Gorath

Modifié par Gorath Alpha, 16 janvier 2011 - 01:20 .


#62
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
I covered the important ground in the long comment, immediately above.  I want to point out a potentially contributary factor that probably isn't really terribly important, since it involves pretending things are different than reality at all kinds of levels.  During the past 4-5 years, more and more popular TV shows have characterized one or more of the regulars on the casts as computer whiz geeks, and without any exception I've seen, have shown their characters using laptops everywhere, not just on the road. 

That's plainly wrong.  Business uses desktops in the office, and Hollywood ignores that.  The real-life highly skilled hackers use desktops, other than when out in the field, not laptops.  High speed typing on laptop keyboards is just not practical.  The keyboard feel and response rate is just not conducive to speedy data entry.  On a daily basis, practically, we see Penelope of Criminal Minds doing amazing things with a laptop, and Abby in the Lab of NCIS is supposed to be using laptops exclusively. 

None of it is real, folks.  It's entertainment, just fiction.  I would like to believe that our education system isn't actually so bad that the great mass of Americans don't know it's a huge exaggeration! 


G.

Modifié par Gorath Alpha, 12 octobre 2010 - 08:35 .


#63
SSV Enterprise

SSV Enterprise
  • Members
  • 1 668 messages
The computer engineering major on my college dorm uses a netbook and a laptop he bought a couple years ago (with a 2.5 GHz Core Duo processor and a Mobility Radeon HD 3650 video card), but doesn't own a desktop. Just sayin'.

But yes, in professions where you have a stationary workplace, you will be using a desktop rather than a laptop. It's less expensive, both in the short-term purchase price and the long-term upgradeability of the platform. Of course, businesses are all about minimizing costs and maximizing profits.

Modifié par SSV Enterprise, 12 octobre 2010 - 06:43 .


#64
flagondotcom

flagondotcom
  • Members
  • 543 messages

Gorath Alpha wrote...
The real-life highly skilled hackers use desktops, not laptops.  High speed typing on laptop keyboards is just not practical.  The keyboard feel and response rate is just not conducive to speedy data entry.  On a daily basis, practically, we see Penelope of Criminal Minds doing amazing things with a laptop, and Abby in the Lab of NCIS is supposed to be using laptops exclusively. 

None of it is real, folks.  It's entertainment, just fiction.  I would like to believe that our education system isn't actually so bad that the great mass of Americans don't know it's a huge exaggeration! 

Real-life highly skilled hackers use the most appropriate tools they have available.  For initial penetration of a system or network, any laptop made in the last three years has enough horsepower to run exploits and tools (Metasploit works fine from my much less capable Dell Mini9, since graphics are not a big need.).  Most hackers I know, whether white-, black-, or grey-hat, have at least one or two laptops plus two or three desktop/server systems "local", plus access to remote hardware (system in colo, compromised system in another country, etc.).

Sometime's it's useful to have a very small system.  If I'm doing a vulnerability and penetration test of a customer network, I may want a larger screen so that I can go onsite and show them (replicate) exactly what I've found and also have enough screen to start building the outbrief/deliverables.  It also can be very useful to be able to have the same environment available at work, at home, and on travel without needing remote/network access.  In the multi-floor building where I work about 95% of the technical/management staff use laptops.

I agree with some of your generalizations, Gorath, but they're still generalizations and not all founded in reality.

#65
Tyrax Lightning

Tyrax Lightning
  • Members
  • 2 725 messages
I don't auto-believe what I see on TV, & I seriously hope i'm not alone in this...

#66
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
The real question, though, about whether that is fully pertinent right now, is whether your attitudes and preferences are still typical for your age group and educational level, after the enlightening experience of learning enough to make good choices buying the components for your own first game system build this past year?

#67
Tyrax Lightning

Tyrax Lightning
  • Members
  • 2 725 messages

Gorath Alpha wrote...

The real question, though, about whether that is fully pertinent right now, is whether your attitudes and preferences are still typical for your age group and educational level, after the enlightening experience of learning enough to make good choices buying the components for your own first game system build this past year?

Good point. I hadn't thought of that. I definitely cannot call myself a 'normal' person.

Before my Computer Build, I was a Computer Novice, & all the effort & learning I did propelled me up to Computer Padawan.

#68
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Because of events in California, in Silicon Vallley, I've been editing my several reference articles here in these forums, and did bring a couple of them nearer the top of the cyclic pile recently.  This particular thread hadn't been updated in awhile, and it won't hurt for it to bump its way upward tonight. 

Two different approaches to the integration of graphics into the actual central processor functions are appearing soon.  One is a bare minimum effort**, from the champion of bad video, Intel.  The other is AMD's far more creative and in my opinion, more useful, contribution.  Intel's Sandy Bridge is due this coming April or May, when the same basic low quality Intel video now riding along piggy back inside the processor package of the i3, i5, and i7, is entirely integrated in the next series of CPUs.  But "Fusion" is practically already here.

The AMD device combines a far more capable graphics capability, closely related to the Radeon HD 5n00 generation as an integrated function to their multi-core CPU, and the mobile versions are already in the (figurative) hands of Netbook, Notebook, and laptop manufacturers, with the PCs using them originally expected about the turn of the year or perhaps at CES 2011.  The desktop Fusion APUs were expected in mid-February.

The latest news on a long-standing feud between Intel and nVIDIA over whether nVIDIA's contract with Intel included the right to design chipsets for the newer Intel CPUs was about to go into court, after six years of wrangling (when Core Duo was a mere embryo).  But both agreed to ask for a continuance because they were back in negotiations now.

The conjecture going around is that Intel wants to protect its dominant lock on the Netbook market, which the Fusion threatens, given its many times superior graphics.  nVIDIA's ION is a low-current device that offers almost-competitive graphics performance to the Fusion for games, and superior performance in some other areas.  Intel may hope to save a bigger part of their Netbook market if they can pair up ION and Atom at a good price, which right now, they cannot.

** (Added in Edit Jan 3rd:  Previews of the Intel Sandy Bridge are appearing now, any the mobile versions are being compared to the ATI HD 4200 and 4250, and it's supposed to be better than that.  According to Anand Tech's reviewer, one of the Sandy Bridge laptops was even able to run Mass Effect 2 as well as the Mobile HD 5450, which surprises me, given the disparity in shader processors.)

Why is this pertinent here?  Fusion is going to be available for standard laptops at a much smaller cost than discrete GPUs plus a CPU without graphics of a similar quality, and no one else is going to have anything for laptops that competes.  Private ownership of PCs is concentrated in laptops, not desktops.  Initially, the business grade, like an HD 5450, will probably be priced at almost what similar AMD processors without graphics have cost.  That will really put a dent, potentially, in nVIDIA's sales of chipsets for AMD processors, and for discrete cards like the G.210 Geforce.

Although pricing isn't being discussed yet, the presumption is that the difference between an APU with business graphics, and one with the equivilant of the HD 5570 graphics integrated will be relatively small compared to a card, probably less than $15 to the OEMs, translating to maybe $25 retail (my guess there).

THOSE APUs will run games such as DAO & ME2 without any separate GPU card, which is why it's significant to this article.  We can expect to see what the Mobile Fusion products will look like at CES (Consumer Electronics Show) 2011, which opens on January 6th.  Until mid-December, it had been my understanding that products would actually shipping by January 6th. 

The current expectations put the shipping date closer to the end of January, 2011.


Gorath

Modifié par Gorath Alpha, 03 janvier 2011 - 07:57 .


#69
Tyrax Lightning

Tyrax Lightning
  • Members
  • 2 725 messages
Many thanks for the news, Gorath! :happy:

#70
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
In the bad old days, before ATI got its act together, they were often late getting product onto the retailers' shelves, and gave away a lot of advantage possibilities to nVIDIA that way.  They once were also in bad repute for the length of time it took them to match up new video drivers to their latest graphics generation.  AMD hasn't had that problem to nearly the same extent. 

But the plans for their 32 nm products have been delayed by both Global Foundries' (recently) and TMSD's problems making that next upgrade from 40 nm wafers.  Intel has gotten their Sandy Bridge out the door, but so far, the Fusion mobile devices are only available for previews, with nothing yet for the desktop APUs. 

#71
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Lucien the Red wrote...

Upon having a friend of mine look at the specs of the computer and the game, I have come to the realization that the computer does not fulfill the requirements of Dragon Age for its video driver. Off to the store I go.

Actually, you don't "buy" drivers, as such. Those are free. There will be a CD in your box with your new graphics card, and as per these posts, we all hope you didn't go off half-cocked and could soon come back with something almost equally unusable as what you have now.

social.bioware.com/forum/1/topic/58/index/621378/2#5733233

social.bioware.com/forum/1/topic/58/index/5600000

Modifié par Gorath Alpha, 16 janvier 2011 - 09:36 .


#72
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Laptops are still outselling desktiops, and the percentage of the total that are sold with actual game-capable graphics in them is still falling lower and lower.  Someone brand new to the forum just wrote that a laptop with an nVIDIA 8200 chip handling video was purchased for playing games on.  An 8200! 

I hope it was really, really cheap! 

(Not quite an hour after I posted this comment, I want to add in that I know, and Moodoggie does, and various of our game playing readers here know it, there certainly are a few laptops sold that have most of the right bells and whistles, but not necessarily all of the grunt and groan that should go with it.  The average laptop doesn't need much in the way of cooling for day to day ordinary activity, and all too frequently, the factory is still selling the same inadequate heat sink and ineffectual fan that was used for the laptop with no graphics card at all, but they blithely advertise that it's a "gaming" machine because it has this or that or another graphics card added to it with a name similar to those used by desktop video cards. 

And it will keep on stalling, slowing down, and shutting off, because it overheats badly.  Toshiba, I believe, is alone in the laptop market (at least at the "usual" prices seen in the retail stores) with a policy of upgrading cooling to match upgraded (at the factory during production) graphics.  Sager sells several excellent laptops that have the power, cooling, and warranty that goes with a high quality PC, and theirs are worth what they cost, unlike most of the Dell versions of  Alienware laptops. 

Take note of a sentence I included above about video card naming.  nVIDIA in particular has frequently created confusion and greatly displeased their customers with deceptive naming.  The mobile cards with names similar to popular desktop cards are very frequently based on an entirely different set of silicon! 


Gorath

Modifié par Gorath Alpha, 31 janvier 2011 - 07:42 .


#73
Moondoggie

Moondoggie
  • Members
  • 3 742 messages
It worries me if people are being advised something like that is good for gaming. And even moreso that convinience is being thought of as a more important factor than the systems ability to handle games to a decent quality. I hope more people read about all those with laptops and can't play games and think "Hmm i'd like to play games maybe i should get a desktop instead of that fashionable looking notebook that is completely useless for playing games on.

#74
CrustyCat

CrustyCat
  • Members
  • 290 messages
More than likely, people are going to Best Buy and listening to whatever idiot is there telling them what to buy.

#75
Tyrax Lightning

Tyrax Lightning
  • Members
  • 2 725 messages

CrustyCat wrote...

More than likely, people are going to Best Buy and listening to whatever idiot is there telling them what to buy.

& that kinda thing is why I built my own Computer with aid from the Forumites here! :D


My apologies if this is too off-topic, but I just found a BBCNews Article about defective Intel CPU Chips that I figured would be of interest to us all: http://www.bbc.co.uk...nology-12354263