Aller au contenu

Photo

The Video Card Generational Ladder Analogy


  • Veuillez vous connecter pour répondre
30 réponses à ce sujet

#1
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
                      Climbing up the Ladders

This article isn't properly part of the three-part series about PC Performance Basics, Video Card Basic Information, and the abbreviated Video Card Ranking List for everyday use.  It is supplemental, and the listing is split apart, where my shorter list combined generational cards into series groupings for simplicity's sake, and because that seems to be the way that game developers were listing the cards that they supported.  

                    Multiple generational ladders

The fact is too many people still fail to consider that the numbers in video device names are not all in a single ladder.  There are four separate ladders, actually.  The n100 / n200 /n300 trio are almost always onboard chips and those progress up the generational ladder at the very bottom end; none of them has ever even reached the (current) Low Quality business graphics level.  An HD 4200 is at the very top of its own extremely short little ladder (in case this is unclear in any way, onboard = inadequate).  In the past, n300s were actual cards, and a few are cards with that number, again, more recently.   

Next is the business class parts, with the rest of the n300s, n400, n500 performance numbering in the names.  They are too slow and too limited for games, but are superior to onboard chips.  As time passes, and the new generations arrive, the Low End does move, but much more slowly than the High End does.  That 4200 IGP does equal the four year old Low End cards (7300 GT, X1300 Pro), and surpasses the five year old Low End (X300, 6200 A -- you will also note the "n200" / "n300" there.  The IGPs then weren't anywhere close to Low End four years ago, and "Low End" was a bit broader).  nVIDIA's G210 is a business card.  These cards were never meant for games, although current ones can usually handle the older games from five years ago or so. 

A Mainline Gaming card has an "n600" in its name and a Medium sized ladder.  These cards do improve somewhat more each generational "year" than is true for Low End Cards;  some years there are also "n700" cards that sit up near the border between Medium and High End.  The better Mainline cards tend to stay just about two generations behind the High End cards' baseline performance, but with a reduced screen resolution capability due to a narrower memory bandwidth. 

The Geforce GT 220 is near the bottom of Mainline, the Geforce GT 230 and GT 240 are somewhat above the 220, and the Geforce GT 250 is on the borderline between Mainline and High End. 

The tallest ladder belongs to the High End cards, which can leap up several rungs at a time some years.  They have an "n800" or "n900" performance code in their names, or in the case of the Geforce 200s, a GTX60, GTX75, GTX80, etc. It is only when we are dealing with these most expensive, fastest cards, that the "newness" of one compared against another means anything. 

In the listing that follows, you will see that the different generations' members scatter out and are intermixed.  Just because a card is "new" simply has no real meaning. 

I will add a more lengthy version of the video card rankings lists that hasn't been simplified / compressed at the high end to make the list shorter.  

Continued, Next Message. 

Modifié par Gorath Alpha, 20 mai 2010 - 07:51 .


#2
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Not the King complied the most comprehensive video card list ever made up for any game when he created his VGA rankings for Oblivion.  Here is one of his last such, for your edification, with my minor edits added.  I am not fully in agreement with each of his placings, and I have had to insert a great many newer cards, but for the most part, the placement disagreements are not major ones. 

The reason this has a bearing on Dragon Age is the heavy pixel shader coding that both games share.  As other new 3D games followed that had similar reliance on the advanced pixel shaders, what worked well for Oblivion basically worked well for NWN2 and The Witcher.  Now, in DA: O, the old lists, with new names added, are still useful.   Please note that the brackets that he used were based on Oblivion, and that was a 2006 game, although featuring hardware demands a year ahead of almost anything else. 

Thus, his "Medium" cards have now slipped to the bottom of that class, if not down into the Low, Business Quality class, and many so-called High End cards are now only Medium, plus some that NTK had suggested were real barn burners are now somewhat mundane. 

"Oblivion" Ranking List

Ultra Extra High End

          o Radeon HD 5900s
          o GeForce GTX480
          o Radeon HD 5800s
          o GeForce GTX 295
          o GeForce GTX470
          o Radeon HD 4870X2
          o GeForce 9800GX2
          o GeForce GTX 285
          o GeForce GTX460
          o GeForce GTX 275
          o Radeon HD 5770  
          o GeForce GTX 280
          o GeForce GTX 260
          o Radeon HD 4870
          o Radeon HD 5750  
          o Radeon HD 4850

(Edit) From this point upward, are Dragon Age: Origins' Recommended Level & better VGAs
     
      Extreme (these can finally achieve a rather "maxed out" Oblivion and run like you expect games to run with a high-end card. quality-enhancing mods will still bring these down, though)
     
          o GeForce GTX 250
          o GeForce 9800GTX+
          o Radeon HD 4770
          o GeForce 8800 Ultra
          o GeForce 9800GTX
          o Radeon HD 4830
          o Radeon HD 3870 X2
          o GeForce 8800GTX
          o GeForce 8800GTS (512MB version)
          o Radeon HD 5670
          o GeForce 9800GT
          o GeForce 8800GT
          o GeForce 9600GT
          o Radeon HD 3870
          o Radeon HD 2900 XT
          o Radeon HD 4670
          o GeForce 8800GTS (640MB & 320MB versions)
          o Radeon HD 3850
          o Radeon HD 2900 Pro
          o GeForce 9600 GSO & GeForce 8800GS
          o Radeon HD 2900 GT

      Very High (Again, this is in terms of Oblivion's era, these are still high end, and top of medium, rather powerful cards; any of them will max out most games, but Oblivion can make them cry)

          o Radeon X1950 XTX
          o GeForce 7950GX2
          o Radeon X1950 XT
          o Radeon X1900 XTX
          o Radeon X1900 XT
          o Radeon X1800 XT Platinum Edition
          o Radeon H D 3690
          o GeForce 7800GT Dual (two GPUs on one board)
          o Radeon X1950 Pro
          o Radeon X1900 GT
          o Radeon X1900 All-in-Wonder
          o GeForce GT 240
          o Radeon HD 5570 (GDDR3 version)
          o GeForce 7900GTX
          o Radeon HD 4650
          o Radeon X1950 GT
          o Radeon X850 XT Platinum Edition
          o Radeon X800 XT Platinum Edition
          o GeForce GT 220 & 230
          o GeForce 8600GTS
         o GeForce 7900GTO
          o GeForce 7800GTX 512
          o GeForce 7950GT
          o Radeon X1800 XT
          o Radeon X850 XT
          o Radeon X800 XT
         o Radeon X1950 GT
          o Radeon X1800 XL & Radeon X1800 GTO²
          o GeForce 7900GT & GeForce 7800GS+ (latter is an AGP Europe-only card that is no longer made, replaced by different cards with an identical name.)

          o GeForce 7800GTX
          o GeForce 7900GS & GeForce 7800GS+ (former is the desktop version, latter is the CURRENT new version of the Europe-only AGP card. Older ones are listed separately)

          o Radeon X1800 GTO         

      High-End (Oblivion's era high end are mediums today: some of these cards may be "outdated," but are all rather good, and will definitely be able to play Oblivion and look nice at the same time)

          o GeForce 8600GT & GeForce 9500GT
          o Radeon HD2600 XT
          o GeForce 8600GS
          o Geforce 7900GS (Notebook version)
          o Radeon X850 Pro  (Dragon Age's requirements make no Suffix distinction)
          o Radeon X1650XT
          o GeForce 7800GS (Gainward "bliss" version)         
          o Radeon X1600XT & Radeon X1650 Pro
          o Xbox 360 & Playstation 3 (rough placement for Oblivion comparison)
          o Radeon X800 XL & Radeon X800 GTO 16
          o Radeon HD2600 Pro
          o GeForce 7800GT
          o GeForce 7800GS (superclock version; 460MHz core, 1350MHz memory)
          o Radeon HD 4550 
          o GeForce 7600GT
          o GeForce 7800GS (overclocked version; 400-430MHz core)
          o Radeon X800 Pro
          o GeForce 6800 Ultra
          o GeForce 7800GS (stock clock; 375MHz core)
          o Radeon X800 GTO & Radeon X800 GTO² (256MB version for either)
          o GeForce 6800GT
          o GeForce 7600GS
          o GeForce 6800GS (128 MB PCI-express version)  -- this is what I consider the practical minimum card

          Below this point are cards in the Dragon Age: Origins below-minimum area

          o GeForce 6800GS (AGP version)
          o Gigabyte 3D1 (two 6600GTs on one board)
          o Radeon X800 & Radeon X800 GTO 128MB
          o Radeon X1600 Pro & Radeon X1300 XT
          o GeForce 8500GT
          o Radeon HD 4350
          o GeForce 6800 (AGP version) 

      Mid-Range for Oblivion (Mostly cards from years past, they've still got some power in them, and will be able to let gamers truly enjoy Oblivion)

          o Radeon X800 GT & Radeon X800 RX
          o Radeon 9800 XT  **
          o Radeon X800 SE
          o Radeon X700 XT
          o Radeon X700 Pro
          o Radeon 9800 Pro **
          o Radeon X1300 Pro OC
          o GeForce 7300 GT Xtreme GDDR3 (modified card)
          o Radeon HD2400 XT
          o Radeon 9700 Pro **
          o GeForce 6600 GT
          o Radeon X1300 Pro
          o GeForce 8400 GS
          o GeForce G210
          o Radeon HD3450
          o Radeon X700
          o GeForce 6800 (PCI-express version)
          o GeForce 6800 XT
          o Radeon HD3100 & Radeon HD3200 (Integrated graphics)
          o Radeon 9800 Vanilla **
          o Radeon X1550
          o GeForce 6800 LE
          o GeForce 7300 GT
          o Radeon 9700 **
          o GeForce 6600

NTK continued on down with a Low End, Very Low End, and Ultra Low End listing, all of which I decided needed to be cut away.  I debated with myself about the Radeon 9700s and 9800s, and elected to double-asterisk them.  They can't do any kind of Dx9.0 / SM3 pixel shading. 

P. S. The simplified Rankings list is easier to use.  This is basically informational, to supplement the main articles, and the list is opened up to better illustrate the opening message and the several entirely separate ladders that the variously ranked cards move up on as the newer generations appear. 

Gorath
-

Modifié par Gorath Alpha, 11 septembre 2010 - 02:47 .


#3
Valaskjalf

Valaskjalf
  • Members
  • 283 messages
Edit: formatting is off.. 

Check this out:
http://www.tomshardw...970,2491-7.html

Modifié par Valaskjalf, 09 janvier 2010 - 04:56 .


#4
phordicus

phordicus
  • Members
  • 640 messages
my thanks. i like to keep up with these things to determine when an upgrade would be worth it.

#5
mrjaeger

mrjaeger
  • Members
  • 7 messages
Where does the Geforce GTS 250 fall on this list?

#6
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
It's the same card as the 9800 GTX+, although I showed it "above" that one. "13th", but I see that I hadn't opened up the HD 5800 / 5900 to individual items after all, so when I get around to doing that, it'll move downward some.

This long version is in my opinion semi-superfluous to the shorter, easier to use list, and is specifically to illustrate the way generation members spread out more than the cards within their own "ladders" of generations.  Also, don't forget that not all games are equally liberal with their use of advanced level pixel shaders.  When Toms Hardware was including Oblivion as one of their standard benchmarks, their results still were somewhat skewed compared to the original NTK lists.  Here's their Fallout 3 "Medium" (Mainline) chart: 

www.tomshardware.com/charts/gaming-graphics-cards-charts-q3-2009-mainstream-quality/Fallout-3,1474.html

That should track with NTK's basic test marks to more of an extent.  Be advised that I don't have NTK's level of patience and devotion.  I'll do a better job keeping my original Video Rankgs as up to data as I can, than this longer list, so for newest results, stick with it:

social.bioware.com/forum/1/topic/58/index/128343

Incidenmtally, I've noticed that none of the reference articles contained any warnings about the marketing gimmick that I prefer to name as the "Big RAM Scam". 

Pardon my possibly incorrect choice of words for a marketing trick, but I don't consider the feelings of marketing executives to be worthy of PC protection (Political Correctness BS). 

This next is IMPORTANT.

My X1600 Pro is the only card I have with the so-common "Big RAM Scam" layout.  Like any Mainline card (admittedly very low for this one on that scale when new), it has only a 128 bit memory system, and a resulting narrow memory bandwidth.  It is limited to a max of 256 MBs of VRAM to pass onward to today's games, so half of the 512 MBs onboard is literally wasted.  The big RAM numbers appeal to the noobs, but it has almost no real meaning at all.  

The important shopping crieria are core speed, RAM speed, memory bandwidth, and shader unit count.


Gorath
-



Gorath
-

Modifié par Gorath Alpha, 20 février 2010 - 04:31 .


#7
qimmiq

qimmiq
  • Members
  • 5 messages
I own a Ati radeon HD 4890 1GB with standard clock speeds. With a cost of around 160 euro I consider it a medium end gaming card of 100 euro is low budget and 300+ euro is high end. I can run DaO at highest settings @ 1920x1200 at between 40-60 fps, very smooth. Processor is a core 2 duo 6550 2.4Ghz overclocked at 2.8Ghz. 2 GB memory.

#8
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
DA: O was started in roughly the same hardware environment as The Witcher was, and that game used an established game engine. Between creating first the new engine, or at least the design for it, then the game, took longer to get to market -- DA: O's video requirements are only slightly advanced over those in The Witcher (SM2 in the other, so Radeon 9800s can be used). I have no idea why the developers named such high level cards as "Recommended", but the minimums (at least the "practical ones", X800 Pro and 6800 GS) track closely.

#9
amckelvey16

amckelvey16
  • Members
  • 2 messages
i am getting a laptop and was wondering if a Radeon HD 4200 would be able to play this game? even at the lowest settings

#10
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

amckelvey16 wrote...

i am getting a laptop and was wondering if a Radeon HD 4200 would be able to play this game? even at the lowest settings

There is no onboard video from any source strong enough for this game to run properly.  Laptops tend not to offer a decent image at the low resolutions that an IGP has to be set at in order to have even halfway acceptable amimation (yes, that's personal opinion).  Sorry, pick a higher quality laptop! 

You need to look at cooling (few laptops' cooling is adequate for gaming) and default screen resolutions (don't go with high resolution screens without High End video cards).  Be prepared to trade up the entire laptop every 2 to 3 years, because very few offer any upgrade path for video (and the ones with onboard chips NEVER do so).

P. S.  I am editing the various reference articles today to be sure that all do mention the gimmick I've dubbed the "Big RAM Scam".  Some people have suggested that using the term "scam" is too politically incorrect to describe a semi- harmless marketing trick.  It isn't totally harmless through depleting rare earth minerals, and wasting electricity, which results in wasted electrical power generation fuel used (and thus ecologically unclean). 

That comment will appear elsewhere here.  For now, I want to differentiate between onboard video chips buried in the chipset ASIC versus the combined CPU plus video devices now appearing.  Eventually, both AMD and Intel will market combined "APU" chips that literally integrate a full-power GPU into the CPU.  The cost savings will be huge, and the effect on the entire video cards as add-ons industry will be catastrophic.  

Intel has already attached one of their low quality video chips to the same package as their i3, i5, and i7 CPUs.  The processors are still separate devices, and the video chip has been improved some from the best that Intel has, so that with the great advantage of direct connection to the CPU, this one meets and beats the performance of nVIDIA's current IGPs, but the fact is, this kind of video doesn't meet the previous definition of an "IGP" as such.

This particular article hasn't needed a lot of recycle back to the top, and hasn't gotten a lot of attention.  This comment I am re-editing here was last edited about a month after it was written, and now it is December, 2010, and interesting things are happening.  AMD's "Fusion" APUs are already in the figuarive hands of the Netbook, Notebook, and Laptop computer manufacturers.  These devices combine far more capable graphics as a fully integrated function within the CPU designs than ever offered as either a chipset video device, or a "passenger" inside the packaging of a CPU. 

Intel's own answer goes by the name "Sandy Bridge" and engineering samples seem to offer video performance that equals that of the old AMD chipset video chips for the first time (it is due out in late spring, 2011).  But where the Fusion chips really look competitive with Intel (who never competed on either Mainline or High End graphics after one disaster of a try a dozen years ago), is the Netbook market, where Intel's Atom has been the market leader.  AMD has a low-amps CPU and GPU combo that blows the current Atom chipset graphics beyond just the weeds, but clear into the next county. 

As a result, the long-standing feud between Intel and nVIDIA may just finally have to end, so that Intel can sell Atom Netbooks with nVIDIA's ION chipset video in them, and not lose as much market share to AMD after all. 


Gorath
-

Modifié par Gorath Alpha, 03 décembre 2010 - 04:00 .


#11
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
I've added this summary to the end of the Video Card Rankings List, and I think it also goes here:

I wanted to add a chronological list for comparisons of the two active 3D video companies' generation names

Generational steps on the ladders

            ATI             nVIDIA 
2003     9n00            Ti-4n00
2004     Xn00            FX 5n00
2005     x1n00           6n00
2006     HD 2n00       7n00
2007     HD 3n00       8n00
2008     HD 4n00       9n00 + 2n0
2009     HD 5n00       1n0 + low end 2n0
2010                       Fermi (March, forecast)

P. S. Added to this long afterward: the actual release dates aren't really quite as neatly recorded as I have shown here, as if each generation was a year apart, when in fact the gestation time varied from six months to nearly two years for both companies. for several of the generations. 

Gorath
-

Modifié par Gorath Alpha, 05 août 2010 - 08:37 .


#12
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Note: the release date for Fermi is supposed to be March 26th.

Also, as long as I have this in front of me, that list of competitive video card generations has no dates for any months, but the facts are that for most years, nVIDIA had new cards available for sale a couple of months earlier in the year than ATI did, until 2008, when ATI was first to market, and again in 2009 also.

#13
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Not one of the resident Green Side fanboys has found any leaked specs on the GTX470 and GTX480?

Not even something like this? 

www.legitreviews.com/news/7519/

Modifié par Gorath Alpha, 24 mars 2010 - 05:26 .


#14
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
There seems to be an upsurge in the numbers of noobs and near-noobs with no concept of what the Mainline Gaming Video card definition is.  Nor what it is not.  Newness isn't as important as specifications, folks.  The G 210 and the HD 5450 (or 4350) are for business, not for games.  They were designed for charts, graphs, presentations, and spreadsheets, and that's all they can do well. 

Like many recent games, the official System Requirements for Dragon Age are misleading, but still do not allow any support for Low End hardware such as those clunkers. 

"Video: ATI Radeon X850 256MB or greater (either this is wrong)
NVIDIA GeForce 6600 GT 128MB or greater" (or this one is wrong)

(Note: IMO, the practical choices for the two video cards above should be the Radeon X800 Pro, and the Geforce 6800 GS, at least, for small textures and Dx9.0"b".  In order to use Medium Textures, the Radeon should be named as the X1650 XT)

Here is how the HD 5450 falls behind the X1650 XT (it's worse for the
X800 Pro, which is a faster card)
www.gpureview.com/show_cards.php

Here is how the G210 falls behind the 6800 GS
www.gpureview.com/show_cards.php

Here is how the HD 4350 falls behind the X1650 XT (it's worse for the X800 Pro, which is a faster card)
www.gpureview.com/show_cards.php

Gorath
-

#15
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Gorath Alpha wrote...

There seems to still be an ongoing upsurge in the numbers of noobs and near-noobs with no concept of what the Mainline Gaming Video card definition is.  Nor what it is not.  Newness isn't as important as specifications, folks.  The G 210 and the HD 5450 (or 4350) are for business, not for games.  They were designed for charts, graphs, presentations, and spreadsheets, and that's all they can do well. 

Here is how the HD 5450 falls behind the X1650 XT (it's worse for the X800 Pro, which is a faster card)
www.gpureview.com/show_cards.php

Here is how the G210 falls behind the 6800 GS
www.gpureview.com/show_cards.php

Here is how the HD 4350 falls behind the X1650 XT (it's worse for the X800 Pro, which is a faster card)
www.gpureview.com/show_cards.php

Today, we have noobs wanting to argue the merits of onboard chips versus the Minimum Video Card Needed.

Maybe for WoW, with its simple requirements, it would be just fine.  Not for this, not for Mass effect. 

#16
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
It seems to me that another cycle of newest arrivals wanting to play games on the cheap are appearing, and the reference articles can be popped back up again, including this one. 

Modifié par Gorath Alpha, 26 janvier 2011 - 02:28 .


#17
basdoorn

basdoorn
  • Members
  • 154 messages
Happy to help the bumping, looked at your comparisons and was surprised to see the Ati Radeon 5550 have about the same performance as a mere Ati Radeon X1800 GTO. These are great cards for viewing Blu-ray and such, but even 4 years is not sufficient to bring n800 card performance down to n500 cards in the latest generation it seems.
http://www.gpureview...1=630&card2=386

Even bigger a surprise was that the Ati Radeon 5670 is only on par with the X1950 XTX (which was not even a dual GPU card like the current n900 series tend to be).
http://www.gpureview...1=623&card2=442

I am currently looking for a fully passive card for in a home theater PC, but it seems waiting for a passive 5670 is the right thing to do. A 5570 or anything less is definitely not a gaming card by my standards with these specifications. I must admit I too had expected current generation cards to be faster than the comparisons show.

At Toms hardware they have some benchmarks for Dragon age of NVidia 9800/NVidia 240/Ati 4830/Ati 5750 and up. It clearly shows these are the minimum cards to be playing Dragon Age at 1650x1050 at very high quality settings.
http://www.tomshardw...igins,2114.html

Using these more general numbers from techpowerup will allow people to see where the lower end cards reside on various resolutions when it comes to gaming. The Ati 5670 comes in a bit higher than the NVidia 240, which should be good enough for the system in the living room:
http://www.techpower...rozr_II/29.html

Suffice to say I fully agree with your posts, most people who want to play games on the PC have nowhere near enough knowledge of video card performance, so keep up your good work on educating the masses.

Modifié par basdoorn, 10 juin 2010 - 03:58 .


#18
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Gorath Alpha wrote...

It seems to me that another cycle of new arrivals wanting to play games on the cheap are appearing, and the reference articles can be popped back up again, including this one. 


And today some more

G

Modifié par Gorath Alpha, 26 janvier 2011 - 02:30 .


#19
mousestalker

mousestalker
  • Members
  • 16 945 messages
What's great about the OP is that you clearly differentiate between G, GT and GTX. If I had a dime for every time I've read "I have a GT 260 so I should be able to play ____ at max res, right?"



Thanks!

#20
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

mousestalker wrote...

What's great about the OP is that you clearly differentiate between G, GT and GTX. If I had a dime for every time I've read "I have a GT 260 so I should be able to play ____ at max res, right?"

Because  Atari, Bethesda, and Bioware all began lumping entire card families together, that is how the "main" list is organized, although close to the official cards' positions, I did break them out.  I always thought that the suffix business was purposely misleading.

An "XT" was number two in the Radeon hierarchy, but on the bottom for a Geforce!  The "GT" was number two (or three when an Ultra was around) for a Geforce, and little different from a vanilla level for a Radeon!  I've left NTK's original categorization, even though after four years, it's way out of date, but I don't consider this the main list. 

#21
basdoorn

basdoorn
  • Members
  • 154 messages
You could also add the new NVidia GTX 460 to the list. Its a really great card at only $ 200 and I expect quite a few people will be getting one as they run relatively cool and generally overclock well. They can get quite close to GTX 470 performance, especially the 1GB version, which has more memory bandwidth and raster units than the 768 MB version.

http://www.anandtech...60-the-200-king

Modifié par basdoorn, 04 août 2010 - 07:58 .


#22
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
I've not visited Toms Hardware lately. I prefer to wait until a card has been added to their VGA charts, then look for a game benchmark for some game similar to Fallout 3 / NWN2 / Oblivion, but more recent, and what I want is heavy graphical dependence on pixel shader code, which may send me to a bargain bin for a game type (RTS or Shooter) I don't buy to play, only to compare its graphics to the last game their charts offered that I was using.

I give the game away to whichever grandkid speaks up for it first (they usually have a first copy of their own already, anyway, as generous as my ex-daughters in law are).

Modifié par Gorath Alpha, 04 août 2010 - 08:15 .


#23
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

basdoorn wrote...

You could also add the new NVidia GTX 460 to the list. Its a really great card at only $ 200 and I expect quite a few people will be getting one as they run relatively cool and generally overclock well. They can get quite close to GTX 470 performance, especially the 1GB version, which has more memory bandwidth and raster units than the 768 MB version.

http://www.anandtech...60-the-200-king

OK, it's done now.  But there will be a some Radeon recycled HD 5n00s with HD 6n00 names to add in another couple of months, along with a few more Fermi 400 series cards, and at least one recycled Fermi 400, all of which will need to be benched first. 

Intel's Sandy Bridge is due this coming April or May, when the same basic low quality Intel video now riding along piggy back inside the processor package of the i3, i5, and i7, is entirely integrated in the next series of CPUs.  But "Fusion" is practically already here.

The AMD device combines a far more capable graphics capability, closely related to the Radeon HD 5n00 generation as an integrated function to their multi-core CPU, and the mobile versions are already in the (figurative) hands of Netbook, Notebook, and laptop manufacturers, with the PCs using them expected about the turn of the year or even sooner.  The desktop Fusion APUs are expected in February. (Edited, again, here):  We can expect to see what the Mobile Fusion products will look like at CES (Consumer Electronics Show) 2011, which opens on January 6th in Las Vegas.  Until mid-December, it had been my understanding that products would actually be shipping by January 6th.

The current expectations put the shipping date closer to the end of January, 2011.

The latest news on a long-standing feud between Intel and nVIDIA over whether nVIDIA's contract with Intel included the right to design chipsets for the newer Intel CPUs was about to go into court, after six years of wrangling (when Core Duo was a mere embryo).  But both agreed to ask for a continuance because they were back in negotiations now.

The conjecture going around is that Intel wants to protect its dominant lock on the Netbook market, which the Fusion threatens, given its many times superior graphics.  nVIDIA's ION is a low-current device that offers almost-competitive graphics performance to the Fusion for games, and superior performance in some other areas.  Intel may hope to save a bigger part of their Netbook market if they can pair up ION and Atom at a good price, which right now, they cannot.

(Added in Edit Jan 3rd:  Previews of the Intel Sandy Bridge are appearing now, with the mobile versions (of the integrated graphics) being compared to the ATI HD 4200 and 4250, and it's supposed to be better than those.  According to Anand Tech's reviewer, one of the Sandy Bridge laptops was even able to run Mass Effect 2 as well as the Mobile HD 5450, which surprises me, given the disparity in shader processor counts.)

Why is this pertinent here?  Fusion is going to be available for standard laptops at a much smaller cost that discrete GPUs plus a CPU, and no one else is going to have anything for laptops that competes.  Private ownership of PCs is concentrated in laptops, not desktops.  Initially, the business grade, like an HD 5450, will probably be priced at almost what similar AMD processors without graphics have cost.  That will really put a dent, potentially, in nVIDIA's sales of chipsets for AMD processors, and for discrete cards like the G.210 Geforce.

Although pricing isn't being discussed yet, the presumption is that the difference between an APU with business graphics, and one with the equivilant of the HD 5570 graphics integrated will be relatively small compared to a card, probably less than $15 to the OEMs, translating to maybe $25 retail (my guess there).

THOSE APUs will run games such as ME2 without any separate GPU card, which is why it's significant to this article.

(I'm adding this sort of expansion to all of the video graphics hardware reference articles in early December of 2010.)

Gorath

Modifié par Gorath Alpha, 03 janvier 2011 - 08:32 .


#24
DragonAddict

DragonAddict
  • Members
  • 441 messages
I very recently upgraded my video card to the Zotac GTX 480 AMP! edition, which is the fastest GTX 480 on the market. When patch 1.04a is released on Monday, I will play DA-O all over again but with every setting in NVidia Control Panel for best video quality and same for DA-O, best video quality.



With my GTX 280, I did experience artifacting with everything set for max video quality so my guess would be the video card's stock cooler wasn't good enough or its memory was over-clocked by default a bit too high.

#25
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
This article is overdue for some upgrades from the past few months.  I am kinda lazy when it comes to trying to place individual video cards in ranks, instead of the simplified version, with a card's siblings from its own class added for a series.  Interestingly, on the same day, both AMD and nVIDIA announced a new High End option yesterday.  The 570 joins the 580 from the Green Team, and the 1 GB versions of the HD 6950 join the Red Team (two new ones in the mid- $200 price range). 

Modifié par Gorath Alpha, 26 janvier 2011 - 01:29 .