Aller au contenu

Photo

Gaming Graphics Card Rankings and Video Game Card Basics


  • Veuillez vous connecter pour répondre
55 réponses à ce sujet

#26
SSV Enterprise

SSV Enterprise
  • Members
  • 1 668 messages
The maximum resolution on the monitor this computer would use appears to be 1440x900. Also, I just found out that the power supply of the computer to be upgraded is 375 watts. I know that definitely restricts usage of cards in the Radeon 57n0 series and above, but would that affect the usage of a 5670?

Ok, I didn't know that AMD was doing this "refresh" of the 5000s. I was planning on buying a card on Black Friday, but because of this refresh, would it be a good idea to delay the purchase? This is going to be a Christmas gift.

Also, I prefer to get a Geforce 400 or Radeon HD 5000 series card in order to get DirectX 11. However, it would seem that most good 400s demand too much power, so I'll probably be getting a 5000.

Modifié par SSV Enterprise, 21 novembre 2010 - 10:18 .


#27
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
TTBOMK, an HD 5670 will be fine with a relatively recent 375 watt delivered as part of an OEM rig, but if you are basing any game-related choices on Dx11 rather than the much-sooner to be useful Dx10 for gaming, whatever anyone buys in 2010 is going to be out of date before Dx11 is a factor.  As noted, the only Fermi that will run on less than 450-500 watts worth is the 430, and it's just not a real gaming card, no matter its numbering. 

The 450 and 460 do suck down sizable amounts of juice, you are correct there.  Don't forget that nVIDIA is working with handicaps on its future, since it isn't partnered with any CPU producer, and the Low End discrete card market is already shrinking rapidly, with AMD close to releasing a desktop Fusion APU (the Netbook and Low End Notebook Fusions are already in production).  nVIDIA is planning on a future as a massively parallel CPU producer.  Those tend to be intensely voracious regarding current, and the GPUs for graphics are essentially now byproducts of the other market. 

P. S. Intel's Sandy Bridge is due this coming April or May, when the same basic low quality Intel video now riding along piggy back inside the processor package of the i3, i5, and i7, is entirely integrated in the next series of CPUs.  But "Fusion" is practically already here. IMO, Sandy Bridge will not have any serious impact on gaming.

The AMD device combines a far more capable graphics capability, closely related to the Radeon HD 5n00 generation as an integrated function to their multi-core CPU, and the mobile versions are already in the (figurative) hands of Netbook, Notebook, and laptop manufacturers, with the PCs using them expected about the turn of the year or even sooner.  The desktop Fusion APUs are expected in February.  Those are likely to be of concern for gamers.

The latest news on a long-standing feud between Intel and nVIDIA over whether nVIDIA's contract with Intel included the right to design chipsets for the newer Intel CPUs was about to go into court, after six years of wrangling (when Core Duo was a mere embryo).  But both agreed to ask for a continuance because they were back in negotiations now.

http://www.insidehw....t-chipsets.html

The conjecture going around is that Intel wants to protect its dominant lock on the Netbook market, which the Fusion threatens, given its many times superior graphics.  nVIDIA's ION is a low-current device that offers almost-competitive graphics performance to the Fusion for games, and superior performance in some other areas.  Intel may hope to save a bigger part of their Netbook market if they can pair up ION and Atom at a good price, which right now, they cannot.

Initially, the business grade Fusion APU, like an HD 5450, will probably be priced at almost what similar AMD processors without graphics have been.  That will really put a dent, potentially, in nVIDIA's sales of chipsets for AMD processors, and for discrete cards like the G.210 Geforce. But those aren't "serious" game parts, not really, although they may help raise what is presently a very dismal low average of graphics performance among mobile devices.

Although pricing isn't being discussed yet, the presumption is that the difference between an APU with business graphics, and one with the equivilant of the HD 5570 graphics integrated will be relatively small compared to a card, probably less than $15 to the OEMs, translating to maybe $25 retail (my guess there).

THOSE better APUs will run games such as ME2 without any separate GPU card, which is why it's significant to this article.

Gorath

Modifié par Gorath Alpha, 04 décembre 2010 - 08:41 .


#28
SSV Enterprise

SSV Enterprise
  • Members
  • 1 668 messages
I understand, but even if Dx11 doesn't become a factor for a while, I still want to buy the more recent 5000 series in order to have more longevity.



Thank you for answering my questions!

#29
Mr.Ph11

Mr.Ph11
  • Members
  • 133 messages
Does - NVIDIA GeForce GTS 220 stands for Nvidia geForce "verto" GT 220?

#30
SSV Enterprise

SSV Enterprise
  • Members
  • 1 668 messages
A quick Google search indicates that Verto is simply a card manufacturing brand associated with the PNY company, not the designation of the card itself.

The two big graphics card companies (Nvidia and ATI) don't actually make graphics cards. What they do is make the graphics chips that are central to the cards, and send them to various manufacturers -- Sapphire, EVGA, Asus, PNY -- who then built actual cards around the chips, with DDR memory, PCI-E interface, cooling fans, etc. "Geforce GTS 220" indicates the kind of chip used inside a card, and that designation cannot be changed just depending on who manufactured the card. So the GTS 220 and the GT 220 are different cards -- they are likely very similar, but not the exact same thing.

Modifié par SSV Enterprise, 23 novembre 2010 - 01:34 .


#31
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Inverted P. S. of sorts here.  I brought up this thread because of the question about an extra low end device that isn't even listed in these rankings, and more or less overlooked those last two comments above, but somehow, something was percolating

. .  and I thought about it some more.  There never was but the one retail graphics card in the Geforce GTX 200 generation named a "GTS", which was the 250.  The 220 was a "plain" GT, that's all.  If some OEM had something different, it never escaped into the retail channel.  OK, back to why the thread was recycled:

We have a new member with a need to learn about real 3D graphics versus anything "onboard". We also have new things going on between Intel and nVIDIA:

http://www.dailytech...rticle20305.htm

(Added in a P. S. three weeks later):  We were supposed to be aable to expect to see what the Mobile AMD Fusion products would look like at CES (Consumer Electronics Show) 2011, which opened on January 6th in Las Vegas.  Until mid-December, it had been my understanding that some kind of Fusion products would actually be shipping by January 6th.

The current expectations put the shipping date closer to the end of February, 2011.   Previews of the Intel Sandy Bridge are appearing now, and the mobile versions (of the integrated graphics) are being compared to the ATI HD 4200 and 4250, and it's supposed to be better than those.  According to Anand Tech's reviewer, one of the Sandy Bridge laptops was even able to run Mass Effect 2 as well as the Mobile HD 5450, which surprises me, given the disparity in shader processor counts.)

Intel apparently felt a need to jump the gun with Sandy Bridge, to try and forestall AMD.  It wasn't necassary, the 40 nm versions of the Fusion designs (redesigns, because the greater efficiency of 32 nm was needed to carry forward with Fusion) weren't ready for review until February, and didn't reach the OEMs until the end of that month.  But when Intel did send Sandy Bridge out the door, it was without adequate testing, and the matching chipset was faulty.

Gorath

Modifié par Gorath Alpha, 09 juin 2011 - 08:35 .


#32
flyingfalcon

flyingfalcon
  • Members
  • 20 messages
Take a look at a mobile sandy bridge on Anatech with some benchmarks.

http://www.anandtech...ile-landscape/5

#33
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
As noted, even the HD 5450, although slow, has far more shader processing power (many more shader processors), so it will still win the beauty contest at displaying the full range of graphics. Fusion for the desktop was expected to be better than a 5450, even for its base graphics (not referring to the Netbook Fusion APUs, in the news at CES today, but they also should be better).

IMO, over the long haul, the news about nVIDIA's own version of ARM is likely to be more of a milestone than Sandy Bridge. 

The 40 nm versions of Fusion couldn't include the originally planned graphics, else both power drain and waste heat were excessive, so those had to be redesigned.  The desktop CPUs (Bulldozer) must have 32 nm wafers or better; they can't be produced on 40 nm.  In early June, 2011, AMD was showing an eight core Bulldozer at E3, so perhaps the Fabs are going to be online soon. 

Modifié par Gorath Alpha, 09 juin 2011 - 08:40 .


#34
flyingfalcon

flyingfalcon
  • Members
  • 20 messages
The 320M and 5450 do have more shader processors, but you can't directly compare graphics by counting them. How would you benchmark shader processing power? I'd try 3DMark or just running games, Mass Effect 2 looked the same on a SB IGP compared to discrete graphics.



Oh, and the netbook fusion can't get close to a 320M so they aren't as fast as sandy bridge graphics, sounded like you said that.



Desktop fusion will be interesting to see, though the hype machine on fusion is starting to wear at my sanity.



Nvidia's own version of ARM could be good, whatever drives more competition and better prices and more work is good, I wonder how long that will take though. So eventually there's going to be CPU/GPU combos of Intel/Intel, Nvidia/Nvidia, and AMD/AMD.



Reminds me of how Microsoft essentially was invading gaming with the original Xbox, everyone wants to expand to as much as they can. All the mobile products are really taking off.

#35
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
To shop for graphics most logically, go here, and then choose a price point:

http://www.tomshardw...-6990,2879.html

#36
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Gorath Alpha wrote...

As noted, even the HD 5450, although slow, has far more shader processing power (many more shader processors), so it will still win the beauty contest at displaying the full range of graphics. Fusion is expected to be better than a 5450, even for its base graphics (not referring to the Netbook Fusion APUs, in the news at CES today, but they also should be better).

IMO, over the long haul, the news about nVIDIA's own version of ARM is likely to be more of a milestone than Sandy Bridge. 

The least powerful Radeon, for desktop or laptop, is the HD 5570, when gaming is any kind of factor at all in the selection process for a personal computing system.  The equivalent Geforce is the 440. 

#37
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Gorath Alpha wrote...

To shop for graphics most logically, go here, and then choose a price point:

www.tomshardware.com/reviews/best-gaming-graphics-card-geforce-gtx-590-radeon-hd-6990,2879.html

Well, shoot fire!  I grabbed the wrong URL, so this wasn't really needed here and now.  Can't hurt, though.  (Edited again, ten hours later) Interestingly enough, the selfsame cheap chipset video chip came up in here also (HD 4250, way down below the minimum), because it's not even fast enough to display the mouse pointer.  The various VGA charts never mention them because they are too slow to bother with.

Dragon Age: Origins is the "Senior" set of forums on this site.  Before there ever was any ME-2, a series of reference articles were prepared in that game's Self-Help Tech forum.  There is a version of this thread there, but most of the other references do not have matching copies edited for Mass Effect 1 or 2. 

PC Hardware* Basics for Gaming (and inventory of Components):
http://social.biowar...58/index/509580

Getting the most value out of the Graphics Budget dollar (DAO)
http://social.biowar...8/index/7196223

Video Card Shader Performance Rankings* (ME-1):        
http://social.biowar...1/index/3117442

Generational Ladders* (and NTK-based shaders ranking list - "old" class markers)
http://social.biowar...58/index/575571

Very basic discussion* of video cards, video chips, PhysX, and even of laptops' limits: 
http://social.biowar...58/index/519461

Those are the articles about graphics performance

Modifié par Gorath Alpha, 07 mai 2011 - 04:03 .


#38
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
The latest new arrival may be trying to run ME-2 using an eight year old video graphics card, which cannot even come close to doing that.

#39
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
The Radeon Mobility 5145 is not included because it is below requirements. It is another version of the HD 4350 / HD 4550 pair, and therefore, still a Dx10 card, in spite of the renaming and slight tweaking of specifications. It is just too slow and too weak.

#40
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Right or wrong, I have this very strong recollection that most cases of ME1 / ME2 characters showing up with just "holes" where the eye sockets belong were seen on low end, and / or elderly, graphics devices, such as various Intel chips. Those aren't supported.

#41
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Since the 8400 GS hasn't been ever good enough for any of the games, even when it was new, it has never been part of the performance listing. It was designed strictly for ordinary business charts, graphs, presentations, and spreadsheets, not for games.  The mobile card is still very low performance:

www.notebookcheck.net/NVidia-GeForce-8400M-GT.3708.0.html

Modifié par Gorath Alpha, 10 juin 2011 - 05:24 .


#42
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
When shopping for a replacement graphics card, gaming quality starts with "600" for the Radeons (the first digit is the generation, not anything seriously impacting overall performance), and an HD 5570 is "real close" to being a 600 card.

For the Fermi 5n00 Geforces, the "50" is about the same as a "650", and a "40" is about the same as an HD 5570, and once again, the first digit is only the generation, not the performance. Far better to look for the GTS / GTX instead.

Modifié par Gorath Alpha, 16 juillet 2011 - 03:38 .


#43
SSV Enterprise

SSV Enterprise
  • Members
  • 1 668 messages
It's my perception that Geforce *50 level cards match up with Radeon *700 level cards, while Geforce *40 matches Radeon *600 and Geforce *30 matches Radeon *500. At least that's the case with the first-gen DirectX 11 cards like the Radeon HD 5750 and the Geforce GTS 450.

I will note that recently my friend bought a Radeon HD 5570 for his low-profile Dell PC and started playing Mass Effect on it, and in his opinion the game works great at 1600x900. But then, anything would feel great compared to the Intel 4 Series Express Chipset he was using before...

#44
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
You can't trust nVIDIA to stick with a fixed definition any further than you can throw a grown cow. They change what all the numbers below "60" mean from week to week and day to day (that's the way it seems, anyway). What was "OK" is lousy the next time you see them use the number.

What is usually true right this minute is that whatever the number is, if it's a "GT" it's no better than an HD 5570, and if it's a "GTS", it's no better than an HD 5670. The 420 and 520 are nearly as awful as a "10" is, and the "440" is not at all as good as the HD 5670, being hardly any better than the GT 240 was.

Modifié par Gorath Alpha, 16 juillet 2011 - 03:35 .


#45
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Just for grins and giggles, here is the expanded video card part of the ME-2 System Requirements (there have been new releases of low end cards, and they had managed to overlook quite a few):

Video Card = 256 MB (with Pixel Shader 3.0 support). Supported GPU Chips: NVIDIA GeForce 6800 or greater(**); ATI Radeon X1600 Pro or greater. Please note that NVIDIA GeForce G.205, G.210, 310, GT 520, 7100, 7200, 7300, 7400, 7500, 8100, 8200, 8300, 8400, 9100, 9200, and 9300; ATI Radeon X1300, X1550, HD 2400, 3100, 3200, HD 3450, HD 3470, HD 4200, HD 4250, HD 4350, HD 4550, and (probably) HD 5450 are below minimum system requirements. Updates to your video and sound card drivers may be required. Intel and S3 video devices are not officially supported in Mass Effect 2.

(**)Two of the Geforce 6800s are worse than the next-lower Geforce game card, the 6600 GT, and should be avoided (6800 SE, 6800 XT).

#46
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Do not allow yourself to fall for the marketing gimmick that I refer to as the "Big RAM Scam". Large amounts of cheap VRAM hung onto slow graphics devices is simply wasted there, when games are what the RAM is needed for. If the video device has a 64 bit memory system, the maximum it can use is 128 MBs.

#47
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Here's a more detailed discussion of that marketing ploy:

http://social.biowar...6/index/1186486

#48
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages

Gorath Alpha wrote...

Dragon Age: Origins is the "Senior" set of forums on this site.  Before there ever was any ME-2, a series of reference articles were prepared in that game's Self-Help Tech forum.  There is a version of this thread there, but most of the other references do not have matching copies edited for Mass Effect 1 or 2. 

PC Hardware* Basics for Gaming (and inventory of Components):
http://social.biowar...58/index/509580

Getting the most value out of the Graphics Budget dollar (DAO)
http://social.biowar...8/index/7196223

Video Card Shader Performance Rankings* (ME-1):        
http://social.biowar...1/index/3117442

Generational Ladders* (and NTK-based shaders ranking list - "old" class markers)
http://social.biowar...58/index/575571

Very basic discussion* of video cards, video chips, PhysX, and even of laptops' limits: 
http://social.biowar...58/index/519461

Those are the articles about graphics performance

The HD 5450 falls off the bottom somewhere, so was never included. 

#49
Baramon

Baramon
  • Members
  • 375 messages
Thanks for the obviously phenomenal amount of work/research/data-gathering that has gone into this and other threads you've created/helped foment over the year(s), Gorath Alpha. I've tried to read them all, but admit to skimming over quite a few if I didn't find a particular pertinence to one or the other. Still, I've bookmarked quite a few.

What do you think of this website, Gorath? http://www.cpubenchmark.net/ . It lists lots of CPUs, Video cards, Hard drives, in rankings of performance. I reference it quite frequently to get an idea of how one card or cpu compares to another. Do you think it is unbiased enough to be helpful/believable? So far it hasn't let me down, but I don't have enough experience with a wide enough range of components to make a qualified judgment on it. Is there a better site for that sort of stuff that you know of?

#50
Gorath Alpha

Gorath Alpha
  • Members
  • 10 605 messages
Without other sources, Passmark isn't necessarily "bad" for comparing some hardware performance levels, although it isn't gamer-oriented. For that, Toms Hardware's benchmarking is superior, particularly for GPUs. Beginning from the very first "Shader-Heavy" game, Oblivion, my predecessor for the lists, Not the King, sought benchmarks that reflected favorably on the demands of such games. When it's time to expand the current GPU lists, I try to interpolate from Toms Hardware benches I consider to be pertinent to the shader-heavy emphasis I have tried to continue with (and I am overdue).