TTBOMK, an HD 5670 will be fine with a relatively recent 375 watt delivered as part of an OEM rig, but if you are basing any game-related choices on Dx11 rather than the much-sooner to be useful Dx10 for gaming, whatever anyone buys in 2010 is going to be out of date before Dx11 is a factor. As noted, the only Fermi that will run on less than 450-500 watts worth is the 430, and it's just not a real gaming card, no matter its numbering.
The 450 and 460 do suck down sizable amounts of juice, you are correct there. Don't forget that nVIDIA is working with handicaps on its future, since it isn't partnered with any CPU producer, and the Low End discrete card market is already shrinking rapidly, with AMD close to releasing a desktop Fusion APU (the Netbook and Low End Notebook Fusions are already in production). nVIDIA is planning on a future as a massively parallel CPU producer. Those tend to be intensely voracious regarding current, and the GPUs for graphics are essentially now byproducts of the other market.
P. S. Intel's Sandy Bridge is due this coming April or May, when the same basic low quality Intel video now riding along piggy back inside the processor package of the i3, i5, and i7, is entirely integrated in the next series of CPUs. But "Fusion" is practically already here. IMO, Sandy Bridge will not have any serious impact on gaming.
The AMD device combines a far more capable graphics capability, closely related to the Radeon HD 5n00 generation as an integrated function to their multi-core CPU, and the mobile versions are already in the (figurative) hands of Netbook, Notebook, and laptop manufacturers, with the PCs using them expected about the turn of the year or even sooner. The desktop Fusion APUs are expected in February. Those are likely to be of concern for gamers.
The latest news on a long-standing feud between Intel and nVIDIA over whether nVIDIA's contract with Intel included the right to design chipsets for the newer Intel CPUs was about to go into court, after six years of wrangling (when Core Duo was a mere embryo). But both agreed to ask for a continuance because they were back in negotiations now.
http://www.insidehw....t-chipsets.html
The conjecture going around is that Intel wants to protect its dominant lock on the Netbook market, which the Fusion threatens, given its many times superior graphics. nVIDIA's ION is a low-current device that offers almost-competitive graphics performance to the Fusion for games, and superior performance in some other areas. Intel may hope to save a bigger part of their Netbook market if they can pair up ION and Atom at a good price, which right now, they cannot.
Initially, the business grade Fusion APU, like an HD 5450, will probably be priced at almost what similar AMD processors without graphics have been. That will really put a dent, potentially, in nVIDIA's sales of chipsets for AMD processors, and for discrete cards like the G.210 Geforce. But those aren't "serious" game parts, not really, although they may help raise what is presently a very dismal low average of graphics performance among mobile devices.
Although pricing isn't being discussed yet, the presumption is that the difference between an APU with business graphics, and one with the equivilant of the HD 5570 graphics integrated will be relatively small compared to a card, probably less than $15 to the OEMs, translating to maybe $25 retail (my guess there).
THOSE better APUs will run games such as ME2 without any separate GPU card, which is why it's significant to this article.
Gorath
Modifié par Gorath Alpha, 04 décembre 2010 - 08:41 .