Failed to detect a supported video card
#1
Posté 23 février 2011 - 04:34
This is my video card: ATI Radeon HD 5570
Do I need a driver update or is there a different issue?
Appreciate any help, thanks.
#2
Posté 23 février 2011 - 04:47
Mobile Intel® 945GM Express Chipset Family
ialmrnt5.dll [6.14.10.4421]
#3
Posté 23 février 2011 - 04:53
Mr.Sprinkles wrote...
Has anyone found a solution to this error? The game starts then abruptly stops saying, "Failed to detect a supported video card."
This is my video card: ATI Radeon HD 5570
Do I need a driver update or is there a different issue?
Appreciate any help, thanks.
A driver update wouldn't hurt.
BTW, I'm not sure if the 5570 is actually supported. But it would take extra effort to make the game specifically not run on compatible ATI or Nvidia hardware, and BW has never done that in the past, so I 'm guessing this is indeed some kind of bug or driver incompatibility.
Daidoji Ryushi wrote...
I am having the same issue.
Mobile Intel® 945GM Express Chipset Family
ialmrnt5.dll [6.14.10.4421]
Probably because Intel graphics aren't supported?
I know DAO was known for working on Intel graphics chips in a surprising number of cases, so I suppose you could try updating drivers. But I wouldn't hold your breath.
#4
Posté 23 février 2011 - 05:54
#5
Posté 23 février 2011 - 09:27
{A half hour later, I just ran that search, and Google reported this: " About 149 results (0.14 seconds) }
Here are two of those whose URLS I'd put aside some time ago:
"Failed to detect a supported video card"
http://social.biowar...8/index/81411/1
http://social.biowar...58/index/402605
Modifié par Gorath Alpha, 26 février 2011 - 07:30 .
#6
Posté 23 février 2011 - 10:05
So check your screen ratio setting =)
#7
Posté 23 février 2011 - 03:27
Gorath Alpha wrote...
Run searches on your subject title using Google or Yahoo, adding a site: redirector pointed back into the Social Forums, to find one of the many more older threads from DAO on the same thing.
{A half hour later, I just ran that search, and Google saif this: " About 149 results (0.14 seconds) }
Here are two of those whose URLS I'd put aside some time ago:
"Failed to detect a supported video card"
http://social.biowar...8/index/81411/1
http://social.biowar...58/index/402605
That's exactly what I did before hand but I never found a succinct solution to the problem. I also searched the title of this thread in this forum and it came up with 0 results. Now there are numerous, when I posted I couldn't find it.
And yes, adjusting the screen resolution works perfectly! Thanks
#8
Posté 26 février 2011 - 07:02
#9
Posté 26 février 2011 - 07:22
Simple.Gorath Alpha wrote...
If you are getting the error message from the Demo that it cannot
detect a supported video graphics card, it probably means that you
overlooked the requirements (video in Boldface emphasis):
Minimum:
OS: Windows XP 32-bit with SP3
OS: Windows Vista 32-bit with SP2
OS: Windows 7
CPU: Intel Core 2 Duo (or equivalent) running at 1.8 GHz or greater
CPU: AMD Athlon 64 X2 (or equivalent) running at 1.8 GHz or
greater
RAM: 1 GB (1.5 GB Vista and Windows 7)
Video: Radeon HD 2600 Pro 256 MB (should be 2600 XT or
X1800 GTO)
Video: NVIDIA GeForce 7900 GS 256 MB cards
(unless this should be 7800 GS)
Disc Drive: DVD ROM drive required
Hard Drive: 7 GB
Sound: Direct X 9.0c Compatible Sound Card Windows Experience
Index: 4.5
You need at least a Mainline Gaming card ("n600" for Radeons and older Geforces, or "n50" for newer Geforces), newer than a 2004 design. The older minimum card only offers the Dx9 version of Direct3D.
You also need a modern full-powered CPU, not a Celeron / Pentium Dual / P4 / Sempron.
Only Desktop style PCs provide a convenient upgrade route for video graphics upgrades. For all practical purposes, you upgrade a laptop by replacing it entirely.
The official minimum Radeon is the HD 2600 Pro, a year newer than the
Geforce minimum card, with Dx10 shader capability. The Geforce only
offers Dx9 functions. Neither of those are available new, so the
effective minimum Radeon becomes the three year old HD 3850, which
is already in very short supply. A Geforce 9600 GT is probably
nVIDIA's oldest available equivalent to a 7900 GS. To be most
practical, you probably have to purchase at least a Radeon HD 4770
or a Geforce GTS250, although once again, the Radeon will have
Dx11, while the Geforce only offered Dx10 then.
Modifié par Gorath Alpha, 26 février 2011 - 07:28 .
#10
Posté 26 février 2011 - 07:41
#11
Posté 26 février 2011 - 08:05
I imagine if you had one of the Low End Geforce 7n00, or Radeon X1n00 cards, like a 7600, or an X1600, instead, the Demo would do that, load itself, and then really struggle.
(P. S. Added here in edit. Yes, to any kibitzers out there, when the 7600 and the X1600 were brand new, back in 2005, they were supposed to be Mainline Gaming cards then, and the 7600 GT most certainly was that. The X1600 had much better shader capability, with many more shader processors, but was really slow, as was the 7600 GS. By mid-2005, the X1600 was replaced by several new X1650 cards so that Radeons no longer compared so poorly on speed to the 7600 GT. That was all four and five years ago and only one of those can still manage to reach the "budget game' borderline level in between Business and Mainline any more ~~ that would be the X1650 "XT".)
Modifié par Gorath Alpha, 27 février 2011 - 11:43 .
#12
Posté 26 février 2011 - 08:18
Modifié par Solarain, 27 février 2011 - 08:56 .
#13
Posté 26 février 2011 - 08:33
All of those had the Dx9.0"a", which added SM-2 shaders, but not any part of the SM-3 shaders. The reason that the 9600 and Low End Xn00s were the same was the X360 video component, the design of which overlapped at that point (and requiring the efforts of the majority of the technical staff), so the X700, X800, and X850 were only slightly warmed over from the 9700 / 9800 designs, adding only half of the SM-3 pixel shader functions in Dx9.0"b". nVIDIA's Geforce 6n00s came with Dx9.0"c", and the full SM-3 shader functionality, although those were RUSHED out the door after the FX card screwup..
Modifié par Gorath Alpha, 28 février 2011 - 12:26 .
#14
Posté 27 février 2011 - 08:53
What I found most interesting about your post was that the older nVIDIA's had the full SM-3 shader functionality. Back then, they might have been a bit better card than ATIs, but that's not the case anymore. Fortunately.
#15
Posté 27 février 2011 - 09:43
Microsoft wanted to renegotiate their contract with nVIDIA, and the boss at nVIDIA refused to even consider the idea.
Things got out of hand, and became intense and personal. It nearly put nVIDIA out of business. The Xbox360 project was fast- tracked, ATI got that video contract, and Microsoft cut nVIDIA out of the loop while developing Dx9. ATI was in on Dx9 from the beginning, and designed their Radeon 9500 / 9700 pair around the new SM-2 shaders in it, with the very first shader unit processors in the architecture. It was truly a highly advanced "VPU" (that was an ATI choice for the products, rather than "GPU") at the time.
Instead of either Dx8 or Dx9, the Geforce FX 5n00 cards had something totally independent, created by nVIDIA, that was supposed to "know" all about all Direct3D functions, plus many more of its own. But it required the game developers to use a much more complicated series of tools to implement, and ended up being ignored. The FXes just didn't handle DirectX very well at all. nVIDIA lost several tens of million dollars that year, but luckily had some huge cash reserves.
nVIDIA had a temporary over-staffed situation, having absorbed 3dFX's personnel in the merger, so they not only ramped up the Geforce 6n00 schedule, they also pulled their FX 5600 and FX 5800 off the production lines, and replaced them with the FX 5600 Ultra, the FX 5700s and the FX 5900 / 5950s.
Their Geforce 6n00s ended up being comparatively slow, except for the 6800 GT and 6800 Ultra, and they had very few shader processor units, so that the Radeons were faster overall, especially the less expensive ones, and the image quality of the Radeons was far superior (at least it was, when the drivers were not screwed up).
ATI thought that their lead with the Radeon 9n00 generation was so large, that they didn't need to hire temporary people for the Xbox360 video design work, and warmed over the 9n00s in various ways, such that the Xn00s from X700 to X850 were still faster than the Geforces once again, although they did shortchange them on the next two steps in Dx9, going only for half of the SM-3 shader functions, while increasing the number of shader units in them.
Had they matched the X800s in particular with good drivers at every update point, they would've never allowed nVIDIA a chance to recover from the FX 5n00 debacle.
Gorath
Modifié par Gorath Alpha, 28 février 2011 - 12:49 .
#16
Posté 28 février 2011 - 09:34
#17
Posté 28 février 2011 - 09:56
The voodoo ranged cards at the time they were still 3dfx interactive, out matched Nvidia, ati and even matrox.
The voodoo 3 3000 series outmatched Nvidia's flagship only marginable though, but this was in general by a wrong marketing strategy done by 3dfx.
The Fx series you described was created by Nvidia Inhouse developers together with that of 3dfx, however it never sprouted the results they had hoped. Which is logical as 3dfx was bound to nvidia rules at the time.
3dfx also created the first SLI function shortly after the development of there voodoo 2 range of cards. Nvidia copied this in 2004 and rereleased it. Albeit in a botched version. Up to today nvidia's SLI and even Ati's Crossfire are botched up version of the original.
Scan-line Interleave as designed by 3dfx allowed up to a near 100% in increased processing time, where as nvidia's scalable link Interface hardly achieves up to 50%.
Which in all, is a shame if you ask me.
Modifié par Syrellaris, 28 février 2011 - 10:20 .
#18
Posté 28 février 2011 - 01:26
If my memory was still the really useful talent I once had, I could have gone on in greater depth and bored everyone ever so much more, but I've grown relatively old now, and much of that has slipped away. Very briefly, I was much more ino the software side of IT between 1970 and 1990, when I was working in the Property Casualty Insuance industry at various jobs, all of which tied one way or the other into the premiums (money) and the computer programs used to do the Accounting of that money.Solarain wrote...
You know so much about some of the history of the companies involved, that perhaps you worked for one of them in that industry? Or you just like reading all the latest on graphics cards? That's a lot of info, thanks. =) What graphics card do you use & why?
There was a business reversal for quite a few of the companies in the city I worked in then, and my mother became ill with bone cancer. I returned to here, took a nothing job to allow me the freedom to take a lot of (unpaid) time off when she was getting radiation or chemotherapy treatments, and made the funeral arrangements she insisted be in place early. I tried to mend fences with my father, practically a lost cause, but again, she insisted.
Before leaving Houston, while my financial situation was relatively generous, I had started owning desktop computers for both my home and my work, where only the commercial computers (mainframes and minis) were considered correct, but there were things I could do at my desk, in my office, with a "Microcomputer" that I could not have done on a work station (now, my memory is doing its new & bad thing. That's the wrong term and the actual one is eluding me. It means "console" ~~ OK, I remember, "Terminal").
When I was equipping my home after 1982, I usually had an X86 PC for business type activity, and a separate smaller gaming computer such as an Apple ][ or Commodore 64 to play games with. I experimented some with assembling my own "Clones" of the IBM PC-AT 286 and 386 systems for MS-DOS as early as 1987, but typically ordered custom PCs built for me by a local "White Box" builder. Here in this city after 1990, I was in much more restricted financial straits and went back to building my own machines, but that time it was based on how inexpensively I could get by.
My emphasis in computing was shifting from software to hardware already, even then. After my mother passed away, when I hoped to get back into a better, more responsible job, my non-technical (Accounting) degree got in the way, and already middle aged, I returned to college, obtained a BS degree in "Business Technology" and ended up teaching my subject at the local college I had attended, but without the MS, it was what was referred to as "Fulltime temporary", not qualified for Tenure, and although I enrolled in graduate school, I couldn't afford the costs to finish, and didn't qualify for a great deal of the financial help that younger students then obtained very easily (my credit had gotten badly savaged by the period of very low pay in the early '90s).
Water under the bridge, though, I really enjoyed teaching, I just wish it had both paid better and allowed me access to a better way to put something aside for retirement. In 1999, I built a Pentium P1 MMX 233 PC for Windows 98, and about a year and a half later, a P2/400 for Windows2000. Baldurs Gate annoyed the devil out of me in countless ways, and I really didn't get my moneys worth from it. Icewind Dale flatly wouldn't run for me on either of the PCs, no matter what I could do get that game to work.
I blamed ATI, Creative, and Intel, my anger in reverse order to alphabetical, for being so screwy at the time. I still don't use Intel Chipsets for anything twelve years later, and Creative is practically the only audio option you can acquire readily from every eTailer. I bought several of nVIDIA's FXes in 2004 / 2005, to equip my "Family Room" and my own computer room with game playing PCs. The FXes failed me because they were so incompatible with DirectX, and I was very upset with nVIDIA. I put in a lot of time studying graphics then, to avoid being screwed over like that ever again.
Gorath, the Long-winded
#19
Posté 02 mars 2011 - 05:37
I couldn't take hi-tech work, it had no human interaction that made any real difference. I went to work for my parents' company and became the Operations Manager, overseeing all departments, including the IT dept. I ordered all the new equipment, saving costs for the company. We had several servers and equipped medical transcriptionists to do their work online. They eventually laid me off - typical of family and business not mixing - and I had unemployment and the ability to have the state pay for my schooling. So I left the world of hi-tech behind & became a massage therapist. Hugely a reduction in pay & benefits, but it was worth my sanity. So this is why I've fallen behind on technology.
I've stayed in touch, somewhat, with hi-tech by volunteering my time to test some of the new Microsoft Xbox games. It's fun, I get a small token for taking part. And they love it when a woman my age takes an interest in giving them feedback. I never grew up. =) I've always wanted to be able to give input into games like Dragon Age - the writing is sometimes excellent, sometimes could use some real improvement in how it makes a player feel. But ya know, you need specific education for that stuff.
I've never bought an nVIDIA card, only Radeon, finding them a better deal. That's why I asked you which cards you prefer.
Back when I was playing Asheron's Call, a couple friends & I were laying on the ground, looking up at the sky (they had such awesome graphics for the sky) and it was night. One of them was commenting on how cool it was to see the Milky Way & stars. I said, "What stars?" That's when they found out I had an outdated graphics card. One of them actually sent me money (never met the person in person) to buy a new card. I got a GeForce something or other, can't remember which, because he sent me money for that particular card, saying it was the best. But when I saw all the graphics I was missing, I swore I'd never let my graphics card get outdated again. And yet, here I am....not able to play Dragon Age 2 Beta, let alone the new release coming out. ha! So I needed a full new system, ordered the individual parts & am building it from scratch. But this time I might have my (current) spouse or son build it for me.
I can also be long winded, like Wynne, but not her age yet! =)
#20
Posté 02 mars 2011 - 12:19
*sigh* Sorry if I'm sounding a bit crazy here, I'm low on sleep, and I'm freaking out. Any help would be much appreciated, guys.
#21
Posté 02 mars 2011 - 02:17
I'm more knowledgeable about graphics than any other PC Tech; and if you don't know any more about Tech than you do, the logical thing from the start was to go with some console version, but I'm a game player, I can't tell you what any retailer will or won't do with a pre-order. Sorry.
For Solarion, my single parent days were from 1966 to 1984 or so, the ex had no more interest in parenting than a feral cat, and my oldest seemed to never want to give up his room here at home. He kept bouncing from place to place, job to job, and relationship to relationship, and a few months back here again in between many of his changes in direction.
I have an HD 4850 in one PC, and HD 3870 in another, and a Geforce 9800 GT in a third one. The HD 3870 is in my "favorite" PC, but that card is getting pretty old, even by my loose standards. If I felt flush, I'd get an HD 5830 or HD 6850 right about now, but my budget is always pretty tight.
#22
Posté 02 mars 2011 - 02:22
This was my first time posting here on the forums, this seemed like the sort of place to ask about stuff. I've calmed down now, and started thinking on how to deal with it. Sorry to interrupt
#23
Posté 11 mars 2011 - 08:04
would it run?
#24
Posté 11 mars 2011 - 08:28
Modifié par Gorath Alpha, 11 mars 2011 - 08:34 .
#25
Posté 12 mars 2011 - 06:53
i have a toshiba laptop... can i even change my graphics card?





Retour en haut







