Aller au contenu

Photo

UPDATED - DX11 known issues and drivers


1560 réponses à ce sujet

#1026
Kayden SiNafey

Kayden SiNafey
  • Members
  • 131 messages

moop1167 wrote...
I am on 1920x1200 with all settings max, including 8xAA.  If I drop it down to 4xAA, yeah, I get 50+ at all times. 


I was bottoming out to about 27fps usually around 32 when in combat but walking around I had about 37 to 45 at 8x. This was with the stupid shadows in the upper left, I figured out if I went to 4x I got that huge jump as well and the shadows were gone. It sounds like you had the same problem but couldn't see those shadows and it caused you to have lower fps. Well I think that narrows down that problem.

If any one is using Very High with everything enabled with 8x AA and 16x AF  and want better FPS, drop it down to 4x AA and 16x AF because there is a bug with how the game uses 8x AA.

Really wish there was some one we could send this info to directly at Bioware.

Modifié par Kayden SiNafey, 19 mars 2011 - 08:16 .


#1027
TallBearNC

TallBearNC
  • Members
  • 986 messages
I'm using 267.59 @ 2560x1600 in very high, no detail boxes checked, AFx16 and no AA, and I get somewhat good performance, but my cards aren't fully used in SLI mode... oddly, using 2-4X AA, I get a performance BOOST (by 15fps), and the cards go to nearly 100% usage.

Either the game (or the driver) isn't fully using SLI. However the use of AA seems to force the cards into fully being used. It's the only explanation I can see in this case.

If I DISABLE SLI, then turning on AA in any way, at any level, kills performance.

Specs are in my sig.

Modifié par TallBearNC, 19 mars 2011 - 08:28 .


#1028
foil-

foil-
  • Members
  • 550 messages

TallBearNC wrote...

I'm using 267.59 @ 2560x1600 in very high, no detail boxes checked, AFx16 and no AA, and I get somewhat good performance, but my cards aren't fully used in SLI mode... oddly, using 2-4X AA, I get a performance BOOST (by 15fps), and the cards go to nearly 100% usage.

Either the game (or the driver) isn't fully using SLI. However the use of AA seems to force the cards into fully being used. It's the only explanation I can see in this case.

If I DISABLE SLI, then turning on AA in any way, at any level, kills performance.

Specs are in my sig.


Hmmm, want to upload a photo of that setup for bragging rights?  Still, think you need to get a third 30inch monitor (and maybe 3way sli to support it) ;)

#1029
Kayden SiNafey

Kayden SiNafey
  • Members
  • 131 messages

TallBearNC wrote...

I'm using 267.59 @ 2560x1600 in very high, no detail boxes checked, AFx16 and no AA, and I get somewhat good performance, but my cards aren't fully used in SLI mode... oddly, using 2-4X AA, I get a performance BOOST (by 15fps), and the cards go to nearly 100% usage.

Either the game (or the driver) isn't fully using SLI. However the use of AA seems to force the cards into fully being used. It's the only explanation I can see in this case.

If I DISABLE SLI, then turning on AA in any way, at any level, kills performance.

Specs are in my sig.


Have you tried the modified drivers? I have heard of ppl getting great fps with sli enabled after they did that. Also if your using nvidias beta drivers there is one specific for the 580's with multi monitors bug fix, I'm not sure what it is but it may also fix the problem.

foil- wrote...

Hmmm, want to upload a photo of that setup
for bragging rights?  Still, think you need to get a third 30inch
monitor (and maybe 3way sli to support it) [smilie]../../../images/forum/emoticons/wink.png[/smilie]


You don't need 3way SLI to link 3 moinitors together you just need 2way, but 3way would help with resolutions that big. If you plan to run 3 independant monitors however you do need a 3 card not in SLI to accomidate that. I plan to go with 3 24 inch on a single stand here in a few months and really push this thing, I just might have to OC my CPU and further push my factory OC video cards.

Edit: This is for the 4xx and 5xx series though. When you have the 2xx series however you do need to 3way if that was your refering too, I apologize I didn't think of that sooner.

Modifié par Kayden SiNafey, 19 mars 2011 - 09:18 .


#1030
TallBearNC

TallBearNC
  • Members
  • 986 messages
Yes I stated that I use 267.59, and they are modded.

Keep in mind I run at VERY high resolution. I refuse to drop to 1920x1200. I have done it, and then I get AWSOME FPS, even in very high with everything checked, but as most of you know, if u run an LCD outside it's native rez, things looks very fuzzy as the monitor (and/or card) must add/remove extra pixels to compensate.. resulting in a blurry or fuzzy looking screen.

Times like this are when I wish I had a backup 20" 1920x1200 monitor to play on :)

TBH, i sort of shot myself in the foot with my rig. 2560x1600 is a LOT to ask out of any game/gpu combination. I should have gotten 2 20" 1920x1200 monitors. I'll be honest, 30" is HUGE, especially when you sit only a few feet from it

Modifié par TallBearNC, 19 mars 2011 - 09:16 .


#1031
Kayden SiNafey

Kayden SiNafey
  • Members
  • 131 messages

TallBearNC wrote...

Yes I stated that I use 267.59, and they are modded.

Keep in mind I run at VERY high resolution. I refuse to drop to 1920x1200. I have done it, and then I get AWSOME FPS, even in very high with everything checked, but as most of you know, if u run an LCD outside it's native rez, things looks very fuzzy as the monitor (and/or card) must add/remove extra pixels to compensate.. resulting in a blurry or fuzzy looking screen.

Times like this are when I wish I had a backup 20" 1920x1200 monitor to play on :)

TBH, i sort of shot myself in the foot with my rig. 2560x1600 is a LOT to ask out of any game/gpu combination. I should have gotten 2 20" 1920x1200 monitors. I'll be honest, 30" is HUGE, especially when you sit only a few feet from it


Oh yeah sorry I missed that, I thought you were refering to the beta not the modified was holding 3 conversations so I got a little distracted. :blink:  Try those 580 specific beta drivers and see if that fixes it, it might be possible the hacked drivers don't fix that issue or just remove one but then you have to put all the ICONs back and I would hate to do that myself.

I know what you mean though about dropping the res I have the same problem and it isn't somthing I don't like to compromise with either, you put good money into somthing new it should be able to handle it if it doesn't then I get upset myself. 

Well at least you got more then what you needed but I'm sure you made the best decision at the time.

Modifié par Kayden SiNafey, 19 mars 2011 - 09:25 .


#1032
TallBearNC

TallBearNC
  • Members
  • 986 messages
A quick note. I yanked one of the monitors, and that gave me about a 8FPS boost.. not much.. but ill take it :)

#1033
moop1167

moop1167
  • Members
  • 22 messages

Kayden SiNafey wrote...

moop1167 wrote...
I am on 1920x1200 with all settings max, including 8xAA.  If I drop it down to 4xAA, yeah, I get 50+ at all times. 


I was bottoming out to about 27fps usually around 32 when in combat but walking around I had about 37 to 45 at 8x. This was with the stupid shadows in the upper left, I figured out if I went to 4x I got that huge jump as well and the shadows were gone. It sounds like you had the same problem but couldn't see those shadows and it caused you to have lower fps. Well I think that narrows down that problem.

[color=rgb(255,0,0)">If any one is using Very High with everything enabled with 8x AA and 16x AF ][b]because there is a bug with how the game uses 8x AA. [/color]

Really wish there was some one we could send this info to directly at Bioware.


Yeah, I get the shadows in the top corner sometimes but if I move my camera around a bit they go away eventually.  But 8xAA REALLY kills performance.  I could drop it down to 4xAA I guess until it's fixed totally.

#1034
Baramon

Baramon
  • Members
  • 375 messages
Yep, I said I'd post back if I got the stones to try it (the 267.59 version drivers) and they work beautifully.

I get very generous (~60fps) in most indoor places and some outdoor places, with a few (brief) dips here and there down to the ~25-30 range in very high-detail areas.  Very nice indeedy!!

In case anyone forgot (or is interested):
AMD X6 1055T
PNY NVidia GTX 470 (1280MB)
4GB DDR3 RAM
Windows 7 Home Premium 64-bit
(the rest is just cheese...)

Thanks, everyone.:)

#1035
Silvanend

Silvanend
  • Members
  • 236 messages
My problem and basically to sum up everyone else's problem, is that the game is not at all optimized for the PC.  Please fix it Bioware.

#1036
Kayden SiNafey

Kayden SiNafey
  • Members
  • 131 messages

moop1167 wrote...
Yeah, I get the shadows in the top corner sometimes but if I move my camera around a bit they go away eventually.  But 8xAA REALLY kills performance.  I could drop it down to 4xAA I guess until it's fixed totally.


Yeah it wanted to run this at 8x AA but the shadows are just too much for me, so I figured why not try just chaging the AA and that fixed it first try. I would guess this is a problem with the game and how it renders at 8x. I have not tried forcing it in NVidias CP but I've seen other issues come up with doing that in other games so I try not to do it. They should have this fixed in the next patch because I've reported on it since more then 3ppl have said it fixed thier issue to me so I got a little info to back it up.


Baramon wrote...

Yep, I said I'd post back if I got the stones to try it (the 267.59 version drivers) and they work beautifully.

I get
very generous (~60fps) in most indoor places and some outdoor places,
with a few (brief) dips here and there down to the ~25-30 range in very
high-detail areas.  Very nice indeedy!!

In case anyone forgot (or is interested):
AMD X6 1055T
PNY NVidia GTX 470 (1280MB)
4GB DDR3 RAM
Windows 7 Home Premium 64-bit
(the rest is just cheese...)

Thanks, everyone.[smilie]../../../images/forum/emoticons/smile.png[/smilie]


Glad to hear it! I wish you happy gaming now dude!


Silvanend wrote...

My problem and basically to sum up
everyone else's problem, is that the game is not at all optimized for
the PC.  Please fix it Bioware.


This seems to be either a video card driver issue or settings issue with the game, whats your specs and your problem? There maybe a workable solution for you here in the short term.

#1037
JJDrakken

JJDrakken
  • Members
  • 800 messages
The new Patch just causes the game to flat out hard freeze now & stop responding.

Before I had slow downs, now I get 1 slowdown, then after that, at some point, game just stops working period.

-sighs-

70 bucks for beta, I am never doing this again.  I should of waited for "Ulitmate Edition" of this game, just like DA1.


JJ

#1038
sav86

sav86
  • Members
  • 4 messages

TallBearNC wrote...

Yes I stated that I use 267.59, and they are modded.

Keep in mind I run at VERY high resolution. I refuse to drop to 1920x1200. I have done it, and then I get AWSOME FPS, even in very high with everything checked, but as most of you know, if u run an LCD outside it's native rez, things looks very fuzzy as the monitor (and/or card) must add/remove extra pixels to compensate.. resulting in a blurry or fuzzy looking screen.

Times like this are when I wish I had a backup 20" 1920x1200 monitor to play on :)

TBH, i sort of shot myself in the foot with my rig. 2560x1600 is a LOT to ask out of any game/gpu combination. I should have gotten 2 20" 1920x1200 monitors. I'll be honest, 30" is HUGE, especially when you sit only a few feet from it



Don't say such a thing! 30inch monitors are the shiz =p I'll never go back now with all my workspace (granted I do alot of editing besides gaming, anything smaller seems insufficient). While I agree with you that it is asking alot for any GPU/CPU to render and process a game at that resolution, our NVIDIA 480/580's were designed to render at higher than average resolutions. Something is to fault here for the dramatic performance decrease and I refuse to believe we have reached a point already in gaming today with this title that it truly tests systems.

#1039
Kayden SiNafey

Kayden SiNafey
  • Members
  • 131 messages

sav86 wrote...
Don't say such a thing! 30inch monitors are the shiz =p I'll never go back now with all my workspace (granted I do alot of editing besides gaming, anything smaller seems insufficient). While I agree with you that it is asking alot for any GPU/CPU to render and process a game at that resolution, our NVIDIA 480/580's were designed to render at higher than average resolutions. Something is to fault here for the dramatic performance decrease and I refuse to believe we have reached a point already in gaming today with this title that it truly tests systems.


Well a lot of ppl are having major performance issues with this game and it isn't the hardware's fault. I say it's Bioware's fault because they rushed this out so fast nvidia and ati had no time to get the right kind of drivers out and now all parties are scrambling to get it resolved. I think when Crysis 2 is released we will see new WHQL drivers from Nvidia but I am not too confident that they will have had enough time to fix the performance problems plaguing in DA2.

haha on the 30" comment. I would like to go with triple 30's in a few months but the cost is just way to high for stand to support all 3 at once. I have to have this because I needed to remove the addition I put on my desk for the 3rd monitor for my case and it's new 4fan radiator system with riser. The fans were less then a 8th of inch from the bottom of the desk and that isn't going to be enough to push all that hot air out from the radiator come summer so off she came. Thus 24" is what I need to get because it is at least reasonable for a stand at $400 and not $800+ for the 30". I know what you mean by workspace when I had to go to one monitor with a res at 1920x1080 and not 1920x1200, these things killed my efficiency in Supreme Commander 2, then all my 2d & 3d apps, I just can't see as much of my projects anymore and it SUCKS!!!! I do prefer 16x10 I know it isn't much more but you do miss it when it is gone.

EDIT: Also I don't think were going to see anything really challange this hardware for at least another 2 years because everything is being developed for consoles, which is a MAJOR mistake IMO. The PC has lead the way for GFX in consoles because we pushed the boundries, if it isn't done on this platform it will never happen on a console plain and simple. Crysis IMO was the last game to truly challange the hardware of it's time because it wasn't a driver problem or in game features not being used correcly it was the engine being so advanced for it's time and if it wasn't for them we wouldn't have seen the pushes made in the Unreal ED or in the PS3 GFX because many still point to Crysis and say they are making it look better then that. *sigh*

Modifié par Kayden SiNafey, 20 mars 2011 - 08:22 .


#1040
MaxPayne37

MaxPayne37
  • Members
  • 149 messages

Kayden SiNafey wrote...

MaxPayne37 wrote...

There is a physical difference, not a software difference between DX versions. DX11 just happens to be backwards compatible with DX10 providing DX10 effects, but not DX11 ones. No emulation happens, try and get a DX10 card to run tessellation in a game, or a benchmark such as Unigine. Good luck. I don't mean to sound "rude" but look up your facts man. Get a DX10 card and try and run DX11 effects, you can't, lower performance or not. Ask anyone playing Dragon Age 2 right now with a DX10 only card and STOP MAKING ASSUMPTIONS.

Look at the High selectable graphics settings. Anyone with a DX10 card will know that Very High is not selectable. You're only given what you can use, not what you can't at a "performance hit", cause it's just not there on the hardware.

http://www.techspot....test/page3.html


This is my last comment because your not listening or understanding the fundamentals of what is being discussed here. There is a software difference, the code used in games to support those features are software they have to be put in by the game developers thus the requirements for DX10 to DX11 are different from a hardware and software point of view. I did not say anyone with a DX10 card should have to put up with crappy performance at all I am saying is that ppl are using a mode that is not supported by their card for a 100% and thus should go to DX9 and if they are unhappy with that well it is the nature of the beast with PC's either lower your expectations or upgrade. There are MANY ppl here with problems that do not correspond to hardware compatibility issue and those should be taken more seriously like low fps in SLI mode, low fps with a top end card (for me that's a 470,480, 570 & 580 don't know ATI that well so stick with what I know) with any texture mode and etc, I could keep going but there is no point.

Also by your own article you post it says

We also tested using the Very High, High and Medium quality presets. The “Very High” preset allows for ambient occlusion, depth of field and blur effects. The “High” preset disables depth of field and blur, while “Medium” removes all the customizable DX11 features.


When the word customizable is used it tells me it's the feature YOU can control, but other DX11 specific features may still be in use that a DX10 card can not perform well with and that's what I have been saying all along. Please understand I see your trying to help and you do have good grasp of technology I will give you that but your are either confusing facts or allowing your judgment to be clouded because you want to be right. There is no assumption worse then to think you are right when you have no facts to back it up.

Again I hate sounding rude especially to people who are trying to help but this is just one time where you are wrong, if I was wrong and you could point out to me where I am I would admit it but this has gone on long enough in this thread so I am going to help those who need help with the game and not understanding how direct x works, there is another forum that at Microsoft I'm sure. Hope this helps. Stay frosty.


No, DX10 cards support DX10 features, not DX11 ones, otherwise they wouldn't make DX11 cards. They call it backwards compatible, because it's compatible, if it wasn't, it wouldn't be, get it? DX10 cards are given High, not Very High, THOSE are the features that are not supported, and wouldn't perform well, hence they are not given. Unfortunately, you have no facts to prove you are right either, you're just writing what you think.

And no, it's a driver problem for most. Look at other threads, you will see people playing on High with DX10 cards comfortably just with an updated driver. Telling them to run DX9 because they should lower their expectations isn't solving the problem, it's a workaround that doesn't provide the same results. I was unhappy with my DX10 performance, but I didn't settle with DX9, I updated my driver and made it playable for DX10. Try and fix the issue before you resort to what should be the worst case scenario.

Also, LOOK AT THE HIGH SETTING IN THE GAME, IT SAYS IT ONLY REQURIES DX10 CAPABLE HARDWARE. THAT KIND OF HARDWARE DOES NOT SUPPORT TESSELLATION FOR EXAMPLE, AND IT'S NOT EVEN USED ON HIGH, GET SOME DX10 HARDWARE TO TRY IT ON AND STOP... SPOUTING... BULL****!! Even if it is used on DX11 hardware on High, does NOT automatically make it usable on DX10 hardware on High. Once again, TRY IT ON DX10 HARDWARE!!

I am upset, cause you are saying I'm ignorant, when you are the ignorant one. Look at the option ingame, it's using the DX11 renderer because it is, to provide DX10 backwards compatibility, doesn't mean it's using DX11 effects, cause it can't. Stop testing it with a DX11 card maybe, and test it using DX10 hardware, you know, LIKE I AM AND SOME OTHERS ARE!! You can't compare your findings on anything with a DX11 card, when we are arguing a DX10 issue.

Kudos to BTCentral, at least he understands and makes some sense.

Modifié par MaxPayne37, 20 mars 2011 - 12:29 .


#1041
BTCentral

BTCentral
  • Members
  • 1 684 messages

MaxPayne37 wrote...

Kudos to BTCentral, at least he understands and makes some sense.

Thanks :) - but you're fighting a loosing battle there, he'll just refuse to listen and state it's "not a fact".
He just does not understand the fact that DX10 hardware can not process DX11 only API features :P

I know for a fact that it works just fine on DX10 cards on High Mode with the 267.59 drivers, as before I got my shiny new MSI GTX 460 Hawk a few days ago, I had a XFX GTX 260 Black Edition - which was a DX10 only card.

Until nVidia/ATI release new drivers I have no doubt that the 267.59 drivers are as good for performance as we are going to get.

Modifié par BTCentral, 20 mars 2011 - 02:22 .


#1042
Baramon

Baramon
  • Members
  • 375 messages
From what I know about video cards and DX (since about version 3 or so, when I {briefly} thought about learning the API for something I wanted to do) there is just NO WAY a DX card of ANY version can "adjust upwards" to make use of a higher number's functions and calls and whatevers...they're just not designed that way. You can't make a DX9 or DX10 card access anything in the respective higher-number version (DX10 or DX11) by any trick, magic, hack, workaround, or whatever. Granted, it's been awhile since I've done anything remotely close to "advanced power user"-type stuff with any of the DX versions, but I doubt anyone's broken through that limitation.

Shutting up now on the subject, because I know I'm not qualified to offer anything similar to reliable information on the subject...(just my inflation-adjusted $0.27c worth).

#1043
MaxPayne37

MaxPayne37
  • Members
  • 149 messages

BTCentral wrote...

MaxPayne37 wrote...

Kudos to BTCentral, at least he understands and makes some sense.

Thanks :) - but you're fighting a loosing battle there, he'll just refuse to listen and state it's "not a fact".
He just does not understand the fact that DX10 hardware can not process DX11 only API features :P

I know for a fact that it works just fine on DX10 cards on High Mode with the 267.59 drivers, as before I got my shiny new MSI GTX 460 Hawk a few days ago, I had a XFX GTX 260 Black Edition - which was a DX10 only card.

Until nVidia/ATI release new drivers I have no doubt that the 267.59 drivers are as good for performance as we are going to get.


QFT. Maybe I'd let it go, but I don't want him giving others false information. At least you had some DX10 hardware to test it on before stating your DX11 experience as "fact". I agree, and at least I have some other people that agree with me too now.


Baramon wrote...

From what I know about video cards and DX (since about version 3 or so, when I {briefly} thought about learning the API for something I wanted to do) there is just NO WAY a DX card of ANY version can "adjust upwards" to make use of a higher number's functions and calls and whatevers...they're just not designed that way. You can't make a DX9 or DX10 card access anything in the respective higher-number version (DX10 or DX11) by any trick, magic, hack, workaround, or whatever. Granted, it's been awhile since I've done anything remotely close to "advanced power user"-type stuff with any of the DX versions, but I doubt anyone's broken through that limitation.

Shutting up now on the subject, because I know I'm not qualified to offer anything similar to reliable information on the subject...(just my inflation-adjusted $0.27c worth).


Exactly, and the only way to do so would be through emulation, and that gets messy even if you could, and you can't use conventional methods to do so. Like 3D Analyzer a long time ago.

Modifié par MaxPayne37, 20 mars 2011 - 03:14 .


#1044
foil-

foil-
  • Members
  • 550 messages

Kayden SiNafey wrote...

foil- wrote...

Hmmm, want to upload a photo of that setup
for bragging rights?  Still, think you need to get a third 30inch
monitor (and maybe 3way sli to support it) [smilie]../../../images/forum/emoticons/wink.png[/smilie]


You don't need 3way SLI to link 3 moinitors together you just need 2way, but 3way would help with resolutions that big. If you plan to run 3 independant monitors however you do need a 3 card not in SLI to accomidate that. I plan to go with 3 24 inch on a single stand here in a few months and really push this thing, I just might have to OC my CPU and further push my factory OC video cards.

Edit: This is for the 4xx and 5xx series though. When you have the 2xx series however you do need to 3way if that was your refering too, I apologize I didn't think of that sooner.


I meant support pushing that many pixels.  That's why I said maybe 3way sli to support it.  I should have said "to push that many pixels".  Performance on 3way isn't 1:1 improvement like you see going from 1 to 2 cards, but it would probably be necessary to push three monitors at that resolution for some of the newer games.

Dragon Age 2 however:  I don't think they are exactly cutting edge on graphics even with their use of tesselation.  This is a game that should run beter than metro2033(DX11) or  Crysis(DX9/10).  The game looks not bad compared to other games in the genre, but its hardly using all the power of PhysX and DX11.

Modifié par foil-, 20 mars 2011 - 03:41 .


#1045
BTCentral

BTCentral
  • Members
  • 1 684 messages

Modifié par BTCentral, 20 mars 2011 - 04:31 .


#1046
TallBearNC

TallBearNC
  • Members
  • 986 messages

MaxPayne37 wrote...

Kayden SiNafey wrote...

MaxPayne37 wrote...

There is a physical difference, not a software difference between DX versions. DX11 just happens to be backwards compatible with DX10 providing DX10 effects, but not DX11 ones. No emulation happens, try and get a DX10 card to run tessellation in a game, or a benchmark such as Unigine. Good luck. I don't mean to sound "rude" but look up your facts man. Get a DX10 card and try and run DX11 effects, you can't, lower performance or not. Ask anyone playing Dragon Age 2 right now with a DX10 only card and STOP MAKING ASSUMPTIONS.

Look at the High selectable graphics settings. Anyone with a DX10 card will know that Very High is not selectable. You're only given what you can use, not what you can't at a "performance hit", cause it's just not there on the hardware.

http://www.techspot....test/page3.html


This is my last comment because your not listening or understanding the fundamentals of what is being discussed here. There is a software difference, the code used in games to support those features are software they have to be put in by the game developers thus the requirements for DX10 to DX11 are different from a hardware and software point of view. I did not say anyone with a DX10 card should have to put up with crappy performance at all I am saying is that ppl are using a mode that is not supported by their card for a 100% and thus should go to DX9 and if they are unhappy with that well it is the nature of the beast with PC's either lower your expectations or upgrade. There are MANY ppl here with problems that do not correspond to hardware compatibility issue and those should be taken more seriously like low fps in SLI mode, low fps with a top end card (for me that's a 470,480, 570 & 580 don't know ATI that well so stick with what I know) with any texture mode and etc, I could keep going but there is no point.

Also by your own article you post it says

We also tested using the Very High, High and Medium quality presets. The “Very High” preset allows for ambient occlusion, depth of field and blur effects. The “High” preset disables depth of field and blur, while “Medium” removes all the customizable DX11 features.


When the word customizable is used it tells me it's the feature YOU can control, but other DX11 specific features may still be in use that a DX10 card can not perform well with and that's what I have been saying all along. Please understand I see your trying to help and you do have good grasp of technology I will give you that but your are either confusing facts or allowing your judgment to be clouded because you want to be right. There is no assumption worse then to think you are right when you have no facts to back it up.

Again I hate sounding rude especially to people who are trying to help but this is just one time where you are wrong, if I was wrong and you could point out to me where I am I would admit it but this has gone on long enough in this thread so I am going to help those who need help with the game and not understanding how direct x works, there is another forum that at Microsoft I'm sure. Hope this helps. Stay frosty.


No, DX10 cards support DX10 features, not DX11 ones, otherwise they wouldn't make DX11 cards. They call it backwards compatible, because it's compatible, if it wasn't, it wouldn't be, get it? DX10 cards are given High, not Very High, THOSE are the features that are not supported, and wouldn't perform well, hence they are not given. Unfortunately, you have no facts to prove you are right either, you're just writing what you think.

And no, it's a driver problem for most. Look at other threads, you will see people playing on High with DX10 cards comfortably just with an updated driver. Telling them to run DX9 because they should lower their expectations isn't solving the problem, it's a workaround that doesn't provide the same results. I was unhappy with my DX10 performance, but I didn't settle with DX9, I updated my driver and made it playable for DX10. Try and fix the issue before you resort to what should be the worst case scenario.

Also, LOOK AT THE HIGH SETTING IN THE GAME, IT SAYS IT ONLY REQURIES DX10 CAPABLE HARDWARE. THAT KIND OF HARDWARE DOES NOT SUPPORT TESSELLATION FOR EXAMPLE, AND IT'S NOT EVEN USED ON HIGH, GET SOME DX10 HARDWARE TO TRY IT ON AND STOP... SPOUTING... BULL****!! Even if it is used on DX11 hardware on High, does NOT automatically make it usable on DX10 hardware on High. Once again, TRY IT ON DX10 HARDWARE!!

I am upset, cause you are saying I'm ignorant, when you are the ignorant one. Look at the option ingame, it's using the DX11 renderer because it is, to provide DX10 backwards compatibility, doesn't mean it's using DX11 effects, cause it can't. Stop testing it with a DX11 card maybe, and test it using DX10 hardware, you know, LIKE I AM AND SOME OTHERS ARE!! You can't compare your findings on anything with a DX11 card, when we are arguing a DX10 issue.

Kudos to BTCentral, at least he understands and makes some sense.


social.bioware.com/forum/1/topic/300/index/6657347/ pretty much sums it all up in one neat post :)

#1047
TallBearNC

TallBearNC
  • Members
  • 986 messages

BTCentral wrote...

foil- wrote...

The game looks not bad compared to other games in the genre, but its hardly using all the power of PhysX and DX11.

I'm pretty sure it doesn't make use of the PhysX engine at all - I am sure I read that somewhere.


Actually, it does.

If I look in C:\\Program Files (x86)\\Electronic Arts\\Dragon Age™ II\\bin_ship I find:
physxcooking.dll
physxcore.dll
physxloader.dll

If the game didn't need or use Physx, then those 3 files would not be there.

Granted, there's no ground items u kick around, etc.. but physx can be used for water and particle effects..or even fire :)

Also the game uses Nvidia cuda tech to offload stuff from the CPU to the GPU:
cudart32_30_9.dll

Modifié par TallBearNC, 20 mars 2011 - 04:32 .


#1048
BTCentral

BTCentral
  • Members
  • 1 684 messages

TallBearNC wrote...

social.bioware.com/forum/1/topic/300/index/6657347/ pretty much sums it all up in one neat post :)

Good work, and factually correct :)

"DISABLE DESKTOP COMPOSITION" is a good tip too, I forgot you can do that - whenever I ran the game with dual monitors enabled that warning always popped up.

TallBearNC wrote...

Actually, it does.
Granted, there's no ground items u kick around, etc.. but physx can be used for water and particle effects..or even fire :)

Ah, my mistake - must have been thinking of another game!

Modifié par BTCentral, 20 mars 2011 - 04:38 .


#1049
TallBearNC

TallBearNC
  • Members
  • 986 messages

BTCentral wrote...

TallBearNC wrote...

Actually, it does.
Granted, there's no ground items u kick around, etc.. but physx can be used for water and particle effects..or even fire :)

Ah, my mistake - must have been thinking of another game!


No, you were right that people keep saying the game doesn't use Physx :) There's a few threads on it, but they are not correct. The game uses it. Granted it's an OLD version of Physx from 2008 (v2.8), but it uses it to a small degree.. it also uses nvidia cuda technology as well.

#1050
TallBearNC

TallBearNC
  • Members
  • 986 messages
Hmm I wonder if this has to do with some of the major performance hits when SSAO is used.. I noticed umbra32.dll, in the game's bin directory. A file that I have never seen before in a game.

Umbra Occlusion Booster Runtime Library
© 2006-2010 Umbra Software

I looked them up, and it's basically a library made by the above company to assist games with occlusion