Aller au contenu

Photo

/// ME3 MOD: HighRes textures + Next-Gen illumination + 3D Fix.


  • Veuillez vous connecter pour répondre
6497 réponses à ce sujet

#626
Dead_Meat357

Dead_Meat357
  • Members
  • 1 122 messages

Elessar79 wrote...

I've been having trouble with adding mods when it comes to adding multiple mods. The only mod that stays is the one that is on the top of the list. They are all the ones that smarteck has made. Any help and/or ideas would be appreciated.

My PC specs are:
OS: Windows XP Home Edition 32bit
CPU: P4 3.40GHz
RAM: 2.GB
NVIDIA GeForce GT 240 1GB DDR RAM
5.1 Dolby Digital Surround Cambridge SoundWorks
22in Acer x223w widescreen Flat Panel Monitor

I can play this game with all the settings maxed without any loss of frame rate.


I'm going to bet it's not really maxed out. The in-game menu is one thing, the configuration menu is another. Also, I'm wondering what resolution you are using. Even at the native 1680x1050 resolution of that monitor, I doubt you've got it all maxed out. Are you using 16xAF? FXAA?

These textures are huge. You don't have a lot of usable RAM as far as system or video card memory is concerned. As a result I doubt you have sufficient memory to load all the textures. This is most likely your issue. And frankly, your machine isn't likely to be able to handle the performance hit should you be able to load everything. Even at 1680x1050, or 1920x1080 your going to have trouble.

I went ahead and took some memory usage information from my system. This is with the high resolution Normandy Textures, high resolution N7 armor, Liara, Garrus, M-8 Avenger, M-92 Mantis, M-3 Predator, Alliance Uniforms, door holograms, and probably a couple things I'm forgetting. Most all of these are the Smartek textures. I monitored memory usage on the Normandy, but that's actually not where it peaked despite having the Normandy textures loaded. This was the opening of the Grissom Academy level where I hit the highest VRAM usage. Now 1920x1080 or less won't be as demanding as the resolution I'm using, but I wanted to illustrate the point that these textures can get out of hand size wise. 

Posted Image

Max usage was 1.77GB, but it hovered around 1.4GB most of the time.

Modifié par Dead_Meat357, 01 mai 2012 - 03:35 .


#627
neilthecellist

neilthecellist
  • Members
  • 450 messages
Dude, you don't have enough RAM, that's why you can't load every texture.

I have 8 GB of RAM.

Mass Effect 3 uses less than 1 GB of RAM without Texmod, over 3.3 GB with Texmod.

Your video card is also very weak. GT 240 is not a gaming card.

You're also using a 32-bit OS, go download LAA in order to increase maximum RAM usage for Mass Effect 3.

FXAA is a crap solution. I still don't understand why Smarteck uses it, MSAA+SGSAA 2x + LOD bias -1.5 creates a crisper image WITHOUT causing any pixelation to the text. Download NVIDIA Inspector, apply MSAA 2x and SGSAA 2x, set negative LOD bias to -1.5

The formula for determining the correct lod bias is "y = -0.5 * log, base 2, of (n)" where n is the number of samples and y is the correct lod bias.


http://naturalviolen....com/sgssaa.htm

FXAA is just an on-screen fix. Everything white-colored gets some ugly HDR bloom in the game which isn't realistic at all.

#628
Dead_Meat357

Dead_Meat357
  • Members
  • 1 122 messages

neilthecellist wrote...

Dude, you don't have enough RAM, that's why you can't load every texture.

I have 8 GB of RAM.

Mass Effect 3 uses less than 1 GB of RAM without Texmod, over 3.3 GB with Texmod.

Your video card is also very weak. GT 240 is not a gaming card.

You're also using a 32-bit OS, go download LAA in order to increase maximum RAM usage for Mass Effect 3.

FXAA is a crap solution. I still don't understand why Smarteck uses it, MSAA+SGSAA 2x + LOD bias -1.5 creates a crisper image WITHOUT causing any pixelation to the text. Download NVIDIA Inspector, apply MSAA 2x and SGSAA 2x, set negative LOD bias to -1.5

The formula for determining the correct lod bias is "y = -0.5 * log, base 2, of (n)" where n is the number of samples and y is the correct lod bias.


http://naturalviolen....com/sgssaa.htm

FXAA is just an on-screen fix. Everything white-colored gets some ugly HDR bloom in the game which isn't realistic at all.


I'm using an AMD card and I've had zero luck forcing AA with Mass Effect 3. As I understand it, the in-game AA for Mass Effect 3 is FXAA. Also, he recommends the FXAA tool to add some post processing effects to the game which do improve the look of the game quite a bit in my opinion, though you have to play with it to get the desired effects.

#629
neilthecellist

neilthecellist
  • Members
  • 450 messages
Already have played with FXAA Tool for several weeks. Trust me, NVIDIA Inspector looks way better. I force negative LOD bias, true AF rendering, NVIDIA 3D, Ambient Occlusion (NOT "Screen Space" version like FXAA_Tool, again, FXAA is crap, its AO solution is screen baesd, not engine-based).FXAA modifies the screen itself, meaning white text can get HDR Bloom because it's... well, white.

FXAA is such a limited and outdated program, I don't know why the programmer keeps updating it. Inspector surpasses FXAA on features, quality, and performance efficiency.

You're not understanding me either. I don't give a rat's ass about ME3's in-game AA. Yes, I know it's FXAA. I already said it's crap. That's why I force MSAA+SGSAA through Nvidia Inspector.

#630
Dead_Meat357

Dead_Meat357
  • Members
  • 1 122 messages

neilthecellist wrote...

Already have played with FXAA Tool for several weeks. Trust me, NVIDIA Inspector looks way better. I force negative LOD bias, true AF rendering, NVIDIA 3D, Ambient Occlusion (NOT "Screen Space" version like FXAA_Tool, again, FXAA is crap, its AO solution is screen baesd, not engine-based).FXAA modifies the screen itself, meaning white text can get HDR Bloom because it's... well, white.

FXAA is such a limited and outdated program, I don't know why the programmer keeps updating it. Inspector surpasses FXAA on features, quality, and performance efficiency.

You're not understanding me either. I don't give a rat's ass about ME3's in-game AA. Yes, I know it's FXAA. I already said it's crap. That's why I force MSAA+SGSAA through Nvidia Inspector.


I just tried forcing AA in the Catalyst Control Center since updating my driver. (I hadn't tried with the Catalyst 12.4 drivers) It's working now. Looks great with 8xAA. And I'd try NVIDIA inspector, but I'm using an AMD card. I guess I can see how well it works on my GTX 580, which I've got in my girlfriend's PC. And yeah, I know FXAA is crap. The in-game AA looks basically like no AA at all. And running the FXAA Tool while forcing AA through the Catalyst Control Center results in a really grainy image. So I won't be doing that anymore.

Oh, and after forcing 8xMSAA in the CCC my GPU memory usage hit 2.29GB.

Modifié par Dead_Meat357, 01 mai 2012 - 04:25 .


#631
MACharlie1

MACharlie1
  • Members
  • 3 437 messages

smarteck wrote...

 ///

MACharlie1??  :whistle:

Come on dude:D

//////

Since you guys asked so nicely...

Garrus Vakarian - Default Armor

Liara T'Soni - Default Armor

EDI w/ Cerberus Logo and EVA sign - Default

EDI w/ Alliance Logo and EDI sign - Default 

I also created one for Wrex/Wreav but haven't tested it yet. 

Modifié par MACharlie1, 01 mai 2012 - 04:30 .


#632
neilthecellist

neilthecellist
  • Members
  • 450 messages
Sell your AMD card on eBay and get an NVIDIA card. I used to run a repair shop and AMD cards might be cheap to buy, but they're outdated and a result of poor engineering, as a result, a bad long term investment. NVIDIA cards are expensive to buy first hand, but the amount of features you get from an NVIDIA card over-justifies your intial cost (Ambient Occlusion, SSGSA+MSAA at the same time, realistic hair, PhysX, NVIDIA 3D Vision, true AF... AMD cards can do the same, but at an emulating level almost

I am from Taiwan (which is where both AMD and NVIDIA cards are made) and I can tell you first hand why AMD cards are so badly designed, but you probably wouldn't understand without an electrical engineering degree and you might not even care. But the point is, NVIDIA is better than AMD. I am not a fanboy of either, I just like the option that makes more sense.

Modifié par neilthecellist, 01 mai 2012 - 04:32 .


#633
Elessar79

Elessar79
  • Members
  • 35 messages
Yikes! Wasn't my aim to start a spat in here!

I realize my specs are really low, just going to take a while to save up for upgrades. Gonna need a new motherboard to handle the 8GB+ RAM I want (Ideally I'm thinking 16GB) not sure how high to go with the CPU, and I guess I'm gonna have to bite the bullet and get a newer GPU.

#634
Dead_Meat357

Dead_Meat357
  • Members
  • 1 122 messages

neilthecellist wrote...

Sell your AMD card on eBay and get an NVIDIA card. I used to run a repair shop and AMD cards might be cheap to buy, but they're outdated and a result of poor engineering, as a result, a bad long term investment. NVIDIA cards are expensive to buy first hand, but the amount of features you get from an NVIDIA card over-justifies your intial cost (Ambient Occlusion, SSGSA+MSAA at the same time, realistic hair, PhysX, NVIDIA 3D Vision, true AF... AMD cards can do the same, but at an emulating level almost

I am from Taiwan (which is where both AMD and NVIDIA cards are made) and I can tell you first hand why AMD cards are so badly designed, but you probably wouldn't understand without an electrical engineering degree and you might not even care. But the point is, NVIDIA is better than AMD. I am not a fanboy of either, I just like the option that makes more sense.


This is what happens when you assume too much. I've been working in the computer hardware industry and information technology fields for almost 18 years. AMD's driver's are lacking a lot of the time. I won't deny that, but hardware wise, they are different than AMD's but not necessarily less advanced. There are pros and cons to each. Hardware wise, neither AMD nor NVIDIA build anything. Both are design firms. They contract out construction of the ASICs, GPU silicon and board manufacturing to other vendors. Sometimes even the same ones. Flextronics, Foxconn, etc. make boards for both companies. And it's hard to argue for superior NVIDIA quality when there was the G80 debacle, VRMs burning out on GeForce 4 Ti4200's and most recently, GeForce GTX 590's catching fire due to bad firmware.

They are constantly leap frogging each other with new types of antialiasing. So that argument doesn't really hold water. NVIDIA 3D Vision is of no consequence as my 30" monitors don't support it. And at 2560x1600 (or 7680x1600 with Eyefinity / NV Surround) I can't afford the performance hit of 3D. Realistic hair? This is a result of techniques applied to achieve the result. It isn't a feature by itself. Both cards can do this. It is how the game or application achived it, as to which card can do it better. It isn't as if this is one feature and there is only one way to achieve it. PhysX is a proprietary API which has to be licensed and few games really make meaningful use out of it aside from Batman Arkham Asylum / City. And there is a work around to use an NVIDIA card for PhysX while using AMD cards as your primary gaming cards. Sure lots of games support it, but few do anything worth noting. 

And technically, AMD cards could do physics effects processing through DirectCompute. However, almost no one is doing that either by choice or because NVIDIA pays top dollar to ensure that game developers do it their way.

And as for NVIDIA being better, this is hit or miss. I've had products from both companies dating back to the early days of 3D acceleration and their entries into that market. I've had good experiences with both, and bad experiences with either. Most recently, I pulled my GTX 580's out of my system because I had issues with SLI and random artifacting which wasn't related to heat, the motherboard, or seemingly anything else. Both cards tested fine, but for whatever reason, I had tons of driver issues with them. Since installing the Radeon HD 7970, I had a few issues with SWTOR which were resolved quickly, and I couldn't force MSAA in the CCC with Mass Effect 3. That's it. I've had zero complaints aside from that. So far this card has been less problematic than my GTX 580's ever were. Though one of my GTX 580s is working perfectly in my girlfriend's machine, and the other is sitting in my test rig. 

I'm no fanboy of either vendor, but right now, the Radeon HD 7970 is serving me well. I have no plans on replacing it until the retail availability of the GTX 690. When that comes out, I'll make a decision on that.

Elessar79 wrote...

Yikes! Wasn't my aim to start a spat in here!

I realize my specs are really low, just going to take a while to save up for upgrades. Gonna need a new motherboard to handle the 8GB+ RAM I want (Ideally I'm thinking 16GB) not sure how high to go with the CPU, and I guess I'm gonna have to bite the bullet and get a newer GPU.


16GB of RAM won't really do anything for you gaming wise. As cheap as it is, I would probably go for that myself, but it won't be of any real benefit for simple gaming tasks. As for the CPU, I'd recommend either shopping for a deal on the Core i5 2500K or Core i7 2600K. They can be had for a fairly reasonable sum right now. The newer Ivy Bridge based Core i7 3770K replaces the Core i7 2600K in Intel's lineup, but their availability isn't great and no one is really offering deals on them right now.

As for motherboards, Z77 doesn't offer much over Z68, but it's always wise to get the newer product unless you just find an amazing deal on the older one. Also Z77 will support Ivy Bridge based CPUs out of the box where as Z68 boards will usually require a BIOS update for compatibility.

Modifié par Dead_Meat357, 01 mai 2012 - 05:04 .


#635
Elessar79

Elessar79
  • Members
  • 35 messages
How much is the 580 GTX going for now? I assume it'll be a more suitable card to run. Hard part is making sure I have enough room in my tower to fit everything and still have room for decent air movement even though I'm using 3 fans that reason.

#636
neilthecellist

neilthecellist
  • Members
  • 450 messages
You're assuming I said NVIDIA and AMD makes their own cards. I never said this. As I said, I used to work in a computer repair shop and I'm working towards an E degree. What makes you think I wouldn't notice the little Foxconn engraving on the boards while doing repairs? -_-

Foxconn is exactly what I was referring to. Yes, I've seen bad NVIDIA boards, the GTX 480 overheats like hell (the card that I have) but with the right aftermarket cooler (which I end up upselling to customers all the time) heat is never really an issue. I've benchmarked the GTX 590 and know what you mean by heating issues, but the actual cooler, as you probably know is not a Foxconn design necessarily (sometimes it is) but more often made by the reseller (EVGA, Gigabyte, etc).

Nonetheless, the design of an AMD card lately has not been that good. You may disagree with me with some examples, but like you, I've been around since integrated graphics were just a part of the mobo. My first non-integrated video card was an ATI Rage 64 (or was it 128?). I don't even think they're made anymore.

Regardless of the technical explanation, the fact remains that NVIDIA cards have those features that you acknowledge AMD cards can also do, but through DirectCompute. DC is not efficient. You know this if you work in the computer hardware industry. Or if you even tried it, that's another way to know.

And honestly, no offense, but having a degree in computer information is really irrelevant. CIS people don't really know engineering as extensively as an EE major.

#637
Dead_Meat357

Dead_Meat357
  • Members
  • 1 122 messages

neilthecellist wrote...

You're assuming I said NVIDIA and AMD makes their own cards. I never said this. As I said, I used to work in a computer repair shop and I'm working towards an E degree. What makes you think I wouldn't notice the little Foxconn engraving on the boards while doing repairs? -_-

Foxconn is exactly what I was referring to. Yes, I've seen bad NVIDIA boards, the GTX 480 overheats like hell (the card that I have) but with the right aftermarket cooler (which I end up upselling to customers all the time) heat is never really an issue. I've benchmarked the GTX 590 and know what you mean by heating issues, but the actual cooler, as you probably know is not a Foxconn design necessarily (sometimes it is) but more often made by the reseller (EVGA, Gigabyte, etc).

Nonetheless, the design of an AMD card lately has not been that good. You may disagree with me with some examples, but like you, I've been around since integrated graphics were just a part of the mobo. My first non-integrated video card was an ATI Rage 64 (or was it 128?). I don't even think they're made anymore.

Regardless of the technical explanation, the fact remains that NVIDIA cards have those features that you acknowledge AMD cards can also do, but through DirectCompute. DC is not efficient. You know this if you work in the computer hardware industry. Or if you even tried it, that's another way to know.

And honestly, no offense, but having a degree in computer information is really irrelevant. CIS people don't really know engineering as extensively as an EE major.


I don't have a degree in CIS. I have worked professionally in these industries for 18 years. There is a difference. I also don't claim to be an electrical engineer. I'm not going to say that I understand things as well as they do, but I do know computer hardware. Don't assume I wouldn't understand something just because I am posting on the BioWare forum. For the most part the features that AMD doesn't do that NVIDIA does are irrelevant to me at this point. As for direct compute vs. CUDA, that's coding stuff and I'm not an expert in that. PhysX is a capable API, but many games have done physics work that is just as impressive via the CPU or through other more general means. Aside from the implementation of PhysX with the Batman games, I've never really seen anything all that impressive outside of tech demos.

And again for quality, I've seen many NVIDIA boards with sloppy soldering, warping, thin ass PCBs, VRMs that burn up far faster than they should, etc. I've seen bad AMD boards too. I'm not saying NVIDIA is worse than AMD, far from it. This varies by manufacturer. It's all about who cut corners on the design as much as adhereance to the design. If NVIDIA specs crap, then Foxconn will build crap. The NVIDIA 680i SLI reference boards are the only example of that you need. If a company specs something good, then what you'll get from Foxconn is a good product. (ASUS, Intel etc. use Foxconn for board manufacturing) 

Anyway, we are derailing the thread big time. I'm more interested in the texture mods. I just wanted to illustrate to the poster above that the problems with being able to load all these textures is the lack of available system/video memory in his or her machine. I wanted to show that Mass Effect 3 can use a ton of video RAM. System RAM usage varies wildly, so I didn't want to get into that. But in case you wanted to know, Mass Effect 3 is currently using 2.27GB of RAM in my machine right now. Video RAM usage on the Grissom Academy level ahs increased to 2.66GB in the court yard where you fight the first Atlas. So yeah, these textures are brutal even on higher end machines. Being limited by the consoles and not wanting to do extra work on the PC port is just part of the reason why the textures are so bad. The other part has to do with the fact that significantly higher resolution textures, or moderate increases in texture quality globally really can kill system performance. Companies like BioWare and EA will always pander to lower end and mid-range systems rather than the high end as there is more money to be made there. 

If you want to really see what this game CAN look like, you need to upgrade.

Modifié par Dead_Meat357, 01 mai 2012 - 05:28 .


#638
Guest_PDesign_*

Guest_PDesign_*
  • Guests
Can you guys use hi-res textures from other games for your mods?

#639
Fredy AG

Fredy AG
  • Members
  • 1 355 messages

neilthecellist wrote...

I still don't understand why Smarteck uses it, MSAA+SGSAA 2x + LOD bias -1.5 creates a crisper image WITHOUT causing any pixelation to the text. Download NVIDIA Inspector, apply MSAA 2x and SGSAA 2x, set negative LOD bias to -1.5

FXAA is just an on-screen fix. Everything white-colored gets some ugly HDR bloom in the game which isn't realistic at all.


Although it sounds ridiculous, I have not recommended the use of FXAA in this post by AA technology, but rather, by the dramatic changes may occur in the lighting, colors and sharpness of the game.

I understand that not everyone has worked like me, but I do know that to others if they worked, and just as I have a crush on the application. I get the impression, that the results depend on the graphics hardware that is being used (especially the type of monitor / TV).

I tested ALL versions of ENB Mods, and achievements with FXAA TOOL, remains much higher in terms of visual reality. I use it on all the games that I can .. all DX9 and DX11 also works on some. They all look amazing.

A simple sample that I found out there, a person who downloaded this MOD FXAA for ME3, but discovered that served with the rest of their games and was amazed.

webcache.googleusercontent.com/search

Has an impact on computer performance? YES. But if your PC can use it, go ahead.

Salu2.;)

//////

Modifié par smarteck, 01 mai 2012 - 06:42 .


#640
Dead_Meat357

Dead_Meat357
  • Members
  • 1 122 messages

smarteck wrote...

neilthecellist wrote...

I still don't understand why Smarteck uses it, MSAA+SGSAA 2x + LOD bias -1.5 creates a crisper image WITHOUT causing any pixelation to the text. Download NVIDIA Inspector, apply MSAA 2x and SGSAA 2x, set negative LOD bias to -1.5

FXAA is just an on-screen fix. Everything white-colored gets some ugly HDR bloom in the game which isn't realistic at all.


Although it sounds ridiculous, I have not recommended the use of FXAA in this post by AA technology, but rather, by the dramatic changes may occur in the lighting, colors and sharpness of the game.

I understand that not everyone has worked like me, but I do know that to others if they worked, and just as I have a crush on the application. I get the impression, that the results depend on the graphics hardware that is being used (especially the type of monitor / TV).

I tested ALL versions of ENB Mods, and achievements with FXAA TOOL, remains much higher in terms of visual reality. I use it on all the games that I can .. all DX9 and DX11 also works on some. They all look amazing.

A simple sample that I found out there, a person who downloaded this MOD FXAA for ME3, but discovered that served with the rest of their games and was amazed.

webcache.googleusercontent.com/search

Has an impact on computer performance? YES. But if your PC can use it, go ahead.

Salu2.;)

//////


Yeah, I'm quite fond of what it does for the lighting in the game. That's definitely it's strong point. 

#641
InBleedingRapture

InBleedingRapture
  • Members
  • 138 messages
FXAA and the like are becoming more popular because they have little to no impact on performance. MSAA + SSAA will soften hard lines better and give a crisper image, but it can really hurt performance.

FXAA is actually just a post-processing effect -- same as a film grain filter, etc. -- while MSAA and SSAA are rendering methods. Basically, this means that instead of your graphics card taking the heat from rendering each frame with AA, it just overlays a simple filter that gives a comparable boost to image quality without sacrificing performance.

As for the other features included with FXAA tool, they also "simulate" what the actual effect looks like. I personally don't like the HDR filter (I use it on VERY low settings to give the game a little more contrast). Real effects like HDR and bloom lighting are actually rendered frame-by-frame by the game engine, so they look better than what FXAA Tool can do.

I hope this makes sense.

Maybe I can use an analogy... rendering methods vs. post-processing is like having a great quality photo of something vs. having an OK quality photo of something and then touching it up in Photoshop. Does that make any sense?

Modifié par InBleedingRapture, 01 mai 2012 - 08:39 .


#642
Dead_Meat357

Dead_Meat357
  • Members
  • 1 122 messages

InBleedingRapture wrote...

FXAA and the like are becoming more popular because they have little to no impact on performance. MSAA + SSAA will soften hard lines better and give a crisper image, but it can really hurt performance.

FXAA is actually just a post-processing effect -- same as a film grain filter, etc. -- while MSAA and SSAA are rendering methods. Basically, this means that instead of your graphics card taking the heat from rendering each frame with AA, it just overlays a simple filter that gives a comparable boost to image quality without sacrificing performance.

As for the other features included with FXAA tool, they also "simulate" what the actual effect looks like. I personally don't like the HDR filter (I use it on VERY low settings to give the game a little more contrast). Real effects like HDR and bloom lighting are actually rendered frame-by-frame by the game engine, so they look better than what FXAA Tool can do.

I hope this makes sense.

Maybe I can use an analogy... rendering methods vs. post-processing is like having a great quality photo of something vs. having an OK quality photo of something and then touching it up in Photoshop. Does that make any sense?


It makes perfect sense and I understood that about it anyway. I'm amazed it does what it does so well despite how it works. That being said, you have to really play with the settings to get a decent result. I don't care for how grainy things look, but I do like the contrast it gives the image. Performance wise I hadn't had any real issues with it.

#643
InBleedingRapture

InBleedingRapture
  • Members
  • 138 messages

Dead_Meat357 wrote...

It makes perfect sense and I understood that about it anyway. I'm amazed it does what it does so well despite how it works. That being said, you have to really play with the settings to get a decent result. I don't care for how grainy things look, but I do like the contrast it gives the image. Performance wise I hadn't had any real issues with it.


I agree totally. I used to tweak my settings in nVidia Inspector for hours to get the right blend between performance and image quality. Now, I can just mess with FXAA Tool and not worry about how it will affect my performance. FXAA is the future :)

#644
Dead_Meat357

Dead_Meat357
  • Members
  • 1 122 messages

InBleedingRapture wrote...

Dead_Meat357 wrote...

It makes perfect sense and I understood that about it anyway. I'm amazed it does what it does so well despite how it works. That being said, you have to really play with the settings to get a decent result. I don't care for how grainy things look, but I do like the contrast it gives the image. Performance wise I hadn't had any real issues with it.


I agree totally. I used to tweak my settings in nVidia Inspector for hours to get the right blend between performance and image quality. Now, I can just mess with FXAA Tool and not worry about how it will affect my performance. FXAA is the future :)


I don't think FXAA is working at all for me. I can't tell the difference between that and no AA. Fortunately I'm able to force MSAA or SSAA through the Catalyst Control Center. It runs fine like that and is better than FXAA so I'm not worried about it.

Modifié par Dead_Meat357, 01 mai 2012 - 09:13 .


#645
InBleedingRapture

InBleedingRapture
  • Members
  • 138 messages
It's not as noticeable in ME3 as it is in other games, but it still works. You have to crank your blur amount and thresholds pretty high to see any difference. An alternative is just to enable AA in the game, since ME3 uses FXAA already. The vanilla AA in ME3 is actually quite good when used in combination with FXAA Tool's sharpening features.

Unfortunately for me, for whatever reason, MSAA/SSAA kills my performance on my nVidia system. I don't know if it's because I'm running an SLI setup or what, but I pretty much have to use FXAA, unless I enjoy 12 FPS :P

#646
Dead_Meat357

Dead_Meat357
  • Members
  • 1 122 messages

InBleedingRapture wrote...

It's not as noticeable in ME3 as it is in other games, but it still works. You have to crank your blur amount and thresholds pretty high to see any difference. An alternative is just to enable AA in the game, since ME3 uses FXAA already. The vanilla AA in ME3 is actually quite good when used in combination with FXAA Tool's sharpening features.

Unfortunately for me, for whatever reason, MSAA/SSAA kills my performance on my nVidia system. I don't know if it's because I'm running an SLI setup or what, but I pretty much have to use FXAA, unless I enjoy 12 FPS :P


I was talking about the in-game FXAA not working. FXAA tool doesn't seem to do anything for the AA either. While NVIDIA's FXAA implementation works on AMD hardware, it may not always work, or it could be an issue with Mass Effect 3. I'm not certain. And MSAA essentially works fine on my system without any issues in this game even with the high resolution textures loaded. So I don't have a problem there.

As for AA not working well on yoursetup, it would depend on what your system configuration is like I guess. I can run 4xAA up to about 2560x1600 on GTX 580's/580 SLI without issue. In many games, I can go higher without any trouble but it depends on the game's graphics quality and texture sizes as to whether or not I run into VRAM limitations with 1.5GB cards. I definitely can't go higher in BF3 than 4xAA. 

#647
neilthecellist

neilthecellist
  • Members
  • 450 messages

Dead_Meat357 wrote...

neilthecellist wrote...

You're assuming I said NVIDIA and AMD makes their own cards. I never said this. As I said, I used to work in a computer repair shop and I'm working towards an E degree. What makes you think I wouldn't notice the little Foxconn engraving on the boards while doing repairs? -_-

Foxconn is exactly what I was referring to. Yes, I've seen bad NVIDIA boards, the GTX 480 overheats like hell (the card that I have) but with the right aftermarket cooler (which I end up upselling to customers all the time) heat is never really an issue. I've benchmarked the GTX 590 and know what you mean by heating issues, but the actual cooler, as you probably know is not a Foxconn design necessarily (sometimes it is) but more often made by the reseller (EVGA, Gigabyte, etc).

Nonetheless, the design of an AMD card lately has not been that good. You may disagree with me with some examples, but like you, I've been around since integrated graphics were just a part of the mobo. My first non-integrated video card was an ATI Rage 64 (or was it 128?). I don't even think they're made anymore.

Regardless of the technical explanation, the fact remains that NVIDIA cards have those features that you acknowledge AMD cards can also do, but through DirectCompute. DC is not efficient. You know this if you work in the computer hardware industry. Or if you even tried it, that's another way to know.

And honestly, no offense, but having a degree in computer information is really irrelevant. CIS people don't really know engineering as extensively as an EE major.


I don't have a degree in CIS. I have worked professionally in these industries for 18 years. There is a difference. I also don't claim to be an electrical engineer. I'm not going to say that I understand things as well as they do, but I do know computer hardware. Don't assume I wouldn't understand something just because I am posting on the BioWare forum. For the most part the features that AMD doesn't do that NVIDIA does are irrelevant to me at this point. As for direct compute vs. CUDA, that's coding stuff and I'm not an expert in that. PhysX is a capable API, but many games have done physics work that is just as impressive via the CPU or through other more general means. Aside from the implementation of PhysX with the Batman games, I've never really seen anything all that impressive outside of tech demos.

And again for quality, I've seen many NVIDIA boards with sloppy soldering, warping, thin ass PCBs, VRMs that burn up far faster than they should, etc. I've seen bad AMD boards too. I'm not saying NVIDIA is worse than AMD, far from it. This varies by manufacturer. It's all about who cut corners on the design as much as adhereance to the design. If NVIDIA specs crap, then Foxconn will build crap. The NVIDIA 680i SLI reference boards are the only example of that you need. If a company specs something good, then what you'll get from Foxconn is a good product. (ASUS, Intel etc. use Foxconn for board manufacturing) 

Anyway, we are derailing the thread big time. I'm more interested in the texture mods. I just wanted to illustrate to the poster above that the problems with being able to load all these textures is the lack of available system/video memory in his or her machine. I wanted to show that Mass Effect 3 can use a ton of video RAM. System RAM usage varies wildly, so I didn't want to get into that. But in case you wanted to know, Mass Effect 3 is currently using 2.27GB of RAM in my machine right now. Video RAM usage on the Grissom Academy level ahs increased to 2.66GB in the court yard where you fight the first Atlas. So yeah, these textures are brutal even on higher end machines. Being limited by the consoles and not wanting to do extra work on the PC port is just part of the reason why the textures are so bad. The other part has to do with the fact that significantly higher resolution textures, or moderate increases in texture quality globally really can kill system performance. Companies like BioWare and EA will always pander to lower end and mid-range systems rather than the high end as there is more money to be made there. 

If you want to really see what this game CAN look like, you need to upgrade.


Really... I need to upgrade... Here are my specs:

AMD x6 1100t six core processor with Zalman coolers
8 GB of RAM active (16 GB total, but disabled 8 GB to save power since I live in California)
300 GB SSD
3 TB SATA II (currently disabled)
blu ray drive
3x GTX 480 with Zalman coolers (only 1 currently active, disabled other 2 to save power)
2000 watt Corsair power supply
Case has 10 fans, variety of sizes.

Lol.... You really think I need to upgrade.... I have the game set anywhere from 2x MSAA+2x SGSAA (visual equivalent of 8 X super sampling), negative LOD bias computed to -1.5... To 8xMSAA+8xSGSAA +4x SuperSampling, -2.0 LOD bias, Ambient Occlusion forced on at High Quality, and NVIDIA 3D Vision running this game at 2560 x 1600 resolution... Really, now?

The lack of a CIS degree (whose requirements are a joke already) in your portfolio further suggests your lack of actual expertise in this area. You don't even have an engineering degree. Great, you've worked in the computer hardware industry for 18 years. So you can only make deductions based on the limited inventory that you are exposed to, you aren't exposed to theoretical --> practical designs that an engineer is exposed to. Have you ever even walked into a Foxconn building? 

Is a dietician entitled to more authority on human organs more than a medical surgeon? I don't think so. One of them has an M.D., the other is a junior college flake, if even.

@InBleedingRapture: For every setting you make to AA in Nvidia Inspector you must use a logarithmic equation to compute the right negative LOD bias, or your image will look funky, performance will be shotty, etc. I am currently running 4xMSAA + 4xSGSAA and my framerate is well above 45 on a single GTX 480 at lod bias -1.5. If I were to make the wrong calculation and set my lod bias to -2.0, my framerate would drop below 30.

See my posts above for the logarithm you have to use to get the right lod bias. Trust me, Inspector WILL look better than FXAA. FXAA is just an on-screen modifier like ENB. It explains the "great" performance that you're getting.

Modifié par neilthecellist, 01 mai 2012 - 09:53 .


#648
InBleedingRapture

InBleedingRapture
  • Members
  • 138 messages
Yeah, I'm not sure what's going on. I run two OC'd GTX 460's and a 9800 for PhysX/extra monitor. I can usually run MSAA without issue in most new games.

Monitoring my GPU usage, ME3 doesn't usually break 50% on each card, and it doesn't change when I enable MSAA. Probably just one of those things. Silly hardware conflicts.

#649
neilthecellist

neilthecellist
  • Members
  • 450 messages
Sorry, I edited my post earlier but I'm not sure if you saw it, here it is again:

@InBleedingRapture: For every setting you make to AA in Nvidia Inspector you must use a logarithmic equation to compute the right negative LOD bias, or your image will look funky, performance will be shotty, etc. I am currently running 4xMSAA + 4xSGSAA and my framerate is well above 45 on a single GTX 480 at lod bias -1.5. If I were to make the wrong calculation and set my lod bias to -2.0, my framerate would drop below 30.

See my posts above for the logarithm you have to use to get the right lod bias. Trust me, Inspector WILL look better than FXAA. FXAA is just an on-screen modifier like ENB. It explains the "great" performance that you're getting.

#650
InBleedingRapture

InBleedingRapture
  • Members
  • 138 messages
Apologies, neil and deadmeat... I should really learn to read upward. Didn't mean to come off like I know a whole bunch about this stuff. I'm self-taught when it comes to computer software/hardware and I really don't keep up with tech forums and such.

@nielthecellist
First, you need to upgrade your POS rig before I'll even dignify you with a response :P Kidding, of course.

Inspector looks much better and I will try your logarithm to smooth out the hiccups. I've always been more of a "solid 60" guy when it comes to performance, sacrificing some quality for smooth gameplay, which is why I favor post-processing methods in general. I've also talked to Boris (of ENB) and donated to him a few times, so I know a little bit about HLSL code, but not a whole lot.

I went to college for English Lit., so I lay no claim to advanced knowledge of CIS. I'm an EVGA loyalist, mainly because they produce the best quality nVidia hardware (IMO) on the market and their lifetime warranties are great. I've RMA'd parts at least a half dozen times without issue.

Again, apologies. I had no intention of inflaming any tensions.