Aller au contenu

Photo

/// ME3 MOD: HighRes textures + Next-Gen illumination + 3D Fix.


  • Veuillez vous connecter pour répondre
6497 réponses à ce sujet

#1126
Fredy AG

Fredy AG
  • Members
  • 1 355 messages

SliPaladin wrote...

So basically at 1080p my 1gig GTX560Ti wont be enough... damn :/


That depends, perhaps shared system memory be your salvation.

If my memory I make a full video and still works is because the shared memory is helping the video card (I think).

You only need to test. I found this:

"Shared System Memory: This usually makes up the bulk of system memory available to the GPU and is “allocated” on demand when needed. Shared system memory is really just regular application VAD which are probe and locked and made visible to the GPU. VidMm will only allow up to N bytes to be pinned down simultaneously where N is: ((Total System Memory – 512)/2 – Dedicated System Memory)"

Posted Image


//////

Modifié par smarteck, 17 mai 2012 - 10:22 .


#1127
InBleedingRapture

InBleedingRapture
  • Members
  • 138 messages
I run two GTX 460's in SLI and I have no trouble running all smarteck's mods at once. The only thing you might notice is a slight stutter when first entering or looking around a new area. Even when my vid mem is maxed out, there is no decrease in performance.

#1128
Thiagobsbr

Thiagobsbr
  • Members
  • 31 messages
Thanks folks!

I've worked with images and graphics in the past, but only 2D images...
I tried to give a go and retexture the first casual appearance (the one Shepard is using at the very beginning of the game), but damn, it's harder than it looks.. haha
I think its because of the organic nature of clothing/armor, I don't know, but the result was very unsatisfactory.
I've then tried to redo it using bits of Smarteck's alliance suit standard to keep the same "feel" and the result was better, but still far from good looking.. :(

As from pulling the textures with Texmod, I've noticed that the largest textures (characters, weapons and large bits of the map) are usually at the end, so I cycle them backwards.
When I've pulled Liara's armor, the overlay showed over 10000 textures loaded and hers were among the last, around 8900 or so.

#1129
InBleedingRapture

InBleedingRapture
  • Members
  • 138 messages

Thiagobsbr wrote...

Thanks folks!

I've worked with images and graphics in the past, but only 2D images...
I tried to give a go and retexture the first casual appearance (the one Shepard is using at the very beginning of the game), but damn, it's harder than it looks.. haha
I think its because of the organic nature of clothing/armor, I don't know, but the result was very unsatisfactory.
I've then tried to redo it using bits of Smarteck's alliance suit standard to keep the same "feel" and the result was better, but still far from good looking.. :(

As from pulling the textures with Texmod, I've noticed that the largest textures (characters, weapons and large bits of the map) are usually at the end, so I cycle them backwards.
When I've pulled Liara's armor, the overlay showed over 10000 textures loaded and hers were among the last, around 8900 or so.


Eeek... I'm actually working on that same outfit. Not finished, but...

Posted Image

...getting closer.

No matter, though. I plan to release my textures in a separate "pack," since we all have different styles and ways of doing things. So you're not stepping on my toes if you decide to make your own version of something I've already done.

I think smarteck mentioned it already, but if you press * on the numpad while TexMod is running, it will filter out textures that aren't currently being rendered. This can knock things down from a couple thousand textures to about one hundred.

#1130
Desuke

Desuke
  • Members
  • 8 messages
Well thank God i have a 3gig video card.

#1131
Thiagobsbr

Thiagobsbr
  • Members
  • 31 messages

InBleedingRapture wrote...

Eeek... I'm actually working on that same outfit. Not finished, but...


...getting closer.

No matter, though. I plan to release my textures in a separate "pack," since we all have different styles and ways of doing things. So you're not stepping on my toes if you decide to make your own version of something I've already done.

I think smarteck mentioned it already, but if you press * on the numpad while TexMod is running, it will filter out textures that aren't currently being rendered. This can knock things down from a couple thousand textures to about one hundred.



That was another reason I stopped trying to remake that texture :)
I saw your post on another thread, a few weeks ago, that you were about to start that retex, so I gave up trying.. not that I could make anything good anyway :D

#1132
Shajar

Shajar
  • Members
  • 1 115 messages

SliPaladin wrote...

So basically at 1080p my 1gig GTX560Ti wont be enough... damn :/


I use all those textures (just awesome job guys)

1920:1080 resolution
HD5870 1GB
8GB ram

No problems for running them, i noticed nice difference on game, so i quess they are working well :)

#1133
InBleedingRapture

InBleedingRapture
  • Members
  • 138 messages

Thiagobsbr wrote...
That was another reason I stopped trying to remake that texture :)
I saw your post on another thread, a few weeks ago, that you were about to start that retex, so I gave up trying.. not that I could make anything good anyway :D


Don't sell yourself short. My freehand art (pencil & paper, painting, etc.) is about on par with a 12-year-old. 3D modeling and 2D texturing must use a different part of the brain or something. I'd be happy to see anything you come up with. I'm sure others would as well.

As some others may know, I'm not exactly secretive about how I work. That said, if you run into any trouble, I'm more than willing to offer what limited help I  can give.

Don't give up ; )

#1134
Fredy AG

Fredy AG
  • Members
  • 1 355 messages

Desuke wrote...

Well thank God i have a 3gig video card.


Just out of curiosity, how to show  windows for "Shared System Memory" on your system?

I started researching about it and I found it interesting the topic of memory management for video. :P

////// 

Modifié par smarteck, 17 mai 2012 - 11:25 .


#1135
InBleedingRapture

InBleedingRapture
  • Members
  • 138 messages
Open the Start Menu and type "system information" in the search box. This should bring up a report. It lists hardware memory and virtual memory, as well as other useful information. I'm not sure what's meant by "shared system memory."

#1136
Fredy AG

Fredy AG
  • Members
  • 1 355 messages

Desuke wrote...

Well thank God i have a 3gig video card.



jajajajaj .. sorry .. need to improve my English :P .... what I meant, how much value is to report the system of  @Desuke in Shared System Memory, with that 3GB dedicated on video memory.
sorry for the bad English ... :lol:


//////

Modifié par smarteck, 17 mai 2012 - 11:53 .


#1137
InBleedingRapture

InBleedingRapture
  • Members
  • 138 messages
...of course, you might be better off waiting for a response from one of the computer geniuses on this forum.

I know how to access information and do what I want with my PC, but just like working on cars: I know how to do it, but I don't know what it's called, hehe.

#1138
InBleedingRapture

InBleedingRapture
  • Members
  • 138 messages
Oh, I see. You should be able to access that information by typing "dxdiag" in the search box. Under "Display 1" it should show your "Approx. Total Memory" for the display.

#1139
InBleedingRapture

InBleedingRapture
  • Members
  • 138 messages
Nevermind, I understand you now, haha.

**EDIT**

Me = Idiot : )

Modifié par InBleedingRapture, 18 mai 2012 - 12:03 .


#1140
Fredy AG

Fredy AG
  • Members
  • 1 355 messages
jajajajajajaaja....

Do not worry man. ;)

//////

Modifié par smarteck, 18 mai 2012 - 12:40 .


#1141
Desuke

Desuke
  • Members
  • 8 messages

smarteck wrote...

Desuke wrote...

Well thank God i have a 3gig video card.



jajajajaj .. sorry .. need to improve my English :P .... what I meant, how much value is to report the system of  @Desuke in Shared System Memory, with that 3GB dedicated on video memory.
sorry for the bad English ... :lol:


//////


Close to 3gig I think. My main memory is 8gig.

#1142
SliPaladin

SliPaladin
  • Members
  • 578 messages

smarteck wrote...

SliPaladin wrote...

So basically at 1080p my 1gig GTX560Ti wont be enough... damn :/


That depends, perhaps shared system memory be your salvation.

If my memory I make a full video and still works is because the shared memory is helping the video card (I think).

You only need to test. I found this:

"Shared System Memory: This usually makes up the bulk of system memory available to the GPU and is “allocated” on demand when needed. Shared system memory is really just regular application VAD which are probe and locked and made visible to the GPU. VidMm will only allow up to N bytes to be pinned down simultaneously where N is: ((Total System Memory – 512)/2 – Dedicated System Memory)"




//////


I have 4GB of shared memory. 

#1143
InBleedingRapture

InBleedingRapture
  • Members
  • 138 messages
I've got about 3GB shared, for a total of 4GB available VMEM. Anyone with 8GB of RAM should have about 3GB shared system memory, if that equation is accurate.

Even so, performance with high-res textures also depends on your RAM and VMEM clocks, as this controls how quickly (or slowly) your computer can cycle textures in and out to be rendered. CPU factors into it as well, but I won't go into that.

#1144
Dead_Meat357

Dead_Meat357
  • Members
  • 1 122 messages
I'll try and address some of the points and keep it simple. Shared video memory isn't what you think it is. Typically shared memory comes from integrated graphics cards stealing physical RAM for use. This is NOT what you want for games. Integrated graphics are almost universally too slow to be very useful for playing anything but the oldest games. Even if a dedicated video card has been added to your system, some memory may be listed as shared as there may be an integrated graphics adapter as well. It is also worth noting that many Intel processors from the Core i3 / i5 / i7 families have integrated GPUs and some boards make use of the integrated GPU for Intel's QuickSync feature even if there are no onboard video ports. So you could have some shared memory dedicated to that, without realizing it.

Shared memory usage, depending on where you are seeing that, may refer to various methods of memory caching. It's not necessarily controllable, and doesn't relate to gaming performance. So ignore it. In fact the Windows pagefile is a form of "shared memory". Windows does report a Shared memory which is your physical RAM and your pagefile at the present time added together. The number is normally around 1/3rd to half the size of your physical RAM added to the physical RAM total. Just make sure you have dedicated graphics cards enabled and that you are using them. If you have integrated graphics and you aren't using it, disable it.

This is what happens when someone buys a regular computer from Wal-Mart, Sam's Club, Best Buy or places like that, slaps a real video card in it and calls it done. It works, but not optimally. Additionally, 32bit OSes, or motherboards with memory remapping above 4GB disabled (even with a 64bit OS) may show less physical RAM available (even when the correct total is displayed) because device memory addresses are mapped below the 4GB boundary. 

Do not use Windows built in functions for determining video memory. It almost always reports weird and inaccurate results. I'm not sure why, but this seems to almost always be true. It shows my Radeon HD 7970 as 4 devices with 730MB of RAM each. This is NOT how it's built. The Catalyst Control Center shows 3.0GB of VRAM. Programs like GPU-Z will also provide accurate information. Do not rely on Windows to find out anything. GPU-Z is free. It also measures VRAM in use while playing games. You can set a maximum counter so that when you exit the game you can see how much memory was used. Or if you have multiple monitors as I do, you can simply watch it like a gauge in real-time.

The usage of VRAM also VARIES a lot by settings and system configuration. Anti-aliasing and antisotropic filtering take up a lot of it. Antialiasing taking the bulk of it. So running at 1920x1080 or "1080P" , your VRAM usage may vary quite a bit from one machine to the next based on it's configuration. Running at that setting with no AA will use very little VRAM in comparison to running at the same resolution with 8xmultisampling antialiasing and more using super sampling. Adaptive AA and FXAA use even less. So again, your memory usage varies wildly. Mass Effect 3 uses FXAA by default, and you can't control the level of it. So very little memory is used this way. If you want larger quality increases you have to force AA through the control panel for your graphics card. I am forcing 8xMSAA on mine. So just because Smartek is running at 1080P and doesn't max 1.5GB of RAM doesn't mean your settings won't require more, or that a card with 1GB of RAM is insufficient. What other people are doing isn't necessarily an accurate barometer for what's going on with your machine.

Shared RAM, is inconsequential to a video card with dedicated RAM. So that doesn't help you. If anything it hurts you. If you have an onboard graphics card or GPU, and it's eating system RAM, disable that crap. It's not doing you any favors if you aren't using it.

And many people do not understand their own hardware. Many of you who think you have a 2GB or 3GB video card may not. For example, few 3GB cards actually exist in a realistic sense. The Radeon HD 7970 and GeForce GTX 580 are two such examples. The regular GeForce GTX 580 has 1.5GB of RAM, so all cards that have more are non-reference designs and cost more. These cards are relatively rare. The GeForce GTX 690 is one example of a 3GB card that doesn't actually give you 3GB of RAM in a practical sense. Oh sure the card physically has that much, but you only get 1.5GB effectively. Some cards rely on internal SLI or Crossfire between dual GPUs for their performance. These cards have two separate frame buffers (your video card's RAM) and everything must be duplicated in each frame buffer. That's simply how the technology works. So while you may think you have 3GB of RAM on your video card, you probably don't. Again such cards are relatively rare and typically, you had to spend well north of $500 on the video card alone to get that.

Hopefully all that mess helps clear some things up. This forum doesn't have multiquotes, so it's hard to highlight all the topics and address them one by one.

Modifié par Dead_Meat357, 18 mai 2012 - 03:15 .


#1145
Dead_Meat357

Dead_Meat357
  • Members
  • 1 122 messages

InBleedingRapture wrote...

I've got about 3GB shared, for a total of 4GB available VMEM. Anyone with 8GB of RAM should have about 3GB shared system memory, if that equation is accurate.


Doesn't matter. Ignore that.

InBleedingRapture wrote...
Even so, performance with high-res textures also depends on your RAM and VMEM clocks, as this controls how quickly (or slowly) your computer can cycle textures in and out to be rendered. CPU factors into it as well, but I won't go into that.


Sort of. While video memory clock speeds do increase fill rates and theoretical memory bandwidth, at some point you hit a wall of diminishing returns. It doesn't do you any good if the GPU isn't powerful enough to leverage that. Sometimes some GPU architectures are more sensitive to that than others. Sometimes your GPU clocks make a world of difference and other times, less so. With the Radeon HD 7970, it makes a huge difference. On the GTX 580, it made little. Also system RAM clocks up to DDR3 1600MHz are one thing. Past that, they will not help you in games. The differences are usually minute and only show up in certain benchmarks. Nothing more.

#1146
neilthecellist

neilthecellist
  • Members
  • 450 messages
Damn, you beat me to this post. I was going to call major bull**** on a lot of the points being made on shared RAM on this thread. Dead Meat is right, use GPU-Z, any gauging software built into Windows is likely going to be inaccurate as f***.

I had a GTX 590 (or was it 595? Lost track, it was last year) and although it was listed on the box as a 3 GB card, it really only operated at 1.5 GB, which in reality is close to about 1.42 GB. What did Windows report? 3 GB. What did my games prove to only use up to? 1.42 GB. Actually, less than 1.42 GB, because some of it was being used for background processes (like my Aero themes at the time).

I also had a 9800GTX+ back in the day, and while it was a 1 GB card, it was like the GTX 590, it was two GPU's crammed into one card. So each "sub-card" had only 512 MB of RAM. It made gaming difficult at the time. Visually expensive games like Crysis 1 (NOT Crysis 2) sputtered like hell in framerate performance with the 9800GTX+. 

What did NVIDIA Inspector (similar functions to GPU-Z) tell me for my GTX 590? 1.42 GB. What did Windows claim I had? 3 GB

What GPU-Z report for my 9800GTX+? 494 MB. What did Windows say? 1.2 GB (wtf?)

I also want to add that unless you know how to match power specs from a power supply, you shouldn't overclock. Overclocking without forcibly amping up your PSU's power output is like trying to drive a stick-shift car without rev-matching the gears (think driving a Honda Civic in first gear at 60 MPH. Your engine will smoke, unless you match 60 MPH to the proper gear, usually fifth gear).

EDIT: The FXAA level of Mass Effect 3 actually can be overridden using NVIDIA Inpsector if you have an NVIDIA GPU. There is a parameter you can set in NVIDIA Inspector that controls FXAA usage in most DirectX9-11 games (Allow/Disallow). 

EDIT 2: For those of you NVIDIA users, Dead Meat's right about the expensiveness of AA functionality on your video card's performance. If you have an NVIDIA card that is a GTX 260 or above, and you want a visual quality that doesn't have a lot of jagginess, but you don't want your framerate to crumble (like 8X multisampling, or worse, super sampling), then just use SGSAA. SGSAA is an anti-aliasing method that was used in NVIDIA drivers released after version 280.xx

You can enable SGSAA through NVIDIA Inspector.

Use SGSAA in combination with MSAA, and match it correctly (think stick-shift driving). So if you set your AA to 2x multisampling, set SGSAA at 2x as well. The visual quality will be similar to that of 2x supersampling minus the framerate hitch. SGSAA causes visual blurring, but nowhere near as bad as FXAA.

Set your negative LOD bias to match the level of AA samples you are using. Negative LOD bias will counter out most of the blurriness that SGSAA produces (I know, it's not perfect, Dead Meat can correct me on this, but I'm well aware that it's not perfect). Remember, SGSAA doesn't blur the f*** out of your image like FXAA does.

The formula to calculate the correct negative LOD bias is y= -0.5 x log base 2 of n, where n is the number of samples you're using and y is the negative LOD bias you'll calculate out to get.

For instance, if you are running multisampling/SGSAA sampling of 4, then punch in your calculator

[b]LOG 4 (DIVIDE) LOG 2 (MULTIPLY) -0.5, (ENTER)
 
result: -1[/b]

Therefore, set negative LOD bias to -1.00

I would post images showing the difference between FXAA (crappy AA), no AA (looks better than FXAA, but this is subject to personal opinion, I'd rather not blur out textures that people have worked tirelessly on to improve over BioWare's stock s***ty texture quality), SSAA, and SGSAA. But I'm in accounting class right now, surfing on BioWare Social when I should be paying attention to class :x

Modifié par neilthecellist, 18 mai 2012 - 03:36 .


#1147
Dead_Meat357

Dead_Meat357
  • Members
  • 1 122 messages

neilthecellist wrote...

Damn, you beat me to this post. I was going to call major bull**** on a lot of the points being made on shared RAM on this thread. Dead Meat is right, use GPU-Z, any gauging software built into Windows is likely going to be inaccurate as f***.

I had a GTX 590 (or was it 595? Lost track, it was last year) and although it was listed on the box as a 3 GB card, it really only operated at 1.5 GB, which in reality is close to about 1.42 GB. What did Windows report? 3 GB. What did my games prove to only use up to? 1.42 GB. Actually, less than 1.42 GB, because some of it was being used for background processes (like my Aero themes at the time).

What did NVIDIA Inspector (similar functions to GPU-Z) tell me? 1.42 GB.

I also want to add that unless you know how to match power specs from a power supply, you shouldn't overclock. Overclocking without forcibly amping up your PSU's power output is like trying to drive a stick-shift car without rev-matching the gears (think driving a Honda Civic in first gear at 60 MPH. Your engine will smoke, unless you match 60 MPH to the proper gear, usually fifth gear).

EDIT: The FXAA level of Mass Effect 3 actually can be overridden using NVIDIA Inpsector if you have an NVIDIA GPU. There is a parameter you can set in NVIDIA Inspector that controls FXAA usage in most DirectX9-11 games (Allow/Disallow). 


Both the Catalyst Control Panel as well as the NVIDIA control panel can override the in-game AA settings as well. Though NVIDIA does tend to override user profiles if it has a "better one." Which I guess NVIDIA Inspector can override. I've never used it. I'm not using my GTX 580's at present. Though I will be upgrading to NVIDIA cards soon. NVIDIA's GeForce GTX 680 and 690 have simply left AMD behind in terms of performance. (This was not the case when I purchased the Radeon HD 7970.) 

As for overclocking, yeah, the normal TDP values in specifications for your hardware go out the WINDOW when you overclock. Your 90watt CPU can easily hit 120 watts. Your 300 watt GPU normally uses no more than about 270 watts in reality, but when you overclock, you can push it to 400 watts. If you have multiple GPUs, things can get out of hand real quick.

#1148
neilthecellist

neilthecellist
  • Members
  • 450 messages
What I mean is that NVIDIA Inspector actually lets you control the existence of FXAA usage directly, in addition to just overriding or enhancing in-game AA. If you override the AA in game with whatever AA setting you want (i.e. SGSAA, MSAA, SSAA, whatever) but you don't touch FXAA's setting in Inspector, then Mass Effect 3 will run FXAA in addition to the AA setting you set in Inspector. So it's important if the user is trying to free up assets on their video card for Mass Effect 3 by forcing FXAA off specifically first before applying additional AA methods.

EDIT: I think FXAA is an NVIDIA creation. It was listed in a press release a while back before Battlefield 3 was released.

Modifié par neilthecellist, 18 mai 2012 - 03:40 .


#1149
Dead_Meat357

Dead_Meat357
  • Members
  • 1 122 messages

neilthecellist wrote...

What I mean is that NVIDIA Inspector actually lets you control the existence of FXAA usage directly, in addition to just overriding or enhancing in-game AA. If you override the AA in game with whatever AA setting you want (i.e. SGSAA, MSAA, SSAA, whatever) but you don't touch FXAA's setting in Inspector, then Mass Effect 3 will run FXAA in addition to the AA setting you set in Inspector. So it's important if the user is trying to free up assets on their video card for Mass Effect 3 by forcing FXAA off specifically first before applying additional AA methods.

EDIT: I think FXAA is an NVIDIA creation. It was listed in a press release a while back before Battlefield 3 was released.


Yes, FXAA is NVIDA's implementation, but it is compatible with AMD cards as well. The way it works in game isn't hardware specific.

#1150
neilthecellist

neilthecellist
  • Members
  • 450 messages
Screenshots showing the difference between No AA, 4x MSAA, 4xMSAA plus MSTRAA, and 4xMSAA plus 4x SGSSAA. 

http://www.geforce.c...liasing-Off.png 

http://www.geforce.c...sing-4xMSAA.png 

http://www.geforce.c...-And-MSTRAA.png 

http://www.geforce.c...nd-4xSGSSAA.png 

No choice but to just post the URL's, since BioWare Social's BBCode system is retarded.

EDIT: Note, the person who posted these images did not apply LOD bias, so the SGSAA image is a little blurry. Setting negative LOD bias to -1 would've helped make this image not as retardedly blurry. Still, even with SGSAA alone without the negative LOD bias setting of -1, it looks better than any FXAA form of anti-aliasing.

EDIT 2: I've emailed the person who posted that article, hopefully he will add in the information I've detailed here in this thread. As a bonus, I asked if he could post a link to this thread from the article on Mass Effect 3 tweaks!

Modifié par neilthecellist, 18 mai 2012 - 04:17 .