Damn, you beat me to this post. I was going to call major bull**** on a lot of the points being made on shared RAM on this thread. Dead Meat is right, use GPU-Z, any gauging software built into Windows is likely going to be inaccurate as f***.
I had a GTX 590 (or was it 595? Lost track, it was last year) and although it was listed on the box as a 3 GB card, it really only operated at 1.5 GB, which in reality is close to about 1.42 GB. What did Windows report? 3 GB. What did my games prove to only use up to? 1.42 GB. Actually, less than 1.42 GB, because some of it was being used for background processes (like my Aero themes at the time).
I also had a 9800GTX+ back in the day, and while it was a 1 GB card, it was like the GTX 590, it was two GPU's crammed into one card. So each "sub-card" had only 512 MB of RAM. It made gaming difficult at the time. Visually expensive games like Crysis 1 (NOT Crysis 2) sputtered like hell in framerate performance with the 9800GTX+.
What did NVIDIA Inspector (similar functions to GPU-Z) tell me for my GTX 590? 1.42 GB. What did Windows claim I had? 3 GB
What GPU-Z report for my 9800GTX+? 494 MB. What did Windows say? 1.2 GB (wtf?)
I also want to add that unless you know how to match power specs from a power supply, you shouldn't overclock. Overclocking without forcibly amping up your PSU's power output is like trying to drive a stick-shift car without rev-matching the gears (think driving a Honda Civic in first gear at 60 MPH. Your engine will smoke, unless you match 60 MPH to the proper gear, usually fifth gear).
EDIT: The FXAA level of Mass Effect 3 actually can be overridden using NVIDIA Inpsector if you have an NVIDIA GPU. There is a parameter you can set in NVIDIA Inspector that controls FXAA usage in most DirectX9-11 games (Allow/Disallow).
EDIT 2: For those of you NVIDIA users, Dead Meat's right about the expensiveness of AA functionality on your video card's performance. If you have an NVIDIA card that is a GTX 260 or above, and you want a visual quality that doesn't have a lot of jagginess, but you don't want your framerate to crumble (like 8X multisampling, or worse, super sampling), then just use SGSAA. SGSAA is an anti-aliasing method that was used in NVIDIA drivers released after version 280.xx
You can enable SGSAA through NVIDIA Inspector.
Use SGSAA in combination with MSAA, and match it correctly (think stick-shift driving). So if you set your AA to 2x multisampling, set SGSAA at 2x as well. The visual quality will be similar to that of 2x supersampling minus the framerate hitch. SGSAA causes visual blurring, but nowhere near as bad as FXAA.
Set your negative LOD bias to match the level of AA samples you are using. Negative LOD bias will counter out most of the blurriness that SGSAA produces (I know, it's not perfect, Dead Meat can correct me on this, but I'm well aware that it's not perfect). Remember, SGSAA doesn't blur the f*** out of your image like FXAA does.
The formula to calculate the correct negative LOD bias is
y= -0.5 x log base 2 of n, where n is the number of samples you're using and y is the negative LOD bias you'll calculate out to get.
For instance, if you are running multisampling/SGSAA sampling of 4, then punch in your calculator
[b]LOG 4 (DIVIDE) LOG 2 (MULTIPLY) -0.5, (ENTER)
result: -1[/b]
Therefore, set negative LOD bias to -1.00
I would post images showing the difference between FXAA (crappy AA), no AA (looks better than FXAA, but this is subject to personal opinion, I'd rather not blur out textures that people have worked tirelessly on to improve over BioWare's stock s***ty texture quality), SSAA, and SGSAA. But I'm in accounting class right now, surfing on BioWare Social when I should be paying attention to class :x
Modifié par neilthecellist, 18 mai 2012 - 03:36 .