Aller au contenu

Photo

A much needed technicality: VRAM stacking


  • Veuillez vous connecter pour répondre
18 réponses à ce sujet

#1
UniformGreyColor

UniformGreyColor
  • Members
  • 1 455 messages

I really do hope this game supports Vram stacking, because if it does not, I will not be able to play this game on max setting and I have a hell of a PC at this point in time.

 

Does anyone know or can answer if VRAM will stack for ME:A?



#2
ArabianIGoggles

ArabianIGoggles
  • Members
  • 478 messages

As far as I know vram stacking doesn't work with nvidia cards, so I assume you're using an amd setup.  Either way I doubt it will be included.



#3
DaemionMoadrin

DaemionMoadrin
  • Members
  • 5 864 messages

As far as I know vram stacking doesn't work with nvidia cards, so I assume you're using an amd setup.  Either way I doubt it will be included.

 

DA:I was better optimized for AMD, so...



#4
UniformGreyColor

UniformGreyColor
  • Members
  • 1 455 messages

Directx 12, which windows 10 is necessary to have for 900 series Nvidia cards to fully support directx 12 supposedly supports VRAM stacking for Nvidia cards, but only if the game supports it. That is what I am wondering about. Will ME:A support VRAM stacking?



#5
ArabianIGoggles

ArabianIGoggles
  • Members
  • 478 messages

What gpus are you using that you'd actually need vram stacking?



#6
UniformGreyColor

UniformGreyColor
  • Members
  • 1 455 messages

I'm using 2x GTX980 ASUS STRIX 4GB of VRAM. CPU is not a problem as I have a 5820K (at least I think...?).



#7
ArabianIGoggles

ArabianIGoggles
  • Members
  • 478 messages

I'm using 2x GTX980 ASUS STRIX 4GB of VRAM. CPU is not a problem as I have a 5820K (at least I think...?).

What resolution are you playing at that you'd require more than 4gb?



#8
Novak

Novak
  • Members
  • 370 messages

VRAm stacking is kinda difficult and it's not just some magic the devs have to implement. While it is true such features have to be implemented by a game by game basis, it's not the whole story. Drivers have to be pushed out and directx 12 or the new vulkan api (depending on how the development goes) also have to be updated etc. 

 

Now, this is still on a game by game basis so devs have to implement their specific features through the api in order for it to work properly. Now why you might ask. Well there are many reasons but some are more important than others. So first off: PCI-E bandwidth limitations. Even though PCI-E is a pretty fast BUS it's not vram fast so not even slightly. So if you would just say okay that's one big fat pool of memory which the chip can access then you are bond to have congestion issues. Let's say one stack of textures is written on the VRAm of GPU 1 but GPU 2 needs that texture then GPU 2 can't just simple access the VRAM and grab that texture to use. It first has to go through the lower bandwidth BUS to grab that. And now comes the reason why stacked VRAM wasn't actually possible before DX12 and it's new features. And the key here are the Asynchronous shaders. To explain this you need to know (in case you don't, if you do already feel free to ignore) how SLI/Crossfire actually works without DX12. Each GPU renders every other frame so GPU one gives you picture 1 3 5 7 and so on, while GPU 2 gives you frame 2 4 6 8 and so on. And one GPU serves as output module, since chips aren't always exactly alike one is faster than the other and that is one of the reasons why (there are more but not important right now) SLI/Crossfire sometimes have stutter even at high frame rates. 

Anyway now with Asynchronous shaders the two GPU's are actually allowed to carry out completely different task, let's say one does physics calculations and texturing while the other does all the shading and calculates the light sources (an oversimplification but that's basically how the splitting works). Unlike before where each GPU has to do everything for each frame. 

 

With that you can effectively stack your VRAM and make it work like it's just one big pile since the data you need to calculate one thing can be loaded in each GPU's VRAm according to what the respective chip actually needs to calculate it's task. It's still not a true stack since you can't just dump all your data in your VRAM pile and call it done and just let the chips sort it out. 

 

Now we get to the fun part. Do you actually need all this crap? Well Asynchronous shading actually makes for better performances even in single GPU's since it's a much more effective way to let the GPU work (provided the hardware supports it) and is simply a new hardware feature in the GPU's architecture.  As for the VRAM advantages, it's kinda tricky. SLI/Crossfire will work better regardless since that too has enormous benefits on the way tasks a handled with that but the seemingly gained VRAM is very often not really the reason for it. A gain in VRAM is only really useful if you capped your limit of your current VRAM so the GPU has do remove and rewrite parts of the data sets it actually still needs. So that takes a toll. But let's say you have 2 GPU's with each 4 GB of VRAM. If you play a game and it actually only uses 3 GB maximum at all times expanding the VRAM to 8 GB doesn't give you an advantage since all the data can already be stored. If your game actually needs 6 GB then the stacked VRAM gives you an advantage. My GPU has 4 GB of VRAM and without 4k I only managed once to go up to the cap. And I play at 1440p. In the 4k range you have more VRAM requirements and it might actually become useful in the future. 

 

So if you specifically care about VRAM stacking I can only tell you if you're playing 4k and mod the **** out of your games then it's probably useful if not then VRAM is not your limiting factor. If you want to know if your current games suffer from VRAM overflow problems then you can check it while playing and see if the VRAM is full then DX12 and VRAM stacking might give you an advantage. 


  • Joseph Warrick et UniformGreyColor aiment ceci

#9
Novak

Novak
  • Members
  • 370 messages

Why do I even bother with those texts?



#10
Novak

Novak
  • Members
  • 370 messages

I'm using 2x GTX980 ASUS STRIX 4GB of VRAM. CPU is not a problem as I have a 5820K (at least I think...?).

 

No CPU is probably not your problem, everything gets tasked to the GPU nowadays anyway (I'm exaggerating of course but there are reasons which are complicated) 



#11
UniformGreyColor

UniformGreyColor
  • Members
  • 1 455 messages

@Novak,

 

ME:A is going to have sky high graphic potential. I likely will not be playing at 4K simply because I prefer high graphics over high res. That said, with DA:I I saw my VRAM hit over 3.2 GB of VRAM @1080p. Granted this was just on a single 980 card with the graphics maxed, but still. If Andromeda is going to be what I expect, playing at my native res of 1440p and hitting the ceiling of 4GB of VRAM is likely going to be a problem even with 2x 980s even with Dx 12 (not to mention that I won't be able to play at max settings at that res).



#12
Novak

Novak
  • Members
  • 370 messages

@Novak,

 

ME:A is going to have sky high graphic potential. I likely will not be playing at 4K simply because I prefer high graphics over high res. That said, with DA:I I saw my VRAM hit over 3.2 GB of VRAM @1080p. Granted this was just on a single 980 card with the graphics maxed, but still. If Andromeda is going to be what I expect, playing at my native res of 1440p and hitting the ceiling of 4GB of VRAM is likely going to be a problem even with 2x 980s even with Dx 12 (not to mention that I won't be able to play at max settings at that res).

 

Not that sky high. The thing which fills up VRAM the most are textures. And since those new shitty consoles have poor texture processing and only play at 900p and 720p it's not that likely that the texture will be super high-res and therefore will not fill up the VRAM. That being said since the consoles do use a unified memory system and got a little more than 6 GB to work with (some of the ram is allocated to the OS and can not be accessed to dump game assets in there) you might have a potential to actually use all of it. Which comes back to fact that GPU's handle most of the tasks associated with gaming these days so GPU has to do the majority of the workload and not that much gets dumped in the system ram anymore. This and the fact that most games these days are developed with the console in mind so resource allocation and memory usage gets optimized for that. Which is just a fancy way of saying devs are too lazy (to be fair not to lazy but don't get the budget for a proper PC port) to make a good PC port and just use the same resource management with the attitude that PC gamers will just put more money in their PC so it powers through poor optimization. 


  • UniformGreyColor aime ceci

#13
UniformGreyColor

UniformGreyColor
  • Members
  • 1 455 messages

I was under the impression that PS4 goes up to 1080p (at least for some games) (not a huge diff, but of noted consequence). I will also say that I still want to play getting 40-50fps (which is my prefered) and gaming at 1440p with at least high settings on my game. Are you saying that I can still do that likely (ofc we don't know that much yet)? I mean, here is what I'm seeing: This game will likely not launch until at least late this year and possibly not till that same time the next year (I have a suspicion that BW is gunning for another GOTY game and coming out early in the year never seems to work for anyone), so I can't help but feel that this game is going to be close to nothing we have seen graphically for PC. I know devs don't make games for PC anymore, but I can't help but think that they will also be catering to a PC audience as well.



#14
Novak

Novak
  • Members
  • 370 messages

Anyway, that was refreshing. We need more topics like this :D Usually I don't spew out all that information (however not indepth it might be in reality) but I feel like if people ask for specific features or whatever they need to know a bigger picture of how graphics computing works (or respectively what impacts what).



#15
Novak

Novak
  • Members
  • 370 messages

I was under the impression that PS4 goes up to 1080p (at least for some games) (not a huge diff, but of noted consequence). I will also say that I still want to play getting 40-50fps (which is my prefered) and gaming at 1440p with at least high settings on my game. Are you saying that I can still do that likely (ofc we don't know that much yet)? I mean, here is what I'm seeing: This game will likely not launch until at least late this year and possibly not till that same time the next year (I have a suspicion that BW is gunning for another GOTY game and coming out early in the year never seems to work for anyone), so I can't help but feel that this game is going to be close to nothing we have seen graphically for PC. I know devs don't make games for PC anymore, but I can't help but think that they will also be catering to a PC audience as well.

 

Yes of course they will, but I doubt that Bioware will be able to handle the top tier graphics especially since frostbite is relatively new and so is DX12 and only DICE really knows who to get everything from it. But they have learned from DA:I so they might get pretty close. 

Nobody knows anything about the game yet so I can't give you the answer you seek. I can only make guesses on how consoles work and how frostbite works as it is right now. I think ME:A will be on the level of Battlefront. You also have to look at development time. Especially when the development of world building etc starts. That's about the graphics you're gonna get. 

 

As for the console resolution. They do output 1080p but so did most of the last gen games but that's with upscaling. Many games actually are being rendered in 1080p but the games I'm talking about (BF, Battlefront etc) are the heavy hitters in terms of graphics and many of them are already too much for the consoles to handle. I wrote another text in another thread about why exactly that is. If you like you can check it out. 


  • UniformGreyColor aime ceci

#16
Novak

Novak
  • Members
  • 370 messages

Oh yea your question. with 2x 980 and by sticking to 1080p I would say most definitely. (Again to the best of my knowledge but I'd say that's a safe assumption) As for 1440p I'm not sure, everything above 1080p is still pretty unpredictable performance wise. Some games do really well (this is most likely due to how much the dev cares since above 1080p is still more of a niche market and again not in the console range so resource management is most likely not optimized (you have to shift some things to make that work) ) and others don't. You can also expect your GPU's to be better next year if the current trend continues. Nvidia GPU's had 20% better performance overall at the end of their life cycle than they had at the beginning. This was simply achieved through better drivers they just pushed later on.  


  • UniformGreyColor aime ceci

#17
UniformGreyColor

UniformGreyColor
  • Members
  • 1 455 messages

Yes of course they will, but I doubt that Bioware will be able to handle the top tier graphics especially since frostbite is relatively new and so is DX12 and only DICE really knows who to get everything from it. But they have learned from DA:I so they might get pretty close. 

Nobody knows anything about the game yet so I can't give you the answer you seek. I can only make guesses on how consoles work and how frostbite works as it is right now. I think ME:A will be on the level of Battlefront. You also have to look at development time. Especially when the development of world building etc starts. That's about the graphics you're gonna get. 

 

As for the console resolution. They do output 1080p but so did most of the last gen games but that's with upscaling. Many games actually are being rendered in 1080p but the games I'm talking about (BF, Battlefront etc) are the heavy hitters in terms of graphics and many of them are already too much for the consoles to handle. I wrote another text in another thread about why exactly that is. If you like you can check it out. 

 

I'd like to check it out. Can you link me?



#18
Novak

Novak
  • Members
  • 370 messages

http://forum.bioware...-pc/?p=20030660



#19
Novak

Novak
  • Members
  • 370 messages

Oh yea in case you're interested. I'm in the process of buying a 4k monitor (cine 4k not that shitty UHD :D) I can tell you how much VRAM that eats up if you're interested.