Aller au contenu

Photo

Dragon Age: Inquisition PC Screenshots, System Requirements and Hands-On!


1427 réponses à ce sujet

#1151
Fidite Nemini

Fidite Nemini
  • Members
  • 5 738 messages

970 and 980 cards can render the game at higher resolutions and shrink it back down to 1080p. Like getting 4K on 1080p.

 

That also means you get the same performance as the GPU IS rendering the game in higher resolutions PLUS downsampling it to your display's native resolution. So unless your GPU has the muscle to render those higher resolutions, the "new" DSR is worth a wet fart.



#1152
SilentCid

SilentCid
  • Members
  • 338 messages

That also means you get the same performance as the GPU IS rendering the game in higher resolutions PLUS downsampling it to your display's native resolution. So unless your GPU has the muscle to render those higher resolutions, the "new" DSR is worth a wet fart.

 

DSR scales and you can choose how you want it to downsample from. It doesn't necessary have to be downsampled from 4K, it can factor 1.20x, 1.50x, 1.78x, 2.0x, 2.5x, 3.0x, and 4.0x of your native resolution. Right now only 970 and 980 cards can use this feature.



#1153
Gill Kaiser

Gill Kaiser
  • Members
  • 6 061 messages

Yeah. I'm just wondering if I shouldn't do something like get the 970 now and come up with a plan for a second 970 in the future for an SLI set-up as the future proofing s. But I hear running cards in SLI is a pain and can lead to problems, esp. if it isn't supported by the software you're running. Or at least it was when I was looking into it for the 8800GTX.


That's what I'm doing. I'm getting one 970 now with the option to upgrade later. SLI 970s have been shown to have much better performance than one 980, and together they're not much more expensive. The only question is SLI compatibility. But even one 970 is more than enough for 1080p.

#1154
The Night Haunter

The Night Haunter
  • Members
  • 2 968 messages

I personally prefer the Radeon 290. Cheaper than the GTX and almost as good ($80 cheaper!)



#1155
naughty99

naughty99
  • Members
  • 5 801 messages

That's what I'm doing. I'm getting one 970 now with the option to upgrade later. SLI 970s have been shown to have much better performance than one 980, and together they're not much more expensive. The only question is SLI compatibility. But even one 970 is more than enough for 1080p.

 

Problem is they haven't released the models with 8GB VRAM yet, so if you buy two GTX 970s now you will always be limited to max 4GB VRAM, starting out of the gate already below recommended specs for Shadows of Mordor.

 

Not really a problem at all with most games, but there are a few starting to ask for more than 4GB. It would blow to have a beastly setup limited by VRAM for some stuttering due to texture issues, etc., on future games.



#1156
Ryzaki

Ryzaki
  • Members
  • 34 423 messages

Problem is they haven't released the models with 8GB VRAM yet, so if you buy two GTX 970s now you will always be limited to max 4GB VRAM, starting out of the gate already below recommended specs for Shadows of Mordor.

 

Not really a problem at all with most games, but there are a few starting to ask for more than 4GB. It would blow to have a beastly setup limited by VRAM for some stuttering due to texture issues, etc., on future games.

 

? I heard those requirements were widely blown out of proportion. (certainly wouldn't be the first time).



#1157
naughty99

naughty99
  • Members
  • 5 801 messages

? I heard those requirements were widely blown out of proportion. (certainly wouldn't be the first time).

 

Usage reportedly up to 5.9GB VRAM on Titan Blacks.



#1158
Gill Kaiser

Gill Kaiser
  • Members
  • 6 061 messages

Problem is they haven't released the models with 8GB VRAM yet, so if you buy two GTX 970s now you will always be limited to max 4GB VRAM, starting out of the gate already below recommended specs for Shadows of Mordor.

 

Not really a problem at all with most games, but there are a few starting to ask for more than 4GB. It would blow to have a beastly setup limited by VRAM for some stuttering due to texture issues, etc., on future games.

 

Aye. That's why I'm just getting one 970 now, and seeing how it goes. If I gamed on a higher resolution than 1920x1200 I would wait until the 8gb versions, but I think that 4gb should be more than enough for the moment, and if I need to upgrade in 2-3 years to something else I'd rather have only spent £280 for one card.



#1159
Maliken

Maliken
  • Members
  • 234 messages

Shadow of Mordor isn't even a graphically impressive game, and I doubt you're going to be needing more than 4GB VRAM for the vast majority of titles in the next few years. Especially if you're gaming on a 1080p display. 



#1160
naughty99

naughty99
  • Members
  • 5 801 messages

Shadow of Mordor isn't even a graphically impressive game, and I doubt you're going to be needing more than 4GB VRAM for the vast majority of titles in the next few years. Especially if you're gaming on a 1080p display. 

 

Certainly not many titles to date can make use of it, and the majority of titles in the next year or two won't either.

 

However, if 8GB versions are coming very soon, I'd rather wait a few weeks than be stuck with 970m SLI 4GB that is quite powerful but possibly suffers stuttering/hitching on one of those future titles that eats VRAM for streaming high res texture data. It seems maxing out 1080p settings now involves using DSR and similar options to render at 3k or 4k for 1080p display.



#1161
Ryzaki

Ryzaki
  • Members
  • 34 423 messages

Usage reportedly up to 5.9GB VRAM on Titan Blacks.

 

Yet people are playing at 60fps at 1080p on a gtx 970 just fine on ultra.

 

I mean I could see needing the 6gb if you were going 4k but 1080p I'm not seeing it.

 

Well worst comes to worse I guess I still have another week to return. *shrug*
 



#1162
Fidite Nemini

Fidite Nemini
  • Members
  • 5 738 messages

Usage reportedly up to 5.9GB VRAM on Titan Blacks.

 

 

Most of that is preloaded and leftover data that's unnecessary. Allocation exists. If a videogame truly uses more than 2.5 GB VRAM at any given point, it's a telltale for horrible optimization and most likely memory leak.

 

VRAM amount is mostly just a big number for manufacturers to fool people into buying higher VRAM SKUs at a higher price. Decently optimized games up until WQHD resolutions do not require more than 2 GB VRAM, anything more is either reserved for extreme resolutions (surround vision/4K), or simply developers having coded a mess.



#1163
naughty99

naughty99
  • Members
  • 5 801 messages

Yet people are playing at 60fps at 1080p on a gtx 970 just fine on ultra.
 
I mean I could see needing the 6gb if you were going 4k but 1080p I'm not seeing it.
 
Well worst comes to worse I guess I still have another week to return. *shrug*

 
Some have complained about experiencing minor stuttering despite high fps because there is hitching when the VRAM reaches limit.
 
Also if you want to max out this game, you have to increase resolution setting above your native display to enable SSAA. It's rendering at 150% or 200% of your display and downsampling, that's what Mordor uses for anti aliasing.

Unless you want to run the game without anti-aliasing or force a different type from your graphics driver menu, 1080p display needs to render at 1440p, 3k or 4k in order to enable SSAA. I'm guessing there will be more upcoming games that make use of use DSR/supersampling this way, Witcher 3 most likely.
  

Most of that is preloaded and leftover data that's unnecessary. Allocation exists. If a videogame truly uses more than 2.5 GB VRAM at any given point, it's a telltale for horrible optimization and most likely memory leak.
 
VRAM amount is mostly just a big number for manufacturers to fool people into buying higher VRAM SKUs at a higher price. Decently optimized games up until WQHD resolutions do not require more than 2 GB VRAM, anything more is either reserved for extreme resolutions (surround vision/4K), or simply developers having coded a mess.


That certainly used to be the case, but it's not just a marketing tactic anymore. There are a few games actually making use of higher amounts than we used to see a few years back. Looking down the road I imagine we will see more titles like this. DOOM 4 and Witcher 3 probably going to be among them.
 
For example, you cannot even activate the highest settings for Wolfenstein New Order unless you have at least 3GB VRAM. I have no idea about the optimization for Shadows of Mordor or Watchdogs, but Wolfenstein New Order is very well optimized. It makes efficient use of both CPU and GPU. It eats VRAM however, for best performance as texture data streams from your SSD, system memory and VRAM.

#1164
Ryzaki

Ryzaki
  • Members
  • 34 423 messages

 
Some have complained about experiencing minor stuttering despite high fps because there is hitching when the VRAM reaches limit.
 
Also if you want to max out this game, you have to increase resolution setting above your native display to enable SSAA. It's rendering at 150% or 200% of your display and downsampling, that's what Mordor uses for anti aliasing.

Unless you want to run the game without anti-aliasing or force a different type from your graphics driver menu, 1080p display needs to render at 1440p, 3k or 4k in order to enable SSAA. I'm guessing there will be more upcoming games that make use of use DSR/supersampling this way, Witcher 3 most likely.
 

 

I can live with minor stuttering myself

 

Ah I see.

 

I guess I'm lucky cause I tend to prefer games on the lower requirements side of the spectrum.



#1165
Mathegs

Mathegs
  • Members
  • 6 messages

Guys, I'm very hype for this game, and with the launch of System Requirements came a question that maybe you can help me. I have a R9 270X 2GB , you think the Mantle support is interesting enough for me to continue with my card or I can go on a GTX 970 without fear? I'm playing on a 1080p monitor. Appreciate the help from everyone.



#1166
Fredvdp

Fredvdp
  • Members
  • 6 186 messages

Guys, I'm very hype for this game, and with the launch of System Requirements came a question that maybe you can help me. I have a R9 270X 2GB , you think the Mantle support is interesting enough for me to continue with my card or I can go on a GTX 970 without fear? I'm playing on a 1080p monitor. Appreciate the help from everyone.

Mantle is useful if your CPU is a bottleneck. I have a Phenom II x4 965 BE 3.4 Ghz paired with a Radeon R9 270x 2GB and Mantle has been very useful to me in Frostbite 3 games (BF4 and PvZ: Garden Warfare). Paired with a good CPU, Mantle GPUs offer a small framerate increase, but nothing spectacular.



#1167
Fidite Nemini

Fidite Nemini
  • Members
  • 5 738 messages

Mantle is useful if your CPU is a bottleneck. I have a Phenom II x4 965 BE 3.4 Ghz paired with a Radeon R9 270x 2GB and Mantle has been very useful to me in Frostbite 3 games (BF4 and PvZ: Garden Warfare). Paired with a good CPU, Mantle GPUs offer a small framerate increase, but nothing spectacular.

 

All the above having been said, the R9 270X is still a very good GPU on its own so unless you really want to upgrade, it'll see you through for the next couple games without problem and you can wait to see how AMD's Pirate Islands GPUs will compare to Nvidia's Maxwell.


  • Fredvdp aime ceci

#1168
naughty99

naughty99
  • Members
  • 5 801 messages

In case anyone is interested, a guy is currently streaming on twitch as he assembles $10,000 gaming PC, kinda fun to watch: http://www.twitch.tv/seriousgaming

Specs

  • Case: Corsair Obsidian 900D
  • Motherboard: ASUS Rampage V
  • CPU: Intel i7-5960X Haswell-E
  • RAM: Corsair Dominator Platinum 3200MHz 32GB
  • SSD: 2x Samsung 850 Pro-Series 1TB SSD
  • PSU: EVGA 1600 Watt Supernova
  • GPU: 2x AMD R9295 (Custom cooled with EK water blocks)
  • Fans: 8x Corsair AF140 LED Reds
  • Lighting: Custom LEDS
  • Radiator: 2x Black Ice GTX 420 Extreme
  • Cooling: Custom Liquid Cooling by Alan


#1169
Gill Kaiser

Gill Kaiser
  • Members
  • 6 061 messages

That is awesome but absurd!



#1170
Brogan

Brogan
  • Members
  • 2 190 messages
All on a 24" monitor.

#1171
Enad

Enad
  • Members
  • 686 messages

All on a 24" monitor.

 

and with AMD cards haha

If he chose Nvidia at least he wouldn't have turned his PC into a heating unit.



#1172
Brogan

Brogan
  • Members
  • 2 190 messages

and with AMD cards haha

If he chose Nvidia at least he wouldn't have turned his PC into a heating unit.


yea even with their closed loop cooling those things get HOT.

And he's doubling up on them.

#1173
Enad

Enad
  • Members
  • 686 messages

Regardless of the GPU brand, it seems like a huge waste to be playing on a 24" screen at 1080p(I'm assuming). If you're building a $10k PC, you better be doing 30" 4K...Seriously.


  • naughty99 aime ceci

#1174
Brogan

Brogan
  • Members
  • 2 190 messages
I was joking about that. :)

I'm sure he's going to go 4k dual screen or something.

#1175
Dark Helmet

Dark Helmet
  • Banned
  • 1 686 messages

Gods I wish I was better at all this.

 

After running through everything I know I meet the recommended specs in almost every category.

 

Don't know if my graphics card is though. It's a Nvidia Geforce gtx 745.