Aller au contenu

Photo

Graphics cards.


  • Veuillez vous connecter pour répondre
19 réponses à ce sujet

#1
Naughty Bear

Naughty Bear
  • Members
  • 5 209 messages

Why do they keep releasing new ones every month or so? Why don't they just make a super powerful graphics card that can survive for 5-10 years?

 

I heard that graphics card potential has a peak of strength it can reach and can't go any further? Is this true? And why so?



#2
Kaiser Arian XVII

Kaiser Arian XVII
  • Members
  • 17 283 messages

I don't know. I had Case size limit and had to settle with a GTX 760 which is still good.

Right now:

Bigger cards = Stronger cards

 

and they only grow in length. I suppose we will need 1 or 1.5 meter cases to put these super powerful graphic cards!



#3
Naughty Bear

Naughty Bear
  • Members
  • 5 209 messages

So why don't they do that then? Why is the Witcher one of the most demanding games? Can't they make a game with pixer style level of graphics and simulation and release a graphics card that can run such?

 

Or make a game that looks like cgi? What is stopping the game industry from making a game like this? And if graphics card are not good, why don't they make one powerful enough to run it?



#4
Fidite Nemini

Fidite Nemini
  • Members
  • 5 738 messages

Why do they keep releasing new ones every month or so?

 

They don't.

 

 

Why don't they just make a super powerful graphics card that can survive for 5-10 years?

 

Because technology marches on. That doesn't mean however that a top-of-the-line GPU can't run future games. It's just that as system requirements increase, the relative performance of your GPU decreases over time, but you can still run pretty much every game with years old GPUs if there aren't software campatibility chasms (say a game requires a new API version your GPU doesn't support), just on decreasingly lower graphical settings as games get more demanding.

 

 

I heard that graphics card potential has a peak of strength it can reach and can't go any further? Is this true? And why so?

 

Not sure I understand that question. Obviously any sort of hardware is performance limited, please elaborate.



#5
Kaiser Arian XVII

Kaiser Arian XVII
  • Members
  • 17 283 messages

...

 

Even the best of Graphic cards right now can't run Witcher 3 on the highest graphic settings and 4K resolution better than 35 FPS (in average)

 

They're not that powerful.



#6
Naughty Bear

Naughty Bear
  • Members
  • 5 209 messages

I don't know how to put my question into words. Hard it seems.

 

Why doesn't a company just release a super-powerful graphics card capable of running a cgi graphics looking game.

 

And why is The Witcher and Arkham Knight graphics is the most we can achieve so far (unsure if this is true), does this make sense? Is it the fact it would be so expensive and nobody could afford such a demanding game? or do we lack technology to produce such game?

 

Does graphics cards have some peak? Once met, it can't go any further?



#7
Fidite Nemini

Fidite Nemini
  • Members
  • 5 738 messages

...

 

Even the best of Graphic cards right now can't run Witcher 3 on the highest graphic settings and 4K resolution better than 35 FPS (in average)

 

They're not that powerful.

 

4K isn't a gaming standard however. That one by far and large still is 1080p with 1440p slowly gaining a larger market share. And there's lots of GPU, both old and new that have little problem running The Witcher 3 on those resolutions at varying degrees of graphical fidelity.

 

 

 

I don't know how to put my question into words. Hard it seems.

 

Why doesn't a company just release a super-powerful graphics card capable of running a cgi graphics looking game.

 

And why is The Witcher and Arkham Knight graphics is the most we can achieve so far (unsure if this is true), does this make sense?

 

They do it all the time. Just today the new AMD R9 Fury X got released and that kind of enthusiast GPUs do have plenty enough power to drive most games at 4K resolution and maintain around 60 fps. It's just that some games are more demanding than others. Same as the original Crysis was impossible to max out and achieve high framerates until years later.

 

It is a complex interaction between GPU performance and game performance demands that drives the evolution of GPUs. New, more powerful GPUs allow developers to go crazy with visuals and some developers go crazy enough with visuals that the currently available GPUs aren't powerful to handle that, forcing the manufacturers to keep innovating and making their GPUs better.


  • A Crusty Knight Of Colour et Kaiser Arian XVII aiment ceci

#8
Deathangel008

Deathangel008
  • Members
  • 4 444 messages

and they only grow in length. I suppose we will need 1 or 1.5 meter cases to put these super powerful graphic cards!

looooool. sorry.
 

Why do they keep releasing new ones every month or so?

to cover different price ranges...? would you like it if there is only one graphic card for 500$? i wouldnt.
there is usually at least one year between new GPU-series.
 

Why don't they just make a super powerful graphics card that can survive for 5-10 years?

10 years ago we had the GeForce 7XXX series which was manufactured in 110nm and only supported DX9. now we have DX11 (since 2009) and 28nm (since 2011). this technologic sector changes and evolves way too fast to create a microprocessor that is still up-to-date in 5 years.
 

Even the best of Graphic cards right now can't run Witcher 3 on the highest graphic settings and 4K resolution better than 35 FPS (in average)
 
They're not that powerful.

"4K" aka UHD has 4(!) times as many pixels as FHD, which still is by far the most common resolution.



#9
BroBear Berbil

BroBear Berbil
  • Members
  • 1 516 messages

My GTX 480 lasted about 5 years in a case with poor airflow in a dusty environment. It managed to run games at 2560x1600 that entire time and heat my room during the winter. I'm satisfied with its lifespan and performance.

 

Going to get a 980 in my next build and I'll be expecting to use it for at least 5 years. I believe a 980 is overkill for current games and I'm not going to be running it at 4k. Just because they keep iterating on the tech doesn't mean you have to buy it constantly.


  • Obadiah, A Crusty Knight Of Colour et geth47 aiment ceci

#10
Fishy

Fishy
  • Members
  • 5 819 messages

If you compare it to console. Yup. Video card tend to have a lot of untapped potential.



#11
Fishy

Fishy
  • Members
  • 5 819 messages

...

 

Even the best of Graphic cards right now can't run Witcher 3 on the highest graphic settings and 4K resolution better than 35 FPS (in average)

 

They're not that powerful.

 

 

Or maybe they simply don't get the same ''OPTIMIZATION'' treatment that console get. A 8800 Gt was like 4-5 time more powerful than a PS3. Yet  a game like uncharted would never run on such a card. Untapped potential is real.



#12
Kaiser Arian XVII

Kaiser Arian XVII
  • Members
  • 17 283 messages

lol you people!

Never expect me to change my monitor till 2020! So my resolution will be 1600x900 till then. Hopefully GTX 760 will be enough for a few years for "High" (not very high or ultimate) graphic settings.



#13
Deathangel008

Deathangel008
  • Members
  • 4 444 messages

Or maybe they simply don't get the same ''OPTIMIZATION'' treatment that console get. A 8800 Gt was like 4-5 time more powerful than a PS3. Yet  a game like uncharted would never run on such a card. Untapped potential is real.

well, every PS3/XBOX360 (current consoles as well, of course) had the same hardware, so its very easy to optimize a game for it. and now try to imagine how many combinations of CPU and GPU are possible with only the PC-hardware of the last 4 years. its impossible to optimize a PC-game as good as its console version.



#14
A Crusty Knight Of Colour

A Crusty Knight Of Colour
  • Members
  • 7 466 messages

Or maybe they simply don't get the same ''OPTIMIZATION'' treatment that console get. A 8800 Gt was like 4-5 time more powerful than a PS3. Yet  a game like uncharted would never run on such a card. Untapped potential is real.


well, every PS3/XBOX360 (current consoles as well, of course) had the same hardware, so its very easy to optimize a game for it. and now try to imagine how many combinations of CPU and GPU are possible with only the PC-hardware of the last 4 years. its impossible to optimize a PC-game as good as its console version.


The truth is somewhere inbetween. While it's much harder to optimise for PC due to the large potential configurations, many developers don't have the time, money or desire to put in the effort.

While it might suffer issues at the higher end of the scale, GTA V is remarkably well optimised for low/medium range cards. Console GPU equivalents in the R9 270(x) and the GTX 760 can easily achieve console level fidelity with much higher framerates. Metal Gear Solid: Ground Zeroes is also another example of a well optimised game, capable of 1080p/60 FPS with even very old configurations.

A game like Dark Souls II is also light on graphical demands for it's fidelity, with older flagships like the GTX 680 perfectly capable of 4K gaming.
  • Kaiser Arian XVII aime ceci

#15
Guest_TrillClinton_*

Guest_TrillClinton_*
  • Guests

Yeah optimization for all the different graphics cards wouldn't make any sense from a cost benefit analysis. Unless selectively choosing the cards to optimize for but why do that when the PC gaming culture is readily prepared to customize their hardware. It is not a decision I would make.

 

Now on optimization itself, the consoles will always be much easier and much accessible. When optimizing for a console you are operating under a single architecturte and single graphics card. This typically means code will run on bare metal. Eliminating all code abstractions that would take time and executing code at the lowest of levels possible. It is one of the advantages of code that runs on a single model. This is why malware usually operates at a very low level.



#16
Obadiah

Obadiah
  • Members
  • 5 735 messages
I once had a video card for 5 years. You just have to not give in to the hype, and not expect exceptional graphics after year 2.
  • A Crusty Knight Of Colour, Kaiser Arian XVII et goofyomnivore aiment ceci

#17
Kaiser Arian XVII

Kaiser Arian XVII
  • Members
  • 17 283 messages

I once had a video card for 5 years. You just have to not give in to the hype, and not expect exceptional graphics after year 2.

 

My last card was being used from 2011-2015. But only the last two years it became annoying about framerates (if I wanted something decent in graphic settings I should settle for below 25FPS, but in low/medium settings achieving 40FPS was possible (but impossible for games like Assassin's Creed 4 with very rich engine).



#18
A Crusty Knight Of Colour

A Crusty Knight Of Colour
  • Members
  • 7 466 messages
A high end GPU will last you 4-5 years with reasonable quality settings for most games. The flagships from late 2010 in the GTX 580 and the HD 6970 can still power through a lot of games.

But people buy flagship cards because they want the best performance and best graphical settings for every game. So that often entails upgrading every 1-2 generations when they could conceivably last longer. That said, I'm not a whole lot different. I tend to upgrade every 2-3 years. I'm just too poor to go for the real flagship cards. :P

#19
Hellamarian

Hellamarian
  • Members
  • 114 messages

That's the tech world for you. Two weeks after the newest technology is revealed, it's already outdated.



#20
Fidite Nemini

Fidite Nemini
  • Members
  • 5 738 messages

That's the tech world for you. Two weeks after the newest technology is revealed, it's already outdated.

 

Technically it's already outdated before it hits the market, because R&D resumes work on more advanced iterations/new technologies as soon as a product has been approved for mass production.