Aller au contenu

Photo

MEA:A on console or pc?


  • Veuillez vous connecter pour répondre
150 réponses à ce sujet

#126
Novak

Novak
  • Members
  • 370 messages

Get a pretty cheap mainboard, some 8 core amd chip, 8 gigs of ram (maybe 12 but we'll see) and a kick ass GPU and you'll be golden. More and more of the actual calculations are done by the GPU and pretty much that's all you ever need to upgrade.



#127
Zatche

Zatche
  • Members
  • 1 222 messages
While I have a PS4, I prefer the keyboard mouse set up for shooters, and I recently installed a new graphics card and an SSD. So, PC with wireless kbm, hooked up to the TV it is.

#128
Osena109

Osena109
  • Members
  • 2 557 messages

Yeah I would build my own if I was able to as I know how but because of my disability I can't. But my current PC I got from Cyberpower where I used the configurator system to choose the components I wanted and got them to assemble it for me. Glad I did though as it's proven itself a great machine since then

I am not disabled but I have massive hands and i can't get them in to a case so am in the same boat I see nothing wrong with buying a PC form a respectable company that offers you a  warranty..



#129
caradoc2000

caradoc2000
  • Members
  • 7 550 messages

I haven't played my games on PC since 2006, so X1 for me.



#130
Spectr61

Spectr61
  • Members
  • 720 messages

Right, but the GPU wars are almost over. 
 
With SW Battlefront coming out, it should prove that to you.  I guess not.


Huh?
  • pdusen aime ceci

#131
straykat

straykat
  • Members
  • 9 196 messages

Consoles have to deal with sub-par porting like on PCs as well, like Bayonetta on PS3, which had half the framerate and longer loading times than the 360 version.

 

That's probably less of a prob now.. they're practically the same architecture. PS3 was a little odd.

 

 

Anyways, I just buy laptops these days. So I don't buy too many intensive games on them. I have the two consoles for that. Works for me.. but I do miss the option sometimes.



#132
shodiswe

shodiswe
  • Members
  • 4 999 messages

PC



#133
BennyORWO

BennyORWO
  • Members
  • 87 messages

PC



#134
Ahglock

Ahglock
  • Members
  • 3 660 messages

I am not disabled but I have massive hands and i can't get them in to a case so am in the same boat I see nothing wrong with buying a PC form a respectable company that offers you a warranty..


I bought my most recent one. Shopping parts and doing it myself would have saved $350ish. I'll gladly pay that for labor, warranty and lack of stress. No matter how many times i do it it stresses me out when I know I'm shoving a $500 part hard into a $500 part and they all are kind of fragile looking. Never broke anything but I'm always thinking about it during the build. Thens the testing and trouble shooting. Yeah I'll bump it up to a $1600 machine instead of $1250. And while I may not be able to run ultra for 5 years I can run better than console for longer than that. So I don't feel the need to replace that often.

Nothing against consoles I own all the current gen ones now and when the new Nintendo comes out I'll get that as well. But people are vastly inflating the costs of pcs. Yeah I could have dropped $5000 or whatever but you can get out for well under $2000 and still run games on ultra for the next 2-3 years and on high for 5+.

#135
rspanther

rspanther
  • Members
  • 1 037 messages

Since I don't have a console I guess I'll have to play it on my PC.



#136
Bizantura

Bizantura
  • Members
  • 990 messages

Never owned a console and like keyboard + mouse.  Drawback of PC could be getting the game to run vs the console games allways  run imediatly out of the box.

 

Even cheap PC components can be far better then the ones used for consoles, even more so if you build the  PC yourself.

 

The more you like to tinker hard and software  wise the more you  will like the PC.  If no, go for console.



#137
goishen

goishen
  • Members
  • 2 426 messages

Huh?

 

The GPU wars are almost over, much in the same way that the gigahertz wars were over.  I'm saying that the photorealistic look of a game like SW Battlefront can't prove that to you, nothing can.



#138
Novak

Novak
  • Members
  • 370 messages

The GPU wars are almost over, much in the same way that the gigahertz wars were over.  I'm saying that the photorealistic look of a game like SW Battlefront can't prove that to you, nothing can.

 

Yea, not really. It's not even close to photorealistic in many aspects. Particle emitters are still low as well as all the physics calculations. Sub surface scattering still doesn't exist in the world of real time computing, texture mapping still isn't up to the standard of ray tracers, demolition effects and breaking structure points still aren't really a thing (not as they are supposed to in real animation) and try putting loads of light sources in a scene which is then realistically rendered. If you look in a mirror in a game you don't even get a true reflection. Polygon meshes are not that detailed and god forbid you want to add tissue properties and hair that isn't just a surface map. When something comes out which makes a leap and everyone always says oh we're so close to real life graphics before they see the next leap and realize that it had many problems to begin with. 

 

Honestly they haven't even made realistic effects in movies and they can use sophisticated ray tracers and composite everything together by hand specifically to each scene. ILM has a render farm which consists of over 5000 cpu's and even in that industry one frame can take up as much as 250 hours to render.

 

And btw, the gigahertz wars were only over because cooling was an issue and at some point couldn't be solved anymore so they implemented multiple threads or core's so they could calculate more stuff at any given time which now is preferred since there are actually not that many serialized tasks to be handled. And the processing power continues to rise since we need more powerful stuff to run things every year. You might say something like "But CPU doesn't matter anymore, everything is run by the GPU anyway" which is true but only in the regard of games. Complex calculation tasks are still heavily handled by the CPU and render farms utilize that since CPU architecture lends itself more for a software pipeline.  


  • pdusen aime ceci

#139
I Am Robot

I Am Robot
  • Members
  • 443 messages

PC, but I might opt for a controller cause I think third person shooters are generally better that way, their pedigree is pretty strongly linked to GeOW so it makes sense. Maybe the RPG management aspects will make mouse and keyboard a better option, but I doubt they'll stray too far from ME 2&3 when it comes to the mix of gameplay styles.



#140
goishen

goishen
  • Members
  • 2 426 messages

What do you mean? Because it looks as good on the console as it does on PC? Well SW Battlefront runs sub 1080 on consoles with 900p on PS4 and Xbone at 720p though both push it up to 60 fps which is good. I haven't seen a good graphics comparison between the two so I don't know if the effects load is the same on the PC as it is on the PS4. 

 

Putting that aside it clearly shows that consoles are still not fully 1080p capable this in itself wouldn't be a super big deal if game development would stand still. GPU wars will never ever end as long as there is advancement in the field of computer graphics. 

 

That's hooey. 

 

The GPU wars will end when the last developer finally stands down and develop his game for consoles only.  Imagine if you will that you are a game developer.  Now imagine that you had a game that took five years to write. 

 

Okay, now imagine that you had graphics cards coming out during all that time.  Four years into development you had a totally game changing card.  But you had a console that 1) didn't have changing hardware 2) had a software kit that wasn't that easy to work with, but once you got you had it and 3) had an easy profit motive.  I mean, I know which one I'd go for.  I'd start writing console games and wouldn't look back.  And this is coming from a life long PC player.  I don't mind it they port it over from consoles.  Just don't make it feel consolized. 

 

Yah, I'm sure that Intel and AMD both closed up their doors when the gigahertz wars were over.  Yep, they're over.  Everybody go home.  What I'm trying to say is that the gains are becoming so incrementally small that they're almost negligible.  Unless either of them comes out with a game changer --  The GPU wars are almost over.



#141
Novak

Novak
  • Members
  • 370 messages

That's hooey. 

 

The GPU wars will end when the last developer finally stands down and develop his game for consoles only.  Imagine if you will that you are a game developer.  Now imagine that you had a game that took five years to write. 

 

Okay, now imagine that you had graphics cards coming out during all that time.  Four years into development you had a totally game changing card.  But you had a console that 1) didn't have changing hardware 2) had a software kit that wasn't that easy to work with, but once you got you had it and 3) had an easy profit motive.  I mean, I know which one I'd go for.  I'd start writing console games and wouldn't look back.  And this is coming from a life long PC player.  I don't mind it they port it over from consoles.  Just don't make it feel consolized. 

 

Yah, I'm sure that Intel and AMD both closed up their doors when the gigahertz wars were over.  Yep, they're over.  Everybody go home.  What I'm trying to say is that the gains are becoming so incrementally small that they're almost negligible.  Unless either of them comes out with a game changer --  The GPU wars are almost over.

 

The task to get more computing power still isn't over. It just shifted since just increasing cycles hit a brick wall. Oh yea because they program the games specially to a given set of hardware and not at all through the given API and drivers. Oh wait they do that! GPU manufacturers take care of most of the problems you mentioned and they're not going to stop they have a specific interest in keeping that going. They don't make that much money designing a chip for the console market as AMD financial problems clearly show.  



#142
goishen

goishen
  • Members
  • 2 426 messages

The task to get more computing power still isn't over. It just shifted since just increasing cycles hit a brick wall. Oh yea because they program the games specially to a given set of hardware and not at all through the given API and drivers. Oh wait they do that! GPU manufacturers take care of most of the problems you mentioned and they're not going to stop they have a specific interest in keeping that going. They don't make that much money designing a chip for the console market as AMD financial problems clearly show.  

 

 

I believe, as I'm not sure...   But I believe that all consoles are powered by AMD hardware. 

 

AMD knows the score.   NVidia is off in a race with itself.



#143
Novak

Novak
  • Members
  • 370 messages

I believe, as I'm not sure...   But I believe that all consoles are powered by AMD hardware. 

 

AMD knows the score.   NVidia is off in a race with itself.

 

Yes they both are, the design on both consoles is based on the Jaguar chip for the CPU and r9 series of the GPU. But both nvidia and amd basically only design their chips. Which is why they don't really make that much money with consoles. The manufacturing is up to someone else. Only intel still produces it's own chips which allowed them to be ahead of the curve for years in terms of transistor size which gave them an edge over AMD. In the GPU department nvidia is ahead but not really in a race with itself, they had a better software package which was more user friendly for a couple of years so more people buy them. But in terms of raw power they could trade punches all night long.



#144
goishen

goishen
  • Members
  • 2 426 messages

Yes they both are, the design on both consoles is based on the Jaguar chip for the CPU and r9 series of the GPU. But both nvidia and amd basically only design their chips. Which is why they don't really make that much money with consoles. The manufacturing is up to someone else. Only intel still produces it's own chips which allowed them to be ahead of the curve for years in terms of transistor size which gave them an edge over AMD. In the GPU department nvidia is ahead but not really in a race with itself, they had a better software package which was more user friendly for a couple of years so more people buy them. But in terms of raw power they could trade punches all night long.

 

 

So you're saying that ATI video cards could trade punches with Nvidia's cards all night long.  Just like AMD's CPU's could trade punches with Intel's CPU's all night long for the home market (read as not video editing or some other crazy nonsense).  I'll agree, for power users and heavy CPU intense operations, Intel does have a clear and definite winner in that category. 

 

I'm not seeing a clear definition of what you're saying to what I was saying.



#145
slimgrin

slimgrin
  • Members
  • 12 461 messages

Given that they're using Frostbite, PC all the way. Sadly, that also means little to no modding opportunities. 



#146
Novak

Novak
  • Members
  • 370 messages

So you're saying that ATI video cards could trade punches with Nvidia's cards all night long.  Just like AMD's CPU's could trade punches with Intel's CPU's all night long for the home market (read as not video editing or some other crazy nonsense).  I'll agree, for power users and heavy CPU intense operations, Intel does have a clear and definite winner in that category. 

 

I'm not seeing a clear definition of what you're saying to what I was saying.

 

No AMD is way behind in the CPU market. They were ahead once (remember 64 bit? AMD made that happen) but now they haven't invented much the last couple of years. And now Intel overtook them. Sadly a lot of those benchmarks are not real, and that is a difficult subject and it would take forever to actually explain everything here since it has to do with instruction sets and calculation methods which are utilized when benchmarks are engaged (they cheated in a very smart way).

Theoretically yes AMD and Intel cpu's could trade punches all night long but as it is right now, not so much. And AMD vs Nvidia is basically the same story. But just because they can compete doesn't mean that the advancement has halted. If you go up a little bit you can read why real time computing still has many things to do before becoming anything close to photorealistic. 

 

As for your argument of the game dev who's really only interested for developing the game for one set of hardware, I believe I addressed that earlier that it isn't really his problem in the PC market since others take care of that and they're the ones making money of selling GPU's for the PC which makes a ton more than it does selling consoles. There are other interest groups here, not just game devs and console manufacturers. Again game dev doesn't care ->API/drivers

 

EDIT: 

I don't know if you've ever programmed yourself but if you do then you know that even for consumer grade products: CPU power upgrades are still needed.



#147
goishen

goishen
  • Members
  • 2 426 messages

No AMD is way behind in the CPU market. They were ahead once (remember 64 bit? AMD made that happen) but now they haven't invented much the last couple of years. And now Intel overtook them. Sadly a lot of those benchmarks are not real, and that is a difficult subject and it would take forever to actually explain everything here since it has to do with instruction sets and calculation methods which are utilized when benchmarks are engaged (they cheated in a very smart way).

Theoretically yes AMD and Intel cpu's could trade punches all night long but as it is right now, not so much. And AMD vs Nvidia is basically the same story. But just because they can compete doesn't mean that the advancement has halted. If you go up a little bit you can read why real time computing still has many things to do before becoming anything close to photorealistic. 

 

As for your argument of the game dev who's really only interested for developing the game for one set of hardware, I believe I addressed that earlier that it isn't really his problem in the PC market since others take care of that and they're the ones making money of selling GPU's for the PC which makes a ton more than it does selling consoles. There are other interest groups here, not just game devs and console manufacturers. Again game dev doesn't care ->API/drivers

 

EDIT: 

I don't know if you've ever programmed yourself but if you do then you know that even for consumer grade products: CPU power upgrades are still needed.

 

 

Look, I'm not gonna argue with you any longer.  I've said my piece. 

 

TLDR ;  Both Nvidia and AMD are arguing over things that are so small that the average consumer shouldn't care.



#148
rashie

rashie
  • Members
  • 910 messages

The GPU wars are almost over, much in the same way that the gigahertz wars were over.  I'm saying that the photorealistic look of a game like SW Battlefront can't prove that to you, nothing can.

Yeah no, this isn't true.

 

There's still a lot of room left for improvement in video game visuals, although it isn't about polygon counts anymore, things can still get much better with things like realtime raytracing, which is still a major hardware hurdle.



#149
Novak

Novak
  • Members
  • 370 messages

Yeah no, this isn't true.

 

There's still a lot of room left for improvement in video game visuals, although it isn't about polygon counts anymore, things can still get much better with things like realtime raytracing, which is still a major hardware hurdle.

 

Even poly count still matters, have you ever looked at the fingers? Looks like complete crap even in the newest games. (And I don't mean the ones you see in an FPS on the characters weapons. I've ported enough objects from games and then retouched them with a high poly count to know what kinda difference that makes.



#150
straykat

straykat
  • Members
  • 9 196 messages

 

 

The more you like to tinker hard and software  wise the more you  will like the PC.  If no, go for console.

 

I used to. It got old after 20 years. I'm lazy now. I don't even want to fix my damn door knob. :P