Next gen question for Bioware developers
#1
Posté 23 août 2013 - 06:14
I'm not looking for PS/MS argument more impressions from them on how they feel about the general quality of the new tech and whether either console shows that much of a quality difference. Is it truly next gen or just an upgrade in quality to this gen essentially? Because for me it looks pretty much like the HD argument of 720p Vs 1080p both look good but one is a bit better, but really the difference is negligible, it's not the leap we had like older generations (PSX/PS2 or VHS/DVD/Bluray).
#2
Guest_Aotearas_*
Posté 23 août 2013 - 06:35
Guest_Aotearas_*
As for the system specs, compared to the previous generation, the new one is a serious step up (then again what did you expect after five years).
And I don't know about any developer comments from BioWare, sorry.
#3
Posté 23 août 2013 - 06:38
I have to imagine that the increase in available RAM is going to be a pretty big deal. Currently they only have 512MB to play with on the Xbox 360, which is pretty limiting(I believe PS3 has the same amount).
They actually hinted that Mass Effect 3 MP already pushes the current consoles to the limit to the point where they couldn't handle much more than the 8 enemies at once that spawn.
#4
Posté 23 août 2013 - 07:19
#5
Posté 23 août 2013 - 08:18
I agree about the first 360 games Vs current releases differences. Even with Mass Effect you can see a real improvement over the trilogy as developers got better at getting the most from the systems. The RAM question still bothers me though with the differences between the PS4 and Xbox one, I'm not that tech savvy so the addition of the esram issue only confuses me more.
I guess I'll just have to wait and see in November, but thanks for the reply's.
#6
Posté 24 août 2013 - 01:46
Cyonan wrote...
I think we're just starting to hit a cap on what our hardware can do for graphical quality. It's getting to the point where you need bigger and bigger increases in hardware just to get a smaller increase in graphical quality.
I have to imagine that the increase in available RAM is going to be a pretty big deal. Currently they only have 512MB to play with on the Xbox 360, which is pretty limiting(I believe PS3 has the same amount).
They actually hinted that Mass Effect 3 MP already pushes the current consoles to the limit to the point where they couldn't handle much more than the 8 enemies at once that spawn.
More or less, this.
We're hitting caps on a number of fronts...
First, and foremost, we're hitting a cap on processors. There's a point where we cannot shrink the components of a CPU any further, and we're rapidly starting to approach that point. Worse, with each shrink we're making it harder to increase speed, as speed requires power and shrinking causes electricity to leak, causing heat, which limits speed.
That affects both CPU's and GPU's. We're hitting a point where it's increasingly difficult to gain more performance by shrinking the die and increasing clock speed. (There's a lot more to this, but the details would bore many)
So we turned to parallelization and tuning efficiency. The problem is, you can only break a task into so many pieces before tasks either can't be divided any further, or the overhead caused by dividing outweighs the benefits of dividing. We've still some space here, but it's a problem.
Tuning is another problem area. It was great when we just optimized our slowest case for a huge benefit, but we rapidly run out of easy targets and the benefits start dropping. If we have a operation that takes 10 cycles to complete, and we cut it to 5 cycles, we get a great benefit. Cutting it from 5 to 3 isn't as great.
Then we have a looming memory problem. The PS4 suffers from this, having the ability to transfer vast amounts of data is pointless if the cost is a high latency. The problem is only going to get worse, because as memory size increases, latency increases as it has to search through that much more memory. Without corresponding increases in bus speeds and access times, we will hit a point of diminishing returns. We have a last ditch effort of moving the memory onto the die, but after that we need new technology.
Then we have a development problem. Being able to generate photorealistic graphics is great, but if it takes 8 months to create a barrel, it isn't useable. Being able to parralelize out code is great, except if the resulting code is unreadable and no one can figure out what it does because it's so fragmented.
On the console side, we're pretty much done here. Without some earth shattering discovery, we can go no further. Heat and Power are our problem here, the Console has a requirement where it has to generate a minimal amount of heat so that it's tiny, poorly ventilated case isn't overwhelmed. It has a requirement where it has to be able to be plugged into the average wall socket, so it cannot consume 1,000W to operate. That's why both of the consoles are using a mid-range mobile CPU instead of a full fledged CPU.
On the PC side, we fare a little bit better. We can build a full size, well ventilated case, and have no restrictions on power. The PC can currently operate in a variety of power modes depending on application, and will be able to power on/off cores as needed. We could easily develop a 16 core PC that sits in the basement next to the furnace that pushes all of the screens in the house in parallel, with an adaptive OS that alters it's appearance based on the input device for that screen (Think gamepad in the living room, mouse/keyboard in the den, touch in the bedroom/kitchen).
Regardless, we have hit a wall. Look at PC CPU's, a two year old Sandy Bridge is only about 5%-15% slower than a Haswell, usually closer to 5%. Which in gaming, is a few FPS.
The next few years will be about convergence, not divergence, and since the new consoles are just low-end PC's, we now have "One platform to rule them all".
Over the next 5 years you'll see Sony/MS's message change from hardware to delivery.
#7
Posté 24 août 2013 - 05:42
#8
Posté 24 août 2013 - 11:07
Gatt9 wrote...
Cyonan wrote...
I think we're just starting to hit a cap on what our hardware can do for graphical quality. It's getting to the point where you need bigger and bigger increases in hardware just to get a smaller increase in graphical quality.
I have to imagine that the increase in available RAM is going to be a pretty big deal. Currently they only have 512MB to play with on the Xbox 360, which is pretty limiting(I believe PS3 has the same amount).
They actually hinted that Mass Effect 3 MP already pushes the current consoles to the limit to the point where they couldn't handle much more than the 8 enemies at once that spawn.
More or less, this.
We're hitting caps on a number of fronts...
First, and foremost, we're hitting a cap on processors. There's a point where we cannot shrink the components of a CPU any further, and we're rapidly starting to approach that point. Worse, with each shrink we're making it harder to increase speed, as speed requires power and shrinking causes electricity to leak, causing heat, which limits speed.
That affects both CPU's and GPU's. We're hitting a point where it's increasingly difficult to gain more performance by shrinking the die and increasing clock speed. (There's a lot more to this, but the details would bore many)
So we turned to parallelization and tuning efficiency. The problem is, you can only break a task into so many pieces before tasks either can't be divided any further, or the overhead caused by dividing outweighs the benefits of dividing. We've still some space here, but it's a problem.
Tuning is another problem area. It was great when we just optimized our slowest case for a huge benefit, but we rapidly run out of easy targets and the benefits start dropping. If we have a operation that takes 10 cycles to complete, and we cut it to 5 cycles, we get a great benefit. Cutting it from 5 to 3 isn't as great.
Then we have a looming memory problem. The PS4 suffers from this, having the ability to transfer vast amounts of data is pointless if the cost is a high latency. The problem is only going to get worse, because as memory size increases, latency increases as it has to search through that much more memory. Without corresponding increases in bus speeds and access times, we will hit a point of diminishing returns. We have a last ditch effort of moving the memory onto the die, but after that we need new technology.
Then we have a development problem. Being able to generate photorealistic graphics is great, but if it takes 8 months to create a barrel, it isn't useable. Being able to parralelize out code is great, except if the resulting code is unreadable and no one can figure out what it does because it's so fragmented.
On the console side, we're pretty much done here. Without some earth shattering discovery, we can go no further. Heat and Power are our problem here, the Console has a requirement where it has to generate a minimal amount of heat so that it's tiny, poorly ventilated case isn't overwhelmed. It has a requirement where it has to be able to be plugged into the average wall socket, so it cannot consume 1,000W to operate. That's why both of the consoles are using a mid-range mobile CPU instead of a full fledged CPU.
On the PC side, we fare a little bit better. We can build a full size, well ventilated case, and have no restrictions on power. The PC can currently operate in a variety of power modes depending on application, and will be able to power on/off cores as needed. We could easily develop a 16 core PC that sits in the basement next to the furnace that pushes all of the screens in the house in parallel, with an adaptive OS that alters it's appearance based on the input device for that screen (Think gamepad in the living room, mouse/keyboard in the den, touch in the bedroom/kitchen).
Regardless, we have hit a wall. Look at PC CPU's, a two year old Sandy Bridge is only about 5%-15% slower than a Haswell, usually closer to 5%. Which in gaming, is a few FPS.
The next few years will be about convergence, not divergence, and since the new consoles are just low-end PC's, we now have "One platform to rule them all".
Over the next 5 years you'll see Sony/MS's message change from hardware to delivery.
That's not the real problem. Anything that has to do with energy transfer is going to mean a bit of loss, a fact of the laws of thermodynamics. The real problem is, the lazines of software developers to make games pony up to work on faster and more powerful hardware. Hell, even to this day, many games that are released are made to run using two processors, even though many of the gaming systems out there are primarily 4 processors or more. I can list them for you. Though you may run them on a quad-core (for example), they will only use two cores. The Xbox 360 uses 3 cores and the PS3, 6 (tehcnically 7 &, but I won't get into that), so shame on the PC devs.
Having said that, the reason gaming consoles work as well as they do and don't, is that the money is made developing the games for them, the problem is, they lack RAM, so they tire out rather quickly. It wasn't until BattleField 3 and Guild Wars 2 that they (PC devs) ventured into using more CPU processing power (quad-cores).





Retour en haut






