Aller au contenu

Photo

If you liked the ending can you please explain why


  • Veuillez vous connecter pour répondre
371 réponses à ce sujet

#301
Reorte

Reorte
  • Members
  • 6 595 messages

o Ventus wrote...

dreamgazer wrote...

Because you would know in advance that the scenario would kill your LI, and you could plan accordingly by forcing them to stay away for Shepard in those final moments?  That's pretty darn contrived, especially for the conclusion of the series---at the very least on following play-throughs. 

It gets worse when you know for a fact that Shepard will, in fact, survive.


"You" wouldn't know anything without meta gaming. 

And that still isn't contrived.

Indeed, by that argument anything that can be affected by decisions made during the game is contrived, and since you can metagame just about anything, not good.

I don't feel that following play-throughs should be used as an excuse to cripple the first one.

#302
dreamgazer

dreamgazer
  • Members
  • 15 742 messages
I take it you guys would never take your loyal, capable LI on the beam run (if they were on the squad)?

#303
Applepie_Svk

Applepie_Svk
  • Members
  • 5 469 messages

dreamgazer wrote...

I take it you guys would never take your loyal, capable LI on the beam run (if they were on the squad)?


I would take instead Walters and Hudson with low EMS :3 

#304
AlanC9

AlanC9
  • Members
  • 35 649 messages

dreamgazer wrote...

I take it you guys would never take your loyal, capable LI on the beam run (if they were on the squad)?


First playthrough? I would have, since I wouldn't have thought that Bio would do such a thing, assuming I had a decent playthrough.

Metagaming, yep. That's why I wouldn't have had a problem with the squadmates getting killed in all EMS states. Better still, let it be random.

#305
dreamgazer

dreamgazer
  • Members
  • 15 742 messages

AlanC9 wrote...

dreamgazer wrote...

I take it you guys would never take your loyal, capable LI on the beam run (if they were on the squad)?


First playthrough? I would have, since I wouldn't have thought that Bio would do such a thing, assuming I had a decent playthrough.

Metagaming, yep. That's why I wouldn't have had a problem with the squadmates getting killed in all EMS states. Better still, let it be random.


If the situation fits, I can definitely get behind that.

#306
AlanC9

AlanC9
  • Members
  • 35 649 messages

MassivelyEffective0730 wrote...

And you probably were distorting our position. ;)

Above everything, I want it to be narratively consistent, both on it's own, and with the rest of the trilogy in mind. As I said earlier, I have themes and a narrative that I'm writing to make it fit that criteria, and have Shepard live.


Hey, sometimes it's just me not understanding the position. But sometimes, yeah, it's rhetoric.

#307
dreamgazer

dreamgazer
  • Members
  • 15 742 messages

Reorte wrote...

o Ventus wrote...

dreamgazer wrote...

Because you would know in advance that the scenario would kill your LI, and you could plan accordingly by forcing them to stay away for Shepard in those final moments? That's pretty darn contrived, especially for the conclusion of the series---at the very least on following play-throughs.

It gets worse when you know for a fact that Shepard will, in fact, survive.


"You" wouldn't know anything without meta gaming.

And that still isn't contrived.

Indeed, by that argument anything that can be affected by decisions made during the game is contrived, and since you can metagame just about anything, not good.

I don't feel that following play-throughs should be used as an excuse to cripple the first one.


I agree, in a way, but think about what you'd do in the first run and see how it meshes with what's being asserted here. The "good" scenario would involve taking characters with you that you don't mind losing, and leaving your LI---more often than not, a capable squad member---at the base in what could be Shepard's final moments.

That rings very false.

#308
AresKeith

AresKeith
  • Members
  • 34 128 messages

dreamgazer wrote...

I take it you guys would never take your loyal, capable LI on the beam run (if they were on the squad)?


What if they handled it like they did Cortez?

#309
Guest_Guest12345_*

Guest_Guest12345_*
  • Guests
Yeah, I don't have any ethical hang-ups about the endings. I also didn't have any ethical hang-ups about all the other choices in the series. I committed genocide against the Rachni and I brainwashed a bunch of synthetic lifeforms. So why would I suddenly get my panties in a bunch about Synthesizing the galaxy or any of the other ending choices.

Frankly, I think all 3 endings are quite good. Refuse is the only ending I refuse, because it is such complete failure.

Forcibly evolving people with synthesis, hell if I could do this right now today for real people, I would. We desperately need to evolve, we've got some real neanderthals holding us back. 

As for Control, this is great because it means Shep becomes a synthetic space-god. Which means, everyone lives in peace and society progresses without the fear of synthetic vs organic war, as Shep will put down any uprisings before they become a problem with overwhelming reaper force.

And Destroy, the classic destroy the reaper ending. I don't like killing EDI, she is one of the best characters in the series. But seeing the reapers tip over and die is a very gratifying sequence. 

Frankly, all 3 endings are emotionally gratifying, and that is all I care about. I got mine jack, good luck gettin yours. 

#310
DDK

DDK
  • Members
  • 352 messages

Mastone wrote...

But stating that this ending was a logical outcome of all 3 installments is just not true, in fact  in hindsight the 3 installments look  more like 3 alternative ME stories with 3 blokes called shepard than it was a sequential storyline.


Then I guess I'm just psychic since my predictions, pretty much every single one of them from the beginning of the series, came true.

#311
dreamgazer

dreamgazer
  • Members
  • 15 742 messages

Malisin wrote...

Mastone wrote...

But stating that this ending was a logical outcome of all 3 installments is just not true, in fact  in hindsight the 3 installments look  more like 3 alternative ME stories with 3 blokes called shepard than it was a sequential storyline.


Then I guess I'm just psychic since my predictions, pretty much every single one of them from the beginning of the series, came true.


What are Tuesday's winning lottery numbers?

#312
KaiserShep

KaiserShep
  • Members
  • 23 829 messages

scyphozoa wrote...

Forcibly evolving people with synthesis, hell if I could do this right now today for real people, I would. We desperately need to evolve, we've got some real neanderthals holding us back. 


Applied to real life, it would mean the end of just about everything that makes us individuals, the death of diversity of all sorts. Beneath its idyllic, hokey exterior, such a fate would be akin to extinction, but rather than simply kill everything off, keep what remains of organic life on permament life support. 

#313
AlanC9

AlanC9
  • Members
  • 35 649 messages

KaiserShep wrote...


Applied to real life, it would mean the end of just about everything that makes us individuals, the death of diversity of all sorts. Beneath its idyllic, hokey exterior, such a fate would be akin to extinction, but rather than simply kill everything off, keep what remains of organic life on permament life support. 


As written, this doesn't make much sense. How do you figure these things to be true?

#314
AlanC9

AlanC9
  • Members
  • 35 649 messages

scyphozoa wrote...

Yeah, I don't have any ethical hang-ups about the endings. I also didn't have any ethical hang-ups about all the other choices in the series. I committed genocide against the Rachni and I brainwashed a bunch of synthetic lifeforms. So why would I suddenly get my panties in a bunch about Synthesizing the galaxy or any of the other ending choices.


I remember someone arguing that the worst thing about ME3 is that players like you aren't punished.

#315
KingZayd

KingZayd
  • Members
  • 5 344 messages

dreamgazer wrote...

I take it you guys would never take your loyal, capable LI on the beam run (if they were on the squad)?


I did. I'm a bit new to the role playing thing, but I find it makes games far more enjoyable than trying to rig the system. When I thought they were in danger, I considered not taking my LI and best friend, but then I decided that when the fate of the galaxy depends on this mission, their lives were relatively unimportant.

#316
KingZayd

KingZayd
  • Members
  • 5 344 messages

dreamgazer wrote...

MassivelyEffective0730 wrote...

But I want a happy ending for Shepard. I want to see him and his LI, and his crew, and the Normandy alive, and surviving. Hell, I don't care if some of the squad were forced deaths. Make the people you take to the beam with you die. 


Do you really want the game so contrived that you can know better than to take your LI on the beam run with you, just so they survive?


I don't think sensible story telling needs to account for metagaming. If you know how the story goes, of course you're going to be able to mess with it. Personally, I don't care about whether the endings are happy or not, so long as they make sense.

When Harby shot Shepard, I felt as if I'd be happy with that ending if indeed it was one. I understood why people might not like it, but it was enough for me.
When Shepard and Anderson were talking and "watching the show". I felt as if that ending was okay. I understood why people might not like it, but it didn't ruin the game for me.
When Shepard passed out after the Crucible didn't fire, I felt as if that ending was decent. I understood why people wouldn't like it, but I was still happy.
When the Starchild appeared and spoke, I knew why people hated the ending and I agreed.

#317
KaiserShep

KaiserShep
  • Members
  • 23 829 messages

AlanC9 wrote...

KaiserShep wrote...


Applied to real life, it would mean the end of just about everything that makes us individuals, the death of diversity of all sorts. Beneath its idyllic, hokey exterior, such a fate would be akin to extinction, but rather than simply kill everything off, keep what remains of organic life on permament life support. 


As written, this doesn't make much sense. How do you figure these things to be true?


Really? Individuality is a major part of what drives conflict, as well as what gives us unique perspectives. There is absolutely no way to remove conflict from humanity without eradicating our individuality. The concept of something like a geth consensus works for scifi, but it in no way applies well for sentient life as we know it. It's essentially an attempt to create a kind of heaven on earth. Problem is, one person's heaven is very probably another persons hell. How do you reconcile this without getting rid of pesky things like difference of opinion? The ability to opt out would instantly undermine any utopia anyone tries to construct. Perpetual peace will never exist so long as life does. 

Modifié par KaiserShep, 05 mai 2013 - 11:57 .


#318
TheRealJayDee

TheRealJayDee
  • Members
  • 2 950 messages

dreamgazer wrote...

I take it you guys would never take your loyal, capable LI on the beam run (if they were on the squad)?


Never? I wouldn't say that, some of my Shepards likely would. My main Shepard did leave his LI (Ash) and his old and best friends (Garrus&Tali, Liara) behind, though, and took Javik and James instead.

I still think it's strange that you leave anyone behind in the first place, but that's a different topic...

#319
AlanC9

AlanC9
  • Members
  • 35 649 messages

KaiserShep wrote...

Really? Individuality is a major part of what drives conflict, as well as what gives us unique perspectives. There is absolutely no way to remove conflict from humanity without eradicating our individuality. The concept of something like a geth consensus works for scifi, but it in no way applies well for sentient life as we know it. It's essentially an attempt to create a kind of heaven on earth. Problem is, one person's heaven is very probably another persons hell. How do you reconcile this without getting rid of pesky things like difference of opinion? The ability to opt out would instantly undermine any utopia anyone tries to construct. Perpetual peace will never exist so long as life does. 


Sure. But what that means is that  Synthesis wouldn't be a utopia..... even if there was such a thing.

#320
Morlath

Morlath
  • Members
  • 579 messages
Game World:

- Stupendously powerful AI construct (Catalyst) is created with one goal and no constraints in achieving it. "Preserve life from the uprising of synthetics"

- Catalyst comes to the conclusion that "transcendence" of organic life is the only option. So it goes about creating a way to capture the essence of a species into a techno-organic (synthesis) creature for preservation.

- Catalyst comes up with the idea of evolution "cycles". At some point (either before or after this) the Catalyst is placed within the Crucible. It can be inferred/can be assumed that the Catalyst is able to control the Keepers.

- Throughout the many cycles the Catalyst comes to believe that its solution is valid but its method is flawed. Attempts to artificially lift organics via synthesis into a higher life form fails.

- It can be assumed that the Reapers/Catalyst become aware of the crucible design. The Reapers believe that they can stop such a machine being built. It can be inferred that the Catalyst uses the indoctrinated information learned about the Crucible to begin a new solution.

- The Catalyst, using the Keepers (inferred), designs a rudimentary system it can explain to "lesser" organic species. Catalyst has come to the conclusion that while the Reaper/Cycle works, it is not an optimum solution. It comes up with three new potential solutions; It's synthetic thought processes and control of the Reapers is the problem and a new, more organic, "thought matrix" is required to take control, that it has potentially failed at the goal its creators gave it, in combining the primary Mass Relay and all its energy and focusing that energy through the Crucible it can do a galaxy-wide synthesis of all life and so complete its programming.

- The Catalyst waits for an organic "worthy" enough to be given the ability to choose.

Shepard:

- Is involved in a galaxy-wide war with untold amounts of casualties.

- Is the front-line "hero" and is being thrown into one bad situation after another.

- The likelihood of survival is slim-to-none and requires a number of possibilities to come together in order for him to "get back home".

- Emotionally, Shepard deserves to live a good life once the war is over. Emotionally, he deserves to see his love interest at least one more time. Intellectually this probably isn't likely and does not happen.

#321
KaiserShep

KaiserShep
  • Members
  • 23 829 messages

AlanC9 wrote...

KaiserShep wrote...

Really? Individuality is a major part of what drives conflict, as well as what gives us unique perspectives. There is absolutely no way to remove conflict from humanity without eradicating our individuality. The concept of something like a geth consensus works for scifi, but it in no way applies well for sentient life as we know it. It's essentially an attempt to create a kind of heaven on earth. Problem is, one person's heaven is very probably another persons hell. How do you reconcile this without getting rid of pesky things like difference of opinion? The ability to opt out would instantly undermine any utopia anyone tries to construct. Perpetual peace will never exist so long as life does. 


Sure. But what that means is that  Synthesis wouldn't be a utopia..... even if there was such a thing.


Then basically it offers nothing but a wierd melding of organic life to synthetic. If anything, this makes a better case for control, since it would permit organic life to learn to accept synthetics the way our protagonist did throughout the entire series.

#322
Iakus

Iakus
  • Members
  • 30 309 messages

KaiserShep wrote...

AlanC9 wrote...

KaiserShep wrote...


Applied to real life, it would mean the end of just about everything that makes us individuals, the death of diversity of all sorts. Beneath its idyllic, hokey exterior, such a fate would be akin to extinction, but rather than simply kill everything off, keep what remains of organic life on permament life support. 


As written, this doesn't make much sense. How do you figure these things to be true?


Really? Individuality is a major part of what drives conflict, as well as what gives us unique perspectives. There is absolutely no way to remove conflict from humanity without eradicating our individuality. The concept of something like a geth consensus works for scifi, but it in no way applies well for sentient life as we know it. It's essentially an attempt to create a kind of heaven on earth. Problem is, one person's heaven is very probably another persons hell. How do you reconcile this without getting rid of pesky things like difference of opinion? The ability to opt out would instantly undermine any utopia anyone tries to construct. Perpetual peace will never exist so long as life does. 


Personally, I hardly see how imposing the Qun on the entire galaxy is a "good" ending <_<

#323
KaiserShep

KaiserShep
  • Members
  • 23 829 messages

Morlath wrote...
- Emotionally, Shepard deserves to live a good life once the war is over. Emotionally, he deserves to see his love interest at least one more time. Intellectually this probably isn't likely and does not happen.


Ironically, the same should've been true of the suicide mission, yet here we are. The funny thing is that shepard is right out in the open under some kind of ME field yet no one sees despite this being the most important device in the galaxy. Intellectually, a shuttle could land right there with no problem. 

#324
Morlath

Morlath
  • Members
  • 579 messages

KaiserShep wrote...

Morlath wrote...
- Emotionally, Shepard deserves to live a good life once the war is over. Emotionally, he deserves to see his love interest at least one more time. Intellectually this probably isn't likely and does not happen.


Ironically, the same should've been true of the suicide mission, yet here we are. The funny thing is that shepard is right out in the open under some kind of ME field yet no one sees despite this being the most important device in the galaxy. Intellectually, a shuttle could land right there with no problem.


Don't forget, it's very possible for Shepard to die at the end of ME2. However because the story is a trilogy, ME3 takes the belief that enough was done that he, at the very least, survived.

If you're talking about the last part once Harbinger fires, I've never understood why people are up in arms at the "They're all dead" quote. How many times does something horrific happen (bomb explodes, building collapses) and people's initial thoughts are that no one could have survived that and yet people do. Shepard and Anderson are knocked down and are out long enough for this initial reaction to happen.

#325
Auld Wulf

Auld Wulf
  • Members
  • 1 284 messages
@KaiserShep

That's all that you understand, anyway. If you lack the ability to understand, then your options are limited, and that's how you become part of the problem. Instead, you could have greater understanding, and then you are part of the solution. This is the key to our future and the Singularity as a whole: Understanding. Wisdom is simply the ability to grasp greater understanding, to become less small in the grand scheme of things by assimilating alternate perspectives into yourself but separating them from the gestalt of your being. You are you, and you understand. Right now, you are you, but you do not understand.

Not many people are very good at ethics and philosophy, they do not understand. They also do not understand other people, they cannot empathise, they cannot grasp the hopes and dreams of another individual. They see other people as a threat, because they do not understand, and this creates conflict. So long as everything is a threat, you will have conflict, conflict driven by primal, animal fears which are in direct contradiction of the intellectuality which makes us human. If we are driven by primal fear, we are small animals, and nothing more.

Intellectuality leads to understanding, and understanding leads to wisdom. And of wisdom, I have plenty. I look at the world around me and I realise that when one person understands another, the barriers between them diminish, and the fear diminishes as well, thus conflict becomes a non issue. Instead, where once you would have had people tearing at each other's eyes, now you have them collaborating on grand works of art or science, because they understand each other, they admire each other, they respect each other.

Collaboration is the opposite of conflict, and it's collaboration through intellect that's brought us to where we are -- the apex race of our planet. Conflict is old hat, it's something evolution did to get us this far, but now we have to take things into our own hands. It's up to us to stop being small animals, to stop falling prey to the animal fears. To be human is to actually use the ability to think and perceive on levels far beyond other creatures.

Any creature can feel, but it is when feeling is met with intellect that we gain empathy and ethics. When we feel for another, or other groups of people, we turn what once would have been conflict into collaboration. And this is our strength, as human beings. Not as base, mindless barbaric drones who tear at each other's flesh, but as thinking, clever, self-modifying things. To a degree, we are the closest thing to a perfect machine that nature has created, and yet due to our lack of understanding, we are still so flawed.

So we must then create something even more perfect than us so that we might all understand, something that can share understanding with everyone, and put people on equal footing. There have been many examples of this in television, literature, films, and games. But let me use games, as I'm sure they are more common examples. In Fallout Tactics, one merges with the machine in order to bring peace to the wastes. In Deus Ex, one merges with the machine in order to create Helios, to network all of humankind. In Mass Effect 3, one merges with the machine in order to create Synthesis.

The want is there. There is wisdom in the desire. It's subconscious and close to the surface for many humans, but you have to be aware of it. You have to understand it. Conflict is not the way forward, as conflict leads only to extinction. We fear that which takes us away from nature because we don't want to understand, not understanding allows us to be comfortable, small animals, obeying our primal fears. Being good little rabbits and wolves, just being a part of the cycle of nature.

But already we augment ourselves. Whenever you put on a pair of spectacles you're doing something that doesn't exist within the animal kingdom. When you use an iPhone to twitter, you're networking on a level that cannot exist within nature. This is the way forward, this is what we're all progressing toward, and this is what Synthesis is ultimately the symbolism of. Open your mind to this knowledge, understand. Synthesis isn't to be taken literally, Synthesis is symbolism. It's an idealistic crystal ball of what could be.

So it's up to us to decide. Do we kill each other off, or do we put all of our efforts into creating something more perfect than us, something that could help us toward perfection, put all of humanity on equal footing, and help all of us understand each other. Which is the more worthy goal? Ultimately, conflict leads to nothing more than a barren earth, as weapons become more dangerous, and war becomes more frivolous, as reasons to fight become more common as we fall back on our animal fear.

I think, at the moment, we stand on something of a precipice. I hope we choose wisely.