Aller au contenu

Photo

"All Were Thematically Revolting". My Lit Professor's take on the Endings. (UPDATED)


  • Veuillez vous connecter pour répondre
5087 réponses à ce sujet

#2801
Seijin8

Seijin8
  • Members
  • 339 messages
Okay, abstraction (while not bad) is making the points murky. Agreed. Partly I think I am sounding this out for myself for the first time and finding the implications (beyond the horrendous homogenization) to be really fundamentally disturbing. I will grab a few things from your post to act as counterpoints so as to get real specific:

"I have no gorram idea which benefit he's talking about... [snip]... and it absolutely does not matter."

For me (and this may reflect my own conceptual stumbling block), it absolutely *does* matter. Without that middle part, the equation doesn't work for me. Synthesis = No more creator vs. created genocide is on offer, but I cannot see how that equation works without stripping away something (possibly everything) essential.

Let me aim at a specific problem I see, and this is as much with the narrative presentation as with the concept itself: The solution's basis is individual and small-scale, while the problem being "solved" is both fundamental and civilization-scale.

A single person or small group cannot commit genocide. Civilizations can. Civilizations are a grinding Darwinistic enterprise that will run over the lesser entities in their way, either absorbing or eliminating them. This is a side-effect of the social dynamics that push them forward. If synthesis indeed doesn't change the fundamentals of who we (the characters) are, then genocide hasn't been averted, just modified. The means have changed, the buildup to war will allow for more diplomatic options before genocide eventuates. Pehaps genocide is no longer necessary - a weakness that has been eliminated. The geth "rewrite" supplants the dreadnought with Thanix cannons.

For me, the synthesis solution doesn't solve anything *unless* it seriously alters the nature and essential character of every living thing in the universe, and by definition that doesn't leave people as "still us, just with a few new minor superpowers." It leaves superpowered civilizations a new host of ways to harm one another.

To put it as simply as possible: I can accept that some means of ending the Catalyst's feared superwar must exist. I cannot perceive a way that synthesis accomplishes it without erasing the fundamental attributes that define human existence.

#2802
Seijin8

Seijin8
  • Members
  • 339 messages
@Ieldra2: "Just because our knowledge is too limited to imagine what those challenges might be doesn't mean they don't exist."

Okay, and I understand your viewpoint. I do not doubt that many great challenges float beyond human understanding. But it won't be humans that meet those challenges. Synthesis is not a tweak or a gentle nudge toward greatness and understanding. To escape the realities of biological, competitive, evolution-dominated existence requires a kick in the brain stem that puts us into orbit.

And then, we won't be human (or quarian or krogan) anymore.

I can't believe that such a fundamental change leaves my squadmates or the rest of the organics out there as anything more than vague resemblances of their previous selves. I argue that it cannot be simultaneously as great and uplifting as it seems while leaving people's personalities and souls unchanged.

#2803
CulturalGeekGirl

CulturalGeekGirl
  • Members
  • 3 280 messages
And also, what I've listed above can be used not just as a support for synthesis, but as a reinforcement of why the endings are so terrible as far as I'm concerned. (sorry Irelda2 ^_^)

In the end, no matter how you slice it, whether or not the Crucible is Reaper designed, whether or not the Catalyst is lying, whether or not transhumanism is desirable... the Reapers are stupid. Mind-bendingly stupid. Ridiculously, incoherently, unfathomably stupid... and you're losing to them.

It doesn't matter what choice they favor. If they favor any choice and can't convince you to take it, whether by lying about what the buttons do or by understanding human nature enough to know what might make a compelling argument (Note: "Hello! All this is my fault, now listen to me" is not the answer.), they are stupid.

If they designed the crucible they're stupid, because they made the Destroy bit the easiest to build unless you happen to have a Reaper brain, in which case Control is the easiest to build.

If they didn't design the crucible they're stupid, because they let out a bunch information about themselves and the Citadel, with absolutely no gorram idea what this would lead to.

If IT is true they're stupid and really bad at indoctrinating people.

No matter what, they're incompetent, socially dysfunctional, petulant screw-ups... and we're losing to them.

#2804
Seijin8

Seijin8
  • Members
  • 339 messages
Yeah, it was easier to be afraid of them before they tried to explain everything. I don't recall if it was on this thread or elsewhere, but I outlined what i thought was a plausible rationale for the Reaper's apparent activities and decisionmaking/priorities during Mass Effect.

There *are* plausible reasons for them to behave the way they have. But it reflects two key attributes: a human level of logic and limited warfighting resources, two things both Sovereign and Harbinger implied were not restrictions they possessed.

And if they were lying... well, why trust their creator?

So, yeah, agreed on the Reapers having to be dumb.

#2805
Ieldra

Ieldra
  • Members
  • 25 190 messages
@Seijin8:
(in answer to your post three slots above this one)
Ah...I see your point. That's why, in my thread A different ascension, I have posited that Synthesis provides "the tools for self-improvement", without enforcing any particular direction beyond enabling communication and collective processing power in a similar way the geth already have, with some kind of mental networking, and giving synthetics an empathy-analogue. I see this echoed in CGG's posts as well as in Siduri's Unofficial Epilogue Slides, so I guess it's an aspect that occurs to many of us.

That won't make conflict impossible, it just makes reasonably sure (total certainty can't be had) that eventual conflict - which can't be avoided because it's just the nature of life to come into conflict with other life - will not end in extinction.

But yes, I concede that even though individuals start out very much themselves, ultimately, all that *will* make us other than we were. But that's where we're headed anyway, and realizing the potential is still determined by choice and circumstances. I'm just speeding the process up.

Modifié par Ieldra2, 30 mai 2012 - 10:07 .


#2806
Ieldra

Ieldra
  • Members
  • 25 190 messages
@CulturalGeekGirl:
Perhaps I should clarify what I like and do not like about the endings: I like the final choice and its primary effects - destroy or control the Reapers or effect Synthesis, or rather I like the underlying idea of those options. That's about it. The presentation is abysmally retarded, the description of the problem and Control and Synthesis insulting to players' intelligence, there isn't enough build-up most specifically for Synthesis, the side effects are soul-crushingly depressive, and roleplaying is completely done away with.

So while the actual writing is abysmal, thematically I don't have big issues with the ending that couldn't be repaired with a better execution. I.e. the EC. The endings appear "thematically revolting" because of sloppy writing and simplifying exposition to the point where it ceases to make sense.

#2807
CulturalGeekGirl

CulturalGeekGirl
  • Members
  • 3 280 messages

Seijin8 wrote..
Let me aim at a specific problem I see, and this is as much with the narrative presentation as with the concept itself: The solution's basis is individual and small-scale, while the problem being "solved" is both fundamental and civilization-scale.

A single person or small group cannot commit genocide.


First, let me stop you right here.  In this game, Shepard (a single person) can commit genocide. Has not one, but several opportunities to do so. Any premise that is entirely predicated upon the idea that  a single person cannot commit genocide is invalid in the Mass Effect universe. Still, I'm going to press on for now.

Seijin8 wrote...
Civilizations can. Civilizations are a grinding Darwinistic enterprise that will run over the lesser entities in their way, either absorbing or eliminating them. This is a side-effect of the social dynamics that push them forward. If synthesis indeed doesn't change the fundamentals of who we (the characters) are, then genocide hasn't been averted, just modified. The means have changed, the buildup to war will allow for more diplomatic options before genocide eventuates. Pehaps genocide is no longer necessary - a weakness that has been eliminated. The geth "rewrite" supplants the dreadnought with Thanix cannons.

For me, the synthesis solution doesn't solve anything *unless* it seriously alters the nature and essential character of every living thing in the universe, and by definition that doesn't leave people as "still us, just with a few new minor superpowers." It leaves superpowered civilizations a new host of ways to harm one another.

To put it as simply as possible: I can accept that some means of ending the Catalyst's feared superwar must exist. I cannot perceive a way that synthesis accomplishes it without erasing the fundamental attributes that define human existence.


I could literally list thousands of ways that the Catalyst could think that synthesis could solve the problem of organic competitiveness, the vast majority of which don't change human existance all that much. I'm going to try to go with an even 5 for now.

1 - Backups. Right now, you kill a synthetic, maybe they have a backup. As data, they can be stored plenty of ways. Say we can do that with humans now, that we have the ability to back up brainstamps and download them into a cloned body if a person is killed. Does this eliminate humanity? Not really - backups can still be destroyed or malfunction. It's the functional equivalent of modern medicine - it decreases the chance of involuntary or accidental death, but doesn't confer true immortality. Did boosting the average lifespan from 28 to 70 make us not human anymore? Not really. See: Down and Out in the Magic Kingdom.

2. Sturdier, interchangeable frames with the option of eliminating organic components -Digitized organic consciousnesses piloting synthetic bodies. Does that make us not human? I don't know, I think Major Kusenagi is a pretty nice lady, more human than a lot of folks I see every day. See: Ghost in the Shell: Stand Alone Complex

3. Less limited processing power - research shows we may be getting close to reaching the limits of the human brain. Say everyone has the ability to remember 200 books worth of facts. What if the next scientific advance requires 500 books worth of knowledge in one brain? Will a few skillsofts render us not human anymore? Hardly. See : the Matrix, Shadowrun, etc.

4. Portability. Organics are limited in how they can travel, whereas Synthetics can travel at the speed of data. Will constant access to near-instantaneous travel give us a significant survival advantage? Sure. Will it make us less human? Not really. See: Star Trek

5. Fewer snap judgements based on perceived inherent differences. If this makes us less human, I'm willing to let this one go. See: The Sneetches and other Stories, by Dr. Seuss

We don't even need to go all the way to an ascension. I mean, sure, eventually a full-on transhumanism for those who want it is fine, but small changes that improve our survivability? Authors have been writing stories of humans staying human with that level of enhancement forever.

Now would you consider any or all of these specific listed changes genocide?

Do you consider giving everyone slight superpowers with a door that allows them to choose true ascension if they want to genocide?

#2808
Seijin8

Seijin8
  • Members
  • 339 messages
Thank you for the links, Ieldra2. I popped through the epilogue slides (which I had done for Destroy, but never for Synthesis, thanks for reminding me about them), and will work my way through your threads as the week goes on.

Gradual transhumanism is fine with me. As you say, that's where we are headed anyway. It's the abrupt stuff that scares the hell out of me. Each generation successively reaching beyond the last while maintaining a sense of connected unity, I get, and that wouldn't be so bad in the (multi-generational) short-term.

I remain unconvinced of its long-term efficacy, and since by its definition it exceeds our comprehension, it'll have to stay that way, I suppose. The Catalyst - a being of millions of years experience - believes this is the solution, but offers no proof. After all, if it had been done before, things would have been different. I find the notion of placing my trust in the primogenitor of galactic xenocide problematic. Going with a solution that amounts to a hunch he has had, or a moment of clarity (Reaper brainfart) that the Crucible provided... also problematic.

It will have to remain an article of faith. I thank you for sharing your viewpoint with me and the other regulars here, and I look forward to reading more of your posts.

#2809
CulturalGeekGirl

CulturalGeekGirl
  • Members
  • 3 280 messages

Seijin8 wrote...

It will have to remain an article of faith. I thank you for sharing your viewpoint with me and the other regulars here, and I look forward to reading more of your posts.


OK, I'm not sure if I've solved this or if it's just so late that I think that I've solved it.

This is where we disagree about Synthesis and transhumanism and perhaps the endings as a whole.

For me, synthesis isn't an article of faith. It's an article of admitting that we have absolutely no idea whatsoever. It's not about believing you know what's happening or what's going to happen, it's about letting go of the need for faith or certainty.

I don't have faith that it'll work out. It's potentially a disaster. Oh no, I plugged in the overlord, etc. But I can embrace it philosophically because the whole of the possibility space for synthesis is vaster and more unknown than any of the other possibility spaces.

When confronted with possibility R, my thoughts were: "this will probably be bad."
When confronted with B, my thoughts were: "this sounds even worse"
When confronted with G, my thoughts were "I have absolutely  no way to assess what will happen after this at all. There is literally no way for me to predict whether it will be good or bad.  There is no rational way for me to form anything remotely resembling an opinion about this one... I'll take it." 

Not an article of faith.

An article of uncertainty.

Modifié par CulturalGeekGirl, 30 mai 2012 - 10:53 .


#2810
Seijin8

Seijin8
  • Members
  • 339 messages
@CGG: Oooooh, okay, its on now! Haha.

Regarding point zero about single-person genocide: You are absolutely right. Within ME, the opportunity has existed multiple times (even if later events somewhat trampled the concept, as with the Rachni queen). +1 to you, and that one left a big scar in my overall argument.

However... if these points are meant to reduce competitiveness... well... tools aren't inherently moral, it is their use that matters. These five things are tools.

1. This will have two side-effects. First, violence and killing will likely become a more casual affair, since it isn't for keeps anymore. Until it is. Military strategy would target the "ressurection centers" where the backups were stored, and the net social/emotional effect would be far more devastating than us short-lived mortals can comprehend. See: Battlestar Galactica reboot.

2. Better war machines, functionally not dissimilar to the transition of bronze age weaponry to the modern tanks, bombers and assorted other military toys. As everyone would have them, the balance of power wouldn't shift. See: Human history (written by those who adapted fastest).

3. Little effect on balance of power. Remember, the basis you are using is changes that are minimal and don't erase our humanity. Aggression as a negotiation mechanic in social hierarchial structures and as a means of resource competition remains, however much it is altered to fit societal norms.

4. We already have this in the Mass Effect universe. Its viability is measured in how many dreadnoughts a species can bring to bear. Nuff said.

5. I would argue that it absolutely does make us less human... from a point of view. On the flipside it allows more humane practices to arise. In my mind, humanity is a balance of the biological factors (evolutionary and resource driven) and our higher altruistic thought processes (able to comprehend the value of differentiating viewpoints, strive toward understanding and peace). Various cultures throughout history have accepted different balance points at different times in their history, so this would inarguably shift the balance farther toward altruism, but still (to my thinking) maintain the fundamentals of humanity. But it is a balance, not an absolute.

As far as people having slight superpowers in any way curbing violent/competitive tendencies, see: every superhero film of the last decade and most of the thousands of comic issues they were based on.

(Gotta go, be back later to continue.)

#2811
CulturalGeekGirl

CulturalGeekGirl
  • Members
  • 3 280 messages
The point isn't to reduce competitiveness or war-likeness. Bah, I don't know how I'm failing to convey this!

The entire point of synthesis it to make it so that robots don't have fundamental strategic benefits over us that would result in them inevitably winning any conflict we have.

That is it. Full stop. No reducing competitiveness. No reducing warlikeness. No reducing conflict or killing. No reducing striving.

The only stated aim of synthesis is reducing the odds of complete, total, and irrevocable extinction of non-robots in the case of a robot war, through introduction of technology that may reduce the survivability of non-synthetics.

I'm saying it is possible to do this without removing anything that makes us human. Your numbered examples are all descriptions of how we would retain our human fighting and striving even in light of my suggested enhancements. So... thank you for proving my point?

Well, except for number five. If being racist is part of being human, I'm willing to let that part go. I can't get too choked up about potentially losing that bit.

Modifié par CulturalGeekGirl, 30 mai 2012 - 11:06 .


#2812
SimonTheFrog

SimonTheFrog
  • Members
  • 1 656 messages
The geth need to defend themselves from agressors.

Since the green beam hits the whole galaxy, including plants etc. it affects everything.
And since the premise of the green beam is to end the basis of conflicts (will there be peace?), everthing will be affected by this.

No one will have to fear anyone.


oops should have updated. Post can be ignored now.

Modifié par SimonTheFrog, 30 mai 2012 - 11:14 .


#2813
Seijin8

Seijin8
  • Members
  • 339 messages
@CGG: Quote...

"The entire point of synthesis it to make it so that robots don't have fundamental strategic benefits over us that would result in them inevitably winning any conflict we have."

I understand that viewpoint, and that is what I am arguing about. Synthetics aren't/weren't created in a vaccuum. Nobody but Gepetto makes synthetics for giggles. The synthetics arose as a next step toward achieving a survival/resource-driven goal. Because they had to be better, they were able to adapt faster, etc, etc, tech singularity, organic extinction, blah blah.

Without eliminating the basis for overall conflict, more efficient/effective creations will arise. The fact that everyone shares hybrid DNA now (intentionally ignoring the wtf of that) in some ultimate evolution doesn't fundamentally change the competitive side, it just adds new avenues for exploiting those advantages.

To actually remove any chance of hyper-evolving constructs (be they synthetic or hybridized) from wiping everyone out requires that the very reason for making them be removed. I.E. no more resource competition initiated by the organics, and some type of artifically inseminated empathy added to the synthetics.

Not being a synthetic, I don't know how that change would affect me, but as a biological organism, that resource-driven survival rationale underpins *every interaction I have*, and every decision and consequence enjoyed by my species - and by extension - all other similar species, of which the ME universe has no short supply of.

My whole point on this has been: For synthesis to work as advertised, organics must change so fundamentally as to be unrecognizable from their beginnings, and therefore, it isn't Liara stepping off the wreck of the Normandy. Its just something that started as Liara and has changed in unfathomable ways. For the changes to be small enough for Liara to still be Liara, then the emotional, instinctual underpinnings must remain, and those same fundamentals of personality are driven by the very competitive processes that will lead someone, somewhere to create their destroyers.

So synthesis is no solution to the stated problem... unless that really isn't anything I knew as Liara. Nothing deeper than a VI construct inhabiting her new half-tech body.

And lest you think I am missing your point again: the strategic benefits of synthetics are in no way offset by synthesis. Unless synthesis means nothing will ever again be upgraded or kept secret, nothing can be manufactured/reproduced faster than anything else or in greater/sufficiently varied quantities, etc, which is mind-numbingly absurd on its face.

There will always be a way to make something just a smidge better and start this whole "creator/created" chain again. This only stops if the reason to create it in the first place is gone.

EDIT:  I think I have said everything on this subject that I reasonably can.  If we are not seeing one another's points, it is likely my poor communication of the topic or me perceiving some connection that others do not.  I have done a lot of research into aggressive behaviors, what causes them, etc, so this concept may carry a lot more psychological weight with me than it does with others.  This may unduly influence my perceptions of both the topic and the arguing points. 

Thanks to both CulturalGeekGirl and Ieldra2 for their points of view and the debate.  If any of this has been unduly confrontational, I apologize with the deepest respect.

Modifié par Seijin8, 30 mai 2012 - 11:57 .


#2814
Ieldra

Ieldra
  • Members
  • 25 190 messages
[quote]Seijin8 wrote...
Nobody but Gepetto makes synthetics for giggles. The synthetics arose as a next step toward achieving a survival/resource-driven goal. [/quote]
I challenge this assumption. I'm pretty sure the first true general artificial intelligence will come into being as nothing more than an answer to the question "Can we do it?" with little, if any, thought of utility. The only survival/resource driven motivation I can see is meaningless to the act itself - you might want more fame in order to get more money on your next job, or to attract a better mate.

Also, if you just want to create useful stuff, you're much better off with creating specialized agents with little intelligence which you can control. To say nothing of the thorny human rights issues. No, I'm pretty sure "we create true AI because they're useful" will not be on anyone's mind. [/quote]

Modifié par Ieldra2, 30 mai 2012 - 01:01 .


#2815
delta_vee

delta_vee
  • Members
  • 393 messages
I go to bed, the place explodes. Such is life.

CulturalGeekGirl wrote...

For me, synthesis isn't an article of faith. It's an article of admitting that we have absolutely no idea whatsoever. It's not about believing you know what's happening or what's going to happen, it's about letting go of the need for faith or certainty.

[...]

Not an article of faith.

An article of uncertainty.

That uncertainty is exactly the problem for many. I can't blame anyone who chooses to reject that vast abyss given how little we know. And while neither faith nor certainty are always achievable, having something to reason about is essential.

CulturalGeekGirl wrote...

The only stated aim of synthesis is reducing the odds of complete, total, and irrevocable extinction of non-robots in the case of a robot war, through introduction of technology that may reduce the survivability of non-synthetics.

I don't think we even have that much to go on. All we get is the Starbrat talking nonsense about frameworks, "new DNA", and the created turning on the creators. We don't even get direct reference to the singularity (yes, Ieldra2, I know it was in the cut script - all we can be certain of is the text at hand). There is much we can suppose, assume, or conjecture, but like the surface of Rannoch we get only what we bring to it.

Also, assuming the threat is indeed a future singularity (and I'll pretend to accept the idea of the singularity for a moment), a few upgrades isn't going to cut it in the face of something inherently much smarter than us. We're likely talking the Green Change being something on the same order of magnitude as the very singularity which the Catalyst considers dangerous. See: Accelerando by Charles Stross, the Altered Carbon series by Richard K Morgan, and the tabletop RPG Eclipse Phase.

Ieldra2 wrote...

I challenge this assumption. I'm pretty sure the first true general artificial intelligence will come into being as nothing more than an answer to the question "Can we do it?" with little, if any, thought of utility. The only survival/resource driven motivation I can see is meaningless to the act itself - you might want more fame in order to get more money on your next job, or to attract a better mate.

And the only reason to hook up said general AI to anything in the real world (which is the only avenue for such a thing to become dangerous) is if we deemed it useful for something. The question of utility may not be a factor in its creation, but it certainly will be such in its potential expansion (and thus potential threat).

Modifié par delta_vee, 30 mai 2012 - 02:18 .


#2816
Devil Mingy

Devil Mingy
  • Members
  • 431 messages

delta_vee wrote...

CulturalGeekGirl wrote...

The only stated aim of synthesis is reducing the odds of complete, total, and irrevocable extinction of non-robots in the case of a robot war, through introduction of technology that may reduce the survivability of non-synthetics.

I don't think we even have that much to go on. All we get is the Starbrat talking nonsense about frameworks, "new DNA", and the created turning on the creators. We don't even get direct reference to the singularity (yes, Ieldra2, I know it was in the cut script - all we can be certain of is the text at hand). There is much we can suppose, assume, or conjecture, but like the surface of Rannoch we get only what we bring to it.

Also, assuming the threat is indeed a future singularity (and I'll pretend to accept the idea of the singularity for a moment), a few upgrades isn't going to cut it in the face of something inherently much smarter than us. We're likely talking the Green Change being something on the same order of magnitude as the very singularity which the Catalyst considers dangerous. See: Accelerando by Charles Stross, the Altered Carbon series by Richard K Morgan, and the tabletop RPG Eclipse Phase.


It really amazes me that Bioware even thought what was in the leaked script would be a sufficient enough explanation, let alone the anemic pseudo-cryptic explanation we get in the proper game. However, I think it would have been acceptable if we were at least shown that the results were worthwhile. However, the only thing we see is a green light, the soldiers being slightly less cheerful in comparison to Control, and Joker and EDI looking happy together. The last scene, in particular, may have had some sort of symbolic meaning to me if there had been any moment in the last 30 hours of the game where they weren't happy together. I suppose it works to an extent if you discouraged their relationship, but why not have something that worked for the supportive Shepards? How about a scene near the end where, if you encouraged them to go for it, they're having some problems caused by just being too different from one another? It'd give their Adam and Eve moment at the end a bit more weight.

Since I don't know the cause nor the effect of synthesis, it's just a half-done scribble of notes in a "finished" story. Bioware's writers are better than that. What happened to the people who would willingly give up pacing to make sure that what they wanted to say was clear and understood?

#2817
CronoDragoon

CronoDragoon
  • Members
  • 10 413 messages

Devil Mingy wrote...

It really amazes me that Bioware even thought what was in the leaked script would be a sufficient enough explanation, let alone the anemic pseudo-cryptic explanation we get in the proper game. However, I think it would have been acceptable if we were at least shown that the results were worthwhile. However, the only thing we see is a green light, the soldiers being slightly less cheerful in comparison to Control, and Joker and EDI looking happy together. The last scene, in particular, may have had some sort of symbolic meaning to me if there had been any moment in the last 30 hours of the game where they weren't happy together. I suppose it works to an extent if you discouraged their relationship, but why not have something that worked for the supportive Shepards? How about a scene near the end where, if you encouraged them to go for it, they're having some problems caused by just being too different from one another? It'd give their Adam and Eve moment at the end a bit more weight.

Since I don't know the cause nor the effect of synthesis, it's just a half-done scribble of notes in a "finished" story. Bioware's writers are better than that. What happened to the people who would willingly give up pacing to make sure that what they wanted to say was clear and understood?


Well, if there was any ending that did not need explanation, I actually think it was synthesis. I think that maybe the whole point behind that ending was making a leap into some future the details of which you do not know. After all, BioWare would need a dissertation to explain the creation of a new DNA, how DNA becomes implemented into synthetics, etc etc. It's an ending so out there that I don't believe any amount of explanation would satisfy people who believe that explanation was necessary for it to be a valid choice. It seems to me an ending designed more for people who don't need to know the scientific basis for it, or even whether or not it will work out well for the galaxy. It's a gamble, a roll of the dice, and a thorough explanation of the aftermath would somehow diminish the potency of synthesis. It's clear in this topic that there are people who love the uncertainty of possibility that synthesis represents. I don't, but I also don't think an explanation would help my opinion of it.

In order for me to accept synthesis, it would have needed to be a choice presented to everyone individually, similar to the cure for mutants in X-Men 3 or whether or not you return from the primordial soup in the End of Evangelion.

Modifié par CronoDragoon, 30 mai 2012 - 03:48 .


#2818
Devil Mingy

Devil Mingy
  • Members
  • 431 messages
That explanation, likely true given how speculation was their intent, really hurts my outlook towards the vision that Bioware was going for. The attention to detail seen in the codex and the planet descriptions is a sight to behold, even if it's not entirely accurate. The attention to detail was one of the things that made Mass Effect so immersive to me. It's something that the ending seems desperate to escape from. There's certainly some appeal to the uncertainty of a strange, alien future, but I need more to go on than a few big words from a starchild.

I disagree that an explanation is impossible, though. Ieldra2 has done a fantastic job working around the synthesis, even if it doesn't make it more palatable for me personally. The only really bad thing I can say about Ieldra2's synthesis topics is that he most certainly put more thought into it than the writers did.

I'm very curious to see how much they clarify synthesis in the EC.

#2819
CronoDragoon

CronoDragoon
  • Members
  • 10 413 messages
Well, actually thinking about it, I don't think it was the intent of the writers to make synthesis specifically ambiguous, or else we would have gotten more explanation from the other endings. I think the sad fact is just that they skimped on all endings. So, maybe I'll reword it in saying that although it might not have been the point of the ending to be vague and unexplained, I think it is still the option that most appeals to people who like unexplained, vague things. (I don't mean that condescendingly at all, sometimes ambiguity/things not being explained is good)

#2820
SimonTheFrog

SimonTheFrog
  • Members
  • 1 656 messages
I'm sorry, I know this is nothing that adds to the thread directly.
But why... why did BioWare think that adding the synthesis as option the way they did was a good idea? Making a choice with so little information about what's going on is indeed a gigantic (and obviously forced) leap of faith. Why would anyone think this is something people want to do at the end of the trilogy?

I know that there are several films that try to reach an abstract new level at the end. There's even StarTrek the first movie which ends in a fusion of a human and an AI and nobody knows what it means. But didn't all these films earn the same kind of critique that the endings were considered aweful? It was surely the case for the StarTrek movie.

It makes no sense to me.

But i agree, explaining synthesis would not be the right choice. Especially not the technology behind it.

#2821
CronoDragoon

CronoDragoon
  • Members
  • 10 413 messages

Devil Mingy wrote...

I disagree that an explanation is impossible, though. Ieldra2 has done a fantastic job working around the synthesis, even if it doesn't make it more palatable for me personally.


Yeah but that's the point, right? The ending is left unexplained, so players fill in the blanks. I think that this is very different from BioWare offering an explanation which must then be taken as gospel. The former is the purpose of the ending while the latter, I think, may defeat the purpose of the ending. I think that any explanation BIOWARE offers is going to be ruthlessly picked apart and savaged. It's different when a fan offers his/her version. Other synthesis enders can say, "Well, that's an interesting way to look at it" or "I disagree, here's how I think synthesis would work." BioWare offering an explanation eliminates that back and forth which I think defines much of the allure of synthesis, the other being the idea of transhumanism.

#2822
edisnooM

edisnooM
  • Members
  • 748 messages
I was thinking about the ending choices, and I just realized that control doesn't really address the organic vs synthetics issue. In destroy synthetics are wiped out with the Reapers, and in synthesis organics and synthetics merge, which will help with the conflict somehow. But control does.....what exactly? Synthetics aren't even mentioned in this choice, and it doesn't seem like anything really happens to them, or at least we aren't told anything.

It probably doesn't mean anything but I thought it was kind of odd, it made me think of that Sesame Street song "One of These Things is Not Like the Others".

#2823
delta_vee

delta_vee
  • Members
  • 393 messages

SimonTheFrog wrote...

But why... why did BioWare think that adding the synthesis as option the way they did was a good idea? Making a choice with so little information about what's going on is indeed a gigantic (and obviously forced) leap of faith. Why would anyone think this is something people want to do at the end of the trilogy?

Judging by the imagery attached (quite literally a leap of faith, plus the obvious Adam and Eve shot at the end of the sequence), I think it was supposed to evoke a hopeful step into the unknown future. Take a Third Option, and all that.

Then again, the Aperture Science We-Don't-Know-What-It-Does is an only slightly more empty shell of an idea than the Blue Levers of Suspicion and the Red Tube of Doom; the former is described in the roughest of sketches, and the latter's boundaries are rather vague. All the options are ciphers to some degree or another.

#2824
CulturalGeekGirl

CulturalGeekGirl
  • Members
  • 3 280 messages

edisnooM wrote...

I was thinking about the ending choices, and I just realized that control doesn't really address the organic vs synthetics issue. In destroy synthetics are wiped out with the Reapers, and in synthesis organics and synthetics merge, which will help with the conflict somehow. But control does.....what exactly? Synthetics aren't even mentioned in this choice, and it doesn't seem like anything really happens to them, or at least we aren't told anything.

It probably doesn't mean anything but I thought it was kind of odd, it made me think of that Sesame Street song "One of These Things is Not Like the Others".


The implication of Control is that Shepard will hold the Reapers in reserve in case of crisis. If there's a Synthetic war that looks like it might wipe out organic civilization, Shepard will sweep in and stop it. Which is... pretty dumb.

I have a longer post brewing, but this ties directly into an assumption I frequently see cropping up when analyzing the ending: the assumption that the Starkid has an accurate assessment of the problems of the universe, and that he has any sane idea of what might mitigate these problems.

Starkid assumes Control will offer a solution because he assumes that if Shepard sees organics almost wiped out a few times, Shepard will make the decision to restart the cycles. Whether or not this is accurate doesn't matter... the Starkid is a moron with no sense of reality. Nothing he believes has any relation to what will actually happen... just, for him, this little assumption is enough for him to feel like he's done his job.

This relates to the "The Reapers are inexhaustibly stupid" point I made recently.

It is also why I'm able to embrace a broader view of synthesis. A lot of people analyzing synthesis assume it has to satisfy some rational expectation the Starkid has, it has to solve the supposed problem that he thinks he has some way to solve. I see no reason why it must relate to him at all. He could decide it's sufficient for no reason other than that Green is his favorite color. At this point in the narrative, what he thinks is literally completely irrelevant.

Modifié par CulturalGeekGirl, 30 mai 2012 - 06:20 .


#2825
delta_vee

delta_vee
  • Members
  • 393 messages

CulturalGeekGirl wrote...

This relates to the "The Reapers are inexhaustibly stupid" point I made recently.

With regards to that, a) you're right, and B) if they weren't stupid we'd have no game, which I think leads to c) someone somewhere failed with regards to the core premise of the third game.