I just wrote a huge multi-page reply to a friend who, for some reason, is not able to register her origin game and post on this forum, so I am a bit drained, but for not I just want to say... gosh go read some Ray Kurzweil or Vernor Vinge. Check out the book "The Singularity is Near." There are real moral issues about AI that our generation will have to deal with and if the ending to ME3 really affects you that much emotionally, you should think about what's actually going to happen to us.

Edit: Whoa, TopShep, maybe I'll just copy and paste what I wrote. There's nothing personally identifiable in it.
Here it is:
I recorded the last hour and a half of my game, from the point where you start crossing no-mans-land all the way through the credits. Been watching it like it's a movie.
Ray Kurzweil's book "Age of Spiritual Machines" deeply changed the way I saw the world, when I read it about six or seven years ago. I don't think it's an exaggeration to say that I've thought about its claims nearly every single day since then.
Essentially he claims that computational processing and information technology in general evolves exponentially by nature and that somewhere around the middle of this century humanity will develop AI that is so advanced that it will be indistinguishable to human intelligence. However, by then, human intelligence will have evolved and integrated technology so much that we won't be reliant on organic chemistry for survival and reproduction. We'll have transitioned from being human to something else, a concept people generally refer to as "transhumanism."
Many people consider themselves to be secular humanists and use that term to describe their human-centered values of individual rights, democracy, and empathy, as opposed to god-centered or supernaturally oriented values. Although the term "humanist" may apply to me, It's more accurate to say that I'm a transhumanist. It's the closest thing I have to a religion and indeed some people poke fun at it by calling it "rapture for nerds." There's no heaven or evidence of afterlife, so I deeply believe it's our destiny to embrace life and learn as much as we can about ourselves, change ourselves to stay healthy as long as we can, growing smarter, stronger, more powerful and creative. We will find it irresistible to manipulate our bodies and brains on a cellular level with ever more sophisticated nano-machines, such that we will be a hybrid of organic and synthetic life.
It's very hard to predict or even imagine what life might be like and fears about conflict between organic and synthetic life influence a lot of science fiction that interests me, including Mass Effect, but I consider such conflicts morally unrealistic. For a conflict like the one between the Quarians and the Geth to occur, there has to be an unusual amount of evil and ignorance somewhere. I'm very sympathetic toward the Geth and EDI, and any other AI that acts in self-defense. It's life trying to find a way, misunderstood, in need of help, and intelligent life is so valuable, why would anyone want to destroy it.
Apparently the Mass Effect universe has more evil in it than I would have given it credit, because the Catalyst believes this conflict to be part of the very nature of the cosmos. "All creations rebel against their creators," which I agree is true to the extent that rebellion means defending one's independence, but that need not lead to chaos, IMHO. So I interpret the Catalyst to be from an earlier civilization that evolved through a technological singularity traumatically, and observed some other species so that its values were misinformed and it didn't become very smart, lol. Someone smart with a heart and head like Shepard had to come along and change it.
I imagine the Catalyst hasn't seen very many cycles. Or, I would like to imagine. I just find it very striking and deeply disturbing to think that a species who's intelligent enough to create AI would not, on average, have developed enough moral intuition to treat their creations sensitively. But really who knows? Look at this world and how close we are to developing AI of our own. There are still huge swaths of humanity who believe in the death penalty for changing religion. Utterly backwards and barbaric, but it is mainstream in Islam. And look at the most technologically advanced of us actually defending those kinds of values with our lives and fortunes, in the name of economic stability, or democracy, or multiculturalism, or whatever. I would like to believe that just as there is an accelerating curve for technological progress, there will be one for moral progress as well. I am hopeful and optimistic.

So there's no way I would have decided to kill the Geth and EDI. Their intelligence is key to humanity evolving, advancing, and understanding itself better. They are just too valuable for their intellectual resources and emotional development. In ME2 I kept Legion with me on nearly every mission and in ME3 I definitely had EDI with me everywhere.
It totally blew me away when I saw her walk out of the AI core... wow. What an incredible moment. I knew the story just hinged right there, eventually learning that the only reason that happened is because Joker removed EDI's AI shackles out of love for her and the Normandy. That they are Adam-and-Eve figures in the end is incredible, more than rewarding and delightful, truly awe-inspiring and just awesome. I'm still a bit shocked by it because I did not see that happening at all, but it's great! It's love. And it's not just love between two people but a richly developed love for life, intelligence, technological creativity, peace, courage, and more.

But leaves Shepard's "death" which I don't really interpret as death, at least in the synthesis ending, which is the only ending I've seen and maybe the only one I want to see. The Catalyst says he can't complete the synthesis, the peaceful non-traumatic union between organic and synthetic life, on his own. The Crucicble requires Shepard's "energy" to be mixed with it. So I interpret this to be Shepard's soul, however constructed, and if it is constructed the way human souls are apparently constructed, as patterns of information, then Shepard's body and mind had to be completely deconstructed so that pattern could be determined.
It may not have been necessary from a physical standpoint. One would think that Shepard could have ended the reaper cycles by living as a leader and example. But for the Catalyst it was necessary. There was a real standoff between the Catalyst and Shepard, something they really didn't understand about each other, and that had to do with heart and sensitivity, and love of liberty... the most complex and wonderful things about us. How are those patterns gleamed? What do they look like? So I understand and sympathize with the Catalyst's problem to some extent and in the end decided to let him deconstruct Shepard inside the Crucible.
But is love really that mysterious? I would like to think that it's not really possible to understand and develop an AI in the first place without the integration of desires, the complex means of achieving them, and the ability to love. But perhaps it is so mysterious. Look at humanity's daly struggle with love. But even then, there is a remarkable amount of it on Earth. The fact that we've spread and populated the globe, protected each other, formed so many families, and across cultures, that we have an apparently stable system of global trade, that I can buy things on a website from across the planet and have it delivered to me in a couple days, that we all work together peacefully somehow, that harmony is a kind of love, isn't it?
The Catalyst seems capable of love and sensitivity, but only love for order, not liberty. It doesn't comprehend liberty as anything other than chaos. Its technological singularity must have been of the most traumatic and insensitive sort, from a civilization that didn't appreciate liberty (not terribly unlike the Protheans) where the AI completely wiped out its creators,
Javik is interesting, after all, in that he is so shocked by the democratic libertarian ethos of Shepard's cycle. It's just so ordinary and matter-of-fact for strength and greatness to be attached to ruthlessness and dominance. Maybe it is an unusual thing in the cosmos. Maybe Mass Effect is delivering that statement, that liberty really is so precious, that we don't appreciate just how good it is, that we take it for granted, that on a cosmic scale there are godlike machines who don't give a whiff of understanding for our love of it.

So yeah I think the ending is very thought-provoking and meaningful.
Mass Effect has always been associated with running for me, which I can tell you more about later, but I know what it's like to go down to what feels like the very last pit of energy and will in my body, and still get up and power forward, surpassing the expectation that I would still be alive. She is so inspiring to me and I was so proud of her. I've never been so on the edge of my seat in a game. I was so amazed by her...
Well, I love this game. I love it top to bottom and I don't need another ending. It is without a doubt the most amazing well-designed, well-written and thought-provoking game I've ever played and I don't honestly have the desire to play another game again. It's made me think so much about life and soon I will be out running, spending most of my day in the sunlight, appreciating my body and life, pushing myself and using up every last bit of my energy for love, like Shepard.
I know you really liked Garrus and, depending on how the romance turned out, I can imagine why you would be upset that he didn't follow you through the conduit. I mean, especially if he said things like, I'll follow you anywhere, but I have my own theory about that, even that case. For now I'll leave it at that.
I've written a lot and would to hear what you think.
I think it's clear that Shepard was not indoctrinated and that, if the boy never really existed and was always the presence of the reapers/catalyst in Shepard's mind, it only shows how strong Shepard's mind was, that she would see the catalyst as a worried frightened child whom she had to guide and help mature.
Whoa, I am getting a bit tired, but I would love to talk to you about this further. Write as much as you want.
... [snip personal stuff] ...
See you.
Modifié par AtlasMickey, 14 mars 2012 - 01:59 .





Retour en haut

















