Aller au contenu

Photo

Organics and Free Will in the Synthesis ending.


  • Veuillez vous connecter pour répondre
56 réponses à ce sujet

#26
Sideria

Sideria
  • Members
  • 128 messages

Riot86 wrote...

Changing someone's way of thinking by force is brain-washing. It doesn't matter if you think the resulting "Utopia" is for the better, as this is your subjective view only. How can you know for sure that your view is the right one?

Don't get me wrong, I agree with your belief and also think it would be great if people would care to understand each other more as this most likely would would reduce hatred and violence on this world. But that is just my personal opinion and I would never dare to force it on somebody else. I would maybe try to convince other people using arguments but I certainly have to left the decision whether they want to overthink their point-of-view or not to themselves.

It is a contradiction to say people have to appreciate each others more, but then ignoring their individual beliefs and imposing your opinion on them without their consent. Tolerance by defintion is something that cannot be forced as it is about respecting the beliefs of others, even if one might strongly disagree with those beliefs. Tolerance is something that has to be learned.

And stating that in the end everybody seems to be happy is irrelevant, as those people are not the same they were before. Maybe some humans didn't want to learnt to appreciate the other races and cultures in the first place. I personally would think they are complete idiots and of course would strongly differ in opinion with them as I think this is a terrible attitude. But I have to respect it nonetheless. And if those humans suddenly start to like aliens after Synthesis this has to be called "brain-washing" as their convictions have been forcefully replaced by an alternative set of beliefs.


What I meant was that Synthesis give people new tools to understand others. And their change of way of thinking is the result of people using these new tools. People are still free to choose what to believe.
(And maybe some haven't change at all. After all the ending slide can be considered like a tendency.)

But I agree that "giving" forcefully these tools is unethical. Still my Shepard choose to do it for the greater good. She believes that with Detroy or Control the result will probably be the same, the galaxy will reach a "natural" synthesis in a far future. But this will be a long and bloody path.

Modifié par Sideria, 17 mars 2013 - 08:11 .


#27
Norbulus

Norbulus
  • Members
  • 33 messages
Synthesis cutscenes show rather distorted version of humanness. First of all, being a human doesn't grant you the ability of ultimate empathy. This is clear and simple, if you don't believe, just look around; you'll see self-centered people for sure.
Second of all, free will doesn't exist. Yes, our brain simulates it, but you do not decide. Little electrons in their orbits decide.
And I'm open for any philosphical debate.

#28
Sideria

Sideria
  • Members
  • 128 messages

Norbulus wrote...

Synthesis cutscenes show rather distorted version of humanness. First of all, being a human doesn't grant you the ability of ultimate empathy. This is clear and simple, if you don't believe, just look around; you'll see self-centered people for sure.
Second of all, free will doesn't exist. Yes, our brain simulates it, but you do not decide. Little electrons in their orbits decide.
And I'm open for any philosphical debate.

True and true.
- I find empathy highly missing in the folks I see everyday ^^
- Our brain is software and hardware and we can change the software (education, social pressure, culture, experience, etc...), but our will will always be bound to the hardware part (drug can alter that :P).

Edit : Ah ! This give me a way to explain synthesis --> Synthesis only alter the "hardware" part. Not the "software", this is why synthesis doesn't change people beliefs.

Modifié par Sideria, 17 mars 2013 - 06:23 .


#29
teh DRUMPf!!

teh DRUMPf!!
  • Members
  • 9 142 messages

Sideria wrote...

But I agree that "giving" forcefully these tools is a unethical. Still my Shepard choose to do it for the greater good. She believes that with Detroy or Control the result will probably be the same, the galaxy will reach a "natural" synthesis in a far future. But this will be a long and bloody path.


Here here!

I have to roll my eyes at folks who argue that it's oh-so-dangerous that we achieve something without difficulty. Worse yet when it involves quoting Mordin, and Legion, because that's totally NOT the point that either of them were trying to make. People have some deluded idea from those conversations that it's actually a good thing for blood to turn the wheels of time forward... SMH!

The only people in-game that actually do make this point are various krogan on Tuchanka in ME2. The stupid ones.

#30
Franky Figgs

Franky Figgs
  • Members
  • 119 messages

Norbulus wrote...

Synthesis cutscenes show rather distorted version of humanness. First of all, being a human doesn't grant you the ability of ultimate empathy. This is clear and simple, if you don't believe, just look around; you'll see self-centered people for sure.
Second of all, free will doesn't exist. Yes, our brain simulates it, but you do not decide. Little electrons in their orbits decide.
And I'm open for any philosphical debate.


At what point does the "I" emerge from the little electron orbits. Our minds are then like a group consensus the Geth refer too. When you have enough of these orbits together they create a decision making process. They all are "I" but the majority of them opperating make the free will. Any constrants on this is a control or oppression of that free will. There can be many such conditions, however; a direct uplink to control seeking AI would be a complete oppression of free will. 

#31
Auintus

Auintus
  • Members
  • 1 823 messages

Franky Figgs wrote...

Norbulus wrote...

Synthesis cutscenes show rather distorted version of humanness. First of all, being a human doesn't grant you the ability of ultimate empathy. This is clear and simple, if you don't believe, just look around; you'll see self-centered people for sure.
Second of all, free will doesn't exist. Yes, our brain simulates it, but you do not decide. Little electrons in their orbits decide.
And I'm open for any philosphical debate.


At what point does the "I" emerge from the little electron orbits. Our minds are then like a group consensus the Geth refer too. When you have enough of these orbits together they create a decision making process. They all are "I" but the majority of them opperating make the free will. Any constrants on this is a control or oppression of that free will. There can be many such conditions, however; a direct uplink to control seeking AI would be a complete oppression of free will. 


There is no "I." Or, rather, the "I" is created by the physics behind it all.

Your psychology is derivative of your biology, which is, in turn, derivative of your biochemistry. Chemistry is merely a result of physics, which is, for the most part, mathematically predictable. Those numbers have been crunching for ages and your thoughts are merely the result of chemical interactions that are completely beyond your own control. Think of it like fate, guided by physics.

#32
Auld Wulf

Auld Wulf
  • Members
  • 1 284 messages
It's all exclusively headcanon, a point I've gone to lengths to prove in the past. Aside from what we're objectively told in the ending as canon, we can't say anything other than. One could say that when Shepard picks Destroy, the Catalyst puts Shepard into a coma (believing organics aren't yet mature enough to make the choice) and proceeds with the harvest, and that everything that we see in the Destroy ending is just a fake reality contained within the Reaper consensus anyway.

We can only go by what each ending tells us, if we go further than that, it's headcanon.

#33
Norbulus

Norbulus
  • Members
  • 33 messages

Auintus wrote...

Franky Figgs wrote...

Norbulus wrote...

Synthesis cutscenes show rather distorted version of humanness. First of all, being a human doesn't grant you the ability of ultimate empathy. This is clear and simple, if you don't believe, just look around; you'll see self-centered people for sure.
Second of all, free will doesn't exist. Yes, our brain simulates it, but you do not decide. Little electrons in their orbits decide.
And I'm open for any philosphical debate.


At what point does the "I" emerge from the little electron orbits. Our minds are then like a group consensus the Geth refer too. When you have enough of these orbits together they create a decision making process. They all are "I" but the majority of them opperating make the free will. Any constrants on this is a control or oppression of that free will. There can be many such conditions, however; a direct uplink to control seeking AI would be a complete oppression of free will. 


There is no "I." Or, rather, the "I" is created by the physics behind it all.

Your psychology is derivative of your biology, which is, in turn, derivative of your biochemistry. Chemistry is merely a result of physics, which is, for the most part, mathematically predictable. Those numbers have been crunching for ages and your thoughts are merely the result of chemical interactions that are completely beyond your own control. Think of it like fate, guided by physics.

Yes, you're right, my determinist friend.

#34
Auld Wulf

Auld Wulf
  • Members
  • 1 284 messages
@Sideria

I agree with your reasoning. It seems that the vast majority of humans have a vast deficit in sympathy, empathy, conscience, and reason. Essentially, the growing norm for humanity seems to be sociopathy, and that scares me more than a little. It's what allows us to easily kill or take slaves without ever realising the wrongness in such an act. What satisfies the sociopath is all that matters to them, they are purely about their own gratification and nothing else. There are no other concerns, and there's almost a solipsism to it -- to the sociopath, they are the only real person alive.

And yet there are many other people out there. Those who are less fortunate, those who are sick, those who are less capable. To the sociopath, these are pawns to be manipulated and nothing else, as they are lesser creatures. Anything that isn't a threat to the existence of the sociopath is a lesser creature, they are only ever in league with another entity because they believe it gives them a better chance of survival. Still, they would stab their so-called allies in the back if it would benefit them, somehow.

We see a decreasing level of connection with real friends or family, wherein we have people today who'd sell their own grandmother to satisfy their need, to provide them with their gratification. Who have absolutely no emotion for what their grandmother would feel whatsoever, and no conscience as to why this is an unethical action. This is why it's an easy choice for them to make -- because they don't take the consequences as their responsibility, they are completely disconnected from that.

In a story, we show our inner selves. We're told that the Reapers are just slaves to a control program, and contain billions of people from thousands of civilisations. We have the option to free those people, and yet many choose to kill them. They choose to kill them because their Shepard survives, and because they get that gratification of having "won" something by killing their opponent. This is a standard sociopathic trait, to be unable to see anything from any perspective other than their own.

If it was within my power, I'd initiate Synthesis today, as I feel that it's needed. People need empathy, sympathy, conscience, and reason. Only then can we truly transcend and become an intelligent peoples, free of instinct. I believe that without empathy, sympathy, conscience, and reason, we are worse than the lowest animals. Even wolves look out for their own, so you have to go further back down the chain than that.

Look at those who consider the geth worthy of dying because they are different. This reminds me of the sociopathy trends that allowed slavery of black people, and the hatred of gay people, and the hatred of disabled people, the latter two I've had to live through and experience myself. It's amazing sometimes how many people will consider you a lesser being if you're disabled in any way, shape, or form.

This is why there's so much exploitation today, too. Exploitation of the disabled and the less fortunate. And yesterday it was exploitation of ethnicities -- those with a different skin colour. It's sad really that we haven't yet, on our own, evolved to a point where sympathy, empathy, conscience, and reason are commonplace without the need of outside intervention. All I can think of though is how much better the world would be, present and future, if every person had sympathy, empathy, conscience, and reason.

And that's why I pick Synthesis. Every time.

#35
Deathsaurer

Deathsaurer
  • Members
  • 1 505 messages
The Catalyst's whole reasoning here is it's going to happen eventually so why not use our tech to speed it up. This attitude makes perfect sense for it since this is the goal it's had since it started but I'd rather provide an atmosphere where we do it on our own terms when we're ready instead of me picking it as a symbolic avatar. I don't think there is any intended brainwashing involved in it but it's just not for me. It's waited eons for it to happen what's a few more centuries compared to that? Yeah, there may be some bloody wars between now and then (or not depending on your ending) but that's a risk I'm willing to take to do it the right way for me.

#36
Sibu

Sibu
  • Members
  • 220 messages

SpamBot2000 wrote...

There are no organics after the synthesis ending, just hybrids. Grass will go crunch under your robo-feet.


http://9gag.com/gag/6818793

#37
Franky Figgs

Franky Figgs
  • Members
  • 119 messages

Auintus wrote...

Franky Figgs wrote...

Norbulus wrote...

Synthesis cutscenes show rather distorted version of humanness. First of all, being a human doesn't grant you the ability of ultimate empathy. This is clear and simple, if you don't believe, just look around; you'll see self-centered people for sure.
Second of all, free will doesn't exist. Yes, our brain simulates it, but you do not decide. Little electrons in their orbits decide.
And I'm open for any philosphical debate.


At what point does the "I" emerge from the little electron orbits. Our minds are then like a group consensus the Geth refer too. When you have enough of these orbits together they create a decision making process. They all are "I" but the majority of them opperating make the free will. Any constrants on this is a control or oppression of that free will. There can be many such conditions, however; a direct uplink to control seeking AI would be a complete oppression of free will. 


There is no "I." Or, rather, the "I" is created by the physics behind it all.

Your psychology is derivative of your biology, which is, in turn, derivative of your biochemistry. Chemistry is merely a result of physics, which is, for the most part, mathematically predictable. Those numbers have been crunching for ages and your thoughts are merely the result of chemical interactions that are completely beyond your own control. Think of it like fate, guided by physics.


Determinisim, while intellectually simulating and not proven wrong, negates moral responsiblity. This story relies heavily on moral decision making and free will is not proven wrong either.
I believe most people, even though some may talk an argument to support determinism, live their lives with the reponsiblity of their choices. In fact every justice system is build on it.

The Geth consensus is a perfect metaphor for our minds synapses. Individually doing little of nothing, but together arriving at intelligence. Once Geth (or synapses) come together they arrive to the conclusion - "cogito ergo sum" which is this:

"x" thinks
I am that "x"
Therefore I think
Therefore I am

An emergance of moral responsiblity is naturally derived and the feeling of free will ensues.
Constraints such as social, political, ideological, and even physical environments will guide this emergence in a predicatable behavior. But it will not take away from it. However, a thrall would see the classic logic as follows:

"x" thinks
IT is that "x"
Therefore IT thinks
Therefore IT is

And all self identity is lost in IT.

[edit; no pun to indoctrination theory intended]

Modifié par Franky Figgs, 17 mars 2013 - 08:24 .


#38
Sibu

Sibu
  • Members
  • 220 messages

Auld Wulf wrote...

@Sideria

I agree with your reasoning. It seems that the vast majority of humans have a vast deficit in sympathy, empathy, conscience, and reason.

.


You, good sir, are full of ****:

http://tvtropes.org/.../ActsOfKindness
http://tvtropes.org/...RealLifeScience
http://tvtropes.org/...g/RealLifeOther
http://tvtropes.org/.../RealLifeSports

http://www.randomactsofkindness.org/

http://www.cracked.c...-to-rescue.html

Both, Rousseau and Hobbes were right. There are good people everywhere just as many  (if not more) bad people in the planet.

The list CAN go on almost infinetly... you pesimist piece of crap

****ing Bull****.

#39
eye basher

eye basher
  • Members
  • 1 822 messages
A the oldest excuse in the books i did it with good intentions you know what they say about good intentions right the road to hell is paved with them.

#40
Norbulus

Norbulus
  • Members
  • 33 messages

Sibu wrote...

Auld Wulf wrote...

@Sideria

I agree with your reasoning. It seems that the vast majority of humans have a vast deficit in sympathy, empathy, conscience, and reason.

.


You, good sir, are full of ****:

http://tvtropes.org/.../ActsOfKindness
http://tvtropes.org/...RealLifeScience
http://tvtropes.org/...g/RealLifeOther
http://tvtropes.org/.../RealLifeSports

http://www.randomactsofkindness.org/

http://www.cracked.c...-to-rescue.html

Both, Rousseau and Hobbes were right. There are good people everywhere just as many  (if not more) bad people in the planet.

The list CAN go on almost infinetly... you pesimist piece of crap

****ing Bull****.



I think you misinterpret the concept of empathy. It's not about helping poor people, it's about understanding how they feel. Those numbers are very small compared to total of outgoings.

#41
Yestare7

Yestare7
  • Members
  • 1 340 messages
Here is my opinion: After a year, all organic tissue has rejected the synthetic bits, and we are back to normal. Turning everyone into semi-bots is total bulls***


Edit: It is the starbrat's favorite, ANOTHER reason not to choose it.:whistle:



Y

Modifié par Yestare7, 17 mars 2013 - 09:02 .


#42
Yestare7

Yestare7
  • Members
  • 1 340 messages

Modifié par Yestare7, 17 mars 2013 - 09:04 .


#43
geceka

geceka
  • Members
  • 208 messages

Auld Wulf wrote...

In a story, we show our inner selves. We're told that the Reapers are just slaves to a control program, and contain billions of people from thousands of civilisations. We have the option to free those people, and yet many choose to kill them. They choose to kill them because their Shepard survives, and because they get that gratification of having "won" something by killing their opponent. This is a standard sociopathic trait, to be unable to see anything from any perspective other than their own.


I wouldn't go as far as to say that choosing to finish the game this way is a sign of sociopathic traits (in fact, I disagree), but you are highlighting one of the issues I'm having with the way the Reaper arc was resolved: We never got to experience what is actually inside a Reaper, e.g. what it means that they store billions of cojoined minds, like Legion said in ME2.

After all, it is very hard to empathize with something that is only being alluded to, but which you cannot experience yourself – empathy is a lot about experience, about understanding and about being able to put yourself into another's shoes.

If there was a scene in ME3 in which we got to experience the minds within a Reaper in a way similar to the Geth consensus, this might have certainly played a role in how people would have picked their final decisions. As it stands, though, the nature of the Reapers is but a footnote that doesn't really invite people to think much about – Which is a pity, really.

#44
Eterna

Eterna
  • Members
  • 7 417 messages
I'm not getting where it is implied in that ending that free will no longer exists. Sounds more like people making things up because they don't like it.

#45
Absaroka

Absaroka
  • Members
  • 162 messages
It's not that Synthesis is intended by the writers to impose on free will, its just that the idea that it wouldn't given everything that had come before and how it is presented is ridiculously hard to believe (but its hardly the only thing that's ridiculous about that ending).

#46
Banxey

Banxey
  • Members
  • 1 306 messages
This is my reasoning on the whole thing. As far as taking the choice away from the rest of the galaxy, if you chose destroy you kill the newly aware Geth and EDI. And if you chose control you're basically replacing one regime with another. Shepard isn't perfect. It's not ideal. But the game is about difficult decisions. There's nothing in the cut scenes to suggest that there aren't going to be people annoyed about being Synthesized. It would make an interesting plot for a new game. Life would be boring without conflict.

As far as the ending being BS. Nobody will read this, but I need to say it somewhere so I can stop thinking about it and go play something else. ;)

The Catalyst is a program that controls the Reapers, it was programmed to find a soloution and once it found one it continued to carry it out as it was programmed to do. It had no free will outside of the parameters it was programmed to operate within until there was an interruption in that program which caused it to reboot. In other words, it had to find a new soloution with the new data available to it (Shepard's success).

Seeing as how the Catalyst had decided there was no other alternative to stop synthetics from wiping out their creators if the reaping was stopped, the only possible soloution was to advance evolution to a point where humans and AI had no choice but to see each other as equals. But the Catalyst can't understand organics because it is still a machine. It needs Shepard.

Now for the so-called "Space Magic" (a term I kind of hate given the nature of the game). At a point in the game Javik states that he can communicate and read experiences via genetics. Which most people call BS, but there is actually a current scientific theory around genetic memory. The basic idea behind this is that in order to evolve, life experiences (traumatic or otherwise) are somehow stored in genes and passed on to the next generation in order to inform a species on the best way to evolve. It isn't something that is an established science (and isn't anything like Assassin's Creed would have you believe) but the point is that it does still have a rocky foundation in science.

My point being of course that rocky science is the basis for a great deal of science fiction. And using this theory, the Crucible breaks Shepard down into readable data and in a way similar to Legion freeing the Geth with the Reaper code, gives the Catalyst and the Synthetics the data they need to understand biological life. The shockwave itself is comprised of data and some form of self-replicating nanite. And as Shepard has just united the galaxy (and is essentially an exceptional specimen having reached the Catalyst), his/her genetic memory imparts all of it's experiences. Which is why it had to be Shepard. Someone else might have gotten it wrong.

Modifié par Banxey2, 17 mars 2013 - 11:35 .


#47
Riot86

Riot86
  • Members
  • 250 messages

Auld Wulf wrote...
What satisfies the sociopath is all that matters to them, they are purely about their own gratification and nothing else. There are no other concerns, and there's almost a solipsism to it -- to the sociopath, they are the only real person alive.

Auld Wulf wrote...
If it was within my power, I'd initiate Synthesis today, as I feel that it's needed. People need empathy, sympathy, conscience, and reason.

So because you feel that people need more empathy, you want to impose your will on others without regarding wheter they want to be more empathtic or not? Wouldn't this complete disregard for the opinions, attitude and feelings of others in favor of your own point of view kinda make you a sociopath yourself, at least given the definition you presented?

If Synthesis was a matter of choice, meaning that all individiuals would have to decide for themselves if they want to be "synthesized" or not, one might surely argue that it was the right thing to do. But the way it is presented in the game itself it is forcing your own beliefs on others...and that's not cool.

Though I basically agree with your view concerning the need for more empathy in general, it would be a good thing if more people would care for others more and tried to understand them. But as I said in an earlier post, I would never dare to force my opinion on others the way you obviously would if you had the power. And to be honest, I'm really glad that you most likely will never be able to do so.

Sideria wrote...

But I agree that "giving" forcefully these tools is unethical.

And that is the essence of it. There was no way Shepard could be sure that everyone was willing to accept this change, these new tools. In fact provided that she is having Javik in her party she could be on the contrary 100% sure that not everybody was in favor of it as Javik is clearly opposed to the idea of Synthesis. And it would be naive to believe our favourite Prothean was the only one that was thinking that way.

By making this alteration happen nonetheless Shepard was willingly accepting that people were forcefully modified. And since this obviously changed the way they were viewing the world (e.g. Javik not throwing himself out of the airlock ^_^), this meets the criteria for being called "brain-washing".

Don't forget, brain-washing doesn't in principle mean that people don't have free will afterwards anymore, it rather means that the basis that is responsible for their opinion-making is altered by force. And that is what Synthesis does, regardless whether it alters the hardware or the software part to use your analogy. It is irrelevant if having these new tools made them overthink their viewpoint on their own, the act of forcing these tools on them without them wanting to have them in the first-place is were the problem lies. Shepard didn't convince them of the worthwhile idea that one should try to understand others more using arguments, she used force.

Modifié par Riot86, 17 mars 2013 - 11:57 .


#48
Sideria

Sideria
  • Members
  • 128 messages
Okay I grant you the point on brain-washing after reading the definition again (neural modification is a form of brain-washing).
However the today definition isn't perfectly adapted to the theme, if I do a transhumanism brain augmentation it's a brainwashing ?

But the thing is that all three ending have unethical part.
I prefer forcefully upgrade all life than killing all synthetics , Reaper included (My Shepard support Adams in this dialogue).
But I do find Control like a sort of middleground (an other Shepard of mine always pick control).

PS : I think it's this kind of debate that Bioware was hoping from the begining ^^

Edit : added transhumanism question.

Modifié par Sideria, 18 mars 2013 - 12:41 .


#49
Reorte

Reorte
  • Members
  • 6 592 messages
There's no reason to believe that everyone doesn't do whatever they want to do post-Synthesis. However there's also a lot of reason to believe that what they want to do may not be what they wanted to do before it (otherwise it wouldn't achieve any of the things it's somehow magically meant to achieve). So there's still free will, but in a mind-altered population.

#50
Xellith

Xellith
  • Members
  • 3 606 messages
I find it disturbing that the geth needed the upgrades to be "truly alive" and edi eventually said that she is only "alive" once she is synthesised.

No. The geth and edi were alive already. (in my opinion)