Spoilers for the entire Mass Effect Trilogy within this thread.
If you're wondering about why this thread isn't in the ME3 section, or if you saw the first version of this thread and are wondering about the disappearance/reappearance of it, see here.
Are Synthetics 'alive'? Are they sentient? Are they the same as Organics - just made of different materials?
Or are they merely an imitation of life? Without an actual mind or any free will - just algorithms and code designed to give the appearance of sentience?
In short - "Does this unit have a soul?"
"...No data available."
This issue is one of the central questions in the Mass Effect Universe. It fuelled the Geth-Quarian war. Shepard can, on various occasions, express a belief in either direction. And, given the conversation with the Catalyst, and the various ending choices, the question even has an influence on the ending of ME3. After all, if your answer to Legion's question is "No," then there's really no downside to picking Destroy.
Defining a Mind
Firstly, we need to clarify exactly what we mean by the above question. What is a soul? What is a mind? What difference does it make if synthetics have one?
Now... I'm not particularly religious. While I won't rule out the possibility that something like an eternal soul exists, I don't actively believe in one, and it's not what I'm here to talk about. I'm not asking whether or not Legion is now doing the robot dance in heaven.
In this post I'm treating souls and minds as the same thing. So what do I mean by a mind?
By a mind, I mean the 'thing' that watches the world from behind your eyes, listens to the world with your ears, and experiences sensations of touch from your nerve endings.
This is the 'consciousness' that exists in your brain, and EDI doesn't have one because, let's face it, she's a fictional character and every single one of her responses has been pre-programmed by Bioware. (Yeah, right now I'm sitting on the real-world side of the fourth wall.)
I'm sorry, EDI...
The problem is that I immediately run into a logical wall with this definition of a mind.
I know that I have a mind - or, more specifically, I know that I am a mind that has a body. However, I have no way of knowing that the same is true of other people. From my perspective, everyone else could just be bodies with pre-programmed responses, and I've yet to ask or do something that doesn't have a pre-programmed response.
In other words, I might be surrounded by philosophical zombies - people who appear to be real, but don't actually have a mind.
I'm not actually calling you a zombie here - I'm just saying that I have no way of knowing whether or not you are one. Equally, you have no way of knowing whether I'm real or a zombie - and it doesn't help that your only connection to me is via words on the internet.
This leads us the next section - which is arguably an example of the worst dentist appointment you could ever have.
The Rational Dentist
Stephen Law wrote a book called "The Philosophy Gym". He has made one of the chapters - "The Strange Case of The Rational Dentist" available online here. I highly recommend following the link and giving it a read, (partly because Stephen Law is a university lecturer in philosophy and I'm not) but I'll summarise the main points:
The dentist is a character who believes that he is the only being in existence to possess a mind.
During a session with a patient (who cannot speak because his mouth is full of cotton buds and numb with pain-killer) he explains his reasoning to the patient for not believing that they are actually in possession of a mind. This is all while sticking a drill inside their mouth. He doesn't believe his patient can actually experience pain, but he administers the pain-killer anyway because it's an observed fact that sticking a drill inside someone's mouth usually causes them to scream and flinch if they don't have any pain-killer injected first.
He is actually somewhat rational to hold this belief. Kinda. (But believe me, I'm grateful that we had either Chakwas or Michel on the Normandy, and not this dentist...)
The analogy that the dentist uses is that of cutting open cherries and finding stones inside. If you cut open 1000 natural cherries and find 1000 stones, you can make some cherryade. You can also comfortably conclude that all natural cherries contain stones.
However, if you only cut open 1 cherry, and find 1 stone, it's not logical to assume that every cherry contains a stone. Equally, you may be a person with a mind, but it's not logical to extrapolate from 1 person - yourself - and conclude that every person has a mind.
Based on this, the Rational Dentist realises that he has no evidence for other people having minds, and concludes that they don't.
Bringing it back to Synthetics
"We do not require dentists."
You probably think I've wandered a bit off-topic here. I started off by asking whether or not Synthetics, in the Mass Effect Universe, have minds. And yet I seem to have concluded that people, in the real universe, don't have minds. Except for me.
Is that a conclusion? Synthetics don't have minds - but neither does anyone else?
Well, I suppose it is a conclusion, but it's not a very satisfying one. I don't know about you, but I'm not very comfortable living a world populated by philosophical zombies. I prefer my zombies to stay in video games.
It won't surprise you to know that I don't actually agree with the Rational Dentist. There are a few good arguments against the Dentist and, since they relate to being able to tell whether or not something has a mind, they'll be useful in telling whether or not Synthetics have minds.
So does Legion have a soul? Let's find out.
Argument No.1 - Redefining the Mind
Perhaps our original definition of a mind was wrong. It certainly wasn't very useful. It enables you to say that you have a mind, but it doesn't let you say much else.
You can imagine a body without a mind. This would be the philosophical zombie that we were discussing earlier. This would also be every Synthetic in the Mass Effect Universe, if the answer to Legion's question is 'No'.
But can you imagine a mind without a body? A mind without any physical presence at all? So... a ghost?
I don't believe ghosts exist. If you do, then I'm going to ask you to come back with some reproducible hard evidence before you can convince me.
Oh great. Someone call the ghostbusters.
But if ghosts don't exist - if a mind cannot exist without a body - can a body exist without a mind?
Oh. Right. Yeah. Dead bodies.
Fine... Can a living body exist without a mind?
Oh. Right. Yeah. Brain-dead bodies.
Okay... Can a living, talking, breathing, interacting body - a person - exist without a mind?
In other words: Are philosophical zombies actually completely unrealistic?
This argument - which is presented in a different form by Stephen Law if you followed the Rational Dentist link - is the Logical Behaviourist argument. It suggests that you can only define a mind by its influence. Put simply - if someone acts like they have a mind, then they have a mind. Yes, I know that's not a very logical statement, but it gets the point across.
If you define a mind by a person having certain characteristics - e.g. you can hold a conversation with them, they respond to stimuli (like pain), and they appear to have free will - then suddenly the entire human race is back inside the group of people that definitely have minds. Which is a bit of a relief, if you ask me. No more philosophical zombie apocalypse.
However, by this definition Synthetics also definitely have minds. After all, they have the characteristics, don't they?
There are things I don't like about this argument, though. It pretty much bypasses the question by redefining it. And it leaves me asking "So what is the 'thing' behind my eyes? I experience consciousness - but what is the 'I' that is doing the experiencing? Do Synthetics have that? Do other people have that?"
The argument also doesn't quite kill off the Philosophical Zombie Horde. After all, you still know what I mean when I say "A body without a mind."
Argument No.2 - Some Machines have Ghosts in Them
The ghost in this machine is watching you.
Let's stick with our original definition of a mind.
Cut open a cherry. You find a stone inside. Now, what can you say about the pile of uncut cherries sitting in front of you?
You can't say that all of them will contain stones. You don't have a large enough sample size for that.
But isn't it also illogical to conclude that none of them contain stones?
So. Some of those cherries probably contain stones.
(In fact, you can estimate that roughly 43% to 91% of the cherries contain stones, with the most likely value being 67%. If you're really interested, I can write out the maths in a separate post, but it's not really important.)
By the same logic, since you have a mind, you can conclude most other people also have minds, even if you can't assume that all of them do.
Which means that if someone acts like they've got a mind, it's probably best to treat them like they have a mind. You know, just to be on the safe side.
And that brings us back to Synthetics. And Alan Turing coming to the rescue.
The Turing Test
Alan Turing was a genius. I don't think that's up for dispute. He got a first class honours degree in Mathematics at King's College Cambridge, and was made a fellow of the college at 22; he was one of the major British code-breakers during WW2, for which he was knighted; and he is considered one of the founders of computing and artificial intelligence. He was also sadly persecuted and chemically castrated for being gay when it was illegal, for which the British government only recently apologised. Seriously, look this guy up, he's had a huge influence on modern history. There's even a conspiracy theory surrounding his death.
However, for our purposes, we're going to focus on one specific thought experiment that he developed. The Turing Test.
Meet ALICE. The Artificial Linguistic Internet Computer Entity.
ALICE is a chatbot. She's not real, she's got a bunch of algorithms that determine her responses. If you followed the link and chatted with her you probably noticed that, while she's got a decent handle on the English language, she comes out with strange responses from time to time. Basically, she's a real world VI. (You can also tell that she's not human because her typing speed is instantaneous.)
However, imagine a perfect chatbot. A computer program so good at making conversation, you genuinely can't tell that it isn't a human sitting at a computer somewhere. A computer program that, through conversation, has all the characteristics of a mind.
That is what it would take for a computer program to pass the Turing Test.
According to Argument 1, that computer program has a mind.
According to Argument 2, it might have a mind, so it's best to treat it as if it does.
Either way, there's a good chance that we've finally found the ghost in the machine.
Implications for the Mass Effect Universe
EDI and EVA; Legion and the rest of the Geth; Sovereign, Harbinger and the Catalyst; even the Presidium AI in ME1 - all of them would pass the Turing Test with flying colours. Sure, it might be difficult to get the Catalyst or the Reapers to submit to testing, but if you did, they'd pass.
Meaning that it's probably best to assume that they are truly sentient.
Excuse me?! Could you stop standing on my house for a moment? There's a test I want you take!
TL;DR: Does this unit have a soul?
We can't know. So if it's acting like it does, maybe we should assume it does.











Retour en haut









