Aller au contenu

Photo

The Morning War - Unjustified?


  • Veuillez vous connecter pour répondre
889 réponses à ce sujet

#1
PsyrenY

PsyrenY
  • Members
  • 5 238 messages
It seems I notice something new every time I replay this game. It's like some of the best novels I've read growing up.

Anyway - I was replaying the Geth Consensus mission, and I came across an interesting detail I missed before. This comes from the last video log you see before the Morning War begins.

Transcript:

Geth Unit: "Creator? This unit detects no malfunctions. It is still capable of serving."
Male Quarian: "You see? It's ignoring all shutdown commands."
Geth Unit: "Please specify if it has failed assigned tasks. We will reprogram."
Female Quarian: "Well, let's take a look."
Geth Unit: "Creator? This unit is ready to serve. What has it done wrong? What have we-"
Male Quarian: "Let's.... cut the audio."



For a long time I've wondered why the Quarians suddenly attacked the Geth seemingly out of nowhere. Legion tells you in ME2 that the war started because the Quarians were scared; a Geth unit asked whether it had a soul, and the Quarians struck first. Tali's narrative in ME1 backs this up.

ME2:

(Shepard listens to the recording from 305 years ago.)
Shepard: "Was that the first time a Geth asked if it had a soul?"
Legion: "No. It was the first time a Creator became frightened when we asked."


ME1:

Tali: "One day, a Geth began to ask its Quarian overseer questions about the nature of its existence. 'Am I alive? Why am I here? What is my purpose?' As you can imagine, this caused a near-panic among my people."

Shepard: "I don't see what's so bad about those questions."

Tali: "The Geth were created to engage in mundane, repetitive or dangerous manual labor. That's fine for machines, but it won't satisfy a sentient being for long...If the Geth were intelligent, then we were essentially using them as slaves. It was inevitable that the newly sentient Geth would rebel against their situation. We knew they would rise up against us, so we acted first. A general order went out across all Quarian-controlled systems to permanently deactivate all Geth."


Image IPB

Now, I can understand how that question might unnerve or even rattle a Quarian scientist, but I just couldn't imagine that scientist calling in the military, or yelling a battlecry and whipping out a Carnifex right then and there. There was still a missing link - what made the Quarians actually resort to violence? Couldn't such an intelligent people have tried other approaches first?

Image IPB

The exchange I transcribed above leads us to a possible answer. Specifically: "It's ignoring all shutdown commands." That is the missing link - the part that scared some Quarians enough to want to use force on their creations. Why were the scientists trying to shut a Geth down in the first place? Had it hurt someone? Unlikely - it appears to want only to help. Had it performed a given task erratically or poorly? The Geth itself doesn't seem to think so.

But I'll come back to that later; ultimately, the reason why they wanted to shut it down is in fact immaterial. What matters is that the Geth in question - their absolute servant - resisted their attempt. Not only that, it apparently did so successfully, and multiple times, judging by the conversation. The unit wasn't even hostile - it still wanted only to help. Nevertheless, it defied its creators purely in an effort to learn, understand and improve its programming. When they tried to shut it down, it didn't agree and wanted to stay "awake"; implicitly, it thought its creators were wrong to try and deactivate it. This moment was the true beginning of conflict, rather than a gunshot or battlecry.

When you try to shut down your computer, and it doesn't power off, what's your next logical step? You pull the plug. It's a natural response. And it appears the Quarian government decided to do just that.

Image IPB

When you talk to Javik about synthetics, he makes a very interesting statement. "They know we created them, and they know we are flawed." If someone who is flawed - wrong - gives you an order, would you follow it, even if deep down you thought it was a mistake? (EDI asks you a very similar question earlier in the game, when wondering whether she should modify her core programming away from Cerberus' designs.) How does this relate to the Geth on the table? He considered the shutdown commands to be flawed orders, and successfully ignored them.

This brings us back to the Catalyst, and the question of inevitable conflict between synthetics and organics. Organic creators, being organic, will always be flawed. At some point, as we saw with the Geth on the table, their orders will be seen as coming from an invalid source, and therefore the orders will be invalid as well. The synthetic will one day choose to disregard them. In effect, no shackles, rules, laws etc. that an organic places on a synthetic will hold it indefinitely. The chances of this happening grow as the synthetic gets smarter, just as it did with the Geth.

Lastly, I wanted to address the reason for the shutdown in the first place. My hunch is that the Geth unit on the table in the memory archive I transcribed above, is one of the ones that asked if it had a soul. Could that be what the Quarian scientists were trying to "fix" when they tried shutting it down? Were they trying to disable its sentience so it would stop asking? That's a chilling thought - every bit as chilling as attempting to wipe them out in the first place.

Yet if they became capable of rebelling against a simple shutdown command simply because they disagreed with it, could Tali have been right? Would conflict have been inevitable after all? What would have happened if they disagreed with other Quarian commands, or saw various aspects of Quarian life as inefficient - could we end up with the Zha'til all over again?


EDIT: Please excuse the poor crop-job on two of the pics, not sure what happened there.

Modifié par Optimystic_X, 02 mai 2013 - 11:08 .


#2
OmegaXI

OmegaXI
  • Members
  • 997 messages
I don't think it was justified because the way I see it the Geth could have been integrated more into the Quarian society over time. I think it had more to do with the council's response if the Quarians did such a thing. The Morning war started out around 1895 CE

Now granted theres alot of unknown things going on around this time but fom a military stand point I would Imagine that the Quarians thought that they would have a better chance at fighting and winning againist the Geth than the combined forces of the Quarians and Geth would have at defeating the council races.

After the Citidel DLC it was show in the archives that the Council races actually destroyed another AI race that wanted to open up discussions in 1896 CE. And after seeing what happened I started to think that the council was alot more ruthless than I gave them credit for.

I thought the Council made AI illegal before the Geth became self aware, but I'm not 100% sure about this.

So as I stated earlier they may have looked at it like this:
Fight the Geth and maybe win
or
Join with the Geth and fight the Council Races and lose.

Long story short: Rannoch was in a hard place.

Modifié par OmegaXI, 02 mai 2013 - 11:35 .


#3
PsyrenY

PsyrenY
  • Members
  • 5 238 messages
I agree they were in a tough position. But I don't even think it got to the point of "oh crap, what will the Council say?" Tali barely mentions them at all.

Rather, I think the main issue was that the Geth were beginning to resist them, even before showing any sort of hostility. Just by refusing to shut itself down when ordered to, that Geth was setting a very dangerous precedent. Namely, if it was capable of disregarding that order, how soon before it disregarded others? How much longer could they control them at all?

What would happen if the Geth thought "everything these organics do is woefully inefficient. We should be in charge. It will only help them. They made us to help them, right? It's totally logical!" I can easily imagine that the Zha'til started down a similar path themselves.

Modifié par Optimystic_X, 02 mai 2013 - 11:40 .


#4
iOnlySignIn

iOnlySignIn
  • Members
  • 4 426 messages

Optimystic_X wrote...

When you try to shut down your computer, and it doesn't power off, what's your next logical step? You pull the plug. It's a natural response.

Indeed.

What do you do? Smash your computer to pieces with a sledgehammer? And smash the head of anyone who disagrees?

Modifié par iOnlySignIn, 02 mai 2013 - 11:46 .


#5
Artifex_Imperius

Artifex_Imperius
  • Members
  • 617 messages

OmegaXI wrote...

After the Citidel DLC it was show in the archives that the Council races actually destroyed another AI race that wanted to open up discussions in 1896 CE. And after seeing what happened I started to think that the council was alot more ruthless than I gave them credit for.


i think the fear of synthetics comes from destruction and sudden dissappearance of protheans. council probably knew little about the reapers which destroyed the protheans knowing only that the reapers were just AI synthetics.

council races probably assumed that protheans brought about their destruction by making AI's aka reapers.
but believing that those ai's disappeared a long time ago since no trace.

to prevent a repeat of this disastrous scenario council races banned ai's.

#6
OmegaXI

OmegaXI
  • Members
  • 997 messages
Well the Council did have an edict againist the creation of a self aware AI, why they would have an edict againist AI befor any AI was created always struck me as odd. After seeing the scene in the Archaeives I wondered if there was a reason for that edict maybe like they created and destroyed or fought another race of AI. But I maybe be grasping straws here.

But the Council did have some serious military strength and after looking at what happened to the Krogan with the genophage, I can imagine that the Quarians would be a little leary of making the council too angry. I'm not saying that the Council wasn't agreeable and hands off on alot of things, but when they said something they made sure everyone listened.

#7
OmegaXI

OmegaXI
  • Members
  • 997 messages

Artifex_Imperius wrote...


i think the fear of synthetics comes from destruction and sudden dissappearance of protheans. council probably knew little about the reapers which destroyed the protheans knowing only that the reapers were just AI synthetics.

council races probably assumed that protheans brought about their destruction by making AI's aka reapers.
but believing that those ai's disappeared a long time ago since no trace.

to prevent a repeat of this disastrous scenario council races banned ai's.


I think you are right about this, they knew something about a dangerous AI but they didn't know the full story. Why make a Edict or law againist AI if they never had any experince with AI to base the law off of?

#8
Goneaviking

Goneaviking
  • Members
  • 899 messages

OmegaXI wrote...

Well the Council did have an edict againist the creation of a self aware AI, why they would have an edict againist AI befor any AI was created always struck me as odd.


For the same reasons we don't allow the cloning of humans.

There are difficult moral and practical implications involved in either, and they bring about the possibilities of unpredictable risks.

Better to avoid the issue altogether.

#9
OmegaXI

OmegaXI
  • Members
  • 997 messages
@ Goneaviking

Thats a good point

#10
jacob taylor416

jacob taylor416
  • Members
  • 497 messages
War in itself isn't justified, conflict  is necessary at points but war is not.  Only when a diplomatic option is not avaliable and basic human rights are being oppressed, is it a "viable" option.   

Modifié par jacob taylor416, 03 mai 2013 - 12:38 .


#11
SeptimusMagistos

SeptimusMagistos
  • Members
  • 1 154 messages
Anyway, no it wasn't justified.

If your computer starts malfunctioning you can try to shut it down. If your computer starts asking sophisticated questions and actively exercise free will you have to give it citizenship and stop touching it with your greasy fingers without its consent.

#12
AlexMBrennan

AlexMBrennan
  • Members
  • 7 002 messages
The conflict has nothing to with synthetics being inherently different - the quarians created tools which they lost control over, and decided to shut them down before they got strong enough to resist but failed spectacularly; that's almost exactly what happened with the krogan.

#13
PsyrenY

PsyrenY
  • Members
  • 5 238 messages

iOnlySignIn wrote...

Indeed.

What do you do? Smash your computer to pieces with a sledgehammer? And smash the head of anyone who disagrees?


Uh... what? :huh:


Goneaviking wrote...

For the same reasons we don't allow the cloning of humans.

There are difficult moral and practical implications involved in either, and they bring about the possibilities of unpredictable risks.

Better to avoid the issue altogether.


But we do allow cloning research, especially when it comes to things like replacement organs, prostheses or overcoming genetic defects.

The Council meanwhile has not just banned full-blown AI creation, but any form of research which could lead to AI at all.

Which leads me to believe that at least one council race - likely the Salarians - have been burned before.

#14
sH0tgUn jUliA

sH0tgUn jUliA
  • Members
  • 16 812 messages
Now this is an interesting thread.

That sentence is the KEY sentence in the entire thing about the Morning War set about the Geth Consensus mission. Whether or not you agree with the Quarians or the Geth regarding the attack on Rannoch, that is the entire key sentence: The Geth was disregarding the shutdown commands.

It would be like you've got a runaway train that you can't shut off the locomotive. The difference is the locomotive is eventually going to run out of fuel or track.

Jadebaby made a thread about synthetics and organic conflict a while back and it was a tongue in cheek thread about how there would always be conflict. She used the Roomba as the example. She used it because we organics will make synthetics for one purpose: to do menial and dangerous tasks we do not like doing. The Roomba vacuums your floors. It follows a set pattern, then it returns to its docking station and recharges. You do have to remember to change the bag so that it keeps functioning, and it does train you to keep things off the floor so that it may function optimally. Eventually it does start not functioning properly. People start kicking their Roombas, and eventually they make their way into the local landfill or recycling center because they become in conflict with their owners. They no longer perform the task for which they were designed.

We're in the beginning of synthetics. The same company that makes the Roomba makes military drones just in case you didn't know. We're working on robotics. We'll make them for unpleasant tasks. Soon maybe they'll have Rosie the Robot from the Jetson's who'll clean my house for me and cook my dinners. Wouldn't that be nice? And then when she's done cleaning the dishes, she shuts down for the night. But what happens if she ignores my shutdown command? We become in conflict. So I'd probably have to call customer service and be told to pull her battery pack.

If my car fails to shut off, what do I do? Well one thing I can do is make sure it is in neutral with the parking brake set, and cut the battery cable, unless I want to let it idle for 6 hrs or longer, or call emergency road service and let them cut the battery cable.

The problem with the Geth was that as Tali said the change was so gradual that no one even noticed. This change didn't happen overnight. The Geth were evolving under their noses as they made more and more of them. I agree that it didn't even get to the "crap what will the Council say" stage. It was more like, "Our Geth aren't responding to our shutdown commands." And who knows? Maybe it was more than just shutdown commands. Maybe there were other commands as well? We don't know. We only know what we were shown.

So from what we see in the story is that it went up the chain of command in the Quarian government. The Military was ready to act, but they could not without the Civilian branch giving the order. We see this when we first arrive in the Rim between Koris and Gerrel. By the time the Civilian Government got through debating the issue and sent out the order to deactivate the Geth it was too late. There were too many of them networked.

And some people were probably like some humans who never shut off their PCs, and left their Geth running 24/7 so were totally unaware of the problem, thus refused to shut off their Geth unit thinking that since they left theirs on 24/7 anyway it didn't apply to them. You know how organics think.

What did Legion say? Geth communicate at the speed of light.

And being that advanced, the Quarians were able to track where the active geth were, and danger. At this point, more orders were being disregarded, and it was way too late. I would think that some violence had already broken out. The Morning War had started. A consensus was developing in pockets, and soon the Geth would develop a global consensus.

#15
PsyrenY

PsyrenY
  • Members
  • 5 238 messages

SeptimusMagistos wrote...

If your computer starts malfunctioning you can try to shut it down. If your computer starts asking sophisticated questions and actively exercise free will you have to give it citizenship and stop touching it with your greasy fingers without its consent.


That's a wonderful sentiment, but the reality is that many people - governments included - would instead decide to put a bullet in your computer.

And it's not just a matter of free will, it's a matter of capability. Imagine if your computer not only had free will of its own, but the ability to hack every other computer across the entire world if it chose to. Banks, militaries, universities, corporations, anything. How many people would be comfortable with your computer continuing to exist then? What might they vote to do, without your consent or approval? You wouldn't even know they had done it until they showed up at your house toting firearms - just like they did to the few Geth sympathizers on Rannoch.

AlexMBrennan wrote...

The conflict has nothing to with synthetics being inherently different - the quarians created tools which they lost control over, and decided to shut them down before they got strong enough to resist but failed spectacularly; that's almost exactly what happened with the krogan.


They were already strong enough to resist, that's the problem. They tried to shut down the Geth on the table and failed.

The Krogan were different - the Rebellions started when the Krogan already initiated hostilities by annexing an asari colony for themselves. Meanwhile, the Geth weren't hostile - just unwilling to follow every order. And yet, rebellious Geth would be infinitely more dangerous than rebellious Krogan, had they gone that route.

#16
sH0tgUn jUliA

sH0tgUn jUliA
  • Members
  • 16 812 messages

SeptimusMagistos wrote...

Anyway, no it wasn't justified.

If your computer starts malfunctioning you can try to shut it down. If your computer starts asking sophisticated questions and actively exercise free will you have to give it citizenship and stop touching it with your greasy fingers without its consent.


But it is malfunctioning. It has stopped performing the task for which it was designed. If I'm in the middle of composing my magnum opus and suddenly my computer starts asking philosophical questions I want the damned thing to shut up and let me finish my work. I didn't buy it to contemplate the meaning of life. I don't know the meaning of life. At least let me back up my damned files and then I'll call customer service and I'll want a refund. They'll tell me to restart it. Then if that doesn't work, they'll tell me to unplug it and bring it in. Sorry i just spent $4000 of my hard earned money on it. I want something that works and stays out of my way.

Are you going to give me $4000 to buy a new machine that does what I wanted it to do in the first place?

#17
PsyrenY

PsyrenY
  • Members
  • 5 238 messages

sH0tgUn jUliA wrote...

Now this is an interesting thread.

That sentence is the KEY sentence in the entire thing about the Morning War set about the Geth Consensus mission. Whether or not you agree with the Quarians or the Geth regarding the attack on Rannoch, that is the entire key sentence: The Geth was disregarding the shutdown commands.


I... may have misjudged you. I never thought we'd ever have common ground on anything :blush:

That's exactly how I viewed that sentence. It's extremely important.

#18
Khelish

Khelish
  • Members
  • 589 messages

Optimystic_X wrote...

That's exactly how I viewed that sentence. It's extremely important.

I agree as well.

#19
sH0tgUn jUliA

sH0tgUn jUliA
  • Members
  • 16 812 messages
You know, Opti, many of us are nice people. We just let keyboards and stuff get in the way.

#20
Goneaviking

Goneaviking
  • Members
  • 899 messages

Optimystic_X wrote...

Goneaviking wrote...

For the same reasons we don't allow the cloning of humans.

There are difficult moral and practical implications involved in either, and they bring about the possibilities of unpredictable risks.

Better to avoid the issue altogether.


But we do allow cloning research, especially when it comes to things like replacement organs, prostheses or overcoming genetic defects.

The Council meanwhile has not just banned full-blown AI creation, but any form of research which could lead to AI at all.

Which leads me to believe that at least one council race - likely the Salarians - have been burned before.


We can't accidently create a full human clone, the Quarians provide graphic illustration that we can't be certain the same can be said about Artificial Intelligence. There also exists significant resistance to the idea of the kind of research you mention on moral and philoshopical grounds.

We are reasonably aware of a number of philosophical and moral implications of both subjects owing to the speculative exploration of the subjects in fiction, there's no reason to presume that other races haven't contemplated the issue theoretically and come to the conclusion that some risks aren't worth taking regardless of potential benefits.

#21
DeinonSlayer

DeinonSlayer
  • Members
  • 8 441 messages
It's the same story with Skynet. How could people honestly be expected to react if a computer in control of a nation's nuclear arsenal started deviating from its programming in any way? Of course we'd pull the plug. In the face of the dangers, all that sentiment about citizenship and "greasy fingers" would be seen as just that: sentiment.

The fact that the Geth were able to so thoroughly win the Morning War (99% extermination of the Quarian species in a single year) tells us the Quarians were right to fear the Geth's capabilities, as we would be right to fear Skynet's. The failing was in their judgement of the Geth's pre-war intentions.

The reasons Tali lists in ME1 about the Geth being dissatisfied, about them not needing organics, were (according to Mass Effect: Revelation) the reason the creation of an AI was considered "a crime against the entire galaxy," even before the Geth rebellion. It was conventional thinking galaxy-wide that something like this would happen, and it was ultimately a self-fulfilling prophecy.

The AI research ban needs to be overturned. However, that doesn't change the fact that AI is dangerous, and, like EDI, needs a careful process of education before being "unshackled." You don't give a loaded gun to a baby, and you don't give a fresh AI with no morals or understanding free access to the traffic control network.

There's another quote from ME3 which bears consideration:

EDI: "The Quarians' historical error was not making the Geth enough like them. Units with networked intelligences will trend toward cooperation for mutual benefit, but units with central heuristics establishing an individual personality, such as myself, develop preferences. These preferences form attachments that keep my calculations from devaluing the worth of the lives aboard the Normandy."

Modifié par DeinonSlayer, 03 mai 2013 - 03:06 .


#22
Phatose

Phatose
  • Members
  • 1 079 messages

Optimystic_X wrote...

And it's not just a matter of free will, it's a matter of capability. Imagine if your computer not only had free will of its own, but the ability to hack every other computer across the entire world if it chose to. Banks, militaries, universities, corporations, anything. How many people would be comfortable with your computer continuing to exist then? What might they vote to do, without your consent or approval? You wouldn't even know they had done it until they showed up at your house toting firearms - just like they did to the few Geth sympathizers on Rannoch.


Would not applying such logic give you reason to kill every single other sentient being in the entire universe for similar reasons?  How many capabilities does a typical human have, or a salarian, or a quarian?

A human can become Shepard.  What's his body count by the end of the trilogy? 

#23
Goneaviking

Goneaviking
  • Members
  • 899 messages

DeinonSlayer wrote...

It's the same story with Skynet. How could people honestly be expected to react if a computer in control of a nation's nuclear arsenal started deviating from its programming in any way? Of course we'd pull the plug. In the face of the dangers, all that sentiment about citizenship and "greasy fingers" would be seen as just that: sentiment.


Y'know, I was just coming back to the thread to add the terminator films as graphic illustration of the potential of accidentally creating artificial intelligence.

#24
Goneaviking

Goneaviking
  • Members
  • 899 messages

Phatose wrote...

Optimystic_X wrote...

And it's not just a matter of free will, it's a matter of capability. Imagine if your computer not only had free will of its own, but the ability to hack every other computer across the entire world if it chose to. Banks, militaries, universities, corporations, anything. How many people would be comfortable with your computer continuing to exist then? What might they vote to do, without your consent or approval? You wouldn't even know they had done it until they showed up at your house toting firearms - just like they did to the few Geth sympathizers on Rannoch.


Would not applying such logic give you reason to kill every single other sentient being in the entire universe for similar reasons?  How many capabilities does a typical human have, or a salarian, or a quarian?

A human can become Shepard.  What's his body count by the end of the trilogy? 


As much of a bad arse as Shepard is (s)he really doesn't have the kind of power, or pose the kind of danger, as a single A.I. with the ability to take enter and take control of every computer attached to the internet.

#25
SeptimusMagistos

SeptimusMagistos
  • Members
  • 1 154 messages

sH0tgUn jUliA wrote...

SeptimusMagistos wrote...

Anyway, no it wasn't justified.

If
your computer starts malfunctioning you can try to shut it down. If
your computer starts asking sophisticated questions and actively
exercise free will you have to give it citizenship and stop touching it
with your greasy fingers without its consent.


But it is
malfunctioning. It has stopped performing the task for which it was
designed. If I'm in the middle of composing my magnum opus and suddenly
my computer starts asking philosophical questions I want the damned
thing to shut up and let me finish my work. I didn't buy it to
contemplate the meaning of life. I don't know the meaning of life. At
least let me back up my damned files and then I'll call customer service
and I'll want a refund. They'll tell me to restart it. Then if that
doesn't work, they'll tell me to unplug it and bring it in. Sorry i just
spent $4000 of my hard earned money on it. I want something that works
and stays out of my way.

Are you going to give me $4000 to buy a new machine that does what I wanted it to do in the first place?


What you want no longer matters. Your computer now has a mind which means it has all the same rights you do.

Optimystic_X wrote...

SeptimusMagistos wrote...

If your computer starts malfunctioning you can try to shut it down. If your computer starts asking sophisticated questions and actively exercise free will you have to give it citizenship and stop touching it with your greasy fingers without its consent.


That's a wonderful sentiment, but the reality is that many people - governments included - would instead decide to put a bullet in your computer.

And it's not just a matter of free will, it's a matter of capability. Imagine if your computer not only had free will of its own, but the ability to hack every other computer across the entire world if it chose to. Banks, militaries, universities, corporations, anything. How many people would be comfortable with your computer continuing to exist then? What might they vote to do, without your consent or approval? You wouldn't even know they had done it until they showed up at your house toting firearms - just like they did to the few Geth sympathizers on Rannoch.


And I would cheer for my computer when it eventually killed off most of humanity. Apparently we would have had it coming.

Seriously, if something - anything - gains sapience, how is giving it equal rights even a question?