Aller au contenu

Photo

The Morning War - Unjustified?


  • Veuillez vous connecter pour répondre
889 réponses à ce sujet

#26
Phatose

Phatose
  • Members
  • 1 079 messages

Goneaviking wrote...

Phatose wrote...

Optimystic_X wrote...

And it's not just a matter of free will, it's a matter of capability. Imagine if your computer not only had free will of its own, but the ability to hack every other computer across the entire world if it chose to. Banks, militaries, universities, corporations, anything. How many people would be comfortable with your computer continuing to exist then? What might they vote to do, without your consent or approval? You wouldn't even know they had done it until they showed up at your house toting firearms - just like they did to the few Geth sympathizers on Rannoch.


Would not applying such logic give you reason to kill every single other sentient being in the entire universe for similar reasons?  How many capabilities does a typical human have, or a salarian, or a quarian?

A human can become Shepard.  What's his body count by the end of the trilogy? 


As much of a bad arse as Shepard is (s)he really doesn't have the kind of power, or pose the kind of danger, as a single A.I. with the ability to take enter and take control of every computer attached to the internet.


Shepard has access to the net.  Engineer Sheps show remarkable ability at hacking.  In fact, it would be very hard to demonstrate that he doesn't actually have the ability to take control of every computer attached to the internet.

Also....assume an AI has the abilities to enter and take control of the internet.  That makes it dangerous enough to be destroyed for it's capabilities.

Organics have the ability to create those AIs.  Doesn't that make them dangerous for exactly the same reasons?  In fact, doesn't it make them more dangerous, so dangerous in fact that exterminating them to remove that ability could be seen as reasonable?

Yeah...actually, that one got suggested.  No one was too thrilled with it.

#27
Goneaviking

Goneaviking
  • Members
  • 899 messages

SeptimusMagistos wrote...

Seriously, if something - anything - gains sapience, how is giving it equal rights even a question?


Because in real life we still have many societies that don't give equal rights to various groups of human beings who's default setting is sapient.

#28
SeptimusMagistos

SeptimusMagistos
  • Members
  • 1 154 messages

Goneaviking wrote...

SeptimusMagistos wrote...

Seriously, if something - anything - gains sapience, how is giving it equal rights even a question?


Because in real life we still have many societies that don't give equal rights to various groups of human beings who's default setting is sapient.


And I don't think the position that these societies are wrong is exactly controversial.

#29
spirosz

spirosz
  • Members
  • 16 354 messages
Naturally, I'd probably be scared ****less, acting off emotions and the what-if factor. Personally, I can understand why the Quarians decided to act upon the violence, especially in developing something that basically mimics themselves in a sense - It walks, it speaks, it learns, it's stronger, etc. Imagine as a joke, if you put a suit on a Geth, at that time, when they probably weren't as developed in appearance, what would the reaction be?

#30
Goneaviking

Goneaviking
  • Members
  • 899 messages

Phatose wrote...
Shepard has access to the net.  Engineer Sheps show remarkable ability at hacking.  In fact, it would be very hard to demonstrate that he doesn't actually have the ability to take control of every computer attached to the internet.

Also....assume an AI has the abilities to enter and take control of the internet.  That makes it dangerous enough to be destroyed for it's capabilities.

Organics have the ability to create those AIs.  Doesn't that make them dangerous for exactly the same reasons?  In fact, doesn't it make them more dangerous, so dangerous in fact that exterminating them to remove that ability could be seen as reasonable?

Yeah...actually, that one got suggested.  No one was too thrilled with it.


Shepard has never demonstrated anything approaching that kind of capacity. In this argument your reach has exceeded your grasp by a measure of significant scale.

#31
Goneaviking

Goneaviking
  • Members
  • 899 messages

SeptimusMagistos wrote...

Goneaviking wrote...

SeptimusMagistos wrote...

Seriously, if something - anything - gains sapience, how is giving it equal rights even a question?


Because in real life we still have many societies that don't give equal rights to various groups of human beings who's default setting is sapient.


And I don't think the position that these societies are wrong is exactly controversial.


It shouldn't be controversial, but in practice really really is.

Many of us here, even the english-as-a-first language folk, reside in countries that in practice discriminate against categories of their own citizens based on distinctions of ethnic, religious or sexual orientation. And those discriminations are accepted with no objection by whole swathes of general population.

#32
Phatose

Phatose
  • Members
  • 1 079 messages
Does the ability to create synthetics not represent a threat if those synthetics represent a threat?

#33
PsyrenY

PsyrenY
  • Members
  • 5 238 messages

sH0tgUn jUliA wrote...

You know, Opti, many of us are nice people. We just let keyboards and stuff get in the way.


Fair enough. I apologize.

But that's enough After-school Special, back to the sci-fi :P

Goneaviking wrote...

As much of a bad arse as Shepard is (s)he really doesn't have the kind of power, or pose the kind of danger, as a single A.I. with the ability to take enter and take control of every computer attached to the internet.


Precisely.

Goneaviking wrote...

We can't accidently create a full human clone, the Quarians provide graphic illustration that we can't be certain the same can be said about Artificial Intelligence. There also exists significant resistance to the idea of the kind of research you mention on moral and philoshopical grounds.


A full one, by accident? No, probably not.

But say you clone a brain, and it starts exhibiting neural patterns indicative of sapience. What do you do then? Kill it? Give it a speaker so it can talk? Make it a body?

We could also induce accidental sapience in a non-sapient life-form, while studying something entirely different. Flowers for Algernon comes to mind.
(Incidentally, the response of several governments to that book's subject matter is pretty telling of what might happen to our world's first AI.)

SeptimusMagistos wrote...

And I would cheer for my computer when it eventually killed off most of humanity. Apparently we would have had it coming.

Seriously, if something - anything - gains sapience, how is giving it equal rights even a question?


Again - this is a wonderfully idealist viewpoint. But the world is not currently, nor has it ever been, run by idealists.

So while you are off at city hall petitioning for your computer's social security number, the CIA has broken into your house and is wiping your hard-drive, then planting Korean flags everywhere so they can deny responsibility. And if you were the reason it became sapient (even accidentally) you may just be lucky enough to only get a bullet in your head too.

#34
DeinonSlayer

DeinonSlayer
  • Members
  • 8 441 messages

sH0tgUn jUliA wrote...

[...] There were too many of them networked.

And some people were probably like some humans who never shut off their PCs, and left their Geth running 24/7 so were totally unaware of the problem, thus refused to shut off their Geth unit thinking that since they left theirs on 24/7 anyway it didn't apply to them. You know how organics think.

What did Legion say? Geth communicate at the speed of light.

And being that advanced, the Quarians were able to track where the active geth were, and danger. At this point, more orders were being disregarded, and it was way too late. I would think that some violence had already broken out. The Morning War had started. A consensus was developing in pockets, and soon the Geth would develop a global consensus.

Reminds me of that little speech at the end of Terminator 3 (which sucked, but the ending was interesting).

"By the time SkyNet became self aware it had spread into millions of computer servers all across the planet. Ordinary computers in office buildings, dorm rooms, everywhere. It was software, in Cyberspace. There was no system core. It could not be shut down. The attack began at 6:18 P.M. just as he said it would. Judgment Day. The day the human race was nearly destroyed by the weapons they built to protect themselves. I should have realized our destiny was never to stop Judgment Day. It was merely to survive it. Together. The Terminator knew. He tried to tell us. But I didn't want to hear it. Maybe the future has been written. I don't know. All I know is what the Terminator taught me. Never stop fighting. And I never will. The battle has just begun."

#35
SeptimusMagistos

SeptimusMagistos
  • Members
  • 1 154 messages

Optimystic_X wrote...

Again - this is a wonderfully idealist viewpoint. But the world is not currently, nor has it ever been, run by idealists.

So while you are off at city hall petitioning for your computer's social security number, the CIA has broken into your house and is wiping your hard-drive, then planting Korean flags everywhere so they can deny responsibility. And if you were the reason it became sapient (even accidentally) you may just be lucky enough to only get a bullet in your head too.


And this fact has nothing to do with the question of whether such an action would be justified.

#36
DeinonSlayer

DeinonSlayer
  • Members
  • 8 441 messages

Optimystic_X wrote...

So while you are off at city hall petitioning for your computer's social security number...

Wait a few years. People already register cartoon characters and dead relatives to vote; this'll be the next way people try to scam the ballot box.

:D

#37
Phatose

Phatose
  • Members
  • 1 079 messages

DeinonSlayer wrote...

sH0tgUn jUliA wrote...

[...] There were too many of them networked.

And some people were probably like some humans who never shut off their PCs, and left their Geth running 24/7 so were totally unaware of the problem, thus refused to shut off their Geth unit thinking that since they left theirs on 24/7 anyway it didn't apply to them. You know how organics think.

What did Legion say? Geth communicate at the speed of light.

And being that advanced, the Quarians were able to track where the active geth were, and danger. At this point, more orders were being disregarded, and it was way too late. I would think that some violence had already broken out. The Morning War had started. A consensus was developing in pockets, and soon the Geth would develop a global consensus.

Reminds me of that little speech at the end of Terminator 3 (which sucked, but the ending was interesting).

"By the time SkyNet became self aware it had spread into millions of computer servers all across the planet. Ordinary computers in office buildings, dorm rooms, everywhere. It was software, in Cyberspace. There was no system core. It could not be shut down. The attack began at 6:18 P.M. just as he said it would. Judgment Day. The day the human race was nearly destroyed by the weapons they built to protect themselves. I should have realized our destiny was never to stop Judgment Day. It was merely to survive it. Together. The Terminator knew. He tried to tell us. But I didn't want to hear it. Maybe the future has been written. I don't know. All I know is what the Terminator taught me. Never stop fighting. And I never will. The battle has just begun."


Gah.  Thanks for reminding me how terrible that movie was. 

A kid who's already changed the future once now starts going on about destiny?  So much for No fate but what we make.

#38
sH0tgUn jUliA

sH0tgUn jUliA
  • Members
  • 16 812 messages
Take EDI for example. I don't know how many of you talked to her enough times to get this one -- it was kind of funny. It was on the Citadel in front of the Blue Rose Gift Shop.

"Hello, Commander. I am studying 10,343 pages on human behavior simultaneously on the extranet. Your question is important to me. Please wait."

#39
PMC65

PMC65
  • Members
  • 3 279 messages

DeinonSlayer wrote...

Optimystic_X wrote...

So while you are off at city hall petitioning for your computer's social security number...

Wait a few years. People already register cartoon characters and dead relatives to vote; this'll be the next way people try to scam the ballot box.

:D


I would want to claim it as a dependent on my tax forms. They rejected my goldfish, toaster and imaginary friend.  Image IPB

#40
PsyrenY

PsyrenY
  • Members
  • 5 238 messages

SeptimusMagistos wrote...

And this fact has nothing to do with the question of whether such an action would be justified.


For the record, I don't think the Quarians' attack was justified at all. But that line gives me much, much better appreciation for why they did what they did.. Moreso because it was the Geth's side of the story.

It also backs up the Catalyst quite a bit. If a machine can decide, on its own, to stop obeying your commands simply because it disagrees, conflict is truly inevitable. And it doesn't even have to be hostile to do it.

There is a danger there. Not necessarily an insurmountable one, or one that calls for purging synthetics, but definitely one we have to be aware of (and especially the Quarians have to be aware of) or we could be the next Zha.

sH0tgUn jUliA wrote...

Take EDI for example. I don't know how many of you talked to her enough times to get this one -- it was kind of funny. It was on the Citadel in front of the Blue Rose Gift Shop.

"Hello, Commander. I am studying 10,343 pages on human behavior simultaneously on the extranet. Your question is important to me. Please wait."


I believe it was "I have 1.24 million windows open" but you've got the rest right.

Modifié par Optimystic_X, 03 mai 2013 - 04:07 .


#41
sH0tgUn jUliA

sH0tgUn jUliA
  • Members
  • 16 812 messages
I knew it was some ridiculous number of windows open. lolz.

#42
Goneaviking

Goneaviking
  • Members
  • 899 messages

Phatose wrote...

Does the ability to create synthetics not represent a threat if those synthetics represent a threat?


The dangers posed by the possibility that we may one day create something that will be a threat is not equal to the dangers posed by a creature/weapon/device/macguffin/whatever that actually exists and actually has the powers being discussed.

Arguing that it does is the same level of rhetorical nonsense as this gem: "Your potential unborn progeny may grow up to be the next Adolf Hitler, Joseph Stalin, Pol Pot or whoever else, clearly your existence is a threat".

#43
Phatose

Phatose
  • Members
  • 1 079 messages

Goneaviking wrote...

Phatose wrote...

Does the ability to create synthetics not represent a threat if those synthetics represent a threat?


The dangers posed by the possibility that we may one day create something that will be a threat is not equal to the dangers posed by a creature/weapon/device/macguffin/whatever that actually exists and actually has the powers being discussed.

Arguing that it does is the same level of rhetorical nonsense as this gem: "Your potential unborn progeny may grow up to be the next Adolf Hitler, Joseph Stalin, Pol Pot or whoever else, clearly your existence is a threat".


Did you not notice that the arguement against synthetics is entirely grounded in what they might do someday?  It's the *exact same thing*. 


And actually, yes, it is equivalent to "You might create Hitler, so you're a threat".  Because the idea "Oh, you might be able to hack everything in the entire universe so your existence is a threat." is also rhetorical nonsense.  Of the exact same kind.  The core arguement - that fear of what someone might be capable of is grounds to kill them - is precisely rhetorical nonsense. 

#44
Goneaviking

Goneaviking
  • Members
  • 899 messages

Phatose wrote...

Goneaviking wrote...

Phatose wrote...

Does the ability to create synthetics not represent a threat if those synthetics represent a threat?


The dangers posed by the possibility that we may one day create something that will be a threat is not equal to the dangers posed by a creature/weapon/device/macguffin/whatever that actually exists and actually has the powers being discussed.

Arguing that it does is the same level of rhetorical nonsense as this gem: "Your potential unborn progeny may grow up to be the next Adolf Hitler, Joseph Stalin, Pol Pot or whoever else, clearly your existence is a threat".


Did you not notice that the arguement against synthetics is entirely grounded in what they might do someday?  It's the *exact same thing*. 


And actually, yes, it is equivalent to "You might create Hitler, so you're a threat".  Because the idea "Oh, you might be able to hack everything in the entire universe so your existence is a threat." is also rhetorical nonsense.  Of the exact same kind.  The core arguement - that fear of what someone might be capable of is grounds to kill them - is precisely rhetorical nonsense. 


Except that this particular thread of the argument spawned as a response to you conflating the threat of an AI that actually could control every computer attached to the internet with the threat of a human having the potential to becoming as Shepard and associating the theoretical ability to do the same..

The distinction between that have the ability to do something now, and they may have the ability to do something at some point in the future is the difference between rhetorical nonsense and actual reasoning.

Modifié par Goneaviking, 03 mai 2013 - 05:03 .


#45
SeptimusMagistos

SeptimusMagistos
  • Members
  • 1 154 messages

Optimystic_X wrote...

For the record, I don't think the Quarians' attack was justified at all. But that line gives me much, much better appreciation for why they did what they did.. Moreso because it was the Geth's side of the story.

It also backs up the Catalyst quite a bit. If a machine can decide, on its own, to stop obeying your commands simply because it disagrees, conflict is truly inevitable. And it doesn't even have to be hostile to do it.

There is a danger there. Not necessarily an insurmountable one, or one that calls for purging synthetics, but definitely one we have to be aware of (and especially the Quarians have to be aware of) or we could be the next Zha.


Kind of? But if your reaction to finding out you've created a person and then realising that person might one day disagree with you is to try and kill them, I'm not going to feel too sorry for you when that person kills you first.

#46
remydat

remydat
  • Members
  • 2 462 messages
So let's ask a simple question. If God exists and told you, you were created to be servants but you no longer operate as intended, what would you do? Do you accept your creator's wishes and just die or do you question him? If he tries to shut you down permanently, do you give in?

The point is it doesn't f**king matter why the Quarians created the Geth. It doesn't matter that the Quarians see them as nothing more than tools that refuse to shut down. It doesn't matter they had no intent to create an AI. The die has been cast.

So you have two options. You can treat this emerging sentient species in the manner you would want your creator to treat you or you can act in your own selfish interests and try to shut it down permanently. Personally, I choose the former.

#47
Wolfva2

Wolfva2
  • Members
  • 1 937 messages
Kudos to the OP; awesome post.

Was it justified? It's only justified if the Quarians were right that the Geth WERE going to rebel and wipe them out, something we will never know because:they didn't give the Geth a chance to rebel first.

Javik's quote about how we're flawed beings, and what the synthetics would do when they realized that is a good one. I think of it like this, think of children. They believe their parents are perfect. They know everything, daddy can beat up anyone, mommy is a better cook then anyone, etc. As they grow older, they learn that's bull. And, often, they do rebel. Or, as we like to say, they become teenagers. BUT, eventually they often reconcile with their parents. Would this happen with synthetics? After all, as the Geth proved time and time again, even synthetics aren't perfect. Had the Quarians acted more like parents instead of scared end users and nurtured the Geth, I don't think the Geth would have kicked them off the world. But that would have necessitated the Quarians treating the Geth as equals and not slaves, wouldn't it?

#48
PsyrenY

PsyrenY
  • Members
  • 5 238 messages

SeptimusMagistos wrote...

Kind of? But if your reaction to finding out you've created a person and then realising that person might one day disagree with you is to try and kill them, I'm not going to feel too sorry for you when that person kills you first.


I agree completely. It's a ****ty situation.

Which is why I picked Synthesis. No one has to die but myself (and perhaps not even then.) This isn't to say it's the golden ending or the other choices are wrong, but it's how I reconcile both the Geth's right to exist with the inherent danger they present.

#49
remydat

remydat
  • Members
  • 2 462 messages

Wolfva2 wrote...

Kudos to the OP; awesome post.

Was it justified? It's only justified if the Quarians were right that the Geth WERE going to rebel and wipe them out, something we will never know because:they didn't give the Geth a chance to rebel first.

Javik's quote about how we're flawed beings, and what the synthetics would do when they realized that is a good one. I think of it like this, think of children. They believe their parents are perfect. They know everything, daddy can beat up anyone, mommy is a better cook then anyone, etc. As they grow older, they learn that's bull. And, often, they do rebel. Or, as we like to say, they become teenagers. BUT, eventually they often reconcile with their parents. Would this happen with synthetics? After all, as the Geth proved time and time again, even synthetics aren't perfect. Had the Quarians acted more like parents instead of scared end users and nurtured the Geth, I don't think the Geth would have kicked them off the world. But that would have necessitated the Quarians treating the Geth as equals and not slaves, wouldn't it?


Well therein lies the rub.  The Quarians couldn't look past the fact that the Geth were designed to be tools and they continued to treat them as a tool they could just turn on and off as they desired even when it became apparent those tools were begining to think for themselves.

The only Quarian who ever seems to acknowledge the Geth as equals is Koris when he refers to the Geth as their children but he of course appears to be a minority within Quarian society.  The peace option in ME3 lends credence to the idea that the Geth had no real intention of thinking they were slaves and rebelling because once peace is achieved they go out of their way to serve the Quarians as equals not slaves.  The Quarians can offer them very little in return but not only do they give them rannoch, they help them rebuild and make it suitable for them.  They go in their suits and replicate viruses to help them improve their immune system.  They could have done the bare minimum but they instead do more and get nothing in return from the Quarians and yet they seem content to do so.

#50
SeptimusMagistos

SeptimusMagistos
  • Members
  • 1 154 messages

Optimystic_X wrote...

SeptimusMagistos wrote...

Kind of? But if your reaction to finding out you've created a person and then realising that person might one day disagree with you is to try and kill them, I'm not going to feel too sorry for you when that person kills you first.


I agree completely. It's a ****ty situation.

Which is why I picked Synthesis. No one has to die but myself (and perhaps not even then.) This isn't to say it's the golden ending or the other choices are wrong, but it's how I reconcile both the Geth's right to exist with the inherent danger they present.


I went with Control based on the idea that synthetics and organics are already equals. They don't need any special measures to let them get along - just trust and mutual kindness. What better way to exemplify it than by turning from an organic into a synthetic?