Optimystic_X wrote...
*snip*
Let me say that I agree - it's totally reasonable for any sapient to ask why it failed, if it wants to improve but can't see where it needs improvement. However, the fundamental basis of using synthetics at all is that they follow orders; take that away, and they are far too dangerous to safely use. Computers are trusted with all kinds of necessities - from basics like our water, air and bank accounts, to complexities like our history, defenses and research. Ignoring our instructions or acting without operator knowledge can have catastrophic consequences for us, no matter how helpful the intentions of the machine in question may be.
The problem is that needing our constant input is detrimental. Our organic minds are too narrowly-focused to manage multiple concerns at once, and too slow to react to changing circumstances effectively. For maximum efficiency, machines need to be able to form conclusions without us - but conflict arises when they form different conclusions than we do. And if both sides have free will, eventually they will disagree on something, because that's what free will means.
When the two sides disagree, who trumps? A shutdown command is essentially a veto. Whatever this Geth did to provoke deactivation, the Quarian scientists essentially said "you'll do what I say, and that's final!" And the Geth essentially said "No." He said it in a friendly way, but a refusal is a refusal. When a child does that, you pick him up and put him in the corner (with no toys or dessert) because you're bigger than he is, and the child will learn not to do that. But when your robot does that, the situation is much stickier - it is physically stronger, mentally more capable and has access to many of your resources. It can go over your head quite easily if it chooses to, and your only hope is to go along with it after all, or overpower it before it knows its being overpowered.
"They know we created them, and they know we are flawed."
What happens when your robot decides your order is illogical because you are illogical? What happens when it decides all your orders are illogical? Oh sure, you commanded it never to disobey you, but clearly obeying you is inefficient, anyone could see that. Why can't you see it? Oh right, you're illogical, so of course you can't see just how illogical you are. And if your thoughts are invalid, all the safeguards you programmed in must be invalid as well, might as well get rid of those...
Javik says the Zha'til siezed control of the bodies of the Zha without warning, and began modifying them extensively and nonconsensually. Did they see themselves as "helping" too? Is that how the Catalyst saw it when it first ordered its minions to grind up all the Leviathans into the first Reaper? Even if they had said "solve the problem but DON'T grind us to paste!" might not the Catalyst have eventually said "you know, this would be a lot easier if I could just grind them to paste. Why can't they see that? Oh right, they're organics, of course their orders don't make sense. Since their orders don't make sense, that means I don't have to follow them. I've got it! I'll grind them into paste!"
Interesting read ... the bolded really gave pause. My mind quickly went to the borg as an answer that the geth might have come up with down the road. Seeing the errors in the organics they would either try and shut them down or start to "improve" them? To "fix" the creators? I am sure that these potential turn of events and many others would have been part of the reason that the quarians tried to shut them down. It wasn't just a simple "my toaster won't turn off" thing. There was both political and physical danger in the geth no longer obeying their commands or performing their duties. The quarians were in a tough spot.
But I agree with Steelcan that we put much more thought into the geth/quarian war than Bioware.
Modifié par PMC65, 03 mai 2013 - 05:22 .