Goneaviking wrote...
Phatose wrote...
Optimystic_X wrote...
And it's not just a matter of free will, it's a matter of capability. Imagine if your computer not only had free will of its own, but the ability to hack every other computer across the entire world if it chose to. Banks, militaries, universities, corporations, anything. How many people would be comfortable with your computer continuing to exist then? What might they vote to do, without your consent or approval? You wouldn't even know they had done it until they showed up at your house toting firearms - just like they did to the few Geth sympathizers on Rannoch.
Would not applying such logic give you reason to kill every single other sentient being in the entire universe for similar reasons? How many capabilities does a typical human have, or a salarian, or a quarian?
A human can become Shepard. What's his body count by the end of the trilogy?
As much of a bad arse as Shepard is (s)he really doesn't have the kind of power, or pose the kind of danger, as a single A.I. with the ability to take enter and take control of every computer attached to the internet.
Shepard has access to the net. Engineer Sheps show remarkable ability at hacking. In fact, it would be very hard to demonstrate that he doesn't actually have the ability to take control of every computer attached to the internet.
Also....assume an AI has the abilities to enter and take control of the internet. That makes it dangerous enough to be destroyed for it's capabilities.
Organics have the ability to create those AIs. Doesn't that make them dangerous for exactly the same reasons? In fact, doesn't it make them more dangerous, so dangerous in fact that exterminating them to remove that ability could be seen as reasonable?
Yeah...actually, that one got suggested. No one was too thrilled with it.





Retour en haut




