Some of the quotes have been cut for the sake of brevity.
[quote]llbountyhunter wrote...
[quote]JShepppp wrote...
let me use this example a friend brought up:
thats like me destroying your tv because I said it will attract flesh eating zombies.
you dont know if its true, or if im lying,
but you cant prove me wrong either because the TV doesnt exist anymore.
just because it cant be disproven doesnt mean its right.
[/quote]
An analogy would need something that could "evolve" repeatedly to the point where it can become an unstoppable threat.
An example came up earlier in the thread where it was like kill a certain animal because it will create a disease that will poison all humans. That example only becomes relevant if (a) you believe that you "save" the animal in such a form that it still exists fully and (

the disease will grow/change/evolve into a global pandemic and kill off all animals. The latter would be the singularity. The disease, like synthetics, exists pre-singularity (before it becomes an epidemic) and post-singularity (point of no return). The animal is the organic, who we assume to be important as a baseline assumption. Then we also assume that we're not killing them but we're "saving" them by whatever process that we do. Obviously, the animals don't like it, but they don't know that if we don't do it they'll all eventually die by this disease anyways. Some animals may fight back and we may have to kill them. But the rest we can "ascend" to whatever form we have deemed fully acceptable prior, that, from our point of view, preserves them fully. We view this as a win/win situation with regrettable losses for the pigs (just throwing out an animal) who disagree. The pigs will all die anyways, so if we kill them, we're just doing what would've happened later.
But the important part is that stopping the disease is so important that we will be prepared to kill ALL pigs to save all OTHER animals. Stopping the disease (singularity) is the primary goal. Saving the pigs (organics) is secondary but preferable. As TIM and EDI said, if they wanted to destroy us utterly, they could. Maybe there were more advanced cycles sometime back (e.g. some races that actually added to the Crucible and understood it) that the Reapers just killed off because they were too powerful and "costly" to try to subjugate. [/quote]
BUT YOU DONT KNOW!!![/quote]
Would you be willing to take the chance, given the stakes? If you are, feel free to pick destroy. It's the manifestation of the ultimate disagreement with the Catalyst's arguments.
[quote]you cant cant compare synthetics to a virus just because they evolve. a virus has shown in the past to be hostile. sythetics have not. the starchilds logic is rooted on a "maybe snthetics will revolt, even though they never have in the past"[/quote]
It doesn't matter who starts the war. If organics start it (basically say "synthetics turn yourselves off of we'll do it for you") and synthetics say "no", it's technically a form of rebellion though they're really doing it in self-defense.
In the past, the Geth and Quarians fought. We know that it's very possible for war to occur; synthetics need not be the aggressors, but they will not be passive to organics' demands if it threatens their survival (assuming they're advanced enough to have such a concept).
The virus analogy was meant to show the out-of-control nature the problem can become if left unchecked and uncontrolled. It doesn't really represent the rebelling part.
Rebellion will occur whenever there is conflict between synthetics and organics because the synthetics are disobeying their creators' orders, no matter how horrible they may be. The "initiation" of the rebellion is not attacking, it is defiance - which leads to the conflict and war.
[quote]its making up an imagenary threat. just like the "tv attracts flesh eating zombies" analogy
[/quote]
We see rogue AIs all the time in ME; there's even a law in Citadel space that bans it. As for the singularity, we haven't seen that. If you take the singularity as inevitable - that AIs will eventually surpass organics - then it becomes a problem. Once an AI gets that level of power, by definition of the singularity, the AI would keep it forever.