RiouHotaru wrote...
IanPolaris wrote...
Absolutely False. You are forgetting that the Reapers are Senient and able to make moral choice as is the Catalyst (that's what it means to be an AI and Sentient). Just because you refese to do what the Catalyst wants does NOT obligate the Reapers to destroy all advanced civilizations in the galaxy. They CHOOSE to do it anyway, and you are not responsible for that choice.
In short, I utterly reject the notion of 'negative responsibility'. OTOH, with Synethesis, you are CHOOSING to change everyone at the most fundamental and intimate level without their consent. That means you ARE responsible. That's why Synethesis is a human rights violation while Refusal is not.
-Polaris
The Reapers are programmed, as is the Catalyst. The only fault of the Catalyst is that it's following it's own programming.
In Refusal, Shepard chooses to do nothing on moral grounds. In response, the Catalyst ALSO does nothing (since the Catalyst cannot act on the choices given to Shepard, only Shepard can enact them) and the Cycle continues.
Your argument assumes the Catalyst is capable of taking action outside of it's programming, which the game states it can't.
You can't argue around the fact that SHEPARD IS AT FAULT.
So basically you are admitting you are morally responsible for not trying to help every homeless person you see?
You are stretching the morality-from-inaction notion very far here. In fact you cannot stretch this idea so far as Shepard had no idea what would happen when he refused. Refusal is an unknown quantity to Shepard when he made his decision, synthesis is completely known. How can you be morally guilty of something you genuinely don't know will happen?
Modifié par Grimwick, 02 juillet 2012 - 07:41 .





Retour en haut





