But won't a choice that evaluates to the least risk of resulting in an undesirable outcome "feel right"? If so, what do we evaluate as an undesirable outcome, how probable we deem it to be and how we evaluate the risk-reward -ratio is how we arrive to the conclusion. How we make these calculations is relevant here, and its there where our morals, personal and cultural biases lie. I'd say that because of this, deep down, none of our decisions are independent from our personal ethics and biases. Our rational decision making is almost never free of individual and cultural bias, and our intuition is almost never free of rational thought. They are so dependent on each other I find it problematic to try and differentiate between the two.
I don't know about you, but I felt like sh*t after killing Mordin while being completely convinced that sabotaging the cure is the only sane and rational way to act. With other Shepards, I actually looked for a way to rationalize the cure so I could make that decision without feeling stupid. I couldn't. Eventually I ignored this problem and chose for the outcome if it felt right for that character, but every single time I resented that Bioware pushed a stupid choice just because it was the intuitively good one.
The conflict is ultimately about what you would prefer to work (the cure - who wouldn't want this to work, really. It feels like the right thing to want) against what you rationally evaluate to work (the sabotage). If you tell me you never experienced such a conflict, then I'll voice my suspicion that you've been deceiving yourself. What you feel is about who you are, what you rationally evaluate is how the world is. it is the point of such evaluation to be detached from what you feel as much as you can manage, else there would be no point to it, and thus there is the potential for conflict.
Bias comes in when you decide about whether the outcome is worth it. There can be no objective way to decide that, in some way it's always personal. Say you recognize an equal risk of things going wrong or right. Do you take the risk or do you avoid it? That's personal. The evaluation of the risk itself can be done in a reasonably objective way. Many people are biased in that as well because they want their "right" choice also to be the rational one, but more often than not it's easily recognizable as self-deception.
Here's another example from DAO: the Anvil of the Void decision: I can strongly suspect someone will abuse it if I save it. Do I think it's worth saving? I know my personal bias tells me to always save it for unrelated reasons, and I often do save it, but I do know quite well there will be very undesirable side effects. That's the result of rationally evaluating the outcome, and to come to a different conclusion and say there will be no abuse means to deny human nature. It's as objective as things can be, and if no abuse materialized, that would be a very big surprise. It's all a matter of going into the decision with open eyes, without attempting to deny things you might not like.
Rational evaluation, that means to me that people can as easily recognize the risks inherent in their preferred choices as they can in choices they intuitively reject, and that they can do that regardless of what they feel about them. Nobody's perfect in that, but I think we can aspire and should aspire to become better at it. Personally and with regard to RL, I recognize such aspiration as a moral imperative.





Retour en haut







