Fast Jimmy wrote...
I'll cut out the rest, if you don't mind
I don't mind.

Not at all. I'm simply asking for moral clarification.
Are you making this choice to help these innocents/broker a peace/save the village/what have you because you believe it is the right thing, or because you believe it will result in the best outcomes?
If it is because it has the best outcomes, then does it matter as much? Should "best outcomes" even be an objective result? One player in another thread today said they would just pull up a guide online to figure out how to get the "best outcomes."
What I suggest is a game where that term doesn't even apply. I'm not talking about making things dark and suffering just because, but to offer choices, REAL choices, where people have to say what they want... at the possible expense of other things, which they may also want.
Again, I'm not disagreeing with you on the basic idea in the bolded portion of your post. The problem is where we start discussing "best outcomes" and what we think of them. The outcomes usually break down into this: (i) everyone lives (ii) some people live, but life is bad for others and (iii) crapsack ****hole.
Not all choices have to be between paradise, being fisted and being gored.
Even if choices do amount to a choice between those three positions, the problem comes into how the actual choice is crafted. As I tried to illustrate with the series of leading question I first asked when wading into this topic, players will fight your hypothetical
hard when you try and create the kind of dichotomy you want. And they're right to do so, because these "hard choices" are entirely artificial.
Do the strength of Bioware's character-crafting lead people to choose the well-being of those NPCs over the general well being of other entire groups? Do the already-established battle lines that can be seen here on the BSN, such as "pro-Mage," "pro-Dalish," "anti-Andrastian" lead people to take the stances mindlessly, without thought, opinions already formed before they even know the context of the choice? Or will players have to consider that they may have to work with people they wouldn't like in order to ensure that the groups they DO care about (Dalish, Apostates, nugs, whatever) all live to see the end of the conflict somewhat intact?
If there's anything that the werewolf choice illustrates is that players are "pro-reducing suffering" and "pro-helping -the-people-in-front-of-them." And then the choice you want to craft is (i) kill and torture people now or (ii) kill and torture more, different, people later. That's not a meaningful moral choice. That's
just throwing it in people's face that they'll have to kill others.
Video games have gotten into a very nasty habit of saying they are offering choices, but really just encouraging pre-made templates to be followed, such that players already know what they are going to do before they even know what doing said choice entails. That, to me, is a real fallacy with people in general in today's world and I would love to see a fictional, interactive setting work to make people question what they really believe and how they really represent their ideals.
As I illustrated above, what you're advocating is no differnet. I mean, the choice I invented with the Couslands follows
exactly the same template. It's a simple morality and a simple binary choice.
"I want to do X or support Y... but can I afford to? Will the cost of doing so be worth it?"
Again, in the abstract, I agree. But in practice all the choices suggested seem to amount to (i) kill and torture people now or (ii) kill and torture more, different, people later.
Would such a system reveal people to be fanatics about their beliefs, or truly devout? Fair weather fan of an idea, or pragmatist compromiser? Doing what needs to be done to get the job done, or sacrificing lives so that the actual blood won't appear on YOUR hands?
Pragmatist compromises just means kill the least important people now, typically.
These are what I am striving for. TRUE morality questions. Not just "side with whatever team you want, complete all the side and loyalty quests and run this game on auto-pilot." That's not a riveting experience. That doesn't make the player look at anything with a discerning eye, or walk away with any experience that can't be found in a Mario game.
Like Maria explains - moral problems aren't difficult. All those problems you see in philosophy, they're not suggested because the answer is hard. They're suggested because they undercut a moral theory. The interesting question is whether or not the theory is still logically sound in front of the example. But the moral choice is not usually hard.