Aller au contenu

Photo

A matter of consequences


  • Veuillez vous connecter pour répondre
237 réponses à ce sujet

#176
Fast Jimmy

Fast Jimmy
  • Members
  • 17 939 messages

Sylvius the Mad wrote...

Now, you're examining the choice from the player's perspective, rather than the Warden's perspective, which changes to arithmetic considerably.


Which brings us back to what I had said earlier - many players don't look at it from a role-playing perspective or consider doing it from their character's perspective. They only approach it from themselves as players and wanting to get the best endings. Which is why Bioware should work to create situations that will encourage players to make hard choices with no clear, optimal outcomes. That way, those who play self-inserts will have to have the hard moral dilemma to struggle with. They may even then have the first inclination to ask "what kind of character would choose X instead of Y?" Which may push said players into role-playing different characters instead of just doing self-I sets or characters which have no sense of character complexity.

And, for the record, I think the train car thought experiment is also very interesting, because a person's answer can tell us quite a bit about how they view action vs. inaction. I, personally, don't see how anyone could possibly offer a moral justification for pulling the switch to kill the smaller group.


Unless, of course, the people aren't all strangers. If by flipping the switch, you could kill the smaller group and save the larger group, which consisted of your family, friends and/or loved ones, I think the question becomes much different for many people. And the same goes for the action/inaction of the small group, if it contains people important to the decision maker.

But that aside, even if all parties are random strangers... I'm surprised at this conclusion. Especially since it runs counter intuitive to nearly every RPG, ever. The role the PC plays in killing dozens, hundreds, possibly even thousands, of humanoids or other sentient beings is that, in most cases, are the "bad guys" in order to save the world at large. If the PC took no action to save the world, many people would die or face categorically bad fates... but through their action, a smaller group will, undoubtedly, die.

Is all of that okay because bad guys are intrinsically bad and, therefore, expendable? Because that is an incredibly over-simplified set of results.

#177
Gwydden

Gwydden
  • Members
  • 2 815 messages

In Exile wrote...
I don't see why it would be natural at all. It would have to be the case that the Dalish are so petty and spiteful that they would risk the eradication of all life in Thedas to... punish the Warden for not saving Zathrian's life in events that you don't even have to disclose to them? 

The straightfoward story of Zathrian had to make a heroic sacrifice to end the curse is perfectly believable, and frankly wouldn't even involve telling them the truth about their liar of a leader.


First, the Blight would only have consumed Ferelden, which is a rather small an unimportant kingdom. Had it succeded there, the darkspawn would have proceded to Orlais, a powerful empire with hundreds of Gray Wardens. It would have been even easier for the dalish to just run to Rivain and let the humans take care of themselves. In character, too, since that's essentially what they did in the Second Blight, no spite required.

Second, I'm just telling what I thought would happen my first time doing that quest, because I didn't think the dalish would believe the Warden at all.


You say this, but again, all of the examples I see come down right to that. Your description about what should happen with the Dalish quest is all about punishing the player - unless you kill the werewolves or the dalish (punishment by killing perceived innocents), then I, the game developer, will kill innocents in the endgame. So, trolololol, no matter what you do innocents have to die!


Here's the thing. Every time I play Nature of the Beast, if I chose to either save only the werewolves or the dalish, I feel like I'm sabotaging myself. If I save both, I feel like I just didn't make a choice at all. Also, you're assuming every player/PC cares about the innocents who hypothetically die in the endgame as much as for the dalish and the werewolves.

Again, I agree with you. The idea of a "best" ending is stupid. But when the entire moral dilemma is "let more people die" or "sacrifice your principles", then you haven't crafted a particularly deep conflict or moral dillema. You've just said, how personally ****ty are you willing to feel, player, in return for the obviously better outcome when the majority of innocents survive?


I agree that it shouldn't be the entire moral dilemma. I'm aware that it would get old quickly. Actually, the endgame choice at DAO is not about more or less or different innocents dying but strictly pesonal, and I loved it. Would you say that choice punished the player, since the PC couldn't survive without letting someone else die in his place or getting their hands dirty?

Look at TW. It doesn't create hard choices by taking two innocent puppies and making you murder one to bathe in its blood. It gives you ****ty people. Unlikeable people. There's no way your hands are clean, and frankly most of the time it just makes you want to walk away. The interesting story it asks you is what do you do when almost everyone is rotten. And even TW1 and TW2 both have their rousing and resounding hero moments.


However, most relevant characters in the Dragon Age series, from companions to antagonists, are intended to be either loved (or at least liked) or hated. They're rarely repellant or disgusting. How would you create hard choices then?

I just don't flounder like that. If there's a greater good, then it's the obvious choice.


I must insist that times of crisis really put your principles to the test. I don't think you can't be certain of what you would do in an extreme situation without having been through one. Now, maybe in game making those choices come easy for you, but the point stands that morals are a rather flexible thing.

Also, you don't need to reevaluate your ideals for a choice to be hard. The Virmire choice in the first ME is not really a moral conflict, but it still is a hard choice. Of course, if you considerably prefer one character over the other, you might be a little biased when making it. Is the Virmire choice a punishment, since there's no way you can save both?

And even if you do consider decissions like that "punishments", do you really believe that those, in small dosis, are a bad thing in a RPG?

#178
Jorji Costava

Jorji Costava
  • Members
  • 2 584 messages

Sylvius the Mad wrote...

And, for the record, I think the train car thought experiment is also very interesting, because a person's answer can tell us quite a bit about how they view action vs. inaction.  I, personally, don't see how anyone could possibly offer a moral justification for pulling the switch to kill the smaller group.


Well, there's straight-ahead utilitarianism, the doctrine of double effect, killing vs. letting die, etc. Are these good justifications? Not so sure about that, but they're there and they have their defenders, anyways.

On hard moral choices:

Think I'm mostly in agreement with Fast Jimmy on this one, although for somewhat different reasons. In general, I don't think the Big Moral Choices should have clearly 'correct' outcomes, since these choices tend to reflect the player's (or the character's) values. Having one choices turn out clearly better than the other seems like a tacit way of passing judgment on these values, and I don't think that it's a great idea for the game to be in the business of doing that. I'm thinking of how frustrating it was for many Renegade players of ME to see their choices come to naught. So it's not strictly a matter of encouraging role-playing so much as it is of permitting it, in a way.

That's not to say that there can't be any choices (not necessarily the Moral ones) in which one outcome is clearly better. For instance, a high score in a specific ability might give you access to dialogue options other PC's wouldn't have, or completing certain prior quests might open up a desirable 'third' option in a choice later on. That seems little different to me than structuring the game so that players who spent more time level grinding or collecting loot have easier times with dungeon crawls, etc. So I don't think you need to have a game which consists of Sophie's Choice after Sophie's Choice, but when the PC's most deeply held values are at stake, it's probably best to keep things a bit more grey when possible.

EDIT: Fixed quote

Modifié par osbornep, 22 septembre 2013 - 02:31 .


#179
Sylvius the Mad

Sylvius the Mad
  • Members
  • 24 117 messages

Fast Jimmy wrote...

Which brings us back to what I had said earlier - many players don't look at it from a role-playing perspective or consider doing it from their character's perspective. They only approach it from themselves as players and wanting to get the best endings. Which is why Bioware should work to create situations that will encourage players to make hard choices with no clear, optimal outcomes. That way, those who play self-inserts will have to have the hard moral dilemma to struggle with. They may even then have the first inclination to ask "what kind of character would choose X instead of Y?" Which may push said players into role-playing different characters instead of just doing self-I sets or characters which have no sense of character complexity.

Again, why should BioWare be in the social engingeering business?  All we really need to care about is whether the games allow us to roleplay, not whether they encourage (or require) others to do so.

Unless, of course, the people aren't all strangers. If by flipping the switch, you could kill the smaller group and save the larger group, which consisted of your family, friends and/or loved ones, I think the question becomes much different for many people. And the same goes for the action/inaction of the small group, if it contains people important to the decision maker.

That would be a selfish justification, not a moral one.

#180
Angrywolves

Angrywolves
  • Members
  • 4 644 messages
Bioware doesn't care nor should they care about how players play the game.
They've been warned about the consequences .
For example if an evil dude player commits genocide in the game, gets a " bad ending " and is unhappy with that ending it's not Bioware 's fault.
It's the player's fault.
They've been warned .
Bioware has done their duty and now needs to just sell the game.

#181
Vaeliorin

Vaeliorin
  • Members
  • 1 170 messages

Sylvius the Mad wrote...

In Exile wrote...
Look at the discussion in this forum. The complaint about the Connor and Dalish questlines is that the choice is too obvious for the player - save everyone - and no one hesitates. Because, obviously, saving everyone is the evidently moral thing to do that we all want.

I'd just like to reiterate my position that I love the Connor choice.  I think a Warden needs to be unbelievably reckless in order to choose the "save everyone" option.  This is even foreshadowed by the initial decision whether to defend Redcliffe.

I disagree that the Warden must be reckless in order to choose the save everyone option.  The Warden and 3 companions just wiped out Connor's forces, potentially without anyone additional dying.  The are potentially 6 more companions who could sit in Redcliffe keeping Connor contained while the Warden goes off to the Circle.  Bioware may not have scripted it that way, but I think that is a failure on Bioware's part, not evidence of a reckless Warden.

I guess what that all boils down to is that I don't object to difficult choices, but I object to being forced to pick among difficult choices when there's a completely obvious better option that I'm inexplicably not allowed to choose.

#182
Guest_Guest12345_*

Guest_Guest12345_*
  • Guests
Fearing and anticipating consequences is what makes RPGs so good. Hell, fearing and anticipating consequences is WHY the data-import feature is so beloved.

People spent years in between ME1 and ME2, and between ME2 and ME3 fantasizing about the consequences of their choices. The potential of what may be is never fully realized, but it is a huge motivator for replayability. I know I replayed ME1 and 2 in prep for ME3 numerous times because I expected a lot of radical and meaningful divergence in ME3. 

But I think fearing and anticipating consequences is incredibly important for every single choice too. This is another reason I love the Imperial Agent class story in SWTOR. I would spend 30-60 minutes scratching my head, trying to weigh variables and morality on every major choice in the story. I did this because I knew that the consequences would be important and meaningful. 

The best thing DAI can do is show the player that consequences will be meaningful very early on in the game. The earlier this is established, the more difficult choices will be throughout the rest of the game. 

#183
Angrywolves

Angrywolves
  • Members
  • 4 644 messages
Expect to be disappointed occasionally in a video game that doesn't always give you a choice you want.

#184
Fast Jimmy

Fast Jimmy
  • Members
  • 17 939 messages

Sylvius the Mad wrote...

Again, why should BioWare be in the social engingeering business? All we really need to care about is whether the games allow us to roleplay, not whether they encourage (or require) others to do so.


Democracy needs an educated and well-informed populace to work. I'm of the mind that the free market is very much like democracy, except you vote with dollars and you elect not representatives on policy ideals, but products on design ones. It is not a true democracy, since how many dollars you spend is equally proportional to how impactful your vote becomes, but the underlying principles are the same.

So, in that light, the video game industry is increasingly becoming uneducated on different forms of playing video games, most due to the fact that developers are more and more creating homogenized experiences, where variation in both game elements as well as play styles is decreasing. In that market, Bioware is a niche commodity market. Even if it were to introduce every element possible to make it as mass-appeal as they could, fantasy WRPGs don't net big market shares, Skyrim aside.

Bioware's games are based around a combination of player agency and story-crafting, usually through character writing. For their market to continue to exist, they need to make sure more players are finding value in these qualities, otherwise they waste tons of money not making a set character who has little decisions to make. Since they have, however, committed to systems that allow for a high level of customization, not just in appearance but also in character control, it would behoove them to do their best to encourage and even "trick" their players to take full advantage of these systems.

It is no different than developing a trap system and having encounters that reward players who utilize it, or having a crafting system and allowing players options that they wouldn't have had without using said system... except, in this case, it is prodding a player towards appreciating the full complexity a game with high player agency and wide choice and consequence can offer, other than just beating the game to get the "best" ending. That seems like a monumental waste of Bioware's time, when they could have gone a much easier route (which some of their competitors in the industry are doing) and making large swaths of their players equally as satisfied.

That would be a selfish justification, not a moral one.


Morals are nothing BUT selfish justifications. By creating and adhering to a set of morals, humans are able to have a clear conscience for their actions (or inactions), which inherently is a selfish goal.

Modifié par Fast Jimmy, 23 septembre 2013 - 02:53 .


#185
Guest_EntropicAngel_*

Guest_EntropicAngel_*
  • Guests

Sylvius the Mad wrote...

And, for the record, I think the train car thought experiment is also very interesting, because a person's answer can tell us quite a bit about how they view action vs. inaction.  I, personally, don't see how anyone could possibly offer a moral justification for pulling the switch to kill the smaller group.


I personally found In Exile's argument about having the capability to induce change enlightening. You're in a position to save life, and you even have the ability, thus you're morally bound to act.

I value in-action highly as well, but also believe we are in some ways (voluntarily) morally bound.

#186
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

EntropicAngel wrote...

Sylvius the Mad wrote...

And, for the record, I think the train car thought experiment is also very interesting, because a person's answer can tell us quite a bit about how they view action vs. inaction.  I, personally, don't see how anyone could possibly offer a moral justification for pulling the switch to kill the smaller group.


I personally found In Exile's argument about having the capability to induce change enlightening. You're in a position to save life, and you even have the ability, thus you're morally bound to act.

I value in-action highly as well, but also believe we are in some ways (voluntarily) morally bound.

But that's absurd, because aren't you also killing more people than you're saving? Not even for long-term benefits.

#187
Guest_EntropicAngel_*

Guest_EntropicAngel_*
  • Guests

Xilizhra wrote...

But that's absurd, because aren't you also killing more people than you're saving? Not even for long-term benefits.


What? Not at all. You're saving the larger group of people in the larger traincar, while effectively killing the smaller group in the smaller traincar.

#188
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

EntropicAngel wrote...

Xilizhra wrote...

But that's absurd, because aren't you also killing more people than you're saving? Not even for long-term benefits.


What? Not at all. You're saving the larger group of people in the larger traincar, while effectively killing the smaller group in the smaller traincar.

Oh, it was presented in the other order previously.

#189
Guest_EntropicAngel_*

Guest_EntropicAngel_*
  • Guests
Alright.

#190
Fast Jimmy

Fast Jimmy
  • Members
  • 17 939 messages
Well, I think it just goes to show you shouldn't travel by train.

#191
Guest_EntropicAngel_*

Guest_EntropicAngel_*
  • Guests
Trains are awesome.

#192
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

Fast Jimmy wrote...

Well, I think it just goes to show you shouldn't travel by train.

Cheaper than planes.

#193
Fast Jimmy

Fast Jimmy
  • Members
  • 17 939 messages

Xilizhra wrote...

Fast Jimmy wrote...

Well, I think it just goes to show you shouldn't travel by train.

Cheaper than planes.


Depending on the departure and arrival destinations. 

But is money worth having situations where untrained individuals who don't even work for the train company are deciding if switches are thrown that can save or end your life? This all sounds HIGHLY unprofessional, not to mention unsafe. 

#194
Jorji Costava

Jorji Costava
  • Members
  • 2 584 messages

Fast Jimmy wrote...

Morals are nothing BUT selfish justifications. By creating and adhering to a set of morals, humans are able to have a clear conscience for theiractions (or inactions), which inherently is a selfish goal.


I'm not persuaded of this. From the fact that acting morally generally leaves one with a clear conscience, it doesn't follow that one's motivation for acting morally all along must have been to have that clear conscience.

On the other hand, I don't think that saving people because they're family members is inherently selfish either. It's plausible to suppose that we have special obligations to friends, family members, etc. that we don't have to strangers (which isn't to say we have no obligations to strangers). For instance, we may be obligated to do things for our children or that we're not obligated to do for a stranger's kids.

EntropicAngel wrote...

Xilizhra wrote...

But that's absurd, because aren't you also killing more people than you're saving? Not even for long-term benefits.


What? Not at all. You're saving the larger group of people in the larger traincar, while effectively killing the smaller group in the smaller traincar.


That's not quite how I heard it. I'm pretty sure the standard case involves five people being tied down to train tracks with a trolley speeding towards them. You have the opportunity to throw a switch which will divert the train towards a different set of tracks which will cause it to kill a single person who has been tied to that set of tracks. Another variation involves the possibility of throwing an obese man off of an overpass; his weight will stop a trolley from killing five people tied to the track but this will kill him. Then again, I'm pretty sure there are well over a thousand variations on these cases by now (no joke).

EDIT: Fixed formatting

Modifié par osbornep, 23 septembre 2013 - 11:29 .


#195
Guest_EntropicAngel_*

Guest_EntropicAngel_*
  • Guests

osbornep wrote...

That's not quite how I heard it. I'm pretty sure the standard case involves five people being tied down to train tracks with a trolley speeding towards them. You have the opportunity to throw a switch which will divert the train towards a different set of tracks which will cause it to kill a single person who has been tied to that set of tracks. Another variation involves the possibility of throwing an obese man off of an overpass; his weight will stop a trolley from killing five people tied to the track but this will kill him. Then again, I'm pretty sure there are well over a thousand variations on these cases by now (no joke).

EDIT: Fixed formatting


Yeah. Thus diverting the switch saves five lives at the expense of one life, for a total of four lives saved. That's what I said.

#196
Sylvius the Mad

Sylvius the Mad
  • Members
  • 24 117 messages

EntropicAngel wrote...

I personally found In Exile's argument about having the capability to induce change enlightening. You're in a position to save life, and you even have the ability, thus you're morally bound to act.

I need those dots connected.  What about that opportunity creates the obligation?

Certainly, if you act, you are responsible for that action.  But how can something that makes no material difference (your presence and ability) have moral relevance?

#197
Sylvius the Mad

Sylvius the Mad
  • Members
  • 24 117 messages

Fast Jimmy wrote...

Morals are nothing BUT selfish justifications. By creating and adhering to a set of morals, humans are able to have a clear conscience for their actions (or inactions), which inherently is a selfish goal.

I like the cut of your jib.

#198
Fast Jimmy

Fast Jimmy
  • Members
  • 17 939 messages

Sylvius the Mad wrote...

Fast Jimmy wrote...

Morals are nothing BUT selfish justifications. By creating and adhering to a set of morals, humans are able to have a clear conscience for their actions (or inactions), which inherently is a selfish goal.

I like the cut of your jib.


Life is nothing but the actions of an individual. The actions each person (or self) takes are, inherently, selfish. Whether one realizes it or does it consciously is a consideration, but it does not change the fact that everything someone does has an inherent self interest to it. 

So condemning something for being selfish is just a matter of ethical or moral degrees and relativism. 

Modifié par Fast Jimmy, 24 septembre 2013 - 07:01 .


#199
Sylvius the Mad

Sylvius the Mad
  • Members
  • 24 117 messages
I completely agree with you on individualism and selfishness, but from that I conclude that morality is meaningless. If our motives are all ultimately selfish regardless of any possible moral justifications, how do those justifications matter?

#200
Fast Jimmy

Fast Jimmy
  • Members
  • 17 939 messages

Sylvius the Mad wrote...

I completely agree with you on individualism and selfishness, but from that I conclude that morality is meaningless. If our motives are all ultimately selfish regardless of any possible moral justifications, how do those justifications matter?


Because maintaining cooperation is vitally important for the individual. While a single act of selfishness can benefit the  individual, it is worth FAR more to be able to predict with strong accuracy the behavior of others in most social settings. Whether that is being able to trade valueless money for valuable goods, not physically attacking so own because hey have something you desire or helping an old lady cross the street, it is all a common thread of overall moral code. The majority of humans share and believe in these things, to the point where laws are passed to enforce them, religions are founded to teach them and identification is obtained by those who share similar viewpoints. 

And, of course, there is additionally a not-insignificant body of evidence that would also suggest that the human brain is hard-wired for both acts of altruism, as well as what could be called spiritual activity. If we are hard wired to crave, if not even enjoy, doing good, then choosing to engage in such behaviors would be beneficial to any given human "for its own reward," which would, of course, be selfish in nature.