Aller au contenu

Photo

The most dire title the Reapers deserve is "Terrible Natural Disaster".


  • Veuillez vous connecter pour répondre
883 réponses à ce sujet

#126
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

Indeed. In fact, down that path the Reaper's own logic lies. They were, after all, killing everyone for the 'greater good'...

If one really subscribed to such abstracted, objective thinking, then why get rid of the Reapers in the first place? It makes the final act of the game (whatever is chosen) just a cowardly act of selfish self-preservation.

Because there was a better solution.

#127
Kel Riever

Kel Riever
  • Members
  • 7 065 messages
I was going to recommend the title of this thread by Seival be called Terrible Natural Disaster....

#128
GreyLycanTrope

GreyLycanTrope
  • Members
  • 12 708 messages

Auld Wulf wrote...
Because it isn't morally repugnant. BioWare actually did something clever. For the Mass Effect series they were playing up to the popcorn flick audience most of the time, and dropping anti-Reaper propaganda wherever they could. Now, this actually had me raise an eyebrow at multiple occasions, but others (like yourself) seem to have had your thinking successfully altered.

The ending presents a clever paradigm shift where everything you know is wrong. That's a really fun literary device and there's absolutely nothing wrong with it. To pull the rug out from underneath the player and then force a choice on them which they have unlimited time (but limited, from the perspective of the game) to think about is absolute genius. It's the kind of thing you'd only expect from Deus Ex, which has done similar. The disconnect here is that the mainstream were ready to believe in absolutes. That the Reapers were an absolute evil, that the player was an absolute good.

Except it turns out that it wasn't as simple-minded as that. The endings represented different ideas and ideologies. Basic black & white classifications such as 'it be good' or 'it be evil' just don't fit. It's like trying to fit a square peg in a tesseract-shaped hole. To a degree it might work, but ultimately the ending is meant to be parsed differently and actually requires some reflection on personal philosophies. Each ending is an ideology. Destroy is very much a mix of Nietzschean and Darwinian beliefs, Control represents a 1984-esque fascist ideology of absolute control over everyone (Shepard is essentially Big Brother), and Synthesis is more representative of ideologies like transhumanism and the Singularity.

I can't help but hear Bush's words echoing in my ears when someone condemns Synthesis, because I'm reminded of his 'special little snowflakes' rants, and how Abortion should be illegal because it tampers with the natural order. This is silly. Everything we do tampers with the natural order, as I've pointed out. Read a book? You grow intellectually. That's tampering with yourself beyond what nature intended, since you now know something you otherwise couldn't have known, and you're familiar with viewpoints and perspectives you otherwise wouldn't have been. Had a successful operation? Your life has now been prolonged further than nature would have intended, you are now essentially an unnatural creature because you didn't die when you were supposed to.

Using synthetic medicines or supplements? That's also unnatural. Technology for global interconnectivity? We weren't born with it, so... unnatural! And it goes on. See, this always happens. You'll always have people, like you, who'll scream at any kind of advancement that they don't actually understand. I've pointed this out before, and I've also pointed out that this problem even exists on the consumer level. Basically, your argument is grouping you in with the likes of 'Stop the Cyborgs!'

Google Glass is a cool bit of consumerist tech that will likely be helpful to a lot of people. It's got the potential to be the new iPhone in a way, since it allows for hands-free connectivity. Yet, as I mentioned, you have people reacting to this as though it's somehow interfering with the natural order. That life will be irrevocably changed, that this device shouldn't even be allowed in many public venues because you're 'spying' or whatever else. You have very luddite-minded people who feel threatened by it. You have people outright claiming that just because someone has a Google Glass device, that privacy will be outright impossible. GASP!

I'm not making this up.

In the UK, we have lots of cameras everywhere for our protection anyway. So that this is an issue in the UK is absolutely ridiculous. Providing that you obey the law outside of your home, how can this be a threat to privacy? Stepping outside your door every morning then is a threat to privacy. But you can't talk to people like that, because they se it as repugnant, so unnatural, and a threat to their way of life. Everything we do brings us one step closer to a technological Singularity, transhumanism drives many scientific minds. I have a number of engineer friends who're all big fans of transhumanism, Deus Ex, and Synthesis.

There's nothing repugnant about Synthesis. It's just an idea, an ideology, and a potentiality for our future. It's symbolism. To see it as anything else is to group yourself in with the 'Stop the Cyborgs!' people.

But some people will always be smarter than others. I'm guessing that people who find Synthesis so very repgunant aren't actually doctors, scientists, or engineers. It would be very hard to find something that is essentially a cure to the organic condition as repugnant, when the suffering of so many would end. Under the Hippocratic Oath, you'd actually be loathe to choose anything other than Synthesis as it would make you a hypocrite.

So there you go.

I honestly lack the words to adequately describe how flawed these assertions and comparisons are.
Image IPB

#129
Mangalores

Mangalores
  • Members
  • 468 messages

There's nothing repugnant about Synthesis. It's just an idea, an ideology, and a potentiality for our future. It's symbolism. To see it as anything else is to group yourself in with the 'Stop the Cyborgs!' people.


In my view you are missing the point entirely. The repugnant act is the terror of a malicious divine being forcing everyone to adhere to its mad ideas and ideology.

Transhumanism. Probably inescapable. AI? Possibly the evolutionary future of mankind. Virtual universes, cybernetic implants, genetically engineered chimera all possible and not necessarily immorale.

It's how all these things will be used which will make it moral or amoral. The Synthesis ending is fundamentally amoral and can only be accepted under the durress of a massmurdering space dictator which quite frankly is a good hint that it is amoral. Otherwise you could end the cycle, make a referendum and convince people that it is a good idea and have volunteers change when they like it.

So you can stuff as many cybernetics into your brain as you like, the second you want to force that on someone else against their will or who are dependant on you representing their best interest, it becomes highly questionable.



But some people will always be smarter than others. I'm guessing that people who find Synthesis so very repgunant aren't actually doctors, scientists, or engineers. It would be very hard to find something that is essentially a cure to the organic condition as repugnant, when the suffering of so many would end. Under the Hippocratic Oath, you'd actually be loathe to choose anything other than Synthesis as it would make you a hypocrite.

...


*lol* I reject Synthesis because it's scientific nonsense. It's not like Science Fiction stretching or explaining away scientific impossibilities, it rejects basic foundations of science and makes baseless metaphysical claims => worst pseudoscience. I think, not knowing science would make it far easier for me to accept it as a metaphysical device. As I know science, I'm highly critical of metaphysics to begin with.
As it is it kind of pisses all over physics, biology and probably information theory.

Modifié par Mangalores, 24 avril 2013 - 02:34 .


#130
Guest_Fandango_*

Guest_Fandango_*
  • Guests

Xilizhra wrote...

Fandango9641 wrote...

Xilizhra wrote...

It's theoretically possible that one could justify torturing a child. For example, the procedures in The Exorcist. However, doing so for fun is doing harm without any good coming out of it at all, and is thus clearly immoral.


Sure, so if the catalyst demanded you torture a small, innocent child to stop the Reapers, you could do it with a clear conscience?

If it was between that or let the Reapers overrun the rest of the galaxy, performing their own torturous harvesting procedures on trillions of others? The choice seems clear, unless I could find a different method of activating the Crucible.


Perfect, thank you.

#131
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

Fandango9641 wrote...

Xilizhra wrote...

Fandango9641 wrote...

Xilizhra wrote...

It's theoretically possible that one could justify torturing a child. For example, the procedures in The Exorcist. However, doing so for fun is doing harm without any good coming out of it at all, and is thus clearly immoral.


Sure, so if the catalyst demanded you torture a small, innocent child to stop the Reapers, you could do it with a clear conscience?

If it was between that or let the Reapers overrun the rest of the galaxy, performing their own torturous harvesting procedures on trillions of others? The choice seems clear, unless I could find a different method of activating the Crucible.


Perfect, thank you.

And what would you do?

#132
Argolas

Argolas
  • Members
  • 4 255 messages

Xilizhra wrote...

I am not saying your view is wrong, but it should not be followed too strictly. That way it has negative side effects, such as the encouragement of terrorism. If you are ready to do anything in order to prevent greater harm, acts such as taking hostages are extremely successful.


Only in the short term. In the long term, such actions reduce your reputation with your peers and make them less likely to actually want to help you, in addition to convincing them that you don't actually work for the greater good; hence, such things should be avoided.


It's an act of balance. Morality can be complicated, and usually, both consequence and principle need to be taken in account. Which one is more important can be different from case to case.

#133
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

Argolas wrote...

Xilizhra wrote...

I am not saying your view is wrong, but it should not be followed too strictly. That way it has negative side effects, such as the encouragement of terrorism. If you are ready to do anything in order to prevent greater harm, acts such as taking hostages are extremely successful.


Only in the short term. In the long term, such actions reduce your reputation with your peers and make them less likely to actually want to help you, in addition to convincing them that you don't actually work for the greater good; hence, such things should be avoided.


It's an act of balance. Morality can be complicated, and usually, both consequence and principle need to be taken in account. Which one is more important can be different from case to case.

Generally, principles are there because of consequences. In the situations where principles are divorced from consequence, they're generally useless.

#134
Guest_Fandango_*

Guest_Fandango_*
  • Guests

Xilizhra wrote...

Fandango9641 wrote...

Xilizhra wrote...

Fandango9641 wrote...

Xilizhra wrote...

It's theoretically possible that one could justify torturing a child. For example, the procedures in The Exorcist. However, doing so for fun is doing harm without any good coming out of it at all, and is thus clearly immoral.


Sure, so if the catalyst demanded you torture a small, innocent child to stop the Reapers, you could do it with a clear conscience?

If it was between that or let the Reapers overrun the rest of the galaxy, performing their own torturous harvesting procedures on trillions of others? The choice seems clear, unless I could find a different method of activating the Crucible.


Perfect, thank you.

And what would you do?


I would fight, which is totally besides the point (you were asked whether you felt one could ever torture an innocent child with a clear conscience, remember)? In any case, what say you about a game that sets things up in such a way as to first demand, then celebrate such a choice? Oh hang on, you've already answered that one too!

Modifié par Fandango9641, 24 avril 2013 - 02:47 .


#135
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

I would fight, which is totally besides the point (you were asked whether you felt one could ever torture an innocent child with a clear conscience, remember)? In any case, what say you about a game that sets things up in such a way as to first demand, then celebrate such a choice? Oh hang on, you've already answered that one too!

Fight what? What is there to fight? All you have is a hologram. And certainly I'd feel bad about it, but it would remain a necessity.

#136
Guest_Fandango_*

Guest_Fandango_*
  • Guests
nvm Image IPB

Modifié par Fandango9641, 24 avril 2013 - 03:01 .


#137
drayfish

drayfish
  • Members
  • 1 211 messages

Xilizhra wrote...

drayfish wrote...

I'd rather cling to the beauty I saw in Mass Effect before it turned into an ad for complete moral relativity. (Which, of course, is why I do like Marauder Shields.)

But that's even more absurd, not to mention being incredibly self-righteous in its tone.

Aside from being needlessly aggressive, you've not actually made an argument here.  It is 'absurd' for me to personally appreciate one theme in a fiction more than another?  ...I didn't realise I had to ask your permission to have a preference.

Unless, of course, you are now decreeing that everyone has to embrace the ending's final relativity, to force themselves to 'change' their mind as you did:

Xilizhra wrote...

It's easier to change one's own mind about the endings to accept them than to change the actual endings.

But, of course, this rationale is precisely what I was talking about when I wrote that first post...

I really do wonder if Bioware takes pride in creating a fiction that has encouraged someone such as yourself to 'change' your mind in such a way?  To be brought to the point of advocating the torture of a child for the 'greater good'.  To happily bargain away the freedoms of others because you apparently know better than they how they should be 'protected'?  To ignore the hypocrisy in stopping countless figures who sought to dominate others, only to gladly do it yourself, and to rationalise away the heavy, necessary contradiciton in such an act?

Between you, Seival's original post, and whatever was going in in Auld Wulf's screed, almost all of the bases of Bioware's triumvirate are covered, and they all seem to advocate the oppression of people's basic freedoms for 'their own good'.  ...All of which might not even be so bad were you all not desperately trying to convince yourself that there is no hypocrisy, and no loaded potential dangers, in embracing such principles.

(I am most certainly not saying that everyone who admires the endings exhibits this kind of self-justifying relativism (or embrace of the transhuman fantastic), but it appears to be a feature of all three of your commentaries.)

Writers have a profoundly intimate occupation.  They delight, challenge, and inspire their audience.  But I feel legitimately sad to think about what Bioware has encouraged its players to ignore or bargain away for such an asinine, arbitrary narrative 'twist'.

Modifié par drayfish, 24 avril 2013 - 03:13 .


#138
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

Aside
from being needlessly aggressive, you've not actually made an argument
here. It is 'absurd' for me to personally appreciate one theme in a
fiction more than another? ...I didn't realise I had to ask your
permission to have a preference.

Not you, the comic.

And in Control, I don't even want to maintain it. It's a stopgap because it's the least burdensome choice. I'm only maintaining control until the galaxy as a whole gets around to Synthesis.

#139
Argolas

Argolas
  • Members
  • 4 255 messages

Xilizhra wrote...

Argolas wrote...

Xilizhra wrote...

I am not saying your view is wrong, but it should not be followed too strictly. That way it has negative side effects, such as the encouragement of terrorism. If you are ready to do anything in order to prevent greater harm, acts such as taking hostages are extremely successful.


Only in the short term. In the long term, such actions reduce your reputation with your peers and make them less likely to actually want to help you, in addition to convincing them that you don't actually work for the greater good; hence, such things should be avoided.


It's an act of balance. Morality can be complicated, and usually, both consequence and principle need to be taken in account. Which one is more important can be different from case to case.

Generally, principles are there because of consequences. In the situations where principles are divorced from consequence, they're generally useless.


Utilitarianism is rather a morality of immediate consequences. If you look at it the way you described above, morality focused on consequences can turn out to be completely useless. For example, if a terrorist takes hostages and claims something less valuable than their lifes in return for their release, your morality tells you to give them that. 

However, if you consider the encouragement of terrorism a consequence as well, it results in even more acts of terrorism and your morality tells you to do the direct opposite than it would have before. Just an example: Terrorists take 1000 innocent hostages. They agree to release those hostages unharmed if you kill 999 innocent people yourself. 1000 lives are worth more than 999. Would you still accept the bargain? Or wouldn't the right thing to do in this case rather be not to accept this deal with terrorists in order to show them that they have no power over you so you don't encourage them to do more? Otherwise, you might end up completely serving those terrorists, always doing whatever they want.

If you don't accept the bargain, you are still looking at the consequences, however you don't follow Utilitarianism anymore, that would oblige you to look at the immediate consequences and accept the bargain. Instead, you would have acted according to the "categorical imperative" as Kant would have which is a position that opposes Utilitarianism: You followed the principle "I do not kill innocents" even though the immediate consequences looked worse that way. You did that because some principles are worth upholding even at a cost, and "I do not kill innocents" is one of them.

Modifié par Argolas, 24 avril 2013 - 03:17 .


#140
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

If you don't accept the bargain, you are still looking at the consequences, however you don't follow Utilitarianism anymore, that would oblige you to look at the immediate consequences and accept the bargain. Instead, you would have acted according to the "categorical imperative" as Kant would have which is a position that opposes Utilitarianism: You followed the principle "I do not kill innocents" even though the immediate consequences looked worse that way. You did that because some principles are worth upholding even at a cost, and "I do not kill innocents" is one of them.

That doesn't follow, and I think you're inventing things about utilitarianism, which I look at as encompassing all consequences, not just short-term. In any case, if I did hold to that principle, it'd be yet another thing that would keep me from picking Destroy.

#141
Argolas

Argolas
  • Members
  • 4 255 messages

Xilizhra wrote...

If you don't accept the bargain, you are still looking at the consequences, however you don't follow Utilitarianism anymore, that would oblige you to look at the immediate consequences and accept the bargain. Instead, you would have acted according to the "categorical imperative" as Kant would have which is a position that opposes Utilitarianism: You followed the principle "I do not kill innocents" even though the immediate consequences looked worse that way. You did that because some principles are worth upholding even at a cost, and "I do not kill innocents" is one of them.

That doesn't follow, and I think you're inventing things about utilitarianism, which I look at as encompassing all consequences, not just short-term. In any case, if I did hold to that principle, it'd be yet another thing that would keep me from picking Destroy.


It is impossible to take all consequences into consideration because you can't possibly predict them. Only immediate consequences are predictable and can be considered if you want to apply Utilitarianism to actual problems. If you are talking about a morality that involves all consequences of an action that looks nice on paper but is an utterly useless exercise of thought and not fit to solve ethical questions.

Modifié par Argolas, 24 avril 2013 - 03:33 .


#142
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

Argolas wrote...

Xilizhra wrote...

If you don't accept the bargain, you are still looking at the consequences, however you don't follow Utilitarianism anymore, that would oblige you to look at the immediate consequences and accept the bargain. Instead, you would have acted according to the "categorical imperative" as Kant would have which is a position that opposes Utilitarianism: You followed the principle "I do not kill innocents" even though the immediate consequences looked worse that way. You did that because some principles are worth upholding even at a cost, and "I do not kill innocents" is one of them.

That doesn't follow, and I think you're inventing things about utilitarianism, which I look at as encompassing all consequences, not just short-term. In any case, if I did hold to that principle, it'd be yet another thing that would keep me from picking Destroy.


It is impossible to take all consequences into consideration because you can't possibly predict them. Only immediate consequences are predictable and can be considered if you want to apply Utilitarianism to actual problems. If you are talking about a morality that involves all consequences of an action, that looks nice on paper but is an utterly useless exercise of thought and not fit to solve ethical questions.

But the consequences of bargaining with terrorists, namely the encouraging of terrorism, are completely predictable. Not all long-term ones are, but many are enough that it renders your definition suspect.

#143
Argolas

Argolas
  • Members
  • 4 255 messages

Xilizhra wrote...

Argolas wrote...

Xilizhra wrote...

If you don't accept the bargain, you are still looking at the consequences, however you don't follow Utilitarianism anymore, that would oblige you to look at the immediate consequences and accept the bargain. Instead, you would have acted according to the "categorical imperative" as Kant would have which is a position that opposes Utilitarianism: You followed the principle "I do not kill innocents" even though the immediate consequences looked worse that way. You did that because some principles are worth upholding even at a cost, and "I do not kill innocents" is one of them.

That doesn't follow, and I think you're inventing things about utilitarianism, which I look at as encompassing all consequences, not just short-term. In any case, if I did hold to that principle, it'd be yet another thing that would keep me from picking Destroy.


It is impossible to take all consequences into consideration because you can't possibly predict them. Only immediate consequences are predictable and can be considered if you want to apply Utilitarianism to actual problems. If you are talking about a morality that involves all consequences of an action, that looks nice on paper but is an utterly useless exercise of thought and not fit to solve ethical questions.

But the consequences of bargaining with terrorists, namely the encouraging of terrorism, are completely predictable. Not all long-term ones are, but many are enough that it renders your definition suspect.


No they are not. There is no way of telling whether or not and how many acts of terrorism you encouraged with that particular action and how much damage exactly you avoided. What if every future act of terrorism would have been commited without you refuting the bargain? Those terrorists are usually quite committed and may have tried anyway. The answer is that you don't know. Predicting mediate consequences of your actions will soon get out of hand and result in conflicting results.

By refuting that bargain, you weren't fully utilitarian, you chose a deontological approach and upheld a principle because it was worth it in that situation. That is not a bad thing. There are many opinions that favor one side more than the other, and you seem to favor utilitarianism, but none of them alone is an appropriate answer for actual ethical problems.

#144
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

No they are not. There is no way of telling whether or not and how many acts of terrorism you encouraged with that particular action and how much damage exactly you avoided. What if every future act of terrorism would have been commited without you refuting the bargain? Those terrorists are usually quite committed and may have tried anyway. The answer is that you don't know. Predicting mediate consequences of your actions will soon get out of hand and result in conflicting results.

By refuting that bargain, you weren't fully utilitarian, you chose a deontological approach and upheld a principle because it was worth it in that situation. That is not a bad thing. There are many opinions that favor one side more than the other, and you seem to favor utilitarianism, but none of them alone is an appropriate answer for actual ethical problems.

And... if this is true anyway, about which I have doubts, it just creates more reasons to not pick destroy.

#145
3DandBeyond

3DandBeyond
  • Members
  • 7 579 messages
The story is fundamentally flawed. The science behind a lot of assertions is wrong. The kid is not an avatar of evolution, he is the destroyer of it. He specifically sees chaos as evil and order as good. He's a machine and order is needed for machines, chaos is ruinous to them. But for organics, there is more understanding and nuance of the words. Chaos and order are neutral but can be bad and good based upon their purpose or what they achieve. And actions are related to outcomes-well intended actions may lead to bad outcomes or vice versa. But a machine that is bereft of nuanced understanding that is the ken of organic brains and minds, cannot conceive of such concepts. Chaos is antithetical to the "life" of a synthetic mind that is not truly alive.

The kid by harvesting the so-called most advanced organics is actually creating less advanced organics in succeeding cycles. The game has it wrong. The kid interrupts evolution, so organics cannot evolve to become advanced. And his use of seeded tech (all tech is based upon reaper tech) should have the effect of causing organics to devolve, to atrophy. The game does not show that and the game has it wrong.

This is complete idiocy. The notion that some program created by the moronic arrogant Leviathans is somehow far better at evolution (a process he thinks is wrong) than natural evolution. So, at the core the Leviathans have created a program to control evolution and they think it's still doing its job. Goody. That's all the confirmation I need that things are going as they should and I should go along with it. The Leviathans have made such good decisions all along the way so they must be the experts.

The catalyst and reapers made the ideal solution possible. Ha ha ha ha ha ha ha. Right. Ok, so what you're saying is the catalyst and reapers made synthesis possible, so that means they created the thing that makes it possible-the crucible, right? Ok yes I think it's a good thing to use something that these idiots made in order to do what they want. They've only been turning people into goo for millions of years. No big thing. And the ideal solution. Right, synthesis. Tech fully integrated into organics by this crucible/citadel/kid thing. Tech that comes from where? Don't know, so what it's got to be good. Tech to do what? Don't know, so what it's got to be good. Because why? Slide shows and because synthesis is inevitable in evolution and then it stops evolution. Yeah, that is so contrary to evolution as to be laughable. Evolution will not make synthesis happen, only someone putting tech into our DNA will do that. And what a great concept. Tech fails, bodies reject artificial things inserted into them. And we have no idea where the tech is from and what it is and does. Great idea.

I don't care if they were or were not interested in war. They created it. Anyone with a brain can see that. And it's only creatures with less understanding brains that do not understand the nature of the pain they are causing, well intended or not. A fire may cleanse and allow for new life to take hold where the fire burned. But the life that dies in its path is more concerned with living on itself. It would stop the fire if it could. By the OP's estimation, that cleansing fire should be allowed to overtake homes, kill people, anything just because it must and it can.

The idiocy of the whole thing is that somehow this program created by the idiot Leviathans is so brilliant that it can see the future. Yet, all tech is flawed. This is a timeless truth. Computers cannot predict certainty in the future; they can predict possibilities and even probabilities but never can they predict inevitable events because other things can happen. Chaotic random events that they cannot predict and plan for. No one, not even the best computer can say with certainty that something will happen. And certainly no computer can say that something is inevitable when events have proven that it need not be inevitable. People understand this.

Synthesis may happen but the laws of probability say also that it may never happen. And it will never be evolutionary. It will be by some mistake or some intent that it will happen.

And since it's unknown what that tech may do or what might happen with it, no one can say that it would be the ideal thing or even a good thing at all. Even the game uses the example of the zha'til to point out problems and there are lots of examples of bad synthesis in the game-the reapers and all.

#146
Argolas

Argolas
  • Members
  • 4 255 messages

Xilizhra wrote...

No they are not. There is no way of telling whether or not and how many acts of terrorism you encouraged with that particular action and how much damage exactly you avoided. What if every future act of terrorism would have been commited without you refuting the bargain? Those terrorists are usually quite committed and may have tried anyway. The answer is that you don't know. Predicting mediate consequences of your actions will soon get out of hand and result in conflicting results.

By refuting that bargain, you weren't fully utilitarian, you chose a deontological approach and upheld a principle because it was worth it in that situation. That is not a bad thing. There are many opinions that favor one side more than the other, and you seem to favor utilitarianism, but none of them alone is an appropriate answer for actual ethical problems.

And... if this is true anyway, about which I have doubts, it just creates more reasons to not pick destroy.


Applying those principles on the ME3 ending is really difficult. The consequences are impossible to predict, many people hardly understand the consequences after seeing the ending, and it is outright impossible to know them before choosing. On the other hand, principles become rather small and irrelevant considering the consequences of that decision.

Just as a last example, I am not a philosopher so I will throw in a critical case for pure utilitarianism from someone who is:

“Suppose that a sheriff were faced with the choice either of framing a ****** for a rape that had aroused hostility to the Negroes (a particular ****** generally being believed to be guilty but whom the sheriff knows not to be guilty)—and thus preventing serious anti-****** riots which would probably lead to some loss of life and increased hatred of each other by whites and Negroes—or of hunting for the guilty person and thereby allowing the anti-****** riots to occur, while doing the best he can to combat them. In such a case the sheriff, if he were an extreme utilitarian, would appear to be committed to framing the ******.”

- H. J. McCloskey -

You see, there is no perfect answer to ethical problems, if there was it would have been established by now. You will have to consider each case on its own and decide what morality you want to apply there, that is what makes it so difficult.

#147
3DandBeyond

3DandBeyond
  • Members
  • 7 579 messages
And calling the reapers a natural disaster is a real joke. There's nothing natural about them. Terrible yeah. A disaster, to be sure. It's like saying nuclear bombs are terrible natural disaster and I'd get rid of any ability to make them if I could. I'd get rid of them if I could. I wouldn't want some nice person having control of them (nuclear bombs) and I wouldn't want to have my body altered just so we could keep them around as paperweights. Nor would I want to be forced to torture babies or kill my friends because some nuclear bomb god says I must.

#148
robertthebard

robertthebard
  • Members
  • 6 108 messages

Fandango9641 wrote...

Xilizhra wrote...

I think this has to do with a fundamental difference in our ethical systems. You're deontological, I'm more teleological and utilitarian. I believe that the best action you can undertake in a given situation cannot be immoral, because I believe that outcomes are more important than the intrinsic morality of actions.


Splendid. Would you like to reveal to us all then the set of circumstances under which you consider it perfectly fine to torture a child for fun…..without compromising your morals or ethics I mean?

Would you kindly point out what this has to do with Mass Effect?  Are you referencing Pragia, or simply trying to find a justification for torturing children?  I have to say, I'm refreshed to find the "you people are monsters" turned on an ending choice other than destroy, but frankly, I'm appalled at the gall it requires to postulate that torturing children is equivalent to picking an ending in a video game.  Yet you have the audacity to accuse someone else of being a monster?  As a parent, and grandparent, it is my sincere wish that you are not in a profession that requires you to work around children.

#149
Argolas

Argolas
  • Members
  • 4 255 messages

robertthebard wrote...

Fandango9641 wrote...

Xilizhra wrote...

I think this has to do with a fundamental difference in our ethical systems. You're deontological, I'm more teleological and utilitarian. I believe that the best action you can undertake in a given situation cannot be immoral, because I believe that outcomes are more important than the intrinsic morality of actions.


Splendid. Would you like to reveal to us all then the set of circumstances under which you consider it perfectly fine to torture a child for fun…..without compromising your morals or ethics I mean?

Would you kindly point out what this has to do with Mass Effect?  Are you referencing Pragia, or simply trying to find a justification for torturing children?  I have to say, I'm refreshed to find the "you people are monsters" turned on an ending choice other than destroy, but frankly, I'm appalled at the gall it requires to postulate that torturing children is equivalent to picking an ending in a video game.  Yet you have the audacity to accuse someone else of being a monster?  As a parent, and grandparent, it is my sincere wish that you are not in a profession that requires you to work around children.


To be fair, this turned into a rather basic ethical discussion without any direct relation to the ME3 endings for a while...

#150
Guest_Fandango_*

Guest_Fandango_*
  • Guests
EDIT: why waste my time?

Modifié par Fandango9641, 24 avril 2013 - 04:38 .