A different ascension - the Synthesis compendium (now with EC material integrated)
#8801
Posté 09 juin 2014 - 01:50
Miranda is an above average biotic. Jack is probably the strongest human biotic alive.
- sH0tgUn jUliA aime ceci
#8802
Posté 09 juin 2014 - 01:50
It wasn't 'torture' that gave Jack her ability. It was the experimentation and development of physical alterations and upgrades that gave Jack her power. You're taking a misinformed and inaccurate view of the circumstances and presenting them in an untrue manner.
I don't think this explanation will hold up in Human Rights court.
#8803
Posté 09 juin 2014 - 01:54
I don't think this explanation will hold up in Human Rights court.
I don't think the opinion of a human rights court is relevant. The U.S. didn't think so when they spirited Von Braughn and his team and Unit 731 to the U.S. to continue their work. Anyhow, torture didn't produce the results. It was the experimentation and implanting that did that.
Human rights should never get in the way of science. What 'rights' do humans have anyway?
#8804
Posté 09 juin 2014 - 01:56
I just played Jack's LM. Those experiments constitute medical torture. I want the decent people like Miranda and Brynn (and Oleg, who isn't really good but I like him) gone so I can dismantle Cerberus.
Miranda is an above average biotic. Jack is probably the strongest human biotic alive.
I did too. They do constitute medical torture. And if they produce results, they're perfectly viable. I want to keep the decent people in, but I want the people willing to take extreme methods in as well. I want to advance the cause, not be the good guy. It's why I tell Jack to get over herself for whining about Cerberus.
#8805
Posté 09 juin 2014 - 02:07
No, they aren't viable. Science must always bow to morality. No benefit is worth it.I did too. They do constitute medical torture. And if they produce results, they're perfectly viable. I want to keep the decent people in, but I want the people willing to take extreme methods in as well. I want to advance the cause, not be the good guy. It's why I tell Jack to get over herself for whining about Cerberus.
#8806
Posté 09 juin 2014 - 02:10
Well I'm going to sit this one out
- SwobyJ aime ceci
#8807
Posté 09 juin 2014 - 02:15
No, they aren't viable. Science must always bow to morality. No benefit is worth it.
I completely disagree. Morality is useless if it gets in the way of benefits. The way I see it, if the morality gets in the way of benefiting the whole or greater good, then that morality is immoral. The only 'lines' that are drawn should be for practical and economic considerations. Though I think we'll disagree on the inherent subjectiveness of morality and my opinion that there is no morality of the universe. It's inherently amoral.
#8808
Posté 09 juin 2014 - 02:40
I completely disagree. Morality is useless if it gets in the way of benefits. The way I see it, if the morality gets in the way of benefiting the whole or greater good, then that morality is immoral. The only 'lines' that are drawn should be for practical and economic considerations. Though I think we'll disagree on the inherent subjectiveness of morality and my opinion that there is no morality of the universe. It's inherently amoral.
Arguing for subjective morality is all fine and dandy... but it would be helpful if you defined your own personal priority within your own personal morality. In other words - to you, why is the suffering of one young girl, causing years worth of psychological scarring, worth the benefit of her becoming a ridiculously powerful biotic? Why does one outweigh the other?
And I want to note - if the only way to reproduce the results of Subject Zero is to put more people through the same levels of torture, then the only progress being made is towards psychologically unstable super-soldiers who have every reason to hate you and are therefore less useful to you than ordinary soldiers. On the other hand, if the torture is not intrinsically necessary to the process, why was it there in the first place?!
- jtav aime ceci
#8809
Posté 09 juin 2014 - 03:12
And I want to note - if the only way to reproduce the results of Subject Zero is to put more people through the same levels of torture, then the only progress being made is towards psychologically unstable super-soldiers who have every reason to hate you and are therefore less useful to you than ordinary soldiers.
Good point. Maybe it would be useful to settle the question of whether we think these methods work in the first place before moving on to discussing their morality?
#8810
Posté 09 juin 2014 - 03:14
Good point. Maybe it would be useful to settle the question of whether we think these methods work in the first place before moving on to discussing their morality?
I think the method should have some sort of controlling device implanted within the person receiving it. I agree with the experiments, but it is hard to argue with the personal results to a person that might make them counterproductive to a goal.
#8811
Posté 09 juin 2014 - 03:30
I completely disagree. Morality is useless if it gets in the way of benefits. The way I see it, if the morality gets in the way of benefiting the whole or greater good, then that morality is immoral. The only 'lines' that are drawn should be for practical and economic considerations. Though I think we'll disagree on the inherent subjectiveness of morality and my opinion that there is no morality of the universe. It's inherently amoral.
Bolded text: That's a no-brainer
But morality almost always gets in the way of benefits, that is the whole point. In your post are you saying a moral standard is good only when it bends to your will? That is like having no morals at all
I don't know if the universe is amoral, since morality is a human construct. However, saying that the universe is amoral therefore it is okay to be amoral sounds conformist, which I reject. Though I do recognize that sometimes in life one is required to set morals aside or die, but to be amoral, in my opinion should be done in moderation and when circumstance requires such
#8812
Posté 09 juin 2014 - 03:39
Whenever morality passes through BSN....

#8813
Posté 09 juin 2014 - 03:57
I completely disagree. Morality is useless if it gets in the way of benefits. The way I see it, if the morality gets in the way of benefiting the whole or greater good, then that morality is immoral. The only 'lines' that are drawn should be for practical and economic considerations. Though I think we'll disagree on the inherent subjectiveness of morality and my opinion that there is no morality of the universe. It's inherently amoral.
Cerberus shows us what happens when you don't let morality get in the way of benefits.
Your experiments get loose and start killing all your guys ![]()
- DoomsdayDevice aime ceci
#8814
Posté 09 juin 2014 - 04:01
It wasn't 'torture' that gave Jack her ability. It was the experimentation and development of physical alterations and upgrades that gave Jack her power. You're taking a misinformed and inaccurate view of the circumstances and presenting them in an untrue manner.
It was done to a child who was literally stolen from her other. The process was painful and done without her consent.
It was torture.
You want to upgrade human biotics, either find genuine volunteers or try it on yourself.
#8815
Posté 09 juin 2014 - 04:04
But morality almost always gets in the way of benefits, that is the whole point. In your post are you saying a moral standard is good only when it bends to your will? That is like having no morals at all
I disagree with the bolded text.
Morality is a decision making system. It measures the amount of 'benefit' of each option - just not necessarily the amount of benefit for you. The subjectivity comes in determining what should be considered a 'benefit'. Since I personally believe that helping other people promotes co-operation, which ultimately achieves more for everyone in the long run, I have no problem with making a choice which benefits other people but has no immediately obvious benefit for myself.
Your 'will' determines what you consider to be your ultimate priority. Moral relativism then consists of developing a moral paradigm around promoting that priority. The priority can something as selfish as your own continued survival, or as selfless as the continued survival of everything (I'll note that the latter is essentially impossible), or even something as simple as "must eat more doughnuts' (Homer).
If your morality is getting in the way of what you personally consider to be a benefit, then you need to reassess your moral system and how it relates to your own set of priorities.
- MassivelyEffective0730 aime ceci
#8816
Posté 09 juin 2014 - 04:19
It was done to a child who was literally stolen from her other. The process was painful and done without her consent.
It was torture.
You want to upgrade human biotics, either find genuine volunteers or try it on yourself.
And if it works, said 'torture' is 100% justified in my opinion. I don't care how painful it was. It got me to where I needed to be.
Genuine volunteers might be limited, and I might not get the same results. If a truly worthwhile potential subject appears, I'm going to authorize their use in the project, whether they agree with it or not.
#8817
Posté 09 juin 2014 - 04:20
Cerberus shows us what happens when you don't let morality get in the way of benefits.
Your experiments get loose and start killing all your guys
That's not a correlation to morality at all.
#8818
Posté 09 juin 2014 - 04:20
And if it works, said 'torture' is 100% justified in my opinion. I don't care how painful it was. It got me to where I needed to be.
Genuine volunteers might be limited, and I might not get the same results. If a truly worthwhile potential subject appears, I'm going to authorize their use in the project, whether they agree with it or not.
At least until said subject gets loose and starts killing all your guys...
#8819
Posté 09 juin 2014 - 04:22
That's not a correlation to morality at all.
Not morality so much as a callous disregard to the views of the test subjects.
Call it enlightened self-interst, at the very least ![]()
#8820
Posté 09 juin 2014 - 04:22
I disagree with the bolded text.
Morality is a decision making system. It measures the amount of 'benefit' of each option - just not necessarily the amount of benefit for you. The subjectivity comes in determining what should be considered a 'benefit'. Since I personally believe that helping other people promotes co-operation, which ultimately achieves more for everyone in the long run, I have no problem with making a choice which benefits other people but has no immediately obvious benefit for myself.
Your 'will' determines what you consider to be your ultimate priority. Moral relativism then consists of developing a moral paradigm around promoting that priority. The priority can something as selfish as your own continued survival, or as selfless as the continued survival of everything (I'll note that the latter is essentially impossible), or even something as simple as "must eat more doughnuts' (Homer).
If your morality is getting in the way of what you personally consider to be a benefit, then you need to reassess your moral system and how it relates to your own set of priorities.
I believe in my own goals, selfish though they may be, and I will see them accomplished no matter what. This is basically a summation of morality.
It's technically impossible to not have any sense of morality. I have my own version of it.
#8821
Posté 09 juin 2014 - 04:23
At least until said subject gets loose and starts killing all your guys...
Said subject won't get loose. Safety standards will be in order.
#8822
Posté 09 juin 2014 - 04:23
Not morality so much as a callous disregard to the views of the test subjects.
Call it enlightened self-interst, at the very least
Well, their well being runs counter to my goals. I'm not going to let their well-being interfere with my goals.
#8823
Posté 09 juin 2014 - 04:24
Said subject won't get loose. Safety standards will be in order.
A good foundation for safety standards would probably be "Avoid giving the subject a reason to kill me"
#8824
Posté 09 juin 2014 - 04:25
A good foundation for safety standards would probably be "Avoid giving the subject a reason to kill me"
Brain implants or extreme power response displays clearly showing them that I am a God over their life would be necessary.
#8825
Posté 09 juin 2014 - 04:29
Brain implants or extreme power response displays clearly showing them that I am a God over their life would be necessary.
And this is one reason why I'd generally argue for co-operation over coercion - it's cheaper. Brain implants and power demonstrations are expensive.
(Not to mention that brain control implants run the risk of seriously damaging your subject, purely due to the fact that brain implants require brain surgery.)





Retour en haut





