Aller au contenu

Photo

In real life, I would probably choose Destroy


  • Veuillez vous connecter pour répondre
127 réponses à ce sujet

#76
Ironhandjustice

Ironhandjustice
  • Members
  • 1 093 messages

Auld Wulf wrote...

Finn the Jakey wrote...

That's more or less why I chose Destroy, the safe option.

Genocide is a 'safe' option? It isn't too 'safe' for all the people who'd die. But hey, don't mind me, I'm just a guy with a decent set of ethics and an understanding of why too much binary opposition in one's thinking is more dangerous thananything.

Edit: For those who might not understand...

By picking Destroy, you choose to slaughter in cold blood:

- The geth and any other synthetics that we're not aware of.
- EDI. (Sorry Joker, Shepard wasn't ready to allow you your happiness.)
- Every species preserved within the reaper consensus.

You'd kind of make Hitler look like John Lennon.


Error. You will think about this while dying on the rubber. Meanwhile, you blow up the reapers on an impulse. You don't think you'll be angry? full of hatred?

Your emotions would give you no choice.

Next.

#77
Auld Wulf

Auld Wulf
  • Members
  • 1 284 messages
@Ironhandjustice

Again, that's the difference between a person with a sense of ethics, and one that doesn't come equipped with such. Sometimes it amazes me how many humans don't come with a good understanding of ethics and empathy as a standard. This is something I believe should be taught in school, as perhaps then we'd have less sociopaths? I dunno.

#78
KyreneZA

KyreneZA
  • Members
  • 1 882 messages

Auld Wulf wrote...

@Ironhandjustice

Again, that's the difference between a person with a sense of ethics, and one that doesn't come equipped with such. Sometimes it amazes me how many humans don't come with a good understanding of ethics and empathy as a standard. This is something I believe should be taught in school, as perhaps then we'd have less sociopaths? I dunno.

Clearly your finely honed sense of ethics fails you every time you are disrespectful and downright rude or condecending to complete strangers on an internet forum. Just food for thought, Old Wolf! Image IPB

#79
Uncle Jo

Uncle Jo
  • Members
  • 2 161 messages

Kyrene wrote...

Clearly your finely honed sense of ethics fails you every time you are disrespectful and downright rude or condecending to complete strangers on an internet forum. Just food for thought, Old Wolf! Image IPB

Never take an obvious troll seriously or personally. He's a blight and an amusement on the bsn. Nothing more.

#80
ElSuperGecko

ElSuperGecko
  • Members
  • 2 314 messages
Good job some people around here are capable of distinguishing the wood from the trees. Great post OP.

#81
MegaSovereign

MegaSovereign
  • Members
  • 10 794 messages

AdmiralCheez wrote...

The-Biotic-God wrote...

That analogy is flawed because you had to trade physical traits for others. Your analogy is actually closer to Catalyst's Reaper solution since it's a forced physical change rather than an acquistion of additional abilities.

With synthesis you retain the same physical form and abilities, but with upgrades (that the game doesn't quite specify).

Okay, say I get man-parts in addition to my natural set of gear.  Everything is the same about me except the "upgrade" which gives me "additional abilities."  Despite whatever benefits may come, was it okay for some jerk to break into my house and perform unwanted surgery on me?

Imagine the same guy broke into your house and gave you a vagina...


Lol. I still don't think it's the same thing. You're still altering their gender.

#82
Fawx9

Fawx9
  • Members
  • 1 134 messages
Wait since when does control make YOU a god?

It makes a god in your image. You die.

#83
Steelcan

Steelcan
  • Members
  • 23 290 messages

Fawx9 wrote...

Wait since when does control make YOU a god?

It makes a god in your image. You die.

. "I do not look forward to being replaced by you". "Your thoughts and even your memories will continue"

Modifié par Steelcan, 15 février 2013 - 03:14 .


#84
Fawx9

Fawx9
  • Members
  • 1 134 messages

Steelcan wrote...

Fawx9 wrote...

Wait since when does control make YOU a god?

It makes a god in your image. You die.

. "I do not look forward to being replaced by you". "Your thoughts and even your memories will continue"


But you still die.

It simply uploads your memories, it doesn't do a brain transfer.

You are no more. You have ceased to be. You have expired and have gone to meet your maker.

All what's left is a machine with thoughts and memories that were uploaded from you.

#85
CR121691

CR121691
  • Members
  • 550 messages
In real life, humanity won't even stand a chance.

#86
DirtyPhoenix

DirtyPhoenix
  • Members
  • 3 938 messages

Fawx9 wrote...

Steelcan wrote...

Fawx9 wrote...

Wait since when does control make YOU a god?

It makes a god in your image. You die.

. "I do not look forward to being replaced by you". "Your thoughts and even your memories will continue"


But you still die.

It simply uploads your memories, it doesn't do a brain transfer.

You are no more. You have ceased to be. You have expired and have gone to meet your maker.

All what's left is a machine with thoughts and memories that were uploaded from you.


Some viewpoints hold that if you copy me thoughts, memories and morals you've effectively created another me. Physically transplanting the brain isn't important. What's inside the brain, is.

#87
Fawx9

Fawx9
  • Members
  • 1 134 messages

pirate1802 wrote...

Fawx9 wrote...

Steelcan wrote...

Fawx9 wrote...

Wait since when does control make YOU a god?

It makes a god in your image. You die.

. "I do not look forward to being replaced by you". "Your thoughts and even your memories will continue"


But you still die.

It simply uploads your memories, it doesn't do a brain transfer.

You are no more. You have ceased to be. You have expired and have gone to meet your maker.

All what's left is a machine with thoughts and memories that were uploaded from you.


Some viewpoints hold that if you copy me thoughts, memories and morals you've effectively created another me. Physically transplanting the brain isn't important. What's inside the brain, is.


Another you, that you have no control of. It is made up of more things than just your memories, and will act differently because of that.

You create a 'God' in your image. There is no further connection, your existence has ended.

#88
DirtyPhoenix

DirtyPhoenix
  • Members
  • 3 938 messages

Fawx9 wrote...

pirate1802 wrote...

Fawx9 wrote...

Steelcan wrote...

Fawx9 wrote...

Wait since when does control make YOU a god?

It makes a god in your image. You die.

. "I do not look forward to being replaced by you". "Your thoughts and even your memories will continue"


But you still die.

It simply uploads your memories, it doesn't do a brain transfer.

You are no more. You have ceased to be. You have expired and have gone to meet your maker.

All what's left is a machine with thoughts and memories that were uploaded from you.


Some viewpoints hold that if you copy me thoughts, memories and morals you've effectively created another me. Physically transplanting the brain isn't important. What's inside the brain, is.


Another you, that you have no control of. It is made up of more things than just your memories, and will act differently because of that.

You create a 'God' in your image. There is no further connection, your existence has ended.


If it is indeed another "me", then I need not control or connect to it. It will behave as I would. And what more things above my memories?

#89
Fawx9

Fawx9
  • Members
  • 1 134 messages

pirate1802 wrote...

Fawx9 wrote...

pirate1802 wrote...

Fawx9 wrote...

Steelcan wrote...

Fawx9 wrote...

Wait since when does control make YOU a god?

It makes a god in your image. You die.

. "I do not look forward to being replaced by you". "Your thoughts and even your memories will continue"


But you still die.

It simply uploads your memories, it doesn't do a brain transfer.

You are no more. You have ceased to be. You have expired and have gone to meet your maker.

All what's left is a machine with thoughts and memories that were uploaded from you.


Some viewpoints hold that if you copy me thoughts, memories and morals you've effectively created another me. Physically transplanting the brain isn't important. What's inside the brain, is.


Another you, that you have no control of. It is made up of more things than just your memories, and will act differently because of that.

You create a 'God' in your image. There is no further connection, your existence has ended.


If it is indeed another "me", then I need not control or connect to it. It will behave as I would. And what more things above my memories?


It's programming, it's outlook, it's logic. Just because it contains Shepards memories does not mean it will react the same way you would at a new situation. It has the memory of the experiences, but does not have the actual experience. It will not fire the same chemical induced emotional response when looking back on them. It will simply know that you were happy, angry or sad, it wont actually feel anything from it.

#90
CronoDragoon

CronoDragoon
  • Members
  • 10 411 messages

CosmicGnosis wrote...

In real life, Control would freaking scare me. I wouldn't want to become a synthetic super-intelligence controlling the most powerful beings in the galaxy.

Synthesis would sound utterly insane and unbelievable.

So I would choose Destroy. Yeah. Afterward, I would become an advocate for synthetic rights and attempt to explain their perspective to the rest of the galaxy.


Hi, Ender.

Yeah, I think IRL I would choose Destroy.

#91
DirtyPhoenix

DirtyPhoenix
  • Members
  • 3 938 messages

Fawx9 wrote...

pirate1802 wrote...

Fawx9 wrote...

pirate1802 wrote...

Fawx9 wrote...

Steelcan wrote...

Fawx9 wrote...

Wait since when does control make YOU a god?

It makes a god in your image. You die.

. "I do not look forward to being replaced by you". "Your thoughts and even your memories will continue"


But you still die.

It simply uploads your memories, it doesn't do a brain transfer.

You are no more. You have ceased to be. You have expired and have gone to meet your maker.

All what's left is a machine with thoughts and memories that were uploaded from you.


Some viewpoints hold that if you copy me thoughts, memories and morals you've effectively created another me. Physically transplanting the brain isn't important. What's inside the brain, is.


Another you, that you have no control of. It is made up of more things than just your memories, and will act differently because of that.

You create a 'God' in your image. There is no further connection, your existence has ended.


If it is indeed another "me", then I need not control or connect to it. It will behave as I would. And what more things above my memories?


It's programming, it's outlook, it's logic. Just because it contains Shepards memories does not mean it will react the same way you would at a new situation. It has the memory of the experiences, but does not have the actual experience. It will not fire the same chemical induced emotional response when looking back on them. It will simply know that you were happy, angry or sad, it wont actually feel anything from it.


Fair enough. Being transferred from an organic platform to a synthetic one brings about changes and its clear Shepard isn't human anymore. I just consider the "thing" to be as close to the real Shepard as we'll get. Like Keiji's Greybox, if animated. One thing though, just because the thing doesn't feel emotions doesn't necessarily mean its moral compass is broken. It still is bound by Shepard's goals and beliefs, however illogical it may view them as.

#92
Elista

Elista
  • Members
  • 900 messages
In real life, i would consider that Starchild is probably lying because he's the enemy, and i would not listen to him. He could very well say : juste take THIS option, it will save everybody, other choices will kill you - and, surprise, when you jump into the flow of energy you just die, or, when you shoot the cable, the Crucible collapses, and the Reapers win. It's nonsense and madness to believe an AI who claims to be the chief of that army of monsters, when it gives you advices and explanation... how in the hell could i trust this thing ??? Don't want to be indoctrinated. Even his appearance looks like an attempt to manipulate me.

I may even believe that he's just an hallucination of my traumatized mind.

So I would pass my way and search for a terminal or a damned red button that makes the weapon does its job. Not SHOOT a cable in the engine which is supposed to save us (what a stupid manner to activate it...), nor RUN happily into the flow of energy... So I think I would choose Control... but my purpose is to destroy the Reapers :P Epic fail. And at no moment I will imagine that I will die by doing this. So stupid, a weapon that can only be activated by someone on board who will die when it shoots the enemy... the engineers could have plan a countdown, at least.

Modifié par Elista, 15 février 2013 - 04:20 .


#93
CronoDragoon

CronoDragoon
  • Members
  • 10 411 messages

pirate1802 wrote...
Fair enough. Being transferred from an organic platform to a synthetic one brings about changes and its clear Shepard isn't human anymore. I just consider the "thing" to be as close to the real Shepard as we'll get. Like Keiji's Greybox, if animated. One thing though, just because the thing doesn't feel emotions doesn't necessarily mean its moral compass is broken. It still is bound by Shepard's goals and beliefs, however illogical it may view them as.


I love the discussion about how the Shepard-AI would work. It's really fascinating because a lot of people, including me, feel that there will be situations where the fact that it is an AI instead of a human will manifest itself.

But what may cause this different reaction? Well, for one, brains aren't encoded with specific instructions in the way that an AI is encoded. If we look at the Catalyst, so much of where it went wrong is based in the semantics of the instructions it was given. "Preserving" organic life gets turned into the Reaper cycle. This brings up the question of what specific form Shepard's morals will take when they are transferred into lines of code. A single word could make all the difference.

This brings me to what appears to the Shepard-AI's "primary" directive in Control endings, which for Paragon take the form of "Protect or defend the many." Ohhhh boy is there a lot of wiggle room for things to go wrong there.

One important way that humans decide on complex issues is to converse with people whose opinions they trust in order to receive a different perspective that they can then measure against their conclusions. Will Shepard-AI know to do this? Is Shepard-AI capable of trust? When the people that Shepard trusted in his life die, how will he forge new bonds of trust? Is that even possible, or will he be reduced to a one perspective and never seek to augment this perspective.

Like I said, very interesting.

#94
CrutchCricket

CrutchCricket
  • Members
  • 7 735 messages

iOnlySignIn wrote...

In real life there is no StarBrat.

A fact for which I'm eternally grateful.



Interesting how a lot of the posts turned (yet again) into jabs at control (it's not really shepard, power corrupts lololol).

As always check the sig.

And I'd pick control in a second. So long, piddly organics! I'm off to explore the cosmos and observe and figure out all the things I never could as a planet-bound meatbag.

Try not to kill each other too fast eh?

Modifié par CrutchCricket, 15 février 2013 - 04:14 .


#95
DirtyPhoenix

DirtyPhoenix
  • Members
  • 3 938 messages

CronoDragoon wrote...

pirate1802 wrote...
Fair enough. Being transferred from an organic platform to a synthetic one brings about changes and its clear Shepard isn't human anymore. I just consider the "thing" to be as close to the real Shepard as we'll get. Like Keiji's Greybox, if animated. One thing though, just because the thing doesn't feel emotions doesn't necessarily mean its moral compass is broken. It still is bound by Shepard's goals and beliefs, however illogical it may view them as.


I love the discussion about how the Shepard-AI would work. It's really fascinating because a lot of people, including me, feel that there will be situations where the fact that it is an AI instead of a human will manifest itself.

But what may cause this different reaction? Well, for one, brains aren't encoded with specific instructions in the way that an AI is encoded. If we look at the Catalyst, so much of where it went wrong is based in the semantics of the instructions it was given. "Preserving" organic life gets turned into the Reaper cycle. This brings up the question of what specific form Shepard's morals will take when they are transferred into lines of code. A single word could make all the difference.

This brings me to what appears to the Shepard-AI's "primary" directive in Control endings, which for Paragon take the form of "Protect or defend the many." Ohhhh boy is there a lot of wiggle room for things to go wrong there.

One important way that humans decide on complex issues is to converse with people whose opinions they trust in order to receive a different perspective that they can then measure against their conclusions. Will Shepard-AI know to do this? Is Shepard-AI capable of trust? When the people that Shepard trusted in his life die, how will he forge new bonds of trust? Is that even possible, or will he be reduced to a one perspective and never seek to augment this perspective.

Like I said, very interesting.


Well in my headcanon Shep-AI will remain in contact with her closest squadmates, atleast for a time being. So there's that, different viewpoints and all:D Also, I think the fact that this new AI was previously an organic and it already has a set of Dos and Don't to guide itself does lend it a more favourable position compared to the catalyst. For example: restarting the cycles.. would my paragon Shepard was ever involved in/thought about it? Would her morals allow it? The AI would come up with No.

The catalyst never had a yardstick like this to which he'd compare their actions to, constraints that would limit his choices. It was basically created from vaccum with no previous emory and given a task to solve with infinite freedom and no constraints. Things are a little different with Shepalyst.

CrutchCricket wrote...

And I'd pick control in a second.
So long, piddly organics! I'm off to explore the cosmos and observe and
figure out all the things I never could as a planet-bound meatbag.

Try not to kill each other too fast eh?


Same.:wizard:

Modifié par pirate1802, 15 février 2013 - 04:23 .


#96
CronoDragoon

CronoDragoon
  • Members
  • 10 411 messages

pirate1802 wrote...


Well in my headcanon Shep-AI will remain in contact with her closest squadmates, atleast for a time being. So there's that, different viewpoints and all:D Also, I think the fact that this new AI was previously an organic and it already has a set of Dos and Don't to guide itself does lend it a more favourable position compared to the catalyst. For example: restarting the cycles.. would my paragon Shepard would ever do it? The AI would come up with No.


Right but one of my points is, what happens when those squadmates die? What happens when all the people that were "grandfathered" into Shepard-AI's "take this viewpoint into account" programming die? How will it determine which new people to trust, or will it simply trust no one anymore?

#97
ruggly

ruggly
  • Members
  • 7 561 messages

Auld Wulf wrote...

@Ironhandjustice

Again, that's the difference between a person with a sense of ethics, and one that doesn't come equipped with such. Sometimes it amazes me how many humans don't come with a good understanding of ethics and empathy as a standard. This is something I believe should be taught in school, as perhaps then we'd have less sociopaths? I dunno.


I know I shouldn't be wasting energy on this.  But I like to think that I'm a very decent person in life, not a sociopath at all.  I have ethics, empathy, and emotions  A good life, a job, a house, a nice car, friends and a social life.  You keep trying to judge me as some robot.  The fact that you're trying to base people's true personalities on what they did in a video game is...well, not a very reliable way to judge anyone.

#98
DirtyPhoenix

DirtyPhoenix
  • Members
  • 3 938 messages

CronoDragoon wrote...

pirate1802 wrote...


Well in my headcanon Shep-AI will remain in contact with her closest squadmates, atleast for a time being. So there's that, different viewpoints and all:D Also, I think the fact that this new AI was previously an organic and it already has a set of Dos and Don't to guide itself does lend it a more favourable position compared to the catalyst. For example: restarting the cycles.. would my paragon Shepard would ever do it? The AI would come up with No.


Right but one of my points is, what happens when those squadmates die? What happens when all the people that were "grandfathered" into Shepard-AI's "take this viewpoint into account" programming die? How will it determine which new people to trust, or will it simply trust no one anymore?


Well, Shepard's memoies will live on for eternity, they'll guide her.. also in those memories will be interactions with those squadmates, their viewpoints etc. For example Shepalyst doesn't physically need to interact with Javik to know he'll probably hate her.

But truth be told, I don't altogather hate a scenario where Shep-AI slowly descends into madness. Would be certainly interesting! The demise of a God, brought about by its own immortality.

#99
themikefest

themikefest
  • Members
  • 21 607 messages
Well in real life

I would ask the AI to prove its the catalyst

I would ask the AI why it looks like the kid

I would ask it to prove it controls the reapers.

I would ask it to prove that each choice does what it does.

I would ask it why synthesis can't be forced but expects me to force it on everyone.

I would ask it to prove robots will always attack its creators

I would ask it why its talking with 3 different voices at the same time

I would try do make a deal with it. Ask the AI if it would spare Earth and just harvest the rest of the galaxyImage IPB

I would go back to get Andersons body and throw it in the beam while shooting the pipe/tube

If I can't get back down the elevator, I would take a pi** in the beam while shooting the tube/pipe

If the above can't be done then I'll just shoot the pipe/tube from a distance then go home and have a beer

#100
CronoDragoon

CronoDragoon
  • Members
  • 10 411 messages

pirate1802 wrote...
Well, Shepard's memoies will live on for eternity, they'll guide her.. also in those memories will be interactions with those squadmates, their viewpoints etc. For example Shepalyst doesn't physically need to interact with Javik to know he'll probably hate her.

But truth be told, I don't altogather hate a scenario where Shep-AI slowly descends into madness. Would be certainly interesting! The demise of a God, brought about by its own immortality.


Oh, absolutely. Though Destroy is my canon ending, Control is the most interesting ending for a future ME game, in my opinion. Automatic overpowering villain but with a big dose of poignancy knowing the history behind what you are fighting.