Aller au contenu

Photo

Why are those who choose Control and Synthesis so much happier with the ending?


  • Veuillez vous connecter pour répondre
1010 réponses à ce sujet

#226
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

General TSAR wrote...

There's no such thing as AI rights.

You're correct insofar as there's no such thing as AI now. Rights will come when the AIs do.

#227
Br3admax

Br3admax
  • Members
  • 12 316 messages

Xilizhra wrote...

General TSAR wrote...

There's no such thing as AI rights.

You're correct insofar as there's no such thing as AI now. Rights will come when the AIs do.

So never. 

#228
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

KaiserShep wrote...

Xilizhra wrote...
You're interfering with the bodily autonomy of a sapient being. Claiming to own as property one in whole or in part constitutes slavery


That's not really how slavery works. First and foremost, the "body" in question is a piece of equipment manufactured by a corporation, and regardless of what anyone else may think, its design is the intellectual property of that corporation (or corporations, as computers tend to be an amalgam of components from several), and the equipment itself is the private property of whoever buys it. The software installed does not change that.

By your logic, a person should be charged with criminal neglect if the computer begins to malfunction, and I just don't care to get it fixed anytime soon, or simply forget to charge it. Basically, it would be a ghost on life support that the state would be forcing you to keep alive. Never gonna happen lol.

This is akin to saying that a human clone would be considered a nonperson who could be owned by whatever organization cloned them, and I hardly see that holding up very well legally.

#229
General TSAR

General TSAR
  • Members
  • 4 384 messages

Xilizhra wrote...
The toaster does not, to my knowledge, possess sapience. If it did and expressed a desire to stop toasting, I would allow it.

It's my toaster and I'm hungry for some buttered toast, what are you gonna do now?

#230
AresKeith

AresKeith
  • Members
  • 34 128 messages

Xilizhra wrote...

General TSAR wrote...

There's no such thing as AI rights.

You're correct insofar as there's no such thing as AI now. Rights will come when the AIs do.


I doubt that

#231
The Heretic of Time

The Heretic of Time
  • Members
  • 5 612 messages

Xilizhra wrote...

General TSAR wrote...

There's no such thing as AI rights.

You're correct insofar as there's no such thing as AI now. Rights will come when the AIs do.


There already are AIs, they're simply not what fantasy RPGs like Mass Effect try to make you think they're "supposed" to be like, nor will they ever be or become like that.

AIs are not people, an AI is not a person, and never will be. Thus the very idea of "AI rights" is illogical and doesn't make sense.

#232
General TSAR

General TSAR
  • Members
  • 4 384 messages

Xilizhra wrote...
You're correct insofar as there's no such thing as AI now. Rights will come when the AIs do.

I can imagine you being a Skynet apologist.

#233
DirtySHISN0

DirtySHISN0
  • Members
  • 2 278 messages

jtav wrote...

This is something I've noticed since the EC's release. Those who choose Destroy and Refuse tend to speak of their ending as the best of a bad lot. Those who choose Control and Synthesis tend to be more enthusiastic about their ending choice and be more accepting of the ending scenario overall. Why is that? Is it a case of the minority opinion having to be able to defend itself more ably? Or do Synth/Control people want something different from the ending than Destroy/Refuse?


As a synthesiser ;- or as someone who hates synthesis the least, I would say it comes down to preference and belief.

Do you like and believe in the Idea of artificial life? 

Yes- Congratulations you just gambled the galaxy in control or synthesis

No- Congratulations you just destroyed alot of what you were working to save.

Refuse - Congratulations you just went full retard.

Modifié par DirtySHISN0, 20 octobre 2013 - 01:56 .


#234
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

General TSAR wrote...

Xilizhra wrote...
The toaster does not, to my knowledge, possess sapience. If it did and expressed a desire to stop toasting, I would allow it.

It's my toaster and I'm hungry for some buttered toast, what are you gonna do now?

That depends on whether unplugging it would cause it to lose mental integrity.

#235
Br3admax

Br3admax
  • Members
  • 12 316 messages

Xilizhra wrote...

KaiserShep wrote...

Xilizhra wrote...
You're interfering with the bodily autonomy of a sapient being. Claiming to own as property one in whole or in part constitutes slavery


That's not really how slavery works. First and foremost, the "body" in question is a piece of equipment manufactured by a corporation, and regardless of what anyone else may think, its design is the intellectual property of that corporation (or corporations, as computers tend to be an amalgam of components from several), and the equipment itself is the private property of whoever buys it. The software installed does not change that.

By your logic, a person should be charged with criminal neglect if the computer begins to malfunction, and I just don't care to get it fixed anytime soon, or simply forget to charge it. Basically, it would be a ghost on life support that the state would be forcing you to keep alive. Never gonna happen lol.

This is akin to saying that a human clone would be considered a nonperson who could be owned by whatever organization cloned them, and I hardly see that holding up very well legally.

It's not like this in anyway. Especially sense the human cells are not manufactured. 

Modifié par Br3ad, 20 octobre 2013 - 01:56 .


#236
Guest_StreetMagic_*

Guest_StreetMagic_*
  • Guests

Mcfly616 wrote...

StreetMagic wrote...

You have to admit though how funny it is that the most mundane, simple type of ending is the most fantastical and difficult to imagine.

How do you figure it was difficult to imagine? The more probable explaination is that they didn't want to be "mundane and simple".


I mean difficult to imagine for me. I'm only given limited tools to imagine. I have to operate within the parameters of their story, right? Those limitations make it difficult. I have no idea what to think of the Destroy ending. Kind of draw a blank on what happens to Shepard.

I don't know why they wouldn't want to be "mundane and simple" when they give you the means to be mundane and simple (Destroy). Synthesis and Control seem more realized in their results. That's why Control/Synthesis players are happier, I think. They have a more fleshed out story to work with.

Modifié par StreetMagic, 20 octobre 2013 - 01:56 .


#237
Steelcan

Steelcan
  • Members
  • 23 291 messages
People can legally own new species that are created by genetic engineering, I'm willing to bet AI's will fall in the same category

#238
KaiserShep

KaiserShep
  • Members
  • 23 835 messages

Xilizhra wrote...
This is akin to saying that a human clone would be considered a nonperson who could be owned by whatever organization cloned them, and I hardly see that holding up very well legally.


No. Firstly, a human clone is in no way similar to a manufactured piece of machinery. It doesn't matter if the human in question was grown in a tube. You can't own the rights to the human body like you would to an SSD or microprocessor in a laptop. If my laptop were to spontaneously develop self awareness, I am within the right to dismantle a piece of equipment that is well exceeding its operating parameters. This isn't true of the human body at all.

#239
General TSAR

General TSAR
  • Members
  • 4 384 messages

Xilizhra wrote...

That depends on whether unplugging it would cause it to lose mental integrity.

Mental Integrity? LOL!

Ok it loses its ability to form coherent thoughts, your move.

#240
jtav

jtav
  • Members
  • 13 965 messages
I treat AIs as persons within the ME verse for roughly the same reason I accept a variety of things that break science: the fictional world's rules are different. The game answers the question of EDI's personhood with a resounding yes, whatever I might think in the real world.

#241
FlamingBoy

FlamingBoy
  • Members
  • 3 064 messages
It will be a long time my friends before artificial intelligence (probably never) becomes a reality, yes computers have grown exponentially in the last decade and it will continue to do so.

However a complex creation such as (for example) EDI which exhibits actual emotions and emotional attachment that could constitute as "love" (which was on of biowares poorer decisions) is beyond any technology currently possible.

What I am trying to say is, avoid the "Democrats for AI Rights" convention :P

#242
AresKeith

AresKeith
  • Members
  • 34 128 messages

General TSAR wrote...

Xilizhra wrote...
You're correct insofar as there's no such thing as AI now. Rights will come when the AIs do.

I can imagine you being a Skynet apologist.


Wouldn't be surprised if Xil already is

#243
General TSAR

General TSAR
  • Members
  • 4 384 messages

jtav wrote...
 The game answers the question of EDI's personhood with a resounding yes.

It does?

Weird, I could have sworn Oleg called the AI equipment, not crew.

#244
FlamingBoy

FlamingBoy
  • Members
  • 3 064 messages

Steelcan wrote...

People can legally own new species that are created by genetic engineering, I'm willing to bet AI's will fall in the same category


I assume your talking about monsanto, those are patents hence its rights over the subject for a limited time. Not technically ownage of say a specific sentient plant :P

Also such laws are limited to certain countries (thankfully!!!)

#245
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

People can legally own new species that are created by genetic engineering, I'm willing to bet AI's will fall in the same category

They can patent the creation process and the genes used, but that's not the same thing as controlling a sapient being.

No. Firstly, a human clone is in no way similar to a manufactured piece of machinery. It doesn't matter if the human in question was grown in a tube. You can't own the rights to the human body like you would to an SSD or microprocessor in a laptop. If my laptop were to spontaneously develop self awareness, I am within the right to dismantle a piece of equipment that is well exceeding its operating parameters. This isn't true of the human body at all.

Semantic bull****. What if the company in question installed some kind of microcomputer within the cloned human body that somehow keeps the body working, but then rips it out? Is that within their rights?

Mental Integrity? LOL!

Ok it loses its ability to form coherent thoughts, your move.

Neutralize you until such time as I can get it mobile somehow.

#246
Br3admax

Br3admax
  • Members
  • 12 316 messages
"This should be legal because what if all of this illegal stuff happened!"

#247
Guest_Cthulhu42_*

Guest_Cthulhu42_*
  • Guests

General TSAR wrote...

jtav wrote...
 The game answers the question of EDI's personhood with a resounding yes.

It does?

Weird, I could have sworn Oleg called the AI equipment, not crew.

I also remember siding with Chakwas over Adams regarding machine personhood.

#248
Steelcan

Steelcan
  • Members
  • 23 291 messages
I hate to go Jurassic Park on this but...

If say the wooly mammoth is brought back from extinction, and there are groups attempting this, its legal status would probably be property as it was created for a function by a human. AI's will likely be similar.

#249
Steelcan

Steelcan
  • Members
  • 23 291 messages

Xilizhra wrote...

Mental Integrity? LOL!

Ok it loses its ability to form coherent thoughts, your move.

Neutralize you until such time as I can get it mobile somehow.

"I swear your honor his computer was alive so I had to restrain him until I could save it from his oppresion"

#250
Xilizhra

Xilizhra
  • Members
  • 30 873 messages

Steelcan wrote...

I hate to go Jurassic Park on this but...

If say the wooly mammoth is brought back from extinction, and there are groups attempting this, its legal status would probably be property as it was created for a function by a human. AI's will likely be similar.

Possibly, though of course they're not sapient. And could still be legally removed if they were being abused, or the like.