Vai al contenuto

Foto

Do Geth Have "Souls"? On the relativity of "life".


  • Effettua l'accesso per rispondere
Questa discussione ha avuto 123 risposte

#26
Monica21

Monica21
  • Members
  • 5603 Messaggi:

Person is human or human like. Ha!

So again, why should an artificially intelligent life form desire to be like us or like a person?

 

Have you ever made a "gut" decision that is completely irrational and illogical, but you turned out to be right? Well, that's a tiny piece of the difference between human and machine. Gary Kasparov beat Deep Blue until one time he didn't, and then IBM dismantled Deep Blue and it never played chess again. Kasparov didn't win because he was smarter than the machine. He beat Deep Blue because he thought like a human. Just in the realm of decision-making alone there's more to making the "right" decision than just how smart you are. And that's leaving out all that stuff like art, love, music, and all those other things that make us stronger (or weaker, depending on your point of view) than machines.



#27
Monica21

Monica21
  • Members
  • 5603 Messaggi:

It's not political correctness or picking the people i care about above others, It's to me a matter of justice, The question why would i do that? I never really understood the severe stigma about  A.I.s, And I would have never committed genocide without having all the info required to do so and to know both sides of the story. I actually let the Quarains die in my first walkthrough because i didn't have enough peace points "Thanks to ME2's stupid Paragon/Renegade system". It was deeply unpleasant since the Civilian fleet didn't really want this war and the quarian civilians suffered heavily from the Geth/Quarian war,  But i had to make a choice and The Quarians started the war. But after that, I replayed the Rannoch arch because i couldn't live with that choice. But that's besides the point. The point is when it comes to people that i know nothing about, I'm a neutral force, I neither hold good or bad feelings to them but i always seek to know the truth behind everything. And to choose the right thing, To choose justice. And i have been like this ever since i can remember, Ever since i was 7 and didn't yet had all the info i needed to make up my opinion about the matter, It was simple to me, Why would i want to hurt innocent people?

 

It doesn't have anything to do with wanting to hurt innocent people. Sane people don't want to hurt innocent people. But if I had to choose between saving my family and saving yours, I'd save mine.



#28
Guest_AugmentedAssassin_*

Guest_AugmentedAssassin_*
  • Guests

Have you ever made a "gut" decision that is completely irrational and illogical, but you turned out to be right? Well, that's a tiny piece of the difference between human and machine. Gary Kasparov beat Deep Blue until one time he didn't, and then IBM dismantled Deep Blue and it never played chess again. Kasparov didn't win because he was smarter than the machine. He beat Deep Blue because he thought like a human. Just in the realm of decision-making alone there's more to making the "right" decision than just how smart you are. And that's leaving out all that stuff like art, love, music, and all those other things that make us stronger (or weaker, depending on your point of view) than machines.

 

This whole concept of "machines" is Stereotypical and wrong. Of course an algorithms-based computer would never defeat a human brain, But i think there are other ways by which we can achieve artificial sentience.

 

 

It doesn't have anything to do with wanting to hurt innocent people. Sane people don't want to hurt innocent people. But if I had to choose between saving my family and saving yours, I'd save mine.

 

I don't see it that way. If i actually was put in that situation I'd do the best i can to save all possible people and then to save my own ass. And since I'm very resourceful, It hasn't failed me in my life yet.



#29
Iakus

Iakus
  • Members
  • 30200 Messaggi:

Absolutly. The same way alien life comes second to human life, the same way lives from my country are more important than lives from your country and the same way family and friends are more important than strangers. In the end comes down to the very human nature of "The closer something is to me, the more I care about it." Nobody really cares about batarians, because we never had a batarian squadmate. No one here would give a **** about the geth if it wasn't for legion.

The Bear and the Dog analogy.

 

That wasn't too popular here not so long ago  ;)



#30
Laughing_Man

Laughing_Man
  • Members
  • 3607 Messaggi:

The Bear and the Dog analogy.

 

That wasn't too popular here not so long ago  ;)

 

Here you will mostly find scorn for Dude!Shep the bumbling paragon.

If only Bioware didn't feel the need to equate goodness and fairness with naivety and idiocy throughout the trilogy...



#31
Guest_AugmentedAssassin_*

Guest_AugmentedAssassin_*
  • Guests

Here you will mostly find scorn for Dude!Shep the bumbling paragon.

If only Bioware didn't feel the need to equate goodness and fairness with naivety and idiocy throughout the trilogy...

 

I don't really think believing in justice is naive.


  • Iakus piace questo

#32
Laughing_Man

Laughing_Man
  • Members
  • 3607 Messaggi:

I don't really think believing in justice is naive.

 

The paragon is all about emotional decisions, hoping for the best and staking the entire galaxy on gut feelings. Not about justice.

 

It could have been presented differently - a paragon could have made decisions in a more calculated responsible way, but in game that is the impression I got.

 

It is only through the force major of a writer's intervention, that some of Paragon Shepard's decision didn't come back to bite him.

 

Doing good shouldn't be something you do because you see the reward dangling at the end of the road, you should do it even when it has negative consequences, especially then. Making Paragon the compassionate AND rewarded choice in most cases is downright idiotic. The appeal of ruthlessness is precisely because without morals your road is easier and more rewarding in many cases.

 

Finally, the dialogue for paragon is painfully naive and almost childish at points.



#33
SuperJogi

SuperJogi
  • Members
  • 175 Messaggi:

I never really understood the severe stigma about  A.I.s

 

There isn't a stigma. You know that pixar movie Wall-E? Most people liked that robot more than they did the humans in that film. You know why? Because it was cute. It had human emotions and followed human behavior. People were able to emphathize with it. The reason why A.I.s usually aren't as high regarded as organics is simply that alot of people have greater difficulty emphathising with them. It's alot more difficult to project human emotions on them, so for alot of people they are just machines. It's the same reason why people go bezerk when you kill a dog and then go to McDonalds to eat a hamburger. Dogs are cute and people are able to project human emotions on them, while cows are just some farm animal that they never even saw getting slaughtered. There is no logical difference, but a huge emotional one for alot of people.

 

Why would i want to hurt innocent people?

 

Never said you would and most people wouldn't either. But sometimes you have to and if there is no logical difference, you will always prefer to kill faceless strangers over people that you emphathize with, because it has less off an emotional effect on you. Or are you really trying to tell me that a stranger dying in a car crash has the same emotional effect on you as your mother dying in a car crash? You simply don't care about faceless strangers, that doesn't mean that you wish them harm, but it simply means that their fate has no great emotional impact on you. Or to finish with a Joseph Stalin quote:

"The death of one man is a tragedy. The death of millions is a statistic."
 


#34
SuperJogi

SuperJogi
  • Members
  • 175 Messaggi:

I don't really think believing in justice is naive.

 

Believing in justice isn't naive, it's a good ideal to have. But believing you can bring justice to everyone, is.


  • A Laughing_Man e Monica21 piace questo elemento

#35
Guest_AugmentedAssassin_*

Guest_AugmentedAssassin_*
  • Guests

The paragon is all about emotional decisions, hoping for the best and staking the entire galaxy on gut feelings. Not about justice.

 

It could have been presented differently - a paragon could have made decisions in a more calculated responsible way, but in game that is the impression I got.

 

It is only through the force major of a writer's intervention, that some of Paragon Shepard's decision didn't come back to bite him.

 

Doing good shouldn't be something you do because you see the reward dangling at the end of the road, you should do it even when it has negative consequences, especially then. Making Paragon the compassionate AND rewarded choice in most cases is downright idiotic. The appeal of ruthlessness is precisely because without morals your road is easier and more rewarding in many cases.

 

Finally, the dialogue for paragon is painfully naive and almost childish at points.

 

And i agree. That's why I'm a paragade. And i made renegade choices before, A bunch of them really. I go with what i feel is right.



#36
Guest_AugmentedAssassin_*

Guest_AugmentedAssassin_*
  • Guests

 

There isn't a stigma. You know that pixar movie Wall-E? Most people liked that robot more than they did the humans in that film. You know why? Because it was cute. It had human emotions and followed human behavior. People were able to emphathize with it. The reason why A.I.s usually aren't as high regarded as organics is simply that alot of people have greater difficulty emphathising with them. It's alot more difficult to project human emotions on them, so for alot of people they are just machines. It's the same reason why people go bezerk when you kill a dog and then go to McDonalds to eat a hamburger. Dogs are cute and people are able to project human emotions on them, while cows are just some farm animal that they never even saw getting slaughtered. There is no logical difference, but a huge emotional one for alot of people.

 

 

Never said you would and most people wouldn't either. But sometimes you have to and if there is no logical difference, you will always prefer to kill faceless strangers over people that you emphathize with, because it has less off an emotional effect on you. Or are you really trying to tell me that a stranger dying in a car crash has the same emotional effect on you as your mother dying in a car crash? You simply don't care about faceless strangers, that doesn't mean that you wish them harm, but it simply means that their fate has no great emotional impact on you. Or to finish with a Joseph Stalin quote:

"The death of one man is a tragedy. The death of millions is a statistic."
 

 

 

Quoting Stalin doesn't really help your case. :D Just saying.

 

Seriously though, You're right to some extent, When i hear that a stranger died in a car accident, I get angry at the rules that allowed such accident to happen. That i think it shouldn't have happened and it surely would have made me sad. But when it comes to people i care about, Which are very close to zero at this point, I care about their well-being and therefore i don't them harmed. The case is different. Doesn't mean that i put anyone in a second priority. Your example of dogs and cows is actually quite accurate but that's not the case with me as i have explained.

 

 

Believing in justice isn't naive, it's a good ideal to have. But believing you can bring justice to everyone, is.

 

Actually, You can, But you just have to thing outside the box and change the laws that allowed innocent people to be hurt. Now that I'm mentioning it, I don't see what's so wrong with utopia, A society where everyone does what they want and be whomever they want without anyone to hurt them or judge them. But, To achieve that, Sometimes you have to cross the line a bit. For example, In ME3, I sabotaged the Genophage cure because The Krogan are still the same Krogan even with Wrex. I don't see the difference.



#37
SuperJogi

SuperJogi
  • Members
  • 175 Messaggi:
Actually, You can, But you just have to thing outside the box and change the laws that allowed innocent people to be hurt. Now that I'm mentioning it, I don't see what's so wrong with utopia, A society where everyone does what they want and be whomever they want without anyone to hurt them or judge them.

 

Do you want an utopia where everyone is safe, or do you want an utopia where everyone is free? If you think both are achieveble, you are naive. Every working society is necesseraly a compromise between the two, at which point it stops beeing an utopia.


  • Domar piace questo

#38
Guest_AugmentedAssassin_*

Guest_AugmentedAssassin_*
  • Guests

Do you want an utopia where everyone is safe, or do you want an utopia where everyone is free? If you think both are achieveble, you are naive. Every working society is necesseraly a compromise between the two, at which point it stops beeing an utopia.

 

It's a much larger issue than that. The whole Safety and Freedom thing is another big discussion that I don't think i should derail this thread with. But if you want to talk about it in PM, You're more than welcome.



#39
Treacherous J Slither

Treacherous J Slither
  • Members
  • 1338 Messaggi:

Have you ever made a "gut" decision that is completely irrational and illogical, but you turned out to be right? Well, that's a tiny piece of the difference between human and machine. Gary Kasparov beat Deep Blue until one time he didn't, and then IBM dismantled Deep Blue and it never played chess again. Kasparov didn't win because he was smarter than the machine. He beat Deep Blue because he thought like a human. Just in the realm of decision-making alone there's more to making the "right" decision than just how smart you are. And that's leaving out all that stuff like art, love, music, and all those other things that make us stronger (or weaker, depending on your point of view) than machines.


That's all well and good but it doesn't answer the question I posed.

#40
Laughing_Man

Laughing_Man
  • Members
  • 3607 Messaggi:

Gary Kasparov beat Deep Blue until one time he didn't, and then IBM dismantled Deep Blue and it never played chess again.

 

Why did they dismantle it? Was there a real fear of it becoming a true AI?



#41
Vazgen

Vazgen
  • Members
  • 4961 Messaggi:

I'm with Shepard on this: "Synthetics emulate life. But they aren't truly alive." 



#42
Monica21

Monica21
  • Members
  • 5603 Messaggi:

That's all well and good but it doesn't answer the question I posed.

 

Sure it does. A human can do more than make logic-based decisions. Logic can be part of it, but not the whole of it. Humans can get feelings about things or sense that something is off, and a machine can't.

 

Why did they dismantle it? Was there a real fear of it becoming a true AI?

 

I don't think there was fear of it becoming an AI, I think they didn't want it to be beaten by Kasparov or any other chess player. It's likely it was just pride on the part of IBM and they didn't want to give up the title.



#43
Guest_AugmentedAssassin_*

Guest_AugmentedAssassin_*
  • Guests

"Organics fear us. We wish to understand, Not incite." - Legion



#44
Treacherous J Slither

Treacherous J Slither
  • Members
  • 1338 Messaggi:

Sure it does. A human can do more than make logic-based decisions. Logic can be part of it, but not the whole of it. Humans can get feelings about things or sense that something is off, and a machine can't.


I don't think there was fear of it becoming an AI, I think they didn't want it to be beaten by Kasparov or any other chess player. It's likely it was just pride on the part of IBM and they didn't want to give up the title.


The question was not answered. Why do you think human senses and emotions would be desirable to an a.i.? Why should it emulate us?

#45
Domar

Domar
  • Members
  • 469 Messaggi:

The Mass Effect story was mostly written to appeal to the lowest common denominator (as opposed to the ending which was written to appeal only to the writers themselves), and therefore when the Geth tackle the question of life, they do it by asking about "souls".

 

Now, because of ME3's new "cinematic" direction, Shepard told Legion that "yes, you do have a soul" without waiting for my input.

 

I of course would have preferred: "I have no idea why you think that souls are an actual thing, but if the lowest of organic scum are considered to have souls, I don't see why a sentient toaster can't have one."

 

 

But this is not the real question though, is it?

 

The real question is, do you consider a Synthetic Intelligence to be "alive" like you would consider your fellow humans / aliens?

 

And then, are they somehow less "alive" than an organic? More?

 

Or is it all simply a question of relativity and point of view, where we try to apply limited organic concepts on a form of "life" that is simply to alien for us to properly understand.

 

Your thoughts and reasoning on this issue?

 

Edit:

 

Assuming you consider synthetics "alive", if you had to choose between saving a synthetic "life", to saving an organic life, would you still consider them equal? (without bringing into the question the specific case of Quarians Vs. Geth)

 

Now this is very much the realm of philosophy at the highest level! As someone who has taken a course on the philosophy of consciousness at university, I should be able to provide you with a good answer or, rather, an answer capturing the academical views of the mattter. However, the question is way too big and complicated to be handled properly in a place like this. So we cannot hope to clarify things as much as needed here, only touch upon the issues a little.

 

In case someone earlier in this thread has already said something along the lines I'm about to say, I apologize for not reading the lot before posting. I will merely point out a couple of the key issues on the subject.

 

To answer a question like the one stated in the title for this thread, here's the first thing you need to do: define "soul".

If you don't have a clear conception of what a "soul" is, then you cannot answer the question.

 

Suppose your answer to the above question is: Whatever we refer to when we say "I". That sounds pretty clear. But what does it mean, really? If you know the answer to THAT question, then I'm pretty sure you can answer the first question, too.

 

Semantically, a "machine" is a complex type of construction with a certain function (usually empowered by electricity) the whole of which is dead. Period. Such things definitely exists, so we need a name for them - machines. If there ever could be a "machine-like" entity that is "alive" (a body in which there resides an "I"), then we need to call that differently - not machines. A "sentient machine" or similar is essentially a contradiction in terms.

 

 

Edit: On an other note...

 

Do you want an utopia where everyone is safe, or do you want an utopia where everyone is free? If you think both are achieveble, you are naive. Every working society is necesseraly a compromise between the two, at which point it stops beeing an utopia.

 

I think this is basically true, for the simply reason that some people want things other people don't - like hurting them. Therefore, absolute freedom for everybody is impossible.



#46
Emissary of the Collectors

Emissary of the Collectors
  • Members
  • 834 Messaggi:

Absolutly. The same way alien life comes second to human life, the same way lives from my country are more important than lives from your country and the same way family and friends are more important than strangers. In the end comes down to the very human nature of "The closer something is to me, the more I care about it." Nobody really cares about batarians, because we never had a batarian squadmate. No one here would give a **** about the geth if it wasn't for legion.

I liked Geth before Legion existed...your move



#47
sH0tgUn jUliA

sH0tgUn jUliA
  • Members
  • 16812 Messaggi:

The Mass Effect story was mostly written to appeal to the lowest common denominator (as opposed to the ending which was written to appeal only to the writers themselves), and therefore when the Geth tackle the question of life, they do it by asking about "souls".

 

Now, because of ME3's new "cinematic" direction, Shepard told Legion that "yes, you do have a soul" without waiting for my input.

 

I of course would have preferred: "I have no idea why you think that souls are an actual thing, but if the lowest of organic scum are considered to have souls, I don't see why a sentient toaster can't have one."

 

 

But this is not the real question though, is it?

 

The real question is, do you consider a Synthetic Intelligence to be "alive" like you would consider your fellow humans / aliens?

 

And then, are they somehow less "alive" than an organic? More?

 

Or is it all simply a question of relativity and point of view, where we try to apply limited organic concepts on a form of "life" that is simply to alien for us to properly understand.

 

Your thoughts and reasoning on this issue?

 

Edit:

 

Assuming you consider synthetics "alive", if you had to choose between saving a synthetic "life", to saving an organic life, would you still consider them equal?

 

This brings into judgement what experiences I have had with said organic life and said synthetic life. If I had nothing but trouble with said organic life but said synthetic life had been nothing but helpful to me, I would save the synthetic being. And if the situation were the opposite, I would save the organic life. Because chances are that the patterns of the past would continue. It is logical.



#48
Kurt M.

Kurt M.
  • Banned
  • 3051 Messaggi:

Don't you worry...the way AI is being developed, we'll have our answer sooner than later.....

 

http://www.livescien...ion-puzzle.html

 

http://nvidianews.nv...-neural-network

 

...neural networks already exists....I'm afraid D:



#49
Domar

Domar
  • Members
  • 469 Messaggi:

Don't you worry...the way AI is being developed, we'll have our answer sooner than later.....

 

http://www.livescien...ion-puzzle.html

 

http://nvidianews.nv...-neural-network

 

...neural networks already exists....I'm afraid D:

 

So an AI or neural network can acquire immense computational power and find patterns in "mountains of data" which human brains have difficult to do. It does not follow from that that an AI or neural network ever can acquire a soul - unless "soul" is defined as "something with immense computational power, etc". This would be a rather novel definition, I think.



#50
Kurt M.

Kurt M.
  • Banned
  • 3051 Messaggi:

So an AI or neural network can acquire immense computational power and find patterns in "mountains of data" which human brains have difficult to do. It does not follow from that that an AI or neural network ever can acquire a soul - unless "soul" is defined as "something with immense computational power, etc". This would be a rather novel definition, I think.

 

Oh, come on. Not even we humans can agree on wheter we have a 'soul' or not :P

 

I'd rather drop the 'soul' term in favor of 'self-awareness'. As far as I've read, it's pretty sure that in the future, a computer will be able to be smarter than humans. Althought if it'll be self-aware or not....that's another matter.

 

P.D: Scare bonus: http://www.livescien...-obstacles.html

 

ME2_FENRIS_Mech.png