Aller au contenu

Photo

Destroy is NOT genocide.


  • Veuillez vous connecter pour répondre
1304 réponses à ce sujet

#226
Fixers0

Fixers0
  • Members
  • 4 434 messages

Kabooooom wrote...
I already did. That definition is the scientific definition, and it is used to explore questions of sentience in non-human animals. It is objectively defined, and objectively useful. Do you disagree that the definition of sentience is the one that I presented? If so, then present an equally valid and equally useful definition. I'd love to hear it.


And do those people who made that definition up had a perfect understanding of sentience? I think nobody has, as the human brain remains yet one of the most facinating objects within science.

#227
Kabooooom

Kabooooom
  • Members
  • 3 996 messages

The Geth's idea of "sentience" is only a limited representation of what we organics percieve as sentience, it's also helps to know that their "sentience" was achieved through a laboratory.


a) The sentience of the Geth that was presented in the story was simply different - that's all. For example, the Geth do not perceive a sense of self. All Geth are Legion, and Legion is all Geth. In many ways the Geth are the most alien species in all of Mass Effect.

B) What does a laboratory have to do with it? That doesn't change the definition of sentience, does it? If I cloned a goat in a laboratory, that doesn't make the goat less sentient just because I created it in a laboratory.

And do those people who made that definition up had a perfect
understanding of sentience? I think nobody has, as the human brain
remains yet one of the most facinating objects within science.


I agree, we do not fully understand consciousness. But we do understand a great deal about it. And consequently, we can define certain things at their most rudimentary level. Sentience is defined merely as being conscious, in simplest terms. You can't have one without the other. 

And the reason why this is a useful definition is because consciousness is inseparable from brain function. You cannot scientifically investigate one without the other - they are inextricably tied together, because they are one and the same thing. It is useful to define such things because we can, even if we don't have a complete understanding.

Similarly, we don't have a complete understanding of gravity - and yet we can still define it.

Modifié par Kabooooom, 08 octobre 2012 - 06:19 .


#228
Eterna

Eterna
  • Members
  • 7 417 messages

ghost9191 wrote...

@Etarna5

your headcanon is no more valid then mine. and that is what i mean. when you have someone watching over you like that how would you grow, them solving your problems and such , rebuilding for you. is it not better to have the races learn and rebuild on their own , eh this is off topic though .


The Asari and Prothean had the same relationship, it seemed to work well for them. 

#229
AngryFrozenWater

AngryFrozenWater
  • Members
  • 9 081 messages

Iconoclaste wrote...

AngryFrozenWater wrote...

No. You do not like the answer. You gave information that the AI will use as a fact. It seems to me that you want to hear an answer that I did not give you. Let me put it another way: If your brain is isolated without any input from the outside and is kept alive then whatever it thinks depends at least on its history. There are lots of studies on the subject. What do you want to hear?

No interaction, no "need" to satisfy, no danger around, what would an AI think about, yet you purposefully put this AI in a context forcing that AI into making a decision. I did not say the AI was running on batteries or felt rust attacks on its circuitry, but you seem to try to take the subject on any grounds to avoid the question, which is simple. Calling for "studies" is another way to try to gain credit, but that still doesn't answer the question. The AI did not sustain any "brain damaged", it is perfectly healthy, just try to make abstraction of all that could come up to your mind to divert you from answering just the question I asked, if you can.

Isolated brain in vitro. Pick what you like there. Or try brain in a vat. From philosophy to fiction. ;)

Modifié par AngryFrozenWater, 08 octobre 2012 - 06:18 .


#230
Jamie9

Jamie9
  • Members
  • 4 172 messages

Eterna5 wrote...
The Asari and Prothean had the same relationship, it seemed to work well for them.


It didn't work well for the Y chromosome. :lol:

#231
ghost9191

ghost9191
  • Members
  • 2 287 messages

Eterna5 wrote...

ghost9191 wrote...

@Etarna5

your headcanon is no more valid then mine. and that is what i mean. when you have someone watching over you like that how would you grow, them solving your problems and such , rebuilding for you. is it not better to have the races learn and rebuild on their own , eh this is off topic though .


The Asari and Prothean had the same relationship, it seemed to work well for them. 


the protheans were wiped out . which left the asari alone. and that is my point. protheans wiped out leaving the asari to learn and grow on their own. reapers get wiped out leaving the races to grow on their own rather then having a future handed to them. blinding them to other alternatives, but that is more of a argument against synthesis

#232
dreman9999

dreman9999
  • Members
  • 19 067 messages

ghost9191 wrote...

Eterna5 wrote...

ghost9191 wrote...

Eterna5 wrote...

ghost9191 wrote...

@dreman9999


wait wait wait wait. it has no emotion. no where is it crying due to the loss. it knew of the loss but could not experience true emotion, only emulate. same as catalyst. which would probably be partly why the catlyst saw no problem with its solution, unlike we organics.

face it, shepalyst is going to turn into a evil tyrannical evil doer thing with a overwhelming synthetic force at its hands to do its bidding. wiping out anything that gets uppity and force everyone to worship it as some kind of ai god


Holy imagination and headcannon bro. 

But seriously, you have no proof, you're making wild claims with no evidence and it's silly. 


glad bsn has yet to find its sense of humor. thought we already settled this abit back.  although i do stand by that they emulate emotions but don't have them


Synthetics emulate emotions by using positive and negative feedback. Edi says as much. They can program themselves to recieve positive feedback for doing certain actions.

It's logical that if Shepard rewrote the Catalyst it would recieve positive feedback from doing actions that your Shepard would have endorsed, such as being a Galatic Peacekeeper if you're paragon. 


point missed it, that is all fine, but the whole argument was that they do not have emotions . and being a galatic peace keeper sounds all well and good in theory , but nice having that whole freewill stripped away

and yes, shepard will use them to enforce peace, which is great, until you start controlling everyoens livves and they get fed up with it , only so far you can go with force, the reapers won't do what the shep catalyst wants them to until it uses them once. and that is strike fear in the eyes in order to maintain peace, , otherwise ppl would provoke attacks.

tried to word right

but yeah ., i get it but they still don't truly have emotions like organics, point of synthesis. and point i was making . simple

One conversation with EDI and Legion proves you wrong. Even the catalyst has emotion. Even the reapers.

#233
Jamie9

Jamie9
  • Members
  • 4 172 messages

dreman9999 wrote...
One conversation with EDI and Legion proves you wrong. Even the catalyst has emotion. Even the reapers.


Mmm hmm (Adam Jensen style).

Talk to Avina in ME1 and you can clearly see the difference between a VI and a full AI with emotional capabilities.

#234
Iconoclaste

Iconoclaste
  • Members
  • 1 469 messages

Fixers0 wrote...

Kabooooom wrote...
I already did. That definition is the scientific definition, and it is used to explore questions of sentience in non-human animals. It is objectively defined, and objectively useful. Do you disagree that the definition of sentience is the one that I presented? If so, then present an equally valid and equally useful definition. I'd love to hear it.


And do those people who made that definition up had a perfect understanding of sentience? I think nobody has, as the human brain remains yet one of the most facinating objects within science.

It's funny to read here that any kind of definitive consensus is deemed "achieved" on the terms "thinking" and "sentience" simply because a group wrote down some definitions. Actually, very bright people are holding public sessions to get feedback and questions from anyone interested in the subjects because they all agree that there is so much to be learned that it would be presomptuous to rely on the present state of knowledge as a definitive basis for development. The fact that this is called an "area of research" should be sufficient to get hints about the value of "definitive answers" or "definitions".

#235
dreman9999

dreman9999
  • Members
  • 19 067 messages

ghost9191 wrote...

Eterna5 wrote...

ghost9191 wrote...

@Etarna5

your headcanon is no more valid then mine. and that is what i mean. when you have someone watching over you like that how would you grow, them solving your problems and such , rebuilding for you. is it not better to have the races learn and rebuild on their own , eh this is off topic though .


The Asari and Prothean had the same relationship, it seemed to work well for them. 


the protheans were wiped out . which left the asari alone. and that is my point. protheans wiped out leaving the asari to learn and grow on their own. reapers get wiped out leaving the races to grow on their own rather then having a future handed to them. blinding them to other alternatives, but that is more of a argument against synthesis



I'm sorry. But it's the same case. The prothens left there tech with them and they advance. Heck, with out the protheans the asari would not be here.

#236
Mcfly616

Mcfly616
  • Members
  • 8 988 messages

Eterna5 wrote...

Mcfly616 wrote...

RadicalDisconnect wrote...

Mcfly616 wrote...

I dont care if it had a soul or not. It's still not genocide.


Care to explain?

sure

My Shepard's goal since ME1 was to Destroy the Reapers. I did not waver or second-guess myself at the very last minute when my one and only chance of victory was at hand. I knew since the beginning that there would be tremendous sacrifice. However, I didn't install Reaper upgrades in the Geth. I had no way of knowing Destroy would destroy all Reaper tech until it was time to make a decision. It's not like I threw all the Geth in prison camps and proceeded to methodically exterminate them. They were collateral damage from my decision to whipe out the Reapers. By a set of unforeseen circumstances, they were effected by the Crucible. Victory at any cost.....even though synthetic life will occur again. So, YAY!!! lol


That's akin to saying all Humans being killed is okay because somewhere in the galaxy other organic life will eventually evolve. 

your point? I wrote all of that and all you can focus on was my last line? If said humans were to be collateral damage of some chain reaction of events, well, so be it. I didn't systematically proceed to round them up and exterminate the entire species

#237
Hanako Ikezawa

Hanako Ikezawa
  • Members
  • 29 692 messages

Mcfly616 wrote...

Eterna5 wrote...

Mcfly616 wrote...

RadicalDisconnect wrote...

Mcfly616 wrote...

I dont care if it had a soul or not. It's still not genocide.


Care to explain?

sure

My Shepard's goal since ME1 was to Destroy the Reapers. I did not waver or second-guess myself at the very last minute when my one and only chance of victory was at hand. I knew since the beginning that there would be tremendous sacrifice. However, I didn't install Reaper upgrades in the Geth. I had no way of knowing Destroy would destroy all Reaper tech until it was time to make a decision. It's not like I threw all the Geth in prison camps and proceeded to methodically exterminate them. They were collateral damage from my decision to whipe out the Reapers. By a set of unforeseen circumstances, they were effected by the Crucible. Victory at any cost.....even though synthetic life will occur again. So, YAY!!! lol


That's akin to saying all Humans being killed is okay because somewhere in the galaxy other organic life will eventually evolve. 

your point? I wrote all of that and all you can focus on was my last line? If said humans were to be collateral damage of some chain reaction of events, well, so be it. I didn't systematically proceed to round them up and exterminate the entire species

You shot the tube, so technically yes you did.

#238
dreman9999

dreman9999
  • Members
  • 19 067 messages

Fixers0 wrote...

Kabooooom wrote...
I already did. That definition is the scientific definition, and it is used to explore questions of sentience in non-human animals. It is objectively defined, and objectively useful. Do you disagree that the definition of sentience is the one that I presented? If so, then present an equally valid and equally useful definition. I'd love to hear it.


And do those people who made that definition up had a perfect understanding of sentience? I think nobody has, as the human brain remains yet one of the most facinating objects within science.

Of course they do because they themselves are sentint. The issue here is what we can say is not it. The geth have every detail under the defination of sentince...Thuse they are.

#239
ghost9191

ghost9191
  • Members
  • 2 287 messages

dreman9999 wrote...

ghost9191 wrote...

Eterna5 wrote...

ghost9191 wrote...

@Etarna5

your headcanon is no more valid then mine. and that is what i mean. when you have someone watching over you like that how would you grow, them solving your problems and such , rebuilding for you. is it not better to have the races learn and rebuild on their own , eh this is off topic though .


The Asari and Prothean had the same relationship, it seemed to work well for them. 


the protheans were wiped out . which left the asari alone. and that is my point. protheans wiped out leaving the asari to learn and grow on their own. reapers get wiped out leaving the races to grow on their own rather then having a future handed to them. blinding them to other alternatives, but that is more of a argument against synthesis



I'm sorry. But it's the same case. The prothens left there tech with them and they advance. Heck, with out the protheans the asari would not be here.


yes., which they had to figure out on their own. reapers give knowledge and rebuild, the races do not have to achieve a understanding of it or anything, just as they did with the citadel m, the keepers were always there so they did not need to dig deeper to truly figure out what it is.

without the reapers the races will have to rebuild on their own, over come on their own. just as the asari had to

#240
Iconoclaste

Iconoclaste
  • Members
  • 1 469 messages

AngryFrozenWater wrote...

Iconoclaste wrote...

AngryFrozenWater wrote...

No. You do not like the answer. You gave information that the AI will use as a fact. It seems to me that you want to hear an answer that I did not give you. Let me put it another way: If your brain is isolated without any input from the outside and is kept alive then whatever it thinks depends at least on its history. There are lots of studies on the subject. What do you want to hear?

No interaction, no "need" to satisfy, no danger around, what would an AI think about, yet you purposefully put this AI in a context forcing that AI into making a decision. I did not say the AI was running on batteries or felt rust attacks on its circuitry, but you seem to try to take the subject on any grounds to avoid the question, which is simple. Calling for "studies" is another way to try to gain credit, but that still doesn't answer the question. The AI did not sustain any "brain damaged", it is perfectly healthy, just try to make abstraction of all that could come up to your mind to divert you from answering just the question I asked, if you can.

Isolated brain in vitro. Pick what you like there. Or try brain in a vat. From philosophy to fiction. ;)

If you really understood the simple question I asked, you wouldn't have put these links. You are failing over and again to acknowledge my use of the term "no interaction". In both links, there is "outside interaction", hence "stimuli". That is not my question. The technique you are using reveals the area you are trying to avoid : a "sentience" with no "contact" or "outside stimuli". Put the AI-thing in a closed room with no windows and noise, turn the thing "on", and watch the "thinking" it will perform. Are you already infering the AI needs a "history" to be "sentient"?

Anyways, don't waste more time on this, it's not going to change a thing : the story tells something in a neutral context, it never asks the "players" to doubt that "AI is sentient", so the players just take it for granted as needed. Is this realistic that such "sentience" would mimick "organic sentience" to perfection? I doubt it, because there would be no logic reason to have powerful synthetics to have their logic biased by the randomness of emotions.

Modifié par Iconoclaste, 08 octobre 2012 - 06:43 .


#241
dreman9999

dreman9999
  • Members
  • 19 067 messages

LDS Darth Revan wrote...

Mcfly616 wrote...

Eterna5 wrote...

Mcfly616 wrote...

RadicalDisconnect wrote...

Mcfly616 wrote...

I dont care if it had a soul or not. It's still not genocide.


Care to explain?

sure

My Shepard's goal since ME1 was to Destroy the Reapers. I did not waver or second-guess myself at the very last minute when my one and only chance of victory was at hand. I knew since the beginning that there would be tremendous sacrifice. However, I didn't install Reaper upgrades in the Geth. I had no way of knowing Destroy would destroy all Reaper tech until it was time to make a decision. It's not like I threw all the Geth in prison camps and proceeded to methodically exterminate them. They were collateral damage from my decision to whipe out the Reapers. By a set of unforeseen circumstances, they were effected by the Crucible. Victory at any cost.....even though synthetic life will occur again. So, YAY!!! lol


That's akin to saying all Humans being killed is okay because somewhere in the galaxy other organic life will eventually evolve. 

your point? I wrote all of that and all you can focus on was my last line? If said humans were to be collateral damage of some chain reaction of events, well, so be it. I didn't systematically proceed to round them up and exterminate the entire species

You shot the tube, so technically yes you did.

The thing your missing here is the full equation. The act of genocide was done becasue of the extreme events around Shepard. The only time  you can  say the choice is the will of the person doing it is when the choice is choosen in an event not forcing it. If you had no reason to and did it for some political /social reason, then the person is at full fault.

This a choise or everyone dies event.

#242
Hanako Ikezawa

Hanako Ikezawa
  • Members
  • 29 692 messages

dreman9999 wrote...
The thing your missing here is the full equation. The act of genocide was done becasue of the extreme events around Shepard. The only time  you can  say the choice is the will of the person doing it is when the choice is choosen in an event not forcing it. If you had no reason to and did it for some political /social reason, then the person is at full fault.

This a choise or everyone dies event.

But it was a choice because the Catalyst gave you other options. I'm not saying nobody should choose Destroy, but they must accept that genocide is a result of that choice.

#243
AngryFrozenWater

AngryFrozenWater
  • Members
  • 9 081 messages

Iconoclaste wrote...

AngryFrozenWater wrote...

Iconoclaste wrote...

AngryFrozenWater wrote...

No. You do not like the answer. You gave information that the AI will use as a fact. It seems to me that you want to hear an answer that I did not give you. Let me put it another way: If your brain is isolated without any input from the outside and is kept alive then whatever it thinks depends at least on its history. There are lots of studies on the subject. What do you want to hear?

No interaction, no "need" to satisfy, no danger around, what would an AI think about, yet you purposefully put this AI in a context forcing that AI into making a decision. I did not say the AI was running on batteries or felt rust attacks on its circuitry, but you seem to try to take the subject on any grounds to avoid the question, which is simple. Calling for "studies" is another way to try to gain credit, but that still doesn't answer the question. The AI did not sustain any "brain damaged", it is perfectly healthy, just try to make abstraction of all that could come up to your mind to divert you from answering just the question I asked, if you can.

Isolated brain in vitro. Pick what you like there. Or try brain in a vat. From philosophy to fiction. ;)

If you really understood the simple question I asked, you wouldn't have put these links. You are failing over and again to acknowledge my use of the term "no interaction". In both links, there is "outside interaction", hence "stimuli". That is not my question. The technique you are using reveals the area you are trying to avoid : a "sentience" with no "contact" or "outside stimuli". Put the AI-thing in a closed room with no windows and noise, turn the thing "on", and watch the "thinking" it will perform. Are you already infering the AI needs a "history" to be "sentient"?

If you understood the basic idea that started this "discussion" then we wouldn't have it at all.

Modifié par AngryFrozenWater, 08 octobre 2012 - 06:43 .


#244
dreman9999

dreman9999
  • Members
  • 19 067 messages

Iconoclaste wrote...

AngryFrozenWater wrote...

Iconoclaste wrote...

AngryFrozenWater wrote...

No. You do not like the answer. You gave information that the AI will use as a fact. It seems to me that you want to hear an answer that I did not give you. Let me put it another way: If your brain is isolated without any input from the outside and is kept alive then whatever it thinks depends at least on its history. There are lots of studies on the subject. What do you want to hear?

No interaction, no "need" to satisfy, no danger around, what would an AI think about, yet you purposefully put this AI in a context forcing that AI into making a decision. I did not say the AI was running on batteries or felt rust attacks on its circuitry, but you seem to try to take the subject on any grounds to avoid the question, which is simple. Calling for "studies" is another way to try to gain credit, but that still doesn't answer the question. The AI did not sustain any "brain damaged", it is perfectly healthy, just try to make abstraction of all that could come up to your mind to divert you from answering just the question I asked, if you can.

Isolated brain in vitro. Pick what you like there. Or try brain in a vat. From philosophy to fiction. ;)

If you really understood the simple question I asked, you wouldn't have put these links. You are failing over and again to acknowledge my use of the term "no interaction". In both links, there is "outside interaction", hence "stimuli". That is not my question. The technique you are using reveals the area you are trying to avoid : a "sentience" with no "contact" or "outside stimuli". Put the AI-thing in a closed room with no windows and noise, turn the thing "on", and watch the "thinking" it will perform. Are you already infering the AI needs a "history" to be "sentient"?

What you missing here is a case of form and insticts of said beings. A human has in born desirese to interact. An AI has to have that programed or learned. An AI also has other ways to move and interact with things.
The AI stuck in a room and has not problem is so because it was made that way. An AI can be made and develop in a contrast of that way.

Edi in her original form would have not problems...A geth on the other hand, which is made to be collabrative and learned to be so on it's own would not.
Also, an AI can develop to a point that it would have a problem being alone.

#245
dreman9999

dreman9999
  • Members
  • 19 067 messages

LDS Darth Revan wrote...

dreman9999 wrote...
The thing your missing here is the full equation. The act of genocide was done becasue of the extreme events around Shepard. The only time  you can  say the choice is the will of the person doing it is when the choice is choosen in an event not forcing it. If you had no reason to and did it for some political /social reason, then the person is at full fault.

This a choise or everyone dies event.

But it was a choice because the Catalyst gave you other options. I'm not saying nobody should choose Destroy, but they must accept that genocide is a result of that choice.

I'm not saying it's not genocide. 

#246
dreman9999

dreman9999
  • Members
  • 19 067 messages

ghost9191 wrote...

dreman9999 wrote...

ghost9191 wrote...

Eterna5 wrote...

ghost9191 wrote...

@Etarna5

your headcanon is no more valid then mine. and that is what i mean. when you have someone watching over you like that how would you grow, them solving your problems and such , rebuilding for you. is it not better to have the races learn and rebuild on their own , eh this is off topic though .


The Asari and Prothean had the same relationship, it seemed to work well for them. 


the protheans were wiped out . which left the asari alone. and that is my point. protheans wiped out leaving the asari to learn and grow on their own. reapers get wiped out leaving the races to grow on their own rather then having a future handed to them. blinding them to other alternatives, but that is more of a argument against synthesis



I'm sorry. But it's the same case. The prothens left there tech with them and they advance. Heck, with out the protheans the asari would not be here.


yes., which they had to figure out on their own. reapers give knowledge and rebuild, the races do not have to achieve a understanding of it or anything, just as they did with the citadel m, the keepers were always there so they did not need to dig deeper to truly figure out what it is.

without the reapers the races will have to rebuild on their own, over come on their own. just as the asari had to

But that does not detour the choice of control. It does not make it better, just different.

#247
Iconoclaste

Iconoclaste
  • Members
  • 1 469 messages

AngryFrozenWater wrote...

If you understood the basic idea that started this "discussion" then we wouldn't have it at all.

You are simply not answering the question, and forcing the issue on "interaction" to try to prove that an AI would "think" even if it is not required to.

#248
Hanako Ikezawa

Hanako Ikezawa
  • Members
  • 29 692 messages

dreman9999 wrote...

LDS Darth Revan wrote...

dreman9999 wrote...
The thing your missing here is the full equation. The act of genocide was done becasue of the extreme events around Shepard. The only time  you can  say the choice is the will of the person doing it is when the choice is choosen in an event not forcing it. If you had no reason to and did it for some political /social reason, then the person is at full fault.

This a choise or everyone dies event.

But it was a choice because the Catalyst gave you other options. I'm not saying nobody should choose Destroy, but they must accept that genocide is a result of that choice.

I'm not saying it's not genocide. 

Sorry, I misunderstood where you were going with this part.

#249
dreman9999

dreman9999
  • Members
  • 19 067 messages

Iconoclaste wrote...

AngryFrozenWater wrote...

If you understood the basic idea that started this "discussion" then we wouldn't have it at all.

You are simply not answering the question, and forcing the issue on "interaction" to try to prove that an AI would "think" even if it is not required to.

What you missing here is a case of form and insticts of said beings. A human has in born desirese to interact. An AI has to have that programed or learned. An AI also has other ways to move and interact with things.
The AI stuck in a room and has no problem is so because it was made that way. An AI can be made and develop in a contrast of that way.

Edi in her original form would have not problems...A geth on the other hand, which is made to be collabrative and learned to be so on it's own, would not.
Also, an AI can develop to a point that it would have a problem being alone.

Modifié par dreman9999, 08 octobre 2012 - 06:50 .


#250
Mcfly616

Mcfly616
  • Members
  • 8 988 messages

LDS Darth Revan wrote...

dreman9999 wrote...
The thing your missing here is the full equation. The act of genocide was done becasue of the extreme events around Shepard. The only time  you can  say the choice is the will of the person doing it is when the choice is choosen in an event not forcing it. If you had no reason to and did it for some political /social reason, then the person is at full fault.

This a choise or everyone dies event.

But it was a choice because the Catalyst gave you other options. I'm not saying nobody should choose Destroy, but they must accept that genocide is a result of that choice.

that's not genocide. And there is nothing "systematic" or "methodical" about shooting the tube. That's like saying a guy that chooses to save an entire train of people whilst sacrificing his 2 yr old in the wreck, is a "murderer". Well, view it however you like but I dont think he set out that day to Kill his son....I think an unforeseen set of circumstances led to that point where the man was met with a choice he had to make, and he made a decision for the greater good. I wouldn't label him a murderer. Just as I don't label Destroy genocide.