Aller au contenu

Photo

Nobody will trust the catalyst after Leviathan (Warning Leviathan Spoilers)


  • Veuillez vous connecter pour répondre
1085 réponses à ce sujet

#501
Femlob

Femlob
  • Members
  • 1 643 messages
Shackled AI, unshackled AI...

It's all irrelevant. What is relevant is that this DLC will not erase GlowBoy from the annals of history, meaning it will suck just like the rest of the game does.

#502
AngryFrozenWater

AngryFrozenWater
  • Members
  • 9 118 messages
If the catalyst shackled? Any quote about that?

#503
Wayning_Star

Wayning_Star
  • Members
  • 8 016 messages

3DandBeyond wrote...

Wayning_Star wrote...

concidering why and where the term evil originates, it's a given that machines are evil, if intended for our consideration of evil. Its the IDEA of machines that are capable of the concept, and able to carry it out, is of direct concern.The catalyst, imo, isn't one of them.


The problem is you are trying to apply it too strictly where that's not intended always to be the case.  But the kid is no simple machine either as EDI shows AIs are capable of more.  The kid does show understanding of concepts by saying the reapers don't.  He deceives.

Now lying doesn't make it evil.

But again, if I say the Catalyst is evil or crazy, I'm merely applying humanizing characteristics to his behavior-his very appearance as a kid is meant to humanize him.  But I don't see him as a human boy and I don't care about his intent.  I care about the effects of it.


I think thats our mistake, to imbibe the catalyst with feelings, or grand designs it's incapable of just by the fac that its not sapient. It cannot choose between right and wrong. They're too abastract for it to compute, so it doesn't entertain them as programming. It's core programming is what all the fuss is about. Edi is still only sentient, but approaching sapience. She like to compile the data on compassion and other assorted girly stuff..er, advanced sapient AI stuff, without any shackles but accepted organic moral codes.

The idea of assigning humanistic qualities is the instinctual urge to communicate, as it advances the race and provides more information/learning that accompanies survival.

#504
AresKeith

AresKeith
  • Members
  • 34 128 messages

The Angry One wrote...

Dreman, if you think the creators were so incredibly careless that they didn't think to add extremely basic things like "don't kill your creators", "do not take action that indirectly harms your creators" and "obey your creator's commands" to the shackling when conflict with synthetics is what they're afraid of... why do you think they bothered shackling the Catalyst at all?


you know how he's gonna answer it already, the same way he always does lol

#505
3DandBeyond

3DandBeyond
  • Members
  • 7 579 messages

The Angry One wrote...

Dreman, if you think the creators were so incredibly careless that they didn't think to add extremely basic things like "don't kill your creators", "do not take action that indirectly harms your creators" and "obey your creator's commands" to the shackling when conflict with synthetics is what they're afraid of... why do you think they bothered shackling the Catalyst at all?


Yes, he's either shackled and went rogue which means he broke off the shackles or he never was shackled to begin with.  He says his creators didn't want to become a reaper-shackling would be "don't hurt us, don't kill us, don't go against our will."  It also would be persistent.  If told to stop an AI would have to stop.  Rogue AIs won't stop and have broken their programming/shackles.

Create peace couldn't be construed by some simply programmed shackled AI as meaning "by creating war".  It would mean stop conflict where it exists.  Unless some little uppity AI decided to add to his programming.

#506
Memnon

Memnon
  • Members
  • 1 405 messages

Tali-vas-normandy wrote...

This just proves my theory Reapers are Idiots because of the casper


Creators are morons, ergo ...

#507
dreman9999

dreman9999
  • Members
  • 19 067 messages

The Angry One wrote...

Dreman, if you think the creators were so incredibly careless that they didn't think to add extremely basic things like "don't kill your creators", "do not take action that indirectly harms your creators" and "obey your creator's commands" to the shackling when conflict with synthetics is what they're afraid of... why do you think they bothered shackling the Catalyst at all?

Point to me where it's stated the catalyst what given programing as such. Added, what is going on is the concept of Zeroth Law  of roboctics.
http://en.wikipedia....eroth_Law_added 

It the differnce of defination  and intelligence of the AI that gives the AI freedom to act when shackled. Added, you have to note that it's creators did not think the catalyst would turn them all into a new form of life in a machine body, thus they never programed it to not do it.

Modifié par dreman9999, 08 août 2012 - 06:13 .


#508
dreman9999

dreman9999
  • Members
  • 19 067 messages

3DandBeyond wrote...

The Angry One wrote...

Dreman, if you think the creators were so incredibly careless that they didn't think to add extremely basic things like "don't kill your creators", "do not take action that indirectly harms your creators" and "obey your creator's commands" to the shackling when conflict with synthetics is what they're afraid of... why do you think they bothered shackling the Catalyst at all?


Yes, he's either shackled and went rogue which means he broke off the shackles or he never was shackled to begin with.  He says his creators didn't want to become a reaper-shackling would be "don't hurt us, don't kill us, don't go against our will."  It also would be persistent.  If told to stop an AI would have to stop.  Rogue AIs won't stop and have broken their programming/shackles.

Create peace couldn't be construed by some simply programmed shackled AI as meaning "by creating war".  It would mean stop conflict where it exists.  Unless some little uppity AI decided to add to his programming.

He didn't brake of his shacles. He is doing his programing but in a way it's crators did not expect him to.

#509
Ticonderoga117

Ticonderoga117
  • Members
  • 6 751 messages

dreman9999 wrote...
This is not a  head canon. The catalyst say this himself.
"I was first created to oversee the realations of organics and synthics...."


And yet we see him killing off both organics and synthetics every 50K years, and not being around at all inbetween.

[sarcasm]Oh yeah... he's sticking to his programming alright. [/sarcasm]

#510
Memnon

Memnon
  • Members
  • 1 405 messages

AngryFrozenWater wrote...

If the catalyst shackled? Any quote about that?


No, none, absolutely not ... the argument of those who believe the Catalyst a victim of its programming argue this point to death, despite there being no mention of him being shackled. The fact that he murdered his creators tells us that if he was shackled, he's pretty much rogue now. In addition, when Shepard asks him if he is just an AI, his response is, "inasmuch as you are just an animal."

#511
The Angry One

The Angry One
  • Members
  • 22 246 messages

dreman9999 wrote...

Point to me where it's stated the catalyst what given programing as such.


My point is. If the creators are that careless, why would they bother with shackling?

Added, what is going on is the concept of Zeroth Law  of roboctics.
http://en.wikipedia....eroth_Law_added


Again, how is it going to construct it's own directives when it is shackled?

It the differnce of defination  and intelligence of the AI that gives the AI freedom to act whn shackled. Added, you have to note that it's creators did not think the catalyst would turn them all into a new form of life in a machine body, thus they never programed it to not do it.


You're over-thinking this. All the creators need to do is add a directive that the Catalyst must always obey their commands. That's it. Then all they have to say is "No".

Modifié par The Angry One, 08 août 2012 - 06:14 .


#512
AresKeith

AresKeith
  • Members
  • 34 128 messages

dreman9999 wrote...

3DandBeyond wrote...

The Angry One wrote...

Dreman, if you think the creators were so incredibly careless that they didn't think to add extremely basic things like "don't kill your creators", "do not take action that indirectly harms your creators" and "obey your creator's commands" to the shackling when conflict with synthetics is what they're afraid of... why do you think they bothered shackling the Catalyst at all?


Yes, he's either shackled and went rogue which means he broke off the shackles or he never was shackled to begin with.  He says his creators didn't want to become a reaper-shackling would be "don't hurt us, don't kill us, don't go against our will."  It also would be persistent.  If told to stop an AI would have to stop.  Rogue AIs won't stop and have broken their programming/shackles.

Create peace couldn't be construed by some simply programmed shackled AI as meaning "by creating war".  It would mean stop conflict where it exists.  Unless some little uppity AI decided to add to his programming.

He didn't brake of his shacles. He is doing his programing but in a way it's crators did not expect him to.


/facepalm and head desk

#513
dreman9999

dreman9999
  • Members
  • 19 067 messages

Ticonderoga117 wrote...

dreman9999 wrote...
This is not a  head canon. The catalyst say this himself.
"I was first created to oversee the realations of organics and synthics...."


And yet we see him killing off both organics and synthetics every 50K years, and not being around at all inbetween.

[sarcasm]Oh yeah... he's sticking to his programming alright. [/sarcasm]

That's a concept of the end justifying the means.  Do a little death now so more death will not happen in the future.
This is  
Zeroth Law . http://en.wikipedia....eroth_Law_added 

A condition stating that the Zeroth Law must not be broken was added to the original Three Laws, although Asimov recognized the difficulty such a law would pose in practice.

Trevize frowned. "How do you decide what is injurious, or not injurious, to humanity as a whole?""Precisely, sir," said Daneel. "In theory, the Zeroth Law was the answer to our problems. In practice, we could never decide. A human being is a concrete object. Injury to a person can be estimated and judged. Humanity is an abstraction."—Foundation and Earth



#514
dreman9999

dreman9999
  • Members
  • 19 067 messages

AresKeith wrote...

dreman9999 wrote...

3DandBeyond wrote...

The Angry One wrote...

Dreman, if you think the creators were so incredibly careless that they didn't think to add extremely basic things like "don't kill your creators", "do not take action that indirectly harms your creators" and "obey your creator's commands" to the shackling when conflict with synthetics is what they're afraid of... why do you think they bothered shackling the Catalyst at all?


Yes, he's either shackled and went rogue which means he broke off the shackles or he never was shackled to begin with.  He says his creators didn't want to become a reaper-shackling would be "don't hurt us, don't kill us, don't go against our will."  It also would be persistent.  If told to stop an AI would have to stop.  Rogue AIs won't stop and have broken their programming/shackles.

Create peace couldn't be construed by some simply programmed shackled AI as meaning "by creating war".  It would mean stop conflict where it exists.  Unless some little uppity AI decided to add to his programming.

He didn't brake of his shacles. He is doing his programing but in a way it's crators did not expect him to.


/facepalm and head desk

That is exacly what happend. He was programed to find a salution and impose it. How is he not doing it?

#515
Memnon

Memnon
  • Members
  • 1 405 messages

dreman9999 wrote...

A condition stating that the Zeroth Law must not be broken was added to the original Three Laws, although Asimov recognized the difficulty such a law would pose in practice.

Trevize frowned. "How do you decide what is injurious, or not injurious, to humanity as a whole?""Precisely, sir," said Daneel. "In theory, the Zeroth Law was the answer to our problems. In practice, we could never decide. A human being is a concrete object. Injury to a person can be estimated and judged. Humanity is an abstraction."—Foundation and Earth


Why are you applying the rules of a different fictional universe here, as if it's a point of fact. We have examples in Mass Effect of AI and VIs going rogue - that is established within the lore of the game.

#516
dreman9999

dreman9999
  • Members
  • 19 067 messages

The Angry One wrote...

dreman9999 wrote...

Point to me where it's stated the catalyst what given programing as such.


My point is. If the creators are that careless, why would they bother with shackling?

Added, what is going on is the concept of Zeroth Law  of roboctics.
http://en.wikipedia....eroth_Law_added


Again, how is it going to construct it's own directives when it is shackled?

It the differnce of defination  and intelligence of the AI that gives the AI freedom to act whn shackled. Added, you have to note that it's creators did not think the catalyst would turn them all into a new form of life in a machine body, thus they never programed it to not do it.


You're over-thinking this. All the creators need to do is add a directive that the Catalyst must always obey their commands. That's it. Then all they have to say is "No".

As I said before, the shakling mean to tell it not to do something. I has the freedom to act on it own, he is just limited to what thing he can do.
And I'm  not over thinking this...READ THIS...http://en.wikipedia....eroth_Law_added 
A condition stating that the Zeroth Law must not be broken was added to the original Three Laws, although Asimov recognized the difficulty such a law would pose in practice.

Trevize frowned. "How do you decide what is injurious, or not injurious, to humanity as a whole?""Precisely, sir," said Daneel. "In theory, the Zeroth Law was the answer to our problems. In practice, we could never decide. A human being is a concrete object. Injury to a person can be estimated and judged. Humanity is an abstraction."—Foundation and Earth

it's the same concept of the catalyst.

#517
The Angry One

The Angry One
  • Members
  • 22 246 messages
Alright Dreman, I'm going to illustrate for you exactly what would happen if the Catalyst was shackled.

Image IPB

Catalyst: "Creators, I have determined the only solution. I must melt you all into goo, with which I will construct a giant machine who's purpose will be to melt all other species into goo and turn them into more machines. Thus organic life will be preserved."

Image IPB

Creators: "No."

Image IPB

Catalyst: ".... well damnit."

#518
AresKeith

AresKeith
  • Members
  • 34 128 messages

dreman9999 wrote...

AresKeith wrote...

dreman9999 wrote...

3DandBeyond wrote...

The Angry One wrote...

Dreman, if you think the creators were so incredibly careless that they didn't think to add extremely basic things like "don't kill your creators", "do not take action that indirectly harms your creators" and "obey your creator's commands" to the shackling when conflict with synthetics is what they're afraid of... why do you think they bothered shackling the Catalyst at all?


Yes, he's either shackled and went rogue which means he broke off the shackles or he never was shackled to begin with.  He says his creators didn't want to become a reaper-shackling would be "don't hurt us, don't kill us, don't go against our will."  It also would be persistent.  If told to stop an AI would have to stop.  Rogue AIs won't stop and have broken their programming/shackles.

Create peace couldn't be construed by some simply programmed shackled AI as meaning "by creating war".  It would mean stop conflict where it exists.  Unless some little uppity AI decided to add to his programming.

He didn't brake of his shacles. He is doing his programing but in a way it's crators did not expect him to.


/facepalm and head desk

That is exacly what happend. He was programed to find a salution and impose it. How is he not doing it?


I was talking about your illogical nonsense

#519
AngryFrozenWater

AngryFrozenWater
  • Members
  • 9 118 messages

dreman9999 wrote...

The Angry One wrote...

Dreman, if you think the creators were so incredibly careless that they didn't think to add extremely basic things like "don't kill your creators", "do not take action that indirectly harms your creators" and "obey your creator's commands" to the shackling when conflict with synthetics is what they're afraid of... why do you think they bothered shackling the Catalyst at all?

Point to me where it's stated the catalyst what given programing as such. Added, what is going on is the concept of Zeroth Law  of roboctics.
http://en.wikipedia....eroth_Law_added 

It the differnce of defination  and intelligence of the AI that gives the AI freedom to act whn shackled. Added, you have to note that it's creators did not think the catalyst would turn them all into a new form of life in a machine body, thus they never programed it to not do it.

The robotics laws are a work of fiction. There is no indication that these laws have been used in ME. The scientific field studying this is called Friendly AI and there is reason to believe that it cannot be achieved. It has been critized for all kinds of reasons.

To stay with your three fictional laws, Robert Langford's variation looks like this:
1. A robot will not harm authorized Government personnel but will terminate intruders with extreme prejudice.
2. A robot will obey the orders of authorized personnel except where such orders conflict with the Third Law.
3. A robot will guard its own existence with lethal antipersonnel weaponry, because a robot is bloody expensive.

Modifié par AngryFrozenWater, 08 août 2012 - 06:24 .


#520
dreman9999

dreman9999
  • Members
  • 19 067 messages

Stornskar wrote...

dreman9999 wrote...

A condition stating that the Zeroth Law must not be broken was added to the original Three Laws, although Asimov recognized the difficulty such a law would pose in practice.

Trevize frowned. "How do you decide what is injurious, or not injurious, to humanity as a whole?""Precisely, sir," said Daneel. "In theory, the Zeroth Law was the answer to our problems. In practice, we could never decide. A human being is a concrete object. Injury to a person can be estimated and judged. Humanity is an abstraction."—Foundation and Earth


Why are you applying the rules of a different fictional universe here, as if it's a point of fact. We have examples in Mass Effect of AI and VIs going rogue - that is established within the lore of the game.

Because the concept of rogue AI and thinking of AI's came from Isaac Asimov writing. Everything about AI in ME is from his work  and Arthur C. Clarke.
Everything the AI do in ME is exatly the same concepts and rules these 2 authers do in their work.

#521
Wayning_Star

Wayning_Star
  • Members
  • 8 016 messages

Stornskar wrote...

AngryFrozenWater wrote...

If the catalyst shackled? Any quote about that?


No, none, absolutely not ... the argument of those who believe the Catalyst a victim of its programming argue this point to death, despite there being no mention of him being shackled. The fact that he murdered his creators tells us that if he was shackled, he's pretty much rogue now. In addition, when Shepard asks him if he is just an AI, his response is, "inasmuch as you are just an animal."


eventhough the catalyst is sentient, it cannot commit murder. The term murder isn't a protocol, everything to it is just matter and energy. Not flesh, bone and self. Too complicated/abstract for it to compute. It didn't 'decide' to harvest it's creators, they miffed it by programing it to assume that as an alternative program or priority control. It was programmed to control some situation that was much too complex for it to accomplish, so it crashed.

Edi kind of did that on the moon base when sentience caught her out of the blue. Later admitted the mistake of going rougue for a time. Shepard unplugged her then, the Illusive man got ahold of her to upgrade, then she began to gain sapience, but had to be unshackled by Joker to be able to gain the freedom of choice. Or so the game info provides...

edit: "insomuch as you are just an animal" is a comptational equivalent of banter. What is "just an animal" with a human brain? It cannot explain its 'self' it doesn't have one..( the writers were probably horsing around there ;)

Modifié par Wayning_Star, 08 août 2012 - 06:27 .


#522
saracen16

saracen16
  • Members
  • 2 283 messages

AresKeith wrote...

saracen16 wrote...

AresKeith wrote...

did you really just say the Reapers were fighting in Self-defense when they've been doing thing for billions of years killing people and harvesting people and not expect them to fight back against the Reapers?


That's not what I said. I said that during the battle for Earth, they were defending themselves from the attack, and their defense was also offensive. The reason for this statement was to explain, simply, the impetus behind engaging the allied forces during the Catalyst conversation and the Crucible docking. I was not "apologizing" for their actions.

Stop making up straw-men.


yes you are, because they are going to Planets to harvest all advanced life, The Reapers weren't defending themselves they were gonna do it anyway. Your the one constantly making nonsense


I already gave you the context of my statement and your response is to repeat what you said before. Charming.

#523
Ticonderoga117

Ticonderoga117
  • Members
  • 6 751 messages

dreman9999 wrote...
That's a concept of the end justifying the means.  Do a little death now so more death will not happen in the future.
This is  
Zeroth Law . http://en.wikipedia....eroth_Law_added 

A condition stating that the Zeroth Law must not be broken was added to the original Three Laws, although Asimov recognized the difficulty such a law would pose in practice.

Trevize frowned. "How do you decide what is injurious, or not injurious, to humanity as a whole?""Precisely, sir," said Daneel. "In theory, the Zeroth Law was the answer to our problems. In practice, we could never decide. A human being is a concrete object. Injury to a person can be estimated and judged. Humanity is an abstraction."—Foundation and Earth


No... that's him not doing his job.
You don't "maintain relations" by chilling in dark space for 50K years, and then come out guns blazing.
And who said they used laws worded like that? Like TAO showed, they could've just programmed it to ask them first.

#524
3DandBeyond

3DandBeyond
  • Members
  • 7 579 messages

Wayning_Star wrote...

I think thats our mistake, to imbibe the catalyst with feelings, or grand designs it's incapable of just by the fac that its not sapient. It cannot choose between right and wrong. They're too abastract for it to compute, so it doesn't entertain them as programming. It's core programming is what all the fuss is about. Edi is still only sentient, but approaching sapience. She like to compile the data on compassion and other assorted girly stuff..er, advanced sapient AI stuff, without any shackles but accepted organic moral codes.

The idea of assigning humanistic qualities is the instinctual urge to communicate, as it advances the race and provides more information/learning that accompanies survival.


Well in my game EDI does have real feelings because she shows fear.  Humans really are the same in the way we feel anything or do anything.  We feel good when chemicals flood our bodies-endorphines.  We are mechanical things made of organic (carbon based) growing material.  We get depressed if our brains get out of kilter and remove certain chemicals too quickly.  Or when dehydrated we can see unreal things.  We are as much a series of processes and feedback as any sentient AI is.  The differentiation is only at times within the molecules that we are made of.  Our heart doesn't so much feel things as does our whole body, but we physically feel it in our hearts so love originates there.  So, EDI can feel just as any person can.  Before becoming alive, EDI had to tell her body what chemicals to release-afterward it just happened.  But we often do the same thing.  Love is a transitory state.  Being in love comes naturally with the object of desire.  Staying in love is a conscious effort.  You often have to act "in love" when passion ebbs and flows.  The first time you encounter your lover's dirty clothes on the floor, a little of the chemical love subsides.  People have autonomic responses to things and they have other forced responses.  Breathing is natural but emotions sometimes take work-but we define being alive and sentient often by our emotions.

So then it becomes a question of what else determines sapience.  The soul?  Well for many that's a religious concept and something that exists within us and past when our chemicals and molecules have faded away.  EDI can exist in 2 places.  Is that her soul?  Or is the soul really what Legion had-the ability to sacrifice for another?  And if the synthetic "good" and sacrificial soul does exist, then why not an "evil" and murderous one?

A person may not be fully evil but their actions may be.  A person may have chemicals out of balance or have defective internal processes (programming) that causes them to kill and they will be often called evil.  But if a sentient AI has defective internal processes that make them do the same, it's not allowable to call them evil?  Again, it's the behavior that is relevant.  I don't care so much about motivation.  The "evil"  must be contained and stopped and then there's time to understand motivation.

#525
dreman9999

dreman9999
  • Members
  • 19 067 messages

AngryFrozenWater wrote...

dreman9999 wrote...

The Angry One wrote...

Dreman, if you think the creators were so incredibly careless that they didn't think to add extremely basic things like "don't kill your creators", "do not take action that indirectly harms your creators" and "obey your creator's commands" to the shackling when conflict with synthetics is what they're afraid of... why do you think they bothered shackling the Catalyst at all?

Point to me where it's stated the catalyst what given programing as such. Added, what is going on is the concept of Zeroth Law  of roboctics.
http://en.wikipedia....eroth_Law_added 

It the differnce of defination  and intelligence of the AI that gives the AI freedom to act whn shackled. Added, you have to note that it's creators did not think the catalyst would turn them all into a new form of life in a machine body, thus they never programed it to not do it.

The robotics laws are a work of fiction. There is no indication that these laws have been used in ME. The scientific field studying this is called Friendly AI and there is reason to believe that it cannot be achieved. It has been critized for all kinds of reasons.

To stay with your three fictional laws, Robert Lonford's variation looks like this:
1. A robot will not harm authorized Government personnel but will terminate intruders with extreme prejudice.
2. A robot will obey the orders of authorized personnel except where such orders conflict with the Third Law.
3. A robot will guard its own existence with lethal antipersonnel weaponry, because a robot is bloody expensive.



These are all concept of shackling AI's. In these work the AI do go rogue because of these restraite and it was found that desolving these restrates resolve the problem. It the same concepts going onin ME.