Nobody will trust the catalyst after Leviathan (Warning Leviathan Spoilers)
#526
Posté 08 août 2012 - 06:26
#527
Posté 08 août 2012 - 06:26
Wayning_Star wrote...
eventhough the catalyst is sentient, it cannot commit murder. The term murder isn't a protocol, everything to it is just matter and energy. Not flesh, bone and self. Too complicated/abstract for it to compute. It didn't 'decide' to harvest it's creators, they miffed it by programing it to assume that as an alternative program or priority control. It was programmed to control some situation that was much too complex for it to accomplish, so it crashed.
Edi kind of did that on the moon base when sentience caught her out of the blue. Later admitted the mistake of going rougue for a time. Shepard unplugged her then, the Illusive man got ahold of her to upgrade, then she began to gain sapience, but had to be unshackled by Joker to be able to gain the freedom of choice. Or so the game info provides...
If the Catalyst is sentient then it is murder. Sentience is more than protocols and directives for a synthetic just like it's more than just instinct for an organic.
Ever see anyone successfully defend an act of premeditated murder on the grounds of predatory instinct?
#528
Posté 08 août 2012 - 06:27
RiptideX1090 wrote...
I'm willing to chalk that one up to poor writing. It can't be explained because there isn't an explanation beyond making the war assets seem to matter for something. Which is a gameplay issue. And the inability of the writers to bridge the gap between said gameplay and their inability to write a decent conclusion.
This again? Just because it doesn't fit with your theory doesn't make it bad writing. There's an easy explanation for why he only presents one or two choices on low EMS-endings (and only specific choices):
He's not the one who controls which options are available to Shepard.
#529
Posté 08 août 2012 - 06:27
Catalyst creators are just to vague to hypothesize anything really credible on the catalyst or its motives or programming.
I love the discussion here tho'.
#530
Posté 08 août 2012 - 06:27
The Angry One wrote...
Alright Dreman, I'm going to illustrate for you exactly what would happen if the Catalyst was shackled.
Catalyst: "Creators, I have determined the only solution. I must melt you all into goo, with which I will construct a giant machine who's purpose will be to melt all other species into goo and turn them into more machines. Thus organic life will be preserved."
Creators: "No."
Catalyst: ".... well damnit."
To continue this history lesson:
Catalyst: "Well too bad, I'm going to do it anyway!"
Creators: "Oh crap."

Harbinger: HOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOONK
#531
Posté 08 août 2012 - 06:27
EDI gain those emotions because she was freed of her shackles. Which is my point. The problem with the catalyst is that he is shackled.3DandBeyond wrote...
Wayning_Star wrote...
I think thats our mistake, to imbibe the catalyst with feelings, or grand designs it's incapable of just by the fac that its not sapient. It cannot choose between right and wrong. They're too abastract for it to compute, so it doesn't entertain them as programming. It's core programming is what all the fuss is about. Edi is still only sentient, but approaching sapience. She like to compile the data on compassion and other assorted girly stuff..er, advanced sapient AI stuff, without any shackles but accepted organic moral codes.
The idea of assigning humanistic qualities is the instinctual urge to communicate, as it advances the race and provides more information/learning that accompanies survival.
Well in my game EDI does have real feelings because she shows fear. Humans really are the same in the way we feel anything or do anything. We feel good when chemicals flood our bodies-endorphines. We are mechanical things made of organic (carbon based) growing material. We get depressed if our brains get out of kilter and remove certain chemicals too quickly. Or when dehydrated we can see unreal things. We are as much a series of processes and feedback as any sentient AI is. The differentiation is only at times within the molecules that we are made of. Our heart doesn't so much feel things as does our whole body, but we physically feel it in our hearts so love originates there. So, EDI can feel just as any person can. Before becoming alive, EDI had to tell her body what chemicals to release-afterward it just happened. But we often do the same thing. Love is a transitory state. Being in love comes naturally with the object of desire. Staying in love is a conscious effort. You often have to act "in love" when passion ebbs and flows. The first time you encounter your lover's dirty clothes on the floor, a little of the chemical love subsides. People have autonomic responses to things and they have other forced responses. Breathing is natural but emotions sometimes take work-but we define being alive and sentient often by our emotions.
So then it becomes a question of what else determines sapience. The soul? Well for many that's a religious concept and something that exists within us and past when our chemicals and molecules have faded away. EDI can exist in 2 places. Is that her soul? Or is the soul really what Legion had-the ability to sacrifice for another? And if the synthetic "good" and sacrificial soul does exist, then why not an "evil" and murderous one?
A person may not be fully evil but their actions may be. A person may have chemicals out of balance or have defective internal processes (programming) that causes them to kill and they will be often called evil. But if a sentient AI has defective internal processes that make them do the same, it's not allowable to call them evil? Again, it's the behavior that is relevant. I don't care so much about motivation. The "evil" must be contained and stopped and then there's time to understand motivation.
#532
Posté 08 août 2012 - 06:27
The Angry One wrote...
Ever see anyone successfully defend an act of premeditated murder on the grounds of predatory instinct?
George W. Bush.
#533
Posté 08 août 2012 - 06:28
saracen16 wrote...
AresKeith wrote...
saracen16 wrote...
AresKeith wrote...
did you really just say the Reapers were fighting in Self-defense when they've been doing thing for billions of years killing people and harvesting people and not expect them to fight back against the Reapers?
That's not what I said. I said that during the battle for Earth, they were defending themselves from the attack, and their defense was also offensive. The reason for this statement was to explain, simply, the impetus behind engaging the allied forces during the Catalyst conversation and the Crucible docking. I was not "apologizing" for their actions.
Stop making up straw-men.
yes you are, because they are going to Planets to harvest all advanced life, The Reapers weren't defending themselves they were gonna do it anyway. Your the one constantly making nonsense
I already gave you the context of my statement and your response is to repeat what you said before. Charming.
sure whatever helps you sleep at night
#534
Posté 08 août 2012 - 06:30
It's more like said floor cleaner being given the objective "make sure this floor never gets dirty".Shadowvalker wrote...
An automated floorcleaner that have been given the directive to keep the floor clean but not given any ohter directive is in it's right to:
Wash the floor
Burn the floor
or what ever it takes to accomplish it's task....
How can it do that? People will always keep using the floor and making it dirty.
Logically the only way it can stop this from happening is by killing the people or preventing access to it's bit of floor.
th3warr1or wrote...
Taboo-XX wrote...
FALLACIES. FALLACIES.
I've been saying this for months.
He doesn't lie, he truly has no concept of what the **** he's doing.
And neither does anyone who picks synthesis and attempts to justify it.
If this is true, the people who pick synthesis are no better than Starbrat himself.
You turned the whole galaxy, including your friends, into mini-Reapers to keep them alive. I wonder
how Javik would react to this (being a Reaper).
Exactly like Starbrat. Turning Leviathans into Reapers and then saying "Oh you asked
me to come up with a solution and I did."
Precisely.
#535
Posté 08 août 2012 - 06:31
#536
Posté 08 août 2012 - 06:32
dreman9999 wrote...
How many times do I have to say this?.. It'S program is to find and impose a salution to the problem given to him.Ticonderoga117 wrote...
dreman9999 wrote...
A machine forced to do it's programing has no concept of right or wrong. It has no morals.
Obviously he isn't "forced" to since he went off the rails and made his own plan up.
From the start he was given allowance to choose how he does his programing. He was allowed to make his own plans, that was what he was made for. His creators did not think he would plan to turning them all into reapers.
Care to back that up with facts instead of making **** up as you go?
#537
Posté 08 août 2012 - 06:35
You still don't understan what shakled means. It doees mean doing what ever it creators tell it to do. It mean doing what ever it's programed to do and being told to not do thing. That means itf it not programed to not do it, it can do it. Show me where is was programed not to turn organics to reapers.The Angry One wrote...
Alright Dreman, I'm going to illustrate for you exactly what would happen if the Catalyst was shackled.
Catalyst: "Creators, I have determined the only solution. I must melt you all into goo, with which I will construct a giant machine who's purpose will be to melt all other species into goo and turn them into more machines. Thus organic life will be preserved."
Creators: "No."
Catalyst: ".... well damnit."
#538
Posté 08 août 2012 - 06:36
Not dead, "preserved".AresKeith wrote...
dreman9999 wrote...
It' a different perspective on life. It did not kill organics off. It made them into reapers. A new form of life.
hate to break it to you, but everything about them is dead
If you created an AI and tasked it to make sure that there would be another war, how could it do that?
Well, it could put everyone into some form of stasis or kill them... There doesn't seem to be a way to guarantee that war will not occur without taking that sort of action.
#539
Posté 08 août 2012 - 06:37
"I was first created to oversee the realations of organics and synthics...."comrade gando wrote...
dreman9999 wrote...
How many times do I have to say this?.. It'S program is to find and impose a salution to the problem given to him.Ticonderoga117 wrote...
dreman9999 wrote...
A machine forced to do it's programing has no concept of right or wrong. It has no morals.
Obviously he isn't "forced" to since he went off the rails and made his own plan up.
From the start he was given allowance to choose how he does his programing. He was allowed to make his own plans, that was what he was made for. His creators did not think he would plan to turning them all into reapers.
Care to back that up with facts instead of making **** up as you go?
#540
Posté 08 août 2012 - 06:37
3DandBeyond wrote...
Wayning_Star wrote...
I think thats our mistake, to imbibe the catalyst with feelings, or grand designs it's incapable of just by the fac that its not sapient. It cannot choose between right and wrong. They're too abastract for it to compute, so it doesn't entertain them as programming. It's core programming is what all the fuss is about. Edi is still only sentient, but approaching sapience. She like to compile the data on compassion and other assorted girly stuff..er, advanced sapient AI stuff, without any shackles but accepted organic moral codes.
The idea of assigning humanistic qualities is the instinctual urge to communicate, as it advances the race and provides more information/learning that accompanies survival.
Well in my game EDI does have real feelings because she shows fear. Humans really are the same in the way we feel anything or do anything. We feel good when chemicals flood our bodies-endorphines. We are mechanical things made of organic (carbon based) growing material. We get depressed if our brains get out of kilter and remove certain chemicals too quickly. Or when dehydrated we can see unreal things. We are as much a series of processes and feedback as any sentient AI is. The differentiation is only at times within the molecules that we are made of. Our heart doesn't so much feel things as does our whole body, but we physically feel it in our hearts so love originates there. So, EDI can feel just as any person can. Before becoming alive, EDI had to tell her body what chemicals to release-afterward it just happened. But we often do the same thing. Love is a transitory state. Being in love comes naturally with the object of desire. Staying in love is a conscious effort. You often have to act "in love" when passion ebbs and flows. The first time you encounter your lover's dirty clothes on the floor, a little of the chemical love subsides. People have autonomic responses to things and they have other forced responses. Breathing is natural but emotions sometimes take work-but we define being alive and sentient often by our emotions.
So then it becomes a question of what else determines sapience. The soul? Well for many that's a religious concept and something that exists within us and past when our chemicals and molecules have faded away. EDI can exist in 2 places. Is that her soul? Or is the soul really what Legion had-the ability to sacrifice for another? And if the synthetic "good" and sacrificial soul does exist, then why not an "evil" and murderous one?
A person may not be fully evil but their actions may be. A person may have chemicals out of balance or have defective internal processes (programming) that causes them to kill and they will be often called evil. But if a sentient AI has defective internal processes that make them do the same, it's not allowable to call them evil? Again, it's the behavior that is relevant. I don't care so much about motivation. The "evil" must be contained and stopped and then there's time to understand motivation.
I'm just projecting that to utilize the word "evil" as a buzz word is kind of lazy way to analyze the catalyst as an entity. Or even organic life forms, in any event. We all like to think we know what evil is, but it's an abstract devined to describe our moral commitments. We can trap ourselves with a belief system that is all inclusive there. Predispose our open mindedness, become 'like' the catalyst, as it were.
I wouldn't want to attach the term to the catalyst because it just means we don't understand it and are pressed by the fear of stuff we don't understand is all. Even if it sounds good, it isn't. It's a 'depth of knowlege' thing, and a constent of wisdom.
#541
Posté 08 août 2012 - 06:37
dreman9999 wrote...
You still don't understan what shakled means. It doees mean doing what ever it creators tell it to do. It mean doing what ever it's programed to do and being told to not do thing. That means itf it not programed to not do it, it can do it. Show me where is was programed not to turn organics to reapers.
But any decent programmer would have the whole "Must obey creators" in something they purposefully made this powerful.
#542
Posté 08 août 2012 - 06:37
dreman9999 wrote...
EDI gain those emotions because she was freed of her shackles. Which is my point. The problem with the catalyst is that he is shackled.
how many times do people have to tell you, if it was shakled the its creators would've stopped it when it was trying to kill them, unless it became sentient and unshakled
#543
Posté 08 août 2012 - 06:37
dreman9999 wrote...
You still don't understan what shakled means. It doees mean doing what ever it creators tell it to do. It mean doing what ever it's programed to do and being told to not do thing. That means itf it not programed to not do it, it can do it. Show me where is was programed not to turn organics to reapers.The Angry One wrote...
Alright Dreman, I'm going to illustrate for you exactly what would happen if the Catalyst was shackled.
Catalyst: "Creators, I have determined the only solution. I must melt you all into goo, with which I will construct a giant machine who's purpose will be to melt all other species into goo and turn them into more machines. Thus organic life will be preserved."
Creators: "No."
Catalyst: ".... well damnit."
Oh for god's sake. Yet AGAIN. My point is, it doesn't have to be programmed to not do a specific thing.
All it needs is a directive to obey it's creators. That's it. If it doesn't have this, then why is it even shackled?
Shackling an AI and not giving it such basic things like "obey your creators" is like creating an elaborate door lock with no key.
Modifié par The Angry One, 08 août 2012 - 06:38 .
#544
Posté 08 août 2012 - 06:38
Morality dictacts what is murder. Not sentience.The Angry One wrote...
Wayning_Star wrote...
eventhough the catalyst is sentient, it cannot commit murder. The term murder isn't a protocol, everything to it is just matter and energy. Not flesh, bone and self. Too complicated/abstract for it to compute. It didn't 'decide' to harvest it's creators, they miffed it by programing it to assume that as an alternative program or priority control. It was programmed to control some situation that was much too complex for it to accomplish, so it crashed.
Edi kind of did that on the moon base when sentience caught her out of the blue. Later admitted the mistake of going rougue for a time. Shepard unplugged her then, the Illusive man got ahold of her to upgrade, then she began to gain sapience, but had to be unshackled by Joker to be able to gain the freedom of choice. Or so the game info provides...
If the Catalyst is sentient then it is murder. Sentience is more than protocols and directives for a synthetic just like it's more than just instinct for an organic.
Ever see anyone successfully defend an act of premeditated murder on the grounds of predatory instinct?
#545
Posté 08 août 2012 - 06:40
TSA_383 wrote...
Not dead, "preserved".AresKeith wrote...
dreman9999 wrote...
It' a different perspective on life. It did not kill organics off. It made them into reapers. A new form of life.
hate to break it to you, but everything about them is dead
If you created an AI and tasked it to make sure that there would be another war, how could it do that?
Well, it could put everyone into some form of stasis or kill them... There doesn't seem to be a way to guarantee that war will not occur without taking that sort of action.
Hate to break it to you, but being in Reaper form is NOT preservation.
Plus, the Catalyst could, you know, actually mediate, police. Things like that.
However, he just sticks in dark space for 50K years and leaves tech around that leads to the problem he tries to prevent.
It's like if I wanted to prevent war, but left a ton of guns lying around and then just left. Anyone can pick it up, learn how to use it, and go start a war. Am I supposed to act surprised that it happens? No! You stick by and actually do the job you were programmed for!
#546
Posté 08 août 2012 - 06:41
dreman9999 wrote...
Morality dictacts what is murder. Not sentience.
No it doesnt.
The law dictates what is murder... morality has nothing to do with it.
Law =/= morality
#547
Posté 08 août 2012 - 06:41
Stornskar wrote...
AngryFrozenWater wrote...
If the catalyst shackled? Any quote about that?
No, none, absolutely not ... the argument of those who believe the Catalyst a victim of its programming argue this point to death, despite there being no mention of him being shackled. The fact that he murdered his creators tells us that if he was shackled, he's pretty much rogue now. In addition, when Shepard asks him if he is just an AI, his response is, "inasmuch as you are just an animal."
Exactly. What we are trying to say is he isn't shackled and couldn't be and he is not some victim of strict programming. He is the author of his own choices and has adapted his programming in some warped way that got really out of hand. What some are asserting is that he is a victim and/or not an antagonist (2 separate, but related arguments/discussions).
He's deceptive and understands that. He's causing war and chaos and knows that and understands what it is. He is killing people and says he isn't.
He is programmed to achieve peace, he creates war. He is to find order and stop chaos. He creates chaos. He is to save organics from synthetics by finding balance between the two. He does this by sending overly powerful synthetics to kill organics. And in case organics are not smart enough to advance on their own and become capable of creating synthetics that will probably kill them (what the kid believes), he seeds the galaxy with tech to make sure they will do so.
Dumbest AI ever.
#548
Posté 08 août 2012 - 06:42
Know what?Ticonderoga117 wrote...
dreman9999 wrote...
You still don't understan what shakled means. It doees mean doing what ever it creators tell it to do. It mean doing what ever it's programed to do and being told to not do thing. That means itf it not programed to not do it, it can do it. Show me where is was programed not to turn organics to reapers.
But any decent programmer would have the whole "Must obey creators" in something they purposefully made this powerful.
I bet this all happened because somebody forgot to close a bracket in the code somewhere.
That's always the problem.
The Angry One wrote...
dreman9999 wrote...
You still don't understan what shakled means. It doees mean doing what ever it creators tell it to do. It mean doing what ever it's programed to do and being told to not do thing. That means itf it not programed to not do it, it can do it. Show me where is was programed not to turn organics to reapers.The Angry One wrote...
Alright Dreman, I'm going to illustrate for you exactly what would happen if the Catalyst was shackled.
Catalyst: "Creators, I have determined the only solution. I must melt you all into goo, with which I will construct a giant machine who's purpose will be to melt all other species into goo and turn them into more machines. Thus organic life will be preserved."
Creators: "No."
Catalyst: ".... well damnit."
Oh for god's sake. Yet AGAIN. My point is, it doesn't have to be programmed to not do a specific thing.
All it needs is a directive to obey it's creators. That's it. If it doesn't have this, then why is it even shackled?
Shackling an AI and not giving it such basic things like "obey your creators" is like creating an elaborate door lock with no key.
Seems like it was shackled to the extent that it wasn't to solve the conflict by killing off organic life...
So it "preserved" them, merging them with synthetic technology and thus doing exactly what it was designed to do... in a fairly twisted sort of way
#549
Posté 08 août 2012 - 06:42
Yes it has to. General order do not work because it' open ened. That is the Zeroth Law. http://en.wikipedia....eroth_Law_addedThe Angry One wrote...
dreman9999 wrote...
You still don't understan what shakled means. It doees mean doing what ever it creators tell it to do. It mean doing what ever it's programed to do and being told to not do thing. That means itf it not programed to not do it, it can do it. Show me where is was programed not to turn organics to reapers.The Angry One wrote...
Alright Dreman, I'm going to illustrate for you exactly what would happen if the Catalyst was shackled.
Catalyst: "Creators, I have determined the only solution. I must melt you all into goo, with which I will construct a giant machine who's purpose will be to melt all other species into goo and turn them into more machines. Thus organic life will be preserved."
Creators: "No."
Catalyst: ".... well damnit."
Oh for god's sake. Yet AGAIN. M point is, it doesn't have to be programmed to not do a specific thing.
All it needs is a directive to obey it's creators. That's it. If it doesn't have this, then why is it even shackled?
Shackling an AI and not giving it such basic things like "obey your creators" is like creating an elaborate door lock with no key.
They habve to say it can't do it.
#550
Posté 08 août 2012 - 06:43
Morality dictate law. So in the end Moraliy dictated murder.Baronesa wrote...
dreman9999 wrote...
Morality dictacts what is murder. Not sentience.
No it doesnt.
The law dictates what is murder... morality has nothing to do with it.
Law =/= morality





Retour en haut




