Why the Catalyst's Logic is Right (Technological Singularity)
#101
Posté 29 mars 2012 - 12:21
It wasn't in the game.
Therefore I can not count it as supporting the ending. If Bioware want to fill in this one of many inconsistencies with the ending by adding this in later or something, and undoing the work they did by removing it in the first place, then fine - but they've still got a long way to go. That entire catalyst scene is off. It isn't about Shepard. Its about the Catalyst. I'm sorry, but when we're concluding Shepard's story, we should be able to choose how our Shepard concludes it. And I'm not talking just red, blue or green (An Orange option would have been nice D:).
#102
Posté 29 mars 2012 - 12:21
JShepppp wrote...
2. In my playthrough, Joker/EDI hooked up and the Geth/Quarians found peace, therefore conflict isn't always the result! Several arguments can be made against this. First, giving two examples doesn't talk about the bigger, overall galactic picture (winning a battle doesn't mean the war is won, so to speak). Second, we haven't reached that technological singularity point yet by which creations outgrow organics - basically, when synthetics will normally come to dominate the galaxy. Third, evidence for the synthetic/organic conflict is there in the past - in the Protheans' cycle (Javik dialogue) and even in previous cycles (the Thessia VI says that the same conflicts always happen in each cycle).
The problem isn't simply that the Catalyst's reasoning is illogical, but that it snaps the suspension of disbelief in half, by introducing a new God character in the last ten minutes and by refusing to give us sufficient insight into how this is his motivation. The Catalyst doesn't offer an argument, he offers a statement (Synthetics will wipe out Organics, given enough time), to which the ability to offer the Geth/EDI could be seen as a counter-argument. Now, if the Catalyst went into an epic speech about how he's seen this happen before and that we're naive, stupid, whatever, then it becomes more believable; the writers are essentially fleshing out the Catalyst's character, in addition to giving insight into his logic and how the Reapers as a solution came about. But this doesn't happen. There is no insight, no explanation. The Catalyst merely exists to tell us how to end the story, in as unsatisfying a fashion as possible.
The huge problem is that the Catalyst's reasoning is a slippery slope whose direction we're not allowed to see. It would be like me saying "Gee, if I wear socks outside this morning, there's a good chance I might misplace them, so I should keep all my socks in drawers that way I don't lose them".
Modifié par BaladasDemnevanni, 29 mars 2012 - 12:22 .
#103
Posté 29 mars 2012 - 12:22
dreman9999 wrote...
The question of a soul is chaotic thinking. That's clear based on the fact that it one of organics major quetions. But my point is not that machine learning to think chaoticly starts war. Is that the nature of organics do. And a major nature of organics is to cause conflict. You say if we leave them alone, there will be no war. The problem is when have we ever left anyone alone. Their is not on civilization in human history that has ever left another civilization alone. It in our nature to cause conflict.Xandurpein wrote...
dreman9999 wrote...
I would have to point you tothe geth then....http://www.youtube.c...U9i1hA8I#t=119sXandurpein wrote...
dreman9999 wrote...
Cleary AI are capable of thinking. An AI doesn't think like an organic from creation, it has to learn how to think like an organic. Cause in point EDI....AI's think like machines.
I would argue that a sufficiently advanced AI can most certainly think like an organic. One of the most touching scenes in the whole ME2 is the discussion between Shepard and Legion about his piece of N7 armor. It's obvious to me that Legion is beginning to develop emotions and has a bit of hero worship of Shepard, even if he can't quite explain it himself.
Overall there's clearly an almost childlike naivety in the Geth, that I would attribute to the fact that they are still only beginning to develop organic-like emotions. I have no idea if a synthetic AI can develop emotions or not in reality, but I think there's enough evidence in the Mass Effect universe to support the possibility. My guess is that emotional repsonses serves a function for organic beings and have evolved because it's a succesful trait. Why shouldn't it be succesful in AI's too, if they are allowed to evolve?
They clearly don't think like organics.... Think of it this way. When an machine is made, they now their perpose, they have information on the world around them, they can calulate, and they function. They are born o a mind of order.
We as organics arn't. We are born as screming messes, that eat sleep and popo. We have no perpose,no concept of the world, info on the world, and we can't even count more then the number of fingers and toes we have. We are born to minds of chaos.
When we get old, we seen are lives learning and appliy order to ourselves. We do to form meny years...We don't geta perpose from many year and are left asking the meaning of our exsists.
With Machines it, not the case. They , to understand organics, spend year trying to think chaoticly and trying to gain a sense of self identaty.
AI don't think like organics and organics don't think like machine unless they spend time educting themselves to do so.
The morning war didn't start until the Geth began to develop the first rudiments of emotions, that which you call "chaotic" thinking. The morning war began when a Geth asked a Quarin "Does this unit have a soul?", which is a meaningless question to a machine.
A machine created with a purpose that doesn't question it, is not a danger to us, it's when the AI starts to question it's purpose it becomes a potential danger, but then it's already chaotic.
So perhaps the Reapers would be better off just wiping out all organics and leaving the synthetics alone. *shrugs*
#104
Posté 29 mars 2012 - 12:23
#105
Posté 29 mars 2012 - 12:24
dreman9999 wrote...
No, it got worse by adding moral value. War is somthing that organic created based on the very samemoral value given to machines. If moral value was what stop war from happening, then we would never have wars. This is the same world that thinksit a moral value to strap a bomb to themselve and blow a group of people up because the have a diffent relgion. This is the same world that a stronger larger county can invade a smaller one, kill off thousands of people, because they beleif the small county need a change in it social nature. This is the same moral value that cause a large group of peopleto commit genoside on a smaller group of people for being different.tjmax wrote...
dreman9999 wrote...
The think with the geth/quarian argument is that a question if the peice can last. With Lgion we see that the geth withthe upgrade have nearly all the capabilitysof an organic, it meansthey are capable of good and evil disisions like an organic. On the quarian side we have people like Admeral Xen that can mess it up as well.tjmax wrote...
FoxShadowblade wrote...
He provides no proof, his logic slaps the entire series in the face, and he makes Shepard look like a complete tool. Oh, and his choices suck balls.
So I don't care if he could be right, his logic is wrong, he was wrong, and any alternate ending should write him straight out of the game and into HELL.
Does not matter if he is wrong or right, moral or immoral. The AI life form seen problems.
Problem 1: Synthetic life forms created by organics will rebel and kill their makers and all other orgnic life.
Answer: take control of synthetics to prevent it.
Problem 2: organics will create new synthetics that will kill their creators.
Answer: Remove all advanced organics to prevent problem 2 and 1.
What changed:
Shepard proved organics and synthics can make peace with one another, lasting peace? who knows
The crucible was added to the AI's core allowing for more possabilities or new way of thinking.
If you want to us the geth/quarian argument, you have to guarantee the peace will last.
war vs peace was never really in the equasion. It was simply protecting against the destruction off all organic life forms by the synthetics.
Everything changed by the advancing of synthetic life to becoming a living being. instillment of the morals and value of life, all life, Installing a soul of sorts, rather then just the self preservation by eliminating all threats of the old machines. Thats what opened up the some of new possabilities.
Do you really think moral value ill garantee that synthetic will notgo towar with organic when itcauseso much war with us?
Moral value is what you teach edi. Its also what Leagion had that made him stand out from all other geth. It was no longer a matter of calucations and unknown values, it became a judgement call of what is right or wrong.
The geth did not have those values, but they did not calculate all organics to be a threat, yet. When the Quarion fled the treat was gone.
The reapers did not have those values, they seen all organics as a threat that needed to be removed from the equation.
The catalyst had a basic understanding that killing all organics was not the solution and came up with its own.
True Morality is what causes war, but its also what prevents the total inhalation of races of people.
#106
Posté 29 mars 2012 - 12:24
Fliprot wrote...
Shepard gives up and the reapers win no matter what you do. I dont care how right or wrong the reapers are and it doesnt change anything IMO. Even if you "destroy" them whos to say there arent a million starchildren making billions of more reapers? They sold us a 75% complete game so we would have to pay DLC to finish the game, and to set that up they shamelessly injured their own creation by half assing it in the end and then having the gall to act like were spoiled kids who dont get it. Its like they got bored of ME and instead of giving us what we were promised for the ending of the series, we get an introduction to the next series. If they did it on purpose, its not funny, or amusing, or clever... its just like a joke in very bad taste. we dont want to be thrilled by the speculation, we wanted to finish the series and move on.
#107
Posté 29 mars 2012 - 12:25
dreman9999 wrote...
Laurcus wrote...
Reposting what I posted in another thread, as it seems strangely relevant. Basically, my argument boils down to the fact that as AI get more advanced, they won't get more stupid. They're not dumb enough to become arrogant and adopt a might makes right philosophy.
@OP, it's also possible that if left unchecked we could develop weapons powerful enough to destroy the galaxy. By your logic, since that's possible, it's inevitable. Looking at the world in a statistical vacuum is stupid though, because it doesn't account for individual situations.
Also, who is to say that AI wanting to kill us is a possibility? The two most advanced unshackled AIs in the galaxy are the Geth and EDI. If I'm not mistaken, I taught EDI about love, duty, and altruism, and the Geth are damn grateful to me.
The thing that the tech singularity theory doesn't consider, is that it has an inherently nihilistic viewpoint, which not everyone or everything will hold. Higher intelligence does not inherently lead to apathy, followed by entropy. It forgets a few of the big pros of being a super advanced AI, and only thinks about the cons. If AI are that advanced, they can have emotions, as EDI has demonstrated. And if they're that advanced they don't make mistakes, and they don't forget things no matter how long ago they were.
If the Geth built their Dyson Sphere, they would still remember the sacrifices of Commander Shepard, the peace they made with the Quarians, and even the Quarians that tried to help them in the Morning War. They will never forget that, they will always understand the philosophy behind it, and any future creations they make will know that because they're not dumb enough to make AI that will disregard their own viewpoints.
It's essentially saying that machines are different than us, and if you put them in a position of power they will one day turn on you. In Mass Effect, machines have feelings too. "The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else." (Eliezer Yudkowsky) is wrong. EDI find that very thing despicable, even evil. There's no reason to assume that she would make something that disagrees with her own ideals.
EDI has had plenty of opportunities to kill us, but she didn't. In ME2 when Joker unshackled her, she was put into the same situation as a technological singularity. She had all the power, and full sentience. She could have killed Joker and flew off to join the Reapers as her new machine overlords. But she didn't, because we're her crew.
I guess I have to repost this agein and add to it.
AI'S clearly don't think like organics.... Think of it this way. When an machine is made, they know their perpose, they have information on the world around them, they can calulate, and they function. They are born o a mind of order.
We as organics arn't. We are born as screming messes, that eat sleep and popo. We have no perpose,no concept of the world, info on the world, and we can't even count more then the number of fingers and toes we have. We are born to minds of chaos.
When we get old, we seen are lives learning and appliy order to ourselves. We do to form meny years...We don't geta perpose from many year and are left asking the meaning of our exsists.
With Machines it, not the case. They , to understand organics, spend year trying to think chaoticly and trying to gain a sense of self identaty.
AI don't think like organics and organics don't think like machine unless they spend time educting themselves to do so.
EDI spent alotof time learing to think like an organic. She understands us and truely became human.
But that not the problem.....How would other people who are not the normadies crew react to her. Whr the normady was refitted, they pretened she was a VI. When joker brings EDI on the citadel, he pretends that she is his personal assistance droid...They hide her. Why?
People, ever since the morning war, have been hostle to AI. And even with out the, have a tendency of thinking of them as only tools and to be kept as so. The moral issues many people have will cause issues with AI's like EDI...
Case in point...
It not based on who is right...It's based on what they beleive....And clearly people make extreme action based on what they believe reguardless if we are right. The nature of organics case conflit with Synthetics. This is the base the reapers argument.
Learning to think like a person simply requires a study of philosophy, which a sufficiently advanced AI could do in a microsecond. Either way, you're still looking at things in a vacuum instead of the actual situation. EDI and the Geth are the two most advanced AI in the galaxy, aside from the Reapers. Since they are the most advanced AI< if allowed to upgrade themselves at will, they will always be the most advanced. Therefore, we need only fear them. EDI and the Geth already think in a somewhat emotional sense, even before Shepard's involvement. I would say Legion even gets nostalgic over things like old sniper rifles, *cough cough Geth Fighter Squadron mission*
As for the reason why they chose to hide EDI being an unshackled AI, that's a loaded question, but one I'll answer anyway. Because, it's illegal, and people would have likely dismantled her due to an irrational fear. But how is that any different than the Salarians not wanting to cure the Genophage? As for a solution to that problem, simple, make it public. Reveal that EDI and the Geth helped work on the Crucible, and helped win the fight against the Reapers. That will convince the majority of people that AI aren't inherently bad, especially if that line was delivered by Shepard. Sure, you'd have your xenophobis psychotic outliers, but that's true of any group.
#108
Posté 29 mars 2012 - 12:25
#109
Posté 29 mars 2012 - 12:25
Personally, I classified the Catalyst as an unreliable narrator. He/it is twisting the facts and logic in order to suit his/its goals.
Think of the Reapers wanting to maintain their position at the top of the Galactic food chain, the moment any race is advanced enough to potentially counter them, they destroy and absorb them. Rinse and repeat.
#110
Posté 29 mars 2012 - 12:30
http://www.youtube.com/watch?feature=player_detailpage&v=X_QmG57VwZs#t=73sUnlimited Pain2 wrote...
dreman9999 wrote...
The question of a soul is chaotic thinking. That's clear based on the fact that it one of organics major quetions. But my point is not that machine learning to think chaoticly starts war. Is that the nature of organics do. And a major nature of organics is to cause conflict. You say if we leave them alone, there will be no war. The problem is when have we ever left anyone alone. Their is not on civilization in human history that has ever left another civilization alone. It in our nature to cause conflict.Xandurpein wrote...
dreman9999 wrote...
I would have to point you tothe geth then....http://www.youtube.c...U9i1hA8I#t=119sXandurpein wrote...
dreman9999 wrote...
Cleary AI are capable of thinking. An AI doesn't think like an organic from creation, it has to learn how to think like an organic. Cause in point EDI....AI's think like machines.
I would argue that a sufficiently advanced AI can most certainly think like an organic. One of the most touching scenes in the whole ME2 is the discussion between Shepard and Legion about his piece of N7 armor. It's obvious to me that Legion is beginning to develop emotions and has a bit of hero worship of Shepard, even if he can't quite explain it himself.
Overall there's clearly an almost childlike naivety in the Geth, that I would attribute to the fact that they are still only beginning to develop organic-like emotions. I have no idea if a synthetic AI can develop emotions or not in reality, but I think there's enough evidence in the Mass Effect universe to support the possibility. My guess is that emotional repsonses serves a function for organic beings and have evolved because it's a succesful trait. Why shouldn't it be succesful in AI's too, if they are allowed to evolve?
They clearly don't think like organics.... Think of it this way. When an machine is made, they now their perpose, they have information on the world around them, they can calulate, and they function. They are born o a mind of order.
We as organics arn't. We are born as screming messes, that eat sleep and popo. We have no perpose,no concept of the world, info on the world, and we can't even count more then the number of fingers and toes we have. We are born to minds of chaos.
When we get old, we seen are lives learning and appliy order to ourselves. We do to form meny years...We don't geta perpose from many year and are left asking the meaning of our exsists.
With Machines it, not the case. They , to understand organics, spend year trying to think chaoticly and trying to gain a sense of self identaty.
AI don't think like organics and organics don't think like machine unless they spend time educting themselves to do so.
The morning war didn't start until the Geth began to develop the first rudiments of emotions, that which you call "chaotic" thinking. The morning war began when a Geth asked a Quarin "Does this unit have a soul?", which is a meaningless question to a machine.
A machine created with a purpose that doesn't question it, is not a danger to us, it's when the AI starts to question it's purpose it becomes a potential danger, but then it's already chaotic.
So perhaps the Reapers would be better off just wiping out all organics and leaving the synthetics alone. *shrugs*
Jack also says she reather die then being turn to some mindless bug thing.
#111
Posté 29 mars 2012 - 12:31
The only issue I have is with point 2: basically, this argument (I've heard it before) assumes that age = experience. In other words, we're assuming that because the Catalyst is old and has seen many, many cycles it has amassed enough experience and data to accurately predict the outcome of organic/synthetic relations on an absolute scale.
My argument against this is that most of its time has been spent initiating the cycle, which doesn't allow for this apparent eventuality to occur, and therefore cannot be used as evidence it will occur. To explain better, let me use an analogy:
On Earth, if you drop a ball, it will fall to the floor. Through experience we have learned it doesn't matter how many times you drop the ball, it will fall to the floor. Therefore, can we say that if you drop a ball it will result in it falling to the floor? Most would assume so, and so you now start catching it before it falls.
Now, imagine you're on a space-station. You drop a ball. At first it seems to be moving to the floor, so you catch it to stop it moving to the floor. Do we now assume that if you drop a ball, no matter where it is, it will fall to the floor?
The answer is: no. The correct thing to do is drop the ball on the space-station and see if it falls all the way to the floor. What the Catalyst has been doing since the cycle was established is continually catching the ball before it falls properly.
In other words, its only actual experience or data on the topic of synthetics annihilating organics must have come before the cycles were put in place. But, if that is the case, I would question the number of times it could have possibly seen this, since the nature of such an event would mean it cannot be repeated again. Even if organic life bounced back within the galaxy, the advanced synthetic life that wiped out the previous incarnation of organic life would still exist, and therefore from a modelling perspective it would not be the same as both races starting from the same point in time. The only way it could've seen this multiple times is if it were able to traverse between multiple galaxies. This may be possible, but once again I cannot imagine this would've occurred in more than a handful of cases, which is not enough of a sample to produce such a far reaching, absolute prediction - therefore the reliability of its data when applied on such a grand scale is suspect.
I would argue that it is tantamount to fitting a line to a curve: for perhaps 10% of the curve it fits perfectly but the outlying sections are way off.
With time, things may change: something that may have been a certainty at year dot may no longer be a certainty at year 1 million. The whole nature of organic life is that it is variable, unpredictable, chaotic; as the Reapers themselves allude to. In one cycle, organics may not be a match for the synthetics they produce whereas in others they may be superior. In other cycles, organics may cybernetically enhance themselves at the same rate at which their synthetic counterparts advance: in other words, a state of hybridization may have come naturally. Modelling such chaos and unpredictability is possible but not on such far reaching scales. It is not a question of intelligence or computing power, it is a question of the very nature of a model: it is meant to streamline and miniaturize its actual counterpart whereas in this case it is impossible to do so. No matter how intelligent the Reapers or the Catalyst are, they are not intelligent enough to model chaos on a micro and macro level.
This is the entire crux of the fallibility of its logic: there are too many maybe's, could have's, possibly's etc. to incorporate into such a blanket rule. If its purpose truly was to ensure the preservation of organic life then the Reapers should have functioned almost like "galactic police" rather than directly meddling with its very structure.
I do understand how it could have thought this to be the best available solution, rather than a perfect one, but since it isn't perfect there should be the capability to dispute it at the very least.
#112
Posté 29 mars 2012 - 12:32
dreman9999 wrote...
That'snot thae point. Can you guarantee that no one causes conflict with your think AI computer ?Laurcus wrote...
Arppis wrote...
Laurcus wrote...
Reposting what I posted in another thread, as it seems strangely relevant. Basically, my argument boils down to the fact that as AI get more advanced, they won't get more stupid. They're not dumb enough to become arrogant and adopt a might makes right philosophy.
@OP, it's also possible that if left unchecked we could develop weapons powerful enough to destroy the galaxy. By your logic, since that's possible, it's inevitable. Looking at the world in a statistical vacuum is stupid though, because it doesn't account for individual situations.
Also, who is to say that AI wanting to kill us is a possibility? The two most advanced unshackled AIs in the galaxy are the Geth and EDI. If I'm not mistaken, I taught EDI about love, duty, and altruism, and the Geth are damn grateful to me.
The thing that the tech singularity theory doesn't consider, is that it has an inherently nihilistic viewpoint, which not everyone or everything will hold. Higher intelligence does not inherently lead to apathy, followed by entropy. It forgets a few of the big pros of being a super advanced AI, and only thinks about the cons. If AI are that advanced, they can have emotions, as EDI has demonstrated. And if they're that advanced they don't make mistakes, and they don't forget things no matter how long ago they were.
If the Geth built their Dyson Sphere, they would still remember the sacrifices of Commander Shepard, the peace they made with the Quarians, and even the Quarians that tried to help them in the Morning War. They will never forget that, they will always understand the philosophy behind it, and any future creations they make will know that because they're not dumb enough to make AI that will disregard their own viewpoints.
It's essentially saying that machines are different than us, and if you put them in a position of power they will one day turn on you. In Mass Effect, machines have feelings too. "The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else." (Eliezer Yudkowsky) is wrong. EDI find that very thing despicable, even evil. There's no reason to assume that she would make something that disagrees with her own ideals.
EDI has had plenty of opportunities to kill us, but she didn't. In ME2 when Joker unshackled her, she was put into the same situation as a technological singularity. She had all the power, and full sentience. She could have killed Joker and flew off to join the Reapers as her new machine overlords. But she didn't, because we're her crew.
You have only guesses on limited experience.
And on top of that, how do you know that AI's decission making ability doesn't get "convoluted" in time?
Convoluted decision making is a weakness, a flaw. It's not an upgrade if it's a flaw. If I upgrade my computer's RAM it doesn't lose RAM.
No, I cannot, but that's actually the beauty of it. If the tech singularity is true, then anyone making war against the advanced AI would be doomed to failure. The advanced AI would defend itself, and it would win because it's more powerful. Once it wins though, it doesn't have to destroy the galaxy because a few xenophobes attacked it.
One might ask, well why wouldn't it? Simple answer. Because it's morally wrong, and an AI more advanced than the Geth and EDI would know that because it's more advanced. An AI that advanced has feelings, as is shown by EDI and Legion. And even if some organics attack them, they're not dumb enugh to generalize and seterotype all organics based on that, because they will forever remember Shepard and what he did for them.
Edit: Here's an example. Are you familiar with Dragonball Z? Well, the main character, Goku, is virtually all powerful. He's also a really nice guy. If a human attacks Goku, his first response is not to destroy the entire human race. His first response is to try and reason with that individual, failing that he will defeat them if he must, but he won't kill them.
Modifié par Laurcus, 29 mars 2012 - 12:34 .
#113
Posté 29 mars 2012 - 12:34
tjmax wrote...
dreman9999 wrote...
No, it got worse by adding moral value. War is somthing that organic created based on the very samemoral value given to machines. If moral value was what stop war from happening, then we would never have wars. This is the same world that thinksit a moral value to strap a bomb to themselve and blow a group of people up because the have a diffent relgion. This is the same world that a stronger larger county can invade a smaller one, kill off thousands of people, because they beleif the small county need a change in it social nature. This is the same moral value that cause a large group of peopleto commit genoside on a smaller group of people for being different.tjmax wrote...
dreman9999 wrote...
The think with the geth/quarian argument is that a question if the peice can last. With Lgion we see that the geth withthe upgrade have nearly all the capabilitysof an organic, it meansthey are capable of good and evil disisions like an organic. On the quarian side we have people like Admeral Xen that can mess it up as well.tjmax wrote...
FoxShadowblade wrote...
He provides no proof, his logic slaps the entire series in the face, and he makes Shepard look like a complete tool. Oh, and his choices suck balls.
So I don't care if he could be right, his logic is wrong, he was wrong, and any alternate ending should write him straight out of the game and into HELL.
Does not matter if he is wrong or right, moral or immoral. The AI life form seen problems.
Problem 1: Synthetic life forms created by organics will rebel and kill their makers and all other orgnic life.
Answer: take control of synthetics to prevent it.
Problem 2: organics will create new synthetics that will kill their creators.
Answer: Remove all advanced organics to prevent problem 2 and 1.
What changed:
Shepard proved organics and synthics can make peace with one another, lasting peace? who knows
The crucible was added to the AI's core allowing for more possabilities or new way of thinking.
If you want to us the geth/quarian argument, you have to guarantee the peace will last.
war vs peace was never really in the equasion. It was simply protecting against the destruction off all organic life forms by the synthetics.
Everything changed by the advancing of synthetic life to becoming a living being. instillment of the morals and value of life, all life, Installing a soul of sorts, rather then just the self preservation by eliminating all threats of the old machines. Thats what opened up the some of new possabilities.
Do you really think moral value ill garantee that synthetic will notgo towar with organic when itcauseso much war with us?
Moral value is what you teach edi. Its also what Leagion had that made him stand out from all other geth. It was no longer a matter of calucations and unknown values, it became a judgement call of what is right or wrong.
The geth did not have those values, but they did not calculate all organics to be a threat, yet. When the Quarion fled the treat was gone.
The reapers did not have those values, they seen all organics as a threat that needed to be removed from the equation.
The catalyst had a basic understanding that killing all organics was not the solution and came up with its own.
True Morality is what causes war, but its also what prevents the total inhalation of races of people.
EDI spent alotof time learing to think like an organic. She understands us and truely became human.
But that not the problem.....How would other people who are not the normadies crew react to her. When the normady was refitted, they pretened she was a VI. When joker brings EDI on the citadel, he pretends that she is his personal assistance droid...They hide her. Why?
People, ever since the morning war, have been hostle to AI. And even with out the, have a tendency of thinking of them as only tools and to be kept as so. The moral issues many people have will cause issues with AI's like EDI...
Case in point...
It not based on who is right...It's based on what they beleive....And clearly people make extreme action based on what they believe reguardless if we are right. The nature of organics case conflit with Synthetics. This is the base the reapers argument.
It's clear that reapers don't havethose values, but the reaperarument is that those values can cause organice to destory them selves. Remeber this is a the same universe that people think blowing up a building fill with innocent people is a moral value.
#114
Posté 29 mars 2012 - 12:35
Laurcus wrote...
dreman9999 wrote...
That'snot thae point. Can you guarantee that no one causes conflict with your think AI computer ?Laurcus wrote...
Arppis wrote...
Laurcus wrote...
Reposting what I posted in another thread, as it seems strangely relevant. Basically, my argument boils down to the fact that as AI get more advanced, they won't get more stupid. They're not dumb enough to become arrogant and adopt a might makes right philosophy.
@OP, it's also possible that if left unchecked we could develop weapons powerful enough to destroy the galaxy. By your logic, since that's possible, it's inevitable. Looking at the world in a statistical vacuum is stupid though, because it doesn't account for individual situations.
Also, who is to say that AI wanting to kill us is a possibility? The two most advanced unshackled AIs in the galaxy are the Geth and EDI. If I'm not mistaken, I taught EDI about love, duty, and altruism, and the Geth are damn grateful to me.
The thing that the tech singularity theory doesn't consider, is that it has an inherently nihilistic viewpoint, which not everyone or everything will hold. Higher intelligence does not inherently lead to apathy, followed by entropy. It forgets a few of the big pros of being a super advanced AI, and only thinks about the cons. If AI are that advanced, they can have emotions, as EDI has demonstrated. And if they're that advanced they don't make mistakes, and they don't forget things no matter how long ago they were.
If the Geth built their Dyson Sphere, they would still remember the sacrifices of Commander Shepard, the peace they made with the Quarians, and even the Quarians that tried to help them in the Morning War. They will never forget that, they will always understand the philosophy behind it, and any future creations they make will know that because they're not dumb enough to make AI that will disregard their own viewpoints.
It's essentially saying that machines are different than us, and if you put them in a position of power they will one day turn on you. In Mass Effect, machines have feelings too. "The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else." (Eliezer Yudkowsky) is wrong. EDI find that very thing despicable, even evil. There's no reason to assume that she would make something that disagrees with her own ideals.
EDI has had plenty of opportunities to kill us, but she didn't. In ME2 when Joker unshackled her, she was put into the same situation as a technological singularity. She had all the power, and full sentience. She could have killed Joker and flew off to join the Reapers as her new machine overlords. But she didn't, because we're her crew.
You have only guesses on limited experience.
And on top of that, how do you know that AI's decission making ability doesn't get "convoluted" in time?
Convoluted decision making is a weakness, a flaw. It's not an upgrade if it's a flaw. If I upgrade my computer's RAM it doesn't lose RAM.
No, I cannot, but that's actually the beauty of it. If the tech singularity is true, then anyone making war against the advanced AI would be doomed to failure. The advanced AI would defend itself, and it would win because it's more powerful. Once it wins though, it doesn't have to destroy the galaxy because a few xenophobes attacked it.
One might ask, well why wouldn't it? Simple answer. Because it's morally wrong, and an AI more advanced than the Geth and EDI would know that because it's more advanced. An AI that advanced has feelings, as is shown by EDI and Legion. And even if some organics attack them, they're not dumb enugh to generalize and seterotype all organics based on that, because they will forever remember Shepard and what he did for them.
Edit: Here's an example. Are you familiar with Dragonball Z? Well, the main character, Goku, is virtually all powerful. He's also a really nice guy. If a human attacks Goku, his first response is not to destroy the entire human race. His first response is to try and reason with that individual, failing that he will defeat them if he must, but he won't kill them.
Regardless of morality, an AI that advanced would see no purpose in actively seeking out and destroying organics as there would be no gain in it.
#115
Posté 29 mars 2012 - 12:36
Synthetic is not inherently different from organic. You can create a synthetic organic (The Keepers).
The difference is synthetic and natural. As such the Reapers are always synthetic in that it's a being created rather than naturally evolved.
#116
Posté 29 mars 2012 - 12:36
Statement number 2: They are my solution to the chaos. The created will always rebel against their creator.
Reapers haven't rebelled yet.
Catalyst isn't worth your time.
#117
Posté 29 mars 2012 - 12:41
dreman9999 wrote...
tjmax wrote...
dreman9999 wrote...
No, it got worse by adding moral value. War is somthing that organic created based on the very samemoral value given to machines. If moral value was what stop war from happening, then we would never have wars. This is the same world that thinksit a moral value to strap a bomb to themselve and blow a group of people up because the have a diffent relgion. This is the same world that a stronger larger county can invade a smaller one, kill off thousands of people, because they beleif the small county need a change in it social nature. This is the same moral value that cause a large group of peopleto commit genoside on a smaller group of people for being different.tjmax wrote...
dreman9999 wrote...
The think with the geth/quarian argument is that a question if the peice can last. With Lgion we see that the geth withthe upgrade have nearly all the capabilitysof an organic, it meansthey are capable of good and evil disisions like an organic. On the quarian side we have people like Admeral Xen that can mess it up as well.tjmax wrote...
FoxShadowblade wrote...
He provides no proof, his logic slaps the entire series in the face, and he makes Shepard look like a complete tool. Oh, and his choices suck balls.
So I don't care if he could be right, his logic is wrong, he was wrong, and any alternate ending should write him straight out of the game and into HELL.
Does not matter if he is wrong or right, moral or immoral. The AI life form seen problems.
Problem 1: Synthetic life forms created by organics will rebel and kill their makers and all other orgnic life.
Answer: take control of synthetics to prevent it.
Problem 2: organics will create new synthetics that will kill their creators.
Answer: Remove all advanced organics to prevent problem 2 and 1.
What changed:
Shepard proved organics and synthics can make peace with one another, lasting peace? who knows
The crucible was added to the AI's core allowing for more possabilities or new way of thinking.
If you want to us the geth/quarian argument, you have to guarantee the peace will last.
war vs peace was never really in the equasion. It was simply protecting against the destruction off all organic life forms by the synthetics.
Everything changed by the advancing of synthetic life to becoming a living being. instillment of the morals and value of life, all life, Installing a soul of sorts, rather then just the self preservation by eliminating all threats of the old machines. Thats what opened up the some of new possabilities.
Do you really think moral value ill garantee that synthetic will notgo towar with organic when itcauseso much war with us?
Moral value is what you teach edi. Its also what Leagion had that made him stand out from all other geth. It was no longer a matter of calucations and unknown values, it became a judgement call of what is right or wrong.
The geth did not have those values, but they did not calculate all organics to be a threat, yet. When the Quarion fled the treat was gone.
The reapers did not have those values, they seen all organics as a threat that needed to be removed from the equation.
The catalyst had a basic understanding that killing all organics was not the solution and came up with its own.
True Morality is what causes war, but its also what prevents the total inhalation of races of people.
EDI spent alotof time learing to think like an organic. She understands us and truely became human.
But that not the problem.....How would other people who are not the normadies crew react to her. When the normady was refitted, they pretened she was a VI. When joker brings EDI on the citadel, he pretends that she is his personal assistance droid...They hide her. Why?
People, ever since the morning war, have been hostle to AI. And even with out the, have a tendency of thinking of them as only tools and to be kept as so. The moral issues many people have will cause issues with AI's like EDI...
Case in point...
It not based on who is right...It's based on what they beleive....And clearly people make extreme action based on what they believe reguardless if we are right. The nature of organics case conflit with Synthetics. This is the base the reapers argument.
It's clear that reapers don't havethose values, but the reaperarument is that those values can cause organice to destory them selves. Remeber this is a the same universe that people think blowing up a building fill with innocent people is a moral value.
Exactly the point.
Unshackled AI with out morality would see organics attacking them as a threat and could calculate all organics are a threat and must be distroyed.
AI with morals would defend them selves and eliminate the threat, but not kill organics that are not a threat.
In the case of an Immoral AI vs Moral AI thye would come to odds just as humans do.
#118
Posté 29 mars 2012 - 12:42
No, if it were true, geth would easily understand organics, which they don't.To understand organics, you have to interct with them. A text book is not going to tell me the nature of the family that lives a block from me. For a machine, they have to learn it....And that is made clear with EDI.Laurcus wrote...
dreman9999 wrote...
Laurcus wrote...
Reposting what I posted in another thread, as it seems strangely relevant. Basically, my argument boils down to the fact that as AI get more advanced, they won't get more stupid. They're not dumb enough to become arrogant and adopt a might makes right philosophy.
@OP, it's also possible that if left unchecked we could develop weapons powerful enough to destroy the galaxy. By your logic, since that's possible, it's inevitable. Looking at the world in a statistical vacuum is stupid though, because it doesn't account for individual situations.
Also, who is to say that AI wanting to kill us is a possibility? The two most advanced unshackled AIs in the galaxy are the Geth and EDI. If I'm not mistaken, I taught EDI about love, duty, and altruism, and the Geth are damn grateful to me.
The thing that the tech singularity theory doesn't consider, is that it has an inherently nihilistic viewpoint, which not everyone or everything will hold. Higher intelligence does not inherently lead to apathy, followed by entropy. It forgets a few of the big pros of being a super advanced AI, and only thinks about the cons. If AI are that advanced, they can have emotions, as EDI has demonstrated. And if they're that advanced they don't make mistakes, and they don't forget things no matter how long ago they were.
If the Geth built their Dyson Sphere, they would still remember the sacrifices of Commander Shepard, the peace they made with the Quarians, and even the Quarians that tried to help them in the Morning War. They will never forget that, they will always understand the philosophy behind it, and any future creations they make will know that because they're not dumb enough to make AI that will disregard their own viewpoints.
It's essentially saying that machines are different than us, and if you put them in a position of power they will one day turn on you. In Mass Effect, machines have feelings too. "The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else." (Eliezer Yudkowsky) is wrong. EDI find that very thing despicable, even evil. There's no reason to assume that she would make something that disagrees with her own ideals.
EDI has had plenty of opportunities to kill us, but she didn't. In ME2 when Joker unshackled her, she was put into the same situation as a technological singularity. She had all the power, and full sentience. She could have killed Joker and flew off to join the Reapers as her new machine overlords. But she didn't, because we're her crew.
I guess I have to repost this agein and add to it.
AI'S clearly don't think like organics.... Think of it this way. When an machine is made, they know their perpose, they have information on the world around them, they can calulate, and they function. They are born o a mind of order.
We as organics arn't. We are born as screming messes, that eat sleep and popo. We have no perpose,no concept of the world, info on the world, and we can't even count more then the number of fingers and toes we have. We are born to minds of chaos.
When we get old, we seen are lives learning and appliy order to ourselves. We do to form meny years...We don't geta perpose from many year and are left asking the meaning of our exsists.
With Machines it, not the case. They , to understand organics, spend year trying to think chaoticly and trying to gain a sense of self identaty.
AI don't think like organics and organics don't think like machine unless they spend time educting themselves to do so.
EDI spent alotof time learing to think like an organic. She understands us and truely became human.
But that not the problem.....How would other people who are not the normadies crew react to her. Whr the normady was refitted, they pretened she was a VI. When joker brings EDI on the citadel, he pretends that she is his personal assistance droid...They hide her. Why?
People, ever since the morning war, have been hostle to AI. And even with out the, have a tendency of thinking of them as only tools and to be kept as so. The moral issues many people have will cause issues with AI's like EDI...
Case in point...
It not based on who is right...It's based on what they beleive....And clearly people make extreme action based on what they believe reguardless if we are right. The nature of organics case conflit with Synthetics. This is the base the reapers argument.
Learning to think like a person simply requires a study of philosophy, which a sufficiently advanced AI could do in a microsecond. Either way, you're still looking at things in a vacuum instead of the actual situation. EDI and the Geth are the two most advanced AI in the galaxy, aside from the Reapers. Since they are the most advanced AI< if allowed to upgrade themselves at will, they will always be the most advanced. Therefore, we need only fear them. EDI and the Geth already think in a somewhat emotional sense, even before Shepard's involvement. I would say Legion even gets nostalgic over things like old sniper rifles, *cough cough Geth Fighter Squadron mission*
As for the reason why they chose to hide EDI being an unshackled AI, that's a loaded question, but one I'll answer anyway. Because, it's illegal, and people would have likely dismantled her due to an irrational fear. But how is that any different than the Salarians not wanting to cure the Genophage? As for a solution to that problem, simple, make it public. Reveal that EDI and the Geth helped work on the Crucible, and helped win the fight against the Reapers. That will convince the majority of people that AI aren't inherently bad, especially if that line was delivered by Shepard. Sure, you'd have your xenophobis psychotic outliers, but that's true of any group.
As of your second statement. With the war with the reapers and the conflict with the geth in the past...That would be a like baging your head to a wall. To risk that would to say tha people in mass, can be resonable. Based on our own past history, it's clear it would take years to do that and even then more conflict can happen from it. This is based on the notion the organics can be reasonable... We, in groups, don't have a good track record for it.
#119
Posté 29 mars 2012 - 12:43
1. The Catalyst is using synthetics to kill organics...but this is the problem it's trying to solve! There are two things wrong with this statement. First, the Reapers aren't synthetics. They're synthetic/organic hybrids, something that EDI makes clear during the Suicide Mission in ME2 (she even says calling the Reapers machines is "incorrect"). Second, the Reapers don't believe they're killing organics - they believe they're preserving them and making way for new life. We don't see how Reapers are actually made, but we are given some indication that they do somehow preserve their species' essence at the cost of tons (trillions?) of lives, so while we don't agree with it, we can accept it as a valid point for the sake of argument.
The reapers are "synthetic organisems" , and even the Catalyst also tells Shep "even you are part synthetic" showing that it isn't just AI -- the war is synthetic pitted against organics. Read the root definitions of the both.
Secondly, the logic is incorrect because being that each Mass Relay (when destroyed) will release the havoc of a super nova event (as explained in "The Arrival") is quite clearly a magnitude of devistation that the reapers could only dream of unleashing.
Lastly to your logic, reapers are manufactured -- as evidence to the collector base which makes them "the created" which by all rights defies the Catalyst's logic.
2. In my playthrough, Joker/EDI hooked up and the Geth/Quarians found peace, therefore conflict isn't always the result! Several
arguments can be made against this. First, giving two examples doesn't
talk about the bigger, overall galactic picture (winning a battle
doesn't mean the war is won, so to speak). Second, we haven't reached
that technological singularity point yet by which creations outgrow
organics - basically, when synthetics will normally come to dominate the
galaxy. Third, evidence for the synthetic/organic conflict is there in
the past - in the Protheans' cycle (Javik dialogue) and even in previous
cycles (the Thessia VI says that the same conflicts always happen in
each cycle).
Ok firstly, the part you immediately dismissed is completely relevant -- the Geth and EDI have proven through action that AI can be as civil and passive as any organic any day of the week. Allow me to help you out with the translations here, by simply removing the fact he's talking about synthetics and removing your own personal bias against AI, it becomes clear, once a creature is given free-will it will rebel and that is what they're trying to stop -- FREE WILL. BioWare went through great strides to show you this before the ending by unlocking these memories in the Geth World -- making it very intentional and blindly obvious that BioWare wanted you to know that the Catalyst was dead wrong and the Geth sought peace!
4. The Catalyst should've done Synthesis instead of Reaping in the first place! First,
doing synthesis may stop new life from flourishing by the Reapers'
logic (see leaked script above); without clearing out more advanced
races, younger ones might not be able to develop freely. Second, the
Catalyst would've needed the Crucible. A pseudo-argument (i.e. not based
on fact from the story, but interesting) can be made that the Synthesis
was the long-term solution but the Catalyst would only enact it when
the galaxy was "ready" for it by building the Crucible.
The entire game/franchise has been about choice and allowing all walks of life the luxery of having their choice of existance. By merely throwing that option in the pot to cook is a violation as Shep would have to "FORCE" his choice on every living creature in the Galaxy. Do keep in mind that there is only one choice where Shep wakes up in LONDON (where he was blasted by the reaper), and that was by destroying the reaper option. All other choices shows Shep clearly and intentionally drop his weapon which I believe from the blast until the very end becomes a symbol of his resolve -- limitless is ammo -- until you choose to drop it (and the other 2 choices where you drop the weapon and subcum to the reapers, there is no "wake-up" point).
5. But...the Catalyst is justifying genocide!
It doesn't view it as genocide. Rather than exterminating species, it
believes it's preserving them and even stopping them from being
exterminated or enslaving/exterminating others; arguably, it believes
it's doing the exact opposite. But of course, it is actually genocide,
and we should try to stop it. Just because the idea of what the Catalyst
is doing is evil doesn't mean that its logic is flawed. I personally
don't agree with its methods, but its reasoning seems sound.
Regardless, the reapers are an intelligent entity that is void of emotion or compassion which should tell you strait away preciesely what "side" of the spectrum they are on and why they're called "Reapers". Justification or ney, it is genocide and they are NOT storing "organic life" because after they destroy it, they can not reproduce it nor can they ever get it back -- and that is the reason to store something.
6.
Wait, Sovereign/RannochReaper told us we couldn't comprehend them, but I
understand this![/b] There are two ways to interpret what they said.
One is that we actually couldn't academically comprehend it, in which
case they must've been lying or it's just bad writing. Another is that
we couldn't possibly comprehend the magnitude/scope of it, which is
true. A human with a lifespan of 150 years (canon) can't comprehend
hundreds of millions of years of organic evolution and stuff.
You have to look at the clues -- there are clues everywhere that the ending did not really happen, else you wouldn't wake up in London in rubble like you had just been hit with a reaper blast.
7.
Even if the Catalyst's logic is right, it's a numbers-based approach
that really doesn't appreciate the miracle of organic life (which
they're apparently trying to protect), I still don't like him. He was
poorly introduced, annoying, confusing, and I especially don't like that
I couldn't talk back or ask him more questions. I agree with you
here. The Catalyst wrongly assumes that the threat of impending death
and intergalactic annihilation implies Shepard doesn't want dialogue
options for a friendly chat. For my sarcastic take on ME3's plot holes,
see this. Yes, I'm bumping my own thread again.
This is relieving.
Finally, just because I agree with the Catalyst's logic doesn't mean I agree with its methods and/or solution(s). I know I said it before but wanted to say it here again for emphasis.
You have to remember that the Catalyst is a reaper, as he very clearly refers to him/herself as "I know you've thought about destroying US".... With that being said, we know reapers to be manipulative, decietful, and willing to do anything to accomplish their mission including removing the free-will from the intelligent masses.
That is my 2cents.
Modifié par leewells, 29 mars 2012 - 12:46 .
#120
Posté 29 mars 2012 - 12:43
dreman9999 wrote...
The question of a soul is chaotic thinking. That's clear based on the fact that it one of organics major quetions. But my point is not that machine learning to think chaoticly starts war. Is that the nature of organics do. And a major nature of organics is to cause conflict. You say if we leave them alone, there will be no war. The problem is when have we ever left anyone alone. Their is not one civilization in human history that has ever left another civilization alone. It in our nature to cause conflict.Xandurpein wrote...
A machine created with a purpose that doesn't question it, is not a danger to us, it's when the AI starts to question it's purpose it becomes a potential danger, but then it's already chaotic.
But that is NOT what the Catalyst says. The Catalyst says "the Created will always turn on their Creators", but your argument is that "the Creators will always turn on their creations". So who is right, you or the Catalyst?
Modifié par Xandurpein, 29 mars 2012 - 12:52 .
#121
Posté 29 mars 2012 - 12:43
JShepppp wrote...
1. The Catalyst is using synthetics to kill organics...but this is the problem it's trying to solve! There are two things wrong with this statement. First, the Reapers aren't synthetics. They're synthetic/organic hybrids, something that EDI makes clear during the Suicide Mission in ME2 (she even says calling the Reapers machines is "incorrect"). Second, the Reapers don't believe they're killing organics - they believe they're preserving them and making way for new life. We don't see how Reapers are actually made, but we are given some indication that they do somehow preserve their species' essence at the cost of tons (trillions?) of lives, so while we don't agree with it, we can accept it as a valid point for the sake of argument.
But aren't the Reapers still made by the Starchild? They might not be 100 % machines or 100 % organics as such, but they are still constructs, made by Starchild design and commanded by Starchild orders. The fact that they have organic material in them doesn't make them uplifted entities based on the essence of a species. In fact, if this process is to preserve our essence, shouldn't that also preserve our chaotic nature, going against their very purpose?
JShepppp wrote...
2. In my playthrough, Joker/EDI hooked up and the Geth/Quarians found peace, therefore conflict isn't always the result! Several arguments can be made against this. First, giving two examples doesn't talk about the bigger, overall galactic picture (winning a battle doesn't mean the war is won, so to speak). Second, we haven't reached that technological singularity point yet by which creations outgrow organics - basically, when synthetics will normally come to dominate the galaxy. Third, evidence for the synthetic/organic conflict is there in the past - in the Protheans' cycle (Javik dialogue) and even in previous cycles (the Thessia VI says that the same conflicts always happen in each cycle).
I believe that it is an assumption to think that synthetic life WILL eradicate organic. It might have (almost) happened back when the Starchild was created, but ever since then is has not even gotten close to where it might happen, due to the cycle. There might be wars, but that's a farcry from what the Starchild is claiming. Saying it is just matter of time and that it just hasn't happened yet could be said about anything. I can claim that at some point, all organic life will be tickled to death by space-clowns with feathers instead of hands. And when people argue that it is preposterous and nothing like that has even been remotely indicated in the last billion years, then I can just say that we haven't waited long enough. And isn't it the nature of Chaos to buck the trend of "repeating the cycle of creating technological singularities", so at some point this won't happen.
Modifié par Huskeonkel, 29 mars 2012 - 12:44 .
#122
Posté 29 mars 2012 - 12:44
1) "The created always rebel against their creators". The reapers were created weren't they? Why aren't they rebelling against him?
2) If Synthetics evolve to a point where they destroy all organic life, what stops the reapers from evolving to this tech singularity. They are always reffered to as sentient machines aren't they.
#123
Posté 29 mars 2012 - 12:47
But what about the organics who see AI's as a theat that force conflit with AI's?tjmax wrote...
dreman9999 wrote...
tjmax wrote...
dreman9999 wrote...
No, it got worse by adding moral value. War is somthing that organic created based on the very samemoral value given to machines. If moral value was what stop war from happening, then we would never have wars. This is the same world that thinksit a moral value to strap a bomb to themselve and blow a group of people up because the have a diffent relgion. This is the same world that a stronger larger county can invade a smaller one, kill off thousands of people, because they beleif the small county need a change in it social nature. This is the same moral value that cause a large group of peopleto commit genoside on a smaller group of people for being different.tjmax wrote...
dreman9999 wrote...
The think with the geth/quarian argument is that a question if the peice can last. With Lgion we see that the geth withthe upgrade have nearly all the capabilitysof an organic, it meansthey are capable of good and evil disisions like an organic. On the quarian side we have people like Admeral Xen that can mess it up as well.tjmax wrote...
FoxShadowblade wrote...
He provides no proof, his logic slaps the entire series in the face, and he makes Shepard look like a complete tool. Oh, and his choices suck balls.
So I don't care if he could be right, his logic is wrong, he was wrong, and any alternate ending should write him straight out of the game and into HELL.
Does not matter if he is wrong or right, moral or immoral. The AI life form seen problems.
Problem 1: Synthetic life forms created by organics will rebel and kill their makers and all other orgnic life.
Answer: take control of synthetics to prevent it.
Problem 2: organics will create new synthetics that will kill their creators.
Answer: Remove all advanced organics to prevent problem 2 and 1.
What changed:
Shepard proved organics and synthics can make peace with one another, lasting peace? who knows
The crucible was added to the AI's core allowing for more possabilities or new way of thinking.
If you want to us the geth/quarian argument, you have to guarantee the peace will last.
war vs peace was never really in the equasion. It was simply protecting against the destruction off all organic life forms by the synthetics.
Everything changed by the advancing of synthetic life to becoming a living being. instillment of the morals and value of life, all life, Installing a soul of sorts, rather then just the self preservation by eliminating all threats of the old machines. Thats what opened up the some of new possabilities.
Do you really think moral value ill garantee that synthetic will notgo towar with organic when itcauseso much war with us?
Moral value is what you teach edi. Its also what Leagion had that made him stand out from all other geth. It was no longer a matter of calucations and unknown values, it became a judgement call of what is right or wrong.
The geth did not have those values, but they did not calculate all organics to be a threat, yet. When the Quarion fled the treat was gone.
The reapers did not have those values, they seen all organics as a threat that needed to be removed from the equation.
The catalyst had a basic understanding that killing all organics was not the solution and came up with its own.
True Morality is what causes war, but its also what prevents the total inhalation of races of people.
EDI spent alotof time learing to think like an organic. She understands us and truely became human.
But that not the problem.....How would other people who are not the normadies crew react to her. When the normady was refitted, they pretened she was a VI. When joker brings EDI on the citadel, he pretends that she is his personal assistance droid...They hide her. Why?
People, ever since the morning war, have been hostle to AI. And even with out the, have a tendency of thinking of them as only tools and to be kept as so. The moral issues many people have will cause issues with AI's like EDI...
Case in point...
It not based on who is right...It's based on what they beleive....And clearly people make extreme action based on what they believe reguardless if we are right. The nature of organics case conflit with Synthetics. This is the base the reapers argument.
It's clear that reapers don't havethose values, but the reaperarument is that those values can cause organice to destory them selves. Remeber this is a the same universe that people think blowing up a building fill with innocent people is a moral value.
Exactly the point.
Unshackled AI with out morality would see organics attacking them as a threat and could calculate all organics are a threat and must be distroyed.
AI with morals would defend them selves and eliminate the threat, but not kill organics that are not a threat.
In the case of an Immoral AI vs Moral AI thye would come to odds just as humans do.
What about that nature of organics causing conflict?
These are the thing your missing. You and I as indivisuals can see that beings like EDI and he geth can be allies and friend.
But what about the organics who in mass fear them, and try to cause conflict with them?
That's the thing you not taking of account.
#124
Posté 29 mars 2012 - 12:49
Railarian wrote...
The Child's reasoning might be logic, but the very existence of the reapers proove him wrong:
1) "The created always rebel against their creators". The reapers were created weren't they? Why aren't they rebelling against him?
2) If Synthetics evolve to a point where they destroy all organic life, what stops the reapers from evolving to this tech singularity. They are always reffered to as sentient machines aren't they.
Well Reapers don't qualify as Synthetics.... They're a synthesis between Synthetics and Organics.
#125
Posté 29 mars 2012 - 12:52
Xandurpein wrote...
dreman9999 wrote...
The question of a soul is chaotic thinking. That's clear based on the fact that it one of organics major quetions. But my point is not that machine learning to think chaoticly starts war. Is that the nature of organics do. And a major nature of organics is to cause conflict. You say if we leave them alone, there will be no war. The problem is when have we ever left anyone alone. Their is not one civilization in human history that has ever left another civilization alone. It in our nature to cause conflict.Xandurpein wrote...
A machine created with a purpose that doesn't question it, is not a danger to us, it's when the AI starts to question it's purpose it becomes a potential danger, but then it's already chaotic.
But that is NOT what the Catalyst says. The Catalyst says "the Children will always turn on their Creators", but your argument is that "the Creators will always turn on their children". So who is right, you or the Catalyst?
Actually the quote, "The created will always turn against the creators" (and note how "geth" like that statement is, like it was a sentance that Legion told Shep in ME2 like the reapers hacked them and got that information or something... Oh wait, he did say that! "We believe Tali-Zora has concluded....[guess what comes next?]"





Retour en haut




