Aller au contenu

Photo

Why the Catalyst's Logic is Right (Technological Singularity)


  • Veuillez vous connecter pour répondre
1057 réponses à ce sujet

#201
Laurcus

Laurcus
  • Members
  • 193 messages

dreman9999 wrote...

Laurcus wrote...

dreman9999 wrote...

Laurcus wrote...

dreman9999 wrote...

tjmax wrote...

dreman9999 wrote...

tjmax wrote...

Exactly the point.
 
Unshackled AI with out morality would see organics attacking them as a threat and could calculate all organics are a threat and must be distroyed.

AI with morals would defend them selves and eliminate the threat, but not kill organics that are not a threat.

In the case of an Immoral AI vs Moral AI thye would come to odds just as humans do.

But what about the organics who see AI's as a  theat that force conflit with AI's?
What about that nature of organics causing conflict?
These are the thing your missing. You and I as indivisuals can see that beings like EDI and he geth can be allies and friend.
But what about the organics who in mass fear them, and try to cause conflict with them? 
That's the thing you not taking of account.



And that is the reasons behind the choices.

What do you want to do?

Kill all AI?
Control the solution?
Merge AI and organics to make a new life form that all have morals.


I don't say i agree with the choices. But the logic used is from a machine standpoint not a moral one.

It's not  moral  to organics.
Remeber, morality is a fical thing because everyone has different moral standings. The spanish inqusition thought it was morally right to kill and troucher Jews in an atempt to turnthem into chistaity.
The reapers morality in one of apurely logical machine.
The morality the reapers are using is based on the meaning of being alive and living.
The concept of it can get warped based on morality.  With us we see being alive means a sense of  self idenity, ego, consusneses, self growth and so on. That why we see brain death as a form of true death. That brain dead person lost the foction of their mind, everything about them is gone while their body lives. To a machine it different. Think about it this way,if your computers hard drive fails, do you morn it and bury it, or do you replace the broken hard drive?
That how machines think. To a machine a brain dead person is not dead. They would think to just replace th nonfuctional parts and the person is fine. 


You're ignoring the point. Who is going to attack the Geth, and what would cause the Geth to generalize and stereotype all organics based on that.

Let's see, before there was the quarians...Then there was project overlord.....Now we also, have to take in the fact that people have an nature of conflict and hating things different. Then you have to takrein the fact the people many not see them as living indavisuals andtry to control them...People like Xen, or the illusive man or so on and so on.
I know I'm stretching abit but the nature of organic is to cause conflit...Heck, with the geths indivsuality, they could cause the conflict.


Admiral Xen has already given up her xenophobic ways, (in the full paragon path) and even if she hasn't the other Admirals have been shown the error of their ways. Even if she decided to go rogue, she's only the leader of special projects, not any of the fleets. And if the Quarians did try to kill the Geth, they'd have to deal with the Humans, Asari, Salarians, Volus, Krogan, Turians, and Geth.

Illusive Man is dead.

So I ask again. Who, as of the ending of ME3, (assuming the Reapers could be defeated without the loss of any major civilizations) would attack the Geth?

1. Link me to what states or shows  she 
given up her xenophobic ways? Also, if not a case of her going to war. It's a case ofher trying to take controlof the geth. Even her trying would case the geth ot go to war. Also,there no sighs that all the quarian agree with the peice. Han'garaled only stopped because it would mean the detrution of the quarians. If does meanhe wouldtry again later whne they have better tech.

2. I also said people like the illusive man. Any one can take the detail of project overlord and try to control the geth. Any race...Hech, the salarian may try it out of fear of the krogan overpopulation.





1. Mass Effect 3. Priority: Rannoch. You make peace between the Geth and the Quarians. She is seen standing next to a Geth Prime with no visible reaction of hostility. Once again, this is assuming you go the full paragon route. Also, you don't know that the Geth would react by trying to start a war with the Quarians, (much less a galactic war) as the only thing we have to base their future actions on is their past actions. And in the past the Geth have only defended themselves when necessary, and have shown mercey when they could. This is doubly reinforced by their friendship with Shepard.

2. There is no one like Illusive Man, now you're making characters up. The only ones that knew about Overlord are either dead, or allies with Shepard.

Also, both the Quarians and Salarians are insignificant on their own compared to the rest of the galaxy. So even if they did try to go to war with the Geth, (it's clearly out of character for them to do so) the rest of the galactic militaries could stop them. And then the Geth would have no reason to go on a psychotic rampage once they become all powerful techno-gods.

#202
Richard 060

Richard 060
  • Members
  • 567 messages
Sorry, but trying to find the 'logic' in the Catalyst's arguments is just 'tilting at windmills'.

The glaring flaw undermining the whole "synthetic life will inevitably turn on organic life, leading to chaos and destruction" schtick?

It's nothing more than supposition, based in flimsy 'logic' and a noticeable lack of evidence...

Seriously - what clairvoyance does the Catalyst have to know that this scenario is indeed 'inevitable'? Because without it, there is just no way to know if ANYTHING is 'inevitable', as any physicist, mathematician or historian will quickly tell you.

So with that being the case, we're basing a galaxy-altering decision on the paranoid guesswork of a Reaper AI who we've only just met, and it's all supposed to be accepted as solid fact?


Any wonder why a great many people are crying "shenanigans"?

#203
Unlimited Pain2

Unlimited Pain2
  • Members
  • 94 messages

tjmax wrote...

dreman9999 wrote...

Unlimited Pain2 wrote...

tjmax wrote...

dreman9999 wrote...

Sovergn, Harbiger, and the reaper on rennoch clearly have self awarness. I would not take everything the star child says as face value...More like a warped truth.



Self aware is not morality

Being self aware means its capable of making decisions on its own based on logic not right or wrong.


Actually self aware means more along the lines of being concious of who you are. It's less about logic and more about personal feelings etc.

And that leads to morality.


Leads to morality as in Edi and Legion but is not morality.

aware adj
Definition of AWARE
1archaic : watchful, wary
2: having or showing realization, perception, or knowledge
— aware·ness noun


mo·ral·i·ty noun
plural mo·ral·i·ties
Definition of MORALITY
1a : a moral discourse, statement, or lesson b : a literary or other imaginative work teaching a moral lesson
2a : a doctrine or system of moral conduct b plural : particular moral principles or rules of conduct
3: conformity to ideals of right human conduct
4: moral conduct : virtue


self-a·ware (sImage IPBlfImage IPBImage IPB-wârImage IPB)adj. Aware of oneself, including one's traits, feelings, and behaviors.

self-awareadj conscious of one's own feelings, character, etc.

;)

#204
Laurcus

Laurcus
  • Members
  • 193 messages

dreman9999 wrote...

Unlimited Pain2 wrote...

tjmax wrote...

dreman9999 wrote...

Sovergn, Harbiger, and the reaper on rennoch clearly have self awarness. I would not take everything the star child says as face value...More like a warped truth.



Self aware is not morality

Being self aware means its capable of making decisions on its own based on logic not right or wrong.


Actually self aware means more along the lines of being concious of who you are. It's less about logic and more about personal feelings etc.

And that leads to morality.


Fales conclusion. You have no evidence of that. The burden of proof is on you to prove your claim.

Modifié par Laurcus, 29 mars 2012 - 02:11 .


#205
dreman9999

dreman9999
  • Members
  • 19 067 messages

Xandurpein wrote...

dreman9999 wrote...

Xandurpein wrote...

Unlimited Pain2 wrote...

Xandurpein wrote...

dreman9999 wrote...

Xandurpein wrote...

But to rebell, even for the purpose of survival is chaotic thinking. A "pure" AI will not go against it's Creator's purpose. There's nothing inherently logical about self-preservation. Self-preservation is a desire not logic.

It's a  pure unthinking machine that  would not rebel....AI' s are thinking machines. It's the fact that they are self aware that is the element of them rebeling.


There's no reason to assume that a self-aware AI automatically would automatically desire self-preservation. There is no logic in self-preservation, it's simply a trait that is favored by evolution. The minute you introduce evolution, then you end up with "chaotic" thinking, just as for organics. What you call "chaos" is self-preservation, but often on a genetic level, rather than an self-aware level. There's no reason to assume that once an AI began to evolove, rather than remain static, the same evolutionary forces would not force them to become "chaotic" too.


True, chaos (or illogical thinking as opposed to a purely logical being) is normally determined by the "I" factor. Putting "I" over "us". We see one main example of AI during ME in the Geth who operate on a consensus. But nothing says that different programming wouldn't make an AI react just as chaotic (selfish) as an organic. The very fact that EDI stands besides you to fight Geth reinforces this.


Exactly. I don't know how Geth propagate, but let's assume that they imprint their software on a machine than then duplicates the software, almost like a computer virus. Now at some point there's an error in the coding that makes a certain Geth become highly motivated to have it's particulare code being duplicated. Over time this error will then outbreed the other Geth, because it's trying harder to be duplicated. Evolution leads to "illogical" selfish coding-sequences because self-preservation is not logic, it's just favored by evolution.

But the problem is not synthetics alone...It's the nature of organics.


Exactly. Once synthetic life reach the level of complexity that it will begin to evolve, then the forces of evolution will make it just as "illogical" as organic life. The only way an AI can escape the forces of evolution is if it has no self-preservation, but then it won't rebel. Besides, eventually the same force of evolution will lead to a coding error (mutation) in an AI so it gets self-preservation and then that will take over.

AI, in ME, clearly are at that level of complexity. But that not my point, I mean the nature of organics to cause conflict. The synthetics may not want ware but organic may.  You know...This...


#206
starjay001

starjay001
  • Members
  • 30 messages
I completely ignored your post buddy. The endings SUCK !!! there mustn't be any reasoning to support Bioware's mediocrity to the ME 3 ending. While your points could make up for a good Mass Effect related fiction work. The endings are a disgrace and thank God ! Bioware has plans to change it.

#207
Xandurpein

Xandurpein
  • Members
  • 3 045 messages

Unlimited Pain2 wrote...

Poison_Berrie wrote...

Unlimited Pain2 wrote...

Shepard has no higher synthetic functions really, he may have synthetic parts in him, but for the most part he's organic. Example: A robot operating on a human brain would be a hybrid, no? Wouldn't an AI construct operating within a human body be every bit as much of a successful hybrid? The synthesizing ending was about bringing organics and synthetics together as one.... If the idea of an AI construct defies that, then doesn't that mean the synthesize ending would basically "kill" the Geth?

I can't really make any discussion about the synthesis ending, without pointing out how fundamentally flawed that is on so many levels.
For one as I said the difference between synthetic and organic doesn't exist on a physical scale. There is a difference between synthetic and natural. So how are we supposed to view this conjoining. If the synthetics parts are supposed to be naturally occuring that means that there are no more synthetics. But why couldn't these syntho-organics make new synthetics? And who's to say new organics can't evolve that are do not carry these synthetic parts organically.


I agree, it's riddled with holes.


Not to mention that NONE of the Catalyst three proposed solutions hold any promise of solving the problem of the technological singularity anyway.

#208
Giantdeathrobot

Giantdeathrobot
  • Members
  • 2 942 messages
A sound logic, OP. thank you for this post.

That said, I simply cannot buy the Catalyst's logic. First thing first, it directly contradicts itself when it says that (I quote) ''The Reapers are my creation'' and then says ''The creator will always rebel against the created''. Uh, yeah, OK, why haven't the Reapers rebelled yet? Because, as far as we know, they are ''Each a nation, inependant'', according to Severeign. ME2 revealed they are also organic/synthetic hybrids (albeit how ''organic'' one is when the ''organic'' part is processed from liquefied people and presumably cannot grow/evolve by itself is another valid question, but I digress), so it can't be because they are purely synthetics programmed to obey.

Second, it uses sweeping generalizations (again, creator always rebelling, the need for a Final Solution, ect) while presenting nothing in the way of argument of proof. You might say ''but he's very old and wise and knows best because of this!'' to which I say BS, that's a pure argument of authority, no sell, I want evidence. Meanwhile, we have 3 evidences that he's wrong (Reapers, Geth, EDI). Doesn't matter on a cosmic level? Maybe, albeit we don't know at all (speculations!...). But it matters tremendously on a narrative level, and one does not simply disconnect the story and dialog from the narrative that strongly without consequences. We were shown something for at lest two games, and told the complete contrary 10 minutes from the ending by a literal Deus Ex Machina. No amount of talk about ''artistic integrity'' or whatnot will convince me this is anywhere near good writing.

Third, its solutions do not solve anything. The Cycle merely delays the inevitable, according to it. Destroy sets back the galaxy big time and destroys the present AIs, but it does not stop organics creating AIs in the future at all. Control only makes it so Shepard controls the Reapers (or something), he can perfectly decide to screw the cycle and let AIs go on about their presumed organic-annihilating business. Synthesis, beside being absolutely retarded and nonsensical on both a conceptual and practical level, merely turns everything into a half-robot. It doesn,t stop said half-robots creating AIs, unless they have also been brainwashed which is a whole other can of worms.

Fourth, where does the ''AIs will always destroy ALL organic life'' come from? Since organic life is, you know, present in the galaxy, it means it hasen't happened. So it looks like pure speculation (buzzwords ahoy) on its part.

#209
dreman9999

dreman9999
  • Members
  • 19 067 messages

Laurcus wrote...

dreman9999 wrote...

Unlimited Pain2 wrote...

tjmax wrote...

dreman9999 wrote...

Sovergn, Harbiger, and the reaper on rennoch clearly have self awarness. I would not take everything the star child says as face value...More like a warped truth.



Self aware is not morality

Being self aware means its capable of making decisions on its own based on logic not right or wrong.


Actually self aware means more along the lines of being concious of who you are. It's less about logic and more about personal feelings etc.

And that leads to morality.


Fales conclusion. You have no evidence of that. The burden of proof is on you to prove your claim.

What do you think the statement "I thing there for I am " means. If you want to see an example of self awareness leading to morality, look in a morror. We are the example.

#210
Unlimited Pain2

Unlimited Pain2
  • Members
  • 94 messages

Laurcus wrote...

dreman9999 wrote...

Unlimited Pain2 wrote...

tjmax wrote...

dreman9999 wrote...

Sovergn, Harbiger, and the reaper on rennoch clearly have self awarness. I would not take everything the star child says as face value...More like a warped truth.



Self aware is not morality

Being self aware means its capable of making decisions on its own based on logic not right or wrong.


Actually self aware means more along the lines of being concious of who you are. It's less about logic and more about personal feelings etc.

And that leads to morality.


Fales conclusion. You have no evidence of that. The burden of proof is on you to prove your claim.


If being self-aware means being concious of ones "feelings" towards something then logic would say that it would also allow being concious of others feelings towards something. Thus, making any choice while being aware of these two variables is a choice made out of morality. Now whether that choice is morally right or morally wrong is another story :P

Modifié par Unlimited Pain2, 29 mars 2012 - 02:17 .


#211
tjmax

tjmax
  • Members
  • 494 messages

Unlimited Pain2 wrote...

self-a·ware (sImage IPBlfImage IPBImage IPB-wârImage IPB)adj. Aware of oneself, including one's traits, feelings, and behaviors.

self-awareadj conscious of one's own feelings, character, etc.

;)



Thats still knowing what you are doing, not understanding right from wrong.

#212
dreman9999

dreman9999
  • Members
  • 19 067 messages

Unlimited Pain2 wrote...

tjmax wrote...

dreman9999 wrote...

Unlimited Pain2 wrote...

tjmax wrote...

dreman9999 wrote...

Sovergn, Harbiger, and the reaper on rennoch clearly have self awarness. I would not take everything the star child says as face value...More like a warped truth.



Self aware is not morality

Being self aware means its capable of making decisions on its own based on logic not right or wrong.


Actually self aware means more along the lines of being concious of who you are. It's less about logic and more about personal feelings etc.

And that leads to morality.


Leads to morality as in Edi and Legion but is not morality.

aware adj
Definition of AWARE
1archaic : watchful, wary
2: having or showing realization, perception, or knowledge
— aware·ness noun


mo·ral·i·ty noun
plural mo·ral·i·ties
Definition of MORALITY
1a : a moral discourse, statement, or lesson b : a literary or other imaginative work teaching a moral lesson
2a : a doctrine or system of moral conduct b plural : particular moral principles or rules of conduct
3: conformity to ideals of right human conduct
4: moral conduct : virtue


self-a·ware (sImage IPBlfImage IPBImage IPB-wârImage IPB)adj. Aware of oneself, including one's traits, feelings, and behaviors.

self-awareadj conscious of one's own feelings, character, etc.

;)



And much of a persons identiy is their morality...

Morality is one of the many point to legions loyalty mission. A change of morality changes a person.

#213
Unlimited Pain2

Unlimited Pain2
  • Members
  • 94 messages

tjmax wrote...

Unlimited Pain2 wrote...

self-a·ware (sImage IPBlfImage IPBImage IPB-wârImage IPB)adj. Aware of oneself, including one's traits, feelings, and behaviors.

self-awareadj conscious of one's own feelings, character, etc.

;)



Thats still knowing what you are doing, not understanding right from wrong.


Is there a difference? Look throughout history and you'll find many horrible incidents that were commonplace in their respective periods. Things we see as monstrous now. Morality is dictated by the backdrop. What one man dictates as right another can just as easily say is wrong and neither would be telling  a lie so long as they are both aware of why they feel this way.

#214
ZajoE38

ZajoE38
  • Members
  • 667 messages
Damn, my head is gonna explode from all these speculations after just one month. Bioware, you should act

#215
Avatar231278

Avatar231278
  • Members
  • 269 messages

JShepppp wrote...

1. The Catalyst is using synthetics to kill organics...but this is the problem it's trying to solve! There are two things wrong with this statement. First, the Reapers aren't synthetics. They're synthetic/organic hybrids, something that EDI makes clear during the Suicide Mission in ME2 (she even says calling the Reapers machines is "incorrect"). Second, the Reapers don't believe they're killing organics - they believe they're preserving them and making way for new life. We don't see how Reapers are actually made, but we are given some indication that they do somehow preserve their species' essence at the cost of tons (trillions?) of lives, so while we don't agree with it, we can accept it as a valid point for the sake of argument. 


While this may be true, the Reapers are nothing less of mass murder. In Star Trek the equivalent to the Reapers would be the Borg. They don't ask us if we want to become them, they just do. Putting in a reason behind it (which it never did to the other species in other cycles) makes it just sound more like any warmonger trying to justify its actions to a higher authority (which also is not present here, and makes it even more senseless.

2. In my playthrough, Joker/EDI hooked up and the Geth/Quarians found peace, therefore conflict isn't always the result! Several arguments can be made against this. First, giving two examples doesn't talk about the bigger, overall galactic picture (winning a battle doesn't mean the war is won, so to speak). Second, we haven't reached that technological singularity point yet by which creations outgrow organics - basically, when synthetics will normally come to dominate the galaxy. Third, evidence for the synthetic/organic conflict is there in the past - in the Protheans' cycle (Javik dialogue) and even in previous cycles (the Thessia VI says that the same conflicts always happen in each cycle). 


And still the answer is faulty. Based on the facts we KNOW, so far the only synthetic race we know first hand, the Geth, defended themselve. We don't know if the synthetics of the Prothean cycle were the aggressors or defenders (which would also be true if they were treated as slaves. As soon as something reaches sentience - be it organic or artificial, they are to be treated as equals), and for everything else we only have the word of the Catalyst, which is less valuable than the things we saw and influenced.

3. If synthetics are the problem and the Catalyst is trying to protect organics, it should just kill Synthetics instead! A few things here. First, the Catalyst believes it's "harvesting/ascending" organics, not killing them. Second, one of the goals of the Catalyst (leaked script above) is to allow new life to flourish as well, indicating that they value the diversity of the "accident" that is life and believe that clearing the galaxy of more advanced races helps lower ones advance peacefully. Arguably, this is true, as the Javik DLC reveals that the Prothean Empire would have either enslaved or exterminated us; since the Reapers killed them, humanity, arguably, was allowed to develop in peace. Third, killing Synthetics may allow for organics to repeatedly develop AIs (as the Reapers keep "helping out" by killing the AIs) until they reach a level that even the Reapers cannot overcome, then organic life would be royally screwed throughout the galaxy. 


It doesn't matter what the Catalyst belives. The reaping of species takes away their natural evolution. Calling it "ascension" might be right for the short time view, but even the Catalyst doesn't know, how a species will be and what it may accomplish in the long run, when it reaches the true apex of its evolution. That's why TIM was wrong by wanting to use Reaper-Tech to evolve humanity, and it is why McStarchild is wrong, for it's basically the same reason.

4. The Catalyst should've done Synthesis instead of Reaping in the first place! First, doing synthesis may stop new life from flourishing by the Reapers' logic (see leaked script above); without clearing out more advanced races, younger ones might not be able to develop freely. Second, the Catalyst would've needed the Crucible. A pseudo-argument (i.e. not based on fact from the story, but interesting) can be made that the Synthesis was the long-term solution but the Catalyst would only enact it when the galaxy was "ready" for it by building the Crucible.


This point is quite true, but essentially meaningless, if we take the former reasons into account. 

5. But...the Catalyst is justifying genocide! It doesn't view it as genocide. Rather than exterminating species, it believes it's preserving them and even stopping them from being exterminated or enslaving/exterminating others; arguably, it believes it's doing the exact opposite. But of course, it is actually genocide, and we should try to stop it. Just because the idea of what the Catalyst is doing is evil doesn't mean that its logic is flawed. I personally don't agree with its methods, but its reasoning seems sound.


same as reason 4 + it is still irrelevant what the Catalyst believes, as it is acting on assumptions, not hard facts.

6. Wait, Sovereign/RannochReaper told us we couldn't comprehend them, but I understand this! There are two ways to interpret what they said. One is that we actually couldn't academically comprehend it, in which case they must've been lying or it's just bad writing. Another is that we couldn't possibly comprehend the magnitude/scope of it, which is true. A human with a lifespan of 150 years (canon) can't comprehend hundreds of millions of years of organic evolution and stuff.


Might be true if you think about it in a superficial way, but even the Reapers haven't seen species evolve more than 50,000 years past space-flight discovery (as it happens it might only be a few 1000 years maximum, as after each harvest, new species need to evolve sufficiently to achieve space-travel first). They would not comprehend how organic species evolve after 50,000 years of interstellar flight nor would they know if they could find another and/or better form of FTL-travel. 

Other than that, I liked the attempt to approach the impossible situation with the Catalyst in a logical form. 

#216
dreman9999

dreman9999
  • Members
  • 19 067 messages

tjmax wrote...

Unlimited Pain2 wrote...

self-a·ware (sImage IPBlfImage IPBImage IPB-wârImage IPB)adj. Aware of oneself, including one's traits, feelings, and behaviors.

self-awareadj conscious of one's own feelings, character, etc.

;)



Thats still knowing what you are doing, not understanding right from wrong.

But you understanding of right and wrong dictate what you know of your self and what you do.:whistle:

#217
Poison_Berrie

Poison_Berrie
  • Members
  • 2 205 messages

tjmax wrote...

Unlimited Pain2 wrote...

self-a·ware (sImage IPBlfImage IPBImage IPB-wârImage IPB)adj. Aware of oneself, including one's traits, feelings, and behaviors.

self-awareadj conscious of one's own feelings, character, etc.

;)



Thats still knowing what you are doing, not understanding right from wrong.

Which would make this a non AI specific problem.
Why would aliens know or have the same idea of right and wrong as us. If your conclusion is that AI are  dangerous because they don't need to adhere to our morality than aliens are dangerous because they don't need to adhere to our morality.

#218
Laurcus

Laurcus
  • Members
  • 193 messages

dreman9999 wrote...

Xandurpein wrote...

dreman9999 wrote...

Xandurpein wrote...

Unlimited Pain2 wrote...

Xandurpein wrote...

dreman9999 wrote...

Xandurpein wrote...

But to rebell, even for the purpose of survival is chaotic thinking. A "pure" AI will not go against it's Creator's purpose. There's nothing inherently logical about self-preservation. Self-preservation is a desire not logic.

It's a  pure unthinking machine that  would not rebel....AI' s are thinking machines. It's the fact that they are self aware that is the element of them rebeling.


There's no reason to assume that a self-aware AI automatically would automatically desire self-preservation. There is no logic in self-preservation, it's simply a trait that is favored by evolution. The minute you introduce evolution, then you end up with "chaotic" thinking, just as for organics. What you call "chaos" is self-preservation, but often on a genetic level, rather than an self-aware level. There's no reason to assume that once an AI began to evolove, rather than remain static, the same evolutionary forces would not force them to become "chaotic" too.


True, chaos (or illogical thinking as opposed to a purely logical being) is normally determined by the "I" factor. Putting "I" over "us". We see one main example of AI during ME in the Geth who operate on a consensus. But nothing says that different programming wouldn't make an AI react just as chaotic (selfish) as an organic. The very fact that EDI stands besides you to fight Geth reinforces this.


Exactly. I don't know how Geth propagate, but let's assume that they imprint their software on a machine than then duplicates the software, almost like a computer virus. Now at some point there's an error in the coding that makes a certain Geth become highly motivated to have it's particulare code being duplicated. Over time this error will then outbreed the other Geth, because it's trying harder to be duplicated. Evolution leads to "illogical" selfish coding-sequences because self-preservation is not logic, it's just favored by evolution.

But the problem is not synthetics alone...It's the nature of organics.


Exactly. Once synthetic life reach the level of complexity that it will begin to evolve, then the forces of evolution will make it just as "illogical" as organic life. The only way an AI can escape the forces of evolution is if it has no self-preservation, but then it won't rebel. Besides, eventually the same force of evolution will lead to a coding error (mutation) in an AI so it gets self-preservation and then that will take over.

AI, in ME, clearly are at that level of complexity. But that not my point, I mean the nature of organics to cause conflict. The synthetics may not want ware but organic may.  You know...This...


Let's suppose you're right for a moment. (not that I think you are, but that's beside the point) What if organics do start a war with synthetics? What will be the outcome?

If synthetics, (the Geth) are not strong enough to defend themselves, then they will be destroyed. Technological Singularity never happens, The Reaper purpose is flawed, The Catalyst is wrong.

If synthetics are strong enough to defend themselves, then they've already reached the point of Technological Singularity. So the Geth destroy the attacking forces. Now here's the leap in logic, READ THIS PART BECAUSE THIS IS WHERE THE CATALYST REALLY FAILS. At this point, the Geth have a few options.

They can either destroy all organics associated with the ones that attacked them, like wipe out all Turians if it was Turians that attacked them.

They can destroy all organics capable of attacking them. This leaves the primitive civilizations intact.

They can destroy all organic life everywhere. (this is what The Catalyst is designed to prevent)

Or they can ignore all other organics.

What we must ask ourselves is, which of these is the most likely, and why? The Geth have records that not all organics are bad. They remember the sacrifices of the Quarians that tried to save them from other Quarians. They also chose to spare the Quarians, to ignore them, when they could have wiped them out. They also recognize that Shepard is organic, and they were only prevented from being wiped out because of him.

Based on their past actions, and their beliefs, we can conclude that they are not warmongers. Even if they beat organics, they have no reason to wipe them out, and they have several reasons to not wipe them out. Therefore, they would not pick option 3, so The Catalyst is wrong.

#219
tjmax

tjmax
  • Members
  • 494 messages

dreman9999 wrote...What do you think the statement "I thing there for I am " means. If you want to see an example of self awareness leading to morality, look in a morror. We are the example.



I think there for I am. has nothing to do with morals, Its about state of being. I can think and thats proof that I exist.

#220
Unlimited Pain2

Unlimited Pain2
  • Members
  • 94 messages

Poison_Berrie wrote...

tjmax wrote...

Unlimited Pain2 wrote...

self-a·ware (sImage IPBlfImage IPBImage IPB-wârImage IPB)adj. Aware of oneself, including one's traits, feelings, and behaviors.

self-awareadj conscious of one's own feelings, character, etc.

;)



Thats still knowing what you are doing, not understanding right from wrong.

Which would make this a non AI specific problem.
Why would aliens know or have the same idea of right and wrong as us. If your conclusion is that AI are  dangerous because they don't need to adhere to our morality than aliens are dangerous because they don't need to adhere to our morality.


EXACTLY! What's being argued here is more social acceptance than morality. The majority ruling doesn't dictate what's right or wrong, only what's socially accepted as right or wrong.

#221
tjmax

tjmax
  • Members
  • 494 messages

dreman9999 wrote...
But you understanding of right and wrong dictate what you know of your self and what you do.:whistle:


Is it? if your pup craps on the floor is it morals or is it cause you did not teach it not to crap on the floor?

#222
Xandurpein

Xandurpein
  • Members
  • 3 045 messages

dreman9999 wrote...

Xandurpein wrote...


Exactly. Once synthetic life reach the level of complexity that it will begin to evolve, then the forces of evolution will make it just as "illogical" as organic life. The only way an AI can escape the forces of evolution is if it has no self-preservation, but then it won't rebel. Besides, eventually the same force of evolution will lead to a coding error (mutation) in an AI so it gets self-preservation and then that will take over.

AI, in ME, clearly are at that level of complexity. But that not my point, I mean the nature of organics to cause conflict. The synthetics may not want ware but organic may.  You know...This...


I mostly replied to your post that synthetic life in inherently different than organic life. I think that once synthetic life begins to evolve, it'll be thinking more and more like us organics, both the good and the bad. As for war being in human nature, I like to think that we have within ourselves both the roots to conflict and the means to overcome it. People with the savant disorder shows how much computing power the organic brain has, if it devotes less time to complex social analysis, and yet clearly this complex social analysis is favored by evolution.

Maybe at some point someone will create a synthetic life that is "greater" than us (whatever that means) and maybe this will frighten the creators. There's no reason to assume this synthetic life will be exterminating us just because of that.

#223
The Grey Ranger

The Grey Ranger
  • Members
  • 1 414 messages

dreman9999 wrote...

A self aware machine is give basic of self sufficency. They are give conceptsof self sevival. If it thinks it will think to keep alive.
It's an extention of the saying  "I think their for I am".
Only a cripple or shakaled AI would not try to stay alive because their ability to think is hampered.


This argument, really doesn't suceed.  Within the game we see AI's that have the virtues of mercy, loyalty, duty, love and self sacrifice.   That implies the abiltiy for an AI to have conventional moral frame work not just surival/self preservation.

#224
Xandurpein

Xandurpein
  • Members
  • 3 045 messages

tjmax wrote...

dreman9999 wrote...
But you understanding of right and wrong dictate what you know of your self and what you do.:whistle:


Is it? if your pup craps on the floor is it morals or is it cause you did not teach it not to crap on the floor?


Moral codes are actually a combination of things we learn from our parents and environment (like the puppy) and some basic compulsions encoded into our genes.

#225
Unlimited Pain2

Unlimited Pain2
  • Members
  • 94 messages

The Grey Ranger wrote...

dreman9999 wrote...

A self aware machine is give basic of self sufficency. They are give conceptsof self sevival. If it thinks it will think to keep alive.
It's an extention of the saying  "I think their for I am".
Only a cripple or shakaled AI would not try to stay alive because their ability to think is hampered.


This argument, really doesn't suceed.  Within the game we see AI's that have the virtues of mercy, loyalty, duty, love and self sacrifice.   That implies the abiltiy for an AI to have conventional moral frame work not just surival/self preservation.


Well he's in agreeance to that i believe.