Aller au contenu

Photo

Does EDI deserve to die in Destroy?


  • Veuillez vous connecter pour répondre
252 réponses à ce sujet

#76
Master Xanthan

Master Xanthan
  • Members
  • 1 218 messages

David7204 wrote...

No.

And quite the opposite. First of all, acknowledgeing AIs as an absolute threat just because they're AIs is basically proving the Catalyst right. Which kind of makes picking Destroy pointless. Secondly, EDI would be exceedingly valuble if she could be brought back after Destroy, because she would be living proof that organics and synthetics can co-exist and even love one another. A template.


Destroy isn't pointless, the reapers are dead, seems like a victory to me. Though I don't think Edi and the Geth deserve to die in the Destroy ending but I guess Bioware wanted to give Destroy a downside. 

#77
RadicalDisconnect

RadicalDisconnect
  • Members
  • 1 895 messages

BlueSandBristow wrote...

Machines don't know literature, art, or beauty. Everything is just code. There is no culture or morals.


Oh really? Then why did EDI choose a Christian biblical metaphor to name Legion? Oh, and what about this line if you take her on Sur'Kesh?

EDI: "I find that analyzing topographical data of a location isn't the same as seeing it in person. But there's also regret, knowing that as we speak Reapers are destroying other worlds just as beautiful."

Kinda blows your statement out of the water, doesn't it?

Modifié par RadicalDisconnect, 17 août 2012 - 05:47 .


#78
Sajuro

Sajuro
  • Members
  • 6 871 messages
Did Miranda deserve to die at Kai Leng's hands?

#79
LucasShark

LucasShark
  • Members
  • 3 894 messages

Balek-Vriege wrote...

LucasShark wrote...

BlueSandBristow wrote...

LucasShark wrote...

BlueSandBristow wrote...

Machines don't think like us. They use logic, and when they see our flaws, they will think we are dangerous. Machines don't care about civilians and children. They don't feel emotion. That makes EDI more dangerous than any organic. She can't feel, and she can keep killing without remorse. Javik makes this very clear.


Wish you would...

ANd oh yes: Javik is such an expert on the psycology of machines and totally unbiased... I mean he only fought a war with the prothean's own machines so he's totally unbiased toward that subject right?


Bias or no bias, his reasoning is very deep and true, and has not been proved wrong.


- Peace with geth - proven wrong right there
- Edi's willingness to sacrafice to save Joker, a notably flawed human - demonstrated wrong again
- A bias toward a conclusion precludes that the resulting conclusion cannot be "deep" as you put it, and an emotional, not a logical term - three strikes, farewell.


Oversimplification. Both Javik and the Catalyst would rebut your first two points by asking you "Yes, but for how long?" The issue is that the peace never lasts and is a brief, temporary exception to the rule. The Catalyst should know since it was his job to broker peace between AIs and Organics for who knows how long.

It would be like a Casino owner (The Catalyst) who's seen countless gambles and the money to prove it saying:

"In the end, the house always wins"

Then some gambling noob coming along (Shepard) replying:

"You're totally wrong, I just gambled for my first time and won!!! The house doesn't always win meaning gambling is a good investment for me!"

The Casino owner laugh then says:


An appeal to probability is not a logical argument in this instance: as you can state the same for virtually any set of circumstances concievable which isn't self-contradictory.

You can for instance make a case for some day it being possible for pigs to fly, pigs evolve afterall, genetic engineering, etc, that is a quantifiable possibility, and if you wait long enough yes "for how long" will work here too.

You can't use "for how long" as a totalitarian game-ender as it is uncertain and undemonstratable, just as I can say "for how long" will "when pigs fly" be a legitimat eexpression?

An example of self-contradiction would be something like "a table made of insomnia", a statement which makes no sense.

#80
BlueSandBristow

BlueSandBristow
  • Members
  • 48 messages

Sajuro wrote...

Did Miranda deserve to die at Kai Leng's hands?


That is completely different.

#81
BlueSandBristow

BlueSandBristow
  • Members
  • 48 messages
Still, look at Terminator and IRobot.

Modifié par BlueSandBristow, 13 août 2012 - 05:31 .


#82
Deadpool9

Deadpool9
  • Members
  • 610 messages
Will Munny: Deserve's got nothin' to do with it.

-Unforgiven (1992)

#83
NPH11

NPH11
  • Members
  • 615 messages
Is EDI any more a threat than the various races of the galaxy that have consistently butted heads in the past?

#84
DirtyPhoenix

DirtyPhoenix
  • Members
  • 3 938 messages

BlueSandBristow wrote...

Sajuro wrote...

Did Miranda deserve to die at Kai Leng's hands?


That is completely different.


No its not. How do you know she is not lying? Can you trust a cerberus agent? For all you know she may be there trying to gain your trust so that you send her to Hackett, where she can sabotage the crucible.
:wizard:

Modifié par pirate1802, 13 août 2012 - 05:40 .


#85
LucasShark

LucasShark
  • Members
  • 3 894 messages

BlueSandBristow wrote...

Still, look at Terminator and IRobot.


The same terminator where the humans defeat the machines utterly without help from another machin... oh wait.
The same I robot where one of the heros is a self-aware AI that... oh wait...

#86
Ticonderoga117

Ticonderoga117
  • Members
  • 6 751 messages

BlueSandBristow wrote...

Still, look at Terminator and IRobot.


Terminator:
AI becomes self-aware, tries to peacefully co-exist with creators, creators try to kill it out of fear and fail horribly. AI decides the best way to keep living is to kill every human. Could've possibly avoided armageddon is someone would've had a cool head in Cheyenne Mountain.

I, Robot (Movie version since I haven't read the book)
AI takes the overriding law of robotics and applies it to the extreme. AI with the ability to only use the laws as guidelines fights against the other AI.

Moral of the story:
AI's are not all inherently evil.

#87
BlueSandBristow

BlueSandBristow
  • Members
  • 48 messages

RadicalDisconnect wrote...

BlueSandBristow wrote...

Machines don't know literature, art, or beauty. Everything is just code. There is no culture or morals.


Oh really? Then why did EDI chose a Christian biblical metaphor to name Legion? Oh, and what about this line if you take her on Sur'Kesh?

EDI: "I find that analyzing topographical data of a location isn't the same as seeing it in person. But there's also regret, knowing that as we speak Reapers are destroying other worlds just as beautiful."

Kinda blows your statement out of the water, doesn't it?


Where does she say this? For all I know, you're lying.

#88
BlueSandBristow

BlueSandBristow
  • Members
  • 48 messages

Ticonderoga117 wrote...

BlueSandBristow wrote...

Still, look at Terminator and IRobot.


Terminator:
AI becomes self-aware, tries to peacefully co-exist with creators, creators try to kill it out of fear and fail horribly. AI decides the best way to keep living is to kill every human. Could've possibly avoided armageddon is someone would've had a cool head in Cheyenne Mountain.

I, Robot (Movie version since I haven't read the book)
AI takes the overriding law of robotics and applies it to the extreme. AI with the ability to only use the laws as guidelines fights against the other AI.

Moral of the story:
AI's are not all inherently evil.


If a machine steps out of it's expected condition, wouldn't the first thing to do is shut it down? Why should we trust a machine that thinks its alive? Machines are used to do things too dangerous for humans. If they are suddenly alive, isn't it smart to shut it down right away?

Modifié par BlueSandBristow, 13 août 2012 - 05:47 .


#89
SeptimusMagistos

SeptimusMagistos
  • Members
  • 1 154 messages
No. EDI doesn't deserve to die. Neither do the geth.

Which is why the other options are right there, shiny and genocide-free.

#90
LucasShark

LucasShark
  • Members
  • 3 894 messages

BlueSandBristow wrote...

RadicalDisconnect wrote...

BlueSandBristow wrote...

Machines don't know literature, art, or beauty. Everything is just code. There is no culture or morals.


Oh really? Then why did EDI chose a Christian biblical metaphor to name Legion? Oh, and what about this line if you take her on Sur'Kesh?

EDI: "I find that analyzing topographical data of a location isn't the same as seeing it in person. But there's also regret, knowing that as we speak Reapers are destroying other worlds just as beautiful."

Kinda blows your statement out of the water, doesn't it?


Where does she say this? For all I know, you're lying.


Oh for the love of... SHE NAMED LEGION!  Did you miss that bit too?

Oh wait you probably left it to die...

#91
Balek-Vriege

Balek-Vriege
  • Members
  • 1 216 messages

LucasShark wrote...

Balek-Vriege wrote...

LucasShark wrote...

BlueSandBristow wrote...

LucasShark wrote...

BlueSandBristow wrote...

Machines don't think like us. They use logic, and when they see our flaws, they will think we are dangerous. Machines don't care about civilians and children. They don't feel emotion. That makes EDI more dangerous than any organic. She can't feel, and she can keep killing without remorse. Javik makes this very clear.


Wish you would...

ANd oh yes: Javik is such an expert on the psycology of machines and totally unbiased... I mean he only fought a war with the prothean's own machines so he's totally unbiased toward that subject right?


Bias or no bias, his reasoning is very deep and true, and has not been proved wrong.


- Peace with geth - proven wrong right there
- Edi's willingness to sacrafice to save Joker, a notably flawed human - demonstrated wrong again
- A bias toward a conclusion precludes that the resulting conclusion cannot be "deep" as you put it, and an emotional, not a logical term - three strikes, farewell.


Oversimplification. Both Javik and the Catalyst would rebut your first two points by asking you "Yes, but for how long?" The issue is that the peace never lasts and is a brief, temporary exception to the rule. The Catalyst should know since it was his job to broker peace between AIs and Organics for who knows how long.

It would be like a Casino owner (The Catalyst) who's seen countless gambles and the money to prove it saying:

"In the end, the house always wins"

Then some gambling noob coming along (Shepard) replying:

"You're totally wrong, I just gambled for my first time and won!!! The house doesn't always win meaning gambling is a good investment for me!"

The Casino owner laugh then says:


An appeal to probability is not a logical argument in this instance: as you can state the same for virtually any set of circumstances concievable which isn't self-contradictory.

You can for instance make a case for some day it being possible for pigs to fly, pigs evolve afterall, genetic engineering, etc, that is a quantifiable possibility, and if you wait long enough yes "for how long" will work here too.

You can't use "for how long" as a totalitarian game-ender as it is uncertain and undemonstratable, just as I can say "for how long" will "when pigs fly" be a legitimat eexpression?

An example of self-contradiction would be something like "a table made of insomnia", a statement which makes no sense.


What the Catalyst (moreso than Javik, who's opinions are based off Prothean/Organic prejudice towards AIs) is getting at is:

a)  The possibility of Synthetics wiping out Organics is high based on it's own experiences and the events which followed during the 37+ million years its Reaped.  If the Catalysts theory could have been proven wrong, I think it would have happened at some point in its huge lifespan.
B)  That the result could definitely lead to the Galactic extinction and genocidal eradication of all organic life for the rest of time by an organic race.  Something there is no coming back from.

I use the gambling analogy because its very close in theory and probability if the Catalyst is to be believed (which it should, since it has so many chances to lie or trick Shepard).

a)  The extreme likelyhood is that you will lose all the money you bet in a Casino.  No matter how many exceptions there are to the rule, the Casino always makes a net profit off the floor.  That's essentially proven everyday because Casino's are big business, always turning a profit if people walk in the door.
B)  The result of this can be somone winning a jackpot or a high stakes card game, only to lose not only their winnings, but everything they own to the Casino.  "The house always wins."

The evolution example you use is not the same as above, because it suggests the likely hood of AI/Organic confrontation is small when indeed its not.  The fact is if AIs don't start it Organics would out of fear, greed, protectionism etc.

Time doesn't matter to the Catalyst, only the probability and eventuality of something which would make its original purpose/programming redundant.  Governing organic/synthetics relations wouldn't make sense if organic life and promordial goo ceased to exist would it?  Although Reaping is a totally currupt version of it's original purpose "morally,"  the Catalyst is still logically carrying out its orginal function (via controlled culling of advanced races) the only way it thinks it can.  That's of course until Shepard comes along and slaps a big Crubible on the Citadel.

Modifié par Balek-Vriege, 13 août 2012 - 05:49 .


#92
LucasShark

LucasShark
  • Members
  • 3 894 messages

BlueSandBristow wrote...

Ticonderoga117 wrote...

BlueSandBristow wrote...

Still, look at Terminator and IRobot.


Terminator:
AI becomes self-aware, tries to peacefully co-exist with creators, creators try to kill it out of fear and fail horribly. AI decides the best way to keep living is to kill every human. Could've possibly avoided armageddon is someone would've had a cool head in Cheyenne Mountain.

I, Robot (Movie version since I haven't read the book)
AI takes the overriding law of robotics and applies it to the extreme. AI with the ability to only use the laws as guidelines fights against the other AI.

Moral of the story:
AI's are not all inherently evil.


If a machine steps out of it's expected condition, wouldn't the first thing to do is shut it down? Why should we trust a machine that thinks its alive? Machines are used to do things too dangerous for humans. If they are suddenly alive, isn't it smart to shut it down right away?


Why should we trust you?

#93
Generic Screen Name

Generic Screen Name
  • Members
  • 245 messages

Sajuro wrote...

Did Miranda deserve to die at Kai Leng's hands?


Yes.

#94
LucasShark

LucasShark
  • Members
  • 3 894 messages

Balek-Vriege wrote...

LucasShark wrote...

Balek-Vriege wrote...

LucasShark wrote...

BlueSandBristow wrote...

LucasShark wrote...

BlueSandBristow wrote...

Machines don't think like us. They use logic, and when they see our flaws, they will think we are dangerous. Machines don't care about civilians and children. They don't feel emotion. That makes EDI more dangerous than any organic. She can't feel, and she can keep killing without remorse. Javik makes this very clear.


Wish you would...

ANd oh yes: Javik is such an expert on the psycology of machines and totally unbiased... I mean he only fought a war with the prothean's own machines so he's totally unbiased toward that subject right?


Bias or no bias, his reasoning is very deep and true, and has not been proved wrong.


- Peace with geth - proven wrong right there
- Edi's willingness to sacrafice to save Joker, a notably flawed human - demonstrated wrong again
- A bias toward a conclusion precludes that the resulting conclusion cannot be "deep" as you put it, and an emotional, not a logical term - three strikes, farewell.


Oversimplification. Both Javik and the Catalyst would rebut your first two points by asking you "Yes, but for how long?" The issue is that the peace never lasts and is a brief, temporary exception to the rule. The Catalyst should know since it was his job to broker peace between AIs and Organics for who knows how long.

It would be like a Casino owner (The Catalyst) who's seen countless gambles and the money to prove it saying:

"In the end, the house always wins"

Then some gambling noob coming along (Shepard) replying:

"You're totally wrong, I just gambled for my first time and won!!! The house doesn't always win meaning gambling is a good investment for me!"

The Casino owner laugh then says:


An appeal to probability is not a logical argument in this instance: as you can state the same for virtually any set of circumstances concievable which isn't self-contradictory.

You can for instance make a case for some day it being possible for pigs to fly, pigs evolve afterall, genetic engineering, etc, that is a quantifiable possibility, and if you wait long enough yes "for how long" will work here too.

You can't use "for how long" as a totalitarian game-ender as it is uncertain and undemonstratable, just as I can say "for how long" will "when pigs fly" be a legitimat eexpression?

An example of self-contradiction would be something like "a table made of insomnia", a statement which makes no sense.


What the Catalyst (moreso than Javik, who's opinions are based off Prothean/Organic prejudice towards AIs) is getting at is:

a)  The possibility of Synthetics wiping out Organics is high based on it's own experiences and the events which followed during the 37+ million years its Reaped.
B)  That the result could definitely be the Galactic extinction and genocidal eradication of all organic life for the rest of time by an organic race.  Something there is no coming back from.

I use the gambling analogy because its very close and probably.

a)  The extreme likelyhood is that you will lose all the money you bet in a Casino.  no matter how many exceptions there are to the rule, the Casino always makes a net profit off the floor.  That's essentially proven everyday because Casino's are big business, always turning a profit if people walk in the door.
B)  The result of this can be somone winning a jackpot or a high stakes card game, only to lose not only their winnings, but everything they own to the Casino.  "The house always wins."

The evolution example you use is not the same as above, because it suggests the likely hood of AI/Organic confrontation is small when indeed its not.  The fact is if AIs don't start it Organics would out of fear, greed, protectionism etc.

Time doesn't matter to the Catalyst, only the probability and eventuality of something which would make its original purpose/programming redundant.  Governing organic/synthetics relations wouldn't make sense if organic life and promordial goo ceased to exist would it?  Although Reaping is a totally currupt version of it's original purpose "morally,"  the Catalyst is still logically carrying out its orginal function (via controlled culling of advanced races) the only way it thinks it can.  That's of course until Shepard comes along and slaps a big Crubible on the Citadel.


Swing and a miss... missed the point entirely.

#95
Ticonderoga117

Ticonderoga117
  • Members
  • 6 751 messages

BlueSandBristow wrote...
If a machine steps out of it's expected condition, wouldn't the first thing to do is shut it down? Why should we trust a machine that thinks its alive? Machines are used to do things too dangerous for humans. If they are suddenly alive, isn't it smart to shut it down right away?


We are just bio-chemical machines that think they are "alive".
Who says an AI can't be with definitive proof?

#96
AresKeith

AresKeith
  • Members
  • 34 128 messages

BlueSandBristow wrote...

RadicalDisconnect wrote...

BlueSandBristow wrote...

Machines don't know literature, art, or beauty. Everything is just code. There is no culture or morals.


Oh really? Then why did EDI chose a Christian biblical metaphor to name Legion? Oh, and what about this line if you take her on Sur'Kesh?

EDI: "I find that analyzing topographical data of a location isn't the same as seeing it in person. But there's also regret, knowing that as we speak Reapers are destroying other worlds just as beautiful."

Kinda blows your statement out of the water, doesn't it?


Where does she say this? For all I know, you're lying.


did you take her to Sur'Kesh?

#97
RadicalDisconnect

RadicalDisconnect
  • Members
  • 1 895 messages

BlueSandBristow wrote...

RadicalDisconnect wrote...

Oh really? Then why did EDI chose a Christian biblical metaphor to name Legion? Oh, and what about this line if you take her on Sur'Kesh?

EDI: "I find that analyzing topographical data of a location isn't the same as seeing it in person. But there's also regret, knowing that as we speak Reapers are destroying other worlds just as beautiful."

Kinda blows your statement out of the water, doesn't it?


Where does she say this? For all I know, you're lying.


Don't believe me? Just youtube Legion's activation in ME2.

And for Sur'Kesh, I haven't seen that video before and I don't have a PVR to record it. But just take EDI and hit one of the examine prompts on the edge facing the valley.

Modifié par RadicalDisconnect, 13 août 2012 - 05:54 .


#98
Hackulator

Hackulator
  • Members
  • 1 606 messages
Why does a machine made of meat have inherent value greater than one made of metal?

#99
BlueSandBristow

BlueSandBristow
  • Members
  • 48 messages

RadicalDisconnect wrote...

BlueSandBristow wrote...

RadicalDisconnect wrote...

Oh really? Then why did EDI chose a Christian biblical metaphor to name Legion? Oh, and what about this line if you take her on Sur'Kesh?

EDI: "I find that analyzing topographical data of a location isn't the same as seeing it in person. But there's also regret, knowing that as we speak Reapers are destroying other worlds just as beautiful."

Kinda blows your statement out of the water, doesn't it?


Where does she say this? For all I know, you're lying.


Don't believe me? Just youtube Legion's activation in ME2.

And for Sur'Kesh, I haven't seen that video before and I don't have a PVR to record it. But just take EDI and hit one of the examine prompts on the edge facing the valley.


Okay fine. Machines still can't feel pain and fear like organics. This makes them ruthless and deadly.

#100
BlueSandBristow

BlueSandBristow
  • Members
  • 48 messages

Hackulator wrote...

Why does a machine made of meat have inherent value greater than one made of metal?


Organics have culture, emotions, civilians, and children. We can feel pain and fear. Machines are ruthless and cold.