Did post-leak changes ruin the ending's exposition and the Control and Synthesis options?
#101
Posté 11 mai 2012 - 10:08
Without synthesis to even the playing field, AI are extremely dangerous. I can blame ME for not making the issue clear enough, but I can't blame them for making it an issue at all.
#102
Posté 11 mai 2012 - 11:15
Except that there is nothing that is needed once an AI reaches tech sing. nothing it can learn from us.Sisterofshane wrote...
And yet the problem with which the Catalyst has tasked itself with is to stop a galactic-wide extinction event. In the order of the universe and evolution there will always be certain species that will go extinct for one reason or another, but does human "superiority" mean that every single other species on the planet is doomed to extinction? Of course not. I see no reason why AI are given this special consideration when there is no evidence to prove it, just a widespread fear of technological advancement.
Even then you do not see the death of all organics. As to your bacteria example, I may continue to breathe, bath, sanitize my house, but even with all of my knowledge and might, I will NEVER be capable of extinguishing every single bacteria that is and will ever be. In fact, Bacteria have proven themselves very capable of living on DESPITE my interference, and many people these days have recognized that, for as many viruses and organisms that are out there that can cause great harm to us, there are some that have proven to be very beneficial to us in many different ways (think fermented foods, yogurts, our own INTESTINAL flora which helps us to digest our food).
So wouldn't it be possible that and AI beyond the point of singularity would be able to see the merits of organics (much like the Geth had done), and that even if they didn't, that it might just be near impossible for them to kill all of us?
In my opinion, the Catalyst has created a solution to a problem that, beyond the fear of the unknown, does not exist.
Unlike us, it would be a nigh perfect being, not needing "bacteria" for any reason, and wiping us out would be very easy.
But in his books they are in fact placed in an AI that changes their source code periodically to adjust for emotional advice from other people. To make sure you understand this type of reasoning the brain computing device for example goes in to denial due to his involvement in killing the crew though it wasn't his fault per say since apparently going to hyperspace in his books will lead to them to not existing for a brief moment it shows that he is changing his source code to deal with the lost of his crew due to it interference of the three laws.
Thought you still haven't proved that Tech singularity was in fact shown during the 3 game or refute my argument.
Being able to change the 3 laws makes them pointless. The only right way to use them would be hardcoding it and putting restrictions on source code.
Aside from the Dyson Sphere Legion mentioned, there isn't really anything on it in the game.
Technological Singularity is a very interesting real-world concept. I think the problem I have with ME3 was that Bioware made the Catalyst equate this Singularity with galactic wide extinction. We were given example of Singularity with the Geth (in an oddly underhanded fashion, as most people never recieved this argument due to the mechanic of the Suicide mission and the fact that Legion was an OPTIONAL character that the player may have chosen to give to Cerberus), but the only example of the extinction event that even comes close to what the Catalyst describes is perpetuated by the Reapers themselves, who are meant to be the solution to the problem (even if they don't do it all at once).
Actually almost the whole of fiction that exist on the singularity connects it to total anihilation of organic life, infact any life at all, aside from the singularity.
#103
Posté 11 mai 2012 - 11:16
You cannot make a judgement on something you cannot comprehend.
#104
Posté 11 mai 2012 - 11:20
Taboo-XX wrote...
You assume that something that was perfect would need to wipe anything out.
You cannot make a judgement on something you cannot comprehend.
I'm not saying it would even bother to "wipe out", you are made of matter, the planet you live in is made of matter, if it needs them for mining, it won't mind you being on it, a being so inferior that it is irrelevant.
Building dyson spheres around stars would cause mass extinction, even if it doesn't have a directive to do so.
#105
Posté 11 mai 2012 - 11:27
Optimystic_X wrote...
The fact that the Geth specifically are friendly is irrelevant. For one, even Legion did not know what effect achieving singularity would have on them or their thought patterns. For two, even if it worked out hunky dory for 1000 years, there's no telling what would happen the next time some stimulus came along to schism them.
Without synthesis to even the playing field, AI are extremely dangerous. I can blame ME for not making the issue clear enough, but I can't blame them for making it an issue at all.
The problem I have with this statement is I feel that there is nothing that makes AI inherently dangerous. They are different, to be sure, but they still have limitations. Take the Geth, for example. They may achieve a massive boost to intelligence and processing power with their super-structure, but at the expense of more mobile, agile, and diverse platforms. Hence why the Quarians were able to damage them as highly as they did in the intitial ambush - the Dyson Sphere did not protect the Geth from the "weapon" developed by the Quarians, it in fact probably made the majority of them more vulnerable to it by them being confined within it, unable to break away and escape.
If the point Bioware wanted to make was that AI would ALWAYS be dangerous to organics, then why introduce a character like Legion? Why allow the Geth and the Quarians the chance for Peace? It seemed more likely to me that they were trying to prove another point entirely, and then the story did a complete 180.
#106
Posté 11 mai 2012 - 11:35
azerSheppard wrote...
Taboo-XX wrote...
You assume that something that was perfect would need to wipe anything out.
You cannot make a judgement on something you cannot comprehend.
I'm not saying it would even bother to "wipe out", you are made of matter, the planet you live in is made of matter, if it needs them for mining, it won't mind you being on it, a being so inferior that it is irrelevant.
Building dyson spheres around stars would cause mass extinction, even if it doesn't have a directive to do so.
What need do they have for expansion?
We assume that we understand what they want.
#107
Posté 11 mai 2012 - 11:49
azerSheppard wrote...
Except that there is nothing that is needed once an AI reaches tech sing. nothing it can learn from us.Sisterofshane wrote...
And yet the problem with which the Catalyst has tasked itself with is to stop a galactic-wide extinction event. In the order of the universe and evolution there will always be certain species that will go extinct for one reason or another, but does human "superiority" mean that every single other species on the planet is doomed to extinction? Of course not. I see no reason why AI are given this special consideration when there is no evidence to prove it, just a widespread fear of technological advancement.
Even then you do not see the death of all organics. As to your bacteria example, I may continue to breathe, bath, sanitize my house, but even with all of my knowledge and might, I will NEVER be capable of extinguishing every single bacteria that is and will ever be. In fact, Bacteria have proven themselves very capable of living on DESPITE my interference, and many people these days have recognized that, for as many viruses and organisms that are out there that can cause great harm to us, there are some that have proven to be very beneficial to us in many different ways (think fermented foods, yogurts, our own INTESTINAL flora which helps us to digest our food).
So wouldn't it be possible that and AI beyond the point of singularity would be able to see the merits of organics (much like the Geth had done), and that even if they didn't, that it might just be near impossible for them to kill all of us?
In my opinion, the Catalyst has created a solution to a problem that, beyond the fear of the unknown, does not exist.
Unlike us, it would be a nigh perfect being, not needing "bacteria" for any reason, and wiping us out would be very easy.Technological Singularity is a very interesting real-world concept. I think the problem I have with ME3 was that Bioware made the Catalyst equate this Singularity with galactic wide extinction. We were given example of Singularity with the Geth (in an oddly underhanded fashion, as most people never recieved this argument due to the mechanic of the Suicide mission and the fact that Legion was an OPTIONAL character that the player may have chosen to give to Cerberus), but the only example of the extinction event that even comes close to what the Catalyst describes is perpetuated by the Reapers themselves, who are meant to be the solution to the problem (even if they don't do it all at once).
Actually almost the whole of fiction that exist on the singularity connects it to total anihilation of organic life, infact any life at all, aside from the singularity.
There is nothing here to me that says that these "Perfect" life forms will indeed be hostile NOR negligent with the galaxy. The point I was trying to make that was, despite being infintely more intelligent and capable than a germ, humans could not, nor see a purpose in, making all bacteria extinct. We understand that they are apart of our world, and the implications of removing them COMPLETELY from our world would be more negative than positive, as our ecosystem (and to a large exten ourselves) has grown to be dependent on them.
So an AI might not blink twice when killing off a few organics here and there, but certainly there would be something to prove to them that doing so to EVERY being in the galaxy (on a permanent basis) would be detrimental to the universe they share with us( the Catalyst seems to understand this, even if he doesn't quite explain it).
This is nothing to say that the human/bacteria argument is perfect, because as far as I know (please source me if I'm wrong), bacteria has never attempted to communicate with us, nor does it understand when we attempt to communicate with it (has anybody ever tried?).
As to your last point, I believe the fiction surrounding us with the Singularity also happens to be based upon fear of the unknown - you know, the Frankenstein Model. As viewpoints shift and people become more adapted to ever increasing technology (nobody here thinks Siri is abhorrant, do they?), our understanding as a people will increase and this inherent fear that people have of AI will naturally dissapate.
#108
Posté 12 mai 2012 - 12:22
THEN WHY THE **** DO WE NEED THE REAPERS?!!!!!????Ieldra2 wrote...
(Synthesis)
C: And a path you have already started down.
C: Organics will always trend to a point of technological singularity. A moment in time where their creations outgrow them.
C: Conflict is the only result, and extinction the consequence."
Oh right, because you're double talking out your ***hole...
Modifié par Bill Casey, 12 mai 2012 - 12:28 .
#109
Posté 12 mai 2012 - 06:14
Ieldra2 wrote...
I was going over the leaked script from November 2011 again, and with increasing annoyance I noticed how much more sense it all makes than the version we got in the game. Not that I haven't known this before, but now, since polls have shown how much the endings are biased in favor of Destroy, it really sends me up the wall
Some quotes from the Catalyst encounter (leaked script version):
About itself and the organic/synthetic problem:
C: I was created eons ago to solve a problem.
C: To prevent organics from creating an AI so powerful that it would overtake them and destroy them.
(after Shepard says they'd rather keep their own form:
C: Organics will always trend to a point of technological singularity. A moment in time where their creations outgrow them.
C: Conflict is the only result, and extinction the consequence."
This gives a much clearer picture of the organic/synthetic problem from the viewpoint of the Catalyst.
About Destroy:
C: It's energy can be released as a destructive force. Organics will prevail at our expense. All synthetic life will succumb.
C: As will much the technology your kind rely on [possibly optional low EMS variant. Also note how "much" is replaced by "most" in the current version"].
C: Including the relays you depend upon. [no mention of this in the other endings]
S: But the Reapers will be dead?
C: Correct. But the probability of singularity occurring again in the future is certain.
About Control:
C: Harness the Crucible's energy. Use it to take control of the ones you call the Reapers.
S: Control? So the Illusive Man was right.
C: Correct... though he could never have taken control, as we already controlled him.
S: What would happen to me?
C: You will become the catalyst. You will continue the cycle as you see fit.
S: And the Reapers will obey me?
C: Correct. [no ambiguity here, Shepard will continue to exist and the Reapers will obey]
About Synthesis:
C: You may combine the synthetic and the organic.
C: Add your energy, your essence, with that of Crucible. The resulting chain reaction will transform both of our kind.
C: We synthetics will become more like you, and organic life will become like us.
S: So we'll just... go on living, together?
C: It is a very elegant solution. And a path you have already started down. [Note how this points towards a symbiosis instead of a genetic rewrite-analogue]
C: The harvesting will cease. It will be a new ascension, for synthetic and organic life. [no utopian aspect]
I can't be the only one who finds this version better by several orders of magnitude. Any comments?
I started a thread talking about the Catalyst using the leaked script. I COMPLETELY AGREE with you OP, the leaked script was much better.
#110
Posté 12 mai 2012 - 06:18
IsaacShep wrote...
Being a threat is not the point. The point is that Geth could achieve the level of intelligence incomprehensible to organics. Tech Singularity is not about "AIs will surely kill us all". It's about the gap between organic intelligence and AI. "Such advanced AIs will surely kill us all" is just one scenario of what could happen. The Catalyst belives in that scenario, doesn't mean it's true and it doesn't mean Shepard = the player has to agree with him.Wulfram wrote...
IsaacShep wrote...
It was. Geth's Dyson Sphere could very well place them at singularity level of intelligence superiority over organics
Well, the partially constructed Dyson Sphere clearly didn't, since the Quarians trounced them before the Reapers got involved. And there's no indication given that the Geth would pose a threat to Organics in that state.
A couple of offhand lines that are somewhat connected to the subject is not the same as introducing a theme, and certainly not enough for it to be revealed as the driving force behind the whole series.
+1
#111
Posté 12 mai 2012 - 11:26
Because - or so I guess the Catalyst would say - organics would become extinct before they could reach the end of that path on their own.Bill Casey wrote...
THEN WHY THE **** DO WE NEED THE REAPERS?!!!!!????Ieldra2 wrote...
(Synthesis)
C: And a path you have already started down.
I see no inconsistency in that premise. It may or may not come true, but it is not obviously wrong. See the comment about the geth quoted in the previous post.Oh right, because you're double talking out your ***hole...C: Organics will always trend to a point of technological singularity. A moment in time where their creations outgrow them.
C: Conflict is the only result, and extinction the consequence."
Modifié par Ieldra2, 12 mai 2012 - 11:26 .
#112
Posté 12 mai 2012 - 11:32
And yes, I do think that the leaked script probably did ruin the ending.
Modifié par vixvicco, 12 mai 2012 - 11:34 .
#113
Posté 12 mai 2012 - 11:35
vixvicco wrote...
Thing about control is, Shepard pretty much becomes a Reaper. So how long do you think till Shepard "realises" that the cycle should actually continue? How long till she/he is influenced to think more like a Reaper than someone who was previously human? 50.000 is a long time to not be human any more
Shepard's humanity only needs to last long enough to order the Reapers to destroy themselves
#114
Posté 12 mai 2012 - 11:58
LKx wrote...
I'll have to go through that again, but i remembered that the Guardian offered as the third option to became a reaper, not synthesys. However i might remember it wrong.
Anyway, still not much better though.
"CONVERSATION: Once Shepard reaches the top of the
elevator he begins a conversation with GUARDIAN where all the mysteries
of the universe are revealed. ACTION: Shepard must now make his final
decision - to control the Reapers, to destroy the Reapers, or if they
had a perfect game to become one with the Reapers."
#115
Posté 12 mai 2012 - 12:28
I'd argue that AIs vs Synthetics wasn't a theme either. It was something that was present, but as a theme it lacked development: the only synthetics we were in conflict with was conflict for more mundane reasons (Geth religion with the Heretics, Geth survivalism in ME3), while the Reapers were distinguished from being synthetics themselves.Wulfram wrote...
IsaacShep wrote...
Urhm... EDI. Geth. Rouge AIs. The theme has been present in the franchise since Day 1
AI vs Organics has been there. AI singularity as threat to organic life as a whole, hasn't.
The closest we got was the Reapers. Except, we're told they're the solution, not the problem.
If anything, the theme of Mass Effect wasn't AIs vs. Organics: it was that AIs and Organics are the same, and it's the AI-Organic hybrids who have fundamental differences that lead to violence.
#116
Posté 12 mai 2012 - 12:39
The fact that he leaves lesser evolute orgainic civilizations alone could be just to have something else to harvest after 50.000 years, afterall it placed citadel and mass relays to have the lesser civilization evolve in the way it wanted.
You CAN'T foresee what reasoning scheme a TS would have, then it's claim that what it's trying to prevent something that wipes ALL organic life (is it even possible?) could just be its synthetic assumption.
#117
Posté 12 mai 2012 - 12:40
You take control, and make Reapers kill themselves. While preserving the technology and all civilizations...
Modifié par Ingvarr Stormbird, 12 mai 2012 - 12:41 .
#118
Posté 12 mai 2012 - 05:00
This is an issue of definition. The Catalyst doesn't have to consider the Reapers synthetics, for example. And the 'but preserved in another form while lesser species survive' caveat is firmly established: the Reapers aren't trying to prevent any destruction of organics, they're working against a specific issue.LKx wrote...
The Catalyst is the tenchnological singularity which he claims to want to prevent. It's synthetic afterall, and it DOES wipe organics every 50.000 years.
We do know that synthetics and organics can come into conflict. Even if the probability of synthetics wiping out all organics is near zero, any non-zero probability will eventually occur if given enough time. It's less synthetic assumption and statitical fact.The fact that he leaves lesser evolute orgainic civilizations alone could be just to have something else to harvest after 50.000 years, afterall it placed citadel and mass relays to have the lesser civilization evolve in the way it wanted.
You CAN'T foresee what reasoning scheme a TS would have, then it's claim that what it's trying to prevent something that wipes ALL organic life (is it even possible?) could just be its synthetic assumption.
Now, does that statistical fact justify the action? Not saying that in the least. There are also certainly much greater, more likely, and more pressing threats that would cut off the threat of a singularity-extinction: the threat of a Reaper extinction is pretty much 1. Likewise, the unsustainability of the Reaper solution: just like any non-zero probability will eventually occur, any non-1 probability will eventually fail. Stastically the Reapers will eventually fail (which they do), which will re-open the door to a technological singularity wipeout (which it does).
With that in mind, that's where the Catalyst's failings come from. Not in identifying a threat, but in weighing that threat against other concerns. It elevates a sub-goal (preventing a singularity from wiping out all organics) over a broader goal (preventing organics from being wiped out).
#119
Posté 12 mai 2012 - 08:59
Exactly. And the absolutely frustrating fact about it all is that they had something that made sense. I can accept that the process of Synthesis isn't explained, only the results, but they at least should make sense. The same with Control. The leaked script version had that. Then....someone decided.....what exactly? What the hell was the reason for this epic fail?azerSheppard wrote...
That's where bioware dropped the ball, by NOT explaining what the fuk happened.
#120
Posté 12 mai 2012 - 09:05
Wow, Legion actually says it? I never got that from Legion in my ME2 games. How do you get it?Optimystic_X wrote...
As for the tech singularity, here is Legion in ME2:
Legion: "We gain intelligence by sharing thoughts. But we do not have adequate hardware for all of us to share at once. No Geth will be alone when {our Dyson Sphere} is done."
Shepard: "What will your purpose be after that?"
Legion: "We cannot yet say. Our intelligence will increase beyond calculable measure. We will be capable of imagining new futures. We are patient; we have been building the megastructure for 264 years."
Outside influences? Right when the Quarians attacked, they finished the damn thing.
Modifié par Ieldra2, 12 mai 2012 - 09:06 .
#121
Posté 12 mai 2012 - 09:26
LKx wrote...
The Catalyst is the tenchnological singularity which he claims to want to prevent. It's synthetic afterall, and it DOES wipe organics every 50.000 years.
The fact that he leaves lesser evolute orgainic civilizations alone could be just to have something else to harvest after 50.000 years, afterall it placed citadel and mass relays to have the lesser civilization evolve in the way it wanted.
You CAN'T foresee what reasoning scheme a TS would have, then it's claim that what it's trying to prevent something that wipes ALL organic life (is it even possible?) could just be its synthetic assumption.
The starbrat is creating the problem again in the first place through his damn inference throughout the 3 games.
This tends to be the reason, why its ****ed up logical consultation is a bit of a joke in reality. It seems his higher thinking skills turn him in to some kind psychopathic synthetic with organics thought processors. It just sound so ****ing stupid, he is presenting me with a assumption at best. Since no were in the games does it support his arguments of Synthetics wiping out all organic life. In fact during the game it shows that The Geth chose to let the Qurains go after they weren't a threat to them anymore. The stupidity of his argument is also supported by the fact the Qurians started the Morning war in the first place and not the Geth.
In other words, its feels out of place to make Tech Singularity a issue since it just doesn't show his assumption to be the truth.
#122
Posté 12 mai 2012 - 10:36
Optimystic_X wrote...
The fact that the Geth specifically are friendly is irrelevant. For one, even Legion did not know what effect achieving singularity would have on them or their thought patterns. For two, even if it worked out hunky dory for 1000 years, there's no telling what would happen the next time some stimulus came along to schism them.
Exactly. After wiping out (or enslaving) all other organic life in the galaxy, the krogan empire could have started a war it couldn't win against the geth, who would just retaliate with biological weapons.
#123
Posté 12 mai 2012 - 10:37
azerSheppard wrote...
Except that there is nothing that is needed once an AI reaches tech sing. nothing it can learn from us.
Unlike us, it would be a nigh perfect being, not needing "bacteria" for any reason, and wiping us out would be very easy.
It also wouldn't be sentient. It wouldn't need to be.
Actually almost the whole of fiction that exist on the singularity connects it to total anihilation of organic life, infact any life at all, aside from the singularity.
Just because a lot of people like to be psuedo-intellectuals about a topic doesn't make them right.
#124
Posté 12 mai 2012 - 10:45
#125
Posté 14 mai 2012 - 05:18
Ieldra2 wrote...
C: The harvesting will cease. It will be a new ascension, for synthetic and organic life. [no utopian aspect]
Actually it still sounds very utopic.





Retour en haut







