***SPOILER*** The Origin of the Reapers is silly (and a paradox). ***SPOILER***
#101
Posté 04 mars 2012 - 04:14
#102
Posté 04 mars 2012 - 04:34
I get that they wouldn't actually be "Reapers" without harvesting organic species, so they'd be something else... whatever, they'd make it work. This just seems like Bioware decided to settle on one thing and when they did, they didn't overthink it because second-guessing would be a waste of development time.
Modifié par RogueBot, 04 mars 2012 - 04:37 .
#103
Posté 04 mars 2012 - 04:48
Breaking Rule.
No matter how hard the Reapers Warn you, still There is many people which love breaking promise / rule for their own satisfaction.
(I think organic satisfaction is limitless and will cause chaos itself) That's the Reason why Reapers being created.
Think back as Alpha and Omega in your mind.
Give warning doesn't mean everybody will stay quiet or stay out of it.
#104
Posté 04 mars 2012 - 04:54
Modifié par RogueBot, 04 mars 2012 - 04:55 .
#105
Posté 04 mars 2012 - 05:04
We can't understand them, because they are effing nuts. If you think, for a second, that they aren't silly...well you just might be crazy. I think fighting over Israel is silly. Land is land, who gives a crap? Let the Palestinians have their land and let it go. Despite that, people still fight over that land because it's "holy" to them. Seems mighty silly to me. Makes total sense to others.
#106
Posté 04 mars 2012 - 05:14
#107
Posté 04 mars 2012 - 05:16
It's about an intelligence beyond our capability that doesn't see need for organics and continues to expand.
No idea how you misunderstood the definition.
#108
Posté 04 mars 2012 - 05:25
Aesieru wrote...
No it's not Deleran...
It's about an intelligence beyond our capability that doesn't see need for organics and continues to expand.
No idea how you misunderstood the definition.
Why does such a thing exist? Think about that for a while. Why does it "continue to expand"? We expand because of biological imperatives. But already with our level of intelligence, we understand the need to check this against our available resources, and to consider the needs of other organisms, etc in our environment. Intelligence has made us more moral and considerate of others, not less. Supposing that this trend reverses at some point is baseless at best.
#109
Posté 04 mars 2012 - 05:29
Deleran wrote...
Aesieru wrote...
No it's not Deleran...
It's about an intelligence beyond our capability that doesn't see need for organics and continues to expand.
No idea how you misunderstood the definition.
Why does such a thing exist? Think about that for a while. Why does it "continue to expand"? We expand because of biological imperatives. But already with our level of intelligence, we understand the need to check this against our available resources, and to consider the needs of other organisms, etc in our environment. Intelligence has made us more moral and considerate of others, not less. Supposing that this trend reverses at some point is baseless at best.
That's not intelligence, that's restraint based off of instinct.
AN AI can only grow stronger and more capable through expansion, there is no implied or possible negative, Humanity can overextend itself.
#110
Posté 04 mars 2012 - 05:39
#111
Posté 04 mars 2012 - 09:02
AztecChieftain wrote...
I don't see what prevents the Reapers from warning developed civilizations about the dangers of AI should it become too advanced to control, if the existence of organic life really bears importance for them. They could help establish measures and take precautions to prevent synthetics from becoming too powerful, instead of completely eradicating tens of millenia worth of evolution. Also I feel as if the part the child plays is shoehorned to give a human element to the Reapers. They came off in the first two games as being machine and machine above all.
And what virtues of the Human race (since they are the only real race posited in ME) makes you think that they would not continue to forge a path to their own destruction even if the REapers were benevolent and schooled them in why AI is bad? We see it right now as a matter of fact. A lot of people think that green energy is a load of bunk not worth pursuing and instead we should continue to trawl the Earth for oil. We pollute our surroundings and kill off natural inhabitants of land in order to build our cities and the things we use every day.
Humanit is self destructive, it is one of our instictual creeds. If you were a sentient machine would you want to put up with a bunch of base emotional creatures who contantly take actions that are harmful to their own well being and be required to be there every day to keep them from making those mistakes or would you just destroy all life and let it rebuild itself anew until your services were once again required?
AI would be efficient, it would seek the path of least wasted resources because that would be the logical end of such a question. In this case destroying all life in the galaxy takes less resources then baby-ing a group of destructive races would. Plus you ge those 50,000 years of R&R in Dark Space.
#112
Posté 04 mars 2012 - 09:04
#113
Posté 04 mars 2012 - 09:08
Marstor, do not just say such juvenile thoughts without actually looking at the story, the people who don't come to your conclusion actually understand it... you obviously do not.
Modifié par Aesieru, 04 mars 2012 - 09:09 .
#114
Posté 04 mars 2012 - 09:11
Shall we continue our discussion here then? I'll repost my response to Balek.
"First of all, it's insane to claim every clue of organic life will eventually evolve the same way.
Secondly, if you dare to use that argument and EVERY ORGANIC LIFE is doomed to destroy itself no matter what, why don't you let it do so? What will you achieve? Why don't you destroy every sign of organic life yourself?
Why not let it go? Why not let something else take its place now instead of giving the doomed organic life one more chance to do the inevitable?
It doesn't make sense."
Modifié par teh_619, 04 mars 2012 - 09:11 .
#115
Posté 04 mars 2012 - 09:11
Balek-Vriege wrote...
Muskau wrote...
One more question. How does Synthesis stop AI's from being built? Doesn't it just make everyone like Shepard with robot parts? I'm pretty sure a Cyborg creating AI's is not different from a Human making AI.
I thought in the interview they had for ME3, they wouldn't leave us with questions about the ending.
Good point and again i'm trying to stay away from revisiting leaks etc. as too not spoil myself too much. I guess the only means of preventing the tech singularity through synthesis is the following:
- Make everyone super intelligent, strong etc. along the lines of advanced AI so we don't need them.
- Program an instinctive warning into each being that prevents them (hopefully) from developing AI and the possibility of tech singularities, while limiting races from achieving this through cybernetics.
Essentially programming out some of our free will and forcefully keeping us from achieving technological singularity. We end up with the benefits of both organic and synthetic lifeforms, but really being neither in the end.
I suppose we could 'assume' that. But almost none of the endings seem to provide closure to the Tech singularity problem.
Destroy - Reapers and Relays destroyed - Tech Singularity will happen in distant future
Synergy - People become Cyborgs - Tech singularity possible unless somehow they take away free will, which I thought was point of being organic?
Control - Reapers go away - Tech singularity possible and/or imminent due to Geth Dyson Sphere
Plus the AI is EVIL writing really isn't consistent
Heretics - Think Sovereign is a god, does as he commands.
Geth - Attacked by Quarians after questioning Quarians
Rogue Moon AI - Self defense after they try to deactivate.
Rogue computer on Citadel - Threatens to kill when cornered so he can get away.
etc etc...
#116
Posté 04 mars 2012 - 09:12
Please refer to the previous pages on this thread for logical insight before posting.
Modifié par Aesieru, 04 mars 2012 - 09:16 .
#117
Posté 04 mars 2012 - 09:20
#118
Posté 04 mars 2012 - 09:31
teh_619 wrote...
I say, this relocation was indeed, quite convenient.
I fail to see how, this has less information and confuses more people and doesn't show much support for the community.
#119
Posté 04 mars 2012 - 09:37
teh_619 wrote...
I don't even know.
Shall we continue our discussion here then? I'll repost my response to Balek.
"First of all, it's insane to claim every clue of organic life will eventually evolve the same way.
Secondly, if you dare to use that argument and EVERY ORGANIC LIFE is doomed to destroy itself no matter what, why don't you let it do so? What will you achieve? Why don't you destroy every sign of organic life yourself?
Why not let it go? Why not let something else take its place now instead of giving the doomed organic life one more chance to do the inevitable?
It doesn't make sense."
Hehe thanks. I went away for a bit and came back to post my reply only to find the thread locked and lost my post. Retype time.
I'm guessing this would be a question for the Guardian or its creators. Religious, cultural of ideological factors may have come into play not only for why they try to save organic life, but the means in which they do so (which is the most logical one imo). Personally believe as much as it hurts that a technological singularity would just be the natural next step in the evolution of intellect in the universe and the end of ours. Also it would be the only type of intelligence with a chance to survive the ending of the universe unless some species managed to evolve (or forcefully) evolve itself into an ascended, multi-dimensional beings. If it aint bombs, wars or virus, its tech singularity, if not that its overpopulation, if not that its space travel, if not that its supernovas, the ending of the Galaxy etc etc etc. At some point it all ends since our universe is based on entropy. Makes you not want to get up in the morning. Also if we take this into account there would be no plots with super intelligences, since they may very well come to the conlusion "Life sucks, deal with it."
In ME3 the Reapers don't kill all organic life, just the ones that prove they will progress to a technological singularity (of course they leave primitives behind to give them a chance). That's any race basically that uses advance tools and progress to fill the gaps of their capabilities and more. Tools which lead to computers, AI and technological singularity. Any race that uses the relays, starships and computers already proves it's going along that path. Again evolution is not the problem, it's technological progress. For the sake of every organic species, intelligent or not, the creators of the Reapers designed a plan that was the most efficient and fool proof they could come up with while trying to justify their automated genocide by making races into memorials (Reapers). It comes down to better to ensure organic life in the galaxy and end advanced lifeforms every 50K years, instead of it being gone forever.
Now that I think about it. For all we know It's possible the creators of the Reapers didn't think like us and knew full well how to avoid it, but didn't trust other organics that think differently to do the same.
Modifié par Balek-Vriege, 04 mars 2012 - 09:37 .
#120
Posté 04 mars 2012 - 09:39
Aesieru wrote...
Did we honestly just get relocated to an older thread that was dead?
Marstor, do not just say such juvenile thoughts without actually looking at the story, the people who don't come to your conclusion actually understand it... you obviously do not.
no - but you need to learn how to distinguish a joke from fact. might make life a little easier for you.
#121
Posté 04 mars 2012 - 09:42
#122
Posté 04 mars 2012 - 09:47
Muskau wrote...
Guess I'll continue it here...Balek-Vriege wrote...
Muskau wrote...
One more question. How does Synthesis stop AI's from being built? Doesn't it just make everyone like Shepard with robot parts? I'm pretty sure a Cyborg creating AI's is not different from a Human making AI.
I thought in the interview they had for ME3, they wouldn't leave us with questions about the ending.
Good point and again i'm trying to stay away from revisiting leaks etc. as too not spoil myself too much. I guess the only means of preventing the tech singularity through synthesis is the following:
- Make everyone super intelligent, strong etc. along the lines of advanced AI so we don't need them.
- Program an instinctive warning into each being that prevents them (hopefully) from developing AI and the possibility of tech singularities, while limiting races from achieving this through cybernetics.
Essentially programming out some of our free will and forcefully keeping us from achieving technological singularity. We end up with the benefits of both organic and synthetic lifeforms, but really being neither in the end.
I suppose we could 'assume' that. But almost none of the endings seem to provide closure to the Tech singularity problem.
Destroy - Reapers and Relays destroyed - Tech Singularity will happen in distant future
Synergy - People become Cyborgs - Tech singularity possible unless somehow they take away free will, which I thought was point of being organic?
Control - Reapers go away - Tech singularity possible and/or imminent due to Geth Dyson Sphere
Plus the AI is EVIL writing really isn't consistent
Heretics - Think Sovereign is a god, does as he commands.
Geth - Attacked by Quarians after questioning Quarians
Rogue Moon AI - Self defense after they try to deactivate.
Rogue computer on Citadel - Threatens to kill when cornered so he can get away.
etc etc...
Agree with just about everything there. I would only say that to my understanding control is more "Wait and see." If technological singularity becomes imminent the cycle begins again. I could be wrong.
As for synthesis, the problem is exactly what you say: You sacrifice part of every organic lifeform's free will in order to save them from tech singularity, while making them "perfect" at the same time. It's not along the same lines as the Reapers (based off convos with Guardian), but does take a big chunk of free thought away or more specifically, what makes races unique and dangerous to themselves.
Destroy is the freedom choice. Reset technology, refuse that your fate is doomed to technological singularity and allow races to achieve their own destinies. Only time will tell what happens.
Modifié par Balek-Vriege, 04 mars 2012 - 09:48 .
#123
Posté 04 mars 2012 - 09:48
AztecChieftain wrote...
I don't see what prevents the Reapers from warning developed civilizations about the dangers of AI should it become too advanced to control, if the existence of organic life really bears importance for them. They could help establish measures and take precautions to prevent synthetics from becoming too powerful, instead of completely eradicating tens of millenia worth of evolution. Also I feel as if the part the child plays is shoehorned to give a human element to the Reapers. They came off in the first two games as being machine and machine above all.
They don't eradicate millenia worth of evolution. They archive it within Reaper shells in order to make room for new life to grow and prosper. It's quite possible that had they not begun the cycle, humanity would never have existed or that Earth may have been devastated in a war between organic empires, colonized by a race of advanced organic life, strip-mined by a growing synthetic race or the site of a major battle between organic and sythetic enemies.
Each Reaper represents the collective existence of a species. The conflict the Guardian seeks to avert obviously stems from experimentation leading to a reinforced conclusion, and of a solution biased by groupthink.
Imagine that Sovereign never meets Saren and instead of trying to initiate the cycle by attacking the Citadel, let's things play out. Imagine that Sovereign never contact the geth. I'd wager that the quarians would have eventually attacked the geth in a bid to retake Rannoch and confirmed what seemed to be a growing conclusion among the geth, that organics could not be trusted. As Legion mentions in ME2, when the quarians feel they have an upper-hand they attack the geth 100% of the time.
We've also seen that the Alliance was conducting its own illegal AI experiments in secret, even though it was forbidden after the quarian/geth conflict began. Imagine that they succeed in creating AI which likewise begins to achieve sentience and then seeks out the assistance of the geth to ensure their survival. The logical conclusions being that organic life will seek to create synthetics, to control them and to destroy them if synthetic life threatens their control.
I think that the concept they're going for makes sense because we can already see instances where these things have happened in the ME universe.
1) The creation of the geth and subsequent rebellion for survival
2) The disdain that exists for AI and the fear of them which the Citadel races harbor
3) The extreme isolation that the geth put themselves into
4) The creation of rogue AI on Luna, among other locations
5) The AI on the CItadel that attained sentience and sought out the geth.
6) The ease in which an outside force was able to convince a small percentage, but significant population of geth to launch a war on organics in exchange for the tools to achieve the future they desire.
7) The ease in which even the true geth will turn to the Reapers in order to ensure their survival in order to fight back against their creators in another assault on their existence.
It sounds to me like the logic of the Guardian isn't that ridiculous. A galaxy-wide war w/ the geth seems reasonable given the circumstances.
It simply needs an example to prove that its ultimate conclusion was wrong. That's observation-based science. And they have millions of years worth of observations that I'm sure reinforce those conclusions. So when Shepard arrives in that chamber at the end of the game, the first time this has ever happened... and if the geth and quarians are able to broker peace, that's not just one, but two examples which prove the conclusion is false rather hypothetical situations which never panned out.
Partly because the Guardian and the Reapers assumed the outcome and then looked for the evidence which supported their conclusion rather than letting things play out and seeing if it actually happened. Possibly because they judged the risk too great if a synthetic race were ever allowed tosufficiently develop to a level where they developed divergent technologies for which the Reapers had no defenses. It seems that threat is their driving force.
#124
Posté 04 mars 2012 - 09:55
Taleroth wrote...
The Reapers aren't AI,. They're a hybrid, but I'm nitpicking you..
Sapient constructs actually, talk about nitpicking.
#125
Posté 04 mars 2012 - 09:55
So civilizations don't grow according to the guidelines set by the reapers and their creators which is apparentely a one-way highway towards a singularity event..
Fair enough..
But what guarantee is there that future civilizations won't create AI that will lead to tech singularity anyway?
Seems like the whole plot is nothing but a stop-gap... A short term solution to a long term problem..
..:Which when you think about it is what the reapers were too...





Retour en haut






