Synthesis is serving the reapers because they go off the hook unpunished for the trillions of murders they have committed. It's like "Thanks for all the genocides. All those races you have exterminated must have been glad you did so. If only they knew. The horrors you inflicted on them were not in vain, because we are now happy in lala land."Ieldra2 wrote...
@AngryFrozenWater
I should really make a list of ridiculous claims about Synthesis. It's "serving the Reapers", "makes everyone the same", "destroys free will".... Where the hell does the nonsense end? To what length will you go to hammer your propaganda into everyone's brain?
Yes, Shepard doesn't ask people if they want to be synthesized. He also doesn't ask if they want to live under the guardianship of a synthetic overlord with the mind of an ascended human (aka Control-Shepard), or if they want to live with the risk that their descendants and their complete species will be killed by post-singularity synthetics. And as I said, results matter. I am approaching this from a consequentialist viewpoint. If I can reasonably expect the results to be beneficial to the great majority, and I don't have the means to apply the decision on an individual level, I might be justified in making that decision for all. Public decision making often goes that way, even today. Also, in my interpretation the change is reversible on an individual basis. A few billion dropouts won't matter for the bigger objective.
Control is serving the reapers, because they again get off the hook and Shepard has become an undead dictator of the cyclical maniacal genocidal reapers. The reapers are so brilliant and beyond our comprehension that they in their infinite wisdom have decided to trust leadership to an inferior human who can fire guns.
Destroy is serving the reapers once again, albeit for the last time, by destroying the geth, EDI, their technology and interstellar space travel. And maybe even Shepard.
I think with "makes everyone the same" you mean that I said that with synthesis the races will lose their features and their identity. Look at the synthesis ending. All now have that new beautiful glow and techie feel about them. But that's not all.
In a couple of thousand years there will be new races. Maybe these create synthetics and maybe they don't. Who knows? What are we supposed to do: Exterminate those new organics or send our saviors the reapers to them. After all, they are still out there. Our saviors can help us with another genocide, right?
If synthesis really prevents creating new synthetics then how does that work? If we still got free will then maybe we would create some new synthetics. But, no, synthesis is designed to prevent it. Does the space magic destroy free will? Must be.
You may call the above whatever you want, but synthesis as an option to solve the synthetics problem it is pretty unbelievable. Especially, because it is a solution to a non-existent problem. We went already into that, didn't we?
Like many have tried to explain to you, it is unknown what the consequences are, because the reapers exterminated the races before a singularity could even occur.
In order to hide any suspicion I went as far as writing propaganda in a Wikipedia article many years ago (of course anonymous), because I knew we would have this discussion today...
A date given for the singularity is 2045. According to the above article, that had to do with Moore's Law. Even Moore didn't believe it. Shepard lives many years past that and it didn't happen.Technological Singularity.
The term "technological singularity" reflects the idea that such change may happen suddenly, and that it is difficult to predict how such a new world would operate.[22][23] It is unclear whether an intelligence explosion of this kind would be beneficial or harmful, or even an existential threat,[24][25] as the issue has not been dealt with by most artificial general intelligence researchers, although the topic of friendly artificial intelligence is investigated by the Singularity Institute for Artificial Intelligence and the Future of Humanity Institute.[22] Many prominent technologists and academics dispute the plausibility of a technological singularity, including Jeff Hawkins, John Holland, Jaron Lanier, and Gordon Moore, whose Moore's Law is often cited in support of the concept.[26][27]
Modifié par AngryFrozenWater, 03 juin 2012 - 12:31 .





Retour en haut





