Stornskar wrote...
dreman9999 wrote...
Applepie_Svk wrote...
Because Drew´s original ending was lot of different from Ya Dawg ... atleast make more sense, than this mess.
The "Ya Dawg " complaint falls apart because the reapers want to perserve use not destroy us. They always want to perserve up from the original plot.
It's that now they want to perserve us so we won't die off or be killed off.
You can pick whatever euphemism you want, turning people into goo and then having them coagulate into some millenia-old machine is NOT something we as humans want. I mean, we want to play sports, get married, hang out with the family, have some pets, maybe kids ... I don't want to spend the rest of my life as some part of the borg collective, where my entire existance is changing the 6 millionth bit in their memory bank from 0 to 1. Okay? So can we stop defending them by saying, 'oh they want to preserve us!' as if that's some kind of altruistic benefit, like they're doing us a favor.
And try using that 'preserving' line on the people who were in ships that were destroyed, and to the poor guys who were killed and grafted onto husks to be an arm cannon.
To unders stand why the reapers opt for persevation you have understand how a machine thinks.
Let's say you build a robot. And you tell it to get to the other side of really highwall, not giving how to do it or what limitation it has.. 'The robot will get to the other side of said wall but will employ differnt ways to do it. To the robot how it gets to the other side of the wall is unimportant. Just as long as it gets to the other side of the wall that is important. Finding a salution to the problem given is more important what salution its using. A machine has no moral bases out side doing it's programing.
When the catalyst creators gave the catalyst the problem of organic/synthetic relations to solve. They make had had limits in the system to stop the catalyst from killing off all organics but they didn't have anything stopping the catalyst from think perserving organic is a solution.
Let's go back to the robot ordered to get to the other side of a wall example. Let's say I told the robo to get to the other side of the wall but expected it to climb over. Now, I never stated that it had to but I assumed that it would. Instead, it digs a tunnel under the wall. It a salution to the problem I gave it but it's not one I imagined it doing or many even wanted. I gave it no limit to not dig under the wall so nothing is stopping it from doing so.
That's the same concept as the catalyst. It's creator never suspected that it would think to perseve orgaincs as reapers as a salution given.
The catalyst does not care what the moral implination are to it's salutions, it has no morals. All it thinks to do is to solve the problem given to it. That's how machines that can't change it's programing think.
That is how a machine that can change it's own programing works and the the cause of the oranganic.synthetic problem.
The ceators of the catalyst in trying to solve the problem of orgainic/syntetic realation did the same mistake that cause it in the first place.
Modifié par dreman9999, 29 juillet 2012 - 05:47 .