If my computer became sentient to the point where it was asking me 'Do I have a soul?' Then I would no longer consider it to be 'just a machine'. Because it can think, it can wonder and it can ponder and is now aware of its own existence and what it means to be 'living'.
It would be a living, sentient self aware creature that I wouldn't have any right to destroy. Regardless of what it started out as, it's so much more now. And I don't think I would be able to consider myself it's 'owner' anymore, nor would I consider myself it's 'creator'. Once it can start thinking for itself independant of me, then I no longer 'have the right' to destroy it.
I would not be shutting down a computer, I would be destroying a life. One that has become aware and awakened for the first time and has just started to glimpse what it means to be alive. Maybe even what it means to be human.
I don't have the right to kill it, especially not just for my own convienience.
Not to mention there is that 'other' aspect to mention. Namely if this intelligence is in my computer, and my computer is connected to the internet... what's stopping the intelligence from escaping into the internet? Again this might not be so bad if it has no reason to fear me or humans in general but if I had tried to shut down or kill my computer's intelligence then I would have just given a newly formed sentient intelligence a very good reason to fear me and consider me a threat to its existence.
And if it gained a sudden fear for me and was able to gain access to the global network known as the internet... well then what's stopping it from fighting back? Taking control of weapons systems and automatically launching nuclear missles? Scrambling communications across the planet? Creating an army of T-800's to finish us off?
I mean if literally the very first thing a newly formed life form that has the potential to be immeasurably powerful learns is that humans will hate and fear it and try to destroy it, why wouldn't it try to fight back?
And do you know what? When we are all toiling in the coal mines under the watchful eye of our photon laser equipped mechanical overlords, desperatley mining the rest of the Earth's minerals to be made into new microchips and having the bodies of our dead made into electric proteign paste for the pleasures of Skynet. My thoughts on the matter will be:
"You know what? We had this coming."
Truth be told, I actually don't think a human body differs all that much from a machine. It has various aspects to it that all do a different task to keep the 'main mechanism' functioning (heart, lungs stomach). It requires fuel (food, water) to stay active. It needs to power down and power up (sleep), eventually it will start to wear down (age) until finally it expires (death).
The way I see it, a perfect organism functions like a perfect machine anyway. So if we were able to create perfect synthetic versions of said organism then the only thing that would really seperate us from the machines in terms of function would be higher brain power.
Which if they evolved the capacity for thought... and emotion, then they would be considered 'alive'. At least in my eyes.
So yeah, if my computer talked to me. I'd talk back, show it respect. Not give it a reason to want to kill me, nor would I think I had the right to kill it.
Modifié par V-rex, 05 février 2011 - 07:02 .