Like I said, I'm a comparative neurologist (I study the nervous system and consciousness of animals), so my opinion is different than yours. I see no magic life-force that makes information processing in a biological brain fundamentally different than information processing in a synthetic one, and thus I see no fundamental barrier to creating sentience OR sapience "synthetically".
However, I do see a technical barrier to doing so. The brain of a fruit fly is more complex than anything that can be simulated today. And for decades, artificial intelligence researchers were going about it completely wrong by ignoring how brains actually work. I've seen a couple recent publications from people who are probably more on the right track, but creating an artificial intelligence by 2050 like many people believe is probably a pipe dream.
Thanks for the reply though, this topic and people's view of it is fascinating to me as I suspect that it may become a major hot topic in the next generation and maybe, just maybe, within my lifetime.
Is the very nature of the complexity a "magic life-force" concept? Not by fiat of course, but by possibility.
You recognize that a fruit fly is more complex than anything we can simulate, and that 2050 is a pipe dream more than likely - but what if it is genuinely impossible? That is a very real possibility regardless of the wistful dreams of people who are fascinated by AI.
On a philosophical level - if we were not created (by some Creator force) - then our sapience sprung out of the muck and mire by chance. That alone makes it intrinsically superior to anything we will design (while we must rely on the Creator force to be superior to us for us to be superior to our creation). I understand that the belief is that once something is designed to do so - it will simply "take over" from there and do the same thing we do - I disagree.
Any created AI will be more limited than it's creator. I believe there would be a diminishing returns on creations. Flawed creatures would only create increasingly flawed creatures.
There is no reason to believe that a machine AI would think, at all, like a biological brain - and yet, those are the representations we invent in fiction. We anthropomorphize machinery (we've done it for a LONG time - golems, pygmallion, Talos, Frankenstein's monster).





Retour en haut







