Hm. Not sure what to say about that one.
[/quote]
You say, "Well, Jake, it's been infuriating and annoying, but at least you weren't a total jackass. Thanks for trying to have an intelligent conversation."
[quote]
If it has all of my memories and feelings and comes to the same conclusions I would come to, then it is me. Not the original me, perhaps, but that's getting too deeply into the question of personhood.
[/quote]
Indeedy.
[quote]
If it doesn't then it's not me, but that's getting too deep into the question of what computers are and aren't capable of.
[/quote]
...which we won't know until it's actually achieved.
[quote]
Are there any? I'm pretty sure neither chimps nor dolphins measure up.
[/quote]
The evidence is mounting from the various sources I've read. Nothing conclusive, though. Another interesting speculation - ala computer minds - is what if we were to increase non-human animal intelligences? Similar situation?
[quote]
All I'm saying is there is a reason the people in Mass Effect universe recognize that the Hanar are a race of people with rights while varren aren't. If we have a principle for distinguishing people from animals, why not just use the same principle for distinguishing people from objects?
[/quote]
Well, hanar and varren are different orders of intelligence. There is, however, no doubt that both are living creatures. Even in the ME universe, there is doubt that machines are - no matter how smart they are.
Of course, one can always say that a varren is as intelligent as it needs to be - to be a successful varren.
Perhaps machines may be judged the same way. Don't get me wrong, I'm not hostile to the idea of machine intelligence, I just need more than wishful thinking and overly-broad definitions. I'd need proof, although I admit I'm not quite sure exactly what kind of proof I'd accept at this juncture.
[quote]
Not strictly speaking, no.
[/quote]
Here's the rub. I am strictly speaking.
[quote]
But we're discussing a situation in which the knock-off is as durable as a Rolex, tells time equally well, and looks the same. The only difference is that it wasn't actually manufactured by Rolex. I'm arguing that the distinction in such a case is utterly trivial.
[/quote]
Not if it's my five grand it isn't.
[quote]
Unless you're arguing that machines aren't able to replicate human thoughts well. In this case we don't exactly have a real-world case to consider but as far as EDI goes, I'd say the results speak for themselves.
[/quote]
I can still say that she imitates us well, and that's all she's doing. You say if she can fool us, what's the difference?
How about this: Forget Data - was that doctor on ST: Voyager a person? Was the computer-driven hologram a person? A mind? If he was, the entire computer of Voyager was, no? If it wasn't - especially wth the computing power to write and modify that holodoc, why wasn't it? Everytime they took Voyager into battle, where was the asking for permission to risk the ship? Did they even care? Not only was it intelligent, it could create intelligence and give it form! Was every ship in Starfleet with a holodeck capable of this?
What happens in this scenario? Different case than EDI or the same? Why?
[quote]
I was under the impression that we were assuming an AI which did do that, but okay. I don't see that argument reaching a satisfactory conclusion for several more generations of computer processors, so I'll end it here.
[/quote]
Cool. Until there are real-world examples... - who knows, maybe quantum computers will make it happen.
[quote]
I guess I might feel more seriously about it since I grew up on fiction that uses poor treatment of AIs as allegory for racism.
[/quote]
And I grew up on fiction where they used aliens. Or elves. Or dwarves. Allegories are handy like that. As long as it isn't us, it's viable.
Like I said, your bias is showing. It's not a bad thing as long as you're aware it is a bias.
[quote]
Well, no. If the issue existed in the real world I'd feel compelled to politically oppose any system which made decisions on the same basis you described and to furthermore consider you specifically a bad person.
Luckily it's not an issue as of yet.
[/quote]
I wouldn't, obviously, consider myself a bad person if I choose to save a living person over a synthetic proxy for one.
Of course, I'd have to know one to see one...
[quote]
Yeah, okay.
I hope we can have this argument again in several decades' time.
[/quote]
That would be cool.





Retour en haut




