Let's take a step back.
You said regarding an A.I. that it would not necessarily need to be self aware in the same way a human is, but rather what matters is the result from its algorithms.
Now, I'm not trying to claim anything new here. All I'm saying is - we know the brain today to be a highly complex biological machine, governed by certain chemicals. A lack or an excess in one of these chemicals, could cause major problems, sometimes changing the perception of - self.
Now, knowing that what we have in our head is in some sense - a biological computer that uses chemicals and electricity to manage data, why should it be so dissimilar from other computers we know of?
Fun fact: Telling people that their most cherished emotions are merely a result of a chemical reaction / imbalance in their brain, usually makes them uncomfortable. Do you think that the human's sense of self would survive intact in most cases if the human was completely aware of the chemical reactions or the algorithms that cause you to think of yourself as - I?
Thinking of oneself as I, is merely the most efficient way to go about things, that doesn't necessarily mean that it's as accurate as you think it is,
or that you are not merely "feeling" the output of several algorithms in your brain that tell you to make sure to look out for the interests of "I".
I'm not trying to claim anything, merely raising questions based on known data.
I'm certainly not trying to claim that there's an invisible old man in the sky, I'm already a believer in the flying spaghetti monster.
Not sure how to interpret the last statement. Are you just joking? Or are you saying that you believe in God, but just not as an invisible old man in the sky, and the reference to the spaghetti monster is your way of admitting that this belief is not the most rational stand, given the circumstances?
If the latter, then you admit that rational arguments will only take you so far. So I wonder how open you really are for rational arguments on other subjects. Which goes to the point of continuing the discussion.
With your points regarding what's known about the brain you're also moving into the territory which whole course books in the philosophy of consciousness are dedicated to. I'm not inclined to go through that lot here. Don't have time for it. So I will just say the following.
In this area of philosophy, there are two main competing theories: the materialistic approach and the dualistic alternative. This concerns the relation between brain and consciousness. The former theory says they're one and the same, the latter that they're separate things. So if the materialists are correct, when we die, then the brain dies and our consciousness with it. But if the dualists are correct, then there's a chance that when the brain dies, our consciousness can still survive - and this may be the origin of our belief in the existence of a soul (which are so prevalent in our religions).
I should perhaps mention here that dualism in the traditional sense, the Cartesian version, which says brain is something material and consciousness something immaterial, creates problems regarding the exchange of information between the two. But dualism need not be interpreted in this way. It can simply stipulate that brain is something material X and consciousness is something material Y, Y being of a kind we haven't discovered yet in our scientific development - unless it's actually of the type we call "dark matter" or "dark energy", which certainly would have some explanatory value. The relation between brain and consciousness can then in principle be likened to the relation between a sponge and water when the sponge is soaked with water.
Now, how to interpret the facts about the brain from the perspective of each theory? And the other facts of relevance, like people really believing themselves to have had "out-of-body" experiences? I leave that up to you. A true philosopher does not rule out any possibility about anything until logically so forced.
As for the problems involved, I can point to an analogy: if a working radio suddenly goes dead, where is surely the fault, if we know nothing more - is it on the signal receiving end (brain) or the signal transmitting end (consciousness)? If the dualists are correct, then the causal relation between chemicals in the brain and emotions etc may have to be reconsidered.
That's it for me in this thread. If PM:ed a specific question, I might make another comment or two. But now I gotta go. Long journey tomorrow.
Good luck with your philosophy about souls, folks. 
Cheers,
D