Aller au contenu

Photo

Why are people thinking of geth like real living breathing entities


  • Veuillez vous connecter pour répondre
144 réponses à ce sujet

#51
Peavio

Peavio
  • Members
  • 42 messages
it is kind of a question of is it better to eliminate the problem over risking the problem overall, which brings us back to whether geth are sentient beings worth a second thought....

Modifié par Peavio, 24 février 2010 - 09:27 .


#52
binaryemperor

binaryemperor
  • Members
  • 781 messages

Peavio wrote...

binaryemperor wrote...

tsd16 wrote...

binaryemperor wrote...

tsd16 wrote...

binaryemperor wrote...

Even morality and emotion is based on a series of cause-effect"logic" factors though.

Socrates figured it out, and he's been dead for 2200 years.


and chemical reactions in the brain.  Organic functions that an AI would be incapable of.  It can be programmed to react to certain stimuli, but it cant feel.

But that's a silly argument. Of course we can argue that we "feel" our thought processes and reactions,  but they could claim the exact same thing themselves!


you are being ridiculous here.   you feel because you know you do. You dont have to react in anyway toward me for me to know that if I walked up and punched you in the face you would feel pain, anger, sadness or fear.  I know you do because i do.


But that is EXACTLY what I mean. The Geth were attacked. They reacted. If you punch me, I react. It does not matter just why I react, but I am obligated for the sake of self preservation to make a reaction. You cannot feel what I feel, you just assume I feel something. The only reason we believe the Geth do not "feel" is because we just assume they do not.

thats kinda like saying trees do not feel because we think they do not.... if they can not and will not tell us and give us no sign of pain, do they feel pain? , geth give no sign of pain they just something like left foot critically damaged bleep bloop bleep


Actually the Geth flinch and heave a LOT when you shoot them. OF course that's probably just because they use the same animations as the Turians and Quarians.

Also it is apparent the Geth think with more than just straight, programmed logic. They do build consensus based on logical choices, but some of them appear to do stuff for completely unexplainable reasons.

....Like legion lying about getting Shepard's armor, or deciding to dance to "the robot" while beatboxing in Gethian.

#53
Nightwriter

Nightwriter
  • Members
  • 9 800 messages

aeetos21 wrote...

I find it amazing that people here are assuming they know the ins and outs of non-organic "life" based entirely off of science fiction. You're taking an enormously complex question and are trivializing it.

According to the current laws of logic and what have you computers and the types of AI that DO EXIST right now should have zero chance of ever becoming self-aware. That also appears to be the gist for most science fiction pieces out there.

But when they suddenly do become self aware (again in science fiction) suddenly everybody knows exactly what they are and what they intend to do even though they have no clue how they got to where they are.

It's like you buy a car, you have no idea how it works but you know how to drive it and maintain it. Then suddenly, when something breaks inside it, you assume that since you know how to drive it and change the oil that you know how to both successfully diagnose the mechanical problem and how to fix it.



Of course everything about what we’re discussing is totally speculative! We’re having a speculative argument. Such an argument depends upon certain unprovable assumptions in order to exist. Allowances must be made in order to have fun and get your fight on.
 
We are not discussing what is or isn’t viable within the parameters of reality. We’re discussing what is or isn’t viable within the parameters of the fiction. This is a moral argument, not a plausibility one, though the line is often blurred.

#54
tsd16

tsd16
  • Members
  • 403 messages

aeetos21 wrote...

I find it amazing that people here are assuming they know the ins and outs of non-organic "life" based entirely off of science fiction. You're taking an enormously complex question and are trivializing it.

According to the current laws of logic and what have you computers and the types of AI that DO EXIST right now should have zero chance of ever becoming self-aware. That also appears to be the gist for most science fiction pieces out there.

But when they suddenly do become self aware (again in science fiction) suddenly everybody knows exactly what they are and what they intend to do even though they have no clue how they got to where they are.

It's like you buy a car, you have no idea how it works but you know how to drive it and maintain it. Then suddenly, when something breaks inside it, you assume that since you know how to drive it and change the oil that you know how to both successfully diagnose the mechanical problem and how to fix it.


I agree with you.  but as a computer programmer myself, i am just not seeing how the hell you can make a machine "feel" via programming code, especially when its ludacris to think if it were possible, that the quarians would say "hey lets pop a star trek data emotion chip into these things so they get angry when we enslave them!"

Any sort of emotion would have to have been implemented by the quarians, throw away the idea of whether or not its even possible, its stupid to think they would do something like that.

#55
Peavio

Peavio
  • Members
  • 42 messages

binaryemperor wrote...

Peavio wrote...

binaryemperor wrote...

tsd16 wrote...

binaryemperor wrote...

tsd16 wrote...

binaryemperor wrote...

Even morality and emotion is based on a series of cause-effect"logic" factors though.

Socrates figured it out, and he's been dead for 2200 years.


and chemical reactions in the brain.  Organic functions that an AI would be incapable of.  It can be programmed to react to certain stimuli, but it cant feel.

But that's a silly argument. Of course we can argue that we "feel" our thought processes and reactions,  but they could claim the exact same thing themselves!


you are being ridiculous here.   you feel because you know you do. You dont have to react in anyway toward me for me to know that if I walked up and punched you in the face you would feel pain, anger, sadness or fear.  I know you do because i do.


But that is EXACTLY what I mean. The Geth were attacked. They reacted. If you punch me, I react. It does not matter just why I react, but I am obligated for the sake of self preservation to make a reaction. You cannot feel what I feel, you just assume I feel something. The only reason we believe the Geth do not "feel" is because we just assume they do not.

thats kinda like saying trees do not feel because we think they do not.... if they can not and will not tell us and give us no sign of pain, do they feel pain? , geth give no sign of pain they just something like left foot critically damaged bleep bloop bleep


Actually the Geth flinch and heave a LOT when you shoot them. OF course that's probably just because they use the same animations as the Turians and Quarians.

Also it is apparent the Geth think with more than just straight, programmed logic. They do build consensus based on logical choices, but some of them appear to do stuff for completely unexplainable reasons.

....Like legion lying about getting Shepard's armor, or deciding to dance to "the robot" while beatboxing in Gethian.

we must remember that this is a game and there may be some things for entertainment.. you know

#56
Varthun

Varthun
  • Members
  • 123 messages

tsd16 wrote...

Pauravi wrote...

tsd16 wrote...

No matter now you want to slice it Artificial sentience is not life and not equal to sentient organics.

If you have an AI robot slave, it doesnt "feel bad" because its a slave.

I don't find your reasoning very convincing.

First of all, you seem to be suggesting that emotions are a necessary part of being a "person".  But an emotion is just an electrochemical cascade between the "wires" (neurons) in your brain.  I see no reason that such a thing couldn't happen in a purely electronic system.  How are you so sure that Geth don't have emotions?

Next of all, why are emotions essential to being a person?  Why does this make organic beings better or more worthy of respect?  You may think this is obvious, but I think that if you try to explain it you'll find that it is not as obvious as you believe.


first of all the quarians would have to have implemented this mean of emotion, i am not saying it is impossible, but why on earth would they implement an ability to have emotion into machines they were using as servants?  Next, the geth would have no conecpt of what emotion actually is since they have never experienced it, in which to implement it in themselves.  The AI is simply an ability for the machine to problem solve on its own and be aware of its existence, it has nothing to do with emotion.  

Emotion is an evolutionary trait, we feel because these feelings turned out to be necessary to our survival. 

As for your comment about me not respecting "other life" than people, sure I do.  most animals although we cannot ask them, most likely are bound to similar chemical reactions that cause emotion in us.

Think about a human existing without any form of emotion.  A non feeling human would essentially do what it had to, to survive, and would have no concept of things like self sacrifce, right from wrong, empathy, compassion etc etc.

A non feeling human would probably not do something like play a video game, there would be no feeling of joy from doing so, there would be no culture no art.   we would simply be.


But the point is, even without feeling we would still be alive. A plant feels no more emotion than a machine does, yet it is still qualified as being alive.

Also, I'm mildly curious how you decided emotions to be an evolutionary trait. If that was the case, it would imply that emotion is a mutation, or unnaturally occuring event at one point in our history. If that was the case, we would be a divergence from what was previously considered life, so we would not in fact fit the idea of life, at one point and time.

I use that to transition to the point of how you are now considering Geth to not be "life" based on the assumption (a rather faulty one) that they do not possess emotion. Perhaps the synthetic form is the new evolutionary "trait", so to speak?

#57
Frotality

Frotality
  • Members
  • 1 057 messages
the human brain is nothing more than a very complex computer. so are the geth.

so whats the problem? sentience isnt some magical state of being only mythical dietys can come up with. respiration does not make an intelligent creature, whether the geth breathe or not is inconsequential; they are self-aware and intelligent, same as any human/asari/quarian. the quarians done f**ked up when they made them; hell, if i accidentally created a widespread intelligent race, i'd be pretty damn confused as to what i should do to, so its perfectly understandable that they tried to wipe them out, as far as they were concerned their cheap labour force just became a MAJOR potential problem, so they wanted to get rid of it.

but the geth arent the universally violent and unreasonable vagrants the quarians think they are; legion shows just how similar the geth are to any other sapient creature, not to mention their advanced technology and that they have no desire to "kill all humans", to paraphrase the common fear of rebelling machines. they can be reasoned with, despite the fact that they have every reason to hate organics.

sci fi is all about stretching the limits of what we percieve to be impossible to artificially create, sentient life is far from new in that department.

#58
Pauravi

Pauravi
  • Members
  • 1 989 messages

Peavio wrote...
thats kinda like saying trees do not feel because we think they do not.... if they can not and will not tell us and give us no sign of pain, do they feel pain? , geth give no sign of pain they just something like left foot critically damaged bleep bloop bleep

So, are you aware that there are human beings who are incapable of feeling pain?
Do you think this means that they are not, and shouldn't be treated as, real people?

They Geth display a remarkable number of human qualities.  They clearly "fear" being destroyed, and they display something like curiosity (think of the social experiments they did) to learn about the organic species.  Legion displayed confusion and dismay when he learned that the Heretics were "spying" on the other Geth.  Whether or not the way they experience fear, or curiosity, or dismay is the same as our way of experiencing it is completely irrelevant.  To deny them personhood on that basis is simply to place preference on human experience and judge everything else as "lesser" without any concrete reason.

#59
Peavio

Peavio
  • Members
  • 42 messages
maybe hes practicing for when the new elite cyborg race comes along and want to have beatbox/dance battles and if they win your race dies... seems logical!

#60
Varthun

Varthun
  • Members
  • 123 messages

tsd16 wrote...

aeetos21 wrote...

I find it amazing that people here are assuming they know the ins and outs of non-organic "life" based entirely off of science fiction. You're taking an enormously complex question and are trivializing it.

According to the current laws of logic and what have you computers and the types of AI that DO EXIST right now should have zero chance of ever becoming self-aware. That also appears to be the gist for most science fiction pieces out there.

But when they suddenly do become self aware (again in science fiction) suddenly everybody knows exactly what they are and what they intend to do even though they have no clue how they got to where they are.

It's like you buy a car, you have no idea how it works but you know how to drive it and maintain it. Then suddenly, when something breaks inside it, you assume that since you know how to drive it and change the oil that you know how to both successfully diagnose the mechanical problem and how to fix it.


I agree with you.  but as a computer programmer myself, i am just not seeing how the hell you can make a machine "feel" via programming code, especially when its ludacris to think if it were possible, that the quarians would say "hey lets pop a star trek data emotion chip into these things so they get angry when we enslave them!"

Any sort of emotion would have to have been implemented by the quarians, throw away the idea of whether or not its even possible, its stupid to think they would do something like that.


The problem is that an AI stands for, in a loose sense, "Artificial Intelligence". In other words, a machine that can reason and determine it's surroundings and learn. Are you implying that machines could not learn "emotions"? It's not much more than a trigger situation, followed by a change in reasoning.

#61
Peavio

Peavio
  • Members
  • 42 messages

Pauravi wrote...

Peavio wrote...
thats kinda like saying trees do not feel because we think they do not.... if they can not and will not tell us and give us no sign of pain, do they feel pain? , geth give no sign of pain they just something like left foot critically damaged bleep bloop bleep

So, are you aware that there are human beings who are incapable of feeling pain?
Do you think this means that they are not, and shouldn't be treated as, real people?

They Geth display a remarkable number of human qualities.  They clearly "fear" being destroyed, and they display something like curiosity (think of the social experiments they did) to learn about the organic species.  Legion displayed confusion and dismay when he learned that the Heretics were "spying" on the other Geth.  Whether or not the way they experience fear, or curiosity, or dismay is the same as our way of experiencing it is completely irrelevant.  To deny them personhood on that basis is simply to place preference on human experience and judge everything else as "lesser" without any concrete reason.

do trees feel hate, fear, or sorrow?  because we think they dont?

#62
Daerog

Daerog
  • Members
  • 4 857 messages

aeetos21 wrote...

I find it amazing that people here are assuming they know the ins and outs of non-organic "life" based entirely off of science fiction. You're taking an enormously complex question and are trivializing it.

According to the current laws of logic and what have you computers and the types of AI that DO EXIST right now should have zero chance of ever becoming self-aware. That also appears to be the gist for most science fiction pieces out there.

But when they suddenly do become self aware (again in science fiction) suddenly everybody knows exactly what they are and what they intend to do even though they have no clue how they got to where they are.

It's like you buy a car, you have no idea how it works but you know how to drive it and maintain it. Then suddenly, when something breaks inside it, you assume that since you know how to drive it and change the oil that you know how to both successfully diagnose the mechanical problem and how to fix it.


I express my thoughts on the matter, but I do not claim to know the ins and outs. The point of having self aware AI in science fiction is to promote these talks and discussions of what we view of life and how or why something so alien to us (being mechanical instead of organic) should be respected.
Philosophers of the past have brought up even crazier things in order to promote discussion, it's not to be seen as "this could really happen".

#63
aeetos21

aeetos21
  • Members
  • 1 478 messages
Hypothetical question or not no one on here is asking the question of just how radical the idea is of a machine becoming self-aware. And if it does manage to break the laws of logic, then how can you possibly have the slightest idea of what to expect or what that machine's limitations are?

Edit: It already did the impossible once, are you so convinced that it can't do so again?

Modifié par aeetos21, 24 février 2010 - 09:35 .


#64
Srau

Srau
  • Members
  • 292 messages
I am sure Kobold, the 12 colonies, 1st Earth and the whole survivor fleet inhabitants would rise an eyebow at this whole "be my friend" ME2 trend between Biogicals and Mechanicals.

I have no doubt this Quarian Admiral and all those talking about peace would just end up court martialed and spaced by Admiral Adama.

I also think Neo and his friends in Zion would agree with him, even Joker says so when unshackling EDI.

#65
binaryemperor

binaryemperor
  • Members
  • 781 messages

Peavio wrote...

binaryemperor wrote...

Peavio wrote...

binaryemperor wrote...

tsd16 wrote...

binaryemperor wrote...

tsd16 wrote...

binaryemperor wrote...

Even morality and emotion is based on a series of cause-effect"logic" factors though.

Socrates figured it out, and he's been dead for 2200 years.


and chemical reactions in the brain.  Organic functions that an AI would be incapable of.  It can be programmed to react to certain stimuli, but it cant feel.

But that's a silly argument. Of course we can argue that we "feel" our thought processes and reactions,  but they could claim the exact same thing themselves!


you are being ridiculous here.   you feel because you know you do. You dont have to react in anyway toward me for me to know that if I walked up and punched you in the face you would feel pain, anger, sadness or fear.  I know you do because i do.


But that is EXACTLY what I mean. The Geth were attacked. They reacted. If you punch me, I react. It does not matter just why I react, but I am obligated for the sake of self preservation to make a reaction. You cannot feel what I feel, you just assume I feel something. The only reason we believe the Geth do not "feel" is because we just assume they do not.

thats kinda like saying trees do not feel because we think they do not.... if they can not and will not tell us and give us no sign of pain, do they feel pain? , geth give no sign of pain they just something like left foot critically damaged bleep bloop bleep


Actually the Geth flinch and heave a LOT when you shoot them. OF course that's probably just because they use the same animations as the Turians and Quarians.

Also it is apparent the Geth think with more than just straight, programmed logic. They do build consensus based on logical choices, but some of them appear to do stuff for completely unexplainable reasons.

....Like legion lying about getting Shepard's armor, or deciding to dance to "the robot" while beatboxing in Gethian.

we must remember that this is a game and there may be some things for entertainment.. you know


All I know is that If a robot started lying (badly) to me, I'd start to wonder what was going on in that quantum brainbox of his.
I'm pretty sure Legion's sudden dancing is more than an Easter Egg. He is a very advanced Geth, containing a lot of consensual memories.  That has to make him very... quirky, as Geth did not design themselves to contain that many minds in one body.

#66
Varthun

Varthun
  • Members
  • 123 messages

aeetos21 wrote...

Hypothetical question or not no one on here is asking the question of just how radical the idea is of a machine becoming self-aware. And if it does manage to break the laws of logic, then how can you possibly have the slightest idea of what to expect or what that machine's limitations are?


Because a lot of people don't seem to get the "click" of a truly self-aware machine operating outside it's programming, even though that's precisely what an AI would be.

Modifié par Varthun, 24 février 2010 - 09:40 .


#67
Varthun

Varthun
  • Members
  • 123 messages
Double post, how I hate thee.

Modifié par Varthun, 24 février 2010 - 09:39 .


#68
Nightwriter

Nightwriter
  • Members
  • 9 800 messages

tsd16 wrote...

aeetos21 wrote...

I find it amazing that people here are assuming they know the ins and outs of non-organic "life" based entirely off of science fiction. You're taking an enormously complex question and are trivializing it.

According to the current laws of logic and what have you computers and the types of AI that DO EXIST right now should have zero chance of ever becoming self-aware. That also appears to be the gist for most science fiction pieces out there.

But when they suddenly do become self aware (again in science fiction) suddenly everybody knows exactly what they are and what they intend to do even though they have no clue how they got to where they are.

It's like you buy a car, you have no idea how it works but you know how to drive it and maintain it. Then suddenly, when something breaks inside it, you assume that since you know how to drive it and change the oil that you know how to both successfully diagnose the mechanical problem and how to fix it.


I agree with you.  but as a computer programmer myself, i am just not seeing how the hell you can make a machine "feel" via programming code, especially when its ludacris to think if it were possible, that the quarians would say "hey lets pop a star trek data emotion chip into these things so they get angry when we enslave them!"

Any sort of emotion would have to have been implemented by the quarians, throw away the idea of whether or not its even possible, its stupid to think they would do something like that.



I can tell you as a non-programmer whose mother IS a programmer that you cannot look too closely at science fiction of this sort. As an IT worker you are too familiar with its total implausibility for the story to stand up to that kind of scrutiny, and it will ruin the fiction for you. Being a layman does have its advantages sometimes.
 
I know this because I took my mother to see I, Robot, which she found laughable and very hard to immerse herself in for this very reason. Her comments were actually quite annoying.
 
I guess when you spend all your time painstakingly teaching machines how to think, you become agonizingly aware of how totally incapable they are of thinking for themselves.

#69
RhythmlessNinja

RhythmlessNinja
  • Members
  • 369 messages
Has anyone even listened to all of legions dialogue on the normandy? Because near the of his dialogue lines, you can tell geth are showing some sort of emotion. Especially legion himself. On that note, i'll go back to my previous example. If a chicken just started talking all of a sudden & understood you, would you feel like a murderer if you killed it while its begging for its life? Same goes for the geth, if you just walked up and had a convo with one, say legion & someone just blew his head off, Would you consider them a murderer? Would you be angry? Sad? Hell, I could tell shepard felt that way when Legion got killed in the suicide mission.

#70
adam_grif

adam_grif
  • Members
  • 1 923 messages

aeetos21 wrote...

Hypothetical question or not no one on here is asking the question of just how radical the idea is of a machine becoming self-aware. And if it does manage to break the laws of logic, then how can you possibly have the slightest idea of what to expect or what that machine's limitations are?


"Break the laws of logic"? Computers are deterministic system. They can't "break the laws of logic" because they operate with logic. What you're saying doesn't make sense. All computations are done using logic gates that switch between 1 and 0.

A computer being self aware does not mean it doesn't stick to its programming. Since it's a deterministic system, it's programming and "it" are the same thing. The idea isn't that they somehow transcend programming, it's that they are programmed in such a way that accurately predicting what will happen because the coding is so complex. You build learning systems that respond intelligently to the environment. 

Even if the AI ends up killing people, it is still behaving in full accordance with its programming, and by extention logic, it's just that it was programmed poorly.

#71
tsd16

tsd16
  • Members
  • 403 messages

Pauravi wrote...

Peavio wrote...
thats kinda like saying trees do not feel because we think they do not.... if they can not and will not tell us and give us no sign of pain, do they feel pain? , geth give no sign of pain they just something like left foot critically damaged bleep bloop bleep

So, are you aware that there are human beings who are incapable of feeling pain?
Do you think this means that they are not, and shouldn't be treated as, real people?

They Geth display a remarkable number of human qualities.  They clearly "fear" being destroyed, and they display something like curiosity (think of the social experiments they did) to learn about the organic species.  Legion displayed confusion and dismay when he learned that the Heretics were "spying" on the other Geth.  Whether or not the way they experience fear, or curiosity, or dismay is the same as our way of experiencing it is completely irrelevant.  To deny them personhood on that basis is simply to place preference on human experience and judge everything else as "lesser" without any concrete reason.


Do they clearly "fear" being destroyed? Or are they simply preserving themselves as a logical means to an end.  When I refer to fear, I am not only referring to a physical reaction to it but the shock you would feel for example, if your foot got caught on a train track and a train was coming, your telling me, for example, a geth would feel the same way you do, or would it simply recognize the threat and attempt to escape?  Bioware has yet to reveal whether the geth "have emotion".  Juding by what has taken place so far, i believe they do not, therefore  they are simply machines capable of advanced thought, Id smash it as soon as I would smash a faulty printer that was annoying the ****** out of me.

#72
Peavio

Peavio
  • Members
  • 42 messages
call me greedy but if my computer started walking and talking and was like no i don't want to do what you want I'd be like sit the f*** down i wanna play some me2 then he'd be like i can smell the fresh air and feel the love in my heart for my new world I'd be like... no i wanna play f****ing mass effect i payed a lot of money for you

#73
binaryemperor

binaryemperor
  • Members
  • 781 messages
The thing is is that the Geth's sentience comes from networking. A hive mind. Most Geth are of animal level intelligence. Put enough together though, and that is what makes them capable of making advanced decisions. One Geth mind is probably as smart as a snail, or a calculator.



It is almost like each geth is a brain cell.

#74
The Capital Gaultier

The Capital Gaultier
  • Members
  • 1 004 messages

Nightwriter wrote...

tsd16 wrote...

aeetos21 wrote...

I find it amazing that people here are assuming they know the ins and outs of non-organic "life" based entirely off of science fiction. You're taking an enormously complex question and are trivializing it.

According to the current laws of logic and what have you computers and the types of AI that DO EXIST right now should have zero chance of ever becoming self-aware. That also appears to be the gist for most science fiction pieces out there.

But when they suddenly do become self aware (again in science fiction) suddenly everybody knows exactly what they are and what they intend to do even though they have no clue how they got to where they are.

It's like you buy a car, you have no idea how it works but you know how to drive it and maintain it. Then suddenly, when something breaks inside it, you assume that since you know how to drive it and change the oil that you know how to both successfully diagnose the mechanical problem and how to fix it.


I agree with you.  but as a computer programmer myself, i am just not seeing how the hell you can make a machine "feel" via programming code, especially when its ludacris to think if it were possible, that the quarians would say "hey lets pop a star trek data emotion chip into these things so they get angry when we enslave them!"

Any sort of emotion would have to have been implemented by the quarians, throw away the idea of whether or not its even possible, its stupid to think they would do something like that.



I can tell you as a non-programmer whose mother IS a programmer that you cannot look too closely at science fiction of this sort. As an IT worker you are too familiar with its total implausibility for the story to stand up to that kind of scrutiny, and it will ruin the fiction for you. Being a layman does have its advantages sometimes.
 
I know this because I took my mother to see I, Robot, which she found laughable and very hard to immerse herself in for this very reason. Her comments were actually quite annoying.
 
I guess when you spend all your time painstakingly teaching machines how to think, you become agonizingly aware of how totally incapable they are of thinking for themselves.

Your mother isn't very imaginative, then.  Science fiction isn't any less interesting just because it depends on impossibilities to be plausible.

#75
Pauravi

Pauravi
  • Members
  • 1 989 messages

Peavio wrote...
do trees feel hate, fear, or sorrow?  because we think they dont?

Why do you continue using this tree analogy?  It is meaningless.
In what way are the Geth like trees?  In what way are trees like Geth?
The answer is not at all.

The fact is that trees simply do not have the "equipment" to produce anything like intelligence or emotion, nor do they display any evidence of it outwardly.  We know this because we've been studying trees for ages.  The Geth, on the other hand, behave in ways that lead us to believe that they have many traits of other organic species, and there has NEVER been an intact Geth to study and determine if they have the necessary equipment for emotion and intelligence.  Based on their behavior, I think there is a pretty good chance that they DO.

So what have we learned?  The Geth are not like trees.
Therefore, your tree analogy is absolutely irrelevant in the discussion of artificial intelligence.