Something that has been bugging me about EDI
#101
Guest_Maviarab_*
Posté 20 mars 2010 - 11:28
Guest_Maviarab_*
And no I doubt they will, theres too many political and legal ramifications with it when it eventually arives and realises humanity is taking the ****** out of it lol....
But they could as you suggested, make EDI ina way that would confirm her status as a proper AI or just anohter clever program.
#102
Guest_Maviarab_*
Posté 20 mars 2010 - 11:36
Guest_Maviarab_*
Modifié par Maviarab, 20 mars 2010 - 11:39 .
#103
Posté 20 mars 2010 - 11:50
Maviarab wrote...
Ws just thinking baout my last post, there is no real need to confirm anything, as she is referred to as an AI numerous times, so as far as Bioware is concerned in the ME universe, she 'is' an AI lol, but to me, with what I know of AI's, she doesnt 'strike' me as being one....I think that better explains it hehe.
Bah bioware's opinion was never what this thread was about. And the tali thread is at over 3500+ now:blink:. Speaking of computerized companions....
*Discretely applies anti-flame spray*
#104
Posté 20 mars 2010 - 11:53
jklinders wrote...
Maviarab wrote...
Ws just thinking baout my last post, there is no real need to confirm anything, as she is referred to as an AI numerous times, so as far as Bioware is concerned in the ME universe, she 'is' an AI lol, but to me, with what I know of AI's, she doesnt 'strike' me as being one....I think that better explains it hehe.
Bah bioware's opinion was never what this thread was about. And the tali thread is at over 3500+ now:blink:. Speaking of computerized companions....
*Discretely applies anti-flame spray*
Forget Mass Effect 3. All Bioware has to do is make a half-inspired Tali themed
#105
Guest_Maviarab_*
Posté 20 mars 2010 - 11:59
Guest_Maviarab_*
#106
Posté 21 mars 2010 - 12:08
Mcjon01 wrote...
jklinders wrote...
Maviarab wrote...
Ws just thinking baout my last post, there is no real need to confirm anything, as she is referred to as an AI numerous times, so as far as Bioware is concerned in the ME universe, she 'is' an AI lol, but to me, with what I know of AI's, she doesnt 'strike' me as being one....I think that better explains it hehe.
Bah bioware's opinion was never what this thread was about. And the tali thread is at over 3500+ now:blink:. Speaking of computerized companions....
*Discretely applies anti-flame spray*
Forget Mass Effect 3. All Bioware has to do is make a half-inspired Tali themedVIClippy and sell it at an inflated price. Everyone on their staff could retire to private islands.
Bioware do not look at the above, humanity is at stake*large person in white coat drags me off while injecting a sedative* Humanity is aaatt st-yawns*load snoring*
#107
Guest_Maviarab_*
Posté 21 mars 2010 - 12:12
Guest_Maviarab_*
#108
Posté 21 mars 2010 - 01:54
That's how scripting in computers works, garbage in, garbage out. If I got a bad result on that damn spreadsheet I needed ot go back in there and correct it myself. There was no magic button to find my mistake and correct it.
What in the name of Beelzebub does this have to do with the topic? Computer program adaptation. Probably best not to bore me with details about coding( hit me in the head with a book of computer code and I could still not tell one string of code from another) but could the ability to learn the truth of your environment after having been given false information not be a good test of whether a machine concienceness is in fact AI? You know some idiot tells me it is warm out as a joke and when I step outside it's freezing, i don't go around saying it's warm because I was told it was.
Could the ability to correctly assess your environment after having been programmed falsely be part of our answer?
#109
Guest_Maviarab_*
Posté 21 mars 2010 - 01:54
Guest_Maviarab_*
#110
Posté 21 mars 2010 - 02:36
Maviarab wrote...
Ws just thinking baout my last post, there is no real need to confirm anything, as she is referred to as an AI numerous times, so as far as Bioware is concerned in the ME universe, she 'is' an AI lol, but to me, with what I know of AI's, she doesnt 'strike' me as being one....I think that better explains it hehe.
That's kinda the beauty of the entire discussion: it's all theory. In a way it's philosophy. We're all trying to seek a truth with own perspective. One's truth is no more valid than another.
Now here's a theory I'd want to throw out there. I think I've already stated somewhere that I believe that an AI would basically be a person that is a machine. Could you consider an AI's programming to be a form of psychological conditioning? Could that be what an AI thinks is it's program is actually a form of positive/negative stimuli treatment to encourage or discourage certain behaviours?
#111
Posté 21 mars 2010 - 02:42
Self-determination does not exist. It has no true meaning. Everything is limited by something.
#112
Guest_Maviarab_*
Posté 21 mars 2010 - 02:47
Guest_Maviarab_*
And yes, programing (in the 'initial' sense) would be a form of conditioning and rules (like stealing is bad etc etc), I would like to thibnk that an AI given all parameters and facts at its disposal, would be able to 'change' said conditioning and rules depending on its values, beliefs and evidence.
#113
Guest_Maviarab_*
Posté 21 mars 2010 - 02:50
Guest_Maviarab_*
Emotions are simply chemical reactions in the brain. They have little to nothing to do with this debate.
Without trying to purposly flame you, you obviousley have no real information in the field and study or AI my friend.
Your statement has everything to do with this debate, as emotions are one of the major contributing factors to what makes us what we are.
#114
Posté 21 mars 2010 - 02:53
Maviarab wrote...
Without trying to purposly flame you, you obviousley have no real information in the field and study or AI my friend.Emotions are simply chemical reactions in the brain. They have little to nothing to do with this debate.
Your statement has everything to do with this debate, as emotions are one of the major contributing factors to what makes us what we are.
They're a huge factor in evolution and humans, yes; but they have nothing to do with intelligence in and of itself, nor AIs.
#115
Guest_Maviarab_*
Posté 21 mars 2010 - 02:59
Guest_Maviarab_*
Please read my first sentance from my last post again.
It is everything to do with it, because if humans had no emotions, I ask you, what would we be? (bare in mind I already know the answer, I would like you to think about it for a while)
Modifié par Maviarab, 21 mars 2010 - 03:08 .
#116
Posté 21 mars 2010 - 03:03
Maviarab wrote...
Schrong....
Please read my first sentance from my last post again.
It is everything to do with it, because if humans had no emotions, I ask you, what would be? (bare in mind I already know the answer, I would like you to think about it for a while)
I would be a human with some sort of enzyme deficiency, I assume.
#117
Guest_Maviarab_*
Posté 21 mars 2010 - 03:10
Guest_Maviarab_*
And thats exactly the point. Without emotions, we would be nothing more than a very complex machine capable of thought.
I at times believe my cats have emotions, but they do not really in the same sense that we as humans have them. Its these very emotions that we have and display, that is a very large part of what makes us 'human', take them away, we would be nothing more than husks, bodies,.....machines.
#118
Posté 21 mars 2010 - 03:11
Maviarab wrote...
No, you would be a machine.
And thats exactly the point. Without emotions, we would be nothing more than a very complex machine capable of thought.
I at times believe my cats have emotions, but they do not really in the same sense that we as humans have them. Its these very emotions that we have and display, that is a very large part of what makes us 'human', take them away, we would be nothing more than husks, bodies,.....machines.
I'm a machine already. Emotions are just another mechanism.
Modifié par Schroing, 21 mars 2010 - 03:13 .
#119
Guest_Maviarab_*
Posté 21 mars 2010 - 03:17
Guest_Maviarab_*
Once you have, you will realise that we (humans) are actually not machines (though we could technically be classed as one)....
And that is one of the ultimate goals of AI, for a 'machine' to have emotions. Again, not going to argue with you, do some research and study into the AI field please.
As for the original topic, another reason why I believe (imo) that EDI (despite what we are told in the game by Bioware) does not strike me as an AI, she never really shows 'emotion'.
#120
Posté 21 mars 2010 - 03:24
Maviarab wrote...
Define machine please?
Once you have, you will realise that we (humans) are actually not machines (though we could technically be classed as one)....
And that is one of the ultimate goals of AI, for a 'machine' to have emotions. Again, not going to argue with you, do some research and study into the AI field please.
As for the original topic, another reason why I believe (imo) that EDI (despite what we are told in the game by Bioware) does not strike me as an AI, she never really shows 'emotion'.
...A machine is simply something that performs a task or multiple tasks. The human body is made up of various complex and simple machines, and is itself a complex machine.
Emotions, again, are simply chemical reactions in the brain. Your cat has them, yes, your dog has them, you have them. They're evolutionary tools meant to help us survive. They're certainly not unique to humans, nor an entirely necessary part of their being.
If...uh...AI creators are truly making such a big deal over them, then it's either out of a misunderstanding as to what both intelligence and emotions truly are, or being done so in the name of something seperate from the sole goal (like the manufacturing of an artificial human analogue, for example) of creating an AI - creating a legitimate intelligence.
Modifié par Schroing, 21 mars 2010 - 03:25 .
#121
Guest_Maviarab_*
Posté 21 mars 2010 - 03:29
Guest_Maviarab_*
A machine is simply something that performs a task or multiple tasks. The human body is made up of various complex and simple machines, and is itself a complex machine.
Why I said we can 'technically' be classsed as machines, though we are not, as a machine does not have emotions (and the fact that emotions are chamical reactions is really irrelevent in this argument)
If...uh...AI creators are truly making such a big deal over them, then it's either out of a misunderstanding as to what both intelligence and emotions truly are, or being done so in the name of something seperate from the sole goal of creating an AI - creating a legitimate intelligence.
Outstanding. You have shown your serious lack of knowledge as well as your understanding of what Ai is in the same section. truly awsome. How did you actually manage to do that? lmao. So you think the 'experts' have no idea what they are talking baout then? Maybe you should go an educate them as you seem to have a better grasp of it then people who have been researching this for years. Also, a legitimate intelligence is a large part of what an AI would be....doh....
Your arguing with me for the sake of it lmao, when you already understand, you just do not seem to grasp that you understand, or even understand yourself, why in fact you do understand (if that made any sense),
#122
Guest_Maviarab_*
Posté 21 mars 2010 - 03:33
Guest_Maviarab_*
Also, in your opinion then, whats makes us human?
#123
Posté 21 mars 2010 - 03:33
Schroing wrote...
Emotions are simply chemical reactions in the brain.
Which in turn becomes electrical impluses that the brain interprets. A computer has to have a power source. Meaning electricity is needed. I can't discount what you're saying, but there is still a lot we have yet to discover about ourselves to understand what makes us, us. With that being said I cannot completely elimiate the idea just because an AI is a computer, that it cannot have emotions.
Maviarab wrote...
And yes, programing (in the 'initial' sense) would be a form of conditioning and rules (like stealing is bad etc etc), I would like to thibnk that an AI given all parameters and facts at its disposal, would be able to 'change' said conditioning and rules depending on its values, beliefs and evidence.
Now what about we throw in the Nature vs Nurture? What would be your spin on that?
#124
Posté 21 mars 2010 - 03:36
Your not getting this are you? A lot of it is semantics btw....
You can't have something big without a lot of less big things to make it up. If your understanding of something basic in the field of philosophy or mathematics is flawed, you're never going to truly understand the higher principles, either.
Why I said we can 'technically' be classsed as machines, though we are not, as a machine does not have emotions (and the fact that emotions are chamical reactions is really irrelevent in this argument)A machine is simply something that performs a task or multiple tasks. The human body is made up of various complex and simple machines, and is itself a complex machine.
Machines are machines individual of emotions; and it's entirely relevant. You're giving emotions far more credit philosophically than they deserve.
Outstanding. You have shown your serious lack of knowledge as well as your understanding of what Ai is in the same section. truly awsome. How did you actually manage to do that? lmao. So you think the 'experts' have no idea what they are talking baout then? Maybe you should go an educate them as you seem to have a better grasp of it then people who have been researching this for years.If...uh...AI creators are truly making such a big deal over them, then it's either out of a misunderstanding as to what both intelligence and emotions truly are, or being done so in the name of something seperate from the sole goal of creating an AI - creating a legitimate intelligence.
Your arguing with me for the sake of it lmao, when you already understand, you just do not seem to grasp that you understand, or even understand yourself, why in fact you do understand (if that made any sense),
Proof by assertion is fun, idn't it?
Also, a legitimate intelligence is a large part of what an AI would
be....doh....
Yes, it is. It's actually the -entire- part of what an AI would be. It has nothing to do with emotions.
#125
Posté 21 mars 2010 - 03:37
Symbol117 wrote...
Schroing wrote...
Emotions are simply chemical reactions in the brain.
Which in turn becomes electrical impluses that the brain interprets. A computer has to have a power source. Meaning electricity is needed. I can't discount what you're saying, but there is still a lot we have yet to discover about ourselves to understand what makes us, us. With that being said I cannot completely elimiate the idea just because an AI is a computer, that it cannot have emotions.
I think we could certainly program them to have some sort of an emotional analogue, but that really wasn't my point - emotions are ultimately irrelevant to the question.





Retour en haut






