Why don't more people choose Control?
#876
Posté 21 décembre 2012 - 11:59
#877
Posté 22 décembre 2012 - 12:03
SeptimusMagistos wrote...
Dr_Extrem wrote...
are we more than our thoughts?
No.Dr_Extrem wrote...
thoughts ans memories are one thing. feelings and emotions are inducted by hormones and other massengers. we have impulses.
If the brain can be emulated, so can glands. If anything, it's easier.Dr_Extrem wrote...
we biological units are far more than out thoughts and memories.
I'm assuming we're using the word 'thoughts' to describe all brain activity. Is that right?
from the philosophical point of view, we may not be more than our thoughts. but this is about the interaction of massengers and stored data. this is about hard- and software.
we can discuss this matter further, if we know how and where a thought originates.
human beings as a whole are more than thoughts and memories .. we are made out of emotions as well. and emotions can may be emulated but not copied. emotions are dependant on our current state of our brain - hormone levels, lithium level and many more things. all this influences what we feel and our feelings can differ even on the same matter.
#878
Posté 22 décembre 2012 - 12:03
Nope, the differnce between Legion and the geth vi i data only. The reason why the geth vi does not have most of legions memory is because he did not up load it. If he did the geht vi would be legion.Rifneno wrote...
dreman9999 wrote...
The geth prove you wrong.Ticonderoga117 wrote...
jtav wrote...
Thoughts and memories. And in my philosophy of the mind, that means that thing is Shepard whether they are stored in data bytes or neurons.
Except in Mass Effect, that's not how AI's function.
The codex specifically mentions that if you take data (which in our case will be memories) and dump it into a bluebox, the first time you will get personality A. Then say you do it again on a different bluebox. You will most likely not get personality A. Instead, you will most likely get B.
Thus, the chance that the new AI will "be" Shepard is EXTREMELY unlikely.
He/She is dead and gone, and an imposter takes over the Reapers.
No, the geth prove him right. The difference between Legion and the Geth VI is startling despite them being the same "data."
That show that if the mind is saved, the person is saved, if not thenthe person , who they were at the time is dead.
#879
Posté 22 décembre 2012 - 12:03
dreman9999 wrote...
Which is the logic of an AI forced to do what it's programed to do.
It's programmed to perseve life not matter the cost... How it does it was never stated or limited.
That still doesn't change the fact that it killed everyone.
Really, what it tried to do (preserve life) could have been accomplished through recorded data banks, rather than turning everyone into sludge and then encase their memories (although, why you'd need the bodies for the process is beyond me) in giant machine bodies. It obviously didn't care for their well being.
#880
Posté 22 décembre 2012 - 12:07
Emotions are based on our hardware. Our general partd of our mind. If we have these parts cut out we loses tose emotions. If a person has a lobotamy and is different after the operation but has all the same memories as before, is he the same person as before or has the person died?Dr_Extrem wrote...
SeptimusMagistos wrote...
Dr_Extrem wrote...
are we more than our thoughts?
No.Dr_Extrem wrote...
thoughts ans memories are one thing. feelings and emotions are inducted by hormones and other massengers. we have impulses.
If the brain can be emulated, so can glands. If anything, it's easier.Dr_Extrem wrote...
we biological units are far more than out thoughts and memories.
I'm assuming we're using the word 'thoughts' to describe all brain activity. Is that right?
from the philosophical point of view, we may not be more than our thoughts. but this is about the interaction of massengers and stored data. this is about hard- and software.
we can discuss this matter further, if we know how and where a thought originates.
human beings as a whole are more than thoughts and memories .. we are made out of emotions as well. and emotions can may be emulated but not copied. emotions are dependant on our current state of our brain - hormone levels, lithium level and many more things. all this influences what we feel and our feelings can differ even on the same matter.
Ask your self that first. The difference here is not that Shepard is no longer Shepard but that he is no longer human. He does not processemotions the same way but he has the same thoughts and memeoies as before.
Modifié par dreman9999, 22 décembre 2012 - 12:09 .
#881
Posté 22 décembre 2012 - 12:07
iakus wrote...
And the directive of the ShepReaper is to "protect the many"
What would it do to achieve this?
If it's operating under the same directive given to the catalyst by the leviathans then I would imagine whatever it has to do. If you believe that mentality is philosophically wrong, that is fine.
I don't believe it is operating under the same strict directive the catalyst was. The fact that there is two different monologues (one renegade, one paragon) I believe supports this.
#882
Posté 22 décembre 2012 - 12:08
Of course it does notcare about our well being. It only cares about doing it's programing. It used the bodies to make reapers.Someone With Mass wrote...
dreman9999 wrote...
Which is the logic of an AI forced to do what it's programed to do.
It's programmed to perseve life not matter the cost... How it does it was never stated or limited.
That still doesn't change the fact that it killed everyone.
Really, what it tried to do (preserve life) could have been accomplished through recorded data banks, rather than turning everyone into sludge and then encase their memories (although, why you'd need the bodies for the process is beyond me) in giant machine bodies. It obviously didn't care for their well being.
#883
Posté 22 décembre 2012 - 12:10
dreman9999 wrote...
Ecept for the fact that's not true and they originally get to being an Ai by sharing thinking work loads...Then they go the modded reaper code and where able to be ai's on their own power....
Reapers are the same case....
Still, the Geth don't use blueboxes. Everything else that does, you will not get the same personality with the same data.
#884
Posté 22 décembre 2012 - 12:11
That just means heis not longer human. No evil. If a fly suddenly could think at a level of a human, is it going to suddenly become evil?Rifneno wrote...
I'm sure Shepbinger will be good. It's not like it starts talking about how it now understands things in ways that a mere human never could.
#885
Posté 22 décembre 2012 - 12:11
Olympiclash wrote...
iakus wrote...
And the directive of the ShepReaper is to "protect the many"
What would it do to achieve this?
If it's operating under the same directive given to the catalyst by the leviathans then I would imagine whatever it has to do. If you believe that mentality is philosophically wrong, that is fine.
I don't believe it is operating under the same strict directive the catalyst was. The fact that there is two different monologues (one renegade, one paragon) I believe supports this.
Paragon/renegade has generally been about different attitudes toward the same goal. This new monster obviously thinks like a Reaper. It "understands" and "comprehends" what a mere human could not. Chances are, it's going to reach the same conclusions as the last AI. It thought it was doing things for the greater good too.
#886
Posté 22 décembre 2012 - 12:12
Dr_Extrem wrote...
from the philosophical point of view, we may not be more than our thoughts. but this is about the interaction of massengers and stored data. this is about hard- and software.
Right. Shepard is software. Shepard's brain is hardware. The Citadel is hardware.
Dr_Extrem wrote...
we can discuss this matter further, if we know how and where a thought originates.
Since I have no desire to discuss neurobiology or AI research, I'll just say that I'm going with the idea that Shepard's mind is basically software when considering Control. I can see why those who disagree would be more hesitant to choose it.
Dr_Extrem wrote...
human beings as a whole are more than thoughts and memories .. we are made out of emotions as well. and emotions can may be emulated but not copied. emotions are dependant on our current state of our brain - hormone levels, lithium level and many more things. all this influences what we feel and our feelings can differ even on the same matter.
On the other hand, synthetics is Mass Effect universe also demonstrate the capacity for emotion, for all that they try to deny it.
So if what you're arguing is that the Shepard AI is composed of the same thoughts and attitudes, but with the possibility of a changed emotional state and a capacity to change and grow in different ways than the Shepard in the organic body - that's absolutely true. I'd still consider this entity to be Shepard, though.
#887
Posté 22 décembre 2012 - 12:14
But the geth do get the same personality with the same data. The geth show it can be done. It's the lack of data the make the geth personality different. Example the geth vi. It has all of Legions save data up to when Legion got shot.It's legion if it never met Shepard.Ticonderoga117 wrote...
dreman9999 wrote...
Ecept for the fact that's not true and they originally get to being an Ai by sharing thinking work loads...Then they go the modded reaper code and where able to be ai's on their own power....
Reapers are the same case....
Still, the Geth don't use blueboxes. Everything else that does, you will not get the same personality with the same data.
#888
Posté 22 décembre 2012 - 12:14
dreman9999 wrote...
Emotions are based on our hardware. Our general partd of our mind. If we have these parts cut out we loses tose emotions. If a person has a lobotamy and is different after the operation but has all the same memories as before, is he the same person as before or has the person died?Dr_Extrem wrote...
SeptimusMagistos wrote...
Dr_Extrem wrote...
are we more than our thoughts?
No.Dr_Extrem wrote...
thoughts ans memories are one thing. feelings and emotions are inducted by hormones and other massengers. we have impulses.
If the brain can be emulated, so can glands. If anything, it's easier.Dr_Extrem wrote...
we biological units are far more than out thoughts and memories.
I'm assuming we're using the word 'thoughts' to describe all brain activity. Is that right?
from the philosophical point of view, we may not be more than our thoughts. but this is about the interaction of massengers and stored data. this is about hard- and software.
we can discuss this matter further, if we know how and where a thought originates.
human beings as a whole are more than thoughts and memories .. we are made out of emotions as well. and emotions can may be emulated but not copied. emotions are dependant on our current state of our brain - hormone levels, lithium level and many more things. all this influences what we feel and our feelings can differ even on the same matter.
Ask your self that first. The difference here is not that Shepard is no longer Shepard but that he is no longer human. He does not processemotions the same way but he has the same thoughts and memeoies as before.
shepard was shepard because of his/her humanity ..
do you know what a lobotomy does to the "patient". it is so inhuman, i will not answer or comment to this.
#889
Posté 22 décembre 2012 - 12:14
Someone With Mass wrote...
That still doesn't change the fact that it killed everyone.
Really, what it tried to do (preserve life) could have been accomplished through recorded data banks, rather than turning everyone into sludge and then encase their memories (although, why you'd need the bodies for the process is beyond me) in giant machine bodies. It obviously didn't care for their well being.
It did not kill "everyone" as that would go against it's directive.
You are absolutely right that it, theoretically, could have been done in another way.
#890
Posté 22 décembre 2012 - 12:16
Then you're not understanding why the catalyst came to the conclusion to preserve organics. It was programmed to. It's a shackled AI.Rifneno wrote...
Olympiclash wrote...
iakus wrote...
And the directive of the ShepReaper is to "protect the many"
What would it do to achieve this?
If it's operating under the same directive given to the catalyst by the leviathans then I would imagine whatever it has to do. If you believe that mentality is philosophically wrong, that is fine.
I don't believe it is operating under the same strict directive the catalyst was. The fact that there is two different monologues (one renegade, one paragon) I believe supports this.
Paragon/renegade has generally been about different attitudes toward the same goal. This new monster obviously thinks like a Reaper. It "understands" and "comprehends" what a mere human could not. Chances are, it's going to reach the same conclusions as the last AI. It thought it was doing things for the greater good too.
#891
Posté 22 décembre 2012 - 12:17
dreman9999 wrote...
Then you're not understanding why the catalyst came to the conclusion to preserve organics. It was programmed to. It's a shackled AI.Rifneno wrote...
Paragon/renegade has generally been about different attitudes toward the same goal. This new monster obviously thinks like a Reaper. It "understands" and "comprehends" what a mere human could not. Chances are, it's going to reach the same conclusions as the last AI. It thought it was doing things for the greater good too.
Funny, I was just thinking the same about you.
#892
Posté 22 décembre 2012 - 12:18
Olympiclash wrote...
iakus wrote...
And the directive of the ShepReaper is to "protect the many"
What would it do to achieve this?
If it's operating under the same directive given to the catalyst by the leviathans then I would imagine whatever it has to do. If you believe that mentality is philosophically wrong, that is fine.
I don't believe it is operating under the same strict directive the catalyst was. The fact that there is two different monologues (one renegade, one paragon) I believe supports this.
I believe that the variations from the speech are derived from paragon/renegade priorities teh AI has developed through Shepard's memories. But the new Shepard AI is still running of the same systems that the Catalyst had. The Shepard AI is simply following its programming, has no connection to the people it "protects" outside of the purpose tehy give it.
#893
Posté 22 décembre 2012 - 12:19
SeptimusMagistos wrote...
Dr_Extrem wrote...
from the philosophical point of view, we may not be more than our thoughts. but this is about the interaction of massengers and stored data. this is about hard- and software.
Right. Shepard is software. Shepard's brain is hardware. The Citadel is hardware.Dr_Extrem wrote...
we can discuss this matter further, if we know how and where a thought originates.
Since I have no desire to discuss neurobiology or AI research, I'll just say that I'm going with the idea that Shepard's mind is basically software when considering Control. I can see why those who disagree would be more hesitant to choose it.Dr_Extrem wrote...
human beings as a whole are more than thoughts and memories .. we are made out of emotions as well. and emotions can may be emulated but not copied. emotions are dependant on our current state of our brain - hormone levels, lithium level and many more things. all this influences what we feel and our feelings can differ even on the same matter.
On the other hand, synthetics is Mass Effect universe also demonstrate the capacity for emotion, for all that they try to deny it.
So if what you're arguing is that the Shepard AI is composed of the same thoughts and attitudes, but with the possibility of a changed emotional state and a capacity to change and grow in different ways than the Shepard in the organic body - that's absolutely true. I'd still consider this entity to be Shepard, though.
it is a fundamentally different hardware. i consider the shep-ai as the remaining thoughts and memories - but disconnected from emotions, a human can feel.
you can consider what ever you want ... but if the shep-ai starts to speak of its former form as of a different perosn, i tent to believe, that this new entity, is no longer our shepard. it may identifies with its old form - but it is definately something new.
#894
Posté 22 décembre 2012 - 12:21
Dr_Extrem wrote...
SeptimusMagistos wrote...
Dr_Extrem wrote...
from the philosophical point of view, we may not be more than our thoughts. but this is about the interaction of massengers and stored data. this is about hard- and software.
Right. Shepard is software. Shepard's brain is hardware. The Citadel is hardware.Dr_Extrem wrote...
we can discuss this matter further, if we know how and where a thought originates.
Since I have no desire to discuss neurobiology or AI research, I'll just say that I'm going with the idea that Shepard's mind is basically software when considering Control. I can see why those who disagree would be more hesitant to choose it.Dr_Extrem wrote...
human beings as a whole are more than thoughts and memories .. we are made out of emotions as well. and emotions can may be emulated but not copied. emotions are dependant on our current state of our brain - hormone levels, lithium level and many more things. all this influences what we feel and our feelings can differ even on the same matter.
On the other hand, synthetics is Mass Effect universe also demonstrate the capacity for emotion, for all that they try to deny it.
So if what you're arguing is that the Shepard AI is composed of the same thoughts and attitudes, but with the possibility of a changed emotional state and a capacity to change and grow in different ways than the Shepard in the organic body - that's absolutely true. I'd still consider this entity to be Shepard, though.
it is a fundamentally different hardware. i consider the shep-ai as the remaining thoughts and memories - but disconnected from emotions, a human can feel.
you can consider what ever you want ... but if the shep-ai starts to speak of its former form as of a different perosn, i tent to believe, that this new entity, is no longer our shepard. it may identifies with its old form - but it is definately something new.
I like to think of it as one more example of autodialogue. They never asked me how Shepard should feel about Thessia, and they certainly didn't ask me how he should feel about transhumanism and mind uploads.
#895
Posté 22 décembre 2012 - 12:22
dreman9999 wrote...
But the geth do get the same personality with the same data. The geth show it can be done. It's the lack of data the make the geth personality different. Example the geth vi. It has all of Legions save data up to when Legion got shot.It's legion if it never met Shepard.
No, Legion had a "personality" because of the ton of processes that were running in the same platform. The VI, obviously, does not, because it's a VI. You disproved your assertion right there! Of course the VI isn't like Legion, it's a VI!
#896
Posté 22 décembre 2012 - 12:22
SeptimusMagistos wrote...
I like to think of it as one more example of autodialogue. They never asked me how Shepard should feel about Thessia, and they certainly didn't ask me how he should feel about transhumanism and mind uploads.
You mean "I like to headcanon it as". Because autodialogue doesn't specifically state that Shepard is dead.
#897
Posté 22 décembre 2012 - 12:23
Shepard was shepard because of his/her personality not humanity. That like saying all humans act the same no matter what. It's are personalities that make us differnet.Dr_Extrem wrote...
dreman9999 wrote...
Emotions are based on our hardware. Our general partd of our mind. If we have these parts cut out we loses tose emotions. If a person has a lobotamy and is different after the operation but has all the same memories as before, is he the same person as before or has the person died?Dr_Extrem wrote...
SeptimusMagistos wrote...
Dr_Extrem wrote...
are we more than our thoughts?
No.Dr_Extrem wrote...
thoughts ans memories are one thing. feelings and emotions are inducted by hormones and other massengers. we have impulses.
If the brain can be emulated, so can glands. If anything, it's easier.Dr_Extrem wrote...
we biological units are far more than out thoughts and memories.
I'm assuming we're using the word 'thoughts' to describe all brain activity. Is that right?
from the philosophical point of view, we may not be more than our thoughts. but this is about the interaction of massengers and stored data. this is about hard- and software.
we can discuss this matter further, if we know how and where a thought originates.
human beings as a whole are more than thoughts and memories .. we are made out of emotions as well. and emotions can may be emulated but not copied. emotions are dependant on our current state of our brain - hormone levels, lithium level and many more things. all this influences what we feel and our feelings can differ even on the same matter.
Ask your self that first. The difference here is not that Shepard is no longer Shepard but that he is no longer human. He does not processemotions the same way but he has the same thoughts and memeoies as before.
shepard was shepard because of his/her humanity ..
do you know what a lobotomy does to the "patient". it is so inhuman, i will not answer or comment to this.
As for the lobotomy question, this is an issue you have to deal with on issue of self. A person is their persona. Understand how tht works is the way to understand how it can be developed.
To Illustrate my point....http://www.youtube.c...QmG57VwZs#t=84s
This mission gives you an option like that to save the heritc geth.
#898
Posté 22 décembre 2012 - 12:24
You not getting how geth work. It has nothing to do with processers. Geth oare software. It has nothign to do with hardware. They are just data.Ticonderoga117 wrote...
dreman9999 wrote...
But the geth do get the same personality with the same data. The geth show it can be done. It's the lack of data the make the geth personality different. Example the geth vi. It has all of Legions save data up to when Legion got shot.It's legion if it never met Shepard.
No, Legion had a "personality" because of the ton of processes that were running in the same platform. The VI, obviously, does not, because it's a VI. You disproved your assertion right there! Of course the VI isn't like Legion, it's a VI!
Added, the geth vi has legion mannerisums. That shows they are the same basis.
Modifié par dreman9999, 22 décembre 2012 - 12:25 .
#899
Posté 22 décembre 2012 - 12:26
#900
Posté 22 décembre 2012 - 12:27
That still just means Shepard is no longer human and still alive as an Ai.Dr_Extrem wrote...
SeptimusMagistos wrote...
Dr_Extrem wrote...
from the philosophical point of view, we may not be more than our thoughts. but this is about the interaction of massengers and stored data. this is about hard- and software.
Right. Shepard is software. Shepard's brain is hardware. The Citadel is hardware.Dr_Extrem wrote...
we can discuss this matter further, if we know how and where a thought originates.
Since I have no desire to discuss neurobiology or AI research, I'll just say that I'm going with the idea that Shepard's mind is basically software when considering Control. I can see why those who disagree would be more hesitant to choose it.Dr_Extrem wrote...
human beings as a whole are more than thoughts and memories .. we are made out of emotions as well. and emotions can may be emulated but not copied. emotions are dependant on our current state of our brain - hormone levels, lithium level and many more things. all this influences what we feel and our feelings can differ even on the same matter.
On the other hand, synthetics is Mass Effect universe also demonstrate the capacity for emotion, for all that they try to deny it.
So if what you're arguing is that the Shepard AI is composed of the same thoughts and attitudes, but with the possibility of a changed emotional state and a capacity to change and grow in different ways than the Shepard in the organic body - that's absolutely true. I'd still consider this entity to be Shepard, though.
it is a fundamentally different hardware. i consider the shep-ai as the remaining thoughts and memories - but disconnected from emotions, a human can feel.
you can consider what ever you want ... but if the shep-ai starts to speak of its former form as of a different perosn, i tent to believe, that this new entity, is no longer our shepard. it may identifies with its old form - but it is definately something new.





Retour en haut





