Aller au contenu

Photo

Destroy is NOT genocide.


  • Veuillez vous connecter pour répondre
1304 réponses à ce sujet

#176
Iconoclaste

Iconoclaste
  • Members
  • 1 469 messages

dreman9999 wrote...

Iconoclaste wrote...
So, having no emotions means that nothing drives the "thinking", is that correct?

No, we would have insticts to drive us left. Emotion is another way to drive for us as well and intellegence.
Emotion is only part of what drives us.

That is unrelated to the subject : the sentence I was questioning was "The emotions drive the thinking", you are saying that "instinct" could replace "emotions" in this "thinking drive function". You are basically promoting the "instinct" to the same definition as "emotions". Is that correct?

Modifié par Iconoclaste, 08 octobre 2012 - 05:22 .


#177
AngryFrozenWater

AngryFrozenWater
  • Members
  • 9 071 messages

Iconoclaste wrote...

AngryFrozenWater wrote...

Emotions are nothing to be proud of and they are nothing special. Emotion in humans is not only learned and trigered by impressions, they are also influenced by hormones and neurotransmitters such as dopamine, noradrenaline, serotonin, oxytocin and cortisol. They control our thinking and behavior. Emotions are usually very functional. Aggression is such a handy emotion. It readies the body for an upcoming fight. Adrenaline (epinephrine) is responsible for that. Whether we like it or not, in that way we are nothing more than biochemical computers. Emotions just drive our thinking.

So, having no emotions means that nothing drives the "thinking", is that correct?

Yes. In some current (in the last decade) AI research progress has been made by introducing emotion-like triggers in artificial thought.

#178
Laotar

Laotar
  • Members
  • 244 messages

The Mad Hanar wrote...

ghost9191 wrote...

The Mad Hanar wrote...

So if you accidently run over your dog Pete then buy a new dog named Pete does that mean you didn't kill Pete?


no but if you break your toaster and go by a new one it is still a toaster. dogs are organics Hanar . geth are synthetics they are different:devil:


I don't think your toaster can think, talk or form opinions. Image IPB


Have you seen Battlestar Galactica? Dude, it is time to watch those toasters.

They are plotting our demise right now. :bandit:

Modifié par Laotar, 08 octobre 2012 - 05:24 .


#179
Guest_Logan Cloud_*

Guest_Logan Cloud_*
  • Guests
I remember a time when I liked BSN.

#180
Iconoclaste

Iconoclaste
  • Members
  • 1 469 messages

AngryFrozenWater wrote...

Iconoclaste wrote...

AngryFrozenWater wrote...

Emotions are nothing to be proud of and they are nothing special. Emotion in humans is not only learned and trigered by impressions, they are also influenced by hormones and neurotransmitters such as dopamine, noradrenaline, serotonin, oxytocin and cortisol. They control our thinking and behavior. Emotions are usually very functional. Aggression is such a handy emotion. It readies the body for an upcoming fight. Adrenaline (epinephrine) is responsible for that. Whether we like it or not, in that way we are nothing more than biochemical computers. Emotions just drive our thinking.

So, having no emotions means that nothing drives the "thinking", is that correct?

Yes. In some current (in the last decade) AI research progress has been made by introducing emotion-like triggers in artificial thought.

I don't think you got my question right. Let me ask it differently :

Is the act of "thinking" an abstract "action" or a "reaction to stimuli"? In other words, what would an AI "think" about if nobody interacts with it?

You put an AI in the middle of the desert. What will it "think" about?

Modifié par Iconoclaste, 08 octobre 2012 - 05:28 .


#181
Guest_The Mad Hanar_*

Guest_The Mad Hanar_*
  • Guests

ghost9191 wrote...

The Mad Hanar wrote...

ghost9191 wrote...

The Mad Hanar wrote...

So if you accidently run over your dog Pete then buy a new dog named Pete does that mean you didn't kill Pete?


no but if you break your toaster and go by a new one it is still a toaster. dogs are organics Hanar . geth are synthetics they are different:devil:


I don't think your toaster can think, talk or form opinions. Image IPB


My one in fallout can :D


Touche. Image IPB

#182
ghost9191

ghost9191
  • Members
  • 2 287 messages
@ The mad hanar

i was jokking btw. i personally think of geth as alive. as much as a video game character can be mind you . but still pick destroy. and i don't think they should be rebuilt. they proved to be a threat multiple times. then they side with shepard , in order to what? save themselves, they saw they were going to be destroyed so they switched sides. self preservation ftw. anyways i just mean they were a threat , who knows what would happen after the reapers are gone.

only evidence we have is metagaming control and synthesis, but they both have reapers alive to put the geth down if they get uppity , and that is if the geth don't surpass the reapers thanks to the upgrades they got

or something like that. most of that post was addressed to the op topic

gots to stays on topics:devil:

Modifié par ghost9191, 08 octobre 2012 - 05:31 .


#183
Mcfly616

Mcfly616
  • Members
  • 8 988 messages
I dont care if it had a soul or not. It's still not genocide.

#184
AresKeith

AresKeith
  • Members
  • 34 128 messages

Mcfly616 wrote...

I dont care if it had a soul or not. It's still not genocide.


still genocide

#185
Iconoclaste

Iconoclaste
  • Members
  • 1 469 messages

Mcfly616 wrote...

I dont care if it had a soul or not. It's still not genocide.

Just for the sake of "language" there is need to be some "genes" involved in a "genocide".  ;)

#186
Kabooooom

Kabooooom
  • Members
  • 3 996 messages

Nope. You can think without emotions. Emotions more or less stand in the way of thinking. Ever tried to think straight while crying?


This is actually incorrect. Patients with emotional impairment due to neurological lesions often exhibit impaired decision making and impaired logical deduction. Evidence from both modern neurology and psychology demonstrate that it is largely our emotions or "gut instinct" that influences the decisions that we make. This should not be surprising, considering that emotions are largely associated with an ancient area of the brain and widely shared across the animal kingdom. Here is a good paper that addresses that subject:

http://cercor.oxford...t/10/3/295.full

Modifié par Kabooooom, 08 octobre 2012 - 05:35 .


#187
darthnick427

darthnick427
  • Members
  • 3 785 messages

Logan Cloud wrote...

I remember a time when I liked BSN.


I know what ya mean....What the hell happened?

#188
Eterna

Eterna
  • Members
  • 7 417 messages
Geth are Synthetic life. It doesn't matter if you like it or not, they are alive, just not in the same sense that Organic life is. And no, you could rebuild the Geth, but you could not recreate the individuals they were without Reaper upgrades.

They have evolved past simple Robots. A robot is something like Glyph.

#189
Kabooooom

Kabooooom
  • Members
  • 3 996 messages

Mcfly616 wrote...

I dont care if it had a soul or not. It's still not genocide.


Why not? I do not believe that human beings have souls, and yet I fully recognize that wiping out an entire race of conscious individuals equals genocide.

#190
AngryFrozenWater

AngryFrozenWater
  • Members
  • 9 071 messages

Iconoclaste wrote...

AngryFrozenWater wrote...

Iconoclaste wrote...

AngryFrozenWater wrote...

Emotions are nothing to be proud of and they are nothing special. Emotion in humans is not only learned and trigered by impressions, they are also influenced by hormones and neurotransmitters such as dopamine, noradrenaline, serotonin, oxytocin and cortisol. They control our thinking and behavior. Emotions are usually very functional. Aggression is such a handy emotion. It readies the body for an upcoming fight. Adrenaline (epinephrine) is responsible for that. Whether we like it or not, in that way we are nothing more than biochemical computers. Emotions just drive our thinking.

So, having no emotions means that nothing drives the "thinking", is that correct?

Yes. In some current (in the last decade) AI research progress has been made by introducing emotion-like triggers in artificial thought.

I don't think you got my question right. Let me ask it differently :

Is the act of "thinking" an abstract "action" or a "reaction to stimuli"? In other words, what would an AI "think" about if nobody interacts with it?

You put an AI in the middle of the desert. What will it "think" about?

Why do you think an AI should act different than you? The middle of the dessert is most likely a very unhealthy location for any being not adapted to it. I can imagine that such a being does not prefer ("like" in humun terms) that situation. Chances are that it tries to go "home".

Modifié par AngryFrozenWater, 08 octobre 2012 - 05:39 .


#191
Fixers0

Fixers0
  • Members
  • 4 434 messages
The Geth are not sentient, no.

#192
Mr.House

Mr.House
  • Members
  • 23 338 messages
Genocide: the deliberate and systematic extermination of a national, racial, political, or cultural group.

It's genocide... for the Reapers and there's nothing wrong wiping those abomination out of the galaxy. The death of EDI and the geth is a sacerfice, a costly one but still a sacerfice.

#193
Iconoclaste

Iconoclaste
  • Members
  • 1 469 messages

Kabooooom wrote...

Nope. You can think without emotions. Emotions more or less stand in the way of thinking. Ever tried to think straight while crying?


This is actually incorrect. Patients with emotional impairment due to neurological lesions often exhibit impaired decision making and impaired logical deduction. Evidence from both modern neurology and psychology demonstrate that it is largely our emotions or "gut instinct" that influences the decisions that we make. This should not be surprising, considering that emotions are largely associated with an ancient area of the brain and widely shared across the animal kingdom. Here is a good paper that addresses that subject:

http://cercor.oxford...t/10/3/295.full

I believe you are both adressing different aspects of the same thing.

"Decision making", good or bad, is not "thinking" as such : it is the finality of some of it. For example, one could "think of a solution" and be distracted before reaching any, but even without a "decision" or "conclusion" the "process of thought" still happened.  "Delirium" is a form of "thinking", "calculating" too. Emotions can induce bias in the thinking process, but I don't think (!) it is the "root" of the thinking process.

#194
Kabooooom

Kabooooom
  • Members
  • 3 996 messages

Fixers0 wrote...

The Geth are not sentient, no.


lol, what? You do know the definition of sentient, right? I had to point it out to someone else here the other day:

Sentient-

1: responsive to or conscious of sense impressions <sentient beings>
2: aware
3 : finely sensitive in perception or feeling

You are sentient, your dog is sentient, and the geth are sentient. How do we know the geth are sentient? Because the story tells us they are.

Modifié par Kabooooom, 08 octobre 2012 - 05:43 .


#195
ghost9191

ghost9191
  • Members
  • 2 287 messages
in war there are casualties , the geth were such, in war there are sacrifices . can be looked at as genocide, or one man sacrificing soldiers to achieve a objective .

geth might find destroy the right choice. control the reapers are still a threat. synthesis is soemthing forced, and not their future. so from ME2 anyways the geth might oppose that

sacrifice the geth and remember them as such . . shepard committed genocide before as well as sacrificing a crew member to achieve the goal. wasn't hated for that before, well besides by the batarians but there won't be any geth left so yeah . that problem fixed it self

#196
Iconoclaste

Iconoclaste
  • Members
  • 1 469 messages

AngryFrozenWater wrote...

(Iconoclaste :You put an AI in the middle of the desert. What will it "think" about?)

Why do you think an AI should act different than you? The middle of the dessert is most likely a very unhealthy location for any being not adapted to it. I can imagine that such a being does not prefer ("like" in humun terms) that situation. Chances are that it tries to go "home".

Of course, if you add biased parameters to the question, it makes different sense. But still, what would the AI, adapted or not, "think" with no interaction?

Modifié par Iconoclaste, 08 octobre 2012 - 05:46 .


#197
RadicalDisconnect

RadicalDisconnect
  • Members
  • 1 895 messages

Mcfly616 wrote...

I dont care if it had a soul or not. It's still not genocide.


Care to explain?

#198
AngryFrozenWater

AngryFrozenWater
  • Members
  • 9 071 messages

Iconoclaste wrote...

AngryFrozenWater wrote...
(You put an AI in the middle of the desert. What will it "think" about?)
Why do you think an AI should act different than you? The middle of the dessert is most likely a very unhealthy location for any being not adapted to it. I can imagine that such a being does not prefer ("like" in humun terms) that situation. Chances are that it tries to go "home".

Of course, if you add biased parameters to the question, it makes different sense. But still, what would the AI, adapted or not, "think" with no interaction?

It's bloody hot there during the day and freaking cold during the night. There is no maintenance. Fear of non-functionality ("dieing" in human terms) takes over. It needs to get the hell out of there. And it will try to do so.

Modifié par AngryFrozenWater, 08 octobre 2012 - 05:50 .


#199
Eterna

Eterna
  • Members
  • 7 417 messages

ghost9191 wrote...

@dreman9999


wait wait wait wait. it has no emotion. no where is it crying due to the loss. it knew of the loss but could not experience true emotion, only emulate. same as catalyst. which would probably be partly why the catlyst saw no problem with its solution, unlike we organics.

face it, shepalyst is going to turn into a evil tyrannical evil doer thing with a overwhelming synthetic force at its hands to do its bidding. wiping out anything that gets uppity and force everyone to worship it as some kind of ai god


Holy imagination and headcannon bro. 

But seriously, you have no proof, you're making wild claims with no evidence and it's silly. 

#200
Iconoclaste

Iconoclaste
  • Members
  • 1 469 messages

AngryFrozenWater wrote...

Iconoclaste wrote...

AngryFrozenWater wrote...
(You put an AI in the middle of the desert. What will it "think" about?)
Why do you think an AI should act different than you? The middle of the dessert is most likely a very unhealthy location for any being not adapted to it. I can imagine that such a being does not prefer ("like" in humun terms) that situation. Chances are that it tries to go "home".

Of course, if you add biased parameters to the question, it makes different sense. But still, what would the AI, adapted or not, "think" with no interaction?

It's bloody hot there. There is no maintainance. Fear of non-functionality ("dieing" in human terms) takes over. It needs to get the hell out of there. And it will try to do so.

What you are doing is "avoiding the subject". What would an AI in a flat / safe / temperate but deserted area think, without interaction?