Aller au contenu

Photo

How long until ControlShep goes nuts?


  • Veuillez vous connecter pour répondre
200 réponses à ce sujet

#1
DeinonSlayer

DeinonSlayer
  • Members
  • 8 441 messages
Old poll, but worthy of discussion:

Link

Think for a moment about how long eternity is. The Shepalyst will still be around long after everyone (s)he knew and loved has died. Long after the human race has evolved into something unrecognizeable or died out. Long after the sun enters its red giant phase and char-broils the Earth. Nothing will remain of the world (s)he knew. Will there be anyone to talk to? Any connection?

How long until the madness sets in? And what would the Shepalyst do to stave off eternal boredom?

Discuss!

^_^

#2
AlexMBrennan

AlexMBrennan
  • Members
  • 7 002 messages
Never - what you say only applies to humans but the AI entity based on Shepard is obviously not human. Empirically, I'd suggest you look at Godchild - whilst he has strange ideas and morality (mostly valuing potential life over extant life), he still seems quite sane.

#3
AlanC9

AlanC9
  • Members
  • 35 580 messages
What's the definition of "nuts" here?

#4
IntelligentME3Fanboy

IntelligentME3Fanboy
  • Members
  • 1 983 messages

AlexMBrennan wrote...

Never - what you say only applies to humans but the AI entity based on Shepard is obviously not human. Empirically, I'd suggest you look at Godchild - whilst he has strange ideas and morality (mostly valuing potential life over extant life), he still seems quite sane.

sane?Sanity doesn't apply to synthetics

#5
Hadeedak

Hadeedak
  • Members
  • 3 623 messages
The programming in a computer older than I am that my parents own is still good.

I boot it up, it beeps at me, we're friends. I can play Number Crunchers! Now, it's ancient, and if it was a dog, it'd be dead. Twice. But also very senile. The machine, however, keeps on rolling. And this is not some form of awesome computer descended from the heavens. It was pretty much junk then, too.

I'm going to go way out on a limb and say the preservation of core programming might be a bit better in the FUTURE. For one thing. I'd like to think the basic parameters of the Shepard Operating system would be archived and checked against, periodically.

That being said, we honestly don't know. The most I can say is that Deep Blue and Deep Fritz never went nuts and stopped playing chess. So there's that. But how AI will really function, preserve its identity and learn... Who knows?

#6
Ruadh

Ruadh
  • Members
  • 399 messages
Five.

#7
Samtheman63

Samtheman63
  • Members
  • 2 916 messages
Four.

#8
AlanC9

AlanC9
  • Members
  • 35 580 messages
Three.

#9
Guest_Finn the Jakey_*

Guest_Finn the Jakey_*
  • Guests
Two.

#10
The Night Mammoth

The Night Mammoth
  • Members
  • 7 476 messages
Ten.

#11
TheRealJayDee

TheRealJayDee
  • Members
  • 2 950 messages
One. Eleven!

Modifié par TheRealJayDee, 26 mai 2013 - 12:48 .


#12
GreyLycanTrope

GreyLycanTrope
  • Members
  • 12 686 messages
Zero.

Modifié par Greylycantrope, 26 mai 2013 - 12:46 .


#13
Hadeedak

Hadeedak
  • Members
  • 3 623 messages
Eternity is six. Jeeze. Bunch of noobs.

FOUR! No! Three!

#14
ruggly

ruggly
  • Members
  • 7 548 messages
minus one

#15
thehomeworld

thehomeworld
  • Members
  • 1 562 messages
Shep already shows moving towards the reapers persuasions or at least para shep does (I play paras not ren shep.) so para shep shoots TIM because he's nuts and his idea is crazy not to mention TIM tried to make the reapers do as he said so did Saren both proved you can't. He kills TIM saying your crazy to want to control, we can't control them, and we're not ready for such power to then saying I knew I had to be more to get the job done? No all you had to do was kill them like you were suppose to.

Shep immediately after gaining the reins to the reapers minds suddenly goes all max space police implying by his speech if you threaten the galaxy any way if you try to silent the few or the many there will be consequences. A huge reaper force bombing you from space is a big incentive to do as they wish.

We've seen this time and time again in the story David + the AI didn't work he was going insane and that was from an AI and from a man who was gifted with the ability to work with them anyway shep isn't like David in any way. Grayson didn't fare any better, we also have again TIM and Saren both tried to seize control didn't work. Legion couldn't do it. So why would shep's mental imprint be able to do it? Because its an impression? It copied shep's mental state then immediately went about twisting I'll protect the galaxy by protecting it from itself. The reapers where already trying this method by killing us so that we couldn't make high tech robots to kill us. We obviously didn't like that and fought them back.

Para shep says I'll protect the galaxy so both the strong and weak can flourish? They never heard of survival of the fittest? One side must always devour the weaker so that they have max resources see any species you'd like. On one side the reapers wanting to make both ends of the scale flourish means nothing ever advances it grows stagnate each end of the spectrum and those in the middle will not be able to change their place on the scale because it will unbalance it.

At worst reaper shep starts destroying entire civilizations because they refuse to stay in their place on the scale reaper shep will then have to decide whom will be the winner and loser on the scale next.

Shep was barley handling the war he won't be able to handle being space overseer either without becoming corrupt again.

Modifié par thehomeworld, 26 mai 2013 - 01:11 .


#16
Hadeedak

Hadeedak
  • Members
  • 3 623 messages

ruggly wrote...

minus one


Shepard was crazy the whole time!

#17
ruggly

ruggly
  • Members
  • 7 548 messages

Hadeedak wrote...

ruggly wrote...

minus one


Shepard was crazy the whole time!


Hackett finally lost it as well!

Posted Image

#18
Hadeedak

Hadeedak
  • Members
  • 3 623 messages
Homeworld: It's not Shepard. It's not a human. It's a machine based on a human's personality.

#19
ruggly

ruggly
  • Members
  • 7 548 messages

Hadeedak wrote...

Homeworld: It's not Shepard. It's not a human. It's a machine based on a human's personality.


For all we know, Shepard could have turned into a newt...But she got better.

#20
Hadeedak

Hadeedak
  • Members
  • 3 623 messages

ruggly wrote...

Hadeedak wrote...

Homeworld: It's not Shepard. It's not a human. It's a machine based on a human's personality.


For all we know, Shepard could have turned into a newt...But she got better.


That Hackett comic made me laugh like a total moron, so thanks for that. Shepard is a newt for theory of BSN 2014, I guess.

#21
ruggly

ruggly
  • Members
  • 7 548 messages

Hadeedak wrote...

ruggly wrote...

Hadeedak wrote...

Homeworld: It's not Shepard. It's not a human. It's a machine based on a human's personality.


For all we know, Shepard could have turned into a newt...But she got better.


That Hackett comic made me laugh like a total moron, so thanks for that. Shepard is a newt for theory of BSN 2014, I guess.


You're welcome. And there can never be enough theories.

edit: I swear I can grammar.

Modifié par ruggly, 26 mai 2013 - 02:46 .


#22
carrmatt91

carrmatt91
  • Members
  • 468 messages
wouldn't an AI based on a human essentially think like a human though? almost like Lore off Star Trek:TNG, he was designed to be as human as possible and went a bit loopy.

#23
Iakus

Iakus
  • Members
  • 30 253 messages
I imagine it would take a few centiruies at least, but yeah...

Doing the exact same thing the Leviathans tried and expecting "this time things will be different" is...optimistic...to say the least...

#24
Indy_S

Indy_S
  • Members
  • 2 092 messages
One of the functions of any sapient mind is finding shortcuts. Just cut out a step in the process. Doing this can theoretically improve efficiency. I'm sure that Shepard, unbound by rigourous programming, would try to cut out a step. That step could be just asking for someone's permission before interfering a flashpoint situation. And if it worked better once, it could work better again. And eventually, that step is just removed. And if that step could be removed for the better, others could.

Eventually, I'd be sure to say that one of the removed steps is going to cut into others' personal freedoms. And then it will cut deeper and deeper still. It's around this point that I'd call him nuts, even if each step towards this made perfect sense.

#25
AlanC9

AlanC9
  • Members
  • 35 580 messages
But for this to happen, cutting out the freedoms would have to actually produce acceptable results, and keep producing them.

Modifié par AlanC9, 26 mai 2013 - 03:44 .