Aller au contenu

Photo

***SPOILER*** The Origin of the Reapers is silly (and a paradox). ***SPOILER***


  • Veuillez vous connecter pour répondre
232 réponses à ce sujet

#176
Sylvanpyxie

Sylvanpyxie
  • Members
  • 1 036 messages
I've said it on multiple threads, but i was honestly hoping Reapers had no real grand purpose for the universe. I wanted to believe they were merely harvesting organics as a means to continue their existence.

I'm lying, i originally wanted to believe they were a machine designed and created by an extremely old race who liked to collect vast amounts of knowledge and store it away for all time.

I wanted the very first Reaper to be some kind of ancient archive, created for the sole purpose of documenting knowledge by absorbing the organic matter of those who were on the brink of death in the Creator's Society. So that their knowledge, and all their life experiences, were stored for all time.

Then the Reaper had a programming fault, or the absorbed knowledge somehow corrupted it's way of thinking, and it began to absorb the knowledge of those that created it. All the knowledge. Only to discover ways to create new Reapers, to store more knowledge.

To store every single life experience, every belief, every single moment of an entire Galactic Culture. Only to expand more and hunger for more. Shutting themselves down periodically and only returning when cultures had once again risen up, offering more knowledge for the Reaper's insatiable hunger.

Killer Galactic Library, if you will.

Seriously.. Missed opportunity, guys.

Modifié par Sylvanpyxie, 05 mars 2012 - 02:03 .


#177
Treopod

Treopod
  • Members
  • 81 messages

GracefulChicken wrote...

A "real theory" as I was using it was meant to mean one backed up by science, information, and verifiable proof. Either way, not an argument against my point. If you take into account, like I said, the exponential growth of technology, and extrapolate that out over time, a tech singularity IS inevitable. People say you can't prove the future of anything, but strangely enough the growth of technology and information is very easy to predict, given how much growth has already been made at such a consistent pace (that is, exponentially. SEE BELOW for graphs and ****)

Second, AI does need to happen for a tech singularity. The big parts of the tech singularity theory is summed up by the acronym "G.N.R." Genetics, Nanotechnology, Robotics. Genetics cover all of neuroscience, meaning mapping the brain. Mix this with robotics, which covers AI Engineering right now (let alone how far it would've come by the time ME happens) and a conscious machine (Geth, for instance), and AI is very much a key part of a tech singularity happening. Technology is an evolutionary process, that's fairly well established, and a pretty overwhelming consensus among scientists in the field.

Third, I'm not saying a tech singularity has to be a bad thing. It is, I believe, an inevitable part of the evolutionary process. In the ME universe, it is seen as one, which is the premise I've agreed to when I played the game. My own personal belief is that it is inevitable, and a key part of our (humanity's) evolution. Overcoming biology is one of the key parts of our evolution, one that is a major part of the tech singularity theory.

The evidence of it is obviously through all of Mass Effect, in the opening 20 minutes of every game at that. ME1: Geth, the "R" in GNR. ME2: Nanotechnology. We see Shep injected with nanotech in the opening credits, which could easily overlap with genetics, the final part of it. The whole idea of the advance and exponential growth of tech is, as I view it, is what drives the entire ME franchise, idea-wise. I could really go on for hours about all the examples in the series so far, it's truly amazing.

I suggest you watch these Youtube clips of Ray Kurzweil, a well respected inventor, scientist, "futurist," and someone who has predicted many many of the accomplishments of technology with alarming accuracy (the year of the first computer to best a human in Chess to the year, the internet as it stands today within 2 years, Google's rise, etc). I'd suggest these two myself to basically sum up the theory as a whole:

 


Image IPB
Image IPB


Wrong, Sentient AI is not neccesarry at all for reaching the tech singularity, its one of many paths, Organics can achieve it themselves by improving themselves with technology, be it nanotechnoogy, implants or gene manipulation.

At most, that would make them cyborgs, but cyborgs are still an organic intelligence/being and keeps their free will, they remain the same species but enchances their intelligence exponentially.

AI:s on the other hand like geth, are completely synthetic and a different type of existance.

as for the reapers im still unsure what they are, If they are the uploaded minds of an ancient race, then that would make them some kind of hybrid because they do not retain free will because of the programming, without the programming they would be considered cyborgs in that case.

If reapers are just AI:s created by the race who died out, then they are just like the geth but with organic material inside them.

And i have no idea if the Guardian is an AI, or if he is an uploaded collection of conciousness from that original race.

Modifié par Treopod, 05 mars 2012 - 01:35 .


#178
karlpopper

karlpopper
  • Members
  • 8 messages
free will? is that like magic? because it sure sounds like it, with natural law defying capabilities and all that

#179
WizenSlinky0

WizenSlinky0
  • Members
  • 3 032 messages
yes sentient AI is necessary. The whole point of the singularity is that they fundamentally approach things differently than organics. So much so that we can't possibly comprehend it in terms of our own minds. There is no understanding because it's impossible for an organic to understand the thought process of a machine, and likewise impossible for an AI to fully understand an organics thought process. They can build models of likely outcomes but not figure out how they get there.

A cyborg retains the human mind. That is the difference. It's not intelligence that is the problem. It's the approach.

#180
Vaenier

Vaenier
  • Members
  • 2 815 messages

Sylvanpyxie wrote...

I've said it on multiple threads, but i was honestly hoping Reapers had no real grand purpose for the universe. I wanted to believe they were merely harvesting organics as a means to continue their existence.

I'm lying, i originally wanted to believe they were a machine designed and created by an extremely old race who liked to collect vast amounts of knowledge and store it away for all time.

I wanted the very first Reaper to be some kind of ancient archive, created for the sole purpose of documenting knowledge by absorbing the organic matter of those who were on the brink the death in the Creator's Society. So that their knowledge, and all their life experiences, were stored for all time.

Then the Reaper had a programming fault, or the absorbed knowledge somehow corrupted it's way of thinking, and it began to absorb the knowledge of those that created it. All the knowledge. Only to discover ways to create new Reapers, to store more knowledge.

To store every single life experience, every belief, every single moment of an entire Galactic Culture. Only to expand more and hunger for more. Shutting themselves down periodically and only returning when cultures had once again risen up, offering more knowledge for the Reaper's insatiable hunger.

Killer Galactic Library, if you will.

Seriously.. Missed opportunity, guys.

That is awsome. Well done.

#181
SovereignWillReturn

SovereignWillReturn
  • Members
  • 1 183 messages
I miss the Dark Energy plotline. I liked that one a TON more than this one. A TON more.

Gah. Bioware, you had good ideas at first...

#182
GracefulChicken

GracefulChicken
  • Members
  • 556 messages

WizenSlinky0 wrote...

yes sentient AI is necessary. The whole point of the singularity is that they fundamentally approach things differently than organics. So much so that we can't possibly comprehend it in terms of our own minds. There is no understanding because it's impossible for an organic to understand the thought process of a machine, and likewise impossible for an AI to fully understand an organics thought process. They can build models of likely outcomes but not figure out how they get there.

A cyborg retains the human mind. That is the difference. It's not intelligence that is the problem. It's the approach.


Basically this to the guy who quoted my post. Sentient AIs are a funamental part of a tech singularity happening. It's the entire part that takes mankind's understanding out of the equation, that's the event horizon. Tech needs to be able to improve on itself without human intervention for a tech singularity to happen in the tradition sense, which means it does need to be self aware (even today, some drone spies are considered semi-autonomous because they "know" where their powersource is. It was the reason behind the US Navy issuing studies into the future for autonomous robots and their moral use in warfare, and cautions to be taken in the process). Sentient AI is one of, if not the biggest, precursor to a tech singularity, along with nanotech and everything else we already agree on.

Modifié par GracefulChicken, 05 mars 2012 - 01:43 .


#183
Treopod

Treopod
  • Members
  • 81 messages

WizenSlinky0 wrote...

yes sentient AI is necessary. The whole point of the singularity is that they fundamentally approach things differently than organics. So much so that we can't possibly comprehend it in terms of our own minds. There is no understanding because it's impossible for an organic to understand the thought process of a machine, and likewise impossible for an AI to fully understand an organics thought process. They can build models of likely outcomes but not figure out how they get there.

A cyborg retains the human mind. That is the difference. It's not intelligence that is the problem. It's the approach.


No, sentient AI is not neccesary, look up the definition, it says clearly that tech singularity can be acheived by organics enhancing their own minds through technology.

the tech singularity with sentient AI:s happens because their Intelligence would improve at such a fast rate that we would not be able to keep up and comprehend it, but if we chose to improve our own intelligence, their would be no limit to our intelligence either.

the approach is different between sentient AI and organics, but the only difference would essentially be that sentient AI:s improve at a faster rate, but that difference is not important enough to justify the creation of AI:s, meaning that if organics chose to achieve tech singularity by improving their own intelligence through technology, there would be no need for them to ever create sentient AI:s.

#184
Nachtritter76

Nachtritter76
  • Members
  • 206 messages
Guys, we can't entirely think from the point of view of a dog. The thought process is entirely different. The brains are not the same. The experiences are not the same from one species to another. Organics don't all think alike. An immortal cybernetic human would not think the same way as a normal, mortal human. To write the Reapers as anything else than completely unknowable is to dumb them down. Bioware just dropped the ball on this whole series.

Oh and I'll find the link to the "we make it up as we go along" bit.

#185
RogueBot

RogueBot
  • Members
  • 830 messages

Sylvanpyxie wrote...

I've said it on multiple threads, but i was honestly hoping Reapers had no real grand purpose for the universe. I wanted to believe they were merely harvesting organics as a means to continue their existence.

I'm lying, i originally wanted to believe they were a machine designed and created by an extremely old race who liked to collect vast amounts of knowledge and store it away for all time.

I wanted the very first Reaper to be some kind of ancient archive, created for the sole purpose of documenting knowledge by absorbing the organic matter of those who were on the brink the death in the Creator's Society. So that their knowledge, and all their life experiences, were stored for all time.

Then the Reaper had a programming fault, or the absorbed knowledge somehow corrupted it's way of thinking, and it began to absorb the knowledge of those that created it. All the knowledge. Only to discover ways to create new Reapers, to store more knowledge.

To store every single life experience, every belief, every single moment of an entire Galactic Culture. Only to expand more and hunger for more. Shutting themselves down periodically and only returning when cultures had once again risen up, offering more knowledge for the Reaper's insatiable hunger.

Killer Galactic Library, if you will.

Seriously.. Missed opportunity, guys.


Neat theory, although I would have been fine if the Reapers' purpose was simply to (a) procreate by using the genetic material of a worthy species to create a new Reaper, and (B) consume their resources to keep them going for another 50,000 years, as well as take any new science and technology as their own.

Guess I'm just a simple guy-- I don't mind simple explanations for my villains. Hell, they'd pretty much just be like humans that way, by conquering and feeding off those weaker than them.

#186
Treopod

Treopod
  • Members
  • 81 messages

GracefulChicken wrote...

WizenSlinky0 wrote...

yes sentient AI is necessary. The whole point of the singularity is that they fundamentally approach things differently than organics. So much so that we can't possibly comprehend it in terms of our own minds. There is no understanding because it's impossible for an organic to understand the thought process of a machine, and likewise impossible for an AI to fully understand an organics thought process. They can build models of likely outcomes but not figure out how they get there.

A cyborg retains the human mind. That is the difference. It's not intelligence that is the problem. It's the approach.


Basically this to the guy who quoted my post. Sentient AIs are a funamental part of a tech singularity happening. It's the entire part that takes mankind's understanding out of the equation, that's the event horizon. Tech needs to be able to improve on itself without human intervention for a tech singularity to happen in the tradition sense, which means it does need to be self aware (even today, some drone spies are considered semi-autonomous because they "know" where their powersource is. It was the reason behind the US Navy issuing studies into the future for autonomous robots and their moral use in warfare, and cautions to be taken in the process). Sentient AI is one of, if not the biggest, precursor to a tech singularity, along with nanotech and everything else we already agree on.


nope, the singularity is based on the fact that we cannot predict in our current state, how our intelligence will evolve because of the huge unpredictable leap in intelligence it is based on, regardless if it refers to enhanced organic intelligence, or evolution of sentient AI:s

here is the quote:

"The term was coined by science fiction writer Vernor Vinge, who argues that artificial intelligence, human biological enhancement or brain-computer interfaces could be possible causes of the singularity."

http://en.wikipedia....cal_singularity

#187
Solduri

Solduri
  • Members
  • 198 messages

GracefulChicken wrote...

WizenSlinky0 wrote...

yes sentient AI is necessary. The whole point of the singularity is that they fundamentally approach things differently than organics. So much so that we can't possibly comprehend it in terms of our own minds. There is no understanding because it's impossible for an organic to understand the thought process of a machine, and likewise impossible for an AI to fully understand an organics thought process. They can build models of likely outcomes but not figure out how they get there.

A cyborg retains the human mind. That is the difference. It's not intelligence that is the problem. It's the approach.


Basically this to the guy who quoted my post. Sentient AIs are a funamental part of a tech singularity happening. It's the entire part that takes mankind's understanding out of the equation, that's the event horizon. Tech needs to be able to improve on itself without human intervention for a tech singularity to happen in the tradition sense, which means it does need to be self aware (even today, some drone spies are considered semi-autonomous because they "know" where their powersource is. It was the reason behind the US Navy issuing studies into the future for autonomous robots and their moral use in warfare, and cautions to be taken in the process). Sentient AI is one of, if not the biggest, precursor to a tech singularity, along with nanotech and everything else we already agree on.


what about mind uploading? or nootropic drugs(smart drugs)?or biotechnology and genitics? also there is brain-computer interface. all of which can become an intelligence explosion  which is the main key to a tech singularity.

Modifié par Solduri, 05 mars 2012 - 01:50 .


#188
Aesieru

Aesieru
  • Members
  • 4 201 messages
I am so far behind the new curve this thread took that I can't even try to get into it.

#189
GracefulChicken

GracefulChicken
  • Members
  • 556 messages
Ok, thats fine wikipedia research, but it doesn't nearly scratch the surface. The fact is, by the time augmenting our own intelligence through the typical "GNR" route of the tech singularity theory, AIs will already be in existence. The knowledge to create AIs would come before augmenting our own systems. If anything, AIs would be used as a research tool into augmenting ourselves. It may not be necessary, that I'll reluctantly admit, but it's most likely AI would come before the singularity event itself.

#190
Sylvanpyxie

Sylvanpyxie
  • Members
  • 1 036 messages

Hell, they'd pretty much just be like humans that way, by conquering and feeding off those weaker than them.

Honestly, i'd be content with anything that had them harvesting for the sake of harvesting, and not for some grand purpose of the universe, but that's too simple for most people these days.

That is awsome. Well done. 

You know you've been on the BSN too long when you can't tell if you're actually getting complimented...

Modifié par Sylvanpyxie, 05 mars 2012 - 01:55 .


#191
Solduri

Solduri
  • Members
  • 198 messages

Aesieru wrote...

I am so far behind the new curve this thread took that I can't even try to get into it.


lol i like the curve its taken its fun to debate things like this :lol:

#192
GracefulChicken

GracefulChicken
  • Members
  • 556 messages

Solduri wrote...

Aesieru wrote...

I am so far behind the new curve this thread took that I can't even try to get into it.


lol i like the curve its taken its fun to debate things like this :lol:


Definately. The science and philosophy behind the ME story has whats kept me this far. Especially when debates like these don't turn ad hominum like a lot seem to now.

#193
WizenSlinky0

WizenSlinky0
  • Members
  • 3 032 messages

Treopod wrote...

WizenSlinky0 wrote...

yes sentient AI is necessary. The whole point of the singularity is that they fundamentally approach things differently than organics. So much so that we can't possibly comprehend it in terms of our own minds. There is no understanding because it's impossible for an organic to understand the thought process of a machine, and likewise impossible for an AI to fully understand an organics thought process. They can build models of likely outcomes but not figure out how they get there.

A cyborg retains the human mind. That is the difference. It's not intelligence that is the problem. It's the approach.


No, sentient AI is not neccesary, look up the definition, it says clearly that tech singularity can be acheived by organics enhancing their own minds through technology.

the tech singularity with sentient AI:s happens because their Intelligence would improve at such a fast rate that we would not be able to keep up and comprehend it, but if we chose to improve our own intelligence, their would be no limit to our intelligence either.

the approach is different between sentient AI and organics, but the only difference would essentially be that sentient AI:s improve at a faster rate, but that difference is not important enough to justify the creation of AI:s, meaning that if organics chose to achieve tech singularity by improving their own intelligence through technology, there would be no need for them to ever create sentient AI:s.


Absolutely. Positively. Not. You're misinterpreting it. If an organi enhances their mind with enough technology they cease to think like an organic. Therefore, they have literally become a AI with a semi-organic shell. It is still a sentient AI just approached in a new way. Much like an advanced form of Overlord where they tried to create an AI using a human mind.

It's not about the intelligence. It's about the needs and approach. Organics and Synthetics require different things. This partially influences how they approach everything in a different way.

The two thought processes can not mix without conflict. It's a two-phase split. The mental and the substance. Unless you have two things that are different in BOTH you will not reach a singularity because there is common needs/understanding.

#194
Solduri

Solduri
  • Members
  • 198 messages

GracefulChicken wrote...

Solduri wrote...

Aesieru wrote...

I am so far behind the new curve this thread took that I can't even try to get into it.


lol i like the curve its taken its fun to debate things like this :lol:


Definately. The science and philosophy behind the ME story has whats kept me this far. Especially when debates like these don't turn ad hominum like a lot seem to now.


glad to hear it  :)

#195
Russalka

Russalka
  • Members
  • 3 867 messages
Is 'paradox' the right term in this case? I keep forgetting how it is misused.

#196
Sylvanpyxie

Sylvanpyxie
  • Members
  • 1 036 messages

Is 'paradox' the right term in this case? I keep forgetting how it is misused.

A paradox is a logical statement or group of statements that lead to a contradiction or a situation which (if true) defies logic or reason.

The Reapers kill organics to protect organics.

#197
Aesieru

Aesieru
  • Members
  • 4 201 messages

Sylvanpyxie wrote...

Is 'paradox' the right term in this case? I keep forgetting how it is misused.

A paradox is a logical statement or group of statements that lead to a contradiction or a situation which (if true) defies logic or reason.

The Reapers kill organics to protect organics.


The Reapers prevent organic races from creating artificial intelligences capable of becoming run-away, they also step in what it happens or when the Vanguards recon shows that it is happening, has happened, or may, or is dangerously close to it. So as to prevent exhaustion of resources by constant expansion and the lack of need for organics and thus their entire extermination and rendered extinction by said run-away intelligences, they eradicate everything related to them and allow the galaxy to repopulate anew.

It is not a paradox.

The AI's are not AI's, they are sapient-constructs that maintain many programs to form a nation inside a embryo made of billions of organics of said species they Reaped.

#198
Treopod

Treopod
  • Members
  • 81 messages

GracefulChicken wrote...

Ok, thats fine wikipedia research, but it doesn't nearly scratch the surface. The fact is, by the time augmenting our own intelligence through the typical "GNR" route of the tech singularity theory, AIs will already be in existence. The knowledge to create AIs would come before augmenting our own systems. If anything, AIs would be used as a research tool into augmenting ourselves. It may not be necessary, that I'll reluctantly admit, but it's most likely AI would come before the singularity event itself.


AI:s need to be created by us, and if we know the dangers of them and have been warned, and we know that there exists an alternative which has a potential for intelligence just as big as sentient AI:s do, then i dont see why we would ever pursue the development of such AI:s any longer, so i would say in that case the likelyhood of such an AI existing before we achieve organic tech singularity is small.

Modifié par Treopod, 05 mars 2012 - 02:05 .


#199
Nachtritter76

Nachtritter76
  • Members
  • 206 messages
http://www.oxmonline...they-went-along

#200
Treopod

Treopod
  • Members
  • 81 messages

WizenSlinky0 wrote...

Treopod wrote...

WizenSlinky0 wrote...

yes sentient AI is necessary. The whole point of the singularity is that they fundamentally approach things differently than organics. So much so that we can't possibly comprehend it in terms of our own minds. There is no understanding because it's impossible for an organic to understand the thought process of a machine, and likewise impossible for an AI to fully understand an organics thought process. They can build models of likely outcomes but not figure out how they get there.

A cyborg retains the human mind. That is the difference. It's not intelligence that is the problem. It's the approach.


No, sentient AI is not neccesary, look up the definition, it says clearly that tech singularity can be acheived by organics enhancing their own minds through technology.

the tech singularity with sentient AI:s happens because their Intelligence would improve at such a fast rate that we would not be able to keep up and comprehend it, but if we chose to improve our own intelligence, their would be no limit to our intelligence either.

the approach is different between sentient AI and organics, but the only difference would essentially be that sentient AI:s improve at a faster rate, but that difference is not important enough to justify the creation of AI:s, meaning that if organics chose to achieve tech singularity by improving their own intelligence through technology, there would be no need for them to ever create sentient AI:s.


Absolutely. Positively. Not. You're misinterpreting it. If an organi enhances their mind with enough technology they cease to think like an organic. Therefore, they have literally become a AI with a semi-organic shell. It is still a sentient AI just approached in a new way. Much like an advanced form of Overlord where they tried to create an AI using a human mind.

It's not about the intelligence. It's about the needs and approach. Organics and Synthetics require different things. This partially influences how they approach everything in a different way.

The two thought processes can not mix without conflict. It's a two-phase split. The mental and the substance. Unless you have two things that are different in BOTH you will not reach a singularity because there is common needs/understanding.


no, thats false, as long as free will is retained and the evolutionary process of our organic intelligence is stable and kept under control it will develop just the way we want it to. we will still be organics, not AI:s. we might think differently but we still wont be AI:s and we wont need them either, so that seems to be the natural and optimal evolution for organics.

the definition of true AI is that it is created from synthetic matter, if we enchance our own intelligence we would still be organic and still require things that organics need and still have free will, by definition alone we would not be considered the same as sentient AI:s, who also have free will but function in a different way.

Modifié par Treopod, 05 mars 2012 - 02:12 .