Aller au contenu

Photo

On the Universality of the Creator-Created Conflict


  • Veuillez vous connecter pour répondre
80 réponses à ce sujet

#51
MegaSovereign

MegaSovereign
  • Members
  • 10 794 messages

Jukaga wrote...

You know what scares me sometimes? The near certainty that somewhere in the universe (hopefully not our galaxy) there are Von Neumann probes running amok sterilizing planets and reproducing themselves or even worse: a grey goo nanohorde devouring everything in it's way.

Given the immensity of space, both are near certainties.


If they're being transported on an asteriod, we could divert their path into the sun. Assuming that the nanohorde could be killed from the heat.

I don't know, I haven't really thought about it. This is makes me a little more weary about space travel, lol.

#52
Mangalores

Mangalores
  • Members
  • 468 messages

remydat wrote...

...

Yes but the point is collectively patterns emerge.  Relgion for example is a pattern that emerges through human history.  Individuals may not believe but collectively as a civilization every major civilization develop some form of religion.  The Catalyst is not making observations at an individual level.  He is saying collectively organics will continue to create technology eventually creating an AI that will destroy them.  Collectively humans conflict with each other.  Whether those reasons dwindle doesn't change the fact we still kill each other over stupid things..


I question what you consider the collective patterns. Human violence decreased in relation to population size over the past millennia. Known organized religion is ~ 20 000 years old vs. 250 000 years of human history vs 3 billion years of life. You are actually not looking at the collective patterns but the outlyers.

And the claim is not that synthetics are incapable of it.  It is that both are incapable of it.  Synthetics can destroy organics either because synthetics provoke the organics or becuase organics provoke synthetics.  If the Geth were born with Reaper Code then they would possibly have simply exterminated the Quarians completely.  They did not because they were primitive and did not possess the processing power to contemplate killing an entire species


That makes no sense. Stopping to kill stuff needs more processing power since you'd need the capacity to question self and foreign motivations.

And again, you assume a monolithic set of behaviour when over long periods of time we see complex sets of behaviour, pretty much all of which reduce reasons for violence and the accepted norms to apply violence.


The point though is the conflict does not have to start because of synthetics.  It can start because organics fear synthetics despite having no reason to simply because they exist just like what happened with the Quarians or why the Council has laws forbidding synthetic life.


Hudson: We should nuke the galaxy from orbit. It's the only way to be sure

The point is that there is no connection here that the synthetics vs. organics conflict shows any unexpected features you don't see in organics vs. organics conflicts or synthetics vs synthetics conflicts. All of which have a multitude of outcomes to resolve each other from mass murder to amicable peace. If we go by the Catalyst's reasoning he should not draw the line at synthetic vs. organic conflict but life.

#53
Auld Wulf

Auld Wulf
  • Members
  • 1 284 messages
@OP

A very refreshing perspective. Not unlike a cold shower on a hot, sweaty day. There's been a bad smell about the BSN, lately, and this washes it away.

I've often presented the perception of the Catalyst as an immortal child, so I couldn't agree with you more, there. I've also pointed out that the perceptions and understanding of the Catalyst are tied to the way in which it was created, the views it had at the time of creation, and the 'evidence' it's observed over aeons from the perspective of those views.

There is, however, one problem with a sub-section of humanity: The belief that nothing changes.

It's a weird problem, really. You'll have some people (like me!) who'll embrace any new ideas, but then you have a certain creed of person who believes that their lifespan dictates whether something is natural or an abomination. (I will say that it's mildly amusing that I'm more open to new things and generally more open-minded than most of the kids on BSN. That's really funny.) See, the issue tends to be that some people are weak of character, so they cling to familiarity because they see the unknown as a danger.

They hold onto familiarity with dear life, and they don't want the world to change. Anything that existed before they were born is natural, anything that exists beyond that point goes from unusual to an outright abomination. I remember when, for example, abortion was a big issue and how you had some kids who were freaking out over how unnatural and strange it was. These days it's just a fact of life, but you'll always have those that cling to their familiarity. (See Bush and his 'special snowflakes.')

To me, anything new and beneficial that happens is cool and interesting. I read science journals and sites on a regular basis because I love staying on top of current events. I'd rather read about the latest breakthroughs than celebrity gossip, but that's just me. I mean, earlier today I was reading about how a vaccine had been created to help deal with the symptoms of Autism -- that is amazing. It's also an advancement that we didn't have before. This is why I see Synthesis as almost the symbolism of science itself. The ideal of science, if you will.

I also read an article which talked about how sociopathic/psychopathic people physically lack the neurophysiological architecture required for empathy. I see a lack of empathy a lot. But I can imagine a day when that's cured and all people have the capacity for empathy. I'm sure that a lot of people around here would flip out about that, freak out, have a riot. Why? Because a cure to that would be essentially creating a new human condition. If we could cure this at birth, we'd essentially be creating a posthuman condition, one where that flaw never existed.

That's what I see Synthesis as being representative of.

Contrary to me, though, you have people who cling to familiarity and believe earth will be the same forever. They take comfort in it, and believe that nothing will ever change. If anything does change, or if anything is contrary to the beliefs they were taught in their youth, then they balk at the changes and consider them wrong, or abominable. What's funny is that, as you pointed out, with new generations these views of what is and isn't abominable can change, thus creating conflict.

I think, outside of roleplaying a Shepard, whether you'd choose Destroy, Control, or Synthesis is based upon a sliding scale of how much you desperately want familiarity. If you're okay with diving off the deep end into the unknown (as I am), and see that as exciting (as I do), then you're going to pick Synthesis. If you're almost there, but you're still a little cautious, you can pick Control. If you cling to familiarity and want the Universe to stay the same forever, you pick Destroy.

That's why Destroy is essentially ludditism at the core. Ludditism for simple people, with simple desires, and a need for familiarity. It might not be that they can't comprehend Synthesis, but maybe more that they wilfully don't want to, because they're scared of any changes threatening their sense of familiarity. They basically want every tomorrow to be like every yesterday, a position of the purest stagnation. And that's a very dangerous thing to want.

Of course, these people would be the first to fire weapons/raise torches at aliens or artificial life. They're the mob mentality that arises whenever the world changes, they are the will of base animal fear at its most primal level. Safety. See, Destroy is pure security, it's safety, it's a warm, cozy blanket where the big, bad Reaper men are gone, and everything stays juuuust as it was. It's stagnation, it's living in a moment forever.

This is essentially the view eschewed by the Templars in Deus Ex. They want to create a new dark age where people are tribal and everything is small and easy to understand, where nothing is beyond the comprehension of the weakest of character. Contrary to that, you have Helios. The Helios solution is simply to improve all humankind to the point where everyone is equal in their strength of character, and able to face the changes ahead.

It's funny, the Templars, the Illuminati, and Helios show exactly the same sliding scale as Mass effect. If you aren't familiar with this, then I urge you to go and either watch or play the Deus Ex games. Especially the culmination of the story in Deus Ex: Invisible War (whilst admittedly not a great game, it did have a brilliant storyline). In Deus Ex the perceptions of all elements of the scale are laid out by various individuals who all have their own perceptions. Perhaps this is why my perceptions are so refined, because I've played Deus Ex.

Have I mentioned that more people need to play Deus Ex?

More people need to play Deus Ex. :I

But what you see in Deus Ex is basically the sliding scale of humanity. At one end you have those who fetishise familiarity and are terrified of any kind of change. At the very far end of that scale you have those who'll violently, vociferously, and aggressively fight all manners of change in any way they can (even if it means killing us all, or killing whomever they need to to make it happen). On the other end of the scale, you have those who believe that a world can exist for everyone, they are creatures of novelty and pacifism who believe that no one has to die to achieve any given dream. Occupying the centre you have the Illuminati, who possess qualities of both.

It's represented in other games, too!

Let's look at Fallout: New Vegas...

Caesar's Legion: A very tribal world where everything is small and easy to understand. The obvious similar ideologies here are the Templars and Destroy.

NCR/Mr. House: These are variations on the middle of the world, taking baby steps towards change, not to upset the rabble. These groups represent the Illuminati and Control.

Independence: The most radical change for New Vegas, it changes everything but in a positive light, leaving nothing as it was before. This is reminiscent of Helios and Synthesis.

I'm using the most common examples I can, here, so that more people will grasp where I'm coming from. This scale is very prevalent in human thinking, as it basically is an allegory of the perspective of progress. Those who hate it, those who fear it, those who want to granularly control it, and those who embrace it.

Progress is the enemy of stagnation. As such, progress is the enemy of familiarity. It's always surprising to me just what levels people want to go to to fight progress.

So there you go, that's my take on it.

Thanks for having me think about this, OP. It was indeed refreshing and very welcome.

Edit: Heh, I'm reminded now of that one Charr in Guild Wars 2. The one that talks of how he thinks progress is a negative thing, how he hates it, and how he finds the current peace unsettling. That he preferred the days when everyone was killing each other with swords.

Modifié par Auld Wulf, 25 avril 2013 - 01:09 .


#54
MassivelyEffective0730

MassivelyEffective0730
  • Members
  • 9 230 messages

MegaSovereign wrote...

Jukaga wrote...

You know what scares me sometimes? The near certainty that somewhere in the universe (hopefully not our galaxy) there are Von Neumann probes running amok sterilizing planets and reproducing themselves or even worse: a grey goo nanohorde devouring everything in it's way.

Given the immensity of space, both are near certainties.


If they're being transported on an asteriod, we could divert their path into the sun. Assuming that the nanohorde could be killed from the heat.

I don't know, I haven't really thought about it. This is makes me a little more weary about space travel, lol.


Just remember the immense size of space and where we're at. 

#55
MassivelyEffective0730

MassivelyEffective0730
  • Members
  • 9 230 messages

Auld Wulf wrote...
*Condescending/Strawman/Black-and-White/Ad Hominem/Pro-Synthesis/Anti-Destroy/Otherwise logically unsound and common sense deprived statement*

Not all change is progress. New Coke in 1982 proved this.

Lack of change isn't always stagnation. Regular Coke formula - why change what works well?

Not every new idea is a good idea. A Football bat for example.

You're basically condemning everyone who don't want things to change. That's not correct. 

You're assuming that all change is for the better. It is not. 

Modifié par MassivelyEffective0730, 25 avril 2013 - 01:15 .


#56
Auld Wulf

Auld Wulf
  • Members
  • 1 284 messages
I was going to put this in as an edit to my prior post, but I recognise that it might not get seen by the people whom I'd like to see it, that way. As such, this should be considered as a direct addendum to my prior post.

So, onto... more thinking!

Okay, so I think another side to this problem is that the human who clings to familiarity can see no better creation than himself and humanity. They are the creation of some deity or other of their own creation, after all. ...which is just patting your own back, really, but I digress. See, this notion that the current human condition is perfect (rather than flawed and cruel) is part of the problem. It also means that we cannot accept that something could be better than us, and that only things familiar to us are equal to us.

In Mass Effect, Deus Ex, and New Vegas (common themes!) it took an AI in all cases to help us achieve perfection, becasue we were too imperfect to do it by ourselves. We'd first need to acknowledge that there's always going to be things that an AI is better at. Humans, for example, are horrendous at micro-management. An AI would be far, far more capable of pulling that off. So there are tasks which an AI could perform which we couldn't. An AI could, for example, manage a consensus and allow us to all remain connected. An AI could look at the needs of the individual, rather than the generality.

Also consider New Vegas and the town of Primm -- they need a sheriff. But most of the options you'll pick will actually doom them. Why? Humans are biased towards their own and act in favour of the generality. What turned out to be the best example? An impartial machine which could look at the needs of each individual and act accordingly and logically. Essentially, machines have a wisdom that humans lack (even me) -- objective fairness. No human can be objectively fair, you can come fairly close (and I strive to), but no one of us can be objectively fair. We're all biased in our own ways, even without realising it.

A machine isn't biased. A machine can look at us and understand the needs, necessities, and desire of the individual and figure out how to implement those without actually impeding the needs, necessities, and desires of any other individual. This is what we use machines for a lot today, we use them for their impartiality. When a machine runs a simulation, it simply crunches the math, it isn't biased towards any particular outcome unless the program in question was weighted (but most programmers know better than that, at least).

The problem, however, is that it's hard for a certain mindset of humanity to accept that anything could be more perfect at any task than a human, because humanity is seen by those people as the peak of evolution. Because we have gone unchallenged. To challenge us, to those people, is essentially an act of war, an affront, an offense, an insult to the deity they created or what have you. So if a machine could prove their superiority in regards to any one task, then you'll have those who'll see it as a threat. An abomination, if you will. Essentially, an abomination is anything that's able to do anything better than we can, or do something in a way we can't understand. That's the way the word is most commonly used, these days.

The reason I think that Synthesis works in Mass Effect is because these issues are easily fixed. I mentioned in my prior post that a lack of empathy can be caused by physical damage to the brain -- this, in time, can be repaired. This could result in more emotionally sound people, less insecure people, and thus people who could understand that there are beings out there who are superior to them in certain ways. They might be able to accept that intelligent machines could help them attain a perfection they'd never even considered. In fact, this is stated outright in the Deus Ex Universe, but I imagine it's true of that of Mass Effect as well.

Here's the thing: Something that's superior to you is unknown, because it's superior in a way you can't understand, because you can't achieve that level of superiority yourself. An emotionally insecure person today would see that as a challenge to their perfection, to the failiarity and stability of the human race. They'd want to destroy it just so things can go back to the way they were.

An emotionally sound person recognises that if something is superior to them in some ways and friendly, then a mutual partnership is a good idea, and that eradication is the furthest thing from ideal that anything could be.

So that's another angle. Superiority is unfamiliarity, and unfamiliarity to the subset of people I've described is scary and undesirable. So the solution for them is to remove the unfamiliarity rather than seeing the benefits that it could present. Artificial life could provide us with many benefits and could help to make humanity a better race overall. Though I worry that we'll achieve artificial life before we fix ourselves. If that happens, then things are going to get very ugly. And I hope the emotionally insecure ones won't be the death of us all by throwing the first stone.

Modifié par Auld Wulf, 25 avril 2013 - 01:30 .


#57
Guest_Cthulhu42_*

Guest_Cthulhu42_*
  • Guests
Auld Wulf's posts sure are long, considering that virtually all his posts can be summed up as follows:

"People who love the geth and Reapers and pick Synthesis are enlightened and intelligent; those who don't are mentally unstable racist Luddites."

Modifié par Cthulhu42, 25 avril 2013 - 01:54 .


#58
MassivelyEffective0730

MassivelyEffective0730
  • Members
  • 9 230 messages

Auld Wulf wrote...
Crap, crap, and more crap


I can't believe I'm going to waste my time so completely but....

You have many a fallacy.

In your view, familiarity = bad, and unfamiliar = good. That's a black-and-white fallacy. It's either one or the other.

AI can improve life. It can also be the death of us. Conversely, it may have no significant advantage or disadvantage. But you claim to know what is no knowable. And you're begging the question in all of your supports. You draw your supports from your own conclusions, and you draw your conclusions from your supports.

You're affirming the consequent and denying the antecedent.

You're claiming that everyone who is emotionally secure will accept something superior (which you fail to define), and if a person accepts something that is by your definition superior, they must be emotionally secure. Vice versa, you're claiming that everyone who is not emotionally secure will not accept something superior, and if a person rejects something that is by your definition superior, they must be emotionally insecure.

That's also a black-and-white fallacy and a hasty generalization.

And going by a prior statement of yours claiming that people who don't choose synthesis did not go to science class, I will extend the same fallacy to you. You obviously didn't go to an sociology, philosophy, or ethics class.

And by the sheer number of fallacies in all of your arguments (most of the arguments being fallacies themselves), I'm going to make the hasty assumption that you never bothered with a persuasion and argument class.

Or any writing or reading comprehension class for that matter.

Modifié par MassivelyEffective0730, 25 avril 2013 - 02:09 .


#59
MassivelyEffective0730

MassivelyEffective0730
  • Members
  • 9 230 messages

Cthulhu42 wrote...

Auld Wulf's posts sure are long, considering that virtually all his posts can be summed up as follows:

"People who love the geth and Reapers and pick Synthesis are enlightened and intelligent; those who don't are mentally unstable racist Luddites."


He has a big propensity for the mind projection fallacy. His entire outlook basically consists of it.

"Because I think this way, it must be true. Anyone and everyone who disagree's is false"

Seival and David7204 (most of the time) have the same issues.

Modifié par MassivelyEffective0730, 25 avril 2013 - 02:11 .


#60
The Heretic of Time

The Heretic of Time
  • Members
  • 5 612 messages

Auld Wulf wrote...

I was going to put this in as an edit to my prior post, but I recognise that it might not get seen by the people whom I'd like to see it, that way. As such, this should be considered as a direct addendum to my prior post.

So, onto... more thinking!

Okay, so I think another side to this problem is that the human who clings to familiarity can see no better creation than himself and humanity. They are the creation of some deity or other of their own creation, after all. ...which is just patting your own back, really, but I digress. See, this notion that the current human condition is perfect (rather than flawed and cruel) is part of the problem. It also means that we cannot accept that something could be better than us, and that only things familiar to us are equal to us.

In Mass Effect, Deus Ex, and New Vegas (common themes!) it took an AI in all cases to help us achieve perfection, becasue we were too imperfect to do it by ourselves. We'd first need to acknowledge that there's always going to be things that an AI is better at. Humans, for example, are horrendous at micro-management. An AI would be far, far more capable of pulling that off. So there are tasks which an AI could perform which we couldn't. An AI could, for example, manage a consensus and allow us to all remain connected. An AI could look at the needs of the individual, rather than the generality.

Also consider New Vegas and the town of Primm -- they need a sheriff. But most of the options you'll pick will actually doom them. Why? Humans are biased towards their own and act in favour of the generality. What turned out to be the best example? An impartial machine which could look at the needs of each individual and act accordingly and logically. Essentially, machines have a wisdom that humans lack (even me) -- objective fairness. No human can be objectively fair, you can come fairly close (and I strive to), but no one of us can be objectively fair. We're all biased in our own ways, even without realising it.

A machine isn't biased. A machine can look at us and understand the needs, necessities, and desire of the individual and figure out how to implement those without actually impeding the needs, necessities, and desires of any other individual. This is what we use machines for a lot today, we use them for their impartiality. When a machine runs a simulation, it simply crunches the math, it isn't biased towards any particular outcome unless the program in question was weighted (but most programmers know better than that, at least).

The problem, however, is that it's hard for a certain mindset of humanity to accept that anything could be more perfect at any task than a human, because humanity is seen by those people as the peak of evolution. Because we have gone unchallenged. To challenge us, to those people, is essentially an act of war, an affront, an offense, an insult to the deity they created or what have you. So if a machine could prove their superiority in regards to any one task, then you'll have those who'll see it as a threat. An abomination, if you will. Essentially, an abomination is anything that's able to do anything better than we can, or do something in a way we can't understand. That's the way the word is most commonly used, these days.

The reason I think that Synthesis works in Mass Effect is because these issues are easily fixed. I mentioned in my prior post that a lack of empathy can be caused by physical damage to the brain -- this, in time, can be repaired. This could result in more emotionally sound people, less insecure people, and thus people who could understand that there are beings out there who are superior to them in certain ways. They might be able to accept that intelligent machines could help them attain a perfection they'd never even considered. In fact, this is stated outright in the Deus Ex Universe, but I imagine it's true of that of Mass Effect as well.

Here's the thing: Something that's superior to you is unknown, because it's superior in a way you can't understand, because you can't achieve that level of superiority yourself. An emotionally insecure person today would see that as a challenge to their perfection, to the failiarity and stability of the human race. They'd want to destroy it just so things can go back to the way they were.

An emotionally sound person recognises that if something is superior to them in some ways and friendly, then a mutual partnership is a good idea, and that eradication is the furthest thing from ideal that anything could be.

So that's another angle. Superiority is unfamiliarity, and unfamiliarity to the subset of people I've described is scary and undesirable. So the solution for them is to remove the unfamiliarity rather than seeing the benefits that it could present. Artificial life could provide us with many benefits and could help to make humanity a better race overall. Though I worry that we'll achieve artificial life before we fix ourselves. If that happens, then things are going to get very ugly. And I hope the emotionally insecure ones won't be the death of us all by throwing the first stone.


Cool story bro.

#61
dreamgazer

dreamgazer
  • Members
  • 15 765 messages

MassivelyEffective0730 wrote...

"Because I think this way, it must be true. Anyone and everyone who disagree's is false"

Seival and David7204 (most of the time) have the same issues.


Oh, a far larger portion of the BSN has this problem, and they're from all camps.

#62
remydat

remydat
  • Members
  • 2 462 messages
Mangalore,

War is still relatively common. The fact it declined relative to population is largely irrelevant. Conflict is a constant in human society.

And ancestor worship or burying the dead which were the first signs off religion date back much further than organized religion and largely coincides with are becoming mote human and less like our ape like cousins. Religion is not an outlier. It appears to be part of the path of sentient development. For most of our 250 thousand years we could not be disguised between us and other animals. The time in our history where we broke of and truly became unique is also the time where religion developed. Even in the MEU, most of the sentient species had some form of religion. So no it is not an outlier. It is part of the progression and hence can be predicted.

Stopping to kill someone who threatens you does not require more processing power. It is the basic instinct of any living organism. If something attacks you, the instinct is kill or be killed. It is completely illogical to ponder the reasons why they want to kill you when survival is the objective. A deer does see a lion and wonder if this lion is different. It sees lion and based on all the other lions that tried to kill it runs away or if it had the power would kill the lion.

The difference once again is the fact that in its opinion a synthetic allowed to evolve will soon surpass organics and never relinquish that superiority. Within a few years of their birth, the Geth was technological capable of exterminating their creators completely. They choose isolation. If instead they spent the next 300 years preparing for a war with organics, orga bhai ics would be in serious troubke. From the moment of their birth a synthetic race is largely already on par with their creators in terms of their tech. The only thing then is whether they choose conflict or peace.

So not sure what you see as being so complex. Humans still kill for the same basic reasons. All that has changed is we have found a more efficient way to control each other. However when **** hits the the fan and resources became scarce we will go right back to killing each other to survive based on ethnicity or race or nation. We basically have enough food to feed the world but we are quite content to let kids starve and die while. I probably throw out a few pounda of food a week. We speak of liberty and freedom for all but when a terror attack happens we turn over our rights and the rights of others to the government so we can feel safe. So I think that is where we disagree. Humans are largely the same. It is just that as we developed and gained control over the environment and resources we are more willing to tolerate people we consider different. Once those resources become strained we will see how long we hold onto this illusion of togetherness. Point is anyone can play nice when times are relatively good.

Modifié par remydat, 25 avril 2013 - 06:52 .


#63
MyChemicalBromance

MyChemicalBromance
  • Members
  • 2 020 messages
Posted Image

Image sums up thread.

#64
MyChemicalBromance

MyChemicalBromance
  • Members
  • 2 020 messages

Mangalores wrote...
\\

MyChemicalBromance wrote...
I'm led to wonder if the Catalyst even sees things in terms of
probability. If the scale shown in the Extended Cut of Synthesis is
accurate, it may be implying that the Catalyst's technology asserts
control or at least influence over quantum fluctuations. Such
capabilities defy explanation, but there's a chance it sees all of
existence the way we see classical physics.

That said, it still
works if the Catalyst is only going off of probabilities, so for
simplicity (and the fact that the alternative I just mentioned doesn't
really add anything) I'll just stick with that.



Which is a false understanding of quantum fluctuations. There is a reason we see less and less as classical physics: They don't work that way on a fundamental, reality bending level.




I wasn't suggesting that quantum fluctuations can be explained by classical physics, more that the Catalyst may know a set of rules that aren't dependent on probability (unlike our current models). Given that it's had billions of years to tackle the problem, along with mass effect fields and Eezo, there's very little I would rule out.

#65
TheRealJayDee

TheRealJayDee
  • Members
  • 2 954 messages
Interesting read, OP, kudos for providing some fresh food for thought to the BSN - it's much appreciated! Posted Image


@Auld Wulf

I know it is too much to ask of you to consider taking an example of the OP and trying to present your intellectually superior interpretations of Mass Effect in a respectful and constructive manner, so I'm not even trying. Do what you do best, apparently you have nothing to fear from the moderators. To give you some positive feedback: the truly awe-inspiring amount of smug self-content and condescendence your posts ooze with just challenged my whole view not only of ME3, but of life itself and... no, wait, my bad - actually all I feel is the strong desire to take a long, hot shower...

Modifié par TheRealJayDee, 30 avril 2013 - 03:51 .


#66
His Name was HYR!!

His Name was HYR!!
  • Members
  • 9 145 messages
 Somehow I managed to miss this thread entirely. =\\

In any event, this was a spirited read. A couple thoughts...

-- I had never heard of Fermi's paradox before, though I have debated with a friend who believes humans are alone in the universe. This definitely made think a bit. Nonetheless, my simple explanation is this: space is huge. Even for a species potentially much more physically and mentally able than ourselves, there are all sorts of potential issues that make space-travel over any significant distance a very tough thing to establish. Hell, Mass Effect created a new element to make it quick and convenient.

-- I mostly agree with your take on the conflict itself, apart from one thing: I don't think there are truly "malevolent" AI, not really. If anything, organics are the one who facilitate the conflict with them, whereas synthetics have little-to-no motives to ever really attack us. In Low-EMS runs, the catalyst says this quite explicitly: "you (organics) bring it (chaos) on yourselves."

-- I've noticed the examples of created-v-creator between squadmates ("Miranda's ass"), myself. Kudos!

#67
MyChemicalBromance

MyChemicalBromance
  • Members
  • 2 020 messages

HYR 2.0 wrote...


-- I mostly agree with your take on the conflict itself, apart from one thing: I don't think there are truly "malevolent" AI, not really. If anything, organics are the one who facilitate the conflict with them, whereas synthetics have little-to-no motives to ever really attack us. In Low-EMS runs, the catalyst says this quite explicitly: "you (organics) bring it (chaos) on yourselves."


Even if organics initiate the conflict, there is still a conflict. I don't think it's impossible for an AI to determine that the "chaos" should be removed. If the Catalyst wasn't taught to value organic life, it seems likely this is the solution it would come to (it's the one the Citadel AI came to).

I'll concede that "malevolent" wasn't the best word to use; what I'm trying to say is that if an AI that didn't value organic life took control, things would be pretty grim for any civilizations it encountered after that point. 

#68
Shermos

Shermos
  • Members
  • 672 messages
Excellent thread OP. I think you'd enjoy reading the link in my sig.

The writers made a mistake in not explaining the cosmological hypotheses used to build the Mass Effect universe. I guess they expected players to know something about it or to go off and do some reading.

On a side note, I've been spending a lot of time thinking about Fermi paradox, and i still have one question I haven't found an answer for yet. Is it possible the human race is among the first intelligent lifeforms to appear in the universe? That would explain why we can't see obvious signs of galactic civilisation. Working from our current model of the universe, what's the minimum amount of time possible for a planet capable of supporting life like ours to be formed? Could it have happened millions or billions of years before Earth?  

Modifié par Shermos, 30 avril 2013 - 04:27 .


#69
MyChemicalBromance

MyChemicalBromance
  • Members
  • 2 020 messages

Shermos wrote...

On a side note, I've been spending a lot of time thinking about Fermi paradox, and i still have one question I haven't found an answer for yet. Is it possible the human race is among the first intelligent lifeforms to appear in the universe? That would explain why we can't see obvious signs of galactic civilisation. Working from our current model of the universe, what's the minimum amount of time possible for a planet capable of supporting life like ours to be formed? Could it have happened millions or billions of years before Earth?  

There are billions of stars that formed billions of years before ours in this galaxy alone. The Fermi paradox is predicated on the observation that none of these stars would appear to have spawned a space-fairing empirical civilization.

To be more specific, hundreds of thousands of planets similar to ours should have existed long before ours. Extra Credits has a good video series on the subject.

#70
MyChemicalBromance

MyChemicalBromance
  • Members
  • 2 020 messages
One more thing I wanted to add (that I assumed some of you were going to):


Posted Image

This may help explain this scene.

#71
Eterna

Eterna
  • Members
  • 7 417 messages
You know, for all the bad the ending debacle brought I'm also happy for it. It brought forth interesting theories from the community that really makes you reflect.

Thanks for posting this, it was interesting to read and made me think.

#72
Sleepingviking

Sleepingviking
  • Members
  • 47 messages
Interesting read, but in the end, it just makes me sad that we are still alone (in the sense that we dont have any contact with other races) despite overwhelming evidence that we shouldnt.

Oh well, i guess the reason why aliens stay away from us is because bad news travel fast, and they dont want any of it.

#73
Tonymac

Tonymac
  • Members
  • 4 313 messages
Its a good read - but the ending of the game still sucks balls.

#74
Linkenski

Linkenski
  • Members
  • 3 452 messages
I'm gonna have to be completely ignorant and not read anything from OP. If it was the intention that we should read so deeply into what the story meant then they lured me with the wrong bait. I expected a simple storyline with some great depth to it, but this is way too much for my brain to handle. I'm outta here!

#75
Skvindt

Skvindt
  • Members
  • 236 messages

Eterna5 wrote...

You know, for all the bad the ending debacle brought I'm also happy for it. It brought forth interesting theories from the community that really makes you reflect.

Thanks for posting this, it was interesting to read and made me think.


+1, the OP and JShepp have some really great threads.