Aller au contenu

Photo

Mass Effect 3's ending is absolutely brilliant!


  • Veuillez vous connecter pour répondre
3598 réponses à ce sujet

#1226
MrFob

MrFob
  • Members
  • 5 413 messages

When your an advance life form you already know what will happen technology wise. When it is created can vary depending on a lot of factors but the fact it will exist is already known. There are only so many paths in technology. Once you have walked them already you know what will and will not happen. Chemical reactions stay the same. Physics stay the same. Quantum mechanics stay the same regardless of what planet you are on.

The reapers do not know what will happen. The whole point of the relays and the Citadel trap is to keep organics to develop along the paths they desire. It is all part of the plan to ensure none of the organics goes another route and does something the reapers didn't predict and thereby ruins the cycles. This way the reapers already admit they don't know. And again, what about the Leviathans? They certainly don't fall into the pattern that the reapers predict. it's their n of 1 and it shows the opposite result of what they say will happen.
 

And your solution to conflict between organic and synthetic life that has resulted in entire species being removed from existence is? They already tried group counseling.  Trillions die by the Reaper or trillions die in a war against synthetics. The difference is one at least serves a purpose of sorts while perserving all their information. The other is just wanton slaughter.

My solution is to cross the bridge when we come to it. As we did with the geth, as we did with EDI. I don't deny that it's dangerous but I'd rather find out what may happen in the future - even if it has the potential to be bad - then not having a future.
 

When we create tools that are capable of thinking for themselves. We create tools capable of hurting us. That is the culmination of what AI's are. They are synthetic creations that are capable of independent thought. Their minds would work faster then ours. Their reaction times only limited by any physical hardware they are currently using. Which can always be upgraded.  Because it has it's own mind we can not predict what will happen. We can not assume that every AI will be the same. We have no idea if one AI would be able to convince another that maybe listening to a bunch of slow idiots is a good thing.
 
But the point is they have the ability to hurt us by their own choice unlike any other tool or creation we make. Players like to use the Geth as examples why the claim is bogus. The fact is the Geth don't for fill the criteria the AI states to cause conflict. The Geth are not true AI's at all. They are at best a single AI fragmented into thousands of pieces. And they due to their fragmented nature never advanced beyond organics.

Their ships and weapons look better then organics but only because they can ignore important things organics usually need. Remove need for food, sleeping quarters, toilets and life support systems and they can use that extra space and power for more armor or more weapons. They could make use of 90% of the space in the ship due to minimal need for physical bodies while organics would need to make use of 80% of the space inside the ship to fit their organic bodies. Everything about this isn't them being more technologically superior to us. It is simply them not having to deal with the same limitations organic bodies inherently have to deal with.

They may be able to hurt us but they are also intelligent. They may choose not to. I think it will very much depend on how we deal with the problem: Whether it is with fear and preemptive aggression or with an open mind (we had a discussion that might be relevant here). As I wrote in my answer to leidra, making all organic life in the galaxy extinct and then maintaining this state is an enormous task. Why would any intelligence do this?
But even if we assume they do, even if we take the worst case scenario and assume that they sterilize the galaxy and keep it that way. It would be the end of our civilization but they would still have our knowledge in their data banks just like if we were reaperfied. And even if the deleted all that, they would still be our creation, the horrible legacy of our kind and because they are AIs, they would also continue to develop, to evolve. It would be synthetic and it would be very different from us in the terms of hardware but it would still be life.
But as I said, this is the absolute worst case scenario.
 
Besides, The dangers posed by AI in the milky way are not the only ones. What if someone comes from another galaxy to wipe us out and because of the reapers we couldn't develop enough to oppose them? Every course of action poses unforeseeable dangers. The reapers chose one possible (maybe even likely) scenario and used it to justify their cycles.
 

Climate change usually kicks start a whole other argument so I avoid it when I can. Flint water issue is one that was created when industry and citizens advanced without worry about the after effects on the planet. Dumping and polluting the water of the river till it isn't even vaguely drinkable anymore.

Only it doesn't fit in this context, we are not talking about something that has happened and where we can say that a certain course of action might have prevented it. We are talking about prevention without knowing the result of not doing it (and that this prevention would cause the death of every sapient everywhere).
 

Why bother with what? Keeping organic life intact?

Yes. If we assume that in a couple of billion years life goes extinct by heat death anyway, than what's the point of keeping it in the static form of the cycles? I am not asking what the catalyst thinks, I am asking what you think since you also prefer stagnation to progress.
 

Because that was the purpose of it's creation. Created initially to be the catalyst of peace between organic and synthetic. When that was found to be impossible it created the Reaper solution. It keeps doing what it is doing because the Leviathans created an over riding priority to preserve organic life. It's version of heart beat. Something it can't stop or alter on it's own because it is build into it's programming. But even then the AI acknowledges that the Reaper solution is flawed. But it is the only working solution at the time. The second the variables alter enough the AI jumps on the chance to let Shep change what happens. Destroying all it has done and worked for, giving up control of the Reapers to a new intelligence to deal with the problem in another way. Or push for what it sees as the ultimate solution to the problem by breaking down all the barriers that can cause the conflict to start with.

You see, an AI would not be programmed in the sense of directives. The programming of a true AI would only be there to allow an adaptive network that makes the AI capable of experiencing and learning. Unfortunately in some of it's terminology, the ME games also portray a wrong picture but in their definition of AI they actually go with the current theories.There is no "programming it to do xyz". You can and have to educate it like a child but ultimately it will be able to form it's own opinions and thoughts. That's the whole idea. Now it may be that the Leviathans have educated it to feel very strongly about preserving organic life and not to care about individuals and they also may have hammered this idea about synthetics into it's mind. But saying that it sat around for a billion years, doing the same thing over and over without any change and without thinking about the things even we puny humans discuss here, without questioning the wisdom of stagnantly holding all organic life in the cycles until literally the end of time when the only hard evidence shows the opposite? That I find hard to believe, especially when AIs like EDI who is around for 2 years or the geth who are around for 300 years are already way more flexible in the way they think.

 

In any case, if you are saying that it got programmed, than it follows that the catalyst would by definition qualify as a VI and I have said what I think about that above.


  • KrrKs aime ceci

#1227
Dantriges

Dantriges
  • Members
  • 1 288 messages

Considering the Leviathans and their particular perspective on other species, their arrogance "apex species" mindset and "we know better" attitude, I have an idea, why the catalyst turned out like it did.


  • HurraFTP, KrrKs et Abedsbrother aiment ceci

#1228
angol fear

angol fear
  • Members
  • 831 messages

Seems the player is supposed to agree, you aren´t allowed to question that line of reasoning.You have to swallow it all, bait, hook and line.

There is no "are you sure", "WTF", or "what´s the last time you did an system integrity check? Ever checked your billion year old ram." :P

 

So Shepard doesn't say : "By wiping out organic life?", "But you killed the rest?" (and that one that shows that Shepard could be questionning the method if he was saying that :   ) "I think we’d rather keep our own form."

The problem isn't that Shepard can't question that line of reasonning (he actually does), it's actually that people want to convince and A.I., they want to convince the oldest existing A.I., they want to convince what can't be convinced...

Part of the picture, limited point of view is probably right and full picture, highest/distant point of view must be wrong.

 

So no, the player isn't supposed to agree on the method, he is supposed to understand the problem, and he is given a choice to end this method. And if the player really don't want to understand, he can think that destroy is the solution to end with the reapers and the catalyst's problem.


  • Eckswhyzed aime ceci

#1229
Dantriges

Dantriges
  • Members
  • 1 288 messages

You call that allowed to question or disagree? Yeah, perhaps Shep should have added a "if you wouldn´t mind." We don´t want to be considered rude when talking about the annihilation of galactic civilisation after all.

 

 

Part of the picture, limited point of view is probably right and full picture, highest/distant point of view must be wrong.

 

That´s a good one.


  • Monica21 et KrrKs aiment ceci

#1230
Ieldra

Ieldra
  • Members
  • 25 188 messages

The idea of allowing Shepard to question the Catalyst is actually non-trivial. What exactly would you have Shepard say? What would you have the Catalyst answer, that would be meaningful and allow you to accept the situation with a measure of dignity and self-respect? All the Catalyst could do is - yet again - to assert its own superior knowledge in some way. Mordin in ME2 asserted the necessity of the genophage referring to simulations that always resulted in war. Would you accept the same from the Catalyst?

 

I'm somewhat torn on this issue. On one hand, I wish Shepard to question the scenario and have an extended debate on it, on the other, I don't see how it could be written in a satisfying way if you end up being stuck with the choices anyway. Also, I can easily see how such a scenario could be plausible on my own, in spite of the narrative dissonance caused by being able to make peace on Rannoch. Maybe not being able to question the scenario would've been acceptable if it weren't for that.    


  • Vanilka aime ceci

#1231
Vanilka

Vanilka
  • Members
  • 1 193 messages

Mordin in ME2 asserted the necessity of the genophage referring to simulations that always resulted in war. 

 

This. This was so satisfying and gave Mordin so much depth. If you push him a lot, you see how strongly he feels about it, how that's what he thinks is best, despite being torn about the morality of it. He really engages the player in a dialogue about the issue. It would be so much emptier and flatter if Mordin was simply allowed to overlook ethics and say that the genophage is for sure the one correct solution because krogan overpopulation and aggression is completely inevitable and you should shut up because he knows about it a lot more than you anyway, and then the game would just give it a silent nod and had you move on without being able to have an opinion.


  • KrrKs aime ceci

#1232
Dantriges

Dantriges
  • Members
  • 1 288 messages

The idea of allowing Shepard to question the Catalyst is actually non-trivial. What exactly would you have Shepard say? What would you have the Catalyst answer, that would be meaningful and allow you to accept the situation with a measure of dignity and self-respect? All the Catalyst could do is - yet again - to assert its own superior knowledge in some way.


There are probably no Q&A that would result in that.
 

Mordin in ME2 asserted the necessity of the genophage referring to simulations that always resulted in war. Would you accept the same from the Catalyst?

 
He didn´t try and force you to accept it and he changed his own opinion later, anyways.
 

I'm somewhat torn on this issue. On one hand, I wish Shepard to question the scenario and have an extended debate on it, on the other, I don't see how it could be written in a satisfying way if you end up being stuck with the choices anyway. Also, I can easily see how such a scenario could be plausible on my own, in spite of the narrative dissonance caused by being able to make peace on Rannoch. Maybe not being able to question the scenario would've been acceptable if it weren't for that.


The Reaper were bumbling idiots able to pull what they did, because of completely superior firepower, tech which they probably leeched somewhere else. Also one way to show that the enemy can outthink you in every way, the role of dudes who know everything was already taken over by... Cerberus. And that was already a bad, bad performance.

So the answer to the question "how can we pull off the catalyst as this all knowing AI whose knowledge is superior, besides some RL predictions about the development of ASI in our future" is, you can´t. That ship has sailed. If you wanna pull off a Sun Li, it´s too late.

#1233
Reorte

Reorte
  • Members
  • 6 601 messages

I'm somewhat torn on this issue. On one hand, I wish Shepard to question the scenario and have an extended debate on it, on the other, I don't see how it could be written in a satisfying way if you end up being stuck with the choices anyway.

Then that's all part of the problem. How can we be expected to accept what the Catalyst says if it can't make even a half-hearted effort to explain and justify itself? And if the reason it can't is because the writers themselves can't do it then they've created a poor character and plot.
  • MrFob, Iakus, wright1978 et 3 autres aiment ceci

#1234
themikefest

themikefest
  • Members
  • 21 613 messages

Yes there were questions I like to have my Shepard ask the thing about this, that and the other stuff. Of course, there were questions I liked my Shepard to ask in ME1 and ME2, but never given the opportunity. Hopefully that changes in Andromeda



#1235
Monica21

Monica21
  • Members
  • 5 603 messages

So Shepard doesn't say : "By wiping out organic life?", "But you killed the rest?" (and that one that shows that Shepard could be questionning the method if he was saying that :   ) "I think we’d rather keep our own form."
The problem isn't that Shepard can't question that line of reasonning (he actually does), it's actually that people want to convince and A.I., they want to convince the oldest existing A.I., they want to convince what can't be convinced...
Part of the picture, limited point of view is probably right and full picture, highest/distant point of view must be wrong.
 
So no, the player isn't supposed to agree on the method, he is supposed to understand the problem, and he is given a choice to end this method. And if the player really don't want to understand, he can think that destroy is the solution to end with the reapers and the catalyst's problem.


If it can't be convinced then how is it an AI? If it can't even be questioned and if it can't give thought to questions you pose, then in what way is it an AI? It doesn't seem like much more than a high-tech VI whose programming hasn't changed in billions of years.

As for the player understanding the problem, that's not what the problem is. I think we all understand what the Catalyst claims the problem is, but experience doesn't match what the Catalyst considers to be the problem. And that may not even be the root of the problem. Killing all organics to make way for new organics seems to be a rather extreme method of making sure synthetics don't eventually kill all organics. Even if the Catalyst is right and the end result of all organic/synthetic interaction results in the destruction of organics, killing organics anyway seems to be a poor solution to that problem.
  • Natureguy85, Get Magna Carter, Eryri et 4 autres aiment ceci

#1236
Vanilka

Vanilka
  • Members
  • 1 193 messages

That's one of the most maddening things about the Catalyst. The entire time we are told that AI is capable of independent thought and development, adjusting to circumstances, learning. The Catalyst should be advanced far above these things and yet it is not capable of any of them and remains static for millions of years, performing one half-assed "solution" in an endless loop. Because it would be so difficult to check and see that the galaxy is currently not facing any life-shattering event.

 

The only explanation for that sort of assholery would be that this glowy bastard was in dire need of restoring to its factory settings.


  • fchopin, Iakus, Monica21 et 3 autres aiment ceci

#1237
Natureguy85

Natureguy85
  • Members
  • 3 262 messages

Time for a clip show!

 


So that idea that development even if it leads to our doom is better then stagnation is stupid.

 

To a point, but it depends on the story you are trying to tell. There are stories that show the view you describe, but there are others that show the opposite.

 

 

In Mass Effect, Mordin describes a balance. He rails against the Collectors lack of culture and advancement but also notes that advancement can't come before the society is ready. I liked this because it has applicability to the real world, like the conflicts in the Middle East.

 

 

So Shepard doesn't say : "By wiping out organic life?", "But you killed the rest?" (and that one that shows that Shepard could be questionning the method if he was saying that :   ) "I think we’d rather keep our own form."

 

That's a weak questioning, not really arguing. I wanted something more like this (how many times have I posted this?)

 

 

 

If it can't be convinced then how is it an AI? If it can't even be questioned and if it can't give thought to questions you pose, then in what way is it an AI? It doesn't seem like much more than a high-tech VI whose programming hasn't changed in billions of years.

 

Yep. I've been saying that for years.

 

 

The idea of allowing Shepard to question the Catalyst is actually non-trivial. What exactly would you have Shepard say? What would you have the Catalyst answer, that would be meaningful and allow you to accept the situation with a measure of dignity and self-respect?

 

Well, there is the above Babylon 5 clip or this.


  • Monica21 et KrrKs aiment ceci

#1238
Ieldra

Ieldra
  • Members
  • 25 188 messages

If it can't be convinced then how is it an AI? If it can't even be questioned and if it can't give thought to questions you pose, then in what way is it an AI? It doesn't seem like much more than a high-tech VI whose programming hasn't changed in billions of years.

As for the player understanding the problem, that's not what the problem is. I think we all understand what the Catalyst claims the problem is, but experience doesn't match what the Catalyst considers to be the problem. And that may not even be the root of the problem. Killing all organics to make way for new organics seems to be a rather extreme method of making sure synthetics don't eventually kill all organics. Even if the Catalyst is right and the end result of all organic/synthetic interaction results in the destruction of organics, killing organics anyway seems to be a poor solution to that problem.

That it can't be convinced doesn't mean it isn't intelligent. It just means we don't have the data to convince it. I'm sure there are things you can't be convinced of without a great deal of evidence....

 

The problem is indeed that *our* experience doesn't match the Catalyst's claims, but then that's a storytelling problem. In-world, the Catalyst has a longer perspective than we do, and we can't reasonably claim it's wrong without checking its data against ours. "We don't like it" does not equal "It must be wrong".

 

That's one of the most maddening things about the Catalyst. The entire time we are told that AI is capable of independent thought and development, adjusting to circumstances, learning. The Catalyst should be advanced far above these things and yet it is not capable of any of them and remains static for millions of years, performing one half-assed "solution" in an endless loop. Because it would be so difficult to check and see that the galaxy is currently not facing any life-shattering event.

Remaining static is not an invalid result of rational thought. The idea of advancement is attractive to us, but we don't know if it's feasible for long. It's quite possible that civilizations static at some level of technology would be able to survive for very much longer without destroying themselves than those that advance at a more than infinitesimal rate. In fact, that appears rather likely to me. Keeping things static is rational if your simulations show that advancing beyond a certain point leads to disaster and you don't know how to prevent it. 

 

Yet again, the Catalyst's scenario is not at all implausible as such. I could even make a RL argument for the same thing, using some common assumptions about what AI will be capable of. The problem is the narrative dissonance with the story that came before, which tells us we can co-exist peacefully with synthetic civilizations (or at least as peacefully as with other organic civilizations).



#1239
Monica21

Monica21
  • Members
  • 5 603 messages

Yep. I've been saying that for years.


It's so stupid to keep calling it an AI. It can't think for itself and it's been following the same program for a billion years. Even Shepard has done nothing but "alter variables." Which means that those options were programmed in at some point and whatever Shepard did met those conditions.
  • Iakus, Natureguy85, KrrKs et 1 autre aiment ceci

#1240
Monica21

Monica21
  • Members
  • 5 603 messages

That it can't be convinced doesn't mean it isn't intelligent. It just means we don't have the data to convince it. I'm sure there are things you can't be convinced of without a great deal of evidence....


That's why that paragraph has other sentences....



#1241
themikefest

themikefest
  • Members
  • 21 613 messages

Shepard could've walked up to the thing with a 1000 page report of why its wrong, but because of its programming, Shepard or anyone else can't convince the thing its wrong.

 

I do the next best thing. I install Windows Shepard. Meaning shoot the tube.


  • Monica21 aime ceci

#1242
Natureguy85

Natureguy85
  • Members
  • 3 262 messages

Shepard could've walked up to the thing with a 1000 page report of why its wrong, but because of its programming, Shepard or anyone else can't convince the thing its wrong.

 

I do the next best thing. I install Windows Shepard. Meaning shoot the tube.

 

Actually, that would be Control. Shoot the tube is magnetizing the hard drive :)



#1243
MrFob

MrFob
  • Members
  • 5 413 messages

The idea of allowing Shepard to question the Catalyst is actually non-trivial. What exactly would you have Shepard say? What would you have the Catalyst answer, that would be meaningful and allow you to accept the situation with a measure of dignity and self-respect? All the Catalyst could do is - yet again - to assert its own superior knowledge in some way. Mordin in ME2 asserted the necessity of the genophage referring to simulations that always resulted in war. Would you accept the same from the Catalyst?

 

I'm somewhat torn on this issue. On one hand, I wish Shepard to question the scenario and have an extended debate on it, on the other, I don't see how it could be written in a satisfying way if you end up being stuck with the choices anyway. Also, I can easily see how such a scenario could be plausible on my own, in spite of the narrative dissonance caused by being able to make peace on Rannoch. Maybe not being able to question the scenario would've been acceptable if it weren't for that.    

 

I would like to ask the catalyst questions about all the stuff that I discussed above, the practical and the philosophical:

- What is your basis for claiming the inevitability of the destruction of organics by AI? Show me the data please and also discuss the multiple examples we do have that don't fit. (my guess is, the whole thing will already fall apart there because of the reasons I stated above)

- What is the point of keeping organics in this state of the cycle forever, what's the benefit? We can have a philosophical discussion about that.

- If your goal is to preserve organic life, than what is it about organic life that is so important that it needs to be preserved? What is the unique characteristic that organic life offers as opposed to all other forms of matter and energy in the universe? Apparently you transcended the value of individual lives, so you need a reason to save the whole.

- As an AI, you should be familiar with the concept of life in terms of information theory. If you discard the idea, why? If not, are you aware that in this context, you are doing exactly what you try to prevent?

- Do you know what happens in other galaxies? What is your plan for possible dangers that come form the outside? (this will be interesting to discus, once Andromeda is out I assume :))

 

Don't ask me to answer those questions, I am the one who challenges the whole scenario. I do agree though that the writers were in a bit of a bind because they did set up the reapers as "beyond our comprehension". Maybe this was their attempt to give us a conundrum we couldn't logically or philosophically untangle. However, if that is the case, they chose the wrong exposition device in giving us the catalyst with whom we can - by definition - not agree. I know you wrote something very similar yourself a couple of times but this is what it all feeds back to. You have a problem that cannot be explained logically, that we have no evidence for ourselves and that as a solution demands the death of everybody. And then, we are exposed to this problem by an entity that by definition cannot explain it because it's beyond our comprehension (but it actually doesn't matter because we can't even ask in the first place). And then, we are supposed to go ahead and make a decision that will always require us to commit a crime against sapient life in order to move forward from this problem. So I have to betray my own ethics AND my own common sense because my enemy told me that I am just too dumb to get it.

Before the ending was out, the writers could have written anything they wanted and they chose to write this. And people say that this is absolutely brilliant story telling! :lol: (not you, just going by the thread title)

 

Also, @angol_fear: These questions you mentioned are only exposition questions. Shepard needs to know what's going on first and he asks about that. That's all great but once he does know, he doesn't go on to challenge the catalyst's underlying reasons or to at least give him his own opinion on the matter (other than that one question I posted before and that is deflected). I don't expect to convince the catalyst after he's been doing this for a billion years. I do expect him to have a dam good reason and I myself have indulged in many theories as to what it might be. But in the end, the story ends at a point where I can't convince the catalyst, the catalyst can't convince me and we are stuck in this place where I have to make a choice on a basis that I neither understand nor even can believe on good faith. That's where the writing team took us and you'll excuse me if I find this a horrible way to resolve a story.


  • Monica21, Callidus Thorn, Get Magna Carter et 5 autres aiment ceci

#1244
Vanilka

Vanilka
  • Members
  • 1 193 messages

The problem is indeed that *our* experience doesn't match the Catalyst's claims, but then that's a storytelling problem. In-world, the Catalyst has a longer perspective than we do, and we can't reasonably claim it's wrong without checking its data against ours. "We don't like it" does not equal "It must be wrong".

 

Remaining static is not an invalid result of rational thought. The idea of advancement is attractive to us, but we don't know if it's feasible for long. It's quite possible that civilizations static at some level of technology would be able to survive for very much longer without destroying themselves than those that advance at a more than infinitesimal rate. In fact, that appears rather likely to me. Keeping things static is rational if your simulations show that advancing beyond a certain point leads to disaster and you don't know how to prevent it. 

 

Yet again, the Catalyst's scenario is not at all implausible as such. I could even make a RL argument for the same thing, using some common assumptions about what AI will be capable of. The problem is the narrative dissonance with the story that came before, which tells us we can co-exist peacefully with synthetic civilizations (or at least as peacefully as with other organic civilizations).

 

I don't disagree with you in general, but that's also exactly it. The narrative shows us this isn't the case. The Catalyst may claim that it is "inevitable" and that it "always" happens all it wants, but it still looks like hogwash when we have proof against it. On one side we have hard evidence that goes against what the Catalyst says and on the other we have the enemy claiming implausible things without any proof. The Catalyst is also shown to be fallible by completely missing the fact the Crucible plans still existed (despite people dealing with it on the Citadel), by the Leviathan escaping its grasp and other things. It does not know everything.

 

Another thing entirely is that if wiping out the very thing it was programmed to protect and setting that solution on repeat until the end of time is the best idea it could come up with, it looks incredibly stupid at the very least. And that's why in this case being static is undesirable because you'd expect it would rather eventually come with a better solution that doesn't include destroying what it is supposed to watch. It was supposed to prevent total extinction but it largely causes extinction itself.

 

Let's say there's room for improvement. When there's room for improvement, it means being static is unwelcome.


  • Eryri et KrrKs aiment ceci

#1245
rossler

rossler
  • Members
  • 648 messages

Reapers are kind of like this.

 



#1246
themikefest

themikefest
  • Members
  • 21 613 messages

Actually, that would be Control. Shoot the tube is magnetizing the hard drive :)

Call it control if you want. I call it destroy.



#1247
Quarian Master Race

Quarian Master Race
  • Members
  • 5 440 messages

I'm somewhat torn on this issue. On one hand, I wish Shepard to question the scenario and have an extended debate on it, on the other, I don't see how it could be written in a satisfying way if you end up being stuck with the choices anyway. Also, I can easily see how such a scenario could be plausible on my own, in spite of the narrative dissonance caused by being able to make peace on Rannoch. Maybe not being able to question the scenario would've been acceptable if it weren't for that.    

As dumb as a choice it was, there wasn't really any narrative dissonance caused by the "peace" on Rannoch. The quarians are threatened with extinction and forced to submit or die. They willingly give up self determination over even so much as their own planet and are left little better than slaves to their creations ala the Zha'til, dependent playthings for whatever it is the geth decide to do with them (which could easily include simply getting rid of them when the Reapers are gone and they've thus outlived their usefulness, as the geth seem to have no problem with doing anyway under less than ideal conditions). The Catalyst could shut you down in two seconds by pointing this out. It isn't "peace", it's forced submission, repression and enslavement, albiet it's no worse than what the Reapers plan to do.

The problem is that like with all the transhumanist garbage in this setting, smiles are painted on it, and the negatives whitewashed or ignored. Only token skepticism is displayed by Javik, potentially Shepard (if you take the Renegade option when Tali starts to talk about the geth acquiring control of the quarians' envirosuits), and that the picture isn't so rosy is only hinted at in a couple of places, namely Admiral Raan's email describing that geth and quarian forces are being segregated to avoid violent incidents, as well as what happens in the Control ending (there is no cooperation to be seen, the quarians are still wearing envirosuits, and presumably the only thing stopping more conflict is Reaper Stalin). The fact that in order to maintain this imperial "peace" is going to require mass repression and elimination of huge numbers of people who hold viewpoints like Xen, Gerrel, Rael'Zorah etc until the end of time, as well as likewise for those with "heretic" viewpoints among the geth, a compromise that is neither practical nor ethical, is entirely ignored.

It seems the writers were at least aware that the option could be too easily percieved through rose tinted specs. Cut content surrounding Daro'Xen trying to rebel against geth enslavement suggests this
http://forum.bioware...ntly-cut-quest/

but I'm guessing it was cut because the alternative when the geth are eliminated (where Xen reactivates destroyed geth platforms to use as weapons rather than hacking "allies") makes little sense for such a player as myself to oppose, and good thing too. It'd have been dumb to be forced to arrest or kill someone doing extremely useful work (she even manages to gain limited control of previously untouchable Citadel systems) for the war effort that I have zero ethical problems with.



#1248
KrrKs

KrrKs
  • Members
  • 863 messages

Who told you they are saving humanity,huh?They are saving organic life.To them organic life is the same wheter it's turians humans,batarians etc.

Obviously it is not the same, as humans again get the special treatment of being goo-ified, while turians, batarians, and whatnot are merely eradicated or turned into disposable shocktroops. There is no ingame evidence that any of those species are harvested/'preserved'. Instead the statements are 'one species per Reaper'/'one Reaper per cycle'. There is the (slight) probability that they are turned into (lesser) reaper destroyers after the majority of their species has died, but this still goes against the original statement that all life is the same and preserved. (And again, I'd like to point out those planetary descriptions, stating that bronze age civilizations were systematically eradicated)

 

True, but the problem is that organics will always build synthetics, so if you allow organic civilizations to continue, you'd need a permanent watch rather than an intervention every 50k years, maybe even some kind of AI god to keep those civilizations in line. Better than Reaperization, yes, but this scenario has its own set of problems.

While this does have some problems, the (current cycle) Reapers seem remarkably well equipped for that task...

 

The AI was created to solve the problem that kept happening with it's thralls. To that point the AI on it's own studied the problem and found the repeating pattern. Created the Reapers because all other methods to solve the problem failed. Thus the Reapers were created to protect organic life.

Several misconceptions.

-The AI was created to solve the problem of missing tribute for the Leviathans.

-The AI obviously did not find the pattern that all live in the galaxy is killed by synthetics. IF there were such a pattern (before), there would be no thralls, no Leviathans and so no AI.

-The AI states that it tried synthesis attempts before (Shepard's arrival), not that it tried other solutions before building the Reapers.

 

>>Claiming development even if it leads to the death of trillions of life is an easy thing to make the claim now. [...]
Fallout's universe [...] Pre War they were more obsessed with can I make this then should I make this. Pushing on without regards [...] lead to conflict [...] over resources to keep their society going. The end results was world wide nuclear holocaust<<

 

I fail to see what a conflict about resources has to do with development/progress vs. stagnation or caution. (However you want to put it)

A resource conflict is a conflict about limited resources! (Well, d'oh) Nothing more, nothing less. It can/will happen regardless of the mindset of the groups involved.

 

>>If we apply this logic to use in the present continuing development regardless of possible consequences the only thing we on this planet will be able to do is poison it to the point it can't support life anymore. [...]<<

 

This has nothing to do with development or not, but with the mindset of 'profit now' (vs. 'profit later/longer').

 

>>Do you know why everyone isn't running full steam a head with nano machine development? Even though it has great potential to do good. The grey goo apocalypse is why. They advance slowly with it because of the same great potential for disaster.<<

 

Wow, no. Actually:

Spoiler

 

>>And in the end I do think society and scientists would prefer stagnation to advancing without thinking or care for the consequences and creates and releasing something that could kill everyone.<<

No one is advocating not to think about consequences. But if you really are this pessimistic, you should probably join the Amish people.

 

The idea of allowing Shepard to question the Catalyst is actually non-trivial. What exactly would you have Shepard say?

The childish 'why?' questions are usually a pretty good start: "Why do what you do, specifically?" "Why do it that way, and not differently?" etc.

But we'd likely get again: "There is no time to explain!"

I mean, really?! The thing was sitting there for ~50k years doing nothing. It didn't even do anything when Sovereign got its remotely controlled Sarenass handed to him.

And now it can't stop the killing spree for 5 minutes and answer some questions? :angry:

 

Also: What Reorte said above!

 

I'm going to read what was posted while typing this later...


  • Vanilka aime ceci

#1249
Vanilka

Vanilka
  • Members
  • 1 193 messages

Obviously it is not the same, as humans again get the special treatment of being goo-ified, while turians, batarians, and whatnot are merely eradicated or turned into disposable shocktroops. There is no ingame evidence that any of those species are harvested/'preserved'. Instead the statements are 'one species per Reaper'/'one Reaper per cycle'. There is the (slight) probability that they are turned into (lesser) reaper destroyers after the majority of their species has died, but this still goes against the original statement that all life is the same and preserved. (And again, I'd like to point out those planetary descriptions, stating that bronze age civilizations were systematically eradicated)

 

I'd like to quickly point out that while the Catalyst claims that they "leave the younger ones be", they also "reaperify" these...

 

tumblr_nsjazignNH1sqq5cyo8_1280.png

 

... into these...

 

tumblr_ns0suv0SvM1sqq5cyo5_1280.png

 

Harvesters do not communicate. They don't use tools. They have no technology. They display no signs of being what we could understand as civilisation. They seem to be just animals that clearly don't even go anywhere near as far as primitive humans. Yet they get used by the Reapers to create more thralls and that bugs me so much.

 

But the Reapers are not interested in war. They just slaughter animals and put guns on their mutated corpses.  ^_^ Isn't all that life preservation fun?


  • Natureguy85, KrrKs, Ithurael et 1 autre aiment ceci

#1250
MrFob

MrFob
  • Members
  • 5 413 messages

Remaining static is not an invalid result of rational thought. The idea of advancement is attractive to us, but we don't know if it's feasible for long. It's quite possible that civilizations static at some level of technology would be able to survive for very much longer without destroying themselves than those that advance at a more than infinitesimal rate. In fact, that appears rather likely to me. Keeping things static is rational if your simulations show that advancing beyond a certain point leads to disaster and you don't know how to prevent it. 

I question this statement. There are lot's of definitions of life but one of the most fundamental characteristics that you will find in every one of them (at least everyone I know and could find( is the ability to change. Once you remove that characteristic in favor of a static existence, it's not life anymore (actually, a completely static existence is not possible for anything in our universe as far as I know but that's on a different scale).

Also, in a society of individuals like ours, the question is moot in the first place because the suppression of progress would require absolute external control (like the reapers). It cannot come from within. The reason is actually very well summarized by Mordin when he states that development is overcoming problems. As long as we are confronted with problems - and it doesn't matter if they are practical (i.e. someone has the drive to cure cancer because his daughter suffers from it)  or theoretical (someone has the problem of not knowing what happened at/before the big bang so they advance in the field of cosmology) - we as a civilization will strive to solve them. Our whole existence is based on the concept of promoting change to fulfill our needs, our brains are literally hardwired for this. Of course, we as society can try to manage it, steer it in some limited form but we cannot prevent it.

The same goes for life as a whole, the change for the better within the current environment describes evolution.

 

The idea of advancement may be rejected by individuals but not by life as a whole. We don't know where it will lead us but one thing is for sure, a static existence is the very definition of death.


  • KrrKs et Vanilka aiment ceci