Aller au contenu

Photo

What is the Right Thing to do with Control?


  • Veuillez vous connecter pour répondre
43 réponses à ce sujet

#26
Vazgen

Vazgen
  • Members
  • 4 967 messages

0d2c55695c2cefbf118c1ff10fa4999d.jpg


  • sH0tgUn jUliA, JasonShepard, SilJeff et 4 autres aiment ceci

#27
RedCaesar97

RedCaesar97
  • Members
  • 3 864 messages

I had a Puppet Master Engineer in ME2 and ME3. She used AI Hacking and Dominate to control enemies. She is also the only Shepard to take the Control ending.
 
It does not matter what the "right" thing to do with control happens to be, because when a control freak gets control of a synthetic army capable of indoctrinating both synthetic and organic life, all will suffer. 
 
Does not matter if Shepard was Paragon or Renegade, everyone and everything will soon conform to Shepard's will; soon everyone will become Shepard. It's like an extreme form of communism. 
 
What's worse? Giant alien starship A.I.s bent on galaxy-wide destruction of organics, or giant alien starship A.I.s bent on galaxy-wide indoctrination of conforming everyone to Shepard's vision of a perfect galaxy.

I'm in full agreement with MrFob:
 

I think RatThing is correct. This question is irrelevant because it is not going to be Shepard who controls the reapers but a Shepard based AI.
What this AI will do is also made quite clear in the EC epilogue slides. It is clear that the AI does not intend to fly the reapers into the sun back to dark space:

...
 
Not only does this indicate that the AI will keep the reapers around, it also indicates that it is intending to use them to impose some sort of order over life in the galaxy, shape it towards what it's extrapolation of Shepard's idea of an ideal galaxy. If Shep was paragon, this is an imposed society of equality, if Sep was renegade, it seems to be an imposed class society with the strongest on top. However, in both cases, the Ai clearly plans to keep control over events in the galaxy, at least in the more immediate future.
So, sorry to the people who wanted to "cheat" the catalyst into a benign destroy ending, it's not supported by what's shown in the game.



#28
Obadiah

Obadiah
  • Members
  • 5 739 messages
Good question, OP.

This is an over-simplified answer, but I'd say the ethical thing is to use the power for "good" until it becomes a problem. It will become a problem if people become dependent on the Reapers, or the Reapers impose the new AI's will.

Then leave or self destruct. Gotta be other places for a god AI spend it's time.

#29
JasonShepard

JasonShepard
  • Members
  • 1 466 messages

I had a Puppet Master Engineer in ME2 and ME3. She used AI Hacking and Dominate to control enemies. She is also the only Shepard to take the Control ending.
 
It does not matter what the "right" thing to do with control happens to be, because when a control freak gets control of a synthetic army capable of indoctrinating both synthetic and organic life, all will suffer. 
 
Does not matter if Shepard was Paragon or Renegade, everyone and everything will soon conform to Shepard's will; soon everyone will become Shepard. It's like an extreme form of communism. 
 
What's worse? Giant alien starship A.I.s bent on galaxy-wide destruction of organics, or giant alien starship A.I.s bent on galaxy-wide indoctrination of conforming everyone to Shepard's vision of a perfect galaxy.

 

So:

Job 1 for the responsible Reaper Controller - switch off the indoctrination.

Job 2 - make note to self not to switch it back on. Ever.

Job 3 - make similar note not to curtail galactic free will. Help, but don't impose. Don't be a control freak - be a self-control freak.

 

Next job?

 

******
 

Good question, OP.

This is an over-simplified answer, but I'd say the ethical thing is to use the power for "good" until it becomes a problem. It will become a problem if people become dependent on the Reapers, or the Reapers impose the new AI's will.

Then leave or self destruct. Gotta be other places for a god AI spend it's time.

 
Of course the difficult thing here is that, by the time that you're imposing your will on the galaxy, it's already too late. You need to pull yourself out before you cross that line. Which, I guess, means you need a self-imposed limit, beyond which you make the call "I'm being too domineering, time to leave". And you'd need to be watching yourself. Constantly.


#30
MrFob

MrFob
  • Members
  • 5 413 messages

"This question is irrelevant" ? Ouch.

 

The Control ending speech is arguably open to interpretation, and I'm still of the opinion that the Shepard-AI is actually Shepard, but that's a separate debate. Regardless, Shepard has no knowledge over those things when presented with the choice, but he or she does have the option to consider power, corruption, and responsibility.

 

I'm asking the question in the OP because a common argument against Control is that it's too much power for one person to hold, or that the Reapers are too dangerous to use. I'm trying to find out how rigorously people would stick to those principles if they already had the so-called too much power. If you throw it away immediately, you're throwing away the potential to do a lot of good. You are, by in-action, killing people. In effect, I'm arguing against the Control-as-cheap-Destroy option.

Ok, that's all fair enough and it is a good question to ask (though I would do so independently of the Control ending since I do think the epilogue leaves little question as to what happens). Sorry if I came off as a giant douche there in my last post.

 

Examining the question for itself, I do think it is a very dangerous thing to do. Even with the best of intentions, you cannot be sure how the experience of getting that much power will change your perspective. Maybe not immediately, but over time. If your consciousness stretches out over an entire galaxy and last forever, how much will a single organic life still mean to you after a couple of aeons? (I tried to work this problem into the Control epilogue of the first one of my ending mods by the way)

 

And even if we assume a best case scenario and you do maintain a moral relationship to your "subjects", even if you maintain something like a what we'd consider a conscience as individuals, I bet you will very quickly and very often come across situations where there is no right answer to a problem. You cannot always do what is "good for everyone". At some point, your action will benefit someone, while they disadvantage another party, unless you keep complete control over everyone (and basically enslave them).

 

An example: The benign reapers help rebuild everything to the point where we were before they attacked. Then, they stay out of everything in order to give organics their oh so relished freedom. But how did Liara's mom put it? We can't go for a single generation without some war breaking out. So, say Shepard cured the genophage and 200 years from the end of ME3, we have another Krogan rebellion on our hands. What do the begin reapers do now? Do they intervene, saviong millions of lives by pushing back the Krogan? I am sure the Krogan wouldn't be too happy about it. Or do they stay out of it, in which case, one would have to ask the question why they stay out of it now but didn't before.

 

So how can you be a good god? I guess, you either have to take full control of everyone's life or none at all (see Star Trek and the prime directive). Which one you choose is a philosophical issue I guess, but as I said in the beginning, I think chances are that any consciousness so vast that it controls all reapers will only see us as we see ants sooner or later anyway and just not care.


  • JasonShepard aime ceci

#31
Jorji Costava

Jorji Costava
  • Members
  • 2 584 messages

So this is only tangentially relevant, but I thought I'd bring it up anyways. According to philosopher L.A. Paul, it may be impossible to make rational decisions about whether or not to undergo certain 'transformative experiences,' where a transformative experience is one in which (a) you can't know what it will be like to have it until you've already had it (i.e. consuming an exotic food or drink you've never tried), and (B) your values and preferences are likely to change once you've had that experience (i.e. having children, joining the military, etc.).

 

These experiences give rise to two problems: First, when you're making a decision, how do you assign a value to the experience given that you have no idea what it's like? Second which preference structure should you privilege in your decision making: The one you have now, or the one you're going to have after the experience?

 

Anyways, I thought I'd mention it because the transition from being plain ol' Shepard to the Shepalyst seems like a good candidate for a transformative experience. Granted, there is a significant difference between this decision and Paul's paradigms for such experiences; since becoming the Shepalyst affects so many people besides yourself, the question of what it will be like is obviously much less pressing in this case. Still, it seemed like an interesting connection to me.



#32
Obadiah

Obadiah
  • Members
  • 5 739 messages

...
Of course the difficult thing here is that, by the time that you're imposing your will on the galaxy, it's already too late. You need to pull yourself out before you cross that line. Which, I guess, means you need a self-imposed limit, beyond which you make the call "I'm being too domineering, time to leave". And you'd need to be watching yourself. Constantly.

I wrote a little fanfic a while ago about how the Shepard AI self destructs after certain unspecified red lines or thought patterns are crossed, based on a simulation it kept of its former self.

http://forum.bioware...0123-the-visit/

The short version, when the Shepard AI was first created it enacted certain protocols to alert it when it was time to leave.
  • JasonShepard aime ceci

#33
Dabrikishaw

Dabrikishaw
  • Members
  • 3 245 messages

Yeah I'm not pretending that Shepard became a reaper or anything, I'm just having fun with the topic.



#34
JasonShepard

JasonShepard
  • Members
  • 1 466 messages

Ok, that's all fair enough and it is a good question to ask (though I would do so independently of the Control ending since I do think the epilogue leaves little question as to what happens). Sorry if I came off as a giant douche there in my last post.

 

Examining the question for itself, I do think it is a very dangerous thing to do. Even with the best of intentions, you cannot be sure how the experience of getting that much power will change your perspective. Maybe not immediately, but over time. If your consciousness stretches out over an entire galaxy and last forever, how much will a single organic life still mean to you after a couple of aeons? (I tried to work this problem into the Control epilogue of the first one of my ending mods by the way)

 

And even if we assume a best case scenario and you do maintain a moral relationship to your "subjects", even if you maintain something like a what we'd consider a conscience as individuals, I bet you will very quickly and very often come across situations where there is no right answer to a problem. You cannot always do what is "good for everyone". At some point, your action will benefit someone, while they disadvantage another party, unless you keep complete control over everyone (and basically enslave them).

 

An example: The benign reapers help rebuild everything to the point where we were before they attacked. Then, they stay out of everything in order to give organics their oh so relished freedom. But how did Liara's mom put it? We can't go for a single generation without some war breaking out. So, say Shepard cured the genophage and 200 years from the end of ME3, we have another Krogan rebellion on our hands. What do the begin reapers do now? Do they intervene, saviong millions of lives by pushing back the Krogan? I am sure the Krogan wouldn't be too happy about it. Or do they stay out of it, in which case, one would have to ask the question why they stay out of it now but didn't before.

 

So how can you be a good god? I guess, you either have to take full control of everyone's life or none at all (see Star Trek and the prime directive). Which one you choose is a philosophical issue I guess, but as I said in the beginning, I think chances are that any consciousness so vast that it controls all reapers will only see us as we see ants sooner or later anyway and just not care.

 

Meant to reply to this days ago - busy week :) (And don't worry, you didn't come across as too much of a douche :) )

I agree, decoupling the question from the other issues with Control would be ideal. To be honest, the only reason I mentioned the Crucible in the OP was for convenience. Maybe I should have phrased it as "You've been given control over the Reapers - don't ask how. Now what?"

 

Lets examine two extremes raised by your post. (1) Shepard, despite being the most powerful being in the galaxy, manages to retain their humanity and morality. (2) Shepard comes to view individual lives as nothing more than ants.

 

The thing is, I can see the same conclusion being (eventually) reached in both cases: *Leave.*

 

In case (1), as you pointed out, Shepard will sooner or later face an impossible situation. Purely by presence of power, Shepard will come to dominate the galaxy. You can't ignore the local god-like entity. Governments will build policies based around Shepard. Military forces will anticipate having to fight the Reapers again. The public will almost certainly campaign against it all. In those circumstances, I think (hope) that a human Shepard would realise that their very presence is doing more harm than good - and leave. However, I suspect that would all happen after most galactic repairs were complete.

 

(2) Shepard grows to the point where 'lesser' life becomes less and less meaningful. This is the Dr Manhattan scenario (which will only make sense if you've seen/watched Watchmen). At some point, Shepard reaches the point where he/she doesn't care about helping out anymore. But he/she would presumably still care about 'lesser' life on some level. So Shepard decides to do his/her own thing - but not here, because there are people here, and Shepard doesn't want to tread on them. So, again, Shepard leaves. (Admittedly, the uncaring presence of the Reapers for that long would probably have done a lot of harm in this scenario.)

 

So I guess my answer to your philosophical question - and, I guess, to my own OP - is to stop interfering and leave as soon as you've undone the damage that the Reapers did to the current cycle.



#35
Raice

Raice
  • Members
  • 72 messages

Harvest the Reapers.  Then use their essence to build a new one in the form of Shepard.  And then continue on with Mass Effect 4.



#36
SilJeff

SilJeff
  • Members
  • 901 messages

Harvest the Reapers.  Then use their essence to build a new one in the form of Shepard.  And then continue on with Mass Effect 4.

 

Harvest-ception?



#37
Raice

Raice
  • Members
  • 72 messages

Harvest-ception?

 

Lol...  Yeah... more or less.



#38
ladyvader

ladyvader
  • Members
  • 3 524 messages

As for the catalyst, I don't believe it's lying but keep in mind that you deal with an intelligence that believed harvesting life (killing it in the process) is preserving life. It is telling you its version of the truth and who knows what this thing understands under consciousness awareness and personality.

 

It's not telling it's version of the truth, it's telling the truth.

 

If the Reapers don't harvest all life, the organics will stop being because the synthetics will destroy them.  For millions of years before the last cycle, organics made synthetics and they fought.  It from what Javik said, the Reapers showed up while they were fighting with their synthetics.

 

The forest fire analogy is perfect.  That is exactly what they are doing.  The Catalyst is right, destroying the Reapers won't stop future generations from creating synthetics again.  It will happen.  It's only a matter of time.  

 

I voiced that concern in one of my fan-fiction stories.  That Shepard felt she made the wrong choice when she destroyed the Reapers.


  • Vazgen aime ceci

#39
RatThing

RatThing
  • Members
  • 584 messages

It's not telling it's version of the truth, it's telling the truth.

 

If the Reapers don't harvest all life, the organics will stop being because the synthetics will destroy them.  For millions of years before the last cycle, organics made synthetics and they fought.  It from what Javik said, the Reapers showed up while they were fighting with their synthetics.

 

The forest fire analogy is perfect.  That is exactly what they are doing.  The Catalyst is right, destroying the Reapers won't stop future generations from creating synthetics again.  It will happen.  It's only a matter of time.  

 

I voiced that concern in one of my fan-fiction stories.  That Shepard felt she made the wrong choice when she destroyed the Reapers.

 

Feel free to believe this. I don`t. Javik`s cycle overcame their synthetics. My Shepards cycle destroyed the Geth. Life finds a way. And the catalyst believed it preserved the races it harvested, not only organic life in general. Of course any "organic" will call that nonsense.



#40
ImaginaryMatter

ImaginaryMatter
  • Members
  • 4 163 messages

It's not telling it's version of the truth, it's telling the truth.

 

If the Reapers don't harvest all life, the organics will stop being because the synthetics will destroy them.  For millions of years before the last cycle, organics made synthetics and they fought.  It from what Javik said, the Reapers showed up while they were fighting with their synthetics.

 

The forest fire analogy is perfect.  That is exactly what they are doing.  The Catalyst is right, destroying the Reapers won't stop future generations from creating synthetics again.  It will happen.  It's only a matter of time.  

 

I voiced that concern in one of my fan-fiction stories.  That Shepard felt she made the wrong choice when she destroyed the Reapers.

 

This kind of touches on a core part of the problem with the ending choices. It's debatable whether or not the story effectively establishes this precise brand of organic/synthetic conflicts is indeed inevitable. It's not just that there will be conflict, it's that this conflict will extend until there is no more organic life (as an aside: if Synthetics seek perfection through understanding organics, why would they kill organics?). Yes, the Geth do nearly exterminate the Quarians but they do stop and then seek nothing but isolation until Reaper interference, hardly the action of the AI the Catalyst is worried about. Even the other AI conflicts in the series were handled by three people on foot. Then ME3 comes along and paints even this as something even more damning. One of the recurring themes of the Rannoch is how often the Geth spare the Quarians, and the writers were not particularly subtle on this front, to the point of annoyance (in fact it seems like AI can't do any wrong in this game). Even the Destroy epilogue doesn't seem to believe in the Catalyst's grave prediction as there is no mention of any kind of conflict ever again occurring, while both the other epilogues at least briefly mention it.



#41
Mordokai

Mordokai
  • Members
  • 2 040 messages

It's not telling it's version of the truth, it's telling the truth.

 

If the Reapers don't harvest all life, the organics will stop being because the synthetics will destroy them.  For millions of years before the last cycle, organics made synthetics and they fought.  It from what Javik said, the Reapers showed up while they were fighting with their synthetics.

 

The forest fire analogy is perfect.  That is exactly what they are doing.  The Catalyst is right, destroying the Reapers won't stop future generations from creating synthetics again.  It will happen.  It's only a matter of time.  

 

I voiced that concern in one of my fan-fiction stories.  That Shepard felt she made the wrong choice when she destroyed the Reapers.

 

Javik's cycle was winning against synthetics. They would probably have won, would the Reapers not have intervened. So yeah, in a matter of speaking, I guess the synthetics were responsible for wiping organics out. Those synthetics being Reapers themselves.

 

Forest fire burns. When there's no more wood to be had, it ends. But before that, it does not turn the trees it burns into monstrosities only vaguely resembling their former selves and turns them against other trees. It does not commit atrocities in the name of "higher good". It's strictly neutral. Catalyst(and by extension, the Reapers) isn't.

 

Yeah, maybe we'll create synthetics again. Yeah, maybe they will be the end of us. But it will be on us, because of our mistakes. To have a third party come and say, well, you'll all die anyway, might as well kill you right away... ain't buying it.



#42
SwobyJ

SwobyJ
  • Members
  • 7 373 messages

I'd give the Reapers cute names, and ban anyone from calling them Reapers, through punishment of Indoctrination.

 

I'd then paint them all pink.

 

And have them blast the (old) My Little Pony theme song.

 

Everyone will be too stupefied to fight each other ever again.



#43
ImaginaryMatter

ImaginaryMatter
  • Members
  • 4 163 messages

I'd give the Reapers cute names, and ban anyone from calling them Reapers, through punishment of Indoctrination.

 

I'd then paint them all pink.

 

And have them blast the (old) My Little Pony theme song.

 

Everyone will be too stupefied to fight each other ever again.

 

The old theme? Stupefied? Try horrified.



#44
SwobyJ

SwobyJ
  • Members
  • 7 373 messages

The old theme? Stupefied? Try horrified.

 

RAINBOWS EVERYWHERE

 

video%20games%20normandy%20reaper%20rain


  • SporkFu aime ceci