My plan... send EDI with a Normandy full of EVA fembot AI and Asari to greet them.
Sounds like the worst nightmare of a handful of the BSN's regular posters.
My plan... send EDI with a Normandy full of EVA fembot AI and Asari to greet them.
Sounds like the worst nightmare of a handful of the BSN's regular posters.
This arguing is stupid. The Catalyst was only ever supposed to make sure the Leviathan's slaves didn't get killed by their robots. It selectively reinterpreted its directive into some grand save the galaxy nonsense plan. It's the paper clip AI gone amuck failing to realize its own actions have caused a MASSIVE escalation in the very "conflict" it's convinced itself it has to stop. That's it. There is no deeper meaning to any of this.
This arguing is stupid. The Catalyst was only ever supposed to make sure the Leviathan's slaves didn't get killed by their robots. It selectively reinterpreted its directive into some grand save the galaxy nonsense plan. It's the paper clip AI gone amuck failing to realize its own actions have caused a MASSIVE escalation in the very "conflict" it's convinced itself it has to stop. That's it. There is no deeper meaning to any of this.
You didn't pay a lot of attention when the Leviathan spoke, did you? It got only one mandate: preserve life at any cost. (note: life. Not organic life, but life in general)
It's doing exactly that.
"Tribute does not flow from a dead race"
That was their intent. It didn't understand their intent. It perverted that directive into something horrible even if they refuse to admit the mistake.
Sure, from a human perspective it seems counter-intuitive to take life in order to preserve it. In fact from memory this is touched on in Shepard's confrontation with Starchild. However, from a cold, calculating, mechanical view it has a certain morbid logic. Sacrifice a handful of species every 50,000 years to protect the rest from their 'inevitable' creations.
Actually, it's not counter-intuitive at all in a universe with the genophage, which has the same basic thinking behind it, or with an organisation like Cerberus, which thinking nothing of wasting human life in order to protect humanity. In fact, Garrus understands the principle that underpins the Catalyst's plan; he talks about how leaders have to decide that a billion people in one place have to die to protect two billion in another.
"Tribute does not flow from a dead race"
That was their intent. It didn't understand their intent. It perverted that directive into something horrible even if they refuse to admit the mistake.
Yes that was their intent. But it was not the directive they gave the Catalyst.
Yes, because they lacked perspective. They assumed they'd always rule the galaxy so it preserving all life would be preserving their slave pool. What it did was turn into the paper clip AI and go off on a wacky tangent with its directive.
Yes, because they lacked perspective. They assumed they'd always rule the galaxy so it preserving all life would be preserving their slave pool. What it did was turn into the paper clip AI and go off on a wacky tangent with its directive.
Yes, it is a paper clip AI, it has kind of a warped vision on preserving life.. from our perspective.
But that's the Leviathan's fault, not the Catalyst's.
Also, in the MEU something called 'organic essence' or just 'essence' exists. It's ridiculous, but it's not something the Catalyst came up with.
I'm well aware the Leviathans are to blame for their hubris. They made a lot of arrogant assumptions when they designed it and the galaxy suffered for it.
The Catalyst implements the 'Zeroth Law' in an extreme way. I don't blame it....it has perspective we lack. And vice versa...
I don't agree with the premise that a liquefied organic or synthetic is alive or has life. Life, to me, is more than mere existence. If one has life, it can reproduce. It can function. It has instinct and adaptability.
I view my plants as "alive" because they can reproduce. They give off O2 , and they adapt. If I see one side of the plant stretches toward the window light, I turn it. In a few days the plant has adapted and again stretches toward the light.
As a liquefied being... What is your function? Can you reproduce? Do you adapt? No. The reaper part does, but those that are preserved inside a reaper have no real life. They are dead. Their DNA is preserved. But the being it was is certainly dead. --> lacking life.
anyway, just my opinion.
As for the original post: I do see Reaper logic as correct. But flawed to such a degree that it seems "silly"
If two synthetic lives combined the best parts of their programming into a new artificial intelligence and implanted the data into a mobile platform, would that be considered a form of "reproduction"?

Just think back to the Donnelly/Adams debate where EDI asked "Are we more than our thoughts?" That's the question at work here.
Or how about when Legion says he doesn't see a meaningful distinction between the Normandy and its crew. And that time he said fear was a hardware error.
You really listen to the other AIs in the series and what the Catalyst does starts to make a creepy kind of sense. You can see the same synthetic reasoning at work in its actions. Sure it takes them too far but that's the point.
They're no different than other alien species. It's just that the aliens aren't alien enough anymore. So now organics are learning about a new species, and at the same time they're learning about themselves. The key is to not make the mistake of attributing organic motivations or morals to them, and just figure out a way to co-exist.
EDIT: Should say that I'm talking about the geth here, I guess. And EDI. It was actually thinking back to her comments on Cronos station about awakening on Luna... and the geth footage from the Geth Flyers mission... that got me thinking about this. Also Javik's conversation about synthetics believing organics are flawed.
Ok, apology accepted, and sorry if I got defensive.1. The mandate to protect organic life in the milkyway, can only be fullfilled, if the reapers can guarantee that no AI, capable of destroying all organic life in the milky way (even with the reapers around), can develop, within reach of the milky way. Considering ai can have a lifespan on the order of billions of years and FTL exists, that necessitates the continual reapings of pretty much the entire universe.
If they don't reap the entire universe, they're practically doing 0,00000000001 percent of the work necessary for their mandate and betting that what's supposedly happened countless times in just the milkyway will never happen in something like 99.999.999.999 other places.
2. If genocidal AI is as likely to happen as the catalyst claims, it would surely have come into existence somewhere in the universe and be on it's way with a vengeance and only be a matter of time.
That, that hasn't happened is a testament to the unlikelyhood of it happening in the first place. Presumably talking about it being a 1 in a trillion trillion risk. This is the basis for the catalysts mandate and the supposed validation for something like tens of thousands times a trillion lives lost.
I'm not saying that conflict cannot happen between AI and organic life. I'm saying that it's allmost impossible for such a conflict to be worse than the reapings are in terms of loss of life. Ie. A world war is bad (ai wiping out organics or organics wiping out ai), a continual world war lasting 2 billion years with only enough time inbetween to replenish losses and start all over again Is much much worse.
How much an AI would worry about organics is proportional to it's "power". Are organics in the milkyway a problem? It might be, if you, as an ai, want a chunk of the milky way and is only comparable in power. If you have a the galaxy of andromeda to yourself, is organics in the milkyway a problem? Not so much.... What if an AI, commands the power and ressources of just a billion galaxies... Why would it care even one bit about organics flourishing in the milkyway?
At some point it's as unlikely as the US considering the vatican a military threat.
3. Nevertheless, both are an incredible waste of ressources on something exceedingly unlikely. It make's it seem much more attractive to just create one friendly, but exceedingly powerfull ai, which could snuff other ai's out of existence on our behalf.
4. You're right, it is hard to find analogies for something this incredulous. Any kind of Rube goldberg like scheme blown to unimaginably large proportions due to an exceedingly unlikely event.
5. You are right. It's exactly why reaping 1 galaxy cannot ensure it's safety and as I said, is a testament to the sheer unlikelyhood of mad, all powerfull ai's.
Think of a modified drake equation. Odds of the organics of the milkyway getting wiped out by a mad genocidal all powerfull AI = size of universe times chance of any mad genocidal all powerfull ai (very likely according to the catalyst) times time.
We're still here... 13,7 billion years later, the universe haven't gotten smaller... The variable that's shrunk infinitely small is chance of mad genocidal all powerfull ai.
6. But it would be incredibly stupid expend an enourmous amount of ressources and energy beyond the point of self defence. Any human could make the same assumptions about any number of other humans (they're a threat, waste of space and what not). How many people actually try do something about it on a massive scale and how many succeed on that massive scale and how would it ever be worth it? Atleast from the perspective of a somewhat sane person.
7. Yes, they might be hostile... but at some point the effort expended on hostility makes it a wasted effort and increasingly irrational. Hostility to the point of genocide, is as irrational for ai's as it is as irrational for us. And not just one genocide, but continiously over the remainder of the lifetime of the universe and in every part of the universe. I can hardly imagine creating a worse hell for yourself, than that. Surely even AI's have better things to spend their time and energy on? I hear computing pi is okayish.
8. Sorry I didn't meant to be an ass. I have no idea of how an AI works or thinks, much less a 1000 or a million of them, however I think it's important to keeo the sheer magnitude of some it in perspective, or atleast I, personally, cannot gloss over it.
As I've said in other threads. I can easily buy into the prospect of crazy or, in some way, broken AI's. Crazy does not have to make logical sense to the rest of us, but can follow it's own logic.
But I would not expect a reasonably intelligent and rational AI to be any more of a threat to me, than any number of reasonably intelligent and rational neighbours. Such a scenario would offcourse be much more boring than anything where we have to outrun or fight a homicidal maniac.
What you've said, throughout the thread, does make sense within the confines of the game and is perfectly valid within that.
In any case my beef is not with you, it's with the writers who expect me to ignore the existence and size and scope of the universe and the consequences of that on any reasoning given in the game. The disconnect between what I know and what the game claims is simply too big for my tastes and that is certainly not your fault.
And the Reapers might agree with you for all we know. The claim of the Catalyst suggests that they've no particular interest in preserving any particular species, just as long as some form of life carries on somewhere (although it's contradicted by most of their speech, which suggests they're not interested in anything at all other than themselves).I don't agree with the premise that a liquefied organic or synthetic is alive or has life. Life, to me, is more than mere existence.
"Tribute does not flow from a dead race"
That was their intent. It didn't understand their intent. It perverted that directive into something horrible even if they refuse to admit the mistake.
Yes that was their intent. But it was not the directive they gave the Catalyst.
Exactly.
"I was created to bring balance, to be the catalyst for peace between organics and synthetics." - this was its directive.
It failed.
And people are reading too much into this. I've read in another post where Leviathan says something about its directive to preserve all life? This isn't correct. I've listened to the circular logic in that conversation too many times. The Catalyst says itself "Without us to stop it, synthetics would destroy all organics. We created the cycle so that never happens. That's the solution."
"Leviathan"
"Yes, They created me to oversee the relations between organic and synthetic life - to establish a connection. They became the first true reaper. They did not approve, but it was the only solution."
"How does that solve anything?"
"Organics create synthetics to improve their own existence, but those improvements have limits. To exceed those limits, synthetics must be allowed to evolve. They must by definition surpass their creators. The result is conflict, destruction, chaos. It is inevitable. Reapers harvest all life, organic and synthetic, preserving them before they are forever lost to this conflict."
The circular logic and methodology of this thing leads me to one conclusion. The Intelligence needs to go.
@78Wobble
You are aware that pretty much all other galaxies are moving away from us at increasing speeds, right? Some at a 'speed' much higher than c (or rather, the space between one galaxy and the other gets larger, but lets not get too technical, I haven't studied physics
)
I don't see AI sipping in from another galaxy that much of a problem, considering the above and a few other things:
1. -the AI would first try to dominate it's own galaxy. Completion of that will not happen within a timeframe any of us can reasonably fathom, even assuming that galaxy would have mass relays to make traveling large distances easier.
2. -it would need a reason go outside it's own galaxy. I'm not going to pretend I understand AI, but logical reasons would be things like resources. Emptying it's own galaxy of resources is again a very time consuming process which I doubt could be completed within any reasonable timeframe before our galaxy moves away so fast it can never be caught up with again.
-if it was really intelligent, it would realize that in other galaxies other organics would create AI. So even if it's goal was to destroy all organic life, it would not have to go to a different galaxy. It just could, and probably would, assume that other AI would be dominating other galaxies. (and taking this into consideration, the Leviathan are the saviours of organic life in the Milky Way)
Considering all of the above, the Catalyst doesn't have to care about other galaxies, and neither do you.
I'm probably overseeing something here though, so feel free to correct me.
Well I'm far from an expert on it, but as far as I understand it is due to expansion of space. The galaxies never locally move beyond C, it is the space between galaxies that expand. However, there are plenty of galaxies that are gravitationally bound to each other and will stay in proximity for trillions of years.
FTL in the mass effect universe is what? From hundreds to thousands of times faster than light? There is the, have to drop charge in gas giant atmosphere, limitation, but that sounds like an engineering issue. In principle they can overcome the expansion of the universe for quite a while and it won't be necessary for the closest, hundreds, thousands and perhaps millions of galaxies.
I don't know the speed of mass relays, but perhaps they could be scaled up to match the galaxy distances (or you could make a route of them) and you could bridge the gaps between galaxies even faster.
1. They might want to consolidate first in their galaxy, but I'd point out that the reapers have kicked the crap out of any upcomers in our galaxy repeatedly, tens of thousands of times over a billion to 2 billion years. If they wanted to, they could easily have build enough strength to keep the milky way permanently locked down and build a sizable invasion fleet and fly it to the andromeda galaxy a mere 2,5 million light years away.
2. It could be for any number of reasons. I find ressources unlikely as well... There are a lot of those in even one galaxy and I have to believe that a rational AI can limit it's own population growth and pick up a sensible hobby. If they're crazy and genocidal? Then they don't really need a reason, that makes sense to us (the crusades don't make sense to much of us today) or to consume their own galaxy first.
Well... the catalyst and shep doing their thing to the mass relays will certainly be a big "WE ARE HERE!" sign for anyone looking. ![]()
But good point about the bit with the leviathans there at the end, that we are sneaking under the radar so to speak. ![]()
Ok, apology accepted, and sorry if I got defensive.
Rather than keep an 8 point debate going, I'll narrow my response down to what I think your main gripes are (using letters to avoid confusion):
A. The reapers aren't doing their job if they ignore the threat of synthetics from other galaxies.
I won't say that there is no such threat, but however likely (or, as you argue, unlikely) the AI threat is in this galaxy, the AI threat from another galaxy would be much, much smaller. You even make a good argument for ignoring such a small threat when you say that dominant AIs in Andromeda would have no desire to come to the Milky Way.
B. The improbability of mad, genocidal AIs
None of what I'm arguing requires this. How many centuries of conflict between AIs and organics would it take before AIs formed the view that conflict with organics would never be avoidable? If you believe something will always be a threat to your existence, isn't it smarter to get on the front foot and reduce that threat? You don't have to be mad or malevolent to come to that conclusion. What if the Quarians had convinced other species to help them wipe out the Geth? Would even the 'good' Geth simply sit and wait for that to happen? How far could that have escalated? Having said that, there is plenty of sci-fi around already, both good and bad, that relies on the notion of mad, genocidal AIs.
But judging from your last paragraphs, you and I might be arguing different points here. Despite my last point, I'm not saying that the Leviathans (or, by extension, Bioware) were necessarily right in concluding that AIs would inevitably be a threat to organic life as we know it. My OP was saying that once the Leviathans had 'programmed' that assumption into the reapers, the path that the reapers took wasn't a ridiculous one (I may have caused some confusion by using the word 'motives' - perhaps 'rationale' would have been a better choice).
Was the reapers' solution the only option? Of course not. It might not even have been the best one, but it was still one that virtually guaranteed, at least for an incredibly long time, that AIs would never become a threat to organic life as a whole (however unlikely that might in fact be) while still allowing organic life to continue between harvests.
Often people argue that the reapers could have just stayed in the Milky Way and dealt with AI threats as they arose. I would actually argue that this is a lot riskier, even if it was better for organic species in the short term. Left to their own devices, there is little doubt that organic species would advance to the point of being a threat to the reapers, thereby preventing them from doing what they saw as their job. If humans (or Protheans, for that matter) were having their AIs destroyed by the reapers, how long would it be before they began trying to develop the technology to use against the reapers? The reapers' best weapon against retaliation was the secret of their existence, which wouldn't be a factor in that scenario.
Of course, all of that still falls over if you don't accept the initial premise of AIs being a threat to organic life. And that's fine!
I'm glad that you did. I do like these games alot, so I'm a bit more enthusiastic than usually in discussing them. Which is saying alot, since I come from a family where we can discuss ie. politics hotly and loudly for hours on end. Thankfully we can then also go and make a drinking game out of ie. wii bowling afterwards. So, sometimes I get too enthusiastic and forget who I'm talking to. Sorry, for that.
Well I think the points are connected.
I, personally, think the chance of dangerous ai (dangerous to all organic life in a galaxy or all galaxies), seems incredibly small. Some conflict is offcourse possible, especially so, if we're talking about people continiously enslaving other intelligent beings. Our own history is evidence of all sorts of conflicts, all over the place and about pretty much everything. Yet, the amount of peacefull coexistence, that has happened and is happening far outweigh the conflicts, otherwise we would literally have destroyed ourselves, and we're not even that smart.
On that basis, and the ingame examples of the AI vs. organics (which might have 1 genocidal ai example and 2, where relations are possible). I find it much more likely to establish peacefull relations with AI. They could even help us design "mechanical slaves", that would not be slaves, because they would never be intelligent enough to be considered so. Ie. my robot vacuumer doesn't need to be able to discuss philosophy with me. It just needs to vacuum. Further eliminating the need for ai rebellion and so on.
Friendly AI could even be the best help we could get against truely hostile AI. EDI was certainly an asset and in some playthroughs the geth are too.
From my perspective... AI in andromeda isn't a big threat and very unlikely.
However, that is not the perspective of the catalyst and the reapers. They think that the emergence overly hostile AI (not just fighting for selfdefence, independent or rights) is extremely likely to happen. If that is the perspective, then andromeda AI and lots of other galaxies is a very real and very near threat.
From that perspective, it is impossible to ignore the rest of the universe. To ignore it, would be to only do that infinitely small part of your job.
To me these perspectives are mutually exclusive.
The Reapers can only do their job only in the milky way, if my perspective is correct and if my perspective if correct, there is no little to no basis for their existence.
If the Reaper perspective is correct and they only do their job only in the milky way, then they and us should not exist or atleast cease to exist quickly, because with such a high chance of emergence of hostile ai, then it is only a matter of time before they get here.
We're here and they are here... and have been for atleast a billion years (2,2 billion if I do remember that right from some part of leviathan).
...
We might have been arguing separate things. Happens alot to me
...
...
You bring up some good and varied points in your last section there, but I will still argue that the continued reapings are an incredibly wastefull (in terms of efficience and lives, even if that is not a reaper priority) way of going about it.
Creation of a peacefull AI, whose benefits or services were available through trade and with mutual respect with organics and the creation of suitable non-intelligent machinery for allmost all needs, would have been an infinitely better solution.
Allmost permanently removing the need or desire for creation of AI's and thus conflict.
Granted, some people might still want to create AI for educational purposes, but now with the benefit of guidance and experience on what to avoid.
Some civilisations that haven't yet joined the intergalactic community might also still develop AI and do it wrongly, but they would be a small threat, to the rest of the galaxy, since the ressources of a single planet starting point will take a long time to be able to compete with a highly technological advanced intergalactic society with the ressources of hundreds or thousands of planets at their disposal.
Such AI's might wipe out their own "masters", but seeing that peacefull coexistence is possible and beneficial, might easily change their minds, in regards with the rest of the intergalactic society.
So, yes, I have a very hard time accepting that a rational "sane" ai, would be hostile, just for the sake of being hostile. A crazy or malfunctioning AI could sure enough be hostile... but here I think our best ally would be a wellfunctioning friendly AI.
But this is not my main beef... The game does not present a convincing case for it's own rationalisations and when combined with knowledge of the existing world it undermines itself and I'm forced to reject portions of the game.
The "syntethic vs. organics" theme itself it is not silly. It is silly in ME. Since the begining they told us that Reapers' purpose was alien for us. But the council already has laws which make AI illegal, the organics do understand why AI are dangerous. So what's alien about the reapers? Humans, one of the latest species to go on the citadel, already know that syntethics could harm their creatiors for the creators sake (human literature exist in the me universe, since in the very first game in the codex they wrote about Nietzsche). So, you wait three games to know what the reapers are up to, and in the end it's just a normal and overused "we protect life from syntethics". It is silly isn't it? It is so clear that this wasn't the original concept and that it was pulled out by Walters alone (since even in mass effect 3, we discorver the reapers purpose only in that infamous ending, not before).
If the reapers would have wanted to kill organics and use them to create a database-ship containing all infos about them (like someone wrote in this thread) so to stop those species from damaging the tissue of the universe, that would have been alien. In ME nobody knows about what is damaging the universe. Nobody even knows the universe is suffering damages. It would have been a great ending. Instead, we have a broken franchise.
If the reapers would have wanted to kill organics and use them to create a database-ship containing all infos about them (like someone wrote in this thread) so to stop those species from damaging the tissue of the universe, that would have been alien. In ME nobody knows about what is damaging the universe. Nobody even knows the universe is suffering damages. It would have been a great ending. Instead, we have a broken franchise.
And you're in favor of this idea because it's less silly?
And you're in favor of this idea because it's less silly?
It is less silly, even if it runs into the same ignoring of the other 99.999.999.999 galaxies problem.
What we have is: An AI wants to prevent all organics from getting killed by AI, so the AI kills many organics, so many organics and over so many times, that it kills more organics, than an AI killing all organics once would, but there is no AI wanting to kill all organics, so there is no point in killing organics in the first place and if there were an AI wanting to kill all organics, there would be no AI wanting to prevent all organics from getting killed.
*sigh*
The kills are collateral damage. The harvest is about ascension (preservation)
Twisted view of life? Yes. But killing is not what the Reapers and the Catalyst think they are doing.