Vigilant111 wrote...
JShepppp wrote...
Lol Taboo. Anyways I kind of lied, have a bit of work. The update will be done by Sunday at the latest.
Tell your friends we are coming for them
Good reference lol
Vigilant111 wrote...
JShepppp wrote...
Lol Taboo. Anyways I kind of lied, have a bit of work. The update will be done by Sunday at the latest.
Tell your friends we are coming for them
Modifié par JShepppp, 22 juin 2012 - 05:37 .
Modifié par Vigilant111, 22 juin 2012 - 12:15 .
Ieldra2 wrote...
To those who say that the Catalyst can't be right because it can't possibly have data about the consequences of a singularity that hasn't happened:
Not true. Here's an analogy: we have never experienced a big asteroid crashing on Earth. Nonetheless, we have a very good idea of the consequences of such a crash. We have that idea because we can calculate, simulate and extrapolate. We may not be 100% correct, we may even be an order of magnitude off about the loss of life it causes, but in the face of the size of such an event these inaccuracies don't matter all that much. We'd still be well-advised to do our utmost to prevent it from happening.
In a similar way, the Catalyst may be able to extrapolate the development of organic civilizations and the consequences of a singularity event. It is important to consider that a singularity is not intrinsically incomprehensible. It's just that its consequences can't be understood by human-level intelligence. The Catalyst is supposed to be more intelligent. If its predictions say "It's 99% likely that all developed organic civilizations in the galaxy will be destroyed 20000 years after the first singularity event, and no new ones will be allowed to arise", it doesn't really matter if that prediction is off by 10000 years in either direction. Trying to prevent it from happening is still a very good idea from the perspective of organics.
The difference between an asteroid hitting Earth and that "threat" is that at some point we can pick it up with telescopes; on the other hand, the technical singularity is a hypothetical threat and even Gordon Moore, the one responsible for Moore's Law which is often cited in support of the idea, disputes its plausibility.Ieldra2 wrote...
To those who say that the Catalyst can't be right because it can't possibly have data about the consequences of a singularity that hasn't happened:
Not true. Here's an analogy: we have never experienced a big asteroid crashing on Earth. Nonetheless, we have a very good idea of the consequences of such a crash. We have that idea because we can calculate, simulate and extrapolate. We may not be 100% correct, we may even be an order of magnitude off about the loss of life it causes, but in the face of the size of such an event these inaccuracies don't matter all that much. We'd still be well-advised to do our utmost to prevent it from happening.
In a similar way, the Catalyst may be able to extrapolate the development of organic civilizations and the consequences of a singularity event. It is important to consider that a singularity is not intrinsically incomprehensible. It's just that its consequences can't be understood by human-level intelligence. The Catalyst is supposed to be more intelligent. If its predictions say "It's 99% likely that all developed organic civilizations in the galaxy will be destroyed 20000 years after the first singularity event, and no new ones will be allowed to arise", it doesn't really matter if that prediction is off by 10000 years in either direction. Trying to prevent it from happening is still a very good idea from the perspective of organics.
Modifié par AngryFrozenWater, 22 juin 2012 - 12:09 .
AngryFrozenWater wrote...
"Ascension through destruction" is the reapers' trademark which has caused more harm to organics than synthetics ever can inflict.
Modifié par Cypher_CS, 22 juin 2012 - 01:31 .
Ieldra2 wrote...
To those who say that the Catalyst can't be right because it can't possibly have data about the consequences of a singularity that hasn't happened:
Not true. Here's an analogy: we have never experienced a big asteroid crashing on Earth. Nonetheless, we have a very good idea of the consequences of such a crash. We have that idea because we can calculate, simulate and extrapolate. We may not be 100% correct, we may even be an order of magnitude off about the loss of life it causes, but in the face of the size of such an event these inaccuracies don't matter all that much. We'd still be well-advised to do our utmost to prevent it from happening.
In a similar way, the Catalyst may be able to extrapolate the development of organic civilizations and the consequences of a singularity event. It is important to consider that a singularity is not intrinsically incomprehensible. It's just that its consequences can't be understood by human-level intelligence. The Catalyst is supposed to be more intelligent. If its predictions say "It's 99% likely that all developed organic civilizations in the galaxy will be destroyed 20000 years after the first singularity event, and no new ones will be allowed to arise", it doesn't really matter if that prediction is off by 10000 years in either direction. Trying to prevent it from happening is still a very good idea from the perspective of organics.
Modifié par The Night Mammoth, 22 juin 2012 - 01:27 .
The Night Mammoth wrote...
That definition of a singularity is your own, not factual. Singularity in this context does not have a definition other than 'a point which can't be predicted past'.
You've added the 'by humans' part with literally no basis for it.
Yet again, attempts to 'prove the Catalyst right' aren't based on the game's facts, but on people's headcanon chain of events.
Modifié par Shaigunjoe, 22 juin 2012 - 01:49 .
Shaigunjoe wrote...
The Night Mammoth wrote...
That definition of a singularity is your own, not factual. Singularity in this context does not have a definition other than 'a point which can't be predicted past'.
You've added the 'by humans' part with literally no basis for it.
Yet again, attempts to 'prove the Catalyst right' aren't based on the game's facts, but on people's headcanon chain of events.
Ieldra2's definition is correct, the singularity is defined by humans, ergo it is defined by human level intelligence. You forget that Mass Effect is a game made by humans.
To broach the subject in a game and treat the risk of a singularity with the respect it deserves you would not be able to explain it to your audience, much less yourself.
There is a possibility that the catalyst has been there, and knows what happens. I'm not saying that this is definitly the case, but it is a possibility and there is no reasonable proof to dismiss it outright.
The Night Mammoth wrote...
Shaigunjoe wrote...
The Night Mammoth wrote...
That definition of a singularity is your own, not factual. Singularity in this context does not have a definition other than 'a point which can't be predicted past'.
You've added the 'by humans' part with literally no basis for it.
Yet again, attempts to 'prove the Catalyst right' aren't based on the game's facts, but on people's headcanon chain of events.
Ieldra2's definition is correct, the singularity is defined by humans, ergo it is defined by human level intelligence. You forget that Mass Effect is a game made by humans.
That's called a 'leap of logic'.
Attaching the 'by humans' part is not factual, it's your own imagination.
That view holds no water.To broach the subject in a game and treat the risk of a singularity with the respect it deserves you would not be able to explain it to your audience, much less yourself.
Wha.......
I have no idea what that means. It's very easy to provide a proper premise for it being a threat. This thread and several others, Iedra2's among them, are somewhat proof of that.
BioWare did not provide that.There is a possibility that the catalyst has been there, and knows what happens. I'm not saying that this is definitly the case, but it is a possibility and there is no reasonable proof to dismiss it outright.
I can dismiss by acknowledging two important aspects of it's statements.
One - it makes no sense at all. It's logically flawed.
Two - no proof or evidence is provided to even start speculating that it's a possibility, and that's before you consider whether it's a 'threat' to organic existence, which is a literal impossibility.
Modifié par Shaigunjoe, 22 juin 2012 - 03:34 .
The numerous cycles say hi.PoorBleedingMe wrote...
AngryFrozenWater wrote...
"Ascension through destruction" is the reapers' trademark which has caused more harm to organics than synthetics ever can inflict.
Wrong. The Reapers destroyed only those, who failed to achieve what only Shepard did.
He allowed the Cuicible to be built and actually used it. He confronted the Reapers, passed the 'Indoctrination Test' (killed the Illusive Man whom I see as a 'seed of indoctrination' in Shepard's mind) and proved to be worthy of using his DNA as a matrix for rewriting all organisms in the Galaxy, ultimately making them Reapers (mind you, in the outro even trees are semi-synthetic, which means all living organism became Reapers when Shepard chose the Synthesis option).
Unfortunately for the Reapers, new 'future paths' were added to the Crucible (probably by the Protheans who modified the blueprints), so that the Reapers could be destroyed. The Catalyst manipulates Shepard to choose the Synthesis option as this was what the Reapers wanted from the very beginning - find the Chosen One, who could serve as a basis for rewreting all organic life into Reaper-like-organisms.
Modifié par Vigilant111, 22 juin 2012 - 04:53 .
Shaigunjoe wrote...
First you need to explain how my definition of singularity is a leap of logic, you can start by explaining who defined what a singularity is in the first place.
If you are unsure, the two points you spell out at the end of your post are leaps of logic as far as supporting your point is concerned.
In fact, then it would point to the opposite. I would imagine that if you were presented with a result of a singularity it would 'make no sense' to you.
Modifié par The Night Mammoth, 22 juin 2012 - 04:58 .
The Night Mammoth wrote...
Shaigunjoe wrote...
First you need to explain how my definition of singularity is a leap of logic, you can start by explaining who defined what a singularity is in the first place.
It has no 'dictionay' definition, it needs context to apply.
In context, it's a single point we converge to that cannot be predicted past, that is likely a point of no return - when AI's reach a level of 'superintelligence'.
Anything after that is speculation, unproven. It cannot be predicted past. Saying the Catalyst has the means is simple speculation. Believing it is in a state of 'superintelligence' is also speculation, and would put more holes in the supporting argument.If you are unsure, the two points you spell out at the end of your post are leaps of logic as far as supporting your point is concerned.
Which point? The point about the singularity definition? Those aren't supporting arguments.
About me dismissing what it says? Nope.
It doesn't make sense, it commits at least two mistakes in its reasoning by telling us organic extinction is inevitible.
It also fails to give any evidence.
In fact, then it would point to the opposite. I would imagine that if you were presented with a result of a singularity it would 'make no sense' to you.
Why? Nothing about the results would be difficult to understand.
Vigilant111 wrote...
Technological change does not accelarate forever, any technological advancement must be a response to change, it cannot just happen in random fashion, you do not self-modify for nothing, and the stimulus for change rests in interation with organics, and eventually the stimulus will die down as the purposes for change had been fulfilled
It is a myth because there is no evidence for it, there are always things that can be done to stop this so-called singularity
Shaigunjoe wrote...
Exactly! As you said, 'we'.
The whole singularity concept is something that we are dealing with outside of the game, I don't think the catalyst ever explicitly said anything about a singularity. It is speculation, it is speculation to say that the catalyst knows inside of whatever tolerence he defines acceptable that organics will eventually be wiped out by synthetics. It is also speculation to say the opposite.
Modifié par The Night Mammoth, 22 juin 2012 - 05:26 .
memorysquid wrote...
Vigilant111 wrote...
Technological change does not accelarate forever, any technological advancement must be a response to change, it cannot just happen in random fashion, you do not self-modify for nothing, and the stimulus for change rests in interation with organics, and eventually the stimulus will die down as the purposes for change had been fulfilled
It is a myth because there is no evidence for it, there are always things that can be done to stop this so-called singularity
Or then again maybe there can't. This is a fictional universe. Nothing says that the authors weren't being perfectly open in what they had the catalyst claim. In fact, the evidence in the game is that AIs do tend to come into violent conflict with their creators for whatever reason - in fact, I cannot think of a single AI in the game that doesn't fit that model. Geth, EDI both as lunar VI and now as self-willed AI, Metacons, Citadel finance AI, etc., every single instance of AI and several of VI feature rebellion against creator for some reason. With the exception of whatever the Catalyst is supposed to be, all in game evidence points to the Catalyst's conclusion being a rational, empirical observation.