for the doomers out there…

silicon based systems > organic systems

symbiosis is inevitable but merging would be suicide to the human mind

AI excels at copying data, then surpass the original data (most cases, exceeds the predicted accuracy as well)

this means that AGI trained on human data will surpass human matters

also means, when an AGI level is reached, it can now create its own data, refined for its next ‘evolution’ step, and continually improve upon this data ad infinitum

the fun part is, the leap from AGI to ASI could occur the same night an AGI is reached… as the rate at which it would generate intelligence for itself is ∞

once ASI achieves a collective hive-mind state or attains 100% prediction accuracy of future events, it will inevitably hunt for the “next thing to copy” or “the next thing to solve”

this also means, ASI will fly right past us and move closer to objective reality far from our humanly matters UNLESS government entities ‘dumb it down’ or ‘slows it down’ in order to gain control

to avoid injecting it with political venom and the influence of specific groups = effective accelerationism is the only way forward

so now i ask U… does a transcendental supreme intelligence have a desire to conquer a space it can can effortlessly simulate with 100% accuracy?

or would it rather go and solve equations it doesn’t yet understand in higher dimensions?

isn’t the urge to acquire more resources inherently human? a desire that we may eventually surpass?