The Singularity is a hypothetical future event where technology growth becomes uncontrollable and irreversible, leading to unpredictable transformations in our reality[1]. It’s often associated with the point at which artificial intelligence surpasses human intelligence, potentially causing radical changes in society. I’d like to know your thoughts on what the Singularity’s endgame will be: Utopia, Dystopia, Collapse, or Extinction, and why?
Citations:
I’ll do you one step better. What about when our ai meets another ai?
Our existence is based on death and war. There is a lot of evidence to suggest we killed off all the other human-like species, such as neanderthals.
And that is the reason we progressed to a state where we have developed our world and society we know today, and all the other species are just fossils.
We were the most aggressive and bloodthirsty species of all the other aggressive and bloodthirsty alternatives, and even though we have domesticated our world, we have only begun to domesticate ourselves.
Think about how we still have seen genocides in our own time.
Our AI will hopefully pacify these instincts. Most likely not without a fight from certain parties that will consider their right to war absolute.
Like the one ring, how much of the agressiveness will get poured into our AI?
What if our AI, in the exploration of space, encounters another AI? Will it be like the early humanoid species, where we either wipe out or get wiped out ourselves?
Will our AIs have completely abstracted away all the senseless violence?
If you want a really depressing answer, read the second book of 3 body problem: The Dark Forest.