Humans Giving Birth to AGI aka Superintelligence

Few thoughts on reaching the point of singularity that I think Gen Z should know

Humans Giving Birth to AGI aka Superintelligence
Photo by Joshua Sortino / Unsplash
  • Experts are of the view that AGI or SuperIntelligence shouldn’t be built until we figure out how to make it safe.
  • Find below an open letter signed in 2023 by well-renowned people including people like Steve Wozniak and Elon Musk to pause AGI progress.
  • But I think that's analogous to saying we shouldn’t give birth until we figure out parenting or raising children safely.
  • Humans have always figured out later, how to handle what they have (accidentally) invented.
  • But with AGI I think humans won't have enough time AND intelligence to figure out.
  • However, AGI progress will not be slowed down as national interests are at stake. Many nations will progress to reach AGI just for self-defence.
  • But no nation will be concerned about SELF-defence i.e. human survival.
  • Meaning risk of humans getting wiped out increases, tantamount to saying that risk of parents who gave birth to a child getting wiped out. 
  • How do parents secure themselves in old age, when they are frail, and the offspring is stronger, who holds sway?
  • The child i.e. AGI will most likely kick the parents i.e. humans out of the house i.e. planet
  • The answer to saving humanity from AGI may lie with senior citizens.