p(life) over p(doom)
I prefer p(life) over p(doom).
Define p(life), heuristically, as the probability that you are alive in some ~hundreds of years.
p(life) subsumes x risk. But p(life) also includes terms that care about the (extremely likely!) chance that you and you loved ones die from an unsolved cancer, cardiac arrest, diabetes, Alzheimer's, or innumerable other age-entwined diseases.
The pragmatic agenda for saving lives, including AI powered biological research, would gain proper prominence as the highest hope of humanity.
- AI powered cloud lab w/ research automation
- Organ Replacement Therapies + Ethical Cloning
- Regenerative Medicine, ex. via Stem Cell Therapy
- Gene Therapy & Genetic Editing
- Pharmacological work, like Rapamycin, mTOR inhibitors, Metformin, and other associated pathways treatable via drugs.
- Cryogenic freezing for dramatically lengthening timelines so that scientific progress catches up
- Enhanced cognition via BCI x AI read / write access for enhanced scientific thought
- Superintelligence for research automation
- Genetics, biotechnology, medicine, and lifestyle interventions will be essential to accelerate as a part of an anti-aging agenda.
p(life) points out that we are all default dead. This tragic fact, well recognized in stories like the Fable of the Dragon Tyrant, is an expectation that has been priced into most meaning systems. In the absence of that expectation, the reality of total death would be treated as a tragedy beyond genocide, at the scale of the apocalypse. A retrospective from a time where p(life) >= 99, our times will seem like we live through holocaust after holocaust, on a yearly basis.
what's your p(life)?
Abstract Superintelligence
p(doom) is a deep collective psychological mistake because it forces you to imagine abstract superintelligence.
This abstract superintelligence often *can do anything within the realm of physics*.
And empirically, we are not getting abstract superintelligence. We are getting something much more limited in scope & capability. And so your predictions about p(doom) will be miscalibrated.
When your society is contained by the assumption, the question you ask is, is p(doom) low or is p(doom) high?
It's the kind of manipulative question that has no good answer. Low numbers are subject to Pascal's mugging (small probability of insanely negative outcomes still totally compels behavior). High numbers are obviously totally paralyzing & inspire an intense savior complex. Low numbers feel magnified by the possibility effect.
But p(doom) - and "Doom"sday psychology - is actually an incredibly dangerous ideology, because it's a totalizing ideology. It *omits the entirety of the rest of life*.
That's why I prefer p(life) - (the probability of you and your loved ones surviving into the far future) to p(doom) - it brings in so much more of life and progress, and forces you to balance risks against benefits.
It *integrates* extinction risk, rather than pedastalizing it.
And that balance forces you to look at limitations, which is not done in the frame of abstract superintelligence.