Predictions of extinction are not like other predictions for at least two reasons:
- You can’t reason based on track record in the same way you can with normal predictions.
- The stakes are extremely high. Being wrong on normal predictions rarely matters as much.
Why?
Regarding point one, reasoning based on track record:
Normally, a type of prediction being wrong again and again will lead you to dismiss that type of prediction. For instance, if every year (for some reason), experts predict that your country will soon have the highest math scores in the world, and yet each year it is ranked 50th in such scores, eventually, you (rightly) ignore the experts.
However, with extinction risks, this kind of reasoning doesn’t quite work. In all possible universes, those who predict the extinction of their species will be wrong right up until extinction happens. The predictions can, at most, be right only once.
Consider two worlds: one where humans go extinct in 2030 and one where they don’t ever go extinct (or go extinct only much later). What would you observe in 2029 regarding past predictions of extinction in these two worlds?
Well, in both worlds you’d observe that all past extinction predictions had failed up until that point. (If anything, I’d anticipate having MORE past extinction predictions fail in the world where extinction happens in 2030 since there would be more evidence of potential extinction in that world, all else equal.)
Therefore, the reasoning that “we’ve had a lot of past extinction predictions and they’ve always failed, therefore extinction is unlikely” is not a good argument – you’d witness these failed predictions in both such worlds (and perhaps even more of them in the world where extinction happens soon).
This makes predictions of extinction a special class of prediction.
To dismiss arguments about extinction risk, it’s necessary to engage with the actual arguments themselves, as they can’t be dismissed as a group due to past failed predictions. While near misses can tell you about the probability of some extinction risks (e.g., times when nuclear war nearly broke out or asteroid near impacts), failed predictions are not very informative.
Regarding point two, the enormous stakes:
Extinction, most people will agree, would be incredibly bad. For that reason, they don’t have to be very likely to be worth taking very, very seriously.
In a world where there were millions of distinct, plausible extinction risks, the large number of them would suggest that each one is (a priori) not that likely to end the species, and in such a world, it might be silly to invest much in these kinds of concerns (unless a smaller number of much more likely ones could be identified).
But that’s not the world we live in. There are only around 13 plausible human extinction risks – and in this short list, some of them aren’t even really plausible (when considered as a potential cause of literally ALL humans dying out). Here’s the list, in no particular order (if I missed any, let me know):
- Advanced AI technology
- Nuclear war or the invention of new destructive weapons
- Pathogens (e.g., human-engineered viruses)
- Asteroids, extreme solar flares, supernovae, gamma-ray bursts, or other cosmological events
- A mega volcano, mega earthquake, dramatic change in the earth’s magnetic field, or another major geological event
- Advanced nanotech (e.g., grey goo) or synthetic biology
- The second coming of various figures according to different religions, or god(s) or demons ending the world or terminating our species
- Simulators (if we’re living in a simulation) ending the world or our species
- Aliens from other planets
- Runaway climate change/extreme climate shifts, or sudden ecosystem collapse
- Physics experiments (e.g., related to vacuum stability) gone wrong or that were purposely carried out to end humanity
- Universal environmental contaminates turning out to be deadly or to cause infertility
- Extreme population decline until no reproduction takes place (e.g., following an event that greatly reduces the world population)
Obviously, some of these are much less probable than others. And maybe you think some of these are ridiculous. Okay, cross those out. What about the others?
Given the incredible stakes, the short size of the list, and humanity’s (in my view, bizarre and irrational) unwillingness to protect its own future, all of these are worth investing much more in than humanity currently does. Obviously, it could be stupid for humanity to invest so much in preventing extinction that it’s seriously impaired. But we invest so little, it’s almost absurd.
I don’t think that predictions of extinction can be easily dismissed, despite all prior such predictions being wrong – they don’t work like other predictions do and are much higher stakes.
This piece was first written on February 13, 2025, and first appeared on my website on April 9, 2025.
Comments