This is a writeup of a shallow investigation, a brief look at an area that we use to decide how to prioritize further research.
In a nutshell
What is the problem?
Some relatively low-probability risks, such as a major pandemic or nuclear war, could carry disastrous consequences, making their overall expected harm substantial. Because such occurrences are unprecedented and relatively unlikely, they may not receive adequate attention from other actors. While we do not have credible estimates of the likelihood of these risks, some seem to be non-trivial.
What are possible interventions?
A philanthropist could focus on individual risks, such as nuclear or biological weapons, or on global catastrophic risks in general. We feel that we have a somewhat better understanding of the potential impact of philanthropy on individual risks, but do not have a sense of whether it would be better to focus on individual risks or on the general category of global catastrophic risks. This page focuses on the latter.
Who else is working on it?
A few small organizations, with budgets totaling a few million dollars a year, focus on the general topic of global catastrophic risks (as opposed to individual risks).
1. What is the problem?
We use the term “global catastrophic risk” on this page to refer to risks that could be bad enough to change the very long-term trajectory of humanity in a less favorable direction (e.g. ranging from a dramatic slowdown in the improvement of global standards of living to the end of industrial civilization or human extinction).1 Such risks might include an asteroid striking earth, an extremely large volcanic eruption, extreme climate change, or, conceivably, a threat from a novel technology, such as intelligent machines, an engineered pathogen, or nanotechnology.2
We are not aware of any reliable estimates of the overall magnitude of these global catastrophic risks.3 Naive estimates suggest that the probabilities of “natural” global catastrophic risks such as those from extremely large asteroid impacts or volcanic eruptions are likely to be low.4 We would guess, though with very limited confidence, that risks of global catastrophe from novel human technology are significantly higher, and are likely to grow in the coming decades.5
Some prominent philosophers have argued that global catastrophic risks are especially worthy of attention, suggesting that cutting short the potentially extraordinarily long future of humanity would be worse than nearly any other outcome.6
Even if the potential impacts of a global risk are not large enough to significantly curtail human flourishing over centuries, it may still be a good fit for philanthropic attention, because other actors (governments, for-profits, etc.) may not have sufficient incentive to address highly uncertain, low-probability risks whose potential consequences would be widely shared.
We have discussed a number of potential global catastrophic risks in separate shallow investigations:
- Anthropogenic climate change
- Near-Earth asteroids
- Large volcanic eruptions
- Nuclear weapons
- Antibiotic resistance
- Biosecurity risks (e.g. pandemics, bioterrorism)
- Risks from atomically precise manufacturing
On this page, we focus on groups and interventions that are explicitly aiming to address global catastrophic risks as a whole, rather than focusing on one particular type of risk. We do not have a strong view on whether work on particular risks or work across risks is likely to be more effective.
2. What are possible interventions?
We do not have a good sense of which interventions focused on the general category of global catastrophic risk (as opposed to a particular risk) might be most effective.
Some areas of focus might include, amongst others:7
- Decreasing the likelihood of major global conflicts.
- Improving resilience to unexpected shocks of all kinds, such as by increasing the amount of food and other supplies that are stockpiled globally or by strengthening support networks between countries.
- Safeguarding people and knowledge to increase the chances that civilization could be rebuilt in the wake of a global catastrophe.
- Regulating novel technology to avoid a potentially catastrophic deployment.
- Supporting research to better understand the level and distribution of global catastrophic risks and the potential returns to specific or cross-cutting efforts to mitigate such risks.
- Advocating for other actors to take greater action on global catastrophic risks.
We do not have a strong understanding of how additional funding in this area would translate into reductions in risk, or of the track record of existing organizations in this field.
3. Who else is working on this?
A few organizations are explicitly focused on reducing global catastrophic risks broadly. Such organizations include:8
- Future of Humanity Institute (affiliated with the University of Oxford)
- Cambridge Centre for the Study of Existential Risk (affiliated with the University of Cambridge)
- Global Catastrophic Risk Institute
- Machine Intelligence Research Institute
- Institute for Ethics and Emerging Technologies
- Lifeboat Foundation
- Global Challenges Foundation
Our understanding is that these groups are quite small in terms of staff and budget, with the Future of Humanity Institute and Machine Intelligence Research Institute being the largest, each with an annual budget of about $1.1 million.9
The Skoll Global Threats Fund, which made grants worth roughly $10 million in 2011, primarily on climate change, has also supported work on other potential global catastrophic risks, including nuclear weapons and pandemics.10
We would guess that some government bodies are also tracking and devoting resources to addressing multiple risks, though we don’t have a sense of the magnitude of resources involved.
4. Questions for further investigation
Our research in this area has been relatively limited, and many important questions remain unanswered by our investigation.
Amongst other topics, further research on this cause might address:
- Which interventions focused on the general category of global catastrophic risk might be most effective in reducing the total amount of global catastrophic risk?
- Is it possible to generate more credible estimates of the overall likelihood of particular global catastrophic risks, or of the sum thereof?
- Should a philanthropist concerned about global catastrophic risks focus on one or more particular risks, or on cross-cutting global catastrophic risk research, advocacy, or preparation?
- What degree of ethical weight does the far future warrant? How should we understand the value of preserving the possibility of a very long future?
5. Our process
Our investigation to date has been rather cursory, mainly consisting of conversations with two individuals with knowledge of the field:
- Carl Shulman, Research Associate, Future of Humanity Institute
- Seth Baum, Executive Director, Global Catastrophic Risk Institute
In addition to these conversations, we also reviewed documents that were shared with us.
6. Sources
DOCUMENT | SOURCE |
---|---|
Notes from a conversation with Carl Shulman on September 25, 2013 | Source |
Notes from a conversation with Seth Baum on October 2, 2013 | Source |
Barnosky et al. 2011 | Source (archive) |
Bostrom 2013 | Source (archive) |
Skoll Global Threats Fund 2011 Form 990 | Source (archive) |
MIRI 2012 Form 990 | Source (archive) |