Error message

Warning: Use of undefined constant openphil_social - assumed 'openphil_social' (this will throw an Error in a future version of PHP) in openphil_social_block_block_view() (line 90 of /var/www/html/openphil/live/sites/all/modules/custom/openphil_social_block/openphil_social_block.module).

Global Catastrophic Risks

Focus areas:

We believe that ongoing economic, social, and technological progress will likely lead to an extraordinarily bright future. At the same time, as the world becomes more interconnected, the magnitude and implications of the worst-case scenarios may be rising. Governments and corporations aren’t necessarily incentivized to focus on preparing for potentially globally disruptive events, so we’re seeking opportunities to help civilization become more robust. Our most recent summary of the causes we’ve investigated in this category — including how we are prioritizing work on them — is here. More explanation and context are given below.

Our basic framework

For work in this category, we prioritize the value of the far future. Accordingly, we use the term “global catastrophic risks” to refer to risks that could be globally destabilizing enough to permanently worsen humanity’s future or lead to human extinction.

In choosing focus areas, we’ve looked for causes that are strong on some combination of the following criteria:

  • Importance: How damaging and destabilizing could a catastrophe be, and how likely are particularly dangerous scenarios to occur over the next century?
  • Neglectedness: Are there opportunities to make a difference, or important aspects of the risk, that receive relatively little attention and support? When investigating a cause, we tend to consider multiple different kinds of activities that might make a difference, looking for major gaps. Even if a risk gets major attention from governments, if it gets little attention from philanthropy, there may be an important role for us to play.
  • Tractability: What sorts of activities could a philanthropist undertake today to reduce the risk?

Focus areas

Currently, our top two priorities are Biosecurity and Pandemic Preparedness and Potential Risks from Advanced Artificial Intelligence. We feel that these are the two risks most likely to lead to globally destabilizing scenarios that could permanently worsen humanity’s future. We also feel that both risks receive little attention from philanthropy and present reasonable opportunities for action. For more detail on how we investigate and prioritize causes, see our process.

More on this topic, from the blog

Here are some posts from our blog that explain our thinking on this focus area: