If you care about helping to prevent very large problems that could affect not just those living today, but the entire trajectory of humanity, you should consider supporting efforts to reduce global catastrophic risks. Most of the work in this area involves preventing global catastrophes and making our world more resilient.
Please note that this page was formerly titled "Safeguarding the long-term future" — we have since changed the name to better reflect the causes and organisations we now recommend in this area.
We all care about the future. We want the people who come after us — our children, our children's children, and so on — to have good lives. But this might not happen. Nuclear catastrophes, uncontrollable AI, and engineered pandemics are just some of the threats humanity faces. The scale of these threats is hard to imagine, as they affect not only everyone alive today, but everyone who could ever live.
Relative to this enormous scale, the problem is neglected. One of the world's leading experts on the risks facing humanity is Toby Ord, a philosopher at Oxford and a co-founder of Giving What We Can. As part of highlighting the neglectedness of safeguarding the long-term future against global catastrophic risks, he shares in his book The Precipice that (at least as of 2016) the international body responsible for the continued prohibition of bioweapons has an annual budget of just $1.4 million USD — less than the average McDonald's restaurant. This is especially concerning given he argues that one of the biggest risks to humanity comes from engineered pandemics — think COVID, but designed to be far worse.
Addressing risks to our future is the least tractable of the high-priority cause areas we recommend supporting. Though there are ways of making progress, it is difficult to know how effective our actions will be. Nonetheless, given the scale of the problem and the relative scarcity of projects addressing it, we think it should be a high priority to support work in this area.
We recommend some of the best charities and organisations working on this problem.
There are several promising focus areas aimed at reducing global catastrophic risks. These include:
COVID-19 has made us all familiar with how devastating a pandemic can be, but despite how damaging it has been, a future pandemic could be far worse — especially one engineered to be particularly lethal or infectious. Preparing for future pandemics, especially those that pose a risk to humanity's survival or long-term potential, is one of the best ways we have to safeguard the long-term future.
Learn more about improving biosecurity and pandemic preparedness.
Artificial intelligence might be the most important technology we ever develop, so it's vital it's developed safely and used equitably. What's particularly concerning is that if artificial intelligence is developed unsafely, it could pose an existential risk — permanently limiting the potential of humanity, and maybe even ending our civilisation. To address this concern and other issues around artificial intelligence, we recommend supporting technical work on ensuring artificial intelligences reliably behave in ways we want and expect, as well as policy and political work that ensures artificial intelligence is developed and used safely.
Learn more about promoting beneficial artificial intelligence.
Nuclear weapons are a recent development in human history, and it's only since their development that humanity has a technology that could lead to its own destruction. Improving nuclear security involves lowering the risks of nuclear war, or mitigating the consequences of a nuclear fallout. Our nuclear security page outlines several plausible ways we could do this, but due to the inherently complex nature of the problem, it is difficult to know which is best to support.
Learn more about improving nuclear security.
Events in recent years have clearly illustrated the negative effects of climate change — on humans, animals, and our potential futures. While we’ve previously included charities and funds working in this space on our list of charity recommendations, we've recently updated this list to reflect the outcomes of our evaluating the evaluators research. Since this project is ongoing, we don’t (yet) have recommendations related to climate change, though you can support several promising climate-related organisations via our donation platform — some of which have been included in our previous years’ recommendations.
Learn more about addressing climate change.
Though we think reducing global catastrophic risks is among the highest-priority cause areas you could support, there are reasons you might choose not to focus on it.
Compared to the other high-priority cause areas we recommend supporting, there is a weaker evidence base for interventions that could safeguard the long-term future from global catastrophic risks. This is partly due to the nature of the threats humanity faces and the solutions they require. For example, it’s not possible to test whether a specific intervention succeeded or failed at preventing an existential catastrophe until that catastrophe happens, which leaves us relying heavily on theory and reason. Similarly, the nature of new (and more neglected) risks, like powerful but out-of-control artificial intelligence, makes it especially difficult to know how likely any given intervention is to make progress on the problem, though there are certainly organisations with promising theories for mitigating these risks.
Depending on your values and worldview, you may choose to prioritise areas where you feel confident that your actions will make a concrete difference. This might mean prioritising the better-understood risks humanity faces, like climate change, or focusing on interventions that are highly tractable, like those improving global health and wellbeing.1 Conversely, you might find that the lack of evidence in this area makes you more likely to prioritise supporting it, since you might feel that the best response to this lack of evidence is to support further research.
While some global catastrophic risks could affect us in our lifetimes, they are largest in scale when we think about how they could affect future generations. Thus, if you think the welfare of future people matters less than people alive today, you might be less interested in supporting work in this area. While we think that future generations matter and their welfare deserves equal consideration to those living today, this is ultimately a question about your values.
Alternatively, you may value future people, but believe that the future doesn't seem likely to be particularly big or especially good.
We think these are important considerations, each requiring judgement calls that depend on your worldview.2 Read more about how to choose between causes.
You can help further work that aims to reduce global catastrophic risks by supporting promising projects working on the focus areas we’ve outlined above. We generally think donating to funds (rather than individual organisations) is the best way to support promising projects in any cause area (here’s why). This is particularly true for supporting global catastrophic risk reduction; the field is developing rapidly and the impact of interventions is harder to measure than in other areas, so the expertise and context that specialised grantmakers have become even more valuable.
These are our recommended funds for reducing global catastrophic risks:
You can also donate to Giving What We Can's Risks and Resilience Fund, which will pool your money with other donors' contributions and allocate it based on the research team's latest research into which evaluators and grantmakers are best-suited to help donors maximise their impact. Read more about how this fund works and why you might donate to it.
Since our list of recommendations is shorter than in recent years (see why) you may also be interested in checking out some of our other supported programs working on reducing global catastrophic risks.
To learn more about safeguarding the long-term future and global catastrophic risks, we recommend the following resources:
This page was written by Alana Horowitz Friedman and Michael Townsend.
Please help us improve our work — let us know what you thought of this page and suggest improvements using our content feedback form.