The Long-Term Future Fund aims to positively influence the long-term trajectory of civilisation by addressing global catastrophic risks, especially AI risk, through enabling individuals & groups to work on relevant, high-priority projects.
What problem is the Long-Term Future Fund working on?
The Long-Term Future Fund aims to positively influence the long-term trajectory of civilisation by making grants to fund projects working on global catastrophic risks, especially potential risks from advanced artificial intelligence. In addition, the Fund seeks to promote, implement, and advocate for longtermist ideas, and to otherwise increase the likelihood that future generations will flourish.
What projects does the Long-Term Future Fund support?
The Fund has a broad remit to make grants that promote, implement, and advocate for longtermist ideas. It supports:
Projects that directly contribute to reducing existential risks through technical research, policy analysis, advocacy, and/or demonstration projects.
Training for researchers or practitioners who work to mitigate existential risks.
Work that helps with relevant recruitment efforts or infrastructure for people working on longtermist projects.
Projects that promote long-term thinking.
Recent grant recipients include:
Alex Cloud, Jacob Goldman-Wetzler, Evžen Wybitul, Joseph Miller — 6-month stipends to develop and apply a novel method for localizing information and computation in neural networks.
Einar Urdshals — Mentored independent research and upskilling to transition from theoretical physics PhD to AI safety.
SERI ML Alignment & Theory Scholars — 4-month salaries and expenses for two people to create a benchmark for evaluating goal-directedness in language models.
Constantin Weisser — 3-month stipend for MATS extension establishing a benchmark for LLMs’ tendency to influence human preferences.
The Long-Term Future Fund provides a lot of grants to individuals — for example, to conduct independent research, to switch into a potentially more impactful career, or to engage in smaller projects like advocating for ideas relevant to improving the long-term future. For more information about how donations are allocated, see the list of past recipients and frequently asked questions on the EA Funds website.
Why do we include the Long-Term Future Fund on our list of recommended programs?
We investigated the Long-Term Future Fund as part of our 2023 evaluator investigations and determined that it is a great choice for donors wishing to maximise the impact of their “dollar” when supporting global catastrophic risk reduction. See the full report here.
Please note that GWWC does not evaluate individual charities. Our recommendations are based on the research of third-party, impact-focused charity evaluators our research team has found to be particularly well-suited to help donors do the most good per dollar, according to their recent evaluator investigations. Our other supported programsare those that align with our charitable purpose — they are working on a high-impact problem and take a reasonably promising approach (based on publicly-available information).
At Giving What We Can, we focus on the effectiveness of an organisation's work -- what the organisation is actually doing and whether their programs are making a big difference. Some others in the charity recommendation space focus instead on the ratio of admin costs to program spending, part of what we’ve termed the “overhead myth.” See why overhead isn’t the full story and learn more about our approach to charity evaluation.
Your current selection
Funds / Organisations you select will show up here