We want to maximize the impact of our portfolio.

We’re open to supporting safe bets, like direct cash transfers to the world’s poorest people, as well as high-risk, high-reward projects, like minimizing risks from potentially world-changing science and technology. Read more about how we choose what to fund here.

  1. Berkeley Existential Risk Initiative — David Krueger Collaboration

    Award Date:
    04/2022
    Amount:
    $40,000
    Potential Risks from Advanced AI
  2. Paris Peace Forum — Global Commission on Governing Risks from Climate Overshoot

    Award Date:
    03/2022
    Amount:
    $616,305
    Scientific Research
  3. Duke University — COVID-19 Antiviral Studies (Timothy Haystead)

    Award Date:
    03/2022
    Amount:
    $14,355
    Scientific Research
  4. Carnegie Endowment for International Peace — AI Governance Research

    Award Date:
    03/2022
    Amount:
    $597,717
    Potential Risks from Advanced AI
  5. Impact Global Health — General Support

    Award Date:
    03/2022
    Amount:
    $1,550,000
    Global Health & Development
  6. Life Sciences Research Foundation — Young Investigators (2022)

    Award Date:
    03/2022
    Amount:
    $751,000
    Scientific Research
  7. Dezernat Zukunft — Re-Granting (2022)

    Award Date:
    03/2022
    Amount:
    $2,400,000
    Macroeconomic Policy
  8. Animal Ask — General Support

    Award Date:
    03/2022
    Amount:
    $130,000
    Farm Animal Welfare
  9. Atlas Fellowship — Scholarships and Summer Program for Students

    Award Date:
    03/2022
    Amount:
    $5,000,000
    Effective Altruism
  10. Hofvarpnir Studios — Compute Cluster for AI Safety Research

    Award Date:
    03/2022
    Amount:
    $1,443,540
    Potential Risks from Advanced AI