We want to maximize the impact of our portfolio.

We’re open to supporting safe bets, like direct cash transfers to the world’s poorest people, as well as high-risk, high-reward projects, like minimizing risks from potentially world-changing science and technology. Read more about how we choose what to fund here.

  1. SecureBio — Biosecurity Research

    Award Date:
    11/2022
    Amount:
    $1,420,937
    Biosecurity and Pandemic Preparedness
  2. AI Safety Support — SERI MATS Program

    Award Date:
    11/2022
    Amount:
    $1,538,000
    Potential Risks from Advanced AI
  3. London Effective Altruism Hub — EA Organizing in London

    Award Date:
    11/2022
    Amount:
    $445,000
    Effective Altruism
  4. Seton Hall University — Longtermism Course Development

    Award Date:
    11/2022
    Amount:
    $39,780
    Effective Altruism
  5. Alignment Research Center — General Support (November 2022)

    Award Date:
    11/2022
    Amount:
    $1,250,000
    Potential Risks from Advanced AI
  6. Helena — Health Security Policy

    Award Date:
    11/2022
    Amount:
    $500,000
    Biosecurity and Pandemic Preparedness
  7. GMU — Michael Clemens Migration Research

    Award Date:
    11/2022
    Amount:
    $450,000
    Immigration Policy
  8. Indian Institute of Technology Bombay — PAVITRA Pollution Modeling Tool

    Award Date:
    11/2022
    Amount:
    $647,618
    South Asian Air Quality
  9. Center for AI Safety — General Support

    Award Date:
    11/2022
    Amount:
    $5,160,000
    Potential Risks from Advanced AI
  10. Equalia — Corporate Campaigns (2022)

    Award Date:
    11/2022
    Amount:
    $2,409,106
    Farm Animal Welfare