CEA is seeking a leader for a new version of the Long-Term Future Fund. This is a rare opportunity to shape millions of dollars in funding toward critical work on existential risks and the broader implications of transformative AI. You'll have substantial autonomy to define the fund’s strategy and scope (and the fund’s new name), build your own team, and establish the fund as a go-to resource for some of the most promising work shaping our future. This is a senior management position on our EA Funds team, reporting to the Director of EA Funds.
The Centre for Effective Altruism (CEA) stewards the movement of people putting effective altruism principles into practice to solve the world's most pressing problems. We’re working to build a flourishing future by applying evidence, reason, and compassion to challenges like global poverty, animal suffering, and existential risks.
Our work centers on growing and supporting a global community of people who rigorously analyze where they can do the most good—and take action on those insights. Current strategic priorities include increasing understanding of effective altruism and its principles, growing the number of people who are motivated by EA principles to take significant action to address pressing problems, and diversifying funding sources for high-impact work.
We had significant success in 2025, building momentum within CEA. Our headcount grew from 42 to 66 core staff. Program participation (e.g., events, courses, groups) grew by 20-25% year over year. We merged with EA Funds and are rapidly scaling up our capacity for grantmaking and associated fundraising: our first Fund staffed with full-time employees (EA Animal Welfare Fund) raised almost as much as the previous three years combined.
In 2026, we’re maintaining ambitious momentum while building the foundations for a step-change in the wider EA ecosystem’s growth trajectory from 2027 onwards.
LTFF is part of the Effective Altruism Funds (EA Funds), which is the grantmaking arm of CEA. EA Funds aims to increase the amount of funding that is dedicated to particularly cost-effective and altruistically impactful projects across animal welfare, long-term future, global health and development, and EA infrastructure.
It’s a particularly exciting time to join EA Funds. EA Funds’ recent integration with CEA allows the team to benefit from shared infrastructure and opens up significant opportunities for programmatic synergies. We’ve also seen the example of EA Funds' Animal Welfare Fund scaling from ~$3M to $10M in a single year under full-time leadership with tens of millions more expected this year, and we believe LTFF has similar potential.
The Long-Term Future Fund’s current scope is wide. It addresses global catastrophic risks—especially those from advanced artificial intelligence—while also supporting work on other existential risks such as pandemics, broader longtermist research, and efforts to help future generations flourish. We expect the right candidate for this role to help redefine the Funds’ strategic vision.
Since 2017, LTFF has distributed over $20 million across hundreds of grants. Giving What We Can recommends LTFF as a top choice for donors wishing to support global catastrophic risk reduction. Past grants have supported researchers and seeded organizations that became central to the AI safety ecosystem, and have funded work that shifted how the field thinks about key technical and governance problems. It has also contributed to biosecurity infrastructure, forecasting, nuclear risk research, and civilizational resilience projects, among others. Notable past grants include early funding for SERI MATS (now a leading AI safety research mentorship program), Robert Miles' AI safety video content reaching millions, and major expansions of the Metaculus forecasting platform that now hosts hundreds of thousands of predictions.
AI development is accelerating, and the next few years may be unusually important for shaping how advanced AI systems are built and governed, and ultimately the broader trajectory of civilization. LTFF exists to get resources to the people and projects best positioned to make this go well.
With the evolving landscape, the fund's integration into CEA and this new leadership hire, we're looking to reposition the fund with a fresh strategic direction (and by default, a new name). As Head, you'll have significant latitude to define what this new version of LTFF should be: which problems to prioritize, what grants to make, how to reach new donors, and how to build a team that can execute at scale.
Inspired by the example of AWF, we're building toward growth. With dedicated support from CEA's communications, fundraising, and operations teams, and in coordination with other funders in the ecosystem, you'll work to grow LTFF's impact while building on what has made the fund effective.
You will be responsible for high-quality grantmaking, strategic leadership of the fund, fundraising, and building a team. You will make grants yourself—particularly (but not exclusively) in the early stages—while also recruiting and managing other grantmakers to expand the fund’s capacity.
We are genuinely open to your perspective on the fund’s future direction. While the majority of our current grantmaking relates to transformative AI, which we expect will remain central, we're interested in candidates with thoughtful views on questions like:
How should the fund position itself relative to other funders looking at AI safety and other implications of AI progress? Where and with which types of grants can we add the most value?
What role, if any, should catastrophic risks that are not related or not directly related to AI (e.g., biosecurity, nuclear risk) play in the portfolio?
How do we reach donors who care about these issues but aren't deeply embedded in EA?
We don't expect you to have definitive answers to these questions, though we think strong candidates for the role will have good strategic intuitions and hypotheses to test. We expect you to have the judgment and intellectual honesty to develop the strategy over time, and to communicate your reasoning transparently to donors, grantees, and the broader community.
This role reports to the Director of EA Funds. Your transition will be supported by existing EA Funds staff and by CEA teams in communications, fundraising, and operations.
Grantmaking
Investigate and evaluate grant opportunities rigorously, assessing theories of change, team strength, expected impact, and other relevant factors
Communicate your grantmaking reasoning transparently through grant reports, feedback to applicants, and public writing
Source high-quality applications based on your strategic insights and the fund's priorities
Contribute to improvements in the grantmaking processes of EA Funds
Strategic leadership
Develop a compelling strategy for LTFF's focus areas, grant types, and role in the funding ecosystem
Represent the fund (and occasionally EA Funds as a whole) within the EA community, the AI safety community, as well as the broader AI community
Team building
Hire and manage additional grantmakers as the fund grows
Build a team culture of rigorous evaluation, intellectual honesty, and responsiveness to applicants
Fundraising
Grow LTFF’s donor base in collaboration with others at CEA (e.g. EAF leadership, the incoming fundraising staff, and the Marketing and Communications team)
Build and cultivate relationships with potential major individual donors
Communicate LTFF's impact and strategy in ways that are honest and compelling
Ecosystem development
Collaborate and possibly build partnerships with other funders and organizations working in relevant areas
Support grantees to succeed and maximize their impact
Contribute to the broader EA, AI safety, and longtermist communities' thinking on funding and other strategic priorities
Below is a list of characteristics we're seeking. We expect strong candidates to demonstrate depth in some of these areas while actively developing in others. If you’re genuinely excited about contributing to the fund, we encourage you to apply even if you aren't sure you're fully qualified.
At a minimum, we want a candidate who has:
Strong familiarity with AI safety and/or other potential implications of transformative AI: key research directions, major organizations, ongoing debates, and the people doing important work. We're particularly interested in candidates with relatively broad knowledge across the field, though we're also open to those with deep expertise in specific areas provided you're able to explore new domains as the landscape and needs evolve.
Value alignment with EA principles. This does not necessarily require deep familiarity with the EA community, but it does require genuine commitment to careful reasoning about impact, willingness to change your mind based on strong reasoning and evidence, and concern for doing the most good.
An entrepreneurial mindset to lead the re-establishment of the fund, and to adapt to potential shifts that could occur in the space in the coming years.
Strong interpersonal skills to professionally represent CEA, build networks, and navigate difficult conversations with potential grantees.
In addition, it will be valuable if you:
Have pre-existing networks in the AI and AI safety ecosystems
Have grantmaking experience, or experience in roles requiring similar judgment (e.g. investment, policy advising, or research/technical leadership roles where you've had to evaluate complex projects and allocate resources under uncertainty).
Can navigate nuanced conversations about AI progress with people who hold a wide range of views on timelines, implications, risks and potential solutions
Have strong written and verbal communication skills, with the ability to explain complex reasoning clearly and build trust with very diverse stakeholders
Have strong project management skills
Have experience building and managing teams
We expect to hire at the Head of Fund level, with room for growth to the Director level as the fund grows. However, candidates with exceptional qualifications across nearly all criteria (both minimum and additional criteria) who demonstrate strong potential to rapidly scale the fund may be considered directly for a Director role.
This is a full-time, remote position. We prefer applicants who are able to work in time zones between US Pacific Time and CET.
We have an office in Oxford, UK, that you would have access to.
Start date: We'd ideally have you start as soon as possible, but would consider a later start for the right candidate.
Reports to: Director of EA Funds
Compensation
Head of LTFF level
US: total compensation package of $157,435, comprising a base salary of $143,123, and a 10% unconditional 401k contribution.
UK: total compensation package of £94,711, comprising a base salary of £86,101, and a 10% pension contribution.
For exceptional candidates meeting the criteria for the Director of LTFF level
US: total compensation package of $186,665, comprising a base salary of $169,696, and a 10% unconditional 401k contribution.
UK: total compensation package of £142,500, comprising a base salary of £129,545, and a 10% pension contribution.
Note that for candidates at the Director level, we offer US market rate compensation, regardless of location. This rate in GBP matches our US benchmarks, with our current standard exchange rate.
Other locations: For candidates outside the US and UK, we base compensation on our UK salary structure and adjust for the cost of employment and fixed local benefit costs to create an equivalent package.
Benefits in the US/UK include private insurance, flexible work hours, a $6,000 / £5,000 annual professional development allowance, a $6,000 / £5,000 mental health support allowance, extended parental leave, ergonomic equipment, unconditional 10% pension / 401k contribution, 25 days of paid vacation, and more.
This role will involve travel. Depending on the location, you should expect roughly 4 to 7 trips annually to attend team retreats and other events, including several international trips.
We are committed to fostering a culture of inclusion and encourage individuals with diverse backgrounds and experiences to apply. We especially encourage applications from self-identified women and people of colour who are excited about contributing to our mission. The Centre for Effective Altruism is an equal opportunity employer. If you need assistance or an accommodation due to a disability, or have any other questions about applying, please contact jobs@centreforeffectivealtruism.org.
CEA participates in E-Verify for US employees.
We are committed to protecting your data. See our privacy policy for more information.
We expect the interview process to include the following steps, subject to minor changes:
Application
Test-task 1
Short interview
Test-task 2
Final interviews (3-4, non-sequential)
1-2 day work trial
Reference checks