Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate
5

Diversify Funding for AI Safety

Science & technologyTechnical AI safetyAI governanceEA communityGlobal catastrophic risks
AngieNormandale avatar

Angie Normandale

Not fundedGrant
$0raised

Project summary

This is a meta-AI Safety project with potential for outsize impact.

The goal is to secure new sources of funding for AI Safety nonprofits through connections with non-EA grant programs and High Net Worth Individuals.

AI Safety work is funding constrained. Many organisations rely on a small cluster of income sources such as the Long Term Future Fund and Open Philanthropy. We need to diversify our funding sources to increase resilience and reduce bottlenecks.

What are this project's goals? How will you achieve them?


Phase 1: 100% complete

  • Research non-EA funds and map opportunity space

  • Connect with grant experts beyond EA

Phase 2:  70% complete

  • Create a high quality searchable database of opportunities, including dates, timelines, amounts, key contacts

Phase 3: in progress

  • Individual consultancy with founders

  • Providing a list of opportunities which fit their needs and time frames

  • Support to secure funds including grant writing, marketing, outreach, accounting, legal

Phase 4: Expanding the pool of High Net Worth Individuals funding AI Safety

  • Generating leads through research, networking

  • Working with current funders & experts

  • Strategic relationship building to secure new top donors

How will this funding be used?


$5k- Searchable funding database of nonprofit grants and government opportunities, free grantwriting for three AI Safety organisations at risk of closure

$7.5k- Pays for access to High Net Worth Individuals and donor lists, so that I can include prospective High Net Worth donors in the main funding database. I'll also send out monthly updates on key deadlines for the next year.

$10k- Grant templates to reduce application times, free grantwriting for 5 AI Safety organisations, and a searchable webapp so you can easily find the right grant.

$50k- Sets up an HNWI outreach program in London and the Bay Area, including an international donor funding circle to bring high value AI Safety donors together.

$120k- HNWI outreach program runs for 12 months, highest change of success

Who is on your team? What's your track record on similar projects?

Angie Normandale - Oxford all time top alumni fundraiser, 10y experience, lawyer, project manager, founder with six figure seed round, part time CompSci MSc, previously did this work at PIBBSS

Advisors:

Chris Akin- COO Apollo Research

Professor Mike X Cohen - seasoned principal investigator with large academic fund experience

Will Portnof -Bay Area Philanthropic consultant


Organisations that have expressed interest:

Apart Labs

Pause AI

Far AI

LISA

Athena

PIBBSS

Apollo

MATS

Epoch

Impact Academy

BlueDot

& others.

What are the most likely causes and outcomes if this project fails?

Research suggests that US-based nonprofits spend between 10-30% of their annual budgets on fundraising.

The likely alternative is paying external consultants to fundraise.

In August 2024 I met with grant consultants from the US, UK, and Australia to look at funding for PIBBSS.

Consultants charged up to $6k per organization per month, or up to 15% of the fundraise in commission.

Despite fundraising expertise, they struggled to comprehend our niche and value add. It appears that the AI Safety Space is unusual compared to other research fields and requires an inside view.

Alternatively someone else in EA might step forward to do this work. Nobody has volunteered thus far but there’s certainly a market for this work!

What other funding are you or your project getting?


None applied for. This manifund should seed the setup costs for a self-sustaining project. The lifespan of the work depends on success rate and the wider field.

—

Please contact Angie for further questions about the project: g.normandale@gmail.com

Comments7Similar6
JaesonB avatar

Jaeson Booker

The AI Safety Research Fund

Creating a fund exclusively focused on supporting AI Safety Research

Technical AI safety
1
16
$100 / $100K
Allisondman avatar

Allison Duettmann

Increasing the funding distributed by Foresight Insitute's AI safety grants

focused on 1. bci and wbe for safe ai, 2. cryptography and security for safe ai, and 3. safe multipolar ai

Science & technologyTechnical AI safetyAI governance
4
0
$0 raised
havenworsham avatar

Haven (Hayley) Worsham

Forming a 501(c)(4) organization for AI safety policy advocacy/lobbying

4-month salary and initial expenses

AI governance
3
4
$0 raised
wiserhuman avatar

Francesca Gomez

Develop technical framework for human control mechanisms for agentic AI systems

Building a technical mechanism to assess risks, evaluate safeguards, and identify control gaps in agentic AI systems, enabling verifiable human oversight.

Technical AI safetyAI governance
3
5
$10K raised
adityaraj avatar

AI Safety India

Fundamentals of Safe AI - Practical Track (Open Globally)

Bridging Theory to Practice: A 10-week program building AI safety skills through hands-on application

Science & technologyTechnical AI safetyAI governanceEA communityGlobal catastrophic risks
1
0
$0 raised
🐸

SaferAI

General support for SaferAI

Support for SaferAI’s technical and governance research and education programs to enable responsible and safe AI.

AI governance
3
1
$100K raised