Project summary
This is a meta-AI Safety project with potential for outsize impact.
The goal is to secure new sources of funding for AI Safety nonprofits through connections with non-EA grant programs and High Net Worth Individuals.
AI Safety work is funding constrained. Many organisations rely on a small cluster of income sources such as the Long Term Future Fund and Open Philanthropy. We need to diversify our funding sources to increase resilience and reduce bottlenecks.
What are this project's goals? How will you achieve them?
Phase 1: 100% complete
Phase 2: 70% complete
Create a high quality searchable database of opportunities, including dates, timelines, amounts, key contacts
Phase 3: in progress
Individual consultancy with founders
Providing a list of opportunities which fit their needs and time frames
Support to secure funds including grant writing, marketing, outreach, accounting, legal
Phase 4: Expanding the pool of High Net Worth Individuals funding AI Safety
Generating leads through research, networking
Working with current funders & experts
Strategic relationship building to secure new top donors
How will this funding be used?
$5k- Searchable funding database of nonprofit grants and government opportunities, free grantwriting for three AI Safety organisations at risk of closure
$7.5k- Pays for access to High Net Worth Individuals and donor lists, so that I can include prospective High Net Worth donors in the main funding database. I'll also send out monthly updates on key deadlines for the next year.
$10k- Grant templates to reduce application times, free grantwriting for 5 AI Safety organisations, and a searchable webapp so you can easily find the right grant.
$50k- Sets up an HNWI outreach program in London and the Bay Area, including an international donor funding circle to bring high value AI Safety donors together.
$120k- HNWI outreach program runs for 12 months, highest change of success
Who is on your team? What's your track record on similar projects?
Angie Normandale - Oxford all time top alumni fundraiser, 10y experience, lawyer, project manager, founder with six figure seed round, part time CompSci MSc, previously did this work at PIBBSS
Advisors:
Chris Akin- COO Apollo Research
Professor Mike X Cohen - seasoned principal investigator with large academic fund experience
Will Portnof -Bay Area Philanthropic consultant
Organisations that have expressed interest:
Apart Labs
Pause AI
Far AI
LISA
Athena
PIBBSS
Apollo
MATS
Epoch
Impact Academy
BlueDot
& others.
What are the most likely causes and outcomes if this project fails?
Research suggests that US-based nonprofits spend between 10-30% of their annual budgets on fundraising.
The likely alternative is paying external consultants to fundraise.
In August 2024 I met with grant consultants from the US, UK, and Australia to look at funding for PIBBSS.
Consultants charged up to $6k per organization per month, or up to 15% of the fundraise in commission.
Despite fundraising expertise, they struggled to comprehend our niche and value add. It appears that the AI Safety Space is unusual compared to other research fields and requires an inside view.
Alternatively someone else in EA might step forward to do this work. Nobody has volunteered thus far but there’s certainly a market for this work!
What other funding are you or your project getting?
None applied for. This manifund should seed the setup costs for a self-sustaining project. The lifespan of the work depends on success rate and the wider field.
—
Please contact Angie for further questions about the project: g.normandale@gmail.com