Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate

Funding requirements

Sign grant agreement
Reach min funding
Get Manifund approval
1

Give free books

AI governanceEA community
mrBlueSky avatar

Aidar Toktargazin

ProposalGrant
Closes July 26th, 2025
$0raised
$501minimum funding
$4,013funding goal

Offer to donate

29 daysleft to contribute

You're pledging to donate if the project hits its minimum goal and gets approved. If not, your funds will be returned.

Sign in to donate

Please delete this project, it is a duplicate

CommentsOffersSimilar6
mrBlueSky avatar

Aidar Toktargazin

Give free books

Give free AI safety books to potentially high-impact individuals

AI governanceEA community
1
0
$0 / $4.01K
mrBlueSky avatar

Aidar Toktargazin

Giving free AI safety books for potentially high-impact individuals

EA community
1
0
$0 / $4.01K
johanf avatar

Johan Fredrikzon

Book: A History of AI x-risk (open access)

Building on a course I've been designing, I'm writing a short book on the history of existential risk from AI for a general audience

Science & technologyTechnical AI safetyAI governanceGlobal catastrophic risks
1
0
$0 raised
AISGF avatar

AI Safety and Governance Fund

Testing and spreading messages to reduce AI x-risk

Educating the general public about AI and risks in most efficient ways and leveraging this to achieve good policy outcomes

AI governanceEA Community Choice
4
17
$12.6K raised
Hein avatar

Hein de Haan

Write and publish an e-book advocating for longtermism and sentientism.

How can we build an awesome civilization for all sentient life, and ensure it will be (even more) awesome in the future?

ACX Grants 2024
1
5
$0 raised
JaesonB avatar

Jaeson Booker

The AI Safety Research Fund

Creating a fund exclusively focused on supporting AI Safety Research

Technical AI safety
1
16
$100 / $100K