Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate

Funding requirements

Sign grant agreement
Reach min funding
Get Manifund approval
1

Giving free AI safety books for potentially high-impact individuals

EA community
mrBlueSky avatar

Aidar Toktargazin

ProposalGrant
Closes July 26th, 2025
$0raised
$500minimum funding
$4,012funding goal

Offer to donate

29 daysleft to contribute

You're pledging to donate if the project hits its minimum goal and gets approved. If not, your funds will be returned.

Sign in to donate

Project summary

Step 1:

For the minimum funding, physical copies of these books will be purchased:

9 copies of If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All by Eliezer Yudkowsky and Nate Soares. (30 dollars each, 270 dollars in total)

2 copies of The Scaling Era: An Oral History of AI, 2019-2025 by Dwarkesh Patel and Gavin Leech (35 dollars each, 70 dollars in total)

2 copies of The Alignment Problem: Machine Learning and Human Values by Brian Christian (20 dollars each, 40 dollars in total)

Total esitmated cost of the books including taxes and shipping: 500.35 USD

If the funding goals are totally covered, physical copies of these books will be purchased:

50 copies of If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All by Eliezer Yudkowsky and Nate Soares. (30 dollars each, 1500 dollars in total)

10 copies of The Scaling Era: An Oral History of AI, 2019-2025 by Dwarkesh Patel and Gavin Leech (35 dollars each, 350 dollars in total)

15 copies of The Alignment Problem: Machine Learning and Human Values by Brian Christian (20 dollars each, 300 dollars in total)

10 copies of Introduction to AI Safety, Ethics, and Society by Dan Hendrycks (110 dollars each, 1100 dollars in total)

Total estimated cost of the books including taxes and shipping: 4012.96 USD

Step 2:
Books will be given for free to the heads of research labs, professors, and researchers in Astana, Kazakhstan. The receivers include: highly motivated Masters and PhD students, and professors in the of the Nazarbayev University (NU). Books also will be given to the teams of the NU Internet of Things lab, , the Applications of Signal Processing lab, and the Institute of Smart Systems and Artificial Intelligence. I can share the names and achievements of the receivers if you agree to keep this information confidential from the general public.

What are this project's goals? How will you achieve them?

Goal: convince people to pursue high-impact research on preventing x-risks.

How will this funding be used?

Funds will be used to purchase books, to pay the shipment fees, and to pay the import taxes.

Who is on your team? What's your track record on similar projects?

Only me, no track record.

What are the most likely causes and outcomes if this project fails?

Main potential cause of failure:

Book receivers are not going to read them or to pursue AI risk reduction after reading them.

Outcome in case of a failure: Money gets wasted.

How much money have you raised in the last 12 months, and from where?

0 dollars.

CommentsOffersSimilar6

No comments yet. Sign in to create one!