Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate
3

Shallow review of cost-effectivness of technical AI safety orgs

Technical AI safetyAI governance
Mikolaj-Kniejski avatar

Mikolaj Kniejski

Not fundedGrant
$0raised

Project summary

This project aims to develop Animal Charity Evaluators style cost-effectiveness estimates for AI safety organizations. I want to gather and analyze data key metrics such as people impacted, research output (e.g., papers, citations), and funding received.

What are this project's goals? How will you achieve them?

As a side-product I will get a list of papers that were published by AI safety organizations and grants distributed by SFF and EAIF. While EAIF already provides an easy to use interface to browse the grants database, SFF doesn't and there is no database of AIS papers AFAIK.

My plan:

  1. Data Collection:

    • Gathering publicly available data from websites and impact analyses

    • Scraping websites of organizations listed on the AI Safety Map to compile a comprehensive list of research papers.

    • Using the Semantic Scholar API to gather citation counts for these papers.

    • Scraping grant databases (SFF, EAIF, ACX) to include grant information.

    • Publishing the collected data as a separate, searchable website (hosted as a static webiste on Vercel, so free of charge).

  2. Engagement with Organizations:

    • Emailing organizations to request data on people impacted (e.g., participants in their programs).

    • Collecting additional relevant data where feasible, such as social media interactions.

  3. Analysis:

    • Comparing metrics across organizations operating under similar theories of change.

  4. Dissemination:

    • Publishing findings on LessWrong and the EA Forum, engaging with the comments.

How will this funding be used?

$500: Create a rough CSV file compiling publicly available data, including research papers, citations, and grants. Minimal post with analysis and no outreach to organizations for additional data.

$2,000: Publish the database in an accessible format and conduct outreach to organizations to gather data on participants and additional metrics.

$3,000: Go beyond the 80/20 principle, heavy interaction with comments, taking requests...

Who is on your team? What's your track record on similar projects?

Re: data analysis: I helped Condor Camp to analyze their pre-camp and after-camp surveys and EA Denmark to analyze yearly community survey .

Re: writing posts: I have one on EA Forum with 100 karma.

What are the most likely causes and outcomes if this project fails?

N/A

How much money have you raised in the last 12 months, and from where?

0$

Comments10Similar8
gleech avatar

Gavin Leech

Shallow review of AI safety 2024

9
14
$20.9K raised
Piotr avatar

Piotr Zaborszczyk

AI safety fieldbuilding in Warsaw, Poland (funding for 1 semester)

Reach the university that trained close to 20% of OpenAI early employees

Technical AI safetyGlobal catastrophic risks
5
5
$10K raised
AngieNormandale avatar

Angie Normandale

Diversify Funding for AI Safety

Seeding a business which finds grants and High Net Worth Individuals beyond EA

Science & technologyTechnical AI safetyAI governanceEA communityGlobal catastrophic risks
5
7
$0 raised
KabirKumar avatar

Kabir Kumar

AI-Plans.com

Science & technologyTechnical AI safetyAI governance
5
4
$5.37K raised
JaesonB avatar

Jaeson Booker

The AI Safety Research Fund

Creating a fund exclusively focused on supporting AI Safety Research

Technical AI safety
1
16
$100 / $100K
zabrown avatar

Zachary Brown

Create ‘Responsible AI Investing’ recommendations for institutional investors

Four months salary to draft and promote the recommendations, helping investors advocate for specific safety and governance practices at labs and chipmakers.

1
2
$0 raised
🐸

SaferAI

General support for SaferAI

Support for SaferAI’s technical and governance research and education programs to enable responsible and safe AI.

AI governance
3
1
$100K raised
tylerjn avatar

Tyler Johnston

The Midas Project

AI-focused corporate campaigns and industry watchdog

AI governanceGlobal catastrophic risks
2
2
$0 raised