Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate

Funding requirements

Sign grant agreement
Reach min funding
Get Manifund approval
1

Scaling Public Understanding of AI Alignment and Existential Risk

Technical AI safetyAI governance
ethan-nelson avatar

Ethan Nelson

ProposalGrant
Closes November 10th, 2025
$0raised
$2,500minimum funding
$20,000funding goal

Offer to donate

28 daysleft to contribute

You're pledging to donate if the project hits its minimum goal and gets approved. If not, your funds will be returned.

Sign in to donate

1. What are this project's goals? How will you achieve them?

Goal: Make AI Safety concepts accessible and engaging to a broader audience through high-quality YouTube content, bridging the gap between academic research (Bostrom, Yudkowsky) and public understanding.

Achievement Strategy:

- Pivot the Ethan Nelson YouTube channel from AI business to AI Safety education

- Produce visually compelling videos that translate complex safety concepts (alignment, existential risk, AGI governance) into digestible content

- Leverage professional video editing to increase watch time and algorithmic reach

- Build a content pipeline covering fundamental AI Safety topics with progressive complexity

- Target 100K+ views per video to maximize impact on public discourse

2. How will this funding be used?

Budget Breakdown:

- Video Production Team (70%): Scale existing editing team to full-time capacity and hire specialized motion graphics/animation editors for complex AI Safety visualizations

- Production Quality (20%): Improved equipment, software licenses (After Effects, Premiere Pro), stock footage, audio production, and specialized graphics assets

- Research & Scripting (10%): Time allocation for deep research into AI Safety literature and expert consultation

Expected Output: 2-4 high-quality videos per month for 12 months, each 15-25 minutes long

3. Who is on your team? What's your track record on similar projects?

Team:

- Ethan Nelson (Creator/Host): Content creator with established YouTube presence on AI topics

- 2 Part-time Video Editors: Experienced in producing AI business content

- 2 Part-time Thumbnail Designers: Proven track record creating click-worthy, engaging thumbnails

- [To be hired] Specialized Motion Graphics Editor: For complex AI Safety concept visualizations

Track Record:

- Channel: https://www.youtube.com/@EthanNelsonScalewithAI

- Growth: Built the Ethan Nelson channel to 23,000 subscribers in 1 year

- Reach: Generated 883,697 views and 32,900 hours of watch time

- Proven audience: Demonstrated ability to attract and retain viewers interested in AI topics

- Content consistency: Established production workflow with existing team capable of sustaining regular output

- Foundation for pivot: Existing audience and production infrastructure provide launching point for AI Safety content transition

4. What are the most likely causes and outcomes if this project fails?

Most Likely Failure Causes:

1. Audience rejection of pivot: Existing AI business audience doesn't follow to safety content; new safety-focused audience doesn't materialize

2. Content complexity barrier: Unable to make safety concepts accessible without oversimplification

3. Algorithm suppression: YouTube doesn't promote longer-form educational content on niche topics

4. Production scaling challenges: Difficulty scaling team capacity or finding specialized animation talent for complex concepts

Mitigation Strategies:

- A/B test content formats early; gradual pivot rather than abrupt shift

- Collaborate with AI Safety researchers for accuracy and credibility

- Diversify platforms (clips to Instagram, long-form to YouTube, short-form Youtube)

- Leverage existing team's expertise while adding specialized skills incrementally

Failure Outcomes:

- Limited reach (< 50K views/video) reduces impact on public understanding

- Funding doesn't translate to sustainable content production

- Minimal contribution to AI Safety discourse

5. How much money have you raised in the last 12 months, and from where?

This is my first external funding application. The Ethan Nelson YouTube channel has been entirely self-funded to date, which has limited production capacity and content quality. The transition to AI Safety content represents a strategic pivot that requires professional editing resources beyond what self-funding can sustainably provide. This grant would enable the leap from hobbyist-quality to professional-grade educational content capable of competing for attention in the AI discourse space.

CommentsOffersSimilar7
CeSIA avatar

Centre pour la Sécurité de l'IA

Scaling AI safety awareness via content creators

4M+ views on AI safety: Help us replicate and scale this success with more creators

Technical AI safetyAI governanceGlobal catastrophic risks
12
13
$21.3K raised
The-AI-Risk-Network avatar

Lindsay Langenhoven

Amplifying AI extinction risk awareness before it’s too late

Support our mission to educate millions through podcasts and videos before unsafe AI development outruns human control.

Science & technologyTechnical AI safetyAI governanceEA communityGlobal catastrophic risks
1
3
$129 / $50K
michaeltrazzi avatar

Michaël Rubens Trazzi

Making 52 AI Alignment Video Explainers and Podcasts

EA Community Choice
8
9
$15.3K raised
GauravYadav avatar

Gaurav Yadav

AI Governance YouTube Channel

Explainable videos for ideas in AI Governance

AI governanceEA Community Choice
7
11
$732 raised
michaeltrazzi avatar

Michaël Rubens Trazzi

Grow An AI Safety Tiktok Channel To Reach Ten Million People

20 Weeks Salary to reach a neglected audience of 10M viewers

Technical AI safetyAI governance
8
32
$29.2K raised
MichelJusten avatar

Michel Justen

Video essay on risks from AI accelerating AI R&D

Help turn the video from an amateur side-project to into an exceptional, animated distillation

AI governanceGlobal catastrophic risks
1
5
$0 raised
liron avatar

Liron Shapira

Doom Debates - Podcast & debate show to help AI x-risk discourse go mainstream

Let's warn millions of people about the near-term AI extinction threat by directly & proactively explaining the issue in every context where it belongs

Technical AI safetyAI governanceEA Community ChoiceGlobal catastrophic risks
4
14
$1.44K raised