Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate

Funding requirements

Sign grant agreement
Reach min funding
Get Manifund approval
4

Drive Narrative Change To Reduce Global Catastrophic Risk: Reach 1.3M people

Science & technologyAI governanceEA communityGlobal catastrophic risks
akhilpuri avatar

Akhil Puri

ProposalGrant
Closes October 17th, 2025
$215raised
$50,000minimum funding
$150,000funding goal

Offer to donate

34 daysleft to contribute

You're pledging to donate if the project hits its minimum goal and gets approved. If not, your funds will be returned.

Sign in to donate

Project Summary

We live in the age of the Metacrisis — multiple global catastrophic risks intersecting and amplifying each other. EA efforts often address individual crises — AI safety, climate change, mental health. But they rarely act on the root causes: misaligned incentives across cultural, economic, political, and spiritual domains.

Without tackling these upstream drivers, efforts risk becoming an endless game of “chopping heads off the hydra.”

This project targets those upstream causes by driving narrative change through original essays, podcasts, and short-form videos that make systemic alternatives visible and emotionally resonant. I aim to shift culture in ways that unlock durable systems change by telling better stories of systemic alternatives, hope, resilience, cooperation, and stewardship that reduce externalities and long-term risk. This will shift the Overton window for policy and make downstream solutions viable.

I am requesting $150K for one year (minimum $50K) to sustain a professionalized cadence of outputs and allow me to dedicate myself to the project full-time.


Problem

Most global problems persist not because we lack technical solutions, but because of misaligned incentives. Multi-polar traps make cooperation difficult: even when everyone recognizes harm, no single actor can move without being punished.

  • Climate: CEOs know fossil fuels drive collapse, but markets punish the reduction of emissions at the cost of profits

  • AI: Companies race forward despite knowing the safety risks.

Where incentives align, solutions can emerge quickly — as shown by the record-speed development of COVID vaccines. But decades of failed global coordination on climate and other risks demonstrate the limits of relying on top-down agreements alone.

To unlock systemic change, we need to shift incentives so that it makes sense for individual players to act responsibly, even unilaterally.


Solution

The lever that makes this possible is culture. Humans are meaning-making creatures: beyond survival, we ask what it means to live well. The answers we absorb from stories — in religion, spirituality, media, and culture — shape our values, which then shape the systems and incentives we live under.

Today, most interventions target the downstream layers: technologies, policies, and institutions. These are essential, but fragile if cultural narratives don’t support them. Without cultural legitimacy, policies struggle to pass and technologies struggle to spread.

This project works at the upstream layer — telling better stories that inspire hope, imagination, care, and cooperation, so that new alternatives gain legitimacy and systemic change becomes viable.

Theory of Change

Today’s dominant cultural script equates success with wealth, power, and status. The consequences are visible everywhere: extractive economics that degrade ecosystems, captured political systems, and social dynamics that reward competition over cooperation. This script traps most people in a cycle of mimetic rivalry — pursuing symbols of success that ultimately undermine collective well-being.

Most people, constrained by economic scarcity, will not deviate until they see cultural and economic momentum shifting. But those with relative security can step outside the dominant script and begin building pro-social alternatives: economic policy that focuses on flourishing within planetary boundaries, cooperatives, bioregional initiatives, regenerative agriculture, and participatory governance.

These alternatives offer a different vision of what it means to “succeed”: stewardship instead of extraction, long-term resilience instead of short-term gain, collective wellbeing over zero-sum competition, and belonging instead of status. Their successes provide proof points, which — when amplified through stories — create a virtuous flywheel:

On the left are the systemic alternatives emerging in technology, economics, governance, and community life. On the right are the kinds of stories that can make these alternatives visible, emotionally compelling, and actionable. This is how the flywheel operates:

  1. Stories of hope and alternatives shift culture.

  2. Culture legitimizes experimentation with aligned systems.

  3. Working alternatives attract more people and weaken incumbent power structures.

  4. Politicians respond to new Overton windows.

  5. Momentum builds until pro-social systems dominate.

This flywheel makes downstream solutions — in governance, technology, and policy — more effective and sustainable. This will also drive impact in the short-term through mechanisms elaborated on in the appendix.

Examples of Alternatives & Type of Stories

The alternatives I will highlight are not isolated fixes, but living demonstrations of another logic of society: cooperation over competition, stewardship over extraction, resilience over short-term gain.

For example:

  1. Doughnut economics — a framework for flourishing for all within planetary boundaries.

  2. Costa Rica’s renewable grid — already running at 98% clean energy.

  3. Mondragón — the world’s largest cooperative, with $11B in revenue.

  4. Ireland’s Citizens’ Assembly — a model for participatory governance on divisive issues.

Most “good news” projects spotlight isolated wins. My work connects them into systemic narratives — exposing cracks in dominant scripts, highlighting courage to defy systemic pressures, and cultivating imagination for a different future. Together, they add up to cultural change that makes policy and technical solutions viable. That’s the missing piece.

The kinds of stories I will tell include:

  1. Big picture thinking: Understand how incentives drive bad outcomes and what principles alternatives must follow

  2. Proof of success: UBI pilots in Kenya showing measurable improvements in health and education.

  3. Courage against systemic pressures: Tech workers who risk careers by blowing the whistle on unsafe AI.

  4. Pathways for participation: Practical guides to joining co-ops, bioregional projects, or civic assemblies — making it clear how ordinary people can plug in.

An expanded list of alternatives and types of stories can be found towards the end of the application

Production Plan/How this funding will be used

  • Long form essays on Substack: 2/month

  • Podcasts: 1/month (conversations with systems-change practitioners)

  • Short videos on YouTube, Instagram, TikTok: 3/week

Total: 24 essays, 12 podcasts, 156 explainer videos

All outputs in a given month will align around one theme, ensuring efficiency and cross-pollination across formats.

Who am I?

I, Akhil Puri, am an ex–climate tech entrepreneur turned full-time writer and systems storyteller. From 2017–2024, I built a crowdfunding platform for green projects, developed solar rooftop projects, and led decarbonization planning at SINAI, a US-based climate tech startup. Earlier, I held leadership roles at Unilever and Practo, India’s top health-tech firm. I have experience leading large teams (25+) and large turnovers (70 million dollars).

I hold degrees from BITS Pilani (Computer Science) and IIM Bangalore (MBA), both top institutions in India.

I am now based in Toronto. In 2024, I pivoted from climate tech to spreading awareness about the Metacrisis and its solutions. The primary channel for this so far has been my newsletter on Substack. I have demonstrated my commitment to this work by self-funding both my climate tech start-up and last year’s exploration to refine my theory of change. I am only requesting funding to scale it up. As such, I hope this acts to reassure supporting EAs about my commitment to the cause and that their dollars will be put to good use.

What's your track record on similar projects?

Over the past year, I’ve been in a period of self-funded exploration — studying the Metacrisis in depth, experimenting with formats, and refining my theory of change. Since this was an exploration phase, I did not have a consistent publishing schedule. But even without that, this work has already begun to reach and resonate with an engaged audience.

  • Reach: 970 followers, 772 subscribers and ~50,000 cumulative views across Substack, LinkedIn, Reddit and other platforms in the past year.

  • Financial validation: 3 paid subscribers on Substack, 8 supporters on Ko-Fi for a cumulative total of 450 USD. This is despite all content being freely available. This indicates there is high resonance and demand for this kind of work

  • Engagement: Readers and listeners consistently report that the work helps them see systemic risks more clearly and leaves them feeling more hopeful, motivated, and informed.

  • Amplification potential: Followers include climate tech CEOs, academics, civic tech leaders, film and television producers and writers with tens of thousands of their own followers — proof of amplification potential.

  • Testimonials showing emotional resonance:

Funding Ask

I intend to continue this work regardless of funding. With my own savings, I have roughly one year of runway left. Without external support, I will need to draw down assets or spend time seeking alternate income streams, which would reduce my ability to focus on producing high-quality content at scale.

This grant determines whether the work continues in a lean, survival mode — or whether it can be fully professionalized and scaled. For reference, my market salary would be $250–300K. I have already self-funded the past year’s work of experimentation, continuously refining my theory of change. This significantly reduces risk for funders, who would be supporting not an untested idea but the scaling up of a validated direction.

Target Metrics/Impact

Because culture change is harder to measure than technical interventions, I will focus on a combination of quantitative reach, qualitative depth, and influence proxies.

Reach Projections (12 months):

Benchmarks: Open Philanthropy-funded projects range from $0.07/view to $2.33/view. This is estimated from views on YouTube and the amount of funding given to them.

My base-case ($0.11/view) is well within this range, and upside scenarios are far more cost-effective.

Details of my reach modelling and other grants can be found in this Google Sheet.

Additional indicators:

  • Subscriber/follower base: grow 5-10x (to 5–10K).

  • People reporting shifting to working on systemic alternatives, feeling more hopeful or having better understanding of root causes- similar to earlier comments I have received

  • Engagement: ≥30% open rates, ≥5% video engagement, ≥30% repeat audience.

  • Influence: ≥10 invitations to collaborate/publish/speak; ≥5 citations in EA/civic/policy communities.

  • Sustainability signals: audience willingness to pay (subscriptions, donations, partnerships).

Why EAs should fund this

  1. Amplifies EA work: Much of what EAs fund are systemic alternatives — in AI, climate, governance. By telling these stories, I make them visible and culturally resonant.

  2. Upstream leverage: Without culture change, technical solutions remain fragile. Narrative work is an investment in long-term resilience.

  3. Neglected lever: Billions fund technology and policy, but almost none fund cultural narrative change. This is high impact underleveraged work. Philanthropy is not competing with markets here — it fills a critical gap

Risks & Mitigation

  • Noise & AI slop drown out my content

    • Counter with authenticity and systemic sensemaking—people are tired of rage bait and superficial solutions. This presents an opportunity to stand out in the age of AI slop

    • Culture change is driven by early adopters and depth of influence, not just reach

  • Direct culture change outcomes are hard to attribute

    • Attribution is inherently challenging in cultural change, so I track influence proxies (citations, collaborations, invitations, comments indicating a shift in mindset or understanding) alongside reach.

  • Platform volatility

    • Diversify across Substack, YouTube, LinkedIn, TikTok, Instagram, Reddit

  • Content being too abstract/preachy/complex

    • Emphasize stories of hope and concrete alternatives

    • Use a range of tones (personal reflection, case studies, explainers) to meet audiences where they are

    • Incorporate feedback loops to avoid insularity.

  • Burnout from high production cadence:

    • Batch workflows

    • Cover one theme/topic across formats each month. This reduces research overhead and meets audiences where they are

    • Repurpose content across formats, use AI to speed up and streamline production

    • Modest editing and distribution support if I get the full $150K

Appendix

Short-term impact pathways

While cultural shifts play out over many years, there are immediate pathways through which this project complements and amplifies existing EA efforts. A few examples:

  • AI Governance

    Public understanding shapes whether AI regulations gain legitimacy. By making AI risk legible and emotionally resonant, stories counter accelerationist hype and broaden the Overton window for pro-safety policy.

  • Climate Coordination

    Stories of local successes — like cooperatives or bioregional stewardship projects — make carbon pricing, adaptation, and just transition policies more culturally acceptable, reducing resistance and denial.

  • Movement-building & Talent Flows

    By inspiring hope and showing concrete pathways for participation, stories help people redirect careers, resources, and energy toward systemic alternatives — amplifying the reach of other EA-aligned interventions.

These are shorter feedback loops: content today can shape discourse, legitimacy, and recruitment tomorrow — even while the deeper cultural flywheel spins over longer time horizons.

Examples of alternatives

Examples of types of stories


Comments10Offers5Similar7
JeroenWillems avatar

Jeroen Willems

A Happier World (YouTube channel promoting EA ideas)

A Happier World explores exciting ideas with the potential to radically improve the world. It discusses the most pressing problems and how we can solve them.

EA community
5
8
$2.79K raised
michaeltrazzi avatar

Michaël Rubens Trazzi

Grow An AI Safety Tiktok Channel To Reach Ten Million People

20 Weeks Salary to reach a neglected audience of 10M viewers

Technical AI safetyAI governance
9
32
$28.2K raised
MichelJusten avatar

Michel Justen

Video essay on risks from AI accelerating AI R&D

Help turn the video from an amateur side-project to into an exceptional, animated distillation

AI governanceGlobal catastrophic risks
1
5
$0 raised
StandardAsset avatar

Shawn Kulasingham

‘Build Responsibly’ | A Documentary Making AI Safety Engaging for the Public.

Creating a cinematic AI safety documentary with entertainment value for the public. Need 5k to create trailer & foundational interviews.

Science & technologyTechnical AI safetyAI governanceEA communityGlobal catastrophic risks
2
0
$0 / $150K
tylerjn avatar

Tyler Johnston

The Midas Project

AI-focused corporate campaigns and industry watchdog

AI governanceGlobal catastrophic risks
2
2
$0 raised
CeSIA avatar

Centre pour la Sécurité de l'IA

AI Safety Atlas

Distilling AI safety research into a complete learning ecosystem: textbook, courses, guides, videos, and more.

Technical AI safetyAI governanceGlobal catastrophic risks
15
14
$0 raised
🦀

Reed Shafer-Ray

Educating World Class Leaders From Developing World on AI/Pandemics

AI governanceBiosecurityGlobal catastrophic risks
1
4
$0 raised