Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate

Funding requirements

Sign grant agreement
Reach min funding
Get Manifund approval
0

Advancing AI Safety: A Structured Path to Impact

Science & technologyTechnical AI safety
Lara-Nguyen avatar

Lara Nguyen

ProposalGrant
Closes June 12th, 2025
$0raised
$500minimum funding
$20,000funding goal

Offer to donate

34 daysleft to contribute

You're pledging to donate if the project hits its minimum goal and gets approved. If not, your funds will be returned.

Sign in to donate

Project summary

AI's exponential advancement demands urgent safety work. With your funding, I will transition to full-time AI Safety contribution through:

  • Technical training via AI Safety ANZ's structured TARA program

  • Self-directed study in machine learning, causal inference, and alignment research

  • Development of practical safety projects and technical portfolio

This grant enables my dedication to acquiring the technical skills needed to help steer AI toward beneficial outcomes.

 

What are this project's goals? How will you achieve them?

This project will systematically build my technical capabilities in AI Safety through:

Technical Foundation

  • Complete Deep Learning Specialisation and AI Safety Fundamentals courses

  • Develop machine learning expertise

  • Master core alignment methodologies

Field Experience

  • Contribute to open-source AI safety projects

  • Develop proof-of-concept safety implementations

Success Metrics

  • Technical portfolio demonstrating AI safety implementations

  • Contributions to safety documentation or research

  • Secured position in AI safety research/engineering

How will this funding be used?

This funding will support my dedicated transition by covering:

·       Education: $7236 for technical courses, study materials, and online subscriptions

·       Technology: $1273 for computing resources essential for AI research

·       Living Expenses: $11256 for rent and necessities during study

·       Professional Development: $905 for conference attendance and networking events

Each dollar directly enables my full focus on developing technical skills to contribute to AI alignment research.

Who is on your team? What's your track record on similar projects?

Though transitioning solo, I have demonstrated the capability to master technical fields:

·       Self-taught programming skills leading to a professional support engineering role at SAP

·       Completed Machine Learning Specialisation and Responsible AI courses

·       Participant in the TARA program with structured mentorship from field experts

·       Founded a non-profit tech startup, showing entrepreneurial execution

These experiences prove my ability to acquire complex technical skills rapidly and apply them effectively.

What are the most likely causes and outcomes if this project fails?

Primary risks include:

  • Technical knowledge gaps in specialised alignment areas

  • The competitive job market for AI safety positions

  • Funding limitations affecting full-time dedication

If unsuccessful, consequences include delayed transition, reduced technical contributions, and missed impact opportunities when safety work is most urgent.

Mitigation strategies include leveraging free resources, seeking partial funding alternatives, and building stronger connections with safety organisations.

How much money have you raised in the last 12 months, and from where?

I have not raised funds in the past 12 months. My transition has progressed through self-funding and participation in programs like TARA. This grant represents the critical next step - enabling my full-time focus on developing technical AI safety skills when they are most urgently needed.

CommentsOffers

No comments yet. Sign in to create one!