You're pledging to donate if the project hits its minimum goal and gets approved. If not, your funds will be returned.
AI's exponential advancement demands urgent safety work. With your funding, I will transition to full-time AI Safety contribution through:
Technical training via AI Safety ANZ's structured TARA program
Self-directed study in machine learning, causal inference, and alignment research
Development of practical safety projects and technical portfolio
This grant enables my dedication to acquiring the technical skills needed to help steer AI toward beneficial outcomes.
This project will systematically build my technical capabilities in AI Safety through:
Technical Foundation
Complete Deep Learning Specialisation and AI Safety Fundamentals courses
Develop machine learning expertise
Master core alignment methodologies
Field Experience
Contribute to open-source AI safety projects
Develop proof-of-concept safety implementations
Success Metrics
Technical portfolio demonstrating AI safety implementations
Contributions to safety documentation or research
Secured position in AI safety research/engineering
This funding will support my dedicated transition by covering:
· Education: $7236 for technical courses, study materials, and online subscriptions
· Technology: $1273 for computing resources essential for AI research
· Living Expenses: $11256 for rent and necessities during study
· Professional Development: $905 for conference attendance and networking events
Each dollar directly enables my full focus on developing technical skills to contribute to AI alignment research.
Though transitioning solo, I have demonstrated the capability to master technical fields:
· Self-taught programming skills leading to a professional support engineering role at SAP
· Completed Machine Learning Specialisation and Responsible AI courses
· Participant in the TARA program with structured mentorship from field experts
· Founded a non-profit tech startup, showing entrepreneurial execution
These experiences prove my ability to acquire complex technical skills rapidly and apply them effectively.
Primary risks include:
Technical knowledge gaps in specialised alignment areas
The competitive job market for AI safety positions
Funding limitations affecting full-time dedication
If unsuccessful, consequences include delayed transition, reduced technical contributions, and missed impact opportunities when safety work is most urgent.
Mitigation strategies include leveraging free resources, seeking partial funding alternatives, and building stronger connections with safety organisations.
I have not raised funds in the past 12 months. My transition has progressed through self-funding and participation in programs like TARA. This grant represents the critical next step - enabling my full-time focus on developing technical AI safety skills when they are most urgently needed.