Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate
AlexandraBos avatarAlexandraBos avatar
Alexandra Bos

@AlexandraBos

$0total balance
$0charity balance
$0cash balance

$0 in pending offers

Projects

AI Safety Research Organization Incubator - Pilot Program

Comments

AI Safety Research Organization Incubator - Pilot Program
AlexandraBos avatar

Alexandra Bos

about 24 hours ago

Final report

Description of subprojects and results, including major changes from the original proposal

Participants rated the program highly: they estimated it accelerated their founding journey by ~11 months total on average. At the end of (the online) Phase 1 of the program, 66% of participants indicated that time spent in Phase 1 of the program was 3-10x or 10x+ as valuable as how they would have spent their time otherwise. At the end of Phase 2 (in-person), 85% of participants indicated this. 

Please find an overview organizations incubated in the program here: https://www.catalyze-impact.org/post/introducing-11-new-ai-safety-organizations-catalyze-incubation-program-cohort-winter-2024-25

To highlight some examples, these are three promising organizations that came out of our Nov-Feb '25 incubation program pilot:

• Luthien: Developing Redwood's AI Control approach into a production-ready solution. Founded by Jai Dhyani, an experienced ML engineer (Meta, Amazon) and MATS 6.0 graduate where he worked with METR. Within two months of its existence, Luthien has already secured nearly 190k$ through our Seed Funding Circle.

• Wiser Human: a non-profit modeling AI threats for agentic use cases, producing compelling demos to hold AI devs accountable to safety commitments. Co-founded by Francesca Gomez, who worked in digital risk management for many years and has a background in AI, and Sebastien Ben M'Barek, an experienced digital risk management professional with a software engineering & product management background. Wiser Human has received 15k$ in donations from our Seed Funding Circle.

• Coordinal Research: a non-profit accelerating technical AIS agendas with research automation. Co-founded by Ronak Mehta, a CS Postdoc & MATS 6.0 graduate, and Jacques Thibodeau, a former data scientist and MATS graduate, previous founder and independent alignment researcher focused on automating alignment research. Coordinal has secured 110k$ in seed funding through members of our Seed Funding Circle.

Please find a few of the testimonials from program graduates below:

  • Jai Dhyani (Luthien): “Catalyze gave me the structure, information, and connections I needed to make Luthien a reality. When I started I had no idea how to build a company or a non-profit, but by the end of Catalyze I not only felt confident in my ability to get started, I was (and remain) optimistic that I will actually succeed in making a meaningful difference. Within three months of the end of the program I had over a year of runway and was well on my way to deploying an MVP.”

  • Cecilia Callas (AI safety comms organization): “Participating in Catalyze Impact was completely transformational for my career journey into AI Safety. (...) being immersed in a community of like-minded AI safety entrepreneurs and having access to advisors helped my co-founder and I to be much more successful, and much more quickly. (...) Within a few months of the Catalyze program concluding, we have secured seed funding for our AI safety communications project, have a clear direction for our organization and perhaps most importantly, were have affirmed that we could build careers in AI Safety”

  • Francesca Gomez (Wiser Human): “The Catalyze Impact AI Safety Incubator really helped get our AI Safety work off the ground. Weekly sessions with the team and Catalyze’s group of mentors, domain experts in AI Safety, gave us first‑hand, candid feedback that really sharpened our thinking, which would not have been possible to do outside of the programme. By the time the cohort wrapped up, we had mapped a roadmap, secured initial seed funding, and produced the materials that later underpinned our larger grant applications. Another big benefit for us was how Catalyze plugged us straight into the London AI Safety ecosystem. (...) the sense of accountability and the ongoing flow of expertise continue to be invaluable as we grow.”

  • Ronak Mehta (Coordinal Research): “The Catalyze program was integral to the foundation of Coordinal Research. The mentorship, networking, and co-founder matching all directly contributed to the organization's founding. Having a dedicated, full-time commitment and space for 1) learning how to build an organization, 2) building out proofs of concept, and 3) networking with AI safety researchers, funders, and other founders was necessary, valuable, and fun, and I cannot imagine a scenario where Coordinal would exist without Catalyze. Learning what it takes to build a new organization alongside like-minded founders dedicated to AI safety was so valuable, in a way that typical startup incubators couldn't provide. The accountability felt extremely genuine, with everyone seriously considering how their organization could effectively contribute to AI safety.”

Spending breakdown

We spent the ~16k$ we raised here primarily on salaries and runway before getting the pilot program funded, as outlined in the comments to this grant.

AI Safety Research Organization Incubator - Pilot Program
AlexandraBos avatar

Alexandra Bos

about 1 year ago

Progress update

What progress have you made since your last update?

  • Fundraising: We have raised ~130K$ which enables us to run an adjusted version of the incubation program we originally proposed.

  • Supporting new AI Safety organizations: We executed on a number of support interventions for young AI safety research organizations. This includes a 1-week product design sprint with a new evals organization and ongoing consultancy sessions with various new AI Safety research org founders. In these sessions we help clients tackle main challenges they face and connect them to potential co-founders, funders or others. We also organized a number of networking events to support these founders (incl. an event with many of the main evals organizations around EAG Bay Area 2024).

  • Finding promising founders: We have looked for and found very promising founders to support & received ~130 expressions of interest for our upcoming program. Part of our strategy has been experimenting with AI Safety Entrepreneurship community-building, such as hosting a dinner with this theme around EAG London 2024 (~25 attendants), and hosting a Q&A event with three AI safety org founders (~40 attendants).

  • Preparing to launch upcoming programs: We have prepared for launching the upcoming incubation programs (incl. setting up an applicant selection pipeline, headhunting, gathering advisors for incubatees, designing program content, putting together resources) and plan to launch applications this June as soon as we wrap up our current hiring round. The program itself will likely start around August.

If either of our funders would like to hear more details on these activities, people or clients in a de-anonymized manner, we're happy to share this with them directly.

What are your next steps?

  • Wrapping up our hiring round & launching our upcoming programs.

Is there anything others could help you with?

  • Seed funding circle members: we'd be interested to get in touch with additional people who are interested in providing seed funding for the new AI Safety research organizations (either for- or non-profit). Please let us know if you are interested in this or know someone who might be.

AI Safety Research Organization Incubator - Pilot Program
AlexandraBos avatar

Alexandra Bos

over 1 year ago

To elaborate on this a bit and make it more precise:
The first 15-20K would go mostly towards runway and very cheap MVPs which don't require resources from us apart from our time (i.e. things we can run online).

Funding above the 15-20K and up to around 60K would go towards somewhat less cheap MVPs such as hosting an in-person mini version of the program for a few individuals who are in the early phases of starting their AI safety research organization.

AI Safety Research Organization Incubator - Pilot Program
AlexandraBos avatar

Alexandra Bos

over 1 year ago

@RyanKidd thanks for asking!

Receiving the first $5K - 10 K in funding would help us gain very useful runway and resources to put towards better MVPs. We're pretty cheap right now; around $5K gives us 1 extra month of runway.

The resources would enable us to:

1) Continue to fully dedicate our attention to making this program a reality,

2) It would create space for us to execute on more MVPs that are helping us learn how to best shape the program, build a proof of concept, and build our track record in this niche.

In other words, this funding would increase the odds of us being able to run a more comprehensive pilot later in 2024.

The MVPs we are planning and partially already running consist of 1) supporting and coaching very early AI safety research organizations and 2) enabling people to find a cofounder for the AI safety research organization they want to set up.

Transactions

ForDateTypeAmount
Manifund Bankover 1 year agowithdraw15977
AI Safety Research Organization Incubator - Pilot Programover 1 year agoproject donation+200
AI Safety Research Organization Incubator - Pilot Programover 1 year agoproject donation+277
AI Safety Research Organization Incubator - Pilot Programover 1 year agoproject donation+500
AI Safety Research Organization Incubator - Pilot Programover 1 year agoproject donation+15000