Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate

Funding requirements

Sign grant agreement
Reach min funding
Get Manifund approval
24

Keep Apart Research Going: Global AI Safety Research & Talent Pipeline

Technical AI safetyAI governanceEA community
Apart avatar

Apart Research

ProposalGrant
Closes June 30th, 2025
$936raised
$10,000minimum funding
$954,800funding goal

Offer to donate

29 daysleft to contribute

You're pledging to donate if the project hits its minimum goal and gets approved. If not, your funds will be returned.

Sign in to donate

Project summary

Apart Research is at a pivotal moment. In the past 2.5 years, we've built a global pipeline for AI safety research and talent that has produced 22 peer-reviewed publications in venues like ICLR, NeurIPS, ICML, and ACL, engaged 3,500+ participants in 42 research sprints across 50+ global locations, and helped launch top talent into AI safety careers.

Our impact spans research excellence, talent development, and policy influence: Two of our recent publications received Oral Spotlights at ICLR 2025 (top 1.8% of accepted papers) and our research has been cited by leading AI and AI safety labs. Our participants have landed jobs at METR, Oxford, Far.ai, and in impactful founder roles, while our policy engagement includes presenting at premier forums like IASEAI and serving as expert consultants to the EU AI Act Code of Practice. Major tech publications have featured our work, extending our influence beyond academic circles. Without immediate funding, this momentum will stop in June 2025.

Read our Impact Report here: https://apartresearch.com/donate

What are this project's goals? How will you achieve them?

Our primary goals are to:

  1. Convert untapped technical talent into AI safety researchers: Via our global research sprints (200+ monthly participants), identify individuals with exceptional potential from tech and science backgrounds and get them to contribute to AI safety immediately

  2. Produce high-impact technical AI safety research: Publish 10-15 new peer-reviewed papers on critical challenges including interpretability, evaluation methodologies for critically dangerous capabilities, and AGI security and control; enable horizon-scanning for important research topics in AI safety via our open-ended research hackathons

  3. Place trained researchers at key organizations: Support 30+ Lab Fellows at any given time, preparing them for roles at leading AI safety institutions, non-profits, and startups

We'll achieve these through our proven three-part model:

  • Global Research Sprints: Weekend-long events across 50+ locations identifying promising researchers and novel approaches

  • Studio Program: 4-week accelerator developing the best sprint ideas into substantive research proposals

  • Lab Fellowship: 3-6 month intensive global program for publication-quality work with compute resources, project management, and mentorship

Our model excels at rapidly identifying and developing talent with significant counterfactual impact. For example, one of the participants of our March 2024 METR x Apart hackathon, a serial entrepreneur with a physics and robotics background, joined METR as a full-time member of technical staff largely because of our event. Shortly after our event, he also contributed to a research project in our lab, which he presented at ICLR 2025 (and which received an oral spotlight). Similar success stories have occurred for fellows landing jobs at Oxford, Far.ai, founding impactful AI safety startups, or establishing new AI safety teams in high-growth organizations.

How will this funding be used?

The funding will directly fund our talent and research acceleration pipeline. Our budget breakdown for 12 months is (scale down accordingly):

Staff Compensation for 8 FTE ($691,200, 73%):

  • Research Project Management ensuring fellows produce publication-quality work

  • Research Engineering providing technical support and automation across projects and talent pipelines

  • Sprint & core operations ensuring program effectiveness, follow-up, and impact

Program Related Costs ($156,000, 16%):

  • Direct Program Expenses ($54,000): Lab & Studio infrastructure, research software, fellow conference travel and attendance

  • Travel Costs ($102,000): Travel, conference attendance, meals and accommodations for the apart team and lab fellows presenting their work.

Indirect Costs & Fiscal Sponsorship ($107,600, 11%):

  • Indirect Expenses ($60,000): Software & subscription costs, office rental and other necessary operational expenses

  • Fiscal Sponsorship ($47,600): Costs incurring through our agreement with Ashgro for their accounting, legal support, and non-profit status retention

Our current ask of $954,800 represents a budget of 12 months. Towards that number, we have multiple funding milestones:

  • $120,000 will be enough to keep our position in AI safety and expand our automated field building and research tooling, despite the need to cut down staff and all programs.

  • $238,700 is the minimum amount we need to continue our research and events work for three months, providing opportunity for the Apart community.

  • $477,400 will give us until the end of the year, giving hundreds of people a chance to partake and contribute to AI safety.

$954,800 will enable us to continue into 2026 with our research and events work, creating impact for thousands of people.

How much is your is your donation worth?

  • $100,000 enables the publication of 4 peer reviewed AI Safety publications

  • $50,000 Enables 3 global research sprints identifying new safety approaches

  • $5,000 supports a Lab Fellow producing publication-quality research (including. conference attendance)

  • $200 Enables the participation of 3 hackathon participants

Who is on your team? What's your track record on similar projects?

Our team combines research expertise and operational excellence, with the following key members:

  • Jason Hoelscher-Obermaier (Research Director): Quantum optics PhD, AI engineer at multiple startups, PIBBSS fellow, and director of research

  • Natalia Pérez-Campanero (Research Project Manager): PhD in Bioengineering, former program manager at Royal Society's talent accelerator

  • Archana Vaidheeswaran (Community Program Manager): Board member at Women in ML, experienced in organizing workshops with 2,000+ participants co-located with major ML conferences

  • Jaime Raldúa (Research Engineer): 8+ years ML engineering experience with multiple key contributions to software stacks at impactful EA orgs

  • Jacob Haimes (Research Assistant): MS from CU Boulder, AI Safety Specialist at the Odyssean Institute, founder of the into AI Safety Podcast.

  • Clement Neo (Research Assistant): Research Engineer at the Singapore AISI & former research intern at Oxford supporting Apart researchers part-time.

Advisors:

  • Esben Kran: Co-founder and advisor

  • Finn Metz: Operations and funding advisor

  • Christian Schroeder de Witt: Research advisor

  • Eric Ries: Strategic advisor

  • Nick Fitz: Organizational development advisor

Track Record:

  • 22 peer-reviewed AI safety publications, including at ICLR, NeurIPS, and ACL

  • Two papers receiving Oral Spotlights at ICLR 2025 (top 1.8% of accepted papers)

  • 42 global research sprints engaging 3,500+ participants

  • 105 researchers incubated through our Lab Fellowship. 40 people incubated through our Studio Fellowship since December.

  • 26 placements at 20+ organizations including METR, Oxford, & Far.ai.

  • Research cited by OpenAI's Superalignment team and other major AI labs

What are the most likely causes and outcomes if this project fails?

The most likely failure modes are:

Insufficient funding: Without adequate resources, we would be forced to disband a high-functioning team built over 2.5 years, losing a proven talent pipeline at a critical time for AI safety and canceling valuable talent capital and research projects. Mitigation: We have already diversified our funding drastically, including partnerships and sponsorships.

Research relevance and impact: Our research may not keep up with rapidly evolving field priorities and we could face diminishing returns on novelty for our research hackathon model. Mitigation: We maintain close collaboration with leading AI labs and safety organizations to continuously align our research priorities, while our model allows for rapid adaptation to emerging safety concerns and to previously neglected topic areas. 

Opportunity cost: With AI capabilities advancing rapidly, moving fast now is necessary to keep critical momentum at precisely the time when safety research is most needed. Mitigation: Our model is designed for efficiency and rapid adaptation, allowing us to maximize impact per dollar invested while prioritizing time-sensitive work on urgent and impactful research areas, such as by prioritizing research inputs for the General-Purpose AI Code of Practice.

Talent pipeline execution risk: Challenges in maintaining quality across talent in our work between global mid-career talent and early-career researchers and avoiding overlap with other programs. Mitigation: We have systematic evaluation metrics for participants, and strategic focus on technical backgrounds and locations where we complement rather than compete with existing programs. Examples of differentiation include being remote-first and part-time, essential for helping mid-career individuals transition, and focusing on strong research management, helping non-academics succeed in research.

Industry and partnership challenges: Difficulties in launching new programs, ensuring partner alignment, and continuously facilitating high quality connections between stakeholders. Mitigation: We've built strong connections with leaders and researchers at key organizations, established formal partnership agreements with clear expectations, and designed our talent pipeline to align with the needs of the AI safety field. We expand our sponsorship setup where e.g. $5k in compute is provided by Lambda Labs to every team for free.

Broader ecosystem risks: Public skepticism  of AI safety work could negatively impact donor perception and fundraising efforts. Mitigation: We maintain transparent operations, publish our research openly, engage constructively with diverse perspectives, and focus our messaging on concrete technical contributions.

If we fail to maintain Apart Research, the field would lose:

  • A proven pipeline for identifying and developing global technical talent in AI safety

  • An efficient mechanism for exploring novel research directions at scale

  • A bridge between diverse technical communities and established AI safety organizations

How much money have you raised in the last 12 months, and from where?

In the past 12 months, Apart Research has raised approximately $680,000 from:

  • Survival and Flourishing Fund (SFF)

  • AI Safety Tactical Opportunities Fund Fund (AISTOF)

  • Open Philanthropy

We've also previously received support from the Long-Term Future Fund (LTFF), Foresight & ACX. This funding has enabled us to build our team and infrastructure, but our current funding expires in June 2025, necessitating this fundraising round to maintain our operations.

View our donations page, read our impact report and find more testimonials here:

https://apartresearch.com/donate

Comments20Offers16
offering $10
🌷

1 day ago

Apart research publishes multiple papers in AI safety. I don't like the entrepreneurial vibe but they have results.

AntonMakiievskyi avatar

Anton Makiievskyi

1 day ago

With such an amazing track record and support previous from big funders, I wonder why they (big funders) are letting you run out of money. Did they all refuse additional support?

Apart avatar

Apart Research

about 20 hours ago

@AntonMakiievskyi

TLDR; no big funders have retracted funding, we have undersold our impact in grant applications by not focusing on the metrics that we now know work best, our network within SF isn't as strong as other orgs, and the AIS funding ecosystem face general problems. See a longer response to this question here.

offering $10
charbel-raphael avatar

Charbel-Raphael Segerie

2 days ago

Apart has been useful for me to quickly experiment with ideas/improve on quick iteration. I've organised multiple hackathons before knowing apart, and their format is vastly more effective at converting talent/by unit of effort. While I was head of EffiSciences’ AI Safety Unit, this was one of my favorite event formats, and this is one of the format that I encourage alumni of ML4Good to run. Empirically, each apart hackathon that I organized in Paris enabled the long term careers of 0.6 person in AI Safety (see the table). This means that, on average, 0.6 new full-time persons started working on AI safety after each Apart hackathon event in Paris.

offering $20
Bart-Bussmann avatar

Bart Bussmann

3 days ago

During an Apart Research hackathon I got my first hands-on experience with mechanistic interpretability and fell in love with it. Now I'm working on mechinterp full-time and have published mechinterp papers at ICLR and ICML. Generally their hackathons are very accessible, well-organized, and a great entry-point for people interested in AI safety.

I believe this is a great funding opportunity for any funders interested in getting more people into working on AI safety!

briantan avatar

Brian Tan

4 days ago

I've been impressed with Apart Research for a while now, and I think more funders should consider filling their funding gap. Their Apart Sprints are a great global on-ramp for many people into AI safety research, and some of our fellows at WhiteBox Research have participated in them. I also got to listen to both of their ICLR orals in-person, which were insightful.

offering $400
jechapman2000yahoocom avatar

Jim Chapman

4 days ago

I think Apart Research's hackathons are a great place for community and skill building. I want your work to continue.

offering $10
🍄

Felix Michalak

8 days ago

AI Safety is suffering from Elitism, induced by scarce financial resources that then also have to be used to accommodate high-paying jobs to make carreer switches more attractive. Even though the interest in AI Safety has grown tremendously, these dynamics are still in place and almost no one offers low entry points for early career individuals (e.g. students in undergraduate studies).

Apart is the only organization I know that goes against this elitist trend by offering support and guidance to virtually anyone who is seriously interested in AI Safety research, no matter their previous qualifications. The (AI Safety) world needs more research and field-building organizations like Apart, as otherwise the field of AI Safety will not manage to develop fast enough to solve fundamental issues before it's too late.

Apart had a tremendous impact on the development of my research skills and interests, and I am grateful for their existence. This organization cannot cease to exist, as it would be detrimental to the AI Safety landscape.

offering $26
evelynciara avatar

Evelyn Ciara

9 days ago

I participated Apart's security evaluations hackathon with a friend last May, and it helped me get in the foot in the door and start exploring the AI safety technical research space. I believe that Apart's hackathons and other programs are a critical part of the talent pipeline for the AI safety community and it would be a shame if they had to shut down due to lack of funding.

offering $50
🍄

Devina Jain

10 days ago

Apart research offers an accessible entry point to AI safety research and it would be a tremendous loss if it didn't exist anymore - I don't believe there's another organization that does what they do!

🦄

10 days ago

I first got into mech interp by going to an Apart Hackathon. I have enjoyed every Apart Hackathon that I've participated in and found them to be extremely productive.

🥭

Abby Lupi

10 days ago

Apart Research forms a critical entry point to contributing meaningfully to research into the powerful technologies poised to shape our future. It is absolutely essential that we expand participation in AI ethics and safety research beyond the institutions that have the most money and power. As a widely recognized and growingly accessible nonprofit, Apart is uniquely positioned to elevate global, diverse, and underrepresented voices in this critical conversation. I'm personally thankful for Apart as an early career researcher without a graduate degree who is now equipped to contribute to the research community; something I previously thought was out of reach.

offering $60
🍇

Jeremias Ferrao

10 days ago

Apart has been instrumental in my early career as an AI researcher. Its hackathons and Lab program significantly boosted my research skills and confidence, culminating in my first publication at a workshop of a top-tier conference. This pivotal achievement, in particular, has profoundly shaped my future aspirations and motivated me to aim much higher. Beyond the technical growth, I greatly valued engaging with like-minded individuals and becoming aware of the broader AI safety community and opportunities. Moreover, Apart's highly accommodating programs provided essential flexibility, a key factor that makes participation feasible for many members. I am very optimistic about the organization and wholeheartedly believe in its continued ability to produce world-class, socially impactful research.

🐸

Mindy Ng

10 days ago

Apart Research has immensely helped me in my AI Safety/Alignment journey. Coming from industry, knowing about AI development and not necessarily the consequences of it has been enlightening. Through Apart, I have been able to gain a deeper understanding of AI Safety through their Hackathon, Studio and Fellowship. Not only am I able to connect with others in the lab through shared mission, but also help make an impact through rigorous research. The lab has helped develop my research skills so that I can contribute to issues that have huge need to be explored yet not many people working on it. Apart Research helps build up the talent needed so that AI Safety labs can continue to ensure safe AI. If Apart Research shuts down, this would be a huge loss not just to the research field, but society.

offering $30
🥦

Jord Nguyen

10 days ago

Love the hackathons and the Apart community! They were very useful when I first started safety research.

🐬

Eitan Sprejer

10 days ago

Apart makes a huge impact by providing early career aspiring AI Safety researchers the opportunity to do a research project on a weekend, and have something to show for to get accepted on research internships

🍩

Markela Zeneli

10 days ago

Apart Research is the most accessible AI Safety research group I have encountered. Their pipeline of hackathons -> studio -> lab fellow is unique, and given the number of successful papers that come out of the lab, the pipeline works. Apart also acts as a catalyst for people that want to career-switch into AI Safety, and the mentorship I have received from them has been the most impactful and valuable contribution to my personal journey.

Balancing accessibility with ops can be incredibly tricky, but Apart have a proven track record of being able to balance it. Every member is dedicated to producing -exceptional- research, whilst bringing people up to speed with the latest developments and unbiased viewpoints. This alone is priceless, but without funding, they will not be able to continue fostering such an amazing community, and that would be a massive loss to the AI Safety field.

offering $30
🐢

Nyasha Duri

10 days ago

Apart Research is an exceptional, vital, and impactful community to say the least. Without it, I would not have been able to connect with such a transformative, inclusive, and unique network now. Having experienced and or evaluated many other formats, I believe that their model is the most effective and efficient.

The experiences I gained have been very helpful for my career overall - instrumental in terms of focusing more on AI Safety. To name just one example of the many different ways in which it has been extremely valuable, leading to inbound opportunities with multinationals, national firms, plus a leading local education institution.

I 100% feel this is one of the best things I have ever been a part of, where so much talent abounds from fellow participants, mentors, speakers and so on. As for the great minds behind the scenes powering things, I have had the privilege of collaborating with many incredible teams over the past decade: they truly stand out as among the best of the best.

Also, I know how meaningful it is to other people, including those who tell me they have yet to take part in a research sprint but aspire to do so. Not being able to continue this essential work would be a huge loss, not to mention at such a crucial time for the field.

What I have written above in yet another failed attempt to summarise (should have gotten an LLM involved but didn't think of it until now) does not do Apart justice; I would be happy to expand anytime.

offering $20
🦀

Auguste Baum

10 days ago

Apart is a great community to get up to speed in AI safety

offering $20
Lucie avatar

Lucie Philippon

10 days ago

Apart Sprints were useful in bootstrapping my AI Safety Career. The hackathons reports were my first publications. They also gave me concrete experience thinking about LLMs and AI strategy.