If you're interested in joining us, go read our Request for Startups: https://seldonlab.com/rfs
Seldon Labs runs a for-profit accelerator for AI security and assurance startups.
In our first SF batch, we worked with companies like Andon Labs, Lucid Computing, and Workshop Labs, who went on to raise significant follow-on funding, close contracts with labs such as xAI, and Anthropic, and receive coverage in outlets like TIME.
With this round, we're running Batch #2 in SF for 5–10 teams working on AI security, assurance, and related infrastructure. The funding enables us to directly support founders relocating to and building in SF, and run a focused, in-person program aimed at turning technically strong, safety-motivated teams into durable AI security companies.
Structure note: This is structured as a dilutive investment (YC SAFE) into Seldon Labs PBC, routed through Manifund as fiscal sponsor. Returns, if any, accrue to the funder's Manifund balance.
Increase the number and quality of AI security/assurance startups that materially reduce catastrophic AI risk, and help them reach product-market fit and funding.
We'll run an in-person SF accelerator (Jan–Apr 2026) for 5–10 teams focused on AI security infrastructure, AI assurance and governance tooling, and adjacent safety-relevant infrastructure. We provide upfront capital, work with founders on problem selection, go-to-market, and governance, and leverage our network to help secure early pilots and raise follow-on funding.
Direct founder support (upfront investments into batch companies) and program operations (team, space, events, legal/admin).
Finn Metz: Co-founder, Seldon Labs. Background in VC/PE. Co-founder AI Safety Founders Community.
Esben Kran: Co-founder, Seldon Labs. Co-founder of Apart Research. Deep connections across AI safety labs, researchers, and funders.
Track record (Batch #1): Our first batch worked with Andon Labs, Lucid Computing, Workshop Labs, and others. Those teams raised >$10m collectively, closed contracts with major labs (xAI, and Anthropic), and received multiple features in TIME.
We're advised by Nick Fitz (Juniper Ventures) and Eric Ries (LTSE, The Lean Startup).
Most likely ways this could fail:
We can’t convert enough top-tier applicants into a strong batch (e.g. key founders decide to join labs or move into other roles instead).
We under-resource the program, providing less support than Batch #1 (e.g. not enough staff / time for hands-on help).
Some companies drift away from safety-critical problems despite our theory-of-change guidance and governance work.
Outcomes if that happens:
If that happens: you still get several competent teams building AI-security-adjacent tools, but with lower impact density.
Over the last 12 months, Seldon Labs PBC has received non-dilutive funding at an amount of $53k from the Survival and Flourishing Fund (SFF). Seldon also received in-kind contributions, and investments from private investors.
Esben Kran
9 days ago
If you're interested in joining us, go read our Request for Startups: https://seldonlab.com/rfs
Charbel-Raphael Segerie
17 days ago
In our first SF batch, we worked with companies like Andon Labs, Lucid Computing, and Workshop Labs, --> What was the counterfactual impact?
Esben Kran
9 days ago
Blocked a predatory investor that may have had a 40% chance to destroy an impactful mission 2 years down the line (the biggest risk)
Counterfactually led to Workshop Lab's successful round, as reported independently by them
Established and supported their re-incorporation as public benefit corporations, deploying the AIS mission in their legal registration
Was counterfactually responsible for at least $565k direct investments into their rounds
Direct design and product decisions supported by us, including pivots into more viable business models to support their mission
Much more understanding for them of the investors' perspective and how to navigate them
Hiring decisions and support through the Apart network, including running a 1,000-person hackathon
New friends (pretty invaluable to have a community within which you can support each other with full trust - this keeps paying dividends now)
The counterfactual impact seems radically higher than an equivalent non-profit accelerator due to the scale of their downstream impact but the direct impact on them changing strategic directions was lower due to the founder demographic's strong (and most often good) opinions.
Akshyae Singh
21 days ago
Hard vouch for the Seldon team, Finn and Esben are extremely fast moving with an in-depth understanding of the AI Safety For-profit space.
LE THAI ANH
20 days ago
This project is perfect. We need one place to any founder or them survey never stop and for all youth people . Mood locker here: https://manifund.org/projects/mood-locker-deep-tech-emotional-infrastructure-for-the-global-south-ikjhtkxuyje?tab=bids
Austin Chen
21 days ago
Approving this project; we've been happy to host the first cohort of Seldon at Mox, and I'm excited for more incubation and support in the space of AI safety startups. Finn and Esben are doing good work and I'm looking forward to seeing what's next!