Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate
plex avatarplex avatar
plex

@plex

$0total balance
$0charity balance
$0cash balance

$0 in pending offers

Comments

Ambitious AI Alignment Seminar
plex avatar

plex

7 days ago

@Richard Good leads on how to get good leads (three people with good contacts/recruitment skills in relevant areas), some interested mentors, but have not yet started mass outreach until funding is locked in as I'd expect that to spoil more leads than it generates if we're like not confident it's happening.

  • Persuade it's hard is not the angle I'm hoping for, but I imagine they'll naturally conclude that by looking at a bunch of the info and topics. Agree interest/curiosity is great as a motivator.

  • Yeah, it's definitely possible to select out best candidates if you apply non-disruptive wrong. I mostly want to avoid people who are something like recklessly/incorrigibly disruptive or the closed/incurious kind of overconfident in a way that blocks good conversation and intellectual progress, while keeping the truth-seeking disagreeable and the weird genius with odd social norms.

  • I want to stand by wanting to select for people who would do something about it if they thought the world was ending. It doesn't have to be EA/altruistic motivations, selfish or caring about their friends is basically fine. But I think having ~everyone bought into a certain kind of ambition and taking this seriously rather than having a bunch of people with missing mood is pretty cruxy for getting the atmosphere and momentum that makes great things happen.

The people we've talked to for marketing seem reasonably confident they can get us high quality candidates, and have done similar-ish things before. This is probably the least certain part of the chain still, and it's not impossible that we have a too ambitious deadline for this and will notice that we're not on track for a sufficiently good crop by March and move to the later dates the venue is free, in May, to improve the participant quality.

Ambitious AI Alignment Seminar
plex avatar

plex

8 days ago

@Linda Yeah, my vision which I described in our call was not to have strongly stable pods but more of flexible-ish working groups, plus mixed interaction seminar-style on an ongoing basis, and the option for at least mentors to be around part-time.

I'll be exploring this with Attila, who wrote that section. He brings a ton of experience with relevant events and will be doing a lot of the event design and he's excited to make this awesome, but we've only partly synced on models of how to best do that. My guess is we end up having more working-group style group layout rather than strongly stable pods, but I'll be examining his reasons for having put this into the draft plan here.

My main reason for wanting flexibility over the usual benefits of having more fixed pods is that this seminar, unlike most similar events, is much more focused on gaining lots of existing knowledge and helping people rapidly grow than producing novel outputs. This means having more intermixing and people switching so they can tutor new people on the things they've collected is unusually beneficial, as opposed to the usual thing where you want to get deeply synced with a few people so you can push the boundaries of knowledge and do a project together.

In general, we're planning on iterating the details like this a fair amount as we approach the date, and have been keen to get this out ASAP so we've got longer with funding confirmed to start collecting candidates.

Ambitious AI Alignment Seminar
plex avatar

plex

10 days ago

@tsvibt
> Sounds cool, but do keep in mind that this could also create a social pressure to "publish or perish" so to speak, leading to goodharting.
Clarification: Learnings will by default be in the form of "I picked one of the topics listed as possibly important and read stuff/talked to people until I deeply get it and why it's a thing and can teach it", not necessarily novel research.

Building and maintaining the Alignment Ecosystem
plex avatar

plex

over 2 years ago

AI safety careers mostly needs content writing and editing. We've looked for volunteers, and there has been some progress, but things would move much faster if we could take on a contractor who could focus on this full time, perhaps by sponsoring an AI safety info distillation fellow as the content is automatically used by both sites.