Project summary
The e/acc movement has been getting a lot of coverage recently, with a Forbes article and Lex Fridman podcast. Many people seem unaware of what exactly the movement is, and I want to write up a summary of its beliefs and goals for easy reference. In particular, the movement claims to have noble goals, but regularly engages in misdirection about exactly what those goals are in ways that may not be obvious to outsiders. I think it would be valuable to have a resource that points out both the positive ideals of the movement and the deceptive practices that its members regularly engage in. While still a very niche community, it's becoming an influential movement within silicon valley, and as such can have an outsized impact (likely negative) on the rest of the world.
(I have in mind something similar to SSC's libertarianism and reactionary FAQs.)
What are this project's goals and how will you achieve them?
Goal: Write a summary of the movement and share it with others.
Methods: By writing such a summary, and sharing it with others. (I have no particular advertisement channel past social media and word of mouth. I'm hoping that others find it useful and share it themselves.)
How will this funding be used?
To cover costs of living. It will be a long article and I expect it will take me several days of research to write, and then I have to keep it updated with changes as time passes.
(The cost also factors in the significant amount of prior time I've spent learning about and debating this topic, which has all been unpaid.)
Who is on your team and what's your track record on similar projects?
I'm the sole writer, and I plan to solicit feedback on the article from several members of the e/acc and AI safety communities before publishing. I have a strong understanding of the arguments behind accelerationism and AI risk, having discussed them at length before, such as here and here. I've written articles on related topics in the past, such as here.
What are the most likely causes and outcomes if this project fails? (premortem)
Doesn't get shared widely. Manifold currently puts this at about a 50% chance of occurring, though I expect that number to go down if this project is funded, since that would indicate significant interest and I could probably get funders to help share it. Still, even if it remains niche, I and a few others will be able to use it as a resource whenever people ask about the movement. (I frequently engage in AI discourse on Twitter, usually productively I think.)
What other funding are you or your project getting?
None for this project. I've gotten funding for other unrelated projects on Manifund, see my profile. I've applied for an ACX grant for this project, but I think it's very unlikely it gets funded.