Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate
2

Write an e/acc FAQ

Science & technologyGlobal catastrophic risks
Isaac-King avatar

Isaac King

Not fundedGrant
$0raised

Project summary

The e/acc movement has been getting a lot of coverage recently, with a Forbes article and Lex Fridman podcast. Many people seem unaware of what exactly the movement is, and I want to write up a summary of its beliefs and goals for easy reference. In particular, the movement claims to have noble goals, but regularly engages in misdirection about exactly what those goals are in ways that may not be obvious to outsiders. I think it would be valuable to have a resource that points out both the positive ideals of the movement and the deceptive practices that its members regularly engage in. While still a very niche community, it's becoming an influential movement within silicon valley, and as such can have an outsized impact (likely negative) on the rest of the world.

(I have in mind something similar to SSC's libertarianism and reactionary FAQs.)

What are this project's goals and how will you achieve them?

Goal: Write a summary of the movement and share it with others.

Methods: By writing such a summary, and sharing it with others. (I have no particular advertisement channel past social media and word of mouth. I'm hoping that others find it useful and share it themselves.)

How will this funding be used?

To cover costs of living. It will be a long article and I expect it will take me several days of research to write, and then I have to keep it updated with changes as time passes.

(The cost also factors in the significant amount of prior time I've spent learning about and debating this topic, which has all been unpaid.)

Who is on your team and what's your track record on similar projects?

I'm the sole writer, and I plan to solicit feedback on the article from several members of the e/acc and AI safety communities before publishing. I have a strong understanding of the arguments behind accelerationism and AI risk, having discussed them at length before, such as here and here. I've written articles on related topics in the past, such as here.

What are the most likely causes and outcomes if this project fails? (premortem)

Doesn't get shared widely. Manifold currently puts this at about a 50% chance of occurring, though I expect that number to go down if this project is funded, since that would indicate significant interest and I could probably get funders to help share it. Still, even if it remains niche, I and a few others will be able to use it as a resource whenever people ask about the movement. (I frequently engage in AI discourse on Twitter, usually productively I think.)

What other funding are you or your project getting?

None for this project. I've gotten funding for other unrelated projects on Manifund, see my profile. I've applied for an ACX grant for this project, but I think it's very unlikely it gets funded.

Comments4Similar5
Isaac-King avatar

Isaac King

$400

Isaac's Blog Writing

Manifold iconManifold Community FundForecastingGlobal catastrophic risks
3
9
$834 raised
Hein avatar

Hein de Haan

Write and publish an e-book advocating for longtermism and sentientism.

How can we build an awesome civilization for all sentient life, and ensure it will be (even more) awesome in the future?

ACX Grants 2024
1
5
$0 raised
🐶

Evan Daniel

$3K

Extreme Probabilities Project

Explore a market structure that attempts to produce good probability estimates on highly-confident markets.

Manifold iconManifold Community FundForecasting
5
11
$0 raised
StevenK avatar

Steven Kaas

Stampy’s AI Safety Info

Creating an interactive FAQ to explain existential risk from AI to all audiences

8
3
$0 raised
AISGF avatar

AI Safety and Governance Fund

Testing and spreading messages to reduce AI x-risk

Educating the general public about AI and risks in most efficient ways and leveraging this to achieve good policy outcomes

AI governanceEA Community Choice
4
17
$12.6K raised