Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate
Greg_Colbourn avatarGreg_Colbourn avatar
Greg Colbourn

@Greg_Colbourn

Global moratorium on AGI, now. Founder of CEEALAR.

https://twitter.com/gcolbourn
$0total balance
$0charity balance
$0cash balance

$0 in pending offers

Outgoing donations

PauseAI local communities - volunteer stipends
$85500
5 months ago
PauseAI US 2025 through Q2
$90000
6 months ago
AI-Plans.com
$5000
over 1 year ago
Alignment Is Hard
$3800
almost 2 years ago

Comments

Dads Against AGI Inc.
Greg_Colbourn avatar

Greg Colbourn

17 days ago

>"AI safety is a very complex issue and it's really not that straightforward at all in terms of what needs to be done."

I think at this stage it is actually pretty straightforward. No one knows how to control or align ASI, so we should make sure it doesn't get built for at least a few years. And to make this happen, we need massive public engagement.

@casebash

Dads Against AGI Inc.
Greg_Colbourn avatar

Greg Colbourn

17 days ago

Interested in why you think we can make AI go well (beyond optimism, what is the mechanism?) @Austin

Creating 'Making God': a Feature Documentary on risks from AGI
Greg_Colbourn avatar

Greg Colbourn

about 1 month ago

I've provided some seed funding for this (outside of Manifund). We really need broad public communication on AI risk to get it further up the political agenda. Something like Seaspiracy -- where they go down the rabbit hole -- but for AI, would be amazing.

CEEALAR
Greg_Colbourn avatar

Greg Colbourn

about 1 month ago

(I'm stepping down as ED in order to focus more of my efforts on slowing down/pausing/stoping AGI/ASI, which for some time now I've thought of as being the most important, neglected and urgent cause. I will remain a Trustee, but take more of a back seat role.)

CEEALAR
Greg_Colbourn avatar

Greg Colbourn

about 1 month ago

CEEALAR is hiring for a new Executive Director, full time, to begin in May. Applications close at the end of this month (sorry for not posting here earlier!). Please share with anyone you think might be interested / a good fit.

PauseAI US 2025 through Q2
Greg_Colbourn avatar

Greg Colbourn

5 months ago

PauseAI Global donation: https://manifund.org//projects/pauseai-local-communities---volunteer-stipends?tab=comments#6fa14ebc-2d4a-4946-b730-941e8d5e1501

PauseAI local communities - volunteer stipends
Greg_Colbourn avatar

Greg Colbourn

5 months ago

Note that this is $90k less Manifund fees (same as my donation to PauseAI US - https://manifund.org//projects/pauseai-us-2025-through-q2?tab=comments#2d85cbfd-d392-447c-ad7f-da056aa77928 - just the fees are taken out first here)

PauseAI local communities - volunteer stipends
Greg_Colbourn avatar

Greg Colbourn

5 months ago

It's more important than ever that PauseAI is funded. Pretty much the only way we're going to survive the next 5-10 years is by such efforts being successful to the point of getting a global moratorium on further AGI/ASI development. There's no point being rich when the world ends. I encourage others with 7 figures or more of net worth to donate similar amounts. And I'm disappointed that all the big funders in the AI Safety space are still overwhelmingly focused on Alignment/Safety/Control when it seems pretty clear that those aren't going to save us in time (if ever), given the lack of even theoretical progress, let alone practical implementation.

Note that this is to be considered general funding to PauseAI Global, maxing out the volunteer stipends fundraiser and funding additional hires (from OP: "If we surpass our goal, we will use that money to fund additional hires for PauseAI Global (e.g. a Social Media Director).")

PauseAI US 2025 through Q2
Greg_Colbourn avatar

Greg Colbourn

6 months ago

(This was 1 Bitcoin btw. Austin helped me with the process of routing it to Manifund, allowing me to donate ~32% more, factoring in avoiding capital gains tax in the UK).

PauseAI US 2025 through Q2
Greg_Colbourn avatar

Greg Colbourn

6 months ago

I've been impressed with both Holly and Pause AI US, and Joep and Pause AI Global, and intend to donate a similar amount to Pause AI Global.

PauseAI US 2025 through Q2
Greg_Colbourn avatar

Greg Colbourn

6 months ago

It's more important than ever that PauseAI is funded. Pretty much the only way we're going to survive the next 5-10 years is by such efforts being successful to the point of getting a global moratorium on further AGI/ASI development. There's no point being rich when the world ends. I encourage others with 7 figures or more of net worth to donate similar amounts. And I'm disappointed that all the big funders in the AI Safety space are still overwhelmingly focused on Alignment/Safety/Control when it seems pretty clear that those aren't going to save us in time (if ever), given the lack of even theoretical progress, let alone practical implementation.

AI-Plans.com
Greg_Colbourn avatar

Greg Colbourn

over 1 year ago

Supporting this because it is useful to illustrate how there are basically no viable AI Alignment plans for avoiding doom with short timelines (which is why I think we need a Pause/moratorium). Impressed by how much progress Kabir and team have made in the last few months, and look forward to seeing the project grow in the next few months.

Alignment Is Hard
Greg_Colbourn avatar

Greg Colbourn

almost 2 years ago

This research seems promising. I'm pledging enough to get it to proceed. In general we need more of this kind of research to establish consensus on LLMs (foundation models) basically being fundamentally uncontrollable black boxes (that are dangerous at the frontier scale). I think this can lead - in conjunction with laws about recalls for rule breaking / interpretability - to a de facto global moratorium on this kind of dangerous (proto-)AGI. (See: https://twitter.com/gcolbourn/status/1684702488530759680)

Transactions

ForDateTypeAmount
PauseAI local communities - volunteer stipends5 months agoproject donation85500
Manifund Bank6 months agodeposit+85500
PauseAI US 2025 through Q26 months agoproject donation90000
Manifund Bank6 months agodeposit+90000
AI-Plans.com over 1 year agoproject donation5000
Manifund Bankover 1 year agodeposit+3800
Alignment Is Hardalmost 2 years agoproject donation3800
Manifund Bankalmost 2 years agodeposit+5000