I want to donate some money (not much, just what I can afford) to AGI Alignment research, to whatever organization has the best chance of making sure that AGI goes well and doesn't kill us all. What are my best options, where can I make the most difference per dollar?

post by lumenwrites · 2022-08-02T12:08:46.674Z · LW · GW · 3 comments

This is a question post.

Contents

  Answers
    21 Richard_Ngo
    14 Buck
    14 Brian Goodrich
None
3 comments

I don't understand this field well, I'm hoping you guys can help me out.

Answers

answer by Richard_Ngo · 2022-08-02T18:11:04.196Z · LW(p) · GW(p)

In general, for donating to alignment work, I think that the best approach is to focus on local grants, because those are the ones that won't be picked up by bigger funders, who have a lot of money right now. By "local" I mean things like: if you meet someone who seems promising, fund their flights to visit an alignment hub; or fund them buying textbooks, getting tutoring in ML; etc.

comment by Quadratic Reciprocity · 2022-08-03T01:29:02.295Z · LW(p) · GW(p)

Couldn't these people just apply to the long-term future fund for those kinds of things and the LTFF could be better at recognising who is promising amongst the people who apply? 

Replies from: Zach Stein-Perlman
comment by Zach Stein-Perlman · 2022-08-03T02:49:55.727Z · LW(p) · GW(p)

Grantmakers aren't always "better at recognising who is promising," mostly because you sometimes have important information they don't, like knowing that someone is smart and knowledgeable and altruistic beyond what that person can make legible to time-constrained grantmakers. If you don't have relevant information, e.g. because you don't know any promising people-who-need-funding, donating to LTFF is great.

(I have donated to LTFF. I have never made a "local grant," but I would consider it if I had more money and I knew promising people-who-need-funding.)

Replies from: ricraz
comment by Richard_Ngo (ricraz) · 2022-08-03T06:18:38.228Z · LW(p) · GW(p)

I agree with this, but I think the main bottleneck here is just that those people often don't apply to the LTFF.

answer by Buck · 2022-08-02T18:37:02.405Z · LW(p) · GW(p)

My guess is that the Long-Term Future Fund is the best you can do. (I'm a fund manager on a different EA fund.)

answer by Brian Goodrich · 2022-08-02T15:17:51.447Z · LW(p) · GW(p)

Larks' 2021 AI Alignment Literature Review and Charity Comparison [LW · GW] is a good summary of the organizations and their funding situations.

3 comments

Comments sorted by top scores.

comment by Chris_Leong · 2022-08-02T17:24:54.260Z · LW(p) · GW(p)

I'm actually about to announce an AI Safety microgrant initiative for people who want to are looking to commit at least $1000USD for every year they choose to be involved. The post will be out in the next few days, let me know if you want me to link you when it's ready.

Replies from: xida-ren
comment by Cedar (xida-ren) · 2022-08-03T06:38:20.906Z · LW(p) · GW(p)

Could you please link me when grant applications are open instead?

Considering AI Safety as a career and would go for it if I can have some time where I'm not worried about rent.

Replies from: Chris_Leong
comment by Chris_Leong · 2022-08-03T17:40:50.686Z · LW(p) · GW(p)

There may not be an open round, as we may find connections through our networks. However, if there is, we will post it on the forum.