Where might I direct promising-to-me researchers to apply for alignment jobs/grants?

post by abramdemski · 2023-09-18T16:20:03.452Z · LW · GW · 10 comments

This is a question post.

It sometimes happens that people who I've talked to or worked with ask me where they should go for financial support for their research. I haven't developed a standard list of answers to this question. It seems to me like there are a lot of new orgs recently, and I'm losing track! 

If you are looking for such applicants or know someone who is looking, consider replying as an answer (or sending me a PM if that makes more sense for whatever reason).


answer by mruwnik · 2023-09-19T00:49:22.786Z · LW(p) · GW(p)

There is a Stampy answer to that which should stay up to date here.

comment by Keenan Pepper (keenan-pepper) · 2023-09-21T00:11:00.513Z · LW(p) · GW(p)

Upvoted because this mentions Nonlinear Network.

answer by LawrenceC · 2023-09-18T21:45:54.207Z · LW(p) · GW(p)

The main funders are LTFF, SFF/Lightspeed/other S-process stuff from Jaan Tallinn, and Open Phil. LTFF is the main one that solicits independent researcher grant applications.

There's a lot of orgs, off the top of my head, there's Anthropic/OpenAI/GDM as the scaling labs with decent-sized alignment teams, and then there's a bunch of smaller/independent orgs:

  • Alignment Research Center
  • Apollo Research
  • CAIS
  • CLR
  • Conjecture
  • FAR
  • Orthogonal
  • Redwood Research

And there's always academia.

(I'm sure I'm missing a few though!)

(EDIT: added in RR and CLR)

comment by DanielFilan · 2023-09-19T01:33:17.874Z · LW(p) · GW(p)

Redwood Research?

Replies from: LawChan
comment by LawrenceC (LawChan) · 2023-09-19T15:56:15.874Z · LW(p) · GW(p)

I don't think they're hiring, but added. 

comment by Martín Soto (martinsq) · 2023-09-19T11:42:05.035Z · LW(p) · GW(p)

Center on Long-term Risk (CLR)

comment by amaury lorin (amaury-lorin) · 2023-09-19T21:14:36.421Z · LW(p) · GW(p)

In France, EffiSciences is looking for new members and interns.

answer by plex · 2023-09-19T14:10:27.935Z · LW(p) · GW(p)

https://aisafety.world/tiles/ has a bunch.

answer by Chipmonk · 2023-09-19T22:32:10.899Z · LW(p) · GW(p)

Very surprised that you don't have a regranting budget! I don't know which funder I would expect to do that, but I would've expected this to be more common.

I guess Jan Tallinn does this, and Manifund does this. Hmm.  

answer by stavros · 2023-09-19T11:35:38.872Z · LW(p) · GW(p)

Depending on the kind of support they're looking for https://ceealar.org could be an option. At any one time there are a handful of people staying there working independently on AI Safety stuff.


Comments sorted by top scores.