How to get into AI safety research

post by Stuart_Armstrong · 2022-05-18T18:05:06.526Z · LW · GW · 7 comments

Recently, I had a conversation with someone from a math background, asking how they could get into AI safety research. Based on my own path from mathematics to AI alignment, I recommended the following sources. It may prove useful to others contemplating a similar change in career:

You mileage may vary, but these are the sources that I would recommend. And I encourage you to post any sources you'd recommend, in the comments.

7 comments

Comments sorted by top scores.

comment by JanB (JanBrauner) · 2022-05-19T07:33:42.415Z · LW(p) · GW(p)

I guess I'd recommend the AGI safety fundamentals course: https://www.eacambridge.org/technical-alignment-curriculum

On Stuart's list: I think this list might be suitable for some types of conceptual alignment research. But you'd certainly want to read more ML for other types of alignment research.

comment by Gunnar_Zarncke · 2022-05-19T21:39:55.162Z · LW(p) · GW(p)

This is nice from a "what do I need to study" perspective, but it does help less with the "how do I pay the bills" perspective. Do you have pointers there too?

Replies from: conor-sullivan
comment by Lone Pine (conor-sullivan) · 2022-05-19T22:21:55.427Z · LW(p) · GW(p)

AI Safety Support

https://www.aisafetysupport.org/resources/career-coaching

Replies from: Gunnar_Zarncke
comment by Gunnar_Zarncke · 2022-05-19T22:59:09.559Z · LW(p) · GW(p)

Thank you! I have scheduled a call.

comment by Joel Burget (joel-burget) · 2022-06-03T21:10:55.475Z · LW(p) · GW(p)

Thank you for mentioning Gödel Without Too Many Tears, which I bought it based on this recommendation. It's a lovely little book. I didn't expect to it to be nearly so engrossing.

Replies from: Stuart_Armstrong
comment by Stuart_Armstrong · 2022-06-05T03:09:35.959Z · LW(p) · GW(p)

Glad your liked it :-)