Self-studying to develop an inside-view model of AI alignment; co-studiers welcome!

post by Vael Gates · 2021-11-30T09:25:05.146Z · LW · GW · 0 comments

Contents

  Vael's curriculum (optimized for me, by me)
None
No comments

tldr: It's hard for me to develop inside views of AI alignment, but I feel like I'm approximately ready for it now. So I'm developing a curriculum for myself, and I'd welcome people who want to join me in creating their own curricula and discussing progress regularly!
 


I’m one of those people who finds developing an “inside view” hard. 

In any case, after a lot of years of orienting around this, I now feel like I'm approximately ready to develop an inside view about AI alignment. I've consumed a lot of AI content by this point, and feel like it's about time, so I'm psyched that my psyche feels like it's finally in the right place for this.

So, because I'm one of those people who likes courses and structure: I'm developing a curriculum for myself, and I'd welcome anyone who wants to join me in creating their own and discussing progress regularly!

 


 

Vael's curriculum (optimized for me, by me)

STRUCTURE

CONTENT


If you're excited about doing something similar with me, send me a message here or email! We'll see how this experiment goes.

0 comments

Comments sorted by top scores.