Survey for alignment researchers!
post by Cameron Berg (cameron-berg), Judd Rosenblatt (judd), AE Studio (AEStudio) · 2024-02-02T20:41:44.323Z · LW · GW · 11 commentsContents
10 comments
UPDATE 3/9: Thanks to broad participation from the community, this and associated surveys have raised approximately $10,000 for high-impact alignment organizations. Given the reasonable sample size we now have, we are now going to pause donations for any subsequent responses. (However, we will preserve the charity voting question, and if anyone wants to sponsor donations for any surveys taken after March 9th, please ping us at alignment@ae.studio.)
AE Studio [LW · GW] is launching a short, anonymous survey for alignment researchers, in order to develop a stronger model of various field-level [LW · GW] dynamics in alignment.
This appears to be an interestingly neglected research direction that we believe will yield specific and actionable insights related to the community’s technical views and more general characteristics.
The survey is a straightforward 10-15 minute Google Form with some simple multiple choice questions.
For every alignment researcher who completes the survey, we will donate $40 to a high-impact AI safety organization of your choosing (see specific options on the survey). We will also send each alignment researcher who wants one a customized report that compares their personal results to those of the field.
Together, we hope to not only raise some money for some great AI safety organizations, but also develop a better field-level model of the ideas and people that comprise alignment research.
We will open-source all data and analyses when we publish the results. Thanks in advance for participating and for sharing this around with other alignment researchers!
Survey full link: https://forms.gle/d2fJhWfierRYvzam8
11 comments
Comments sorted by top scores.
comment by Linda Linsefors · 2024-02-09T13:17:51.008Z · LW(p) · GW(p)
I timed how long it took me to fill in the survey. It took 30 min. I could probably have done it in 15 min if I skipped the optional text questions. This is to be expected however. Every time I've seen someone someone guesses how long it will take to respond to their survey, it's off by a factor of 2-5.
Replies from: cameron-berg↑ comment by Cameron Berg (cameron-berg) · 2024-02-09T14:08:25.208Z · LW(p) · GW(p)
Thanks for taking the survey! When we estimated how long it would take, we didn't count how long it would take to answer the optional open-ended questions, because we figured that those who are sufficiently time constrained that they would actually care a lot about the time estimate would not spend the additional time writing in responses.
In general, the survey does seem to take respondents approximately 10-20 minutes to complete. As noted in another comment below,
this still works out to donating $120-240/researcher-hour to high-impact alignment orgs (plus whatever the value is of the comparison of one's individual results to that of community), which hopefully is worth the time investment :)
comment by Esben Kran (esben-kran) · 2024-02-07T01:34:12.294Z · LW(p) · GW(p)
This seems like a great effort. We made a small survey called pain points in AI safety survey back in 2022 that we received quite a few answers to which you can see the final results of here. Beware that this has not been updated in ~2 years.
Replies from: cameron-berg↑ comment by Cameron Berg (cameron-berg) · 2024-02-07T14:10:08.062Z · LW(p) · GW(p)
Thanks for sharing this! Will definitely take a look at this in the context of what we find and see if we are capturing any similar sentiment.
comment by Kajus · 2024-02-23T09:04:29.738Z · LW(p) · GW(p)
What do you mean by an alignment researcher? Is somebody who did AI Safety Fundamentals an alignment researcher? Is somebody participating in MATS, AISC or SPAR an alignment researcher? Or somebody who has never posted anything on LW?
Replies from: cameron-berg↑ comment by Cameron Berg (cameron-berg) · 2024-02-23T18:10:21.871Z · LW(p) · GW(p)
There will be places on the form to indicate exactly this sort of information :) we'd encourage anyone who is associated with alignment to take the survey.
comment by Michael Tontchev (michael-tontchev-1) · 2024-02-08T00:43:55.439Z · LW(p) · GW(p)
When do you expect to publish results?
Replies from: cameron-berg↑ comment by Cameron Berg (cameron-berg) · 2024-02-08T15:25:50.243Z · LW(p) · GW(p)
Ideally within the next month or so. There are a few other control populations still left to sample, as well as actually doing all of the analysis.
comment by Vivek Hebbar (Vivek) · 2024-02-07T02:07:35.249Z · LW(p) · GW(p)
Note: The survey took me 20 mins (but also note selection effects on leaving this comment)
Replies from: cameron-berg-1↑ comment by Cameron Berg (cameron-berg-1) · 2024-02-07T14:35:58.778Z · LW(p) · GW(p)
Definitely good to know that it might take a bit longer than we had estimated from earlier respondents (with the well-taken selection effect caveat).
Note that if it takes between 10-20 minutes to fill out, this still works out to donating $120-240/researcher-hour to high-impact alignment orgs (plus whatever the value is of the comparison of one's individual results to that of community), which hopefully is worth the time investment :)