Apply to the Constellation Visiting Researcher Program and Astra Fellowship, in Berkeley this Winter

post by Nate Thomas (nate-thomas) · 2023-10-26T03:07:34.118Z · LW · GW · 10 comments

This is a link post for two AI safety programs we’ve just opened applications for: https://www.constellation.org/programs/astra-fellowship and  https://www.constellation.org/programs/researcher-program

Constellation is a research center dedicated to safely navigating the development of transformative AI. We’ve previously helped run the ML for Alignment Bootcamp (MLAB) series [EA · GW] and Redwood’s month-long research program on model internals (REMIX) in addition to a variety of other field-building programs & events.[1]

This winter, we are running two programs aimed at growing and supporting the ecosystem of people working on AI safety: 

Applications for both are due November 10, 11:59pm anywhere on Earth. You can apply to the Astra Fellowship here and the Visiting Researcher Program here. If you are unsure about your fit, please err on the side of applying. We especially encourage women and underrepresented minorities to apply. You can refer others who you think might be a good fit through this form

Logistics: Housing and travel expenses are covered for both programs, and Astra fellows will receive an additional monetary stipend. The start and end dates for both programs are flexible. 

Questions? Email programs@constellation.org or ask them below. 

  1. ^

    Over 15 participants from these past programs are now working on AI safety at Anthropic, ARC Evals, ARC Theory, Google DeepMind, OpenAI, Open Philanthropy, and Redwood Research.

10 comments

Comments sorted by top scores.

comment by Jay Bailey · 2023-10-31T01:30:17.989Z · LW(p) · GW(p)

For the Astra Fellowship, what considerations do you think people should be thinking about when deciding to apply for SERI MATS, Astra Fellowship, or both? Why would someone prefer one over the other, given they're both happening at similar times?

Replies from: Alexandra Bates, ryankidd44
comment by Alexandra Bates · 2023-11-01T01:39:33.588Z · LW(p) · GW(p)

Good question! In my opinion, the main differences between the programs are the advisors and the location. Astra and MATS share a few advisors, but most are different. Additionally, Astra will take place at the Constellation office, and fellows will have opportunities to talk with researchers that work regularly from Constellation.

comment by Ryan Kidd (ryankidd44) · 2023-11-08T00:10:58.900Z · LW(p) · GW(p)

MATS has the following features that might be worth considering:

  1. Empowerment: Emphasis on empowering scholars to develop as future "research leads" (think accelerated PhD-style program rather than a traditional internship), including research strategy workshops, significant opportunities for scholar project ownership (though the extent of this varies between mentors), and a 4-month extension program;
  2. Diversity: Emphasis on a broad portfolio of AI safety research agendas and perspectives with a large, diverse cohort (50-60) and comprehensive seminar program;
  3. Support: Dedicated and experienced scholar support + research coach/manager staff and infrastructure;
  4. Network: Large and supportive alumni network that regularly sparks research collaborations and AI safety start-ups (e.g., Apollo, Leap Labs, Timaeus, Cadenza, CAIP);
  5. Experience: Have run successful research cohorts with 30, 58, 60 scholars, plus three extension programs with about half as many participants.
Replies from: ryankidd44
comment by Ryan Kidd (ryankidd44) · 2023-11-08T01:14:51.374Z · LW(p) · GW(p)

Buck Shlegeris, Ethan Perez, Evan Hubinger, and Owain Evans are mentoring in both programs. The links show their MATS projects, "personal fit" for applicants, and (where applicable) applicant selection questions, designed to mimic the research experience.

Astra seems like an obviously better choice for applicants principally interested in:

  • AI governance: MATS has no AI governance mentors in the Winter 2023-24 Program, whereas Astra has Daniel Kokotajlo, Richard Ngo, and associated staff at ARC Evals and Open Phil;
  • Worldview investigations: Astra has Ajeya Cotra, Tom Davidson, and Lukas Finnvedan, whereas MATS has no Open Phil mentors;
  • ARC Evals: While both programs feature mentors working on evals, only Astra is working with ARC Evals;
  • AI ethics: Astra is working with Rob Long.
comment by Chipmonk · 2023-10-26T09:21:56.141Z · LW(p) · GW(p)

Will more advisors be added later? The subareas covered by the advisors aren't as broad as I expected

Replies from: Alexandra Bates
comment by Alexandra Bates · 2023-10-27T01:46:42.320Z · LW(p) · GW(p)

We might add a few more advisors over the next few weeks, and there are a few advisors who chose not to be listed on the website. Are there specific subareas you'd like to see more of? 

Replies from: mesaoptimizer
comment by mesaoptimizer · 2023-10-27T19:35:02.658Z · LW(p) · GW(p)

There's a sense in which there are specific assumptions made that influence the selection of advisors listed for Astra Fellowship. I may be wrong, but it seems to me that the majority of researchers listed seem to work on interpretability and evals-and-demonstrations, or have models of the alignment problem (or research taste and agenda) that are strongly Paul-Christiano-like.

I assume Chipmonk was gesturing at the nonexistence of advisors who aren't downstream of Paul Christiano's work and models and research agenda and mentors. Agent foundations (John Wentworth, Scott Garrabrant, Abram Demski) and formal world-models (Davidad) are two examples that come to mind.

Note I don't entirely share this belief (I notice that there are advisors who seem to be interested in s-risk focused research), but I get the sentiment. Also as far as I can tell, there are very few researchers like the ones I listed, and they may not be in a position to be an advisor for this program.

Replies from: Chipmonk
comment by Chipmonk · 2023-10-30T18:31:24.851Z · LW(p) · GW(p)

Yes this. And more agent foundations, especially. Thanks mesa

comment by Neel Nanda (neel-nanda-1) · 2023-10-26T09:16:46.721Z · LW(p) · GW(p)

Note that the astra fellowship link in italics at the top goes to the researcher program not the astra fellowship

Replies from: nate-thomas
comment by Nate Thomas (nate-thomas) · 2023-10-26T14:50:00.131Z · LW(p) · GW(p)

Thanks, Neel! It should be fixed now.