One week left for CSER researcher applications

post by RyanCarey · 2015-04-17T00:40:00.901Z · LW · GW · Legacy · 5 comments

This is the last week to apply for one of four postdoctoral research positions at the Centre for the Study of Existential Risk. We are seeking researchers in disciplines including: economics, science and technology studies, science policy, arms control policy, expert elicitation and aggregation, conservation studies and philosophy.

The application requires a research proposal of no more than 1500 words from an individual with a relevant doctorate.

"We are looking for outstanding and highly-committed researchers, interested in working as part of growing research community, with research projects relevant to any aspect of the project. We invite applicants to explain their project to us, and to demonstrate their commitment to the study of extreme technological risks.

We have several shovel-ready projects for which we are looking for suitable postdoctoral researchers. These include:

1. Ethics and evaluation of extreme technological risk (ETR) (with Sir Partha Dasgupta;

2. Horizon-scanning and foresight for extreme technological risks (with Professor William Sutherland);

3. Responsible innovation and extreme technological risk (with Dr Robert Doubleday and the Centre for Science and Policy).

However, recruitment will not necessarily be limited to these subprojects, and our main selection criterion is suitability of candidates and their proposed research projects to CSER’s broad aims.

More details are available here. Applications close on April 24th.

- Sean OH and Ryan

5 comments

Comments sorted by top scores.

comment by John_Maxwell (John_Maxwell_IV) · 2015-06-11T14:16:55.285Z · LW(p) · GW(p)

I was interested to read Nick Beckstead write that x-risk reduction jobs are "very competitive". Do you guys want to share how pleased you were about the set of applicants you received for these jobs? And what strategies worked best for advertising them? (Interesting because: I'm curious whether x-risk reduction is more capital or talent-limited, and also how well the x-risk reduction movement is communicating internally.)

Replies from: Sean_o_h, AlexMennen
comment by Sean_o_h · 2015-06-15T10:09:50.301Z · LW(p) · GW(p)

A few comments. I was working with Nick when he wrote that, and I fully endorsed it as advice at the time. Since then, the Xrisk funding situation - and number of locations at which you can do good work - has improved dramatically. it would be worth checking with him how he feels now. My view is that jobs are certainly still competitive though.

In that piece he wrote "I find the idea of doing technical research in AI or synthetic biology while thinking about x-risk/GCR promising." I also strongly endorse this line of thinking. My view is that in addition to centres specifically doing Xrisk, having people who are Xrisk-motivated working in all the standard mainstream fields that are relevant to Xrisk would be a big win. Not just AI or synthetic biology (although obviously directly valuable here) - I'd include areas like governance, international relations, science & technology studies, and so on. There will come a point (in my view) when having these concerns diffusing across a range of fields and geographic locations will be more important than increasing the size of dedicated thought bubbles at e.g. Oxford.

"Do you guys want to share how pleased you were about the set of applicants you received for these jobs?" I can't say too much about this, because hires not yet finalised, but yes, pleased. The hires we made are stellar. There were a number of people not hired who at most times I would have thought to be excellent, but for various reasons the panel didn't think they were right at this time. You will understand if I can't say more about this, (and my very sincere apologies to everyone I can't give individual feedback to, carrying a v heavy workload at the moment w minimal support).

That said, I wouldn't be willing to stand up and say x-risk reduction is not talented-limited, as I don't think there's enough data for that. Our field was large, and top talent was deep enough on this occasion, but could have been deeper. Both CSER and FHI have more hires coming up, so that will deplete the talent pool further.

Another consideration: I do feel that many of the most brilliant people the X-risk field needs are out there already, finishing their PhDs in relevant areas but not currently part of the field. I think organisations like ours need to make hard efforts to reach out to these people.

Recruitment strategies: Reaching out through our advisors' networks. Standard academic jobs hiring boards, emails to the top 10-20 departments in the most relevant fields. Getting in touch with members of different x-risk organisations and asking them to spread the word through their networks. Posting online in various x-risk/ea-related places. I also got in touch with a large range of the smaller, more specific centres (and authors) producing the best work outside of the x-risk community - e.g. in risk, foresight, horizon-scanning, security, international relations, DURC, STS and so on, asked them for recommendations and to distribute it among their network. And I iterated a few times through the contacts I made this way. E.g. I got in touch with Tetlock and others on expertise elicitation & aggregation, who put me in touch with people at the Good Judgement Project and others, who put me in touch with other centres. Eventually got some very good applicants in this space, including one from Australia's Centre of Excellence for Biosecurity Risk Analysis, whose director I was put in touch with through this method but hadn't heard of previously.

This was all v labour intensive, and I expect I won't have time to recruit so heavily in future. But I hope going forward we will have a bigger academic footprint. I also had tremendous help from a number of people in the Xrisk community, including Ryan Carey, Seth Baum, FHI folks, to whom I'm v grateful. Also, a huge thanks to Scott Alexander for plugging our positiosn on his excellent blog!

I think our top 10 came pretty evenly split between "xrisk community", "standard academic jobs posting boards/university department emails" and "outreach to more specific non-xrisk networks". I think all our hires are new introductions to existential risk, which is encouraging.

Re: communicating internally, I think we're doing pretty well. E.g. on recruitment, I've been communicating pretty closely with FHI as they have positions to fill too at present and coming up, and will recommend to some excellent people who applied to us to apply to them. (note that this isn't always just quality - we have both had excellent applicants who weren't quite a fit at this time at one, but would a top prospect at the other, going in both directions).

More generally, internal communication within x-risk has been good in my view - project managers and researchers at FHI, MIRI and other orgs make a point of regular meetings with the other organisations, and this has made up a decent chunk of my time too over the past couple of years and has been very important, although I'm likely to have to cut back personally for a couple of years due to increasing cambridge-internal workload (early days of a new, unusual centre in an old traditional university). I expect our researchers will play an important role in communicating between centres however.

One further apology: I don't expect to have much time to comment/post on LW going forward, so I apologise that I won't always be able to reply to qs like this. But I'm very grateful for all the useful support, advice and research input I've received from LW members over the years.

Replies from: John_Maxwell_IV
comment by John_Maxwell (John_Maxwell_IV) · 2015-06-15T11:36:36.598Z · LW(p) · GW(p)

Sounds like you're doing a lot; thanks so much!

comment by AlexMennen · 2015-06-11T18:15:21.924Z · LW(p) · GW(p)

... to read Nick Beckstead write that ...

Wrong link.

Replies from: John_Maxwell_IV
comment by John_Maxwell (John_Maxwell_IV) · 2015-06-11T18:17:26.781Z · LW(p) · GW(p)

Thanks, fixed.