Postdoctoral research positions at CSER (Cambridge, UK)

post by Sean_o_h · 2015-03-26T17:59:53.828Z · LW · GW · Legacy · 8 comments

Contents

8 comments

[To be cross-posted at Effective Altruism Forum, FLI news page]

I'm delighted to announce that the Centre for the Study of Existential Risk has had considerable recent success in grantwriting and fundraising, among other activities (full update coming shortly). As a result, we are now in a position to advance to CSER's next stage of development: full research operations. Over the course of this year, we will be recruiting for a full team of postdoctoral researchers to work on a combination of general methodologies for extreme technological (and existential) risk analysis and mitigation, alongside specific technology/risk-specific projects.

Our first round of recruitment has just opened - we will be aiming to hire up to 4 postdoctoral researchers; details below. A second recruitment round will take place in the Autumn. We have a slightly unusual opportunity in that we get to cast our net reasonably wide. We have a number of planned research projects (listed below) that we hope to recruit for. However, we also have the flexibility to hire one or more postdoctoral researchers to work on additional projects relevant to CSER's aims. Information about CSER's aims and core research areas is available on our website. We request that as part of the application process potential postholders send us a research proposal of no more than 1500 words, explaining what your research skills could contribute to CSER. At this point in time, we are looking for people who will have obtained a doctorate in a relevant discipline by their start date.

We would also humbly ask that the LessWrong community aid us in spreading the word far and wide about these positions. There are many brilliant people working within the existential risk community. However, there are academic disciplines and communities that have had less exposure to existential risk as a research priority than others (due to founder effect and other factors), but where there may be people with very relevant skills and great insights. With new centres and new positions becoming available, we have a wonderful opportunity to grow the field, and to embed existential risk as a crucial consideration in all relevant fields and disciplines.

Thanks very much,

Seán Ó hÉigeartaigh (Executive Director, CSER)

 

"The Centre for the Study of Existential Risk (University of Cambridge, UK) is recruiting for to four full-time postdoctoral research associates to work on the project Towards a Science of Extreme Technological Risk.

We are looking for outstanding and highly-committed researchers, interested in working as part of growing research community, with research projects relevant to any aspect of the project. We invite applicants to explain their project to us, and to demonstrate their commitment to the study of extreme technological risks.

We have several shovel-ready projects for which we are looking for suitable postdoctoral researchers. These include:

However, recruitment will not necessarily be limited to these subprojects, and our main selection criterion is suitability of candidates and their proposed research projects to CSER’s broad aims.

Details are available here. Closing date: April 24th."

8 comments

Comments sorted by top scores.

comment by leplen · 2015-03-27T00:52:38.814Z · LW(p) · GW(p)

Candidates should have a PhD in a relevant field

I'm really curious as to what constitutes a relevant field. The 3 people you list are an economist, a conservation biologist, and a someone with a doctorate in geography. Presumably those are relevant fields, but I don't know what they have in common exactly.

I don't know what to think about this. You're new and you have sort of unconventional funding and a really broad mission statement. I'm not really sure what sort of research you're looking for or what journals it would be published in. I can't tell how much of this is science and how much of this is economics or political science and your institute is under the umbrella of the Arts and Humanities Research Center. What sorts of positions do you envision your post-doctoral fellows taking two years down the road?

This is definitely interesting, but I'm not sure that I have any actual idea who you're looking for and having read your website and downloaded the job listing and read the bios of the people involved, I'm still not really sure. I can't figure out whether this seems sort of vague and confusing because it isn't directed at me or because you're still sort of figuring out the shape of the group yourself.

Replies from: Sean_o_h, Sean_o_h, RyanCarey
comment by Sean_o_h · 2015-03-29T15:42:31.817Z · LW(p) · GW(p)

Leplen, thank you for your comments, and for taking the time to articulate a number of the challenges associated with interdisciplinary research – and in particular, setting up a new interdisciplinary research centre in a subfield (global catastrophic and existential risk) that is in itself quite young and still taking shape. While we don’t have definitive answers to everything you raise, they are things we are thinking a lot about, and seeking a lot of advice on. While there will be some trial and error, given the quality and pooled experience of the academics most involved I’m confident that things will work out well.

Firstly, re: your first post, a few words from our Academic Director and co-founder Huw Price (who doesn’t have a LW account).

“Thanks for your questions! What the three people mentioned have in common is that they are all interested in applying their expertise to the challenges of managing extreme risks arising from new technologies. That's CSER's goal, and we're looking for brilliant early-career researchers interested in working on these issues, with their own ideas about how their skills are relevant. We don't want to try to list all the possible fields these people might come from, because we know that some of you will have ideas we haven't thought of yet. The study of technological xrisk is a new interdisciplinary subfield, still taking shape. We're looking for brilliant and committed people, to help us design it.

We expect that the people we appoint will publish mainly in the journals in their home field, thus helping to raise awareness of these important issues within those fields – but there will also be opportunities for inter-field collaborations, too, so you may find yourself publishing in places you wouldn't have expected. We anticipate that most of our postdocs will go on to distinguished careers in their home fields, too, though hopefully in a way which maintains their links with the interdisciplinary xrisk community. We anticipate that there will also be some opportunities for more specialised career paths, as the field and funding expand. “

A few words of my own to expand: As you and Ryan have discussed, we have a number of specific, quite well-defined subprojects that we have secured grant funding for (two more will be announced later on). But we are also in the lucky position of having some more unconstrained postdoctoral position funding – and now, as Huw says, seems like an opportune time to see what people, and ideas, are out there, and what we haven’t considered. Future calls are likely to be a lot more constrained – as the centre’s ongoing projects and goals get more locked in, and as we need to hire for very specific people to work on specific grants.

Some disciplines seem very obviously relevant to me – e.g. if the existential risk community is to do work on AI, synthetic biology, pandemic risk, geoengineering, it needs people with qualifications in CS/math, biology/informatics, epidemiology, climate modelling/physics. Disciplines relevant to risk modelling and assessment seem obvious, as does science & technology studies, philosophy of science, and policy/governance. In aiming to develop implementable strategies for safe technology development and x-risk reduction, economics, law and international relations seem like fields that might produce people with necessary insights. Some or a little less clear-cut: insights into horizon-scanning and foresight/technological prediction could come from a range of areas. And I’m sure there are disciplines we are simply missing. Obviously we can’t hire people with all of these backgrounds now (although, over the course of the centre we would aim to have all these disciplines pass through and make their mark). But we don’t necessarily need to; we have enough strong academic connections that we will usually be able to provide relevant advisors and collaborators to complement what we have ‘in house’. E.g. if a policy/law-background person seems like an excellent fit for biosecurity work or biotech policy/regulation, we would aim to make sure there’s both a senior person in policy/law to provide guidance, and collaborators in biology to make sure the science is there. And vice versa.

With all that said, from my time at FHI and CSER, a lot of the biggest progress and ideas have come from people whose backgrounds might not have immediately seemed obvious to x-risk, at least to me – cosmologists, philosophers, neuroscientists. We want to make sure we get the people, and the ideas, wherever they may be.

With regards to your second post:

You again raise good questions. For the people who don’t fall squarely into the ‘shovel-ready’ projects (although the majority of our hires this year will), I expect we will set up senior support structures on a case by case basis depending on what the project/person needs.

One model is co-supervision or supervisor+advisor. For one example, last year I worked with a CSER postdoctoral candidate on a grant proposal for a postdoc project that would have taken in both technical modelling/assessment of extreme risks from sulphate aerosol geoengineering, but where the postdoc also wanted to explore broader socio/policy challenges. We felt we had the in-house expertise for the latter but not the former. We set up an arrangement whereby he would be advised by a climate specialist in this area, and spend a period of the postdoc with the specialist’s group in Germany. (The proposal was unfortunately unsuccessful with the granting body.)

As we expect AI to be a continuing focus, we’re developing good connections with AI specialist groups in academia and industry in Cambridge, and would similarly expect that a postdoc with a CS background might split their time between CSER’s interdisciplinary group and a technical group working in this area and interested in long-term safe/responsible AI development. The plan is to develop similar relations in bio and other key areas. If we feel like we’re really not set up to support someone as seems necessary and can’t figure out how to get around that, then yes, that may be a good reason not to proceed at a given time. That said, during my time at FHI, a lot of good research has been done without these kinds of setups – and incidentally I don’t think being at FHI has ever harmed anyone’s long-term career prospects - so they won’t always be necessary.

And overly-broad job listings are par for the course, but before I personally would want to put together a 3 page project proposal or hunt down a 10 page writing sample relevant or even comprehensible to people outside of my field, I'd like to have some sense of whether anyone would even read them or whether they'd just be confused as to why I applied.

An offer: if you (or anyone else) have these kinds of concerns and wish to send me something short (say 1/3-1/2 page proposal/info about yourself) before investing the effort in a full application, I’ll be happy to read and say whether it’s worth applying (warning: it may take me until weekend on any given week).

Replies from: leplen
comment by leplen · 2015-03-29T16:59:00.337Z · LW(p) · GW(p)

Thanks so much for your thoughtful response. This clarifies the position dramatically and makes it sound much more attractive. If I have any further questions related to my application specifically, I'll certainly let you know.

comment by Sean_o_h · 2015-03-27T11:42:31.025Z · LW(p) · GW(p)

Placeholder: this is a good comment and good questions, which I will respond to by tomorrow or Sunday.

comment by RyanCarey · 2015-03-27T16:21:32.560Z · LW(p) · GW(p)

This should help a little:

Projects: (i) Ethics of extreme technological risk (working with Professor Partha Dasgupta) This subproject aims to examine the limitations of standard cost-benefit analysis (CBA) as a means of assessing the importance of mitigating extreme technological risk (ETR); to develop a version of CBA more suitable to this context; and derive conclusions about the importance of mitigating ETR compared to other global priorities. Relevant disciplines include: Philosophy (especially moral philosophy, applied ethics, and formal decision theory) and Economics (e.g., the economics of sustainability, the theory of future discounting).

(ii) Horizon-scanning and foresight for extreme technological risk (working with Professor William Sutherland) Successful management of ETR is likely to require early detection. This subproject aims to optimise the horizon-scanning and foresight techniques available for this task, and to understandthe similarities and differences between the case of ETR and other horizon-scanning applications. Relevant disciplines include: Zoology and Ecology, Conservation Studies, Science and Technology Studies, Psychology.

(iii) Responsible innovation and extreme technological risk (working with Dr Robert Doubleday and Professor Martin Rees) This subproject asks what can be done to encourage risk-awareness and societal responsibility, without discouraging innovation, within the communities developing future technologies with transformative potential. Relevant disciplines include: Science and Technology Studies, Geography, Philosophy of Science, plus relevant technological fields (e.g., AI, Virology, Synthetic biology).

further Information about the positions

Replies from: leplen
comment by leplen · 2015-03-28T03:43:01.768Z · LW(p) · GW(p)

It's sort of not that useful though. This is a description of the "shovel-ready" projects and those are actually pretty straight-forward. If you fit into one of those categories, you'd basically be under a single person with a well-defined discipline and you can get a pretty good sense of who you'd be working for by scanning a half-dozen paper abstracts if you're not already familiar with them. There's a decent chance you're actually funded directly out of the individual professor's research grant. It's pretty business as usual.

But being a post-doc for an interdisciplinary center can be a lot more confusing. If the center has someone who is an expert in your field then they're semi-qualified to supervise your work and they sort of become your boss by default. If there isn't an expert in your field, the standard academic mentor-apprentice model starts to break down and it's not always clear what will replace it. Sometimes you become predominantly a lackey/domain expert/flex researcher for existing projects. Sometimes the center recruits someone to mentor you. Some you are expected to develop a novel focus for the group. And if the group has been around for a while you can estimate a lot of these answers just from publication history, but with something brand new it's much harder.

And this is a stupid hard problem to even describe. It isn't clear what department "All the things that might possibly go wrong that would make us all die" belongs in. On some level I understand why the "Specialist knowledge and skills" are super vague general things like "good level of scientific literacy" and "strong quantitative reasoning skills." And overly-broad job listings are par for the course, but before I personally would want to put together a 3 page project proposal or hunt down a 10 page writing sample relevant or even comprehensible to people outside of my field, I'd like to have some sense of whether anyone would even read them or whether they'd just be confused as to why I applied.

Replies from: RyanCarey
comment by RyanCarey · 2015-03-28T14:52:56.335Z · LW(p) · GW(p)

Hi Leplen,

I'm only assisting on CSER on a casual basis but here are some rough notes that should at least be helpful.

As you point out, the job description is general because the enterprise is interdisciplinary and there are a lot of ways that people could contribute to it. Projects apart from those specified would be significantly designed to match the available personelle and their expertise. If someone wanted to contribute to some specific technology, such as nanotech, that you've previously written about on this forum, and had a credible background that was relevant to that risk, then we wouldn't be left wondering why you were applying. Still, I agree that we should make future job postings more specific, and expect that will do this.

In relation to who would be available to supervise applicants in areas other than those advertised, it can be helpful to look at CSER's Cambridge-based advisory. In policy, for example, there is not only Robert Doubleday from the Centre for Science and Policy but also others who are advising, so this would obviously be a strong area. Another example is that Huw Price, who is a founder, is significantly interested in the application of decision theory to AI safety, and so opportunities may arise in that area over time.

It doesn't seem immediately likely that domain experts would be used by passing around existing projects because CSER is actively interested in performing thorrough and ongoing analysis of relevant risks, and how to promote the safe development of relevant technologies.

If you have a question about whether CSER is interested in performing research and has capabilities for supervision of X area of research,

comment by Stuart_Armstrong · 2015-03-27T11:12:32.637Z · LW(p) · GW(p)

As a minor "argument from authority", I'd like to state that Sean has done really good work at the FHI before moving across to found CSER, and that CSER has the full support of the FHI as something that is worthwhile and doing important work. So if you trust the FHI's judgement in this area, then trust that CSER is a very positive development.