[AMA] Announcing Open Phil’s University Group Organizer and Century Fellowships [x-post]
post by abergal, ClaireZabel · 2022-08-06T21:48:02.134Z · LW · GW · 0 commentsThis is a link post for https://forum.effectivealtruism.org/posts/txmwtQehGZDYk4reN/ama-announcing-open-phil-s-university-group-organizer-and
Contents
What these programs are and are not The University Group Organizer Fellowship The Century Fellowship Current thoughts on who we should be funding What we’d like to achieve with these programs Experimenting with other kinds of groups Making community-building a highly compelling career path Some of our concerns about this work Downsides to funding Providing insufficient support We could be wrong None No comments
[Crossposted from the EA Forum. Particularly relevantly to LessWrong, I'm interested in funding more rationality groups at universities.]
Open Philanthropy recently launched two new fellowships intended to provide funding for university group organizers: the University Group Organizer Fellowship (apply here), and the Century Fellowship (apply here). This post is intended to give some more color on the two programs, and our reasoning behind them. In the post, we cover:
- What these programs are and are not
- Current thoughts on who we should be funding
- What we’d like to achieve with these programs
- Some of our concerns about this work
We’d also like this post to function as an AMA for our university student-focused work. We’ll be answering questions August 4 - August 6, and will try to get through most of the highest-voted questions, though we may not get to everything. We welcome questions at any level, including questions about our future plans, criticisms of our funding strategy, logistical questions about the programs, etc.
If you’re a university group organizer or potential university group organizer and want to talk to us, I (Asya) will also be hosting virtual office hours August 5th and 6th (edit: added some time August 7th) – sign up to talk to me for 10 minutes here.
[Note about this post: The first two sections of this post were written by Asya Bergal, and the latter two were written by Claire Zabel; confusingly, both use the first-person pronouns “I” and “my”. We indicate in the post when the author switches.]
What these programs are and are not
[The following sections of this post were written by Asya Bergal.]
The University Group Organizer Fellowship
The University Group Organizer Fellowship provides funding for organizers and group expenses for part-time and full-time organizers helping with student groups focused on effective altruism, longtermism, rationality, or other relevant topics at any university.
Unlike CEA’s previous program in this space:
- We’re interested in funding a much wider set of student groups that we think are on the path to making sure the long-term future is as good as possible, e.g. groups about longtermism, existential risk reduction, rationality, AI safety, forecasting, etc.
- For reasons discussed below [EA · GW], we think it’s possible that groups not focused on effective altruism (particularly those focused directly on risks from transformative technology), could also be highly effective in terms of getting promising people to consider careers working on the long-term future, by appealing more to people with a slightly different set of interests and motivations. As far as we know, there are currently relatively few efforts of this kind in the student group space right now, so there’s particular information value from experimenting. (And we’re excited about some new efforts in this space, e.g. the Harvard AI Safety Team. [EA · GW])
- We also continue to be interested in funding effective altruism groups, including those with a primarily neartermist focus.
- We’re interested in funding any college or university, including universities outside of the US or UK.[1] [EA(p) · GW(p)]
- We’re not providing mentorship, retreats, or other kinds of hands-on support for groups as part of these programs (at least for now).
More on the last point: while we ourselves are not providing this kind of support, we have and expect to refer groups we fund to others who are, including CEA’s University Group Accelerator Program, the Global Challenges Project, EA Cambridge, and other one-off programs and retreats.
Our current aim is to provide a smooth funding experience for strong group organizers who want to do this work, while mitigating potential negative impact from organizers who we don’t think are a good fit [EA · GW]. We don’t plan or intend to “cover” the groups space, and actively encourage others in the space to consider projects supporting university groups, including projects that involve running groups, or providing more hands-on support and funding themselves. That being said, while we think there is room for many actors, we also think the space is sensitive, and it’s easy to do harm with this kind of work by giving bad advice to organizers, using organizer time poorly, or supporting organizers who we think will have a negative impact. We expect to have a high bar for supporting these projects, and to encourage the relevant teams to start by proving themselves on small scales.
We think there are a number of projects that we think could be valuable but have no clear “owner” in the student groups space right now, including:
- Creating high-quality resources for non-EA groups
- Running residency programs [EA · GW] that provide in-person guidance to new group organizers at the beginning of the year
- Running events for group organizers
- Running events where promising new group members recommended by group organizers can meet professionals
- Finding individuals who could become strong organizers of new groups
Individuals looking to start projects supporting university groups can apply for funding from our team at Open Phil through our general application form.
The Century Fellowship
The Century Fellowship is a selective 2-year program that gives resources and support (including $100K+/year in funding) to particularly promising people early in their careers who want to work to improve the long-term future. We hope to use it (in part) to support exceptionally strong full-time group organizers, and to make community building more broadly a more compelling career path (see below [EA · GW]).
Current thoughts on who we should be funding
These criteria are the ones that I think are most important in a group organizer (and consider most strongly when making funding decisions):
- Being truth-seeking and open-minded
- Having a strong understanding of whatever topic their group is about, and/or being self-aware about gaps in understanding
- Being socially skilled enough that people won’t find them highly offputting (note that this is a much lower bar than being actively friendly, extroverted, etc.)
Secondary “nice-to-have” desiderata include:
- Taking ideas seriously
- Being conscientious
- Being ambitious / entrepreneurial
- Being friendly / outgoing
- Having good strategic judgment in what activities their group should be doing
- Actively coming off as sharp in conversation, such that others find them fun to have object-level discussions with
Notably, (and I think I may feel more strongly about this than others in the space), I’m generally less excited about organizers who are ambitious or entrepreneurial, but less truth-seeking, or have a weak understanding of the content that their group covers. In fact, I think organizers like this can be more risky than less entrepreneurial organizers, as they have the potential to misrepresent important ideas to larger numbers of people, putting off promising individuals or negatively affecting community culture in disproportionate ways.
Overall, I think student organizers have an outsized effect on the community and its culture, both by engaging other particular individuals, and more broadly by acting as representatives on college campuses, and we should accordingly have high standards for them. I similarly encourage student leaders to have high standards for the core members of their groups. I think groups will generally do better by my lights if they aim to have a small core membership with lots of high-quality object-level discussions, rather than focusing most of their attention on actively trying to attract more people [EA · GW]. (I think it’s still worth spending substantial time doing outreach, especially at the beginning of the year [EA · GW].)
All of the above being said, my current policy is to be somewhat laxer on the “truth-seeking” and “strong understanding” criteria for a given organizer when:
- There are one or more other core organizers that do well on these criteria, and those organizers are excited about having this organizer on board;
- The organizer is working in a primarily operational, non-student-facing capacity for their group; or
- The group is located in an area that is geographically and culturally remote (e.g. is in a country or region with little or no activity aimed at improving the far future, is physically distant from any EA hub, has few English speakers). I think it might make sense to be laxer here because:
- Relevant ideas are less likely to have spread in those areas, so the upside of just making people aware of them is higher;
- It’s less likely that a better organizer would come along in the next few years; and
- Since these organizers are in areas that are culturally distant, they have a weaker effect on community culture.
So far, most of the organizers who have applied to us have met the conditions above — we’ve offered funding to 47 out of the 78 organizers who we’ve evaluated directly as part of the University Organizer Fellowship program.
What we’d like to achieve with these programs
[The following sections of this post were written by Claire Zabel.]
Experimenting with other kinds of groups
In addition to growing and expanding EA groups, we’re excited to see student groups experiment with other longtermist-relevant formats, such as AI safety reading groups or groups on existential risk or applied rationality. I think there are a couple reasons this is promising:
- I think most people working on longtermist projects got interested in doing so via EA-ish philosophical reasoning, and have poured a lot of time, effort, and money into building out that pathway (from being interested in this kind of reasoning to working on a longtermist priority project); that chain of reasoning probably seems particularly salient and compelling to them/us, and they/we are well-equipped to reiterate it to others.
- However, I think if most of us were presented afresh with a convincing empirical narrative about the risks from potentially imminent transformative AI (or other longtermist concerns) without having “gone through” EA first, I wouldn’t expect them to independently converge on the view that this is the best recruiting strategy; in fact, I think it’d seem pretty niche and overly conjunctive, and it’s more likely that most people would just focus on the specific cause and try to raise awareness about it among relevant groups.
- People might be very interested in reducing existential risks for reasons other than the sort of consequentialist-leaning philosophical reasoning that has often underlain interest in EA-longtermism (e.g. see here [EA · GW] and here [EA · GW] for arguments for this approach), and we want to support and encourage folks with those motivations (and styles, beliefs about comparative advantage, etc.) to join onto longtermist priority projects.
- Or, people might sign on to the philosophical reasoning, but be more interested in exploring particular cause areas or areas of skill development, and feel they wouldn’t get much value from more general EA groups
- E.g. I think a large fraction of people (though certainly not all) already feel that human extinction , or an entity that doesn’t care about the values and preferences of humans and other sentient life on Earth gaining total irreversible power, would obviously be extremely bad, and if it were plausibly going to happen in the next hundred years or so, that would be a good and noble thing to try to prevent, no fancy philosophy needed. (This is supported by the fact that a huge number of fantasy and sci-fi books and movies centrally involve attempts to prevent these outcomes.)
- Analogously, though EA has been in non-longtermist cause areas like farm animal welfare and global health in recent years, those cause areas historically drew in many people who are not hardcore EAs and who add a ton of value in the space.
- Our sense is that groups are generally strongest when the organizers are particularly knowledgeable about and interested in the subject matter. Organizer interests vary, so we hope that support for more varied kinds of groups will allow a larger number of very strong groups led by passionate and highly-engaged organizers to emerge.
- I think effective altruism has strong associations with cost-effectiveness analyses and helping the global poor (or, earning to give). But these associations don’t seem like obviously the best ones for longtermist priority projects.
- We’re funding this from Open Philanthropy’s longtermist budget (funding to support projects motivated by the longtermist view), and we think that, given the abundance of funding in the space right now and potentially short timelines on which it must be spent to have the desired impact, and the limited number of people motivated to work on longtermist priority projects, cost-effectiveness isn’t always the most useful framing [EA · GW], though we nonetheless do attempt to do cost-effectiveness analyses of common or large uses of funds.
- Other kinds of groups with different associations might be less confusing and avoid some optics concerns (I think it intuitively makes a lot more sense to people that an AI safety-focused group would e.g. pay organizers well (compared to an EA group), since I think AI safety is associated with the highly-paying tech sector and doesn’t make an implicit claim about being the best use of money).
Making community-building a highly compelling career path
I want to make community-building a highly compelling career path, commensurate with the impact I think it’s had historically.
- Evidence from some research we’ve done [EA · GW] suggests that a fairly large fraction of people working on the projects we prioritize the most highly attribute a lot of credit to a group they were in while in college/university for them being on their current path. By the metric [EA · GW] I think is best-suited to this kind of question, our survey respondents allocate to EA groups 6-7% of the total credit they give to all EA/EA-adjacent sources (meta orgs, pieces of content, etc.) for them being on the paths they’re on, and about ⅔ to ¾ of that credit goes to student groups specifically.
- The work that group organizers do is often very demanding, both intellectually and emotionally. They’re often pushed to, at a young age, master a variety of challenging topics well enough to skillfully lead conversations about them with very bright young people, mentor peers from a variety of backgrounds and with a variety of different interests and personalities, manage other peers, organize large and complex events, and deal with tricky interpersonal issues. They are asked to do all of this without formal full-time managers or mentors, often while balancing other important responsibilities and life priorities. Strong organizers are very talented people, often with a variety of other opportunities to do projects that might offer more prestige and job security. We have high expectations for these organizers, and want to compensate them fairly for that, as well as support them and their groups to try different kinds of projects and events.
- We think that, in addition to compensation, providing longer-term support will draw more people who would be good fits to this path, and encourage people to stay in the space when it’s a good fit for them.
With the Century Fellowship especially, we hope to support particularly promising people (organizers and others) to flexibly explore and build towards different ambitious longtermist projects, with the security of longer-term support for themselves and collaborators.
Some of our concerns about this work
Downsides to funding
I worry (and I think others have expressed concerns along these lines too) about potential downsides of funding in the student group space. There are at least a few different potential issues:
- Attracting unaligned or less-aligned people
- Funding packages, especially more generous ones (and we expect the funding we offer organizers to be somewhat higher than historically, especially for the very most promising organizers, though not vastly so) increases the risk of attracting people who are less aligned with our goals. We’re happy to support anyone doing work that, upon close scrutiny, seems helpful for longtermist projects; we’re funding work, not feelings, so it’s theoretically completely okay if their motivations are mercenary in nature. But practically, having people be emotionally bought into the same goals correlates with long-term good outcomes.
- Group-organizing projects often have relatively difficult-to-measure outputs, and we have limited capacity for vetting to ensure that high-EV work is being attempted. (In other roles where aligned people are well-positioned to measure outputs, I think it’s less important to think about how aligned our goals are with our grantees’). Also, people who are more closely aligned are more likely to stay focused on their work (they are unlikely to leave abruptly if they get a more lucrative offer elsewhere, or start subtly using the group for another purpose), and they contribute to a community with more trust and comradery from shared goals.
- Despite these risks, we’ve also heard a lot of anecdotes of promising people being put off or demotivated by being offered relatively low pay; in some cases, they pursued different paths that seem less valuable to us.
- We hope that our increase in capacity will allow us to maintain a fairly low “false positive” rate (accidentally funding organizers who are pretending to share our goals).
- Funding packages, especially more generous ones (and we expect the funding we offer organizers to be somewhat higher than historically, especially for the very most promising organizers, though not vastly so) increases the risk of attracting people who are less aligned with our goals. We’re happy to support anyone doing work that, upon close scrutiny, seems helpful for longtermist projects; we’re funding work, not feelings, so it’s theoretically completely okay if their motivations are mercenary in nature. But practically, having people be emotionally bought into the same goals correlates with long-term good outcomes.
- The funding not being worth it/better off being used elsewhere
- Our research [EA · GW] suggests that many of the people we think are doing promising longtermist work credit a relevant student group at university as one of the top influences that led them to their current path, and that there’s substantial variation in how successful groups have been (even among the top schools).
- We currently value the work these people are doing very highly. It’s challenging to share our internal cost-effectiveness estimates (which include some sensitive information both about our assessments of the value of particular kinds of work and how we expect funding to be distributed between cause areas, in addition to being quite rough and potentially difficult to understand).
- But, per the above, we do see a lot of variation between schools, suggesting that moderate differences in the quality of the organizers can greatly affect how many promising people say the group helped them. I’m pretty confident that, if e.g. changing the hourly rate for an organizer's time from $20/hr to $35/hr leads to a 5 percentage point higher chance of a very strong group rather than a median-quality groups (which might have less than half as many strong members), that will be cost-effective from our perspective, barring the other risks mentioned in this section. We aren’t sure we will achieve that level of impact, but we think it’s an experiment worth trying for a few years.
- But, per the above, we do see a lot of variation between schools, suggesting that moderate differences in the quality of the organizers can greatly affect how many promising people say the group helped them. I’m pretty confident that, if e.g. changing the hourly rate for an organizer's time from $20/hr to $35/hr leads to a 5 percentage point higher chance of a very strong group rather than a median-quality groups (which might have less than half as many strong members), that will be cost-effective from our perspective, barring the other risks mentioned in this section. We aren’t sure we will achieve that level of impact, but we think it’s an experiment worth trying for a few years.
- Negative optics/PR concerns
- Even if this is a cost-effective use of funding, it might be “bad optics”, i.e. it might appear like a frivolous use of money to others.
- This seems worth thinking through carefully, but in general, we prefer to try to share our reasoning for unconventional choices we make (though this is often challenging, given capacity constraints and the difficulty of communicating work we’ve done internally), rather than avoiding decisions that otherwise seem good out of fear of social censure. We think that so far this strategy has been fairly effective, especially among the people whose opinions seem especially important for our goals.
Providing insufficient support
- On a different tack, here’s another way we could do harm: I think when an organization moves into a new space, there will almost always be some initial mistakes and confusions, including some pretty costly ones. We’ve recently increased our capacity a lot, but we’re still capacity-constrained, which I think heightens this risk, and we have less designated capacity for this right now than CEA did previously. In general and especially in the beginning, I worry about us not making decisions and providing support as quickly and as well as I want us to, and this causing some important projects to proceed more slowly and without access to helpful resources.
- I don’t think I have a great “response” to this concern other than having confidence in my team and our ability to catch up over time, or empower others who can.
- Also, we are very glad and grateful for other actors in this space, like CEA and the Global Challenges Project, among others. In the short term, we are mostly focusing on providing monetary support; we hope other groups provide other kinds of help. Generally, we think it’s actively good for there to be multiple strong organizations working in important spaces; it leads to greater robustness and diversity of perspectives, as well as some healthy competitive pressure.
We could be wrong
[Added by Asya:] We have our own views on who should and shouldn’t be funded to do this work, but those views could of course be mistaken. I don’t think it’s implausible that we come to believe we’ve made a mistake in either direction, e.g.:
- We realize our fears about putting off good people and negatively affecting the community are overblown (e.g, maybe interacting with weaker representatives of the community doesn’t have much of a negative effect on people’s likelihood of getting involved later compared to the counterfactual; maybe people who we would be less excited to have around bounce off later in the pipeline reliably enough that it doesn’t matter as much who student groups initially attract), and from a hits-based perspective we should have been funding more organizers without worrying about these downsides.
- We realize our bar for funding organizers has been too low, and we’ve made it slower or less likely for the most promising people to get involved, made the community less motivating and useful to be a part of for the most impactful people, or substantially damaged the trust network [EA · GW] that historically made it easier for people to coordinate and make progress quickly.
Overall, we’re excited to support strong university groups, and to be able to offer more and more help to group organizers in the future! Thanks for bearing with us, and please share your questions and thoughts.
0 comments
Comments sorted by top scores.