The Value of Those in Effective Altruism
post by Gleb_Tsipursky · 2016-02-17T00:59:34.180Z · LW · GW · Legacy · 14 commentsContents
14 comments
Summary/TL;DR: this piece offers Fermi Estimates of the value of those in EA, focusing on the distinctions between typical EA members and dedicated members (defined below). These estimates suggest that, compared to the current movement baseline, we should prioritize increasing the number of “typical” EA members and getting more non-EA people to behave like typical EA members, rather than getting typical EAs to become dedicated ones.
[Acknowledgments: Thanks to Tom Ash, Jon Behar, Ryan Carey, Denis Drescher, Michael Dickens, Stefan Schubert, Claire Zabel, Owen Cotton-Barratt, Ozzie Gooen, Linchuan Zheng, Chris Watkins, Julia Wise, Kyle Bogosian, Max Chapnick, Kaj Sotaja, Taryn East, Kathy Forth, Scott Weathers, Hunter Glenn, Alfredo Parra, William Kiely, Jay Quigley, and others who prefer to remain anonymous for looking at various draft versions of this post. Thanks to their feedback, the post underwent heavy revisions. Any remaining oversights, as well as all opinions expressed, are my responsibility.]
This article is a follow-up to "Celebrating All Who Are In Effective Altruism"
Introduction
There has been some discussion recently of whether the EA movement is excessively or insufficiently oriented toward getting typical EA members to become dedicated ones. The crux of this discussion, from a mechanistic movement building perspective, is whether, compared to the current baseline:
1) The EA movement should put more efforts into attracting as many value-aligned people to the EA movement as possible and keeping them in the movement, as well as getting non-EA members to behave more like typical EAs, or
2) The EA movement should more efforts into getting all of those in the EA movement to become as engaged as possible, at the expense of some people becoming disengaged due to this pressure and others not wanting to join because of the perception of high demands.
Which one will contribute most to global flourishing?
Semantics
A sub-branch of this discussion has focused on the terminology to use in describing those more or less engaged in the EA movement. My position is that the level of engagement is not binary, but lies on a spectrum, as noted here. For the sake of clarity and without claiming these are the optimal terms to use broadly, this piece will use “typical” for those who are closer to the end of the spectrum of casual engagement with EA, and “dedicated” for those who are closer to the higher pole of engagement.
Fermi Estimates
While there’s no reliable hard data available, we can do a Fermi estimate to start putting some approximate numbers at the actual resources, of time and money, each of these EA cohorts contribute. I will use numbers expressed publicly by others to minimize weighing the numbers with my own perspectives.
One highly-upvoted comment expressed “I expect 80%+ of EAs and rising to be 'softcore' for the foreseeable future.” This correlates with the 90/9/1% generalization, and makes prima facie sense based on the distribution of EA group organizers versus participants, etc. So let’s say that anywhere from 10 to 20 out of 100 EAs are closer to the dedicated pole, and this will continue to be the case as the movement grows.
Another highly-upvoted comment suggests that “a fully committed altruist usually accomplishes about as much as three to six people who do little beyond pledging.” This might be a more controversial claim than the previous one.
First, the comment itself was strongly advocating for focusing more on getting more dedicated EA members rather than typical ones, and thus might have potentially exaggerated the impact of dedicated EA participants. In fact, the comment was in response to a post I made and went against the sentiments I expressed in that post, which was one reason I chose these numbers, to avoid going with numbers that matched my intuitions.
Nonetheless, I think that the 3-6 times impact of dedicated EA members may not be an exaggeration, for several reasons: 1) Those closer to the dedicated pole would likely provide more resources, of time and money, to advance global flourishing than ones closer to the typical end of the spectrum; 2) Those more dedicated would be more exposed and informed about the complex issues related to EA, such as the challenges of cause prioritization and evaluating QALYs for systemic intervention and existential risk, and can contribute more to creating the memetic and organization infrastructure for the most effective EA movement; 3) On a related note, dedicated EA participants would also likely search harder for the most impactful and cost-effective places to give rather than just go with GiveWell’s top picks, multiplying the impact of their giving by supporting “weird” charities and meta-charities, etc.; 4) Those more dedicated would channel their resources of time and skills more effectively into advancing global flourishing than typical EAs, ranging from volunteer work to choosing and switching careers based on a desire to optimize global flourishing and fill talent gaps.
The attention of some readers might be drawn to EA notables such as Peter Singer, William MacAskill, Tom Ash, Jon Behar, Ryan Carey, Brian Tomasik, Kerry Vaughn, Tyler Alterman, Julia Wise, Owen Cotton-Barratt, Ozzie Gooen, and others, including frequent EA Forum participants, when evaluating dedicated EA members. Indeed, they do more good, much more good, than 3-6X times the good done by typical EA participants, through a combination of convincing many more people to do EA-aligned activities and building the infrastructure of the EA movement. Yet we should remember that such notables are atypical, and do not represent the vast majority of dedicated EA members, and their contributions fold into the overall 3-6X contributions of dedicated people. Separately, one reader of the draft version suggested we should come up with an additional term for such EA notables, such as “rock stars,” who do more than 100X as much good as a typical EA participant, and I will leave that for readers to discuss in the comments.
I hope this shows why I think the 3-6 times impact of dedicated EA members is plausible, and I plan to use it in this analysis. Using the handy Guesstimate app created by Ozzie Gooen, here is a link to a model that shows the results of this comparison. You can also see a screenshot of the model below.
Now, these are Fermi estimates, and I invite you to use this model to put in your own estimates and see what you come up with. After all, the upvoting of the comments above does not indicate that these numbers are correct in any objective sense. For instance, you might believe that the numbers for the impact of dedicated EA members are in fact exaggerated, despite the defence I provided above, or understated. It would be helpful if whatever numbers you choose come from a source outside of yourself to minimize letting personal intuitions weigh the numbers in favor of your preferred position, but use your own judgment, of course.
We should also compare the resource contributions of typical and dedicated EA participants to an ordinary member of the general public who is not value-aligned with the EA movement to get a better grasp on the value provided to improving the world by EA members. Let’s say a typical member of the general public contributes 3.5% of her/his resources to charitable causes, both time and money. By comparison, let’s say a typical EA member contributes around 10% of her/his resources, in various combinations of time and money, to charity. Being generous, we can estimate that the resources provided by non-EAs are ~100 times less impactful than that of EA participants due to the higher effectiveness of EA-endorsed charities.
Let’s compare the impact of such giving. Here is a link to a model that does so, and here is a screenshot of the model.
As you can see, the impact of typical EA participants is ~28500% more than an ordinary member of the public, and the impact of dedicated ones is ~450% more than typical ones (the likelihood of dedicated EA participants selecting better charities is included in the 3-6 greater impact). You’re welcome to plug in your own numbers to get your own Fermi estimates, of course.
Implications
1) Getting a non-EA to behave like like a typical EA member yields an increase in global flourishing per individual from ~3.5 to ~1000 utilons. This is a huge increase, one that would be worthwhile to think about in percentage terms to address scope insensitivity - ~28500%. Losing a typical EA member or not gaining one in the first place reduces global flourishing by decreasing ~1000 utilons to ~3.5, so a ~28500% reduction per person lost.
2) Moving a typical EA to a dedicated EA results in a ~450% increase in global flourishing, from ~1000 to ~4500, and a dedicated EA becoming a typical EA results in a ~450% decrease in global flourishing, from ~4500 to ~1000.
3) Since the number of dedicated EA members is capped by the number of typical ones, in order to get more dedicated ones, we have to get substantially more typical ones, around 8-9 typical EA members to get a dedicated one.
4) Typical EA members as a cohort provide substantially more total value, ~25% more, in resources of time and money contributing to EA efforts to advance global flourishing than dedicated EA members.
4A) This Fermi estimate suggests that the EA movement does not function according to the Pareto principle, with 80% of the output produced by 20% of the input. For the sake of epistemic honesty and for the purpose of movement-building, it would be beneficial to publicly acknowledge and recognize the contributions of typical EA movement members.
4B) Additionally, for the sake of epistemic honesty, it’s important to acknowledge that other ways of calculating the impact of typical and dedicated EA participants might result in different estimates, depending on what you optimize for. For example, if you personally optimize for reducing existential risk or animal suffering, it might be that dedicated EA members spend more of their resources on those areas of EA activism as opposed to the more mainstream focus of addressing global poverty.
Discussion
What other numbers would be useful?
While the numbers on the benefits of focusing on getting non-EAs to behave more like typical EA members are prima facie convincing, we need to acknowledge that there are some numbers that we don’t have that are salient, and that are currently too vague for doing a Fermi estimate.
For example, it would be great to know how many resources it takes to get a non-EA to behave like an EA member. This is an area I have some knowledge about. Let’s take exposure to an EA-themed article as an example. I published this article on The Huffington Post, which was shared widely on social media. As you'll see from this Facebook comment on my personal page, it helped convince someone to decide to donate to effective charities. Furthermore, this comment is from someone who is the leader of a large secular group in Houston, and he thus has an impact on a number of other people. Since people rarely make actual comments, and far from all are fans of my Facebook page, we can estimate that many more made similar decisions but did not comment about it.
Another piece of evidence is that people who clicked over to the GiveDirectly website from this article I wrote donated $500, according to internal GiveDirectly stats. It is highly likely that people who clicked through and immediately donated as a result of reading the article just found out about GiveDirectly, and made a test donation as is typical for initial nonprofit donations - their lifetime value of donations will be worth much more to GiveDirectly. Likewise, the vast majority of people who will have found out about GiveDirectly from the article will take a while to donate as they research the topic and consider their donations. After all, an immediate donation from an article is pretty rare, usually people proceed much more slowly through the 4 steps of the nonprofit sales funnel and take time to consider and evaluate the nonprofit before donating.
Now, each of these articles took about 15 hours of total labor to write and edit for me, 10 hours for other EA participants collaborating with me to edit, and about 5 hours per article to place and promote.
Yet these numbers represent my labor in particular, and I specialize in promoting EA-themed effective giving to broad audiences. We need more numbers and data to get a better estimate of the average impact of such articles, and how much utility there is from those other than me working on this area. We also need numbers on the multitude of other areas of activity that can get ordinary people to behave like EA members, such as Giving Games and other activities. I would invite readers who have more familiarity with these areas of activity to provide their thoughts in comments, and also with your own models in the Guesstimate app.
Another set of numbers that we don’t have, and it would be great to have, is figuring out how many resources it takes to get a typical EA participant to become a dedicated one. In doing so, we also need to estimate how much risk there is of causing a typical EA member to leave the movement due to perception of high demands, and calculate that downside.
This is not an area I specialize in, and I hope some readers who do will leave their thoughts in comments, and also consider creating Guesstimate models of their own.
What are specific steps we can take to advance getting non-EAs to behave more like typical EAs?
1) One way of doing so is promoting EA to those who are value aligned. We should be wary of promoting the EA movement to those who are not value aligned, due to the downside of flooding the EA movement with non value-aligned people.
2) Another way is getting non-EAs to behave more like typical EAs without getting non-value aligned people into the EA movement. Promoting EA-themed effective giving ideas is one way of getting ordinary people to behave more like EAs without the downside of flooding the EA movement with non value-aligned members. For instance, if the ~3.5% of resources that non-EAs give to improving the world can be redirected to EA causes, this would result in ~3.5 utilons * ~100 increase in impact, so ~350 utilons instead of ~3.5. This is a ~10000% increase in EA-aligned efforts to advance global flourishing, regardless of whether or not the non-EA becomes value-aligned and joins the EA movement. Moreover, it is likely that if people are persuaded to act more like EA members, they will shift their values to grow more value aligned, and would eventually be ready to join the EA movement.
2A) Promoting effective giving includes publishing articles for a broad audience. Such articles have the promise of hitting a lot of people at once, but their impact is varied.
2B) Promoting effective giving to social elites who have lots of money to give. This can include a variety of strategies from promoting effective giving to niche well-off audiences, to selectively promoting pre-existing elite-giving strategies that are aligned with EA strategies. An example of the latter might be to write articles promoting well-known charities that engage in EA-like giving - the Gates Foundation, Good Ventures - while also promoting EA concepts such as data-driven giving, solving the drowning child problem, etc.
2B1) Getting elites to change their giving strategies takes more effort than getting non-elites to do so, since there are many competing for their attention and wealth, but we currently don’t have sufficient evidence to compare the trade-offs of focusing on either group. I suspect in any case we should pursue promoting EA-themed effective giving both to elites and non-elites.
2C) selectively promoting pre-existing elite-giving strategies that are aligned with EA strategies eg writing articles promoting charities that you can pick and choose from whatever the Gates foundation, zuckerberg, theil or whomever is flavour-of-the-month and giving (but only choose stuff that promotes EA-like giving).
2C) Providing materials and resources to local EA groups to enable them to promote effective giving on a local level, as well as teaching them how to do so. One project relevant to the former might be to create a marketing resource bank for all EAs to use, with materials to promote effective giving to broad audiences, and the EA movement to value-aligned people. One point relevant to the latter might be to hold workathons training EA participants on marketing and promotion.
2D) Promoting effective giving to gatekeepers and influencers who would then promote effective giving to their own audience due to mutually compatible incentives. For example, a number of organizations with affiliates who don’t have an inherent interest in EA might want to run Giving Games for other reasons, such as because they believe their affiliate members would benefit from them.
Are the negative consequences of high expectations and pressure that bad?
Now, some people might doubt that we lose EA members due to the pressure of high expectations and burnout. Yet there are many people who leave the EA movement because of the perception that they are only really welcome if they do as much as they can to contribute to EA causes.
Doing so can be exhausting and lead to burnout, as it did for me. While I did not choose to leave, many do leave the movement because of burnout. I spoke to many about this after I started sharing my story publicly a while ago.
Others leave because their circumstances change. I spoke to people who were donating 50% of their income, and then their circumstances changed – job loss, moving, other transitions – and they could not afford to do so anymore. Rather than suffer what they perceived as the stigma of not being good enough anymore, they disengaged from the movement. Others were contributing a huge amount of time to the movement during their college years, but then graduate, move, and lose that community support that kept their activism going and gave them a strong sense of purpose. Because of survivorship bias, most of those in the movement don’t see them as they participate less and less, and they fade quietly into the background, resulting in huge losses of money and time/skills for the movement.
Others choose not to join in the first place because of the high expectations, even those who are otherwise value-aligned. For instance, Taryn describes how she is value aligned with EA and does EA-themed activities. Yet she is reluctant to identify with the EA movement due to the "general unspoken feeling of 'you're not doing enough unless you meet our high expectations,'" as expressed by in her Facebook comments here. Or take the comments of Kaj here (who permitted me to cite him): “Datapoint - I too have felt unsure whether I'm doing enough to justifiably call myself EA. (I have both worked for and donated to MIRI, ran a birthday fundraiser for EA causes, organized an introductory EA event where I was the main speaker, and organized a few EA meetups. But my regular donations are pretty tiny and I'm not sure of how much impact the-stuff-that-I've-done-so-far will have in the end, so I still have occasional emotional doubts about claiming the label.)”
How many of you think Kaj should not identify with the EA movement? However, the only role models in the EA movement right now are those who are highly involved and committed. There are no real steps taken to acknowledge and celebrating typical EA members for the benefits they bring to the movement. Some easy steps to celebrate typical EA participants, and ease off the pressure to do whatever they can, will likely result in a significant overall gain for the EA movement.
What are steps we can take to address these problems?
1) We can encourage publications of articles that give typical EA members the recognition they deserve for doing so much for the movement, more than dedicated ones according to the calculations above.
2) We can invite typical EA members to speak about their experiences at the 2016 EA Global.
3) We can publish interviews with typical EAs.
4) We can feature a few typical EA members on the redesigned version of effectivealtruism.org.
5) We can pay more attention to burnout and self-care than we currently do, and highlight the importance for those trying to change the world to orient toward the long term, thinking of their civic engagement as a marathon and not a sprint.
6) We can also celebrate the people who make small steps that carry them up the spectrum of engagement: those who go from 1% to 5% and those who go from 5 to 10% of their monetary donations; those who take TLYCS or the GWWC pledge; those who increase their level of volunteering from being EA group members to EA group organizers, who start to give EA talks or host Giving Games, who start to write blog post about their EA engagement, who start to volunteer for EA meta-charities or effective direct-action charities. The key is to encourage and praise multiple small steps and multiple paths, for instance for people that don't want to do talks but are ok with writing blog-posts and vice versa, in order that people don't get discouraged thinking you only support a person if they choose a particular activity. It's also good to have ways of acknowledging people that come up with new ways of doing things advancing global flourishing. The underlying logic here is to pay attention to people’s emotions, social signaling, and group dynamics. Then, provide appropriate rewards and positive reinforcements for higher engagement, without making people feel not included if they do not choose to engage more, or need to drop out and re-engage later.
7) On a related note, we can publicly promote the idea that people will shift their commitment levels of time and money to the movement as their circumstances change, and that’s ok! Heidi Overbeek is one example of a person who did so. People like Heidi should be welcomed to commit as much as they are willing and able, and not asked to keep committing at their previous level despite their change in circumstances. It’s especially easy for us dedicated EA participants to make the mistake of demanding that people stick to prior commitments due to the human brain’s vulnerability to loss aversion, but knowing about this tendency, we can avoid it.
Now, I’m not suggesting we should make most speakers typical EAs, or write most articles or conduct most interviews with them. Overall, my take is that it’s appropriate to celebrate individual EA members proportional to their labors, and as the numbers above show, dedicated EA participants individually contribute quite a bit more than typical ones. Yet we as a movement need to go against the current norm of not acknowledging typical EA members, and give them the recognition they deserve for contributing in their mass more than dedicated EA participants, and not making excessive demands on them that will cause them to leave the movement. These are just some specific steps that would help us achieve this goal.
Conclusion
For the sake of global flourishing, in comparison to the current baseline of the EA movement, it’s more valuable to:
1) Focus on attracting value-aligned people and retaining typical EA members, and decreasing the emphasis on transforming typical EA members into dedicated ones. This would both increase global flourishing by ensuring high numbers of typical EA members, and also provide the baseline population needed to get more dedicated EA participants.
2) Focus on getting members of the general public to behave more like typical EA members by promoting effective giving broadly, thus changing people’s giving behaviors and channeling their existing resources into EA-aligned causes. Over time, for some people this would result in changing values to make them EA-aligned and ready to join the movement.
(Cross-posted on the EA Forum)
14 comments
Comments sorted by top scores.
comment by Gunnar_Zarncke · 2016-02-17T19:13:22.580Z · LW(p) · GW(p)
I'd like to see the calculation updated with a significantly wide range of effectiveness of non-EAs (need not be the gauss distribution of guesstimate) - maybe even one that includes a realistic percentage with the EAs effectiveness as a subset at the upper end of the distribution (because there are likely people thinking and acting like EAs without being part of the movement or never having heard of it.
Replies from: Gleb_Tsipursky↑ comment by Gleb_Tsipursky · 2016-02-17T23:25:13.367Z · LW(p) · GW(p)
Agreed that non-EAs have a wide range of effectiveness for improving the world, so 3.5 utilons is a rough guesstimate. However, we have to keep in mind that many non-EAs have negative effectiveness, for example people who produce cigarettes. So it would have to be a complex calculation. We can certainly try to get at this number, but it would take a lot of work to account for all factors appropriately.
Replies from: Gunnar_Zarncke↑ comment by Gunnar_Zarncke · 2016-02-18T08:43:17.649Z · LW(p) · GW(p)
I don't suggest to account for 'all factors appropriately' but to not model non-EA as 'close to zero'. Why not be honest and model them as zero on average? That would net you literally infinitely better effectiveness of converting non-EAs. This suggests that there is something wrong with the calculation. The difficulty of converting people to EA depends on how EA-affine they are to begin with. And that has to be taken into account somehow.
Replies from: Gleb_Tsipursky↑ comment by Gleb_Tsipursky · 2016-02-18T18:58:05.872Z · LW(p) · GW(p)
I think on average non-EA people are making the world slightly better, guided by various incentive structures - from common sense, to empathy, to efficient markets. But on average people are not committed to making the world as good as it can get through their actions. I think this intentionality on the part of EA participants, their willingness to devote sizable resources to this area, and their willingness to update based on evidence justifies the huge multiple for how much better EAs make the world compared to non-EA people.
However, this is only on average. I certainly would think that some non-EA people have as much of a positive impact as EA participants, if they happen to do things that are EA-aligned, such as support GiveDirectly, MIRI, etc. Or they could be helping the world in other ways, such as pushing for limiting nuclear risk, preventing pandemic risk, etc.
comment by Vaniver · 2016-02-19T02:53:22.579Z · LW(p) · GW(p)
FYI: I edited a summary break into this post, right before "Introduction."
Replies from: Gleb_Tsipursky↑ comment by Gleb_Tsipursky · 2016-02-19T03:20:13.526Z · LW(p) · GW(p)
Sounds fine.
comment by EngineerofScience · 2016-05-04T00:16:38.864Z · LW(p) · GW(p)
What is a Fermi Estimate? If you could provide a link to an article talking about that I would be thankful.
Replies from: Gleb_Tsipursky↑ comment by Gleb_Tsipursky · 2016-05-04T23:14:12.426Z · LW(p) · GW(p)
comment by JohnC2015_duplicate0.34499772964045405 · 2016-04-18T03:59:17.174Z · LW(p) · GW(p)
I am new to EA and this article together with the inputs I am getting with Gleb, everything seems to be clearer.
For now, I can personally say that I am transformed from a typical EA member (just someone who listen and read about EA-related articles) into a dedicated EA member (active and participative) .
Thank you all, guys! :)
comment by dvasya · 2016-02-18T00:07:23.281Z · LW(p) · GW(p)
Losing a typical EA ... decreasing ~1000 utilons to ~3.5, so a ~28500% reduction per person lost.
You seem to be exaggerating a bit here: that's a 99.65% reduction. Hope it's the only inaccuracy in your estimates!
Replies from: alicey, Gleb_Tsipursky↑ comment by Gleb_Tsipursky · 2016-02-18T02:18:08.259Z · LW(p) · GW(p)
As the comment below indicates, I think we don't disagree on the math, it's the semantics issue. When I talk about reduction per person lost, I compare the utilons from a typical person to the typical EA., which is a 996.5 utilon difference. So comparing that is 996.5/3.5 * 100% = 28471%.
comment by Sarginlove · 2016-04-19T06:35:20.606Z · LW(p) · GW(p)
A lovely one, with this ideal i can be able to know how a typical EA member should behave, then also know how to behave as one encouraging others also on how to become a good and typical EA member. Then also not just encouraging but making them remain as one. A nice write up which can help to improve giving behavior and changing other people's giving behavior which can have impact in he society. It's really a good thing.
comment by avwenceslao · 2016-04-19T03:56:01.527Z · LW(p) · GW(p)
I am truly grateful to the author of this post who has shared this great piece of writing at this time.