“Be A Superdonor!”: Promoting Effective Altruism by Appealing to the Heart

post by Gleb_Tsipursky · 2015-11-09T18:20:56.586Z · LW · GW · Legacy · 82 comments

(Cross-posted on The Life You Can Save blog, the Intentional Insights blog, and the Effective Altruism Forum).

 

This will be mainly of interest to Effective Altruists



Effective Altruism does a terrific job of appealing to the head. There is no finer example than GiveWell’s meticulously researched and carefully detailed reports laying out the impact per dollar on giving to various charities. As a movement, we are at the cutting edge of what we can currently evaluate about the effectiveness of how we optimize QALYs, although of course much work remains to be done.

 

However, as seen in Tom Davidson’s recent piece, "EA's Image Problem," and my Making Effective Altruism More Emotionally Appealing,” we currently do not do a very good job of appealing to the heart. We tend to forget Peter Singer’s famous quote that Effective Altruism “combines both the heart and the head.” When we try to pitch the EA movement to non-EAs, we focus on the head, not the heart.

 

Now, I can really empathize with this perspective. I am much more analytically oriented than the baseline, and I find this to be the case for EAs in general. Yet if we want to expand the EA movement, we can't fall into typical mind fallacy and assume that what worked to convince us will convince others who are less analytical and more emotionally oriented thinkers.

 

Otherwise, we leave huge sums of money on the table that otherwise could have gone to effective charities. For this reason, I and several others have started a nonprofit organization, Intentional Insights, dedicated to spreading rational thinking and effective altruism to a wide audience using effective marketing techniques. Exploring the field of EA organizations, I saw that The Life You Can Save already has some efforts to reach out to a broad audience, through its Charity Impact Calculator and its Giving Games, and actively promoted its efforts.

 

I was excited when Jon Behar, the COO & Director of Philanthropy Education at TLYCS, reached out to me and suggested collaborating on promoting EA to a broad audience using contemporary marketing methods that appeal to the heart. In a way, this is not surprising, as Peter Singer’s drowning child problem is essentially an effort to appeal to people’s hearts in a classroom setting. Using marketing methods that aim to reach a broad audience is a natural evolution of this insight.

 

Jon and I problem-solved how to spread Effective Altruism effectively, and came up with the idea of a catchphrase that we thought would appeal to people’s emotions well: “Be a Superdonor!” This catchphrase conveys in a short burst crucial information about Effective Altruism, namely that one can have the most powerful impact of one’s donations through giving to the charities that optimize QALYs for the most.

 

More importantly, it appeals to the heart well. Superdonor conveys the feeling of power – you can be super in your donations! Superdonor conveys an especially strong degree of generosity. Superdonor conveys a feeling of superiority, as in better than other donors. In other words, even if you donate less, if you donate more effectively, you can still be better than other donors by donating more effectively. This appeals to the “Keeping Up With the Joneses” effect, a powerful force in guiding our spending.

 

Just as importantly, “Be a Superdonor!” is easily shareable on social media, a vital component of modern marketing in the form of social proof. People get to show their pride and increase their social status by posting on their Facebook or Twitter how they are a Superdonor. This makes their friends curious about what it means to be a Superdonor, since that is an appealing and emotionally resonant phrase. Their friends check out their links, and get to find out about Effective Altruism. Of course, it is important that the link go to a very clear and emotionally exciting description of how one can be a Superdonor through donating.

 

Likewise, people should get credit for being a Superdonor through getting others to donate through sharing about it on social media, through talking about it to friends, through getting their friends to go to their local EA groups. Thus, we get the power of social affiliation, a crucial aspect of motivation, working on behalf of Effective Altruism. A particularly effective strategy for social affiliation here might be to combine “Be A Superdonor” with Giving Games, both the in-person version that TLYCS runs now and perhaps a web app version that helps create a virtual community setting conducive to social affiliation.

 

Now, some EAs might be concerned that the EA movement would lose its focus on the head through these efforts. I think that is a valid concern, and we need to be aware of the dangers here. We still need to put energy into the excellent efforts of GiveWell and other effective charity evaluators. We still need to be concerned with existential risk, even if it does not present us in the best light to external audiences.

 

Therefore, as part of the Superdonor efforts, we should develop compassionate strategies to educate emotionally-oriented newcomers about more esoteric aspects of Effective Altruism. For example, EA groups can have people who are specifically assigned as mentors for new members, who can help guide for their intellectual and emotional development alike. At the same time, we need to accept that some of those emotionally-oriented thinkers will not be interested in doing so.

 

This is quite fine, as long as we remember our goal of making the strongest impact on the world by optimizing QALYs through not leaving huge sums of money on the table. Consider the kind of benefit you can bring to the EA movement if you can channel the giving of emotionally-oriented thinkers toward effective charities. Moreover, think of the positive network effect of them getting their friends to donate to effective charities. Think of whether you can make a much bigger difference in doing the most good per energy of effort by focusing more of your own volunteering and giving on EA outreach in comparison to other EA-related activities. This is what inspired my own activities at Intentional Insights, and the recent shifts of the TLYCS toward effective outreach.

 

What are your thoughts about reaching out to more emotionally-oriented thinkers using these and other modern marketing strategies? If you support doing so, what do you think you can do personally to promote Effective Altruism effectively? Would love to hear your thoughts about it in comments below, and happy to talk to anyone who wants to engage with the Intentional Insights project: my email is gleb@intentionalinsights.org.

 

82 comments

Comments sorted by top scores.

comment by OrphanWilde · 2015-11-09T21:20:01.424Z · LW(p) · GW(p)

Once you start trying to appeal to "emotionally-oriented" thinkers (what a euphemism), you start bringing them into your group. Once you bring them into your group, they start participating in creating your group policy. Once they start participating in creating your group policy - you stop being effective, because they don't care about effective, and they outnumber you.

Don't court the Iron Law of Oligarchy so directly. Keep your focus on your organization's purpose, rather than your organization. It will last slightly longer that way.

Replies from: John_Maxwell_IV, Viliam, Gleb_Tsipursky, MarsColony_in10years, bogus
comment by John_Maxwell (John_Maxwell_IV) · 2015-11-11T23:06:16.970Z · LW(p) · GW(p)

Do you mean the Iron Law of Bureaucracy?

...in any bureaucratic organization there will be two kinds of people: those who work to further the actual goals of the organization, and those who work for the organization itself. Examples in education would be teachers who work and sacrifice to teach children, vs. union representatives who work to protect any teacher including the most incompetent. The Iron Law states that in all cases, the second type of person will always gain control of the organization, and will always write the rules under which the organization functions.

Replies from: OrphanWilde
comment by OrphanWilde · 2015-11-11T23:13:32.630Z · LW(p) · GW(p)

So I did, yes. I suspect I've mixed those mentally more than once.

comment by Viliam · 2015-11-10T09:04:28.210Z · LW(p) · GW(p)

Once you bring them into your group, they start participating in creating your group policy.

What exactly is the "group" here, and how exactly will they "participate in the policy"? Are we going to put the emotionally oriented people into research positions at GiveWell? Or do you believe the risk is that at some moment they will say "fuck GiveWell, let's donate to the Cute Puppies Foundation instead"?

The latter seems like a real risk to me, the former doesn't.

Replies from: OrphanWilde, Gleb_Tsipursky
comment by OrphanWilde · 2015-11-10T15:23:07.096Z · LW(p) · GW(p)

The latter is a part of the risk.

But yes, the former is a part of the risk too.

The issue is that the "We" you reference is going to change. And it will be, step by step, a series of positive moves, all culminating in a collapse of everything you care about. First, to court the new, "emotionally oriented" members of EA, you start hiring better marketers. Executives give way to industry-proven fundraisers. At every step, you get more effective at your purpose - and at each step, your purpose changes slightly. Until Effective Altruism becomes yet another Effective Fundraiser - and then, yes, people are put into research positions based on their ability to improve fundraising, rather than their ability to research charitable efforts.

All organizations are doomed, and that part will happen regardless. It's just a matter of timing.

comment by Gleb_Tsipursky · 2015-11-10T23:59:38.564Z · LW(p) · GW(p)

Agreed that putting emotionally-oriented people into research positions would be a risk, but let's be honest, they won't want to go there.

Regarding the second point, the whole goal of the post I was making above is to appeal to people's emotions to cause them to care about effectiveness.

The more emotionally-oriented people will not be good at determining effectiveness. But if we can get them to care about effectiveness, not cute puppies, that's where we can make a huge difference in their spending decisions. They would be highly unlikely to become leaders within EA, but their donations can then be powerfully shaped by EA recommendations

comment by Gleb_Tsipursky · 2015-11-10T23:55:24.424Z · LW(p) · GW(p)

I hear your concern about bringing more emotionally-oriented people into leadership positions. However, I am not at all convinced by the point of them not caring about being effective. The whole point I was making above is to appeal to people's emotions to cause them to care about effectiveness.

The more emotionally-oriented people will not be good at determining effectiveness. But if we can get them to care about effectiveness, not cute puppies, that's where we can make a huge difference in their spending decisions. They would be highly unlikely to become leaders within EA, but their donations can then be powerfully shaped by EA recommendations

Replies from: OrphanWilde, ChristianKl
comment by OrphanWilde · 2015-11-11T16:03:09.980Z · LW(p) · GW(p)

They are highly likely to become leaders within EA, because your advertising is based on appealing to social status, and people who can be appealed to on that basis are better at social status games than you are. You assume your intelligence puts you at an advantage; you're, bluntly, horribly wrong. Charisma is not a dump stat.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-11T18:16:19.027Z · LW(p) · GW(p)

I agree that charisma is important, but within the EA movement in particular, you won't get far without intelligence. Intelligence is a necessary qualifier for leadership in EA, in other words.

I have a pretty strong confidence level on that one. I'm ready to make a $100 bet that if you ask EA people whether intelligence is a necessary qualifier for leadership in the EA movement, 9 out of 10 will say yes. Want to take me up on this?

Replies from: OrphanWilde, ChristianKl
comment by OrphanWilde · 2015-11-11T18:40:00.335Z · LW(p) · GW(p)

Within the current EA movement, or within the EA movement you propose to create by filling your ranks with people who don't share your culture or values?

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-12T01:05:32.032Z · LW(p) · GW(p)

Within the EA movement currently, but I disagree with the second presumption. So are you taking that bet?

Replies from: OrphanWilde
comment by OrphanWilde · 2015-11-12T12:38:22.538Z · LW(p) · GW(p)

I don't think you understand how bets work in staking out certainty, but $100 against $100 implies your certainty that you won't destroy the EA movement is ~50%.

Replies from: ChristianKl, Gleb_Tsipursky
comment by ChristianKl · 2015-11-12T13:03:08.889Z · LW(p) · GW(p)

The bet is not on the question of whether he destroys the EA movement but about whether people say intelligence is important.

Replies from: OrphanWilde
comment by OrphanWilde · 2015-11-12T13:36:02.761Z · LW(p) · GW(p)

And he'll find little disagreement from me over what the current group of EA members will say, which says nothing at all about what the group of EA members -after- a successful advertising campaign aimed at increasing membership substantially will say, and at which point the EA movement is, by my view, effectively destroyed, even if it doesn't know it yet.

comment by Gleb_Tsipursky · 2015-11-12T21:11:45.600Z · LW(p) · GW(p)

My certainty is against your claim of emotionally oriented people becoming EA leaders, not the destruction of the EA movement. Please avoid shifting goalposts :-) So taking the bet, or taking back your claim?

Replies from: OrphanWilde
comment by OrphanWilde · 2015-11-12T21:40:02.028Z · LW(p) · GW(p)

My unwillingness to accept a strawman in place of the positions I have actually stated does not constitute shifting goalposts. But that's irrelevant, compared to the larger mistake you're making, in trying to utilize this technique.

A lesson in Dark Arts: I am Nobody. I could delete this account right now, start over from scratch, and lose nothing but some karma points I don't care about. You, however, are a Somebody. Your account is linked to your identity. Anybody who cares to know who you are, can know who you are. Anybody who already knows who you are can find out what you say here.

As a Somebody, you have credibility. As a Nobody, I have none. So in a war of discrediting - you discredit me, I discredit you - I lose nothing. What do you lose?

Your identify gives you credibility. But it also gives you something to lose. My lack of identity means any credit I gain or lose here is totally meaningless. But it means I have nothing to lose. That means that our credibility disparity is precisely mirrored by a power disparity; one in your favor, the other in mine. But the credibility disparity lasts only until you let yourself be mired in credibility-destroying tactics.

You really shouldn't engage anybody, much less me, in a Dark Arts competition. Indeed, it's vaguely foolish of you to have admitted to the practice of Dark Arts in the first place.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-12T21:58:56.559Z · LW(p) · GW(p)

I agree that I have something significant to lose, as my account is tied to my public identity.

However, I do not share your belief that me having acknowledged engaging in Dark Arts is foolish. I am comfortable with being publicly identified as someone who is comfortable with using light forms of Dark Arts, stuff that Less Wrongers generally do not perceive as crossing into real Dark Arts, to promote rationality and Effective Altruism. In fact, I explored this question in a Less Wrong discussion post earlier. I want to be open and transparent, to help myself and Intentional Insights make good decisions and update beliefs.

comment by ChristianKl · 2015-11-12T13:16:30.711Z · LW(p) · GW(p)

I think you go wrong if you assume that "emotionally-oriented" are automatically stupid.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-12T21:10:21.714Z · LW(p) · GW(p)

I agree that emotionally-oriented people are not automatically stupid, my point was about what EAs value. If an emotionally-oriented person happens to be also intelligent, then that has certain benefits for the EA movement, of course.

Replies from: ChristianKl
comment by ChristianKl · 2015-11-13T14:24:54.708Z · LW(p) · GW(p)

A person who cares about playing status games might be intelligent but still harmful to the EA movement.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-13T16:13:21.617Z · LW(p) · GW(p)

I have a strong probabilistic estimate that there are currently a substantial number of people in the EA movement who care about status games. I'm willing to take a bet on that.

comment by ChristianKl · 2015-11-11T12:19:35.082Z · LW(p) · GW(p)

When it comes to straight money donations their origin doesn't matter. When it comes to people who actually want to spend time being involved "in the community" their background matters.

Replies from: Gleb_Tsipursky, VoiceOfRa
comment by Gleb_Tsipursky · 2015-11-11T18:09:51.081Z · LW(p) · GW(p)

Yup, agreed.

comment by VoiceOfRa · 2015-11-19T05:13:00.817Z · LW(p) · GW(p)

Yes, they do. Strait money donations exert evolutionary pressure on charities.

comment by MarsColony_in10years · 2015-11-10T16:28:31.445Z · LW(p) · GW(p)

This is a good point. Perhaps an alternative target audience to "emotionally oriented donars" would be "Geeks". Currently, EA is heavily focused on the Nerd demographic. However, I don't see any major problems with branching out from scientists to science fans. There are plenty of people who would endorse and encourage effectiveness in charities, even if they suck at math. If EA became 99.9% non-math people, it would obviously be difficult maintain a number crunching focus on effectiveness. However this seems unlikely, and compared to recruiting "emotionally-oriented" newbies it seems like there would be much less risk of losing our core values.

Maybe "Better Giving Through SCIENCE!" would make a better slogan than "Be A Superdonor"? I've only given this a few minutes of thought, so feel free to improve on or correct any of these ideas.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-10T23:57:28.934Z · LW(p) · GW(p)

I think orienting toward geeks would be good, but insufficient. The whole point I was making above is to appeal to people's emotions to cause them to care about effectiveness. The more emotionally-oriented people will not be good at determining effectiveness. But if we can get them to care about effectiveness, not cute puppies, that's where we can make a huge difference in their spending decisions. They would be highly unlikely to become leaders within EA, but their donations can then be powerfully shaped by EA recommendations

comment by bogus · 2015-11-10T10:44:21.333Z · LW(p) · GW(p)

I'm not sure that there are many people who are exclusively "emotionally-oriented" - or "analytically-oriented", for that matter. Rather, the idea is that by appealing to both the "head" and the "heart" we can convey a fuller message about EA, and that this will amplify our reach among people who otherwise might not know about it or take it seriously.

Replies from: Lumifer, OrphanWilde, Gleb_Tsipursky
comment by Lumifer · 2015-11-10T16:04:01.561Z · LW(p) · GW(p)

I'm not sure that there are many people who are exclusively "emotionally-oriented" - or "analytically-oriented", for that matter.

Hint: "emotionally oriented" is a code word for "stupid and easily led".

Replies from: bogus, Gleb_Tsipursky
comment by bogus · 2015-11-11T01:28:57.144Z · LW(p) · GW(p)

Sure, but that's a spectrum too. I don't know of many people who are so vulnerable to the "dark arts" that the "head" would play no role in their decisions. EA will always appeal to the most analytical, that's a given - but if you want to broaden your reach you need to make the effort.

comment by Gleb_Tsipursky · 2015-11-11T00:07:01.625Z · LW(p) · GW(p)

I was being a little excessive in the post by using that term. I wouldn't necessarily call them stupid, just not well educated and savvy. If we can shape them in the right direction, and get them to care about effectiveness, it would be a huge boon to the EA movement and put a lot of money into effective charities. Thus, we can be agentive about meeting our goals.

Replies from: Lumifer
comment by Lumifer · 2015-11-11T16:38:27.656Z · LW(p) · GW(p)

If we can shape them in the right direction, and get them to care about effectiveness, it would be a huge boon to the EA movement and put a lot of money into effective charities.

As you said in the OP, "we leave huge sums of money on the table". Shaping people so that you could get at their money easier is what marketing scum does.

Of course, that shouldn't bother dedicated consequentialists, should it? :-/

Replies from: None, Gleb_Tsipursky
comment by [deleted] · 2015-11-11T20:41:32.670Z · LW(p) · GW(p)

As you said in the OP, "we leave huge sums of money on the table". Shaping people so that you could get at their money easier is what marketing scum does.

I think that's a bit of an extreme way to put it... people who are emotionally driven see emotional appeals as the proper way to convince people. "You think too much" and " sometimes you just have to go with your gut" is an inherently appealing thing to them - regardless of intelligence levels.

Essentially, they WANT emotional appeals like this one - I saw several emotionally driven (and smart) friends who shared this commercial and basically said (I'm translating now) "It's nice to see an emotional appeal that actually has a good message/purpose".

That's what effective altruism can offer, marketing that has a good message and leads to good outcomes. Convincing people by logic is no more "inherently good" than convincing people by emotion (at least, I haven't seen anyone provide a convincing proof of either's inherent goodness or badness), it just depends on their preferred method of thinking.

Replies from: Lumifer, OrphanWilde
comment by Lumifer · 2015-11-11T21:14:07.820Z · LW(p) · GW(p)

people who are emotionally driven see emotional appeals as the proper way to convince people

But why would "emotionally driven" people be interested in EA? It doesn't offer them the required emotional appeal (note: I'm talking about EA activities, not EA marketing). If the marketing promises them rescuing cute puppies in distress, EA won't be able to deliver. And even if such people stick around, OrphanWilde's considerations come into play: these people have different goals and different culture, recruit enough of them and they'll take over.

Essentially, they WANT emotional appeal

People WANT to be on the receiving end of advertising for unknown to them charity? Not anyone I know, but sure, mankind is very diverse... :-/

Besides, are you quite sure you want to compete on the emotional-appeal basis? You become a very small fish in a big pond with some pretty large megalodons swimming around. I don't doubt that the Sierra Club, Susan G. Komen, and ASPCA will handily beat you in the cuteness sweepstakes (not to mention advertising budgets). What's your edge?

Replies from: None
comment by [deleted] · 2015-11-11T22:21:23.688Z · LW(p) · GW(p)

But why would "emotionally driven" people be interested in EA? It doesn't offer them the required emotional appeal (note: I'm talking about EA activities, not EA marketing). If the marketing promises them rescuing cute puppies in distress, EA won't be able to deliver.

Well, it depends on what you're advertising. If you're advertising deworming you talk about the suffering of children in these countries and you show some heartbreaking images (I'm being deliberately vague here, but you get the idea.

If you're advertising animal welfare, then yes, you can show cute puppies.

Besides, are you quite sure you want to compete on the emotional-appeal basis? You become a very small fish in a big pond with some pretty large megalodons swimming around.

You wouldn't consider "people who are emotionally driven" as a target market. That's far too big a market for a small movement like EA (probably containing somewhere between 40%-95% of the global population). Instead, you would start out with a smaller market that you expect contains many emotionally driven people. You move to the bigger ponds once you have the capital to compete in them.

comment by OrphanWilde · 2015-11-11T21:01:53.189Z · LW(p) · GW(p)

That's what effective altruism can offer, marketing that has a good message and leads to good outcomes. Convincing people by logic is no more "inherently good" than convincing people by emotion (at least, I haven't seen anyone provide a convincing proof of either's inherent goodness or badness), it just depends on their preferred method of thinking.

That's not what the original poster is proposing, however. The original poster is proposing convincing people, through social status bonuses, to donate money. The original poster isn't proposing appealing to people's better natures, but through encouraging their baser natures.

That's where the morality enters into it, I believe.

comment by Gleb_Tsipursky · 2015-11-11T18:19:05.439Z · LW(p) · GW(p)

Using the term marketing scum is a bit pejorative, I hope we can agree on that :-) Let's avoid emotionally-loaded terms when having rational discourse - I suggest tabooing that term.

Regardless of the term used, yes, I am a dedicated consequentialist, and my goal is to get people to care about effective giving, to avoid leaving huge sums on the table.

Replies from: Dagon, Lumifer
comment by Dagon · 2015-11-12T15:39:00.864Z · LW(p) · GW(p)

Wait. Is your goal to get people to care about effective giving, or is your goal to get people to give effectively? "to avoid leaving huge sums on the table" implies the latter.

This question seems to be the crux of the discussion. Whether EA as a movement has an important identity and mission that's not just "improve the measured state of being of many people on a relatively short timeframe".

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-12T21:14:45.145Z · LW(p) · GW(p)

My goal is to get people to care about effective giving. This will then lead to people giving effectively. However, the first is the goal I am pursuing most directly.

comment by Lumifer · 2015-11-11T18:24:48.845Z · LW(p) · GW(p)

Let's avoid emotionally-loaded terms

But... but... but... what about "appealing to the heart"? :-P

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-11T18:40:01.087Z · LW(p) · GW(p)

I'd be happy to taboo that as well, if you'd like ;-)

comment by OrphanWilde · 2015-11-10T15:13:47.930Z · LW(p) · GW(p)

Rather, the idea is that by appealing to both the "head" and the "heart" we can convey a fuller message about EA, and that this will amplify our reach among people who otherwise might not know about it or take it seriously.

Coddletrop. This post is talking about dark arts, about bypassing the head entirely. "Superdonor" indeed.

Replies from: MarsColony_in10years, bogus, Gleb_Tsipursky
comment by MarsColony_in10years · 2015-11-10T21:28:15.540Z · LW(p) · GW(p)

I agree that OP was leaning a bit heavy on the advertising methods, and that advertising is almost 100% appeal to emotion. However, I'm not sure that 0% emotional content is quite right either. (For reasons besides argument to moderation.) Occasionally it is necessary to ground things in emotion, to some degree. If I were to argue that dust specs in 3^^^3 people's eyes is a huge amount of suffering, I’d likely wind up appealing to empathy for that vastly huge unfathomable amount of suffering. The argument relies almost exclusively on logic, but the emotional content drives the point home.

However, maybe a more concrete example of the sorts of methods EAs might employ will make it clearer whether or not they are a good idea. If we do decide to use some emotional content, this seems to be an effective science-based way to do it: http://blog.ncase.me/the-science-of-social-change/

Aside from just outlining some methods, the author deals briefly with the ethics. They note that children who read George Washington's Cherry Tree were inspired to be more truthful, while the threats implicit in Pinocchio and Boy Who Cried Wolf didn’t motivate them to lie less than the control group. I have no moral problem with showing someone a good role model, and setting a good example, even if that evokes emotions which influence their decisions. That’s still similar to an appeal to emotion, although the Aristotelian scheme the author mentions would classify it as Ethos rather than Pathos. I’m not sure I’d classify it under Dark Arts. (This feels like it could quickly turn into a confusing mess of different definitions for terms. My only claim is that this is a counterexample, where a small non-rational component of a message seems to be permissible.)

It seems worth noting that EAs are already doing this, to some degree. Here are a couple EA and LW superheroes, off of the top of my head:

One could argue that we should only discuss these sorts of people purely for how their stories inform the present. However, if their stories have an aspirational impact, then it seems reasonable to share that. I’d have a big problem if EA turned into a click-maximizing advertising campaign, or launched infomercials. I agree with you there. There are some techniques which we definitely shouldn’t employ. But some methods besides pure reason legitimately do seem advisable. But guilting someone out of pocket change is significantly different from acquiring new members by encouraging them to aspire to something, and then giving them the tools to work toward that common goal. It’s not all framing.

Replies from: OrphanWilde, Gleb_Tsipursky
comment by OrphanWilde · 2015-11-10T21:59:11.665Z · LW(p) · GW(p)

The issue with advertising isn't just the ethics. Set the ethics of advertising aside. The issue with advertising is that you're bringing people in on the basis of something other than Effective Altruism.

How many new people could EA successfully culturally inculcate each month? Because that's the maximum number of people you should be successfully reaching each month. EA is fundamentally a rationalist culture; if you introduce non-rationalists faster than you can teach them rationalism, you are destroying your own culture.

How do you foresee this going for EA's culture?

comment by Gleb_Tsipursky · 2015-11-11T00:23:28.548Z · LW(p) · GW(p)

I very much agree. My post was leaning more toward the heart to go against the mainstream. There are plenty of tactics I would not endorse. We shouldn't lie to people, or tell them that Jesus will cry if they don't give to effective charities. However, I think it's important to acknowledge and be ok with using some moderate dark arts to promote rationality and effective altruism. If we can motivate people to engage with the EA movement and put their money toward effective charities by getting them to truly care about effective donations, I think that is a quite justifiable use of moderate dark arts. We can be agentive about meeting our goals.

P.S. Nice username!

comment by bogus · 2015-11-11T01:38:49.629Z · LW(p) · GW(p)

This post is talking about dark arts, about bypassing the head entirely.

Guess what, that's what the 'heart' responds to. It doesn't mean you can't appeal to the head too, it's just saying that a mixed message won't work very well. The appeals do have to be largely distinct, albeit they would probably work best if presented together.

Replies from: OrphanWilde
comment by OrphanWilde · 2015-11-11T15:53:44.079Z · LW(p) · GW(p)

Emotions are not in opposition to rationality, and you do not have to bypass rational processes in order to reach the heart. That is the flawed presumption underlying the Spock mentality.

Replies from: Lumifer, bogus
comment by Lumifer · 2015-11-11T16:53:39.287Z · LW(p) · GW(p)

Emotions are not in opposition to rationality, and you do not have to bypass rational processes in order to reach the heart.

This is very true. You only need to bypass the head if you want to sabotage the rational process and manipulate people into something their head would have rejected.

comment by bogus · 2015-11-11T21:53:04.085Z · LW(p) · GW(p)

The Spock mentality is about personal decision making, not communication or even influence. The notion that 'reaching' System 1 is not something you can do with ordinary, factual communication is quite widely accepted. Even some recent CFAR materials - with their goal factoring approach - are clearly based on this principle.

comment by Gleb_Tsipursky · 2015-11-11T00:11:29.903Z · LW(p) · GW(p)

I think I was clear that we should still use the current EA tactics of appealing to the head, but enrich them by appealing to the heart, to emotions. I think it's important to acknowledge and be ok with using some moderate dark arts to promote rationality and effective altruism. If we can motivate people to engage with the EA movement and put their money toward effective charities by getting them to truly care about effective donations, I think that is a quite justifiable use of moderate dark arts.

Replies from: OrphanWilde
comment by OrphanWilde · 2015-11-11T15:56:00.438Z · LW(p) · GW(p)

First, emotions != dark arts, or EA would be meaningless as an enterprise.

Second, you're not getting anybody to care about effective donations, you're getting them to care about the social status they would attain by being a part of your organization. People who care about social status in this way are going to want more, and they're better at it than you are. You will lose control.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-11T18:13:24.907Z · LW(p) · GW(p)

Sure, emotions = dark arts, but there are shades of darkness, I think we can all agree on that. For example, the statement "emotions != dark arts" relies on a certain emotional tonality to the word "dark arts."

I'm getting people to care about social status to the extent that they care about effective donations. There is nothing about Intentional Insights itself that they should care about, the organization is just a tool to get them to care about effective giving. The key is to tie people's caring to effective giving :-)

Replies from: OrphanWilde
comment by OrphanWilde · 2015-11-11T18:57:05.786Z · LW(p) · GW(p)

Sure, emotions = dark arts, but there are shades of darkness, I think we can all agree on that.

No. We can't. Emotions != dark arts. I say that as somebody who killed his emotions and experienced an emotion-free existence for over a decade in the pursuit of pure rationality. You have no idea what you're talking about.

the statement "emotions != dark arts" relies on a certain emotional tonality to the word "dark arts."

No, it does not. It is a statement that there are ways of interacting with emotions that are non-manipulative. Emotions are not in opposition to rationality, and indeed are necessary to it. Emotions are the fundamental drive to our purpose; rationality is fundamentally instrumental. Emotions tell us what we should achieve; rationality tells us how. What makes your approach "dark arts" is that you seek to make people achieve something different from the achievement you are appealing to in them.

I'm getting people to care about social status to the extent that they care about effective donations. There is nothing about Intentional Insights itself that they should care about, the organization is just a tool to get them to care about effective giving. The key is to tie people's caring to effective giving :-)

You lure people in with one goal, and hope to change the goal they pursue. Have a notion of your own human fallibility, and consider what will happen if you fail. They won't leave. They will take over, and remake your shining institution in their own image.

Because if you do possess the ability to change people's goals, you should start there. Convince people that Effective Altruism is worth doing for its own sake. If you can manage that, you don't need the dark arts in the first place. If you need the dark arts, then you can't do what you'd need to be able to do to make the results favorable, and shouldn't use them.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-12T01:33:57.192Z · LW(p) · GW(p)

I accept that you believe you killed your emotions. However, I think statements like "you have no idea what you are talking about" indicate a presence of emotions, as that's a pretty extreme statement. So I think it might be best to avoid further continuing this discussion.

Replies from: gjm, OrphanWilde
comment by gjm · 2015-11-12T14:21:59.365Z · LW(p) · GW(p)

OrphanWilde has told his emotion-killing story elsewhere on LW, and isn't claiming to have no emotions now but to have spent some time in the past without emotions (having deliberately got rid of them) and found the results very unsatisfactory.

Whether that makes any difference to your willingness to continue the conversation is of course up to you.

comment by OrphanWilde · 2015-11-12T14:21:53.397Z · LW(p) · GW(p)

I'll repeat:

If you do possess the ability to change people's goals, you should start there. Convince people that Effective Altruism is worth doing for its own sake. If you can manage that, you don't need the dark arts in the first place. If you need the dark arts, then you can't do what you'd need to be able to do to make the results favorable, and shouldn't use them.

comment by Gleb_Tsipursky · 2015-11-11T00:03:45.629Z · LW(p) · GW(p)

Yup, agreed that no one is exclusively emotional or analytical - this is a spectrum.

and I made sure in the post above to emphasize that we should keep the current EA outreach oriented toward the head, but also enrich it with an orientation toward the heart. Let's be real, people who are really emotionally oriented will still not care much about EA. But we can reach much further on the spectrum of the analytical/emotional toward emotional than we are currently doing.

comment by Lumifer · 2015-11-09T21:06:51.960Z · LW(p) · GW(p)

What are your thoughts about reaching out to more emotionally-oriented thinkers using these and other modern marketing strategies?

My thoughts?

The product of "modern marketing strategies" will go into the spam bin, and the marketers will go into the scum bin.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-11T00:15:03.199Z · LW(p) · GW(p)

Guess we have different takes on the matter, then. I'm an educator at heart, and it seems that one needs to reach people where they are at in order to have a beneficial impact on the world through raising the sanity waterline, whether regarding donations or other areas of life.

Replies from: Lumifer
comment by Lumifer · 2015-11-11T16:33:20.758Z · LW(p) · GW(p)

I'm an educator at heart, and it seems that one needs to reach people where they are

I'm sorry, outside of mandatory schooling educators teach people who ask to be taught. Those who "reach people where they are" to further their own aims are called other, less complimentary names.

I have no wish to be shaped or "reached" by various people who might think it useful, especially if it mostly involves making my pockets easier to pick.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-11T18:17:22.752Z · LW(p) · GW(p)

I hear that's not your preference. The project of raising the sanity waterline is what I am dedicated to, and that is a project of education.

Replies from: Lumifer
comment by Lumifer · 2015-11-11T18:27:12.720Z · LW(p) · GW(p)

The project of raising the sanity waterline is what I am dedicated to

Actually, you've been talking about raising money for EA. Raising the sanity waterline in the rich West is not an EA goal because there is no efficient $/QALY ratio there.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-11T18:41:10.797Z · LW(p) · GW(p)

Actually, I'd disagree - raising the sanity waterline in the rich West is an EA goal, to the extent that raising the sanity waterline causes people to be more rational about their donations, and thus give to more effective charities. This is very much a meta-goal, of course, and one that has a great deal of impact. In the EA movement, this is called movement-building.

Replies from: Lumifer
comment by Lumifer · 2015-11-11T18:50:56.183Z · LW(p) · GW(p)

raising the sanity waterline in the rich West is an EA goal

An explicit EA goal to which it is willing to commit money?

I don't mean it in the sense of "it would be nice if X happened", I mean it in the sense of "we will spend resources to move this forward". The EA movement-building has narrow focus and is not concerned with the general raising of the sanity waterline.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-12T01:18:04.192Z · LW(p) · GW(p)

Yes, it is an explicit goal of some EA efforts, such as EA Outreach, The Life You Can Save, Intentional Insights, Giving What We Can.

comment by LessRightToo · 2015-11-09T20:54:30.390Z · LW(p) · GW(p)

Superdonor conveys a feeling of superiority, as in better than other donors. In other words, even if you donate less, if you donate more effectively, you can still be better than other donors by donating more effectively.

My personal preference is that you promote honorable reasons for donating, while recognizing that dishonorable reasons exist. Donating so that I can feel superior to other donors who give less or give differently does not strike me as particularly honorable. I admit that I am using the term honor without ever having given much thought as to what it means.

Replies from: bogus, Gleb_Tsipursky
comment by bogus · 2015-11-10T10:48:10.554Z · LW(p) · GW(p)

Whether donating to a super-effective charity should make you feel "superior" to other donors is largely a matter of personal choice. But I don't think that pointedly conveying the message that charities vary widely in effectiveness is persay dishonorable.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-11T00:26:41.480Z · LW(p) · GW(p)

Yup, agreed!

comment by Gleb_Tsipursky · 2015-11-11T00:26:15.255Z · LW(p) · GW(p)

I'm confused by your use of the term "honor." Let's taboo that term. Can you explain what's wrong with desiring to be better than others?

Replies from: LessRightToo
comment by LessRightToo · 2015-11-13T02:46:28.998Z · LW(p) · GW(p)

In its purest form, giving is intentionally impoverishing yourself in order to enrich another (the terms impoverish, enrich, and another can be as defined as narrowly or as broadly as you'd like). A person who makes some gesture for the sole purpose of self-elevation is not actually giving, no matter how generous the gesture may appear to casual observers. The most effective campaigns I've seen in the charitable giving domain emphasize positive outcomes for others rather than appealing to a donor's vanity or encouraging narcissism.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-13T05:35:34.715Z · LW(p) · GW(p)

Ah, thanks for clarifying. So it's a matter of purity of motivations. As a consequentialist I am mainly concerned with the outcome of people caring about effective giving and therefore giving to effective, evidence-based charities, and if getting them to desire self-elevation will motivate donors, then I'm happy to use that to achieve the outcome.

comment by Dagon · 2015-11-09T22:53:17.785Z · LW(p) · GW(p)

I worry about hidden costs in the case of multiple levels of charity each putting a portion of their effort into outreach (aka development).

In exactly the same way that I'd rather give to a direct-action effective charity than to an EA aggregator, I'd rather have end-result charities make the advertising/effect tradeoff than an upstream organization.

Note: I'm not actually all that effective nor altruistic myself, so you should have appropriate skepticism about my opinions on the topic.

Replies from: Gleb_Tsipursky, ChristianKl
comment by Gleb_Tsipursky · 2015-11-11T00:30:00.787Z · LW(p) · GW(p)

I think this is a matter of specialization. We should let charities that are direct action charities specialize in their work, evaluation charities specialize in their work evaluating these charities, and then have charities dedicated to popularizing the EA movement and effective giving specializing in their work. Each level deserves funding, as the system would not operate without each.

Replies from: Dagon
comment by Dagon · 2015-11-11T00:56:49.307Z · LW(p) · GW(p)

I like that framing.

How does the EA movement generally feel about the current choice between donating to evaluation agencies vs direct action? Is each donor encouraged to decide what the split should be?

Replies from: ChristianKl, Gleb_Tsipursky
comment by ChristianKl · 2015-11-11T12:15:35.438Z · LW(p) · GW(p)

As far as I understand GiveWell get's enough money by asking people privately that there no need to publically encourage new people to give to it. It's okay when a new donor simply gives to direct action.

comment by Gleb_Tsipursky · 2015-11-11T01:55:38.000Z · LW(p) · GW(p)

There's no centralized apparatus of directing funding, unfortunately. Most ordinary EAs give to direct action, as that's where it is intuitive to give. More wise EAs specifically donate to evaluation and outreach organizations, to direct funding against the general trend and donate where they can make the most effective impact, even though it doesn't feel as warmfuzzy as direct action.

Replies from: Dagon
comment by Dagon · 2015-11-11T18:06:30.895Z · LW(p) · GW(p)

So, what's really needed is a meta-evaluation organization, which can help donors choose how much effort to direct toward direct results and how much toward evaluation, and how much toward outreach (and to evaluate the evaluators). And then a meta-meta-evaluation to figure out how to rate and value the evaluator-evaluators. And so on.

My guess is each level should get handwave-exponentially-fewer resources, and that it converges to zero people working seriously on meta-meta-meta-evaluation, and only fractional people in ad-hoc ways even on meta-meta-evaluation. But the overall topic might be big enough now to have a university group doing studies on relative effectiveness of EA aggregators compared to each other and to direct action groups.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-11T18:09:17.602Z · LW(p) · GW(p)

I think one level of meta-evaluation should be sufficient :-) Namely, one organization that would help donors decide how much efforts to put into nonprofits dedicated to promoting EA, nonprofits dedicated to evaluating charities, and direct action nonprofits.

comment by ChristianKl · 2015-11-11T01:03:16.671Z · LW(p) · GW(p)

In exactly the same way that I'd rather give to a direct-action effective charity than to an EA aggregator, I'd rather have end-result charities make the advertising/effect tradeoff than an upstream organization.

The kind of people who want to give to direct-action effective charities generally want to give to charities with low administrative overhead. As a result it makes sense to have other organisations focus more on the marketing.

comment by ChristianKl · 2015-11-10T23:12:25.798Z · LW(p) · GW(p)

More importantly, it appeals to the heart well. Superdonor conveys the feeling of power – you can be super in your donations! Superdonor conveys an especially strong degree of generosity. Superdonor conveys a feeling of superiority, as in better than other donors.

That sounds like appearling to the gut and not the heart.

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-11T00:32:07.083Z · LW(p) · GW(p)

Appealing to the heart through the gut might be an apt metaphor.

comment by ChristianKl · 2015-11-10T23:10:16.433Z · LW(p) · GW(p)

Jon and I problem-solved how to spread Effective Altruism effectively, and came up with the idea of a catchphrase that we thought would appeal to people’s emotions well: “Be a Superdonor!”

What does "problem-solved" mean in that sentence? Something besides "brainstormed"?

Replies from: Gleb_Tsipursky
comment by Gleb_Tsipursky · 2015-11-11T00:31:09.825Z · LW(p) · GW(p)

We brainstormed using effective decision-making strategies, i.e., ones informed by psychological research on decision making.