The value of doing one's own philanthropic research?

post by Capla · 2014-11-10T19:15:00.013Z · score: 5 (5 votes) · LW · GW · Legacy · 8 comments

I want to be informed and to act because I have evaluated the evidence, not just go "with the  herd." There's a stigma against simply taking the word of an authority, and rightly so; on the  net, the world would be better if more people stopped to think for themselves (does anyone disagree?). But it is also the case that there are many fields in which I have to defer to experts because I simply am not equipped to deal with or consider the problems.

I wonder, is it even worth my doing research on charities, when there exist resources like givewell, which will almost certainly be able to do a more thorough and more accurate analysis than I would be able to do? Should I just defer to givewell when giving my effective charity?

I'll note that there is a difference between values and facts: I might decide for myself that I care more about some issues than others, due to variations in my personal moral calculus (for instance, I may value the well being of no-human mammals, relative to human mammals, more than others, and so chose to support animal rights groups, instead of poverty elimination), but might still defer to the experts with regards to how to most efficiently accomplish my stated goals.

Also, do I have good evidence to defer to the expertise of givewell? I like the idea, their analysis seems insightful, and people on this forum often speak highly of them. But these are all relativity superficial and don't seem like sufficient reason to allow them to dictate my giving (that's just lazily, submitting to the slick-looking authority). How do I evaluate the expertise of experts?


Comments sorted by top scores.

comment by tog · 2014-11-11T16:10:45.507Z · score: 3 (3 votes) · LW(p) · GW(p)

I'd vote for just deferring to GiveWell.

comment by Ben_LandauTaylor · 2014-11-11T16:31:40.760Z · score: 2 (2 votes) · LW(p) · GW(p)

How do I evaluate the expertise of experts?

This is a difficult problem whose implications go well beyond evaluating charities. Many people seem to defer their evaluation of experts to the experts, but then you have to figure out how to qualify those experts, and I haven't yet seen a good solution to that.

Some heuristics that I use instead:

—Does the expert produce powerful, visible effects in their domain of expertise which non-experts can't duplicate? If so, they're probably reliable within their domain. (For example, engineers can build bridges and authors can make compelling stories, so they're probably reliable in those fields.) This is only useful in cases where a non-expert can evaluate the product's quality; it won't help a non-mathematician evaluate theoretical physics.
—Are the domain experts split into many disagreeing camps? If so, at most one camp is right, which means most of the experts are wrong, and the field isn't reliable. (So this rules out, e.g., experts on nutrition.) This one is a tools for assessing domains of expertise, and won't tell you much about individual experts.

comment by Salemicus · 2014-11-11T12:31:43.894Z · score: 2 (2 votes) · LW(p) · GW(p)

There's a stigma against simply taking the word of an authority, and rightly so; on the net, the world would be better if more people stopped to think for themselves (does anyone disagree?).

Yes, I disagree. Remember that if people "stop and think for themselves," they have to stop what they were doing otherwise. If their comparative advantage is doing, rather than thinking, this may well be a poor choice.

It seems to me that there are good reasons to think for yourself:

  • The agency problems of trusting an expert
  • How can you choose the right expert to trust if you don't understand the subject itself?

Versus equally good reasons to delegate thinking:

  • Division of labour
  • Error-checking
  • If your understanding is so poor that you can't choose the right expert, how will you understand the subject?

Therefore, most people will do a mixture of thinking for themselves and trusting experts. In particular, people should most think for themselves when agency problems are most severe, when good experts are hard to identify, and where their own thinking will get rapid and plentiful feedback (classic example - household finances). People should defer to experts most readily when agency problems are small, when good experts are easy to identify, and where there own thinking will get very little feedback (classic example - history).

Unfortunately, claiming to think for yourself is also a brag - it is a way of signalling that you are an expert, or nearly an expert, or that you are capable of understanding a domain. So for these signalling reasons, people think for themselves more than is genuinely wise, and as a result we see many autodidacts with obviously wrong beliefs. Even more widespread, we see people who think that they have "thought for themselves" about a subject, but in fact have just read a small selection of popular books on the subject - in reality, this is just deference to an expert, but because they do not consider it as such, they have wasted their time by this study; even worse, because they believe they are thinking for themselves, they do not consider the degree to which they are trusting an expert, and so frequently choose their sources unwisely, based on e.g. literary qualities rather than accuracy.

I have heard plenty of claims that people defer to authority too much, but never one that had any weight behind it. Mostly, they are mere assertion/mood affiliation.

comment by Capla · 2014-11-11T15:31:45.951Z · score: 0 (0 votes) · LW(p) · GW(p)

That being the case, it sounds like the problem is the particular "experts" to which one defers. From my perspective, many people deferring to the pope (or their local preacher) causes more harm than good (e.g. opposition to issues of importance, and encouraging intolerance). I look at that and think the cached thought, "If only those people would think for themselves, they would see how ridiculous many of of the religious claims are."

If most people shouldn't actually think for themselves, what is the alternative? If we have a choice between encouraging a culture that enshrines thinking for ones self, or one that values submission to experts and leaders, but we don't get to choose who the leaders are going to be, which is preferable?

I'm more-or-less opposed to democracy, but I think it produces better outcomes than dictatorships (averaging across all the dictatorships that have existed, not just one's that happened to be fairly good).

comment by Fluttershy · 2014-11-10T20:52:56.668Z · score: 2 (2 votes) · LW(p) · GW(p)

I wonder, is it even worth my doing research on charities, when there exist resources like givewell, that will almost certainly be able to do a more thorough and more accurate analysis than I would be able to do?

If you have very similar values to the folks at GiveWell, then I would advise you to simply review their research and donate to their top-ranked charities, rather than conducting your own research, given that your time is valuable. If you have somewhat different values from the folks at GiveWell, you might look into organizations (MIRI, Animal Charity Evaluators, etc.) who have already conducted effective altruism-relevant research in other fields, before branching off and starting your own research.

I may value the well being of non-human mammals, relative to human mammals, more than others, and so chose to support animal rights groups

Animal Charity Evaluators is a Givewell-like charity evaluator which has its roots in the EA movement, and focuses on evaluating organizations which seek to reduce non-human suffering. I'm not as familiar with them as I am with GiveWell, but the fact that Animal Charity Evaluators branched off of 80,000 Hours is a good signal of their credibility.

comment by John_Maxwell (John_Maxwell_IV) · 2014-11-11T00:26:06.330Z · score: 1 (1 votes) · LW(p) · GW(p)

One nice thing about Givewell is their commitment to transparency. If you think your time is well spent thinking about optimal giving, you can start with what they've written and try to critique it publicly. (Alternatively, first deliberately ignore their recommendations in order to fight priming effects and try to brainstorm your own independent list of effective giving opportunities, then diff it with Givewell's.)

I think in the long run, the ideal model for effective giving research will be something like the ideal model for science research: publish results, critique the results of others, and try to build on our collective understanding. In the same way that all the world's scientific research isn't done by a single organization, I suspect in the long run we will want multiple organizations doing serious thinking about how to give effectively. The big difference with science is that the laws of physics are the same for everyone but not everyone has the same values that motivate their giving. So the ideal output of effective giving research will be something that makes it easy for you to plug your utility function in and learn what organization best achieves your values.

comment by Metus · 2014-11-10T20:52:27.496Z · score: 1 (1 votes) · LW(p) · GW(p)

This should be an interesting discussion. Not only because some people have different values than plain utilitarianism (of any flavour) but because there are other causes the community around LW tends to care about, such as FHI, CFAR and MIRI.

My dream would be an interface that allows me to specify to some degree what I care about and then it suggests a mix of donations catering to my personal beliefs.

comment by Capla · 2014-11-11T00:34:25.253Z · score: 0 (0 votes) · LW(p) · GW(p)

then it suggests a mix of donations

I replied, but since it's something I've been thinking about for a while, I simply made it a post.