is scope insensitivity really a brain error?

post by Kaarlo Tuomi · 2020-09-28T18:37:37.715Z · LW · GW · No comments

This is a question post.

Contents

  Answers
    5 Unnamed
    5 jimrandomh
    3 areiamus
    0 PatrickDFarley
None
No comments

I am reading a post called The Martial Art of Rationality [? · GW] by Eliezer Yudkowsky, in which he makes the following claim:

 

"If you’re a fast learner, you might learn faster—but the art of rationality isn’t about that; it’s about training brain machinery we all have in common. And where there are systematic errors human brains tend to make—like an insensitivity to scope—rationality is about fixing those mistakes, or finding work-arounds."

 

this post follows one in which he explained the concept of scope insensitivity by discussing a study that found that persons asked to contribute to dealing with the consequences of fuel spill on wildlife habitat did not contribute more money to save more birds. the contributors were deemed to be insensitive to the scope of the suffering.

 

the point I want to discuss is whether it is entirely fair to describe scope insensitivity, as defined in this way,  as a "systematic human brain error"?

 

it seems to me that this is bordering on saying that persons who made a different choice to yours are therefore not just wrong, but suffering from something, their brain is not working properly and they need to be taught how to make better choices. where "better" obviously means, more in line with the choice you would make.

 

scope insensitivity would only be irrational if saving birds were the only criteria in play. to save more birds, give more money. but this is almost never the case, people are more complex than this and they need to consider more criteria than this and each person may consider different criteria and weight them differently. to label those differences as "systematic human brain error" seems to be a very one-dimensional response.

 

I think we need to bear in mind that the original study did not allow for the possibility that folk did not pay more because they were unable to afford more, or because they would prefer to allocate their charitable spending to alleviate human suffering rather than animal suffering. in fact, the study explicitly said that they were unable to account for the lack of sensitivity to scope. it seems wrong, in fact plain anti-intellectual, for Yudkowsky to claim that their scope insensitivity is a "systematic human brain error"?

 

please discuss.

Answers

answer by Unnamed · 2020-09-29T21:01:44.612Z · LW(p) · GW(p)

One way to look at this is to pick questions where you're really sure that the two versions of the question should have different answers. For example, questions where the answer is a probability rather than a subjective value. One study some years ago asked some people for the probability that Assad's regime would fall in the next 3 months, and others for the probability that Assad's regime would fall in the next 6 months. As described in the book Superforecasting, non-superforecasters gave essentially identical answers to these two questions (40% and 41%, respectively). So it seems like they were making some sort of error by not taking into account the size of the duration. (Superforecasters gave different answers, 15% and 24%, which did take the duration into account pretty well.)

answer by jimrandomh · 2020-09-28T23:18:45.980Z · LW(p) · GW(p)

It seems to me that this is bordering on saying that persons who made a different choice to yours are therefore not just wrong, but suffering from something, their brain is not working properly and they need to be taught how to make better choices. where "better" obviously means, more in line with the choice you would make.

It's not about the choice in isolation, it's the mismatch between stated goals and actions. If someone says they want to save money, and they spend tens of hours of their time to avoid a $5 expense when there was a $500 expense they could have avoided with the same effort, then they aren't doing the best thing for their stated goal. Scope-insensitivity problems like this are very common, because quantifying and comparing things is a skill that not everyone has; this causes a huge amount of wasted resources and effort. That doesn't mean everything that looks like an example of scope insensitivity actually is one; people may have other, unstated goals. In the classic study with birds and oil ponds, for example, people might spend a little money to make themself look good to the experimenter.

(I would also note that, while the classic birds-and-oil-ponds example study is often used as an illustrative example, most peoples' belief that scope insensitivity exists and is a problem does not rely on that example, and other examples are easy to find.)

comment by Kaarlo Tuomi · 2020-09-29T06:55:44.996Z · LW(p) · GW(p)

In the classic study with birds and oil ponds, for example, people might spend a little money to make themself look good to the experimenter.

 

so you agree with me that there my be a rational reason for them not to donate more money. which implies that it is not logical or rational of Eliezer Yudkowsky to ascribe that reason to a human brain error.

thank you.

answer by areiamus · 2020-09-29T07:46:52.273Z · LW(p) · GW(p)

This article (open access) provides a useful summary of scope insensitivity as a phenomenon that is well researched and seems robust:

https://www.sciencedirect.com/science/article/pii/S2211368114000795

I would caveat that the primary data reported has almost no evidentiary value because of the smalls ample size (n = 41).

I feel that you have a separate issue beyond the existence of scope insensitivity as a phenomenon, and that is that Yudkowsky committed a value judgement when he labelled the phenomenon a production of systematic error. The article linked above describes how scope insensitivity differs from an unbiased utilitarian perspective on aid and concern (it is this latter approach that Yudkowsky would presumably consider correct):

In the specific case of valuations underlying public policy decisions, one would expect that each individual life at risk should be given the same consideration and value, which is a moral principle to which most individuals in western countries would probably agree to. Nonetheless, intuitive tradeoffs and the limits of moral intuitions underlying scope insensitivity in lifesaving contexts can often lead to non-normative and irrational valuations (Reyna & Casillas, 2009).

comment by Stefan_Schubert · 2020-09-29T10:52:07.831Z · LW(p) · GW(p)

The link doesn't seem to work.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2020-09-29T18:21:12.521Z · LW(p) · GW(p)

Probably meant to be this: "Scope insensitivity: The limits of intuitive valuation of human lives in public policy", Dickert et al.

comment by Kaarlo Tuomi · 2020-09-29T10:49:07.115Z · LW(p) · GW(p)

I feel that you have a separate issue beyond the existence of scope insensitivity as a phenomenon...

the existence of scope insensitivity is not in doubt. in my original post I quite specifically said: "the point I want to discuss is whether it is entirely fair to describe scope insensitivity, as defined in this way,  as a "systematic human brain error"?

it isn't obvious to me what I should or could have done to make that point any more clear.

and that is that Yudkowsky committed a value judgement when he labelled the phenomenon a production of systematic error.

I think what he did was to claim that he knew the one specific reason why people did not donate more money, and with no data whatsoever he attributed that reason to all persons in the study. this is knowledge he could not possibly possess. his claim is therefore false.

and not only that, but he wants to extend that to all cases of scope insensitivity so that he can say of all persons who are insensitive to scope that they have or display a "systematic human brain error."

I think it is obvious that his claim cannot possibly be correct.

thank you

answer by PatrickDFarley · 2020-09-29T00:21:20.500Z · LW(p) · GW(p)

scope insensitivity would only be irrational if saving birds were the only criteria in play. to save more birds, give more money. but this is almost never the case

The question was designed to isolate those two factors. You can claim the respondents all had secret, rational reasons to answer the way they did, but there's no evidence of that, and you haven't even proposed what those reasons could be.

comment by Kaarlo Tuomi · 2020-09-29T04:59:26.694Z · LW(p) · GW(p)

You can claim the respondents all had secret, rational reasons to answer the way they did, but there's no evidence of that, and you haven't even proposed what those reasons could be.

 

maybe you missed the last paragraph in my post, where I gave two possible reasons; "the original study did not allow for the possibility that folk did not pay more because they were unable to afford more, or because they would prefer to allocate their charitable spending to alleviate human suffering rather than animal suffering."

the first reply from jimrandomh suggested that they might have initially donated a little to make themselves look good to the researcher, a possibility I had not considered. in that case donating more achieves no purpose and it is therefore entirely rational to stick with the lower amount.

but Eliezer Yudkowsky claims to know that it was due to a brain error. is it rational for him to claim to know that?

thank you.

Replies from: PatrickDFarley
comment by PatrickDFarley · 2020-09-29T06:42:17.883Z · LW(p) · GW(p)

You're right, I did miss that in your last paragraph, my bad.

It shouldn't matter if they care more about human suffering: as long as bird-lives have nonzero value to them (and they revealed this by pledging any money at all), then the money donated should scale with the lives saved.

If they couldn't afford more, then they already made a mistake in donating their maximum to the first arbitrary opportunity presented. That's like a broader kind of scope insensitivity - valuing all large-sounding benefits exactly the same.

And, if they only pledged money to make themselves look good, they still failed due to scope insensitivity, because it looks bad to value 200,000 lives as little as 2000.

Anyway, as jimrandomh said, other examples are easy to find. I wouldn't believe in scope insensitivity if I'd never heard anything like the bird example, but I have.

Replies from: Kaarlo Tuomi
comment by Kaarlo Tuomi · 2020-09-29T06:58:53.604Z · LW(p) · GW(p)

If they couldn't afford more, then they already made a mistake in donating their maximum to the first arbitrary opportunity presented.

if I understand you correctly, your position is: unless a person is able to donate more money to save more birds, it is a mistake to donate anything at all.

so a little old lady who has $60 to spare gives it to a wildlife charity, but according to you that is a human brain error because she doesn't have another sixty bucks to spare. interesting, if true.

 

thank you.

Replies from: PatrickDFarley
comment by PatrickDFarley · 2020-09-29T16:50:10.625Z · LW(p) · GW(p)

No, that's not my position. Read it again and see if there's a nuanced view that better fits my words.

Replies from: Kaarlo Tuomi, Kaarlo Tuomi
comment by Kaarlo Tuomi · 2020-09-30T01:50:20.371Z · LW(p) · GW(p)

I have already replied once but my post seems to have been deleted so I will try again.

I am unable to detect any nuance in your post. perhaps it would be simpler if you were to explicitly say exactly what you mean.

thank you.

comment by Kaarlo Tuomi · 2020-09-29T18:36:33.297Z · LW(p) · GW(p)

I think the problem is that you are trying to say too much and have ended up with a post that says nothing of any value. your first and third paragraphs are demonstrably false, which is why I ignored them. in the second paragraph you literally said: "If they couldn't afford more, then they already made a mistake." I don't see that as being susceptible to nuance.

thank you.

Replies from: habryka4
comment by habryka (habryka4) · 2020-09-29T20:08:41.226Z · LW(p) · GW(p)

I... don't think that's the way discourse works here on LessWrong. Take this as your first moderator warning, if you continue posting and commenting at this level of quality, you will be banned. 

No comments

Comments sorted by top scores.