Caring less

post by eukaryote · 2018-03-13T22:53:22.288Z · LW · GW · 24 comments

Contents

  Why don't we see this more?
None
24 comments

Why don't more attempts at persuasion take the form "care less about ABC", rather than the popular "care more about XYZ"?

People, in general, can only do so much caring. We can only spend so many resources and so much effort and brainpower on the things we value.

For instance: Avery spends 40 hours a week working at a homeless shelter, and a substantial amount of their free time researching issues and lobbying for better policy for the homeless. Avery learns about existential risk and decides that it's much more important than homelessness, say 100 times more, and is able to pivot their career into working on existential risk instead.

But nobody expects Avery to work 100 times harder on existential risk, or feel 100 times more strongly about it. That's ridiculous. There literally isn't enough time in the day, and thinking like that is a good way to burn out like a meteor in orbit.

Avery also doesn't stop caring about homelessness - not at all. But as a result of caring so much more about existential risk, they do have to care less about homelessness (in any meaningful or practical sense) as a result.

And this is totally normal. It would be kind of nice if we could put a meaningful amount of energy in proportion to everything we care about, but we only have so much emotional and physical energy and time, and caring about different things over time is a natural part of learning and life.

When we talk about what we should care about, where we should focus more of our time and energy, we really only have one kludgey tool to do so: "care more". Society, people, and companies are constantly telling you to "care more" about certain things. Your brain will take some of these, and through a complicated process, reallocate your priorities such that each gets an amount of attention that fits into your actual stores of time and emotional and physical energy.

But since what we value and how much is often considered, literally, the most important thing on this dismal earth, I want more nuance and more accuracy in this process. Introducing "consider caring less" into the conversation does this. It describes an important mental action and lets you describe what you want more accurately. Caring less already happens in people's beliefs, it affects the world, so let's talk about it.

On top of that, the constant chorus of "care more" is also exhausting. It creates a societal backdrop of guilt and anxiety. And some of this is good - the world is filled with problems and it's important to care about fixing them. But you can't actually do everything, and establishing the mental affordance to care less about something without disregarding it entirely or feeling like an awful human is better for the ability to prioritize things in accordance with your values.

I've been talking loosely about cause areas, but this applies everywhere. A friend describes how in work meetings, the only conversational attitude ever used is this is so important, we need to work hard on that, this part is crucial, let's put more effort here. Are these employees going to work three times harder because you gave them more things to focus on, and didn't tell them to focus on anything else less? No.

I suspect that more "care less" messaging would do wonders on creating a life or a society with more yin, more slack, and a more relaxed and sensible attitude towards priorities and values.

It also implies a style of thinking we're less used to than "finding reasons people should care", but it's one that can be done and it reflects actual mental processes that already exist.


Why don't we see this more?

(Or "why couldn't we care less"?)

Some suggestions:

Brains can create connections easily, but unlike computers, can't erase them. You can learn a fact by practicing it on notecards or by phone reminders, but can't un-learn a fact except by disuse. "Care less" is requesting an action from you that's harder to implement than "care more".

This might be a cultural thing, though. Ways to care less about something include: mindfulness, devoting fewer resources towards a thing, allowing yourself to put more time into your other interests, and reconsidering when you're taking an action based on the thing and deciding if you want to do something else.

I suspect people feel that if you assert "care more about this", you're just sharing your point of view, and information that might be useful, and working in good faith. But if you say "care less about that", it feels like you know their values and their point of view, and you're declaring that you understand their priorities better than them and that their priorities are wrong.

Actually, I think either "care more" or "care less" can have both of those nuances. At its best, "maybe care less" is a helpful and friendly suggestion made in your best interests. There are plenty of times I could use advice along the lines of "care less".

At its worst, "care more" means "I know your values better than you, I know you're not taking them seriously, and I'm so sure I'm right that I feel entitled to take up your valuable time explaining why."

If you treat the things you care about as cherished parts of your identity, you may react badly to people telling you to care less about them. If so, "care less about something you already care about" has a negative emotional effect compared to "care more about something you don't already care about".

(On the other hand, being told you don't have to worry about something can be a relief. It might depend on if you see the thought in question as a treasured gift or as a burden. I'm not sure.)

"Care more about X" sounds more exciting and engaging than "care less about Y", so people are more likely to remember and spread it.

Maybe? Maybe by telling people to "care less" you'll remove their motivations and drive them into an unrelenting apathy. But if you stop caring about something major, you can care more about other things.

Also, if this happens and harms people, it already happens when you tell people to "care more" and thus radically change their feelings and values. Unfortunately, a process exists by which other people can insert potentially-hostile memes into your brain without permission, and it's called communication. "Care less" doesn't seem obviously more risky than the reverse.

Buddhism has a lot to say on relinquishing attachment and desires.

Self-help-type things often say "don't worry about what other people think of you" or "peer pressure isn't worth your attention", although they rarely come with strategies.

Criticism implicitly says "care less about X", though this is rarely explicitly turned into suggestions for the reader.

Effective Altruism is an example of this when it criticizes ineffective cause areas or charities. This image implicitly says "...So maybe care more about animals on farms and less about pets," which seems like a correct message for them to be sending.

Image from Animal Charity Evaluators.


Anyway, maybe "care less" messaging doesn't work well for some reason, but existing messaging is homogeneous in this way and I'd love to see people at least try for some variation.


Image from the 2016 Bay Area Secular Solstice. During an intermission, sticky notes and markers were passed around, and we were given the prompt: "If someone you knew and loved was suffering in a really bad situation, and was on the verge of giving up, what would you tell them?" Most of them were beautiful messages of encouragement and hope and support, but this was my favorite.

Crossposted to my personal blog.

24 comments

Comments sorted by top scores.

comment by query · 2018-03-14T00:15:21.724Z · LW(p) · GW(p)

If you choose to "care more" about something, and as a result other things get less of your energy, you are socially less liable for the outcome than if you intentionally choose to "care less" about a thing directly. For instance, "I've been really busy" is a common and somewhat socially acceptable excuse for not spending time with someone; "I chose to care less about you" is not. So even if your one and only goal was to spend less time on X, it may be more socially acceptable to do that by adding Y as cover.

Social excusability is often reused as internal excusability.

comment by Qiaochu_Yuan · 2018-03-14T05:12:57.517Z · LW(p) · GW(p)

From my perspective, the simplest and best explanation by far is the Hansonian one: it's just bad signaling to talk about caring less about things as opposed to more. You can't form a social group / movement / alliance around it, it makes you look heartless, etc.

(It's perfectly fine to pair caring less about a thing with caring more about another thing, e.g. the parts of the "manosphere" that combine caring more about having sex with women and caring less about feminism.)

The good news for rationalists is that, as a corollary, there's money on the ground to be picked up deliberately caring less about things, especially things it would be bad signaling to publicly advocate for caring less about. Some plausible candidates off the top of my head:

  1. Recycling
  2. The news (except these days there's mission-critical stuff like possible increased threat of nuclear war)
  3. Formal education, maybe
  4. Offending people
  5. Infrequent tragic events that kill small numbers of people
  6. Answering emails
Replies from: Celer, ESRogs
comment by Celer · 2018-03-15T15:41:36.959Z · LW(p) · GW(p)

I think you're right, so I'll start by assuming that you're wrong, because I have an alternative explanation for those who disagree with you (and which I think is the most convincing if we assume that the signalling explanation isn't the correct one). I think Eukaryote is missing one important cause. Assume that most writers arguing for caring more or less about a cause are doing so because they believe that this is an important way to serve that cause. Particularly outside our community, people rarely write about causes just for intellectual entertainment. "Everything is signalling" is a valid response, but I'll first reply to the rational-actor case, since that informs the limits and types of signalling. If I am writing to people who are "broadly value aligned", an admittedly imprecise term, I tend to expect that they are not opposed to me on topics that I think are most important. I expect most (85% if I'm optimistic) instances of reading writing to happen when people who are broadly value aligned with the author, at least with respect to the topic of the piece.

If someone cares less about something, I might value that directly (because I dislike the results of them caring), and I might value that indirectly (because I expect effort they take away from the target of my writing to go towards other causes that I value). However, conditional on broad value alignment, the causes that my readers care passionately about are not causes I'm opposed to, and the causes that I care passionately about are not ones that they're opposed to. So direct benefit, except in writing that is explicitly trying to convince people "from the other side", will rarely motivate me to try to make people care less.

Most communities have more than 3-4 potential cause areas. One specific friend of mine will physically go to events to support gun control, gender equality, fighting racism, homelessness prevention, Palestine, Pride, her church, abortion rights, and other topics. If I make her be less confident that gun control is an effective way of reducing violence, her efforts will be split fairly broadly. It is unlikely that whatever topic I find most important, or even whatever bundle of topics I find most important, is going to receive much marginal support. EAs are relatively unusual in that deactivating someone along one cause area has a high expected affect on specific other cause areas.

comment by Jacob Falkovich (Jacobian) · 2018-03-15T17:08:59.607Z · LW(p) · GW(p)

A lot of the comments so far have said "I agree that telling people to care less is good, but we can't do it for X,Y,Z reasons". Why the hell can't we? If anyone can overcome X,Y,Z reasons for the purpose of doing something good, it should be us.

How about starting a thread with good examples of calls to care less? You don't have to 100% agree or endorse them, just to demonstrate that it can and is being done.

I'll start with some of mine:

1. We should care less about global warming.

2. We should care less about empathizing.

3. We should care less about elections.

comment by eukaryote · 2020-01-10T01:46:17.935Z · LW(p) · GW(p)

Hi, I'm pleased to see that this has been nominated and has made a lasting impact.

Do I have any updates? I think it aged well. I'm not making any particular specific claims here, but I still endorse this and think it's an important concept.

I've done very little further thinking on this. I was quietly hoping that others might pick up the mantle and write more on strategies for caring less, as well as cases where this should be argued. I haven't seen this, but I'd love to see more of it.

I've referred to it myself when talking about values that I think people are over-invested in (see https://eukaryotewritesblog.com/2018/05/27/biodiversity-for-heretics/), but not extensively.

Finally, while I'm generally pleased with this post's reception, I think nobody appreciated my "why couldn't we care less" joke enough.

comment by Caleb Withers (caleb-withers) · 2018-03-14T02:31:05.364Z · LW(p) · GW(p)

An additional possible reason: to be receptive to "caring less" about a cause, one generally has to engage with the fact that, given scarcity, we have to make trade-offs between the many opportunities to do good in the world. Many would agree this is true if asked, but don't fully internalize this, as discussed in Nate Soares' See the Dark World.

comment by Jacob Falkovich (Jacobian) · 2019-12-23T22:45:31.762Z · LW(p) · GW(p)

"Caring less" was in the air. People were noticing the phenomenon. People were trying to explain it. In a comment [LW(p) · GW(p)], I realized that I was in effect telling people to care less about things without realizing what I was doing. All we needed was a concise post to crystallize the concept, and eukaryote obliged.

The post, especially the beginning, gets straight to the point. It asks the question of why we don't hear more persuasion in the form of "care less", offers a realistic example and a memorable graphic, and calls to action. This is the part that was most useful to me - it gave me a clear handle on something that I've been thinking about for a while. I'm a big fan of telling people to care less, and once I realized that this is what I was doing I learned to expect more psychological resistance from people. I'm less direct now when encouraging people to care less, and often phrase it in terms of trade-offs by telling people that caring less about something (usually, national politics and culture wars) will free up energy to care more about things they already endorse as more important (usually, communities and relationships).

The post talks about the guilt and anxiety induced by ubiquitous "care more" messaging, and I think it's taking this too much for granted. An alternative explanation is that people who are not scrupulous utilitarian Effective Altruists are quite good at not feeling guilt and anxiety, which leaves room for "care more" messaging to proliferate. I wish the post made more distinction between the narrow world of EA and the broader cultural landscape, I fear that it may be typical-minding somewhat.

Finally, eukaryote throws out some hypotheses that explain the asymmetry. This part seems somewhat rushed and not fully thought out. As a quick brainstorming exercise it could be better as just a series of bullet points, as the 1-2 paragraph explanations don't really add much. As some commenters pointed out and as I wrote in an essay [LW · GW] inspired by this post, eukaryote doesn't quite suggest the "Hansonian" explanation that seems obviously central to me. Namely: "care more about X" is a claim for status on behalf of the speaker, who is usually someone who has strong opinions and status tied up with X. This is more natural and more tolerable to people than "care less about Y", which reads as an attack on someone else's status and identity - often the listener themselves since they presumably care about Y.

Instead of theorizing about the cause of the phenomenon, I think that the most useful follow ups to this post would be figuring out ways to better communicate "care less" messages and observing what actually happens if such messages are received. Even if one does not buy the premise that "care less" messaging is relaxing and therapeutic, it is important to have that in one's repertoire. And the first step towards that is having the concept clearly explained in a public way that one can point to, and that is the value of this post.


comment by Benquo · 2018-03-14T12:07:58.250Z · LW(p) · GW(p)

If you think about policy recommendations as attempts to move resources, there's a straightforward reason why actors with sufficiently unaligned preferences might prefer to send "care more" messages over "care less" ones.

Let's say you have coalition A, which contains potential "care more" target B, and does not contain potential "care less" target C. Then you have everything in neither C nor A, which we can call D. The resources freed up by a "care less" message about C are divided in some proportion between A and D, while a "care more"message about B benefits A exclusively. This dilution effect means that even if agents are constrained to sending true messages, if they can prioritize selfishly, they're often going to favor "care more" over "care less" at similar levels of effectiveness.

If nearly all careabouting territory is defended, the dynamics are somewhat different, since "care more" mostly stops working. If a dominant coalition owns most of the resources, then the main source of variation is the occasional competing claim, and there's a strong incentive for the large player to shut down smaller ones as they pop up. The very large coalition stands to reap the majority of the gains from "care less" attacks against outsiders. Cf. monotheism. On the other hand, small players have a comparatively strong shared incentive to attack the largest one, since even a fairly small shrinkage of its resource claims in percentage terms may free up what is to them quite a lot, for ensuing "care more" claims.

Replies from: Raemon, jameson-quinn
comment by Raemon · 2020-01-11T01:28:53.978Z · LW(p) · GW(p)

I found this slightly hard to parse, would be interested in someone writing this again... maybe just in slightly different words, maybe with real examples instead of A/B/C/D.

comment by Jameson Quinn (jameson-quinn) · 2020-01-10T22:42:24.570Z · LW(p) · GW(p)

This is a rigorous version of my intuitive dissatisfaction with the OP.

comment by Stuart_Armstrong · 2018-03-14T12:03:57.210Z · LW(p) · GW(p)

Somehow, it seems that messages like "prioritise less" resonate more than "care less". Maybe because it sounds more tactical, and less like a change in who you are?

Replies from: ninjafetus
comment by ninjafetus · 2018-03-14T16:54:40.980Z · LW(p) · GW(p)

Bingo. When people say what they care about, they're treating it as a statement of values. When they say what they prioritize, they say what they're actually doing.

Ideally, people would do things that match their values, or at least be honest about what their values are, but it's a rare person that will say "I don't care about that" to some tragedy, even when they plan to do nothing.

From what I've seen, the more way to publicly "care less" about X instead of Y without it threatening your ego is to say you're adjusting priorities (usually to something unimpeachable, like family). And the way to "care less" about X because you only have finite time to do things and stay healthy is to say you're focusing on self care. It's not that X is any less worthy of a cause, or that you're any less good of a person for spending less time on it. It's a tactical, zero-sum shift.

comment by crybx · 2018-03-14T14:38:59.423Z · LW(p) · GW(p)

Intuitively seeing things as being like the pie graph is why the birds example for scope insensitivity doesn't feel like a case where I should try to do anything differently. Maybe I only have an ~$80 budget to care about birds because I can't smash a bigger slice into my pie of caring.

comment by chkno · 2018-03-23T05:19:29.410Z · LW(p) · GW(p)

How to do it with your brain:

A specific mental movement that can be trained as a habit that I learned in the context of limiting in-flight work: Label non-sanctioned work "bullshit". Internally declare it emphatically and emotively: "That's bullshit!!" This simple, charged internal speech is accessible to both system 2 and system 1 and can flow between them in either direction.

This is an extremely local notion of "bullshit"! It is a judgement of the triple: (you, <work item>, this moment). It is not an overall judgement of the work / cause / etc.. When slicing causes, it is the dedication of others to these obviously important causes that allows them to be bullshit to you in this this moment (emote: gratitude!). When slicing your own time, what is bullshit to you in this moment while you finish composing this email or whatever may become your top priority 30 seconds later after the current task is completed.

comment by Chris_Leong · 2018-03-14T03:53:47.970Z · LW(p) · GW(p)

Attempting to change the social messaging is unrealistic because of the reasons given by other commentators, but I think it is valuable for you to have noticed this pattern. I am also worried about how it pushes us towards guilt as there are always more things to care about than we can handle. I think the solution is to avoid asking yourself how much you should care about a particular thing in isolation. If you are going to decide that you should care more about a particular thing, then you should ask yourself what thing you think you should do less of instead. And then you should ask yourself if that is realistic (zero time procrastinating is not). Perhaps you decide that you are actually allocating your time well and you shouldn't feel bad or perhaps you realise you ought to make a change. But either way, you avoid feeling bad uselessly.

comment by Jameson Quinn (jameson-quinn) · 2020-01-10T22:45:01.239Z · LW(p) · GW(p)

I think the primary value of this post is in prompting Benquo's response. That's not nothing, but I don't think it's top-shelf, because it doesn't really explore the game theory of "care more" vs. "care less" attempts between two agents whose root values don't necessarily align.

comment by tragedyofthecomments · 2018-03-13T23:01:35.898Z · LW(p) · GW(p)

You mentioned self help. I think care less arguments exist, but because "caring less" sounds kind of like "be a shittier person" things get phrased as take care of yourself or "what do we prioritize here". In favor of caring less about some things so we can do more of the things we want to care more about.

comment by orthonormal · 2019-12-07T22:06:36.836Z · LW(p) · GW(p)

"Caring less" singles out one thing in such a way as to indicate that you might actually care zero about it or care in the opposite direction from your audience.

Analogously, I embarrassed myself when watching the last World Cup with coworkers; when asked who I was cheering for (since the USA didn't qualify), I responded "I just hope it's not Russia". This did not sit well at all with the Russian coworker who overheard. They wouldn't have been upset if I'd instead expressed a positive interest in whoever was playing Russia at the time.

comment by Jiro · 2018-03-19T09:51:38.869Z · LW(p) · GW(p)

Some things are central examples of caring (caring about the homeless), and other things are noncentral examples of caring (caring about sleeping late, caring about leisure activities). Whether a speaker describes something as "you should care more about X" or "you should care less about Y" does communicate information--it depends partly on how central an example of caring he considers X and Y to be.

(It also depends on how broad X and Y are. If you want to tell someone "you should care less about the entire range of activities that includes everything except climate change", you would probably describe it as "you should care more about climate change". So it doesn't follow that any "care more" can be reasonably phrased as a "care less".)

comment by Dagon · 2018-03-14T20:08:33.025Z · LW(p) · GW(p)

This is probably a near/far thing. "caring" is far-mode, and mostly about intent and image without tradeoffs. I don't think you'll find universal agreement that "one can only care a fixed total amount about things".

If you say "spend more resources on X", whether that's money, time, or or whatnot, I guarantee you'll have the discussion of "what do I not do instead".

In other words, you should care less about caring, and put more energy into doing.

comment by Hazard · 2018-03-13T23:47:08.081Z · LW(p) · GW(p)

Your picture at the top is great and it nicely hammered in what I felt was the main point (Real caring works like the pie, even if it "should" work like the ideal scenario).

comment by Ben Pace (Benito) · 2019-12-02T19:35:02.771Z · LW(p) · GW(p)

Seconding Habryka.

comment by habryka (habryka4) · 2019-11-29T23:08:00.685Z · LW(p) · GW(p)

I think this post summarizes a really key phenomenon when thinking about how collective reasoning works, and the discussion around it provides some good explanations. 

I've explained this observation many times before this post even came out, but with this post I finally had a pretty concrete and concise reference, and have used it a few times for that purpose.