How alienated should you be?

post by Vaniver · 2020-06-14T15:55:24.043Z · LW · GW · 4 comments

Epistemic status: a post about values, where I’m confident the question is interesting but not confident my answer is correct. Factual statements are intended to be correct summaries, but I won’t be careful about citations. This post was written in January, and edited and published today.

The “correct” level and type of alienation, like a “balanced” posture, is a dynamic function of the individual and environment. Nevertheless, we can say useful things about it, as we can about posture and balance.

We often talk on LessWrong about the importance of human intelligence. Often, we mean individual human intelligence, instead of collective human intelligence, and for many topics the difference is immaterial. But when we talk about alienation, or the separation between the individual and the collective, the distinction is paramount.

For the strength of the pack is the wolf, and the strength of the wolf is the pack. --Kipling, The Law For Wolves

The evolutionary history of humans is inexorably tied up with their existence in bands. The drive to imitate allows for cultural accumulation, and the drive to teach further accelerated that growth. Language gave structure to thoughts, and more importantly allowed easier transmission. Trade allowed for specialization and fragmentation of knowledge.

We see some evidence that improving collective ability was worth individual costs. Much has been made out of how early farmers had clear signs of malnutrition compared to hunter gatherers, like their shorter skeletons. Note that malnutrition is bad for individual intelligence, and farming benefits collective capability primarily through the increase in population carrying capacity. (It also allows for saving of wealth in a way that means society can be arranged differently, and more reliable trade because people are more stationary, but I think those effects are small for our purposes here.)

To live with an immense and proud composure: always beyond. - To have and not have one's feelings, one's for and against, voluntarily, to condescend to them for hours, to sit on them, as if on a horse, often as if on a donkey: - for one needs to know how to use their stupidity as well as their fire. To preserve one's three hundred foregrounds, as well as one's dark glasses: for there are occasions when no one should be allowed to look into our eyes, even less into our "reasons." And to select for company that mischievous and cheerful vice, courtesy. And to remain master of one's four virtues: courage, insight, sympathy, and loneliness. For solitude is a virtue with us, as a sublime tendency and impulse for cleanliness, which senses how contact between one person and another - "in society" - must inevitably bring impurity with it. Every community somehow, somewhere, sometime makes people - "common." --Nietzsche, Beyond Good and Evil #284

In No Safe Defense, Not Even Science [LW · GW], Eliezer observes many rationalists had an early experience that broke their emotional trust in the sanity of the people around them, and this made it necessary for them to form independent judgment.

Selection on collectives is done on collective, not individual, survival and popularity. This means that the incentives on the collective for the accuracy of individual beliefs and the satisfaction of individual preferences is weak at best. When collective benefit and individual benefit conflict, we should expect significant pressure to be pro-social; that is, put the collective first.

Many advances in rationality come from rejecting epistemically invalid pressures to be pro-social. From the collective’s point of view, belief in the local religion is generally not a question about the supernatural, but instead a question of “are you one of us?”, and the rare loud atheist who took the question literally instead of seriously was correctly seen as “not one of us.” Silly rules are also a more effective test of desire to be a member than sensible rules; even sinners do sensible things.

But, from the individual’s perspective, what an evil trick for society to pull! It, in its bigness, decides that your epistemology should be censored and constrained in an opaque way that is for its benefit. It might claim that it’s for your benefit also, but in a way it will only let you evaluate using the crippled epistemology.

Many advances in social technology seem like they manage this balancing act more delicately. Capitalism is often characterized as trying to harness individual ambition for pro-social ends. Liberal humanist democracy increases the incentive for the collective to take some sorts of individual benefits more seriously, but more importantly gives individuals more of a moral license to be sovereign.

>u ever sit on a train and listen to a good song and sunlight is pouring into the carriage as u pull into the city and u just,…feel an overwhelming awe and love for the human race? like we built this train!!!! we built this city!!!!! billions of hands and millions of ideas and thousands of years and now I’m here!!! sitting on this train!!! listening to music that was written between all the infrastructure and progress just because!!! human beings are clever and loving and creative and that passion moves in all directions and has inevitably lead me here!!! to this train!! going to uni!!! the most mundane thing in the world and it’s so utterly remarkable it makes me feel tiny and also enormous --bactii

Scott Alexander writes that nerds can be bees, too. Often the capacity and desire to belong are there, and just the group to belong to is missing.

So the question at the end is, what should one do with one’s time, and sense of belonging, and sense of alienation? Bryan Caplan recommends building a beautiful bubble. This seems obviously correct in many ways, but I think it fails to grapple with the ways in which one’s values are flexible and socially constructed. It is a different sort of work to ‘do what you think is right’ and ‘figure out what you should think is right.’


I’ve been thinking about this general topic for years, not with this specific name, but this post solidified while I was thinking about effective altruism, youth movements, recruiting for work on existential risk reduction, and Kolmogorov Complicity.

For x-risk reduction, it often is the case that people think someone else has it handled. And in a world that’s often adequate, that’s not a crazy thing to think!

For effective altruism, it seems like the general society’s pro-social pressure is to collude to pretend to not notice the various ways in which the world is on fire, or the child at the center of Omelas. On the one hand, this is one of the main reasons humanity has nice things at all.

Typical of this attitude is the comment that, 'If we can send a man to the moon, why can't we-" followed by whatever project the speaker favors. The fact that we sent a man to the moon is part of the reason why many other things could not be done. --Thomas Sowell, Knowledge and Decisions

On the other hand, this is how mortality becomes deliberately ignored, despite the obvious individual interest in continuing to live. [And a society that ignores individual mortality through tweaks to epistemics instead of values seems like the sort of society that might end up ignoring collective mortality too!]

But it’s one thing to convince people that society is predatory, or misleading them, or generally worthy of being alien to, and another thing to create a collective that is still able to do good in the world, and worth belonging to.

For example, when I look at my psychology and values, I see both a deep individualism and a deep cosmopolitanism. That is, I want to be able to do my own thing, make my own choices, and generally be weird, and also I respect other people’s ability to do their own thing, and make their own choices, and generally be weird. You can have one without the other! One could be a would-be tyrant, accepting no limit on their own behavior whilst cruelly limiting others, or the formless follower, believing the Other is correct regardless of how it’s shaped and seeking to become whatever will fit in.

Given that those aspects needed to be paired to be good, the challenge of creating a culture, and of convincing others to join it, is more difficult than it might seem.

4 comments

Comments sorted by top scores.

comment by Viliam · 2020-06-15T20:42:42.039Z · LW(p) · GW(p)

I think it is easier to feel like a part of a group, if you feel that other people in the group are similar to you. That can happen for various reasons, such as:

  • the people actually are similar to you;
  • typical mind fallacy makes you believe in similarity by default, without evidence;
  • the environment makes similarities salient, and differences invisible;
  • you notice both the similarities and the differences, but decide that the similarities are important and the differences are unimportant.

So. The first option doesn't work for you if you really are atypical in some sense; this applies to all LW readers, I suppose. The second option, well, we are trying to overcome our biases, aren't we? That leaves options three and four -- the latter is about you as an individual, and the former is about the (sub)culture you want to fit in.

If we take the rationalist community, then "being aspiring rationalists" is something we have in common, though there can be other things that divide us. Some people care deeply about math and AI, others want life hacks, yet others want to do effective altruism; it could be more easy to feel like a part of a subgroup. The culture of nitpicking definitely puts the emphasis on differences; the question is how to mitigate this without giving up our shared value of truth-seeking.

(Of course there are other possible groups one could want to join, with other advantages and disadvantages.)

Oh, by the way, option four is not symmetric! You can think that what you have in common with person X is important, and the differences are unimportant... but the person X may have an opposite opinion.

More complicated, option four is not transitive. Suppose there are two traits you have in common with person X; you think the first one is important, they think the second one is important. Anyway, you both feel you could be members of the same group, and that's nice, right? Except, you disagree about other candidates for your group...

I suppose the lesson here is that you should make your values explicit. Which is easier said than done, because of illusion of transparency, insufficient introspection, or even not having words for some concepts. For many years I had a LessWrong-shaped hole in my heart, but my attempts to explain it... "intelligent people", "thinking about important things", "actually doing stuff", "trying to improve the world"... made others point me towards Mensa, philosophy, entrepreneurs, and non-profits respectively. But Mensa only did puzzles, philosophy was about status signaling, entrepreneurs often had horrible epistemology outside of their domain of expertise, and most people in non-profits were hopelessly mindkilled. For lack of better options I hanged out with them, but still felt alone. Heck, even today I probably couldn't explain the essence of Less Wrong. (Also: the ideology is not the movement, but to start a movement, you need to clearly point towards a new point in thingspace, otherwise the existing attractors will swallow you before take your first step.) I still don't know how to do this properly.

Plus there is the obvious trade-off between the quality and the size of the bubble. The more you expect, the fewer people are able to fulfill those expectations. Maybe the solution is a system of overlapping bubbles of various size and quality.

comment by Vaniver · 2020-06-16T05:09:28.701Z · LW(p) · GW(p)

Also, a thing worth mentioning; in January when I wrote this, most of my attention was on how to convince people that current institutions were inadequate, while not making them give up on humanity. And now the situation seems reversed, where it seems like people need to be reminded of the overwhelming awe and love for the human race.

comment by Pongo · 2020-06-14T23:10:51.578Z · LW(p) · GW(p)

I like the title question. It seems like a large(r?) focus of the essay is also on "how to have culture that doesn't require alienation". That's what I imagine when I read this:

But it’s one thing to convince people that society is predatory, or misleading them, or generally worthy of being alien to, and another thing to create a collective that is still able to do good in the world, and worth belonging to.

Have I understood rightly that this is a question you're thinking about?

Replies from: Vaniver
comment by Vaniver · 2020-06-15T00:20:33.113Z · LW(p) · GW(p)

Yes, tho in the spirit of "less wrong" instead of "not wrong" I think of "less alien" instead of "not alien." (I think in order to actually get rid of alienation, you would need to get rid of individuality.)


The practical consideration is from the x-risk angle. If you're pro-humanity and buy the basic scientific picture of the universe, almost all of the value looks like it's in the future, when there can be many more humans than there are now experiencing more joy and less misery. Even if you look at it from a selfish perspective, almost all of your expected lifeyears come from the chance of living an extremely long time, tho you have to be patient for discounting to not wipe that out.

But if you buy the basic social picture of humanity, almost all of your contribution comes from 'staying in your lane' and being a cog in a complicated machine much more subtle than you could have designed on your own. Perhaps you could become an expert in a narrow field of study and slightly shift things, but asking the big questions is 'above your pay grade,' and you should mostly expect to make things worse instead of better by looking at those questions or taking actions in response.

And so it seems like people need to let go of many parts of that picture to be highly effective; but also, I think the basic social picture is one of the main reasons many people have to be pro-humanity in the first place. [As opposed to in favor of themselves or their specific friends, in opposition to a hostile world.]