Entropy and social groups
post by NancyLebovitz · 2011-04-27T13:59:32.012Z · LW · GW · Legacy · 14 commentsContents
14 comments
I suggest that there are default patterns for social groups, and they could be viewed as high entropy-- what you'd expect without knowing more than that there was a social group of a certain size, possibly with some modifications for tech level and status.
For example, I think that authoritarianism is the default for government-- "we're in charge because we're in charge, and it would be dangerous for anyone who tries to change that". Totalitarianism is lower entropy-- it's surprising for the people in charge to have an ideology which requires them to make drastic changes.
The recent Elitist Jerks: A Well-kept Garden describes an effort to fight one sort of entropy (the repetition of the same questions and answers) which resulted in another sort of entropy (an excessively stable and eventually fragile core group).
Maintaining fun is another challenge in the keeping things alive category. Pleasant is relatively easy. Fun (which I'd say requires novelty) is harder, and I'm interested in comments on what it takes to keep the fun going.
There's a theory that life exists as chaos on the border between order and randomness-- I find this plausible, and it's a different angle for looking at the Friendliness problem. How can a system be built which continues to permit (or even encourage) interesting sorts of change, without permitting change so drastic that we as we are now wouldn't recognize the outcome as still related to us?
14 comments
Comments sorted by top scores.
comment by cousin_it · 2011-04-27T15:05:05.337Z · LW(p) · GW(p)
Sorry, I want to make a comment that's not related to the substance of your post.
You seem to use the word "entropy" to mean "bad things", especially in the paragraph about "fighting one sort of entropy" etc. Using technical terms as metaphors is uncomfortably close to woo. So is the meme that "life exists on the border between order and randomness", which as far as I can tell arose from poor science reporting on the subject of self-organized criticality.
To clarify the difference: this is woo, and this isn't.
ETA: Turns out Shalizi has a note about the above-mentioned meme. (If someone is unfamiliar with Shalizi's notebooks, run don't walk and read as many of them as you can.)
Replies from: NancyLebovitz, MrMind, rhollerith_dot_com↑ comment by NancyLebovitz · 2011-04-28T13:04:58.172Z · LW(p) · GW(p)
I may not have my idea of entropy clear enough, but when I say that totalitarianism is lower entropy than authoritarianism, I'm not implying that entropy is equivalent to "bad things". What I've got in mind is the idea that entropy is what happens unless an effort is made to get something else to happen.
Replies from: cousin_it↑ comment by RHollerith (rhollerith_dot_com) · 2011-04-27T16:54:50.118Z · LW(p) · GW(p)
My problem with it is not that it is uncomfortably close to woo. My problem is that even when it is being used in a nontechnical sense, when I see the word, it is hard for me to suppress the habit of winding up the part of my brain needed to understand technical uses of the concept of entropy.
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2011-04-28T13:02:03.247Z · LW(p) · GW(p)
Would "easy defaults" work better for you?
comment by JenniferRM · 2011-04-29T04:50:22.750Z · LW(p) · GW(p)
Eliezer's old post Every Cause Wants To Be A Cult is probably relevant here:
Replies from: NancyLebovitzA Noble Cause doesn't need a deep hidden flaw for its adherents to form a cultish in-group. It is sufficient that the adherents be human. Everything else follows naturally, decay by default, like food spoiling in a refrigerator after the electricity goes off.
In the same sense that every thermal differential wants to equalize itself, and every computer program wants to become a collection of ad-hoc patches, every Cause wants to be a cult. It's a high-entropy state into which the system trends, an attractor in human psychology. It may have nothing to do with whether the Cause is truly Noble. You might think that a Good Cause would rub off its goodness on every aspect of the people associated with it - that the Cause's followers would also be less susceptible to status games, ingroup-outgroup bias, affective spirals, leader-gods. But believing one true idea won't switch off the halo effect. A noble cause won't make its adherents something other than human. There are plenty of bad ideas that can do plenty of damage - but that's not necessarily what's going on.
Every group of people with an unusual goal - good, bad, or silly - will trend toward the cult attractor unless they make a constant effort to resist it. You can keep your house cooler than the outdoors, but you have to run the air conditioner constantly, and as soon as you turn off the electricity - give up the fight against entropy - things will go back to "normal".
↑ comment by NancyLebovitz · 2011-04-30T21:08:22.564Z · LW(p) · GW(p)
I think it's more that that every cause wants to be become a cult or a habit. The one thing causes don't want is to became reliable ways of achieving their stated goal.
Part of it is that people need social networks, but at least until the net, it was hard to make those networks happen just because people needed them. Either there had to be a cause, or the networks happened by geographical default.
When the goal of a organization is no longer feasible, the organization may look for another goal rather than dissolve. I'm not cynical about disease-fighting organizations which don't go away just because the disease has been abolished. It's simply too hard to build substantial organizations.
Replies from: JenniferRM↑ comment by JenniferRM · 2011-05-01T23:45:09.043Z · LW(p) · GW(p)
Thinking about your original post some more, I'm wondering if perhaps fun groups tend to involve flow on the part of many participants?
If that's an element of it, it would seem an inherently delicate arrangement, because flow requires people to be challenged, but not too challenged, and practice in those circumstances frequently produces growth in skills and a need to increase the level of challenge. If people grow at different speeds they might need different challenges and no longer find association productive. Or you might have a cohort with an initial common understanding that stays roughly synced who "use up all their challenge" and end up either becoming bored or trying to figure out a new mission that will actually be interesting to work on.
If this is true, then it suggests that really hard games with effective handicapping systems might be a good thing to build into a community? If some people get better fast, the can just aspire to winning with a higher handicap. If the game is really deep, there's room to improve for a long time. This makes me wonder if maybe golf clubs or go clubs tend to be long lived?
I feel like I'm groping here... Like a better conversation on this topic might have more data points in the form of stories about organizations and their tendencies. Then a good theory (a theory I don't feel anywhere close to proposing or justifying at the present time) would be able to give some kind of causal/mechanistic summary of all the data, and there would be parts of the theory that spoke to "fun" and how it related to all the rest of what happens in different organizations.
comment by Eugine_Nier · 2011-04-27T20:59:51.476Z · LW(p) · GW(p)
This reminds me of the iron law of oligarchy.
comment by AlphaOmega · 2011-04-27T17:29:59.599Z · LW(p) · GW(p)
Entropy may always be increasing in the universe, but I would argue that so is something else, which is not well-understood scientifically but may be called complexity, life or intelligence. Intelligence seems to be the one "force" capable of overcoming entropy, and since it's growing exponentially I conclude that it will overwhelm entropy and produce something quite different in our region of spacetime in short order -- i.e. a "singularity". If, as I believe, we are a transitional species and a means to a universal singularity, why would I want a system which restricts changes to those which are comprehensible or related to us?
Replies from: wedrifid, timtyler↑ comment by wedrifid · 2011-04-27T17:32:35.564Z · LW(p) · GW(p)
Intelligence seems to be the one "force" capable of overcoming entropy
Intelligence burns entropy to function. It just burns it in a far more efficient way in terms of awesomeness per entropy unit than anything else does. It can also concentrate the negentropy, mining a lot of it to use for its own ends. But in the end we are still entropy's bitch. Intellgence would need to find a way to work around the apparent loss of negentropy and find a new source if it wants to survive forever.