On "Geeks, MOPs, and Sociopaths"

post by alkjash, Gordon Seidoh Worley (gworley) · 2024-01-19T21:04:48.525Z · LW · GW · 35 comments

Contents

35 comments
Gordon Seidoh Worley

Hey, alkjash! I'm excited to talk about some of David Chapman's work with you. Full disclosure, I'm a big fan of Chapman's in general and also a creator within the meta/post-rationality scene with him (to use some jargon to be introduced very shortly).

You mentioned being superficially convinced of a post he wrote a while ago about how subcultures collapse called "Geeks, MOPs, and sociopaths in subculture evolution". In it he makes a few key claims that, together, give a model of how subcultures grow and decline:

  1. Subcultures come into existence when a small group of creators start a scene (people making things for each other) and then draw a group of fanatics who support the scene. Creators and fanatics are the "geeks".
  2. A subculture comes into existence around the scene when it gets big and popular enough to attract MOPs (members of the public). These people are fans but not fanatics. They don't contribute much other than showing up and having a good time.
  3. If a subculture persists long enough, it attracts sociopaths who prey on the MOPs to exploit them for money, sex, etc.
  4. Although MOPs sometimes accidentally destroy subcultures by diluting the scene too much, sociopaths reliably kill subcultures by converting what was cool about the scene into something that can be packaged to sold to MOPs as a commodity that is devoid of everything that made it unique and meaningful.
  5. The main way to fight this pattern is to defend against too many MOPs overwhelming the geeks (Chapman suggests a 6:1 MOP to geek ratio) and to aggressively keep out the sociopaths.

There's also a 6th claim that we can skip for now, which is about what Chapman calls the fluid mode and the complete stance, as talking about it would require importing a lot of concepts from his hypertext book Meaningness.

To get us started, I'd be interested to know what you find convincing about his claims, and what, if anything, makes you think other models may better explain how subcultures evolve.

alkjash

In my head I'm running this model against these examples: academic subfields, gaming subreddits and discords, fandoms, internet communities, and startups. Do tell me which of these count as "subcultures" in Chapman's framing. Let me start with the parts of the model I find convincing.

  1. When subcultures grow (too) rapidly, there is an influx of casual members that dilutes the culture and some tension between the old guard and the new fans. This agrees with what I know about startups, gaming subcultures, and fandoms. It does explain the longevity of academic cultures known for our extreme gatekeeping.
  2. In Chinese there is a saying/meme 有人的地方就是江湖, which I would loosely translate as "where there are people there is politics." It seems obvious to me that in the initial stage a subculture will be focused on object reality (e.g. a fandom focused on an anime, a subreddit focused on a video game, etc.), but as people join, politics and social reality will play a larger and larger role (competition over leadership positions, over power and influence, over abstractions like community values not directly tied to the original thing).
  3. As the low-hanging fruits of innovation in object reality (e.g. geeks coming up with new build orders in starcraft, bloggers coming up with new rationality techniques) dry up, there is a tendency for those good at playing social reality games to gain progressively more influence.
alkjash

Here are some parts that I'm not sure about, or find suspicious, or disagree with:

  1. At least on a superficial reading there seems to be an essentialist pigeonholing of people into the Geek/Mop/Sociopath trichotomy. It seems to me more persuasive that all members of a scene have the capacity for all 3 roles, and on average the "meta" shifts as the evolution of the scene dictates. The most savvy/charismatic Geeks find it prudent to turn to Sociopathy when it is their competitive advantage. The Mops are always a recruiting ground for both Geeks and Sociopaths. Etc.
  2. I recall reading a comment very negative on Chapman's model, I can't find it exactly but the gist is that the whole choice of names Geeks/Mops/Sociopaths is enormously (and intentionally) biased and colors the rest of the discussion in an unhelpful way. I think names are quite important and would at very least relabel Sociopaths as "Savvy people" or something like this.
  3. I am curious to what extent this model aims to explain the rise and fall of subcultures versus explanations grounded in the object reality of the interest itself. For example an old technology might fall out of fashion and the interest groups around it dry up, or a competitive video game becomes too well-studied and the game space essentially solved, or the low-hanging fruit in a scientific subfield is all picked and the area stagnates. It's not obvious to me that the "social reality" explanations of Chapman outcompete such considerations.
Gordon Seidoh Worley

I'm not sure what Champman would say counts as a subculture. The definition seems fuzzy. I think the archetypical examples would be the subcultures that grew up around music styles, like the punk or metal or grunge or house subcultures. Gaming discords and internet communities seem a lot like these to me, academic fields and startups less so, and fandoms a lot less so because fandoms start out with a huge gulf between the creators and the consumer fans. Goths are maybe a good example of a subculture, as are rationalists, in the sense of the community that grew up around Less Wrong and related sites, and similar Effective Altruism is a subculture.

I think it's worth making a distinction between two kinds of subcultures. Some subcultures are inherently time limited. Think about the subculture around a music genre or an academic niche. Over time all the new interesting stuff that can be done will be done. Eventually there will be diminishing returns where new things just sound like old things or rehash already well explored ideas. This will cause the subculture to naturally die out due to diminishing returns.

And then there's subcultures with more longevity, like goths, who have been a thing for 40 years or so, even though there's no major innovations in goth music or dance music or fashion to explore. What seems to help is that goths are united around a central idea of being part of a community of people who have the same vibe, and then each person who joins the goth subculture gets to go through their own journey of discovering new things for themselves that others have already discovered. This acts of as a sort of initiation into the subculture, after which they are a member in good standing.

Chapman's theory can apply to both types of subcultures, but it seems most interesting to consider it for subcultures with longevity, since all MOPs and sociopaths can do to time limited subcultures is hasten their inevitable demise. I'm probably most interested in what it can tell us about rationality and EA.

So let's assume for now that we're talking about a subculture with longevity that could probably survive indefinitely if it doesn't succumb to a failure mode, like say letting in so many MOPs it gets too diluted or letting sociopaths/savvy people exploit the subculture until it no longer resembles itself.

As a start, it'd be nice to backtest the model to see if it can adequately explain the fall of any past subculture that could not be explained well by another theory. My guess is that Chapman had hippies in mind when he wrote this post, so let's talk about them.

Wikipedia tells a rough story of what happened to the hippies. After a rise through the 1960s, bad stuff started happening in 1969. There was the Altamont Music Festival, where 5 people died and 4 babies were born, which apparently was too shocking for the time. Charles Manson was a to all appearances a hippie, but then it turned out he and his followers killed a bunch of people. And then through the 70s as America stopped drafting teenagers to fight in Vietnam, people stopped having a motivation to live outside the system. Wikipedia also says that hippie culture went mainstream, so it was no longer really a subculture.

Charles Manson and Altamont sound like "sociopaths" exploiting hippie MOPs. That last line could be interpreted as MOPs taking over and diluting hippie culture into oblivion, though that feels like a bit of a stretch. So feels like half marks here: Chapman's model basically explains what happened to the hippies, but it also seems like more explanation than was needed, and I could imagine other explanation that account for the same events.

alkjash

I didn't realize goth had such a long history, huh.

I'm relatively convinced that Chapman's model is a good fit for musical subcultures, and most interested in to what extent it generalizes to other spaces. I'd hoped that it would provide a glimpse into some "grand unified theory of social dynamics" and extend all the way out to the rise and fall of nations, but perhaps that's too ambitious.

As you pointed out, some subcultures are naturally time-limited and there it's not necessary to have Chapman's model explain anything. Are there other necessary conditions? It seems like these dynamics have a hard time taking hold in "normie-land" where the rigid constraints of credentialism, professionalism, legible competence, etc. etc. drive both Geeks and Sociopaths away. Would it be fair to say that Chapman's "subcultures" are exactly those social groups most susceptible to becoming cults?

Gordon Seidoh Worley

Oh, hmm, I don't know, but it seems plausible. I think it certainly would be fair to say that subcultures are the kind of things people worry will become cults or accuse of being cults. Would be interesting to have more data here, but I'm not sure where we can get it. I'm sure there's places people keep track of cults, but there's also lots of minicults that pop up and fizzle out without anything bad enough happening to get the attention of someone who would track cults (we've had our share of these within the rationalist community! and a couple that went sideways enough that folks got arrested!).

One of the challenges is that normies sometimes call anything where people are too earnest about a thing a cult. I don't think they would say that all geeks are in cults, but certainly I could imagine a normie offhandedly calling any conglomeration of geeks a cult.

I guess a really interesting question might be how well the model applies in academia, where there are certainly a lot of geeks, but also a lot of structures like credentialism and professionalism that might prevent subculture formation in a way that would meet the conditions for Chapman's model to apply.

So if I were to start putting together a list of criteria for applying Chapman's model, it might include:

  • not naturally time-limited
  • not subject to strong credentialism, professionalism, etc. forces

I'm not sure what else, but we probably have enough examples that we could figure out if there are other necessary conditions for subculture creation that, if any one of them is missing, the model fails to apply because the ground in which a subculture can arise isn't existent.

alkjash

I suppose the big question is: what is one to do about this? Suppose we're geek creators on a new scene and would like things not to turn sour/exploitative through this dynamic. Chapman has some prescriptions that (afaict) round to "titrate the growth of your subculture" and "if you can't beat the sociopaths, join them." 

Are there examples where such precautions were taken successfully? Are these prescriptions you would recommend/would have recommended to the rationalist community?

Gordon Seidoh Worley

Let's start with rationalists successes. Some time ago Eliezer wrote "Well-Kept Gardens Die By Pacifism [LW · GW]", and I think it's done a lot to instill rationalist culture with a strong streak of, not exactly gatekeeping, but setting a high bar and expecting people in rationalists spaces to meet it. When there have been bad actors (e.g. Brent Dill, Ziz, etc.) they've been ostracized, so rationalist have a demonstrated willingness to kick "sociopaths" out of the subculture.

There've been some other, marginal cases where there's not consensus if action should now be or should have previously been taken (e.g. MAPLE, Nonlinear, maybe just all of EA, etc.). A good metaphor is probably that a heathy immune systems sometimes overreacts, and it's more dangerous to trade off for more false negatives (miss a deadly pathogen) to reduce false positives (sneeze at some dust motes). It's probably right that some folks have pushed back in cases where there's no consensus and we end up with an ambiguous state where we have orgs/members who are still part of the subculture but are also clearly not 100% in good standing.

Setting a high bar also keeps out normies. People regularly join Less Wrong and leave because they get downvoted for failing to post things that meet the standards of the subculture. There's also lots of jargon and other things to keep them out. Now you will find plenty of rationalist-adjacent folks, who basically are maybe the closest thing to rationalist MOPs, but in my experience most of them would be rationalists except they're wary of identifying as a "geek" within the rationalist scene.

This all seems good for keeping rationality functioning as a subculture.

I think Chapman would ask, okay, but can we do more? This is where I'm not sure. Will rationalists fade into the general intellectual milieu like our nearest predecessor movement, general semantics? Will we become ossified around the writings of one or two key authors, like Objectivists and Communists? Perhaps some third fate? I don't have a clear read on where rationalists are going or should go. I'm doubtful that rationality can go mainstream, but I'm also not sure if there's a desirable option beyond maintaining the subculture.

alkjash

This is illuminating but I'm not sure where to go from here. Suggestions?

Gordon Seidoh Worley

I don't have a clear next direction to take us, either. Perhaps we can invite readers to comment with their thoughts on Chapman's post to continue the discussion.

alkjash

Sure, I don't mind stopping the official dialogue short and opening up for comments.

35 comments

Comments sorted by top scores.

comment by FireStormOOO · 2024-01-20T20:01:26.841Z · LW(p) · GW(p)

For less loaded terms, maybe Create, Consume, Exploit or Create, Enjoy, Exploit as the set of actions available.  Looks like loosely what was settled on above.

Where exploit more naturally captures things like soulless commercialization and others low key taking advantage of those enjoying the scene.

Consume in the context or rationalists would more be people who read the best techniques on offer and then go try to use them for things that aren't "advancing the art" itself, like addressing x-risk.

Replies from: FireStormOOO
comment by FireStormOOO · 2024-01-20T20:07:45.623Z · LW(p) · GW(p)

Related, how does spin-off subcultures fit into this model?  E.g. in music you have people that consume an innovation in one genre, then reinvent it in their own scene where they're a creator.  I think there's similar dynamics in various LW adjacent subcultures, though I'm not up enough on detailed histories to comment.

Replies from: Viliam
comment by Viliam · 2024-01-21T13:54:31.729Z · LW(p) · GW(p)

If the spin-off group identifies differently and meets at different places, that is okay, because it does not prevent the original group from their original ways.

I think there's similar dynamics in various LW adjacent subcultures

I agree. Less Wrong is a well-defended fortress. The spin-off subcultures have their own online places, such as Astral Codex Ten, Effective Altruism Forum, the places where post-rats meet, etc.

(Even if we copy or link each other's articles, it is always selected articles, which are then discussed by a different audience. In music, I guess an analogy would be a song that is halfway between two genres, being played at different festivals for different audiences.)

comment by Elizabeth (pktechgirl) · 2024-01-20T06:16:15.088Z · LW(p) · GW(p)

I think it's helpful to think of sociopaths as sociopathic towards the original topic of a group, not necessarily towards people. The most giving, proscocial people can end up the most sociopathic-in-that-sense, because they want to make whatever they're involved in as accessible as possible to bring joy to as many people as possible, and don't care about the topic for its own sake. 

comment by Viliam · 2024-01-20T22:40:42.511Z · LW(p) · GW(p)

Alternative names: Creators, True Fans, Fake Fans, Fake Creators.

Without Creators, the subculture would not exist; there would be nothing to unite around.

True Fans are those that recognize and appreciate the value the Creators are trying to produce. Their presence provides social and/or material support for the Creators, which is how they also contribute to the art.

Fake Fans are people who are fans for the wrong reasons. They do not care about the same values as the Creators and True Fans, but they come because they care about something else, for example a nice community. The danger is, because they do not care about the original values, they may try to change the community towards their values.

(They may succeed, because it's not like the True Fans are necessarily opposed to their values. Like, maybe the True Fans value both "heavy metal" and "having fun together", but they come to the community because that is the only place where they can listen to the heavy metal. Fake Fans care about "having fun together", but they don't actually care about the metal. They can still wear metal-themed shirts, because it allows them to have more fun with the true fans. But they will push towards more fun even when it comes at the expense of the metal. And the True Fans will not oppose them too hard, because they like having fun, too. It's just, there are other places where people can have fun, but only a few where they can listen to the heavy metal, and now one such place is being actively undermined.)

Fake Creators are alternative Creators, but they produce the values of the Fake Fans, not of the True Fans. They succeed, because they have the support of the Fake Fans, who by the time may be a majority of the community. This is a problem because they compete for resources with the Creators; but without the Creators, the values of the original community are lost.

(The problem is not that Fake Creators are bad people; they may or may not be, but the bad people can also be among the Creators, True Fans, and Fake Fans. The problem is that by pushing out the Creators, they directly reduce the amount of the value that was originally produced. If Fake Creators fully succeed, the Fake Fans will be completely okay with that; only the True Fans and the Creators will feel that something of value was lost. The new community will celebrate the friendship and ponies, but there will be little or no heavy metal played.)

When there have been bad actors (e.g. Brent Dill, Ziz, etc.)

Brent Dill is an abuser and generally a horrible person, but from the perspective of the "geeks, mops, sociopaths" framework this is irrelevant. Any of Creators, True Fans, Fake Fans, or Fake Creators can be abusive and horrible people. (Another reason why using the word "Sociopaths" is wrong.) I would classify Brent as a Fake Fan -- as far as I know, he never pretended to be a rationality guru (as a Fake Creator would), nor does it seem like he cared about rationality itself (as a Creator or a True Fan would). He joined the rationalist community for the wrong reasons (as a Fake Fan would), specifically as yet another source of young women he could abuse.

Ziz, on the other hand, is a Fake Creator -- a fake rationality guru. Actually, they are outside the framework, because it seems like they preyed on the True Fans, not the Fake Fans.

...ok, this is a weird idea, but from a certain angle Chapman himself fits the role of a Fake Creator with regards to the rationalist community. First you have the rationality guru (Eliezer), then come his fans (rationalists), then come people who enjoy the community but don't really buy that stuff that Eliezer teaches (post-rationalists), and then comes a new guru bringing new wisdom optimized for the post-rationalists (Chapman). Interesting.

Replies from: gworley, SaidAchmiz, TAG
comment by Gordon Seidoh Worley (gworley) · 2024-01-22T18:19:29.149Z · LW(p) · GW(p)

I agree with some of your points here, disagree with others. I'll just focus on one that seems worth discussing:

...ok, this is a weird idea, but from a certain angle Chapman himself fits the role of a Fake Creator with regards to the rationalist community. First you have the rationality guru (Eliezer), then come his fans (rationalists), then come people who enjoy the community but don't really buy that stuff that Eliezer teaches (post-rationalists), and then comes a new guru bringing new wisdom optimized for the post-rationalists (Chapman). Interesting.

I think this is not right. Chapman isn't really part of the rationalist community, and was working in parallel on something different that has now intersected because both Chapman and Eliezer are trying to reach similar audiences. I also think it's wrong to say that post-rationalists "don't buy that stuff that Eliezer teaches" as this misunderstands what post-rationality is about, although there are fake fans of post-rationalists, to borrow your terminology, who do reject rationality and it's pretty annoying that they've co-opted the term.

Replies from: SaidAchmiz
comment by Said Achmiz (SaidAchmiz) · 2024-01-23T00:32:41.467Z · LW(p) · GW(p)

Chapman isn’t really part of the rationalist community, and was working in parallel on something different that has now intersected because both Chapman and Eliezer are trying to reach similar audiences.

Probably the strangest thing about Chapman’s writing has always been the way that he would rail against “rationalists” and “rationality”, and then, when it was pointed out that his characterization doesn’t match the actual beliefs and behavior of Less Wrong style rationality and its adherents, would respond along the lines of “oh, I didn’t mean those ‘rationalists’ and that ‘rationality’; heck, I don’t even know what those Less Wrong guys believe”. To my knowledge, he’s never made it clear just who he is against, then. Where, in 2024 (or 2014), is Chapman encountering these people who have no connection to Less Wrong, but self-describe as “rationalists”, talk of “rationality” as their philosophy, etc.? Who even are these mythical people?

Replies from: kave, xpym, Viliam
comment by kave · 2024-01-23T00:43:15.367Z · LW(p) · GW(p)

In the sidebar of his website, the section "Positive and Logical" under "Part One: Taking Rationalism Seriously" says

Early 20th-century logical positivism was the last serious rationalism. Better understandings of rationality learn from its mistakes.

Replies from: SaidAchmiz
comment by Said Achmiz (SaidAchmiz) · 2024-01-23T01:05:10.313Z · LW(p) · GW(p)

It seems very weird to write a whole website-book, in the 21st century, arguing against early 20th-centry logical positivism, of all things. Besides, Chapman often writes as if the “rationalists” he takes as foils are, you know, still around!

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2024-01-23T01:31:13.007Z · LW(p) · GW(p)

I'd say they very much are, they just aren't as prevalent on Less Wrong (and I think there are still plenty of them on LW!). My experience is that you can't throw a stone without hitting a logical positivist (even if they don't know that they are, if you talk to them it's clear those are their beliefs) in any STEM university department, engineering company, etc.

Replies from: SaidAchmiz
comment by Said Achmiz (SaidAchmiz) · 2024-01-23T02:26:40.621Z · LW(p) · GW(p)

But what are those beliefs exactly? I mean, the actual, historical, early-20th-century positivists had some pretty specific beliefs, some of which were (now-)clearly wrong in their strongest forms… but do very many people believe those strongest forms now? Or are they logical positivists in the same way that Scott Alexander is a logical positivist?

This is why I find David Chapman’s “vagueblogging” so annoying. This whole conversation doesn’t need to be happening; it could all be avoided if he just, like, linked to specific people saying specific things.

Indeed, even just explicitly saying “logical positivists” instead of “rationalists” would make his writing more clear. Why say the latter if what you actually mean is the former…?

comment by xpym · 2024-02-04T12:44:35.501Z · LW(p) · GW(p)

Here's Chapman's characterization of LW:

Assuming by “the modern rationality movement” you mean the LessWrong-adjacent subculture, some of what they write is unambiguously meta-rational. The center of gravity is more-or-less rationalism as I use the term, but the subculture is not exclusively that.

Among the (arguably) core LW beliefs that he has criticized over the years are Bayesianism as a complete approach to epistemology, utilitarianism as a workable approach to ethics, the map/territory metaphor as a particularly apt way to think about the relationship between belief and reality.

Replies from: SaidAchmiz
comment by Said Achmiz (SaidAchmiz) · 2024-02-04T13:05:54.241Z · LW(p) · GW(p)

Yes, I’ve seen that quote; but what it means is that Chapman’s use of the terms “rational”, “rationality”, etc., are so different from ours (on LW) that we have to translate anything he writes before we can understand it.

As for the criticized beliefs—well, I also reject utilitarianism as a workable approach to ethics. So do many people here, I think (though probably not most). Bayesianism as complete approach to epistemology seems like at least a bit of a strawman.

The map/territory one is interesting; I can’t easily predict what that criticism consists of. Do you have any links handy, by chance?

Replies from: xpym
comment by xpym · 2024-02-04T13:31:21.198Z · LW(p) · GW(p)

Well, I blame Yudkowsky for the terminology issue, he took a term with hundreds of years of history and used it mostly in place of another established term which was traditionally sort of in opposition to the former one, no less (rationalism vs empiricism).

As I understand it, Chapman's main target audience wasn't LW, but normal STEM-educated people unsophisticated in the philosophy of science-related issues. Pretty much what Yudkowsky called "traditional rationality".

The map/territory essay: https://metarationality.com/maps-and-territory

Replies from: SaidAchmiz, Richard_Kennaway
comment by Said Achmiz (SaidAchmiz) · 2024-02-04T15:42:38.854Z · LW(p) · GW(p)

The map/territory essay: https://metarationality.com/maps-and-territory

Thanks for the link!

I have to agree with @Richard_Kennaway’s evaluation of the essay. Also, Chapman here exhibits his very common tendency to, as far as I can tell, invent strawman “mistakes” that his targets supposedly make, in order to then knock them down. For example:

Taking maps as prototypes gives the mistaken impression that simply correcting factual errors, or improving quantitative accuracy, is the whole task of rationality.

Maybe someone somewhere has made this sort of mistake at some point, but I can’t recall ever encountering such a person. And to claim that such a mistake arises, specifically, from the map-territory metaphor, seems to me to be entirely groundless.

But of course that’s fine; if I haven’t encountered a thing, it does not follow that the thing doesn’t exist. And surely Chapman has examples to point to, of people making this sort of error…? I mean, I haven’t found any examples, at least not in this essay, but he has them somewhere… right?

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2024-02-04T20:55:56.179Z · LW(p) · GW(p)

Maybe someone somewhere has made this sort of mistake at some point, but I can’t recall ever encountering such a person. And to claim that such a mistake arises, specifically, from the map-territory metaphor, seems to me to be entirely groundless.

I think you should seriously consider you live in a bubble where you are less likely to encounter the vast valley of half-baked rationality. I regularly meet and engage with people who make exactly this class of errors, especially in practice, even if they say they understand in theory that this is not the whole task of LW-style rationality.

Replies from: SaidAchmiz
comment by Said Achmiz (SaidAchmiz) · 2024-02-04T21:07:03.123Z · LW(p) · GW(p)

Sure, that’s possible. Do you have any links to examples?

comment by Richard_Kennaway · 2024-02-04T14:14:20.472Z · LW(p) · GW(p)

The map/territory essay: https://metarationality.com/maps-and-territory

Every example Chapman gives there to illustrate the supposed deficiencies of "the map is not the territory" is of actual maps of actual territories, showing many different ways in which an actual map can fail to correspond to the actual territory, and corresponding situations of metaphorical maps of metaphorical territories. The metaphor passes with flying colours even as Chapman claims to have knocked it down.

Replies from: xpym
comment by xpym · 2024-02-04T14:43:53.769Z · LW(p) · GW(p)

To me, the main deficiency is that it doesn't make the possibility, indeed, the eventual inevitability of ontological remodeling explicit. The map is a definite concept, everybody knows what maps look like, that you can always compare them etc. But you can't readily compare Newtonian and quantum mechanics, they mostly aren't even speaking about the same things.

Replies from: SaidAchmiz
comment by Said Achmiz (SaidAchmiz) · 2024-02-04T16:02:32.968Z · LW(p) · GW(p)

Switching from a flat map drawn on paper (parchment?), to a globe, would be an example of ontological remodeling.

comment by Viliam · 2024-01-23T08:42:24.977Z · LW(p) · GW(p)

As I understand it, Chapman is promoting some form of Buddhism. (I think he might even be a leader of some small sect? Not sure.) The bottom line [LW · GW] is already written; now he is adding the previous lines to make it seem like this is something that a sufficiently smart modern thinker would discover independently.

Here he is using an ancient Dark Arts technique, which in our culture is known as Hegel's dialectic, but it was already used by Buddha -- to win a debate, create two opposed strawmen, classify all your competitors as belonging to one or the other, and then you are the only smart person in the room who can transcend the strawmen and find the golden middle way of "it is actually the reasonable parts of this, plus the reasonable parts of that, minus all the unreasonable parts". Congratulations, you win!

Buddha classified his philosophical/religious competitors into two groups, and Chapman translated one of those words as "rationalists". (The reference to early 20th-century logical positivism is just another nice trick, where Chapman is promoting an ancient belief, but he is rebranding it as a cool modern perspective, as opposed to the outdated and therefore low-status ideas of positivism.)

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2024-01-23T17:54:03.464Z · LW(p) · GW(p)

I don't have the energy to get into it in depth, but I think you're being pretty uncharitable here and it feels to me like you're trying to weaponize rationalist applause lights. Some quick thoughts on what I think is insufficient about your comment:

  • You claim he already wrote the bottom line, but you provide no evidence to substantiate that.
  • You claim that Hegelian dialectic is a dark art technique with no justification.
  • You makes some claims about what's written in Buddhists texts, but offer no reference to the specific arguments that were made to justify the claim that he set up strawmen, which would also require proving that they were strawmen at the time, not just now with 2500 years of philosophical progress.
  • You offer what I guess I can best interpret as an attempt to dunk on Chapman for promoting "ancient" ideas, as if ancient ideas were inherently bad (lots of math is just as old and we still use it every day, so being old is obviously not the problem; would you dunk on someone for promoting the "ancient" belief in the Pythagorean theorem?).
Replies from: Viliam
comment by Viliam · 2024-01-24T09:04:13.793Z · LW(p) · GW(p)

You claim he already wrote the bottom line, but you provide no evidence to substantiate that.

Prediction: No matter how many books or web articles Chapman writes, their conclusions will always support Buddhism. He will not conclude anything fundamentally incompatible with Buddhism.

You claim that Hegelian dialectic is a dark art technique with no justification.

Yes, and I have just explained how exactly it works. Once you see the pattern, it is obvious. Let me show you how it works in practice:

"There are people who believe that Hegelian dialectic is a useful method to transcend our limited beliefs by transcending the thesis and anti-thesis by creating a new and better syn-thesis.

There are also people who believe that Hegelian dialectic is a dark art technique, where people who disagree with the author's conclusion are sorted into two opposing groups, and then the author's solution is presented as the middle way superior to both.

My opinion is that both of these people are correct in some aspect, but wrong in some other aspect. The actual deep understanding of Hegelian dialectic is that in some situations it can be used to transcend two existing contradictory beliefs, while in other situations it can be used as a dark arts technique."

You makes some claims about what's written in Buddhists texts, but offer no reference

True; that would be too much work. (Probably enough for someone to write a master's thesis.)

You offer what I guess I can best interpret as an attempt to dunk on Chapman for promoting "ancient" ideas, as if ancient ideas were inherently bad

I don't mind the ancient ideas, but the rebranding feels a bit dishonest. To use your analogy, it would be like teaching the Pythagorean theorem under a new name, pretending that it was my invention.

If Chapman said plainly that he was repeating an ancient argument about pre-Buddhist "rationalists", we could have avoided a confusion. But he made it sound like he was making a fresh observation based on current data.

Notice how Chapman reacts to finding out that there are actual rationalists out there who do not fit his definition of "rationalists". He simply says "those are not the rationalists I was talking about". And that's great! But does this new knowledge make him somehow revise his existing conclusions?

Replies from: localdeity, pktechgirl
comment by localdeity · 2024-01-24T20:34:26.299Z · LW(p) · GW(p)

I haven't read the relevant Chapman stuff, but, to be sure, if we look up Rationalism on Wikipedia, it lists Descartes, Spinoza, Leibniz, and Kant, devoting many paragraphs to their views.  Only in one sentence at the end does it mention Less Wrong: "Outside of academic philosophy, some participants in the internet communities surrounding Less Wrong and Slate Star Codex have described themselves as "rationalists.""  It doesn't even mention Yudkowsky, Hanson, or Scott Alexander by name.

The point is, there exists some academic context in which "rationalism" has a preexisting meaning that refers to those 1600s-1800s people, and probably not to Less Wrong people.  So, when Chapman writes about "rationalists", is it possible he's acting like he's in that context, and talking about the pre-1900 people?

Doing Google searches with site:meaningness.com, I get "descartes" -> 8 results, "spinoza" -> 2 results, "leibniz" -> 1 result [though it's about calculus], "kant" -> 21 results; meanwhile, "yudkowsky OR eliezer" -> 10 results, "less wrong" -> 5 results, "hanson" -> 3 results... and "scott alexander" OR "slate star codex" OR "astral codex ten" -> 31 results!

Hmmph.  I guess he talks about both.  It would require actually reading the blog to judge what he means when he says "rationalist" and whether he's consistent about it.  I'll let Viliam report on this.

... And I see a further result, from what seems to be a book Chapman wrote, bold added by me:

  • The book uses “rationality” to refer to systematic, formal methods for thinking and acting; not in the broader sense of “any sensible way of thinking or acting,” as opposed to irrationality.
  • “Rationalism” refers to any belief system that makes exaggerated claims about the power of rationality, usually involving a formal guarantee of correctness.

Oh, dear.  You define a term like that based on whether the claims are exaggerated vs accurate?  That seems like a recipe for generating arguments about whether something qualifies as "rationalism".  (If Descartes writes an essay about the power of rationality, and some of the claims are exaggerated while others are correct, does that mean the essay is partly rationalism and partly not?  And, obviously, if we disagree about something's correctness, then that means we'd disagree about whether it's "rationalism".  I have the impression that people often don't try to label philosophical ideas beyond who wrote them and when, and any voluntary self-labeling the author did; this type of thing is probably why.)

I've now updated to find Said and Viliam's complaints very plausible.

comment by Elizabeth (pktechgirl) · 2024-01-24T18:47:57.860Z · LW(p) · GW(p)

I've only read some of his work, but I didn't walk away with a sense he wants everyone to be Buddhist. My sense was more that he was pushing back against things he didn't like within Buddhism, including changes it made to become more memetically fit.

comment by Said Achmiz (SaidAchmiz) · 2024-01-23T00:25:55.813Z · LW(p) · GW(p)

The problem is not that Fake Creators are bad people

I think that this is wrong, actually; Fake Creators are bad people, precisely in virtue of being Fake Creators. Value for Creators and True Fans is real value; value for Fake Fans is fake value. Destroying value is bad.

comment by TAG · 2024-01-22T22:04:06.057Z · LW(p) · GW(p)

First you have the rationality guru Socrates...

comment by trevor (TrevorWiesinger) · 2024-01-20T00:47:41.588Z · LW(p) · GW(p)

Strong upvoted, I'm glad to see people thinking about this particular problem. Looking forward to a continuation of this dialog, which I will also use to help research game-changing community dynamics and defense from external threats.

I think that a big element here, that puts things OOD in interesting ways, is that a lot is at stake- money, for example (billions have been moved into OpenAI and Anthropic alone).

Normal subcultures don't have infosec requirements, let alone infosec requirements effective enough for intelligence agencies [LW · GW], let alone warding off megacorporations with unprecedented technological capabilities and incredible financial incentives to coopt or hijack large portions of the scene [LW · GW] (yes, I did in fact write that 5 days before the OpenAI incident, although 5 days is like 30 days in 2019 time).

Furthermore, rationalists at least aspire to be at the cutting edge of epistemics and/or empowering people, with the Sequences [? · GW], the first chapter of the CFAR handbook [? · GW], prediction markets, and projectlawful [LW · GW] being notable probably-successes.

As a side note, I think that a sufficiently intelligent person is often flexible enough to succeed in the role of the creator or the sociopath, whereas other people are effectively "locked in" to being sociopaths/impostors, fanatics, or mobs. The ratio of flexible to inflexible people should also be far higher here, in addition to intelligence. 

However, rare math skill, e.g. intuitive multidimensional thinking, places hard constraints on who can become alignment rockstars [LW · GW] and who is stuck with routinely stealing ideas (or cutting reasonable deals with ghostwriters), similar to guitar-playing fine motor skills and music composition skill are required in order for someone to be a creator instead of settling for one of the other roles.

Replies from: AspiringRationalist
comment by NoSignalNoNoise (AspiringRationalist) · 2024-01-20T20:11:11.076Z · LW(p) · GW(p)

Normal subcultures don't have infosec requirements, let alone infosec requirements effective enough for intelligence agencies

This link is broken

Replies from: TrevorWiesinger
comment by trevor (TrevorWiesinger) · 2024-01-20T20:22:21.806Z · LW(p) · GW(p)

Fixed, thank you.

comment by Garrett Baker (D0TheMath) · 2024-01-20T21:17:05.389Z · LW(p) · GW(p)

From afar at least academia seems absolutely brimming with mops, sociopaths, and geeks. The question should be how does it still function? Several answers, which I don't have enough information to differentiate between:

  1. It doesn't. It sucks, and we should expect most intellectual advancement to happen elsewhere.
  2. 1000 shit papers do nothing to lessen a single great work. The lesson? Set up your subculture so that it prioritizes strong-link problems.
  3. There is a (slow) ground truth signal in the form of replicable experimental evidence that acts to cull the excesses of sociopaths. Psychology had a replication crisis, but after 50 years it was able to realize it had a replication crisis, and get better, despite the fraction of mops and sociopaths in the field likely at an all-time high.
  4. Academia is a(n) (un?)holy alliance between mops, geeks, and sociopaths, where mops get legibility & credentialism, geeks get to work on neat technical problems, and sociopaths get to game grant money & status. Everyone mostly knows who everyone else is, but doesn't talk very loudly about it.

    You can tell who are the mops by their milk-toast papers, you can tell who are the sociopaths by their money & unnaturally high h-index, and you can tell who are the geeks by their quality work.
  5. [edit] Academia is not unified, and has many subcultures within it and outside of it. These subcultures compete among each other for the small number of geeks in order to maintain their institutional status, which puts a limit on how charmed they can be at their sociopaths, or how deluded they can be by their mops. 
  6. Probably more hypotheses.
Replies from: Mo Nastri
comment by Mo Putera (Mo Nastri) · 2024-01-22T04:16:52.821Z · LW(p) · GW(p)

you can tell who are the sociopaths by their money & unnaturally high h-index, and you can tell who are the geeks by their quality work

Tangential to your comment's main point, but for non-insiders maybe PaperRank, AuthorRank and Citation-Coins are harder to game than the h-index: 

Since different papers and different fields have largely different average number of co-authors and of references we replace citations with individual citations, shared among co-authors. Next, we improve on citation counting applying the PageRank algorithm to citations among papers. Being time-ordered, this reduces to a weighted counting of citation descendants that we call PaperRank. Similarly, we compute an AuthorRank applying the PageRank algorithm to citations among authors. These metrics quantify the impact of an author or paper taking into account the impact of those authors that cite it. Finally, we show how self- and circular- citations can be eliminated by defining a closed market of citation-coins. 

They still can't be compared between subfields though, only within.

comment by Seth Herd · 2024-01-23T01:08:03.983Z · LW(p) · GW(p)

How about this distillation of the theory:

Sociopaths are a problem, and you should aggressively guard against them.

Now this applies outside of subcultures, to other business and personal affairs, and seems to capture the most original and important part of this theory.

Replies from: Seth Herd
comment by Seth Herd · 2024-01-23T01:22:19.769Z · LW(p) · GW(p)

We should probably add: people making money off of something have (at least somewhat) misaligned incentives with those trying to either create or enjoy that thing. Those misaligned incentives will sometimes encourage them to act like sociopaths.

comment by Mo Putera (Mo Nastri) · 2024-01-22T04:41:13.075Z · LW(p) · GW(p)

Curious what you think of Scott Alexander's Peter Turchin-inspired 'cyclic model' alternative to Chapman's model, which he argues better matches his experience, summarizable as precycle → growth (forward + upward + outward) → involution → postcycle: 

Either through good luck or poor observational skills, I’ve never seen a lot of sociopath takeovers. Instead, I’ve seen a gradual process of declining asabiyyah. Good people start out working together, then work together a little less, then turn on each other, all while staying good people and thinking they alone embody the true spirit of the movement.