You're Calling *Who* A Cult Leader?

post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-22T06:57:46.809Z · LW · GW · Legacy · 121 comments

Contents

121 comments

Followup toWhy Our Kind Can't Cooperate, Cultish Countercultishness

I used to be a lot more worried that I was a cult leader before I started reading Hacker News.  (WARNING:  Do not click that link if you do not want another addictive Internet habit.)

From time to time, on a mailing list or IRC channel or blog which I ran, someone would start talking about "cults" and "echo chambers" and "coteries".  And it was a scary accusation, because no matter what kind of epistemic hygeine I try to practice myself, I can't look into other people's minds.  I don't know if my long-time readers are agreeing with me because I'm making sense, or because I've developed creepy mind-control powers.  My readers are drawn from the nonconformist crowd—the atheist/libertarian/technophile/sf-reader/Silicon-Valley/early-adopter cluster—and so they certainly wouldn't admit to worshipping me even if they were.

And then I ran into Hacker News, where accusations in exactly the same tone were aimed at the site owner, Paul Graham.

Hold on.  Paul Graham gets the same flak I do?

I've never heard of Paul Graham saying or doing a single thing that smacks of cultishness.  Not one.

He just wrote some great essays (that appeal especially to the nonconformist crowd), and started an online forum where some people who liked those essays hang out (among others who just wandered into that corner of the Internet).

So when I read someone:

  1. Comparing the long hours worked by Y Combinator startup founders to the sleep-deprivation tactic used in cults;
  2. Claiming that founders were asked to move to the Bay Area startup hub as a cult tactic of separation from friends and family;

...well, that outright broke my suspension of disbelief.

Something is going on here which has more to do with the behavior of nonconformists in packs than whether or not you can make a plausible case for cultishness or even cultishness risk factors.

But there are aspects of this phenomenon that I don't understand, because I'm not feeling what they're feeling.

Behold the following, which is my true opinion:

"Gödel, Escher, Bach" by Douglas R. Hofstadter is the most awesome book that I have ever read.  If there is one book that emphasizes the tragedy of Death, it is this book, because it's terrible that so many people have died without reading it.

I know people who would never say anything like that, or even think it: admiring anything that much would mean they'd joined a cult (note: Hofstadter does not have a cult).  And I'm pretty sure that this negative reaction to strong admiration is what's going on with Paul Graham and his essays, and I begin to suspect that not a single thing more is going on with me.

But I'm having trouble understanding this phenomenon, because I myself feel no barrier against admiring Gödel, Escher, Bach that highly.

In fact, I would say that by far the most cultish-looking behavior on Hacker News is people trying to show off how willing they are to disagree with Paul Graham.  Let me try to explain how this feels when you're the target of it:

It's like going to a library, and when you walk in the doors, everyone looks at you, staring.  Then you walk over to a certain row of bookcases—say, you're looking for books on writing—and at once several others, walking with stiff, exaggerated movements, select a different stack to read in.  When you reach the bookshelves for Dewey decimal 808, there are several other people present, taking quick glances out of the corner of their eye while pretending not to look at you.  You take out a copy of The Poem's Heartbeat: A Manual of Prosody.

At once one of the others present reaches toward a different bookcase and proclaims, "I'm not reading The Poem's Heartbeat!  In fact, I'm not reading anything about poetry!  I'm reading The Elements of Style, which is much more widely recommended by many mainstream writers."  Another steps in your direction and nonchalantly takes out a second copy of The Poem's Heartbeat, saying, "I'm not reading this book just because you're reading it, you know; I think it's a genuinely good book, myself."

Meanwhile, a teenager who just happens to be there, glances over at the book.  "Oh, poetry," he says.

"Not exactly," you say.  "I just thought that if I knew more about how words sound—the rhythm—it might make me a better writer."

"Oh!" he says, "You're a writer?"

You pause, trying to calculate whether the term does you too much credit, and finally say, "Well, I have a lot of readers, so I must be a writer."

"I plan on being a writer," he says.  "Got any tips?"

"Start writing now," you say immediately.  "I once read that every writer has a million words of bad writing inside them, and you have to get it out before you can write anything good.  Yes, one million.  The sooner you start, the sooner you finish."

The teenager nods, looking very serious.  "Any of these books," gesturing around, "that you'd recommend?"

"If you're interested in fiction, then definitely Jack Bickham's Scene and Structure," you say, "though I'm still struggling with the form myself.  I need to get better at description."

"Thanks," he says, and takes a copy of Scene and Structure.

"Hold on!" says the holder of The Elements of Style in a tone of shock.  "You're going to read that book just because he told you to?"

The teenager furrows his brow.  "Well, sure."

There's an audible gasp, coming not just from the local stacks but from several other stacks nearby.

"Well," says the one who took the other copy of The Poem's Heartbeat, "of course you mean that you're taking into account his advice about which books to read, but really, you're perfectly capable of deciding for yourself which books to read, and would never allow yourself to be swayed by arguments without adequate support.  Why, I bet you can think of several book recommendations that you've rejected, thus showing your independence.  Certainly, you would never go so far as to lose yourself in following someone else's book recommendations—"

"What?" says the teenager.

If there's an aspect of the whole thing that annoys me, it's that it's hard to get that innocence back, once you even start thinking about whether you're independent of someone.  I recently downvoted one of PG's comments on HN (for the first time—a respondent had pointed out that the comment was wrong, and it was).  And I couldn't help thinking, "Gosh, I'm downvoting one of PG's comments"—no matter how silly that is in context—because the cached thought had been planted in my mind from reading other people arguing over whether or not HN was a "cult" and defending their own freedom to disagree with PG.

You know, there might be some other things that I admire highly besides Gödel, Escher, Bach, and I might or might not disagree with some things Douglas Hofstadter once said, but I'm not even going to list them, because GEB doesn't need that kind of moderation.  It is okay for GEB to be awesome.  In this world there are people who have created awesome things and it is okay to admire them highly!  Let this Earth have at least a little of its pride!

I've been flipping through ideas that might explain the anti-admiration phenomenon.  One of my first thoughts was that I evaluate my own potential so highly (rightly or wrongly is not relevant here) that praising Gödel, Escher, Bach to the stars doesn't feel like making myself inferior to Douglas Hofstadter.  But upon reflection, I strongly suspect that I would feel no barrier to praising GEB even if I weren't doing anything much interesting with my life.  There's some fear I don't feel, or some norm I haven't acquired.

So rather than guess any further, I'm going to turn this over to my readers.  I'm hoping in particular that someone used to feel this way—shutting down an impulse to praise someone else highly, or feeling that it was cultish to praise someone else highly—and then had some kind of epiphany after which it felt, not allowed, but rather, quite normal.

 

Part of the sequence The Craft and the Community

Next post: "On Things that are Awesome"

Previous post: "Tolerate Tolerance"

121 comments

Comments sorted by top scores.

comment by Scott Alexander (Yvain) · 2009-03-22T12:42:03.440Z · LW(p) · GW(p)

I read recently an article on charitable giving which mentioned how people split up their money among many different charities to, as they put it, "maximize the effect", even though someone with this goal should donate everything to the single highest-utility charity. And this seems a bit like the example you cited where, if blue cards came up randomly 75% of the time and red cards came up 25% of the time, people would bet on blue 75% of the time even though the optimal strategy is blue 100%. All this seems to come from concepts like "Don't put all your eggs in one basket", which is a good general rule for things like investing but can easily break down.

I find myself having to fight this rule for a lot of things, and one of them is beliefs. If all of my opinions are Eliezer-ish, I feel like I'm "putting all my eggs in one basket", and I need to "diversify".You use book recommendations as a reductio, but I remember reading about half the books on your recommended reading list, thinking "Does reading everything off of one guy's reading list make me a follower?" and then thinking "Eh, as soon as he stops recommending such good books, I'll stop reading them."

The other thing is the Outside View summed up by the proverb "If two people think alike, one of them isn't thinking." In the majority of cases I observe where a person conforms to all of the beliefs held by a charismatic leader of a cohesive in-group, and keeps praising that leader's incredible insight, that person is a sheeple and that leader has a cult (see: religion, Objectivism, various political movements). I respect the Outside View enough that I have trouble replacing it with the Inside View that although I agree with Eliezer about nearly everything and am willing to say arbitrarily good things about him, I'm certainly not a cultist because I'm coming to my opinions based on Independent Logic and Reason. I don't know any way of solving this problem except the hard way.

"note: Hofstadter does not have a cult"

I tried to start a Hofstadter cult once. The first commandment was "Thou shalt follow the first commandment." The second commandment was "Thou shalt follow only those even-numbered commandments that do not exhort thee to follow themselves." I forget the other eight. Needless to say it didn't catch on.

Replies from: SoullessAutomaton, Nebu, Anatoly_Vorobey, None, AnnaSalamon, pnkflyd831
comment by SoullessAutomaton · 2009-03-22T13:04:11.520Z · LW(p) · GW(p)

I tried to start a Hofstadter cult once. The first commandment was "Thou shalt follow the first commandment." The second commandment was "Thou shalt follow only those even-numbered commandments that do not exhort thee to follow themselves." I forget the other eight. Needless to say it didn't catch on.

You just didn't give it enough time. Remember, it always takes longer than you expect!

comment by Nebu · 2009-03-25T16:52:55.176Z · LW(p) · GW(p)

I find myself having to fight this rule for a lot of things, and one of them is beliefs. If all of my opinions are Eliezer-ish, I feel like I'm "putting all my eggs in one basket", and I need to "diversify"

See also Robin Hanson's post on Echo Chamber Confidence.

You use book recommendations as a reductio, but I remember reading about half the books on your recommended reading list, thinking "Does reading everything off of one guy's reading list make me a follower?"

I think of all the people who have ever recommended books to me, Eliezer has the most recommendations which I've actually followed. In most of my circle socials, I'm the "smart one", but I'm nowhere near as smart as Eliezer (or most other people on LessWrong, it seems). So I do admire EY a lot. I want to be as smart as he is, and so I try reading all the books he has read.

And it kills me, because I also remember his post about novice editors copying the surface behavior of master editors, without integrating the deep insight, and I know that by reading the same science fiction novels EY has read, I'm committing exactly the same sin. But I don't know what else I can do to try to improve myself.

comment by Anatoly_Vorobey · 2009-03-22T13:42:56.565Z · LW(p) · GW(p)

how people split up their money among many different charities to, as they put it, "maximize the effect", even though someone with this goal should donate everything to the single highest-utility charity.

If I have complete or near-complete trust in the information available to me about the charity's utility, as well as its short-term sustainability, that seems like the right decision to make.

But if I don't - if I'm inclined to treat data on overhead and estimates of utility as very noisy sources of data, out of skepticism or experience - is it irrational to prefer several baskets?

Similarly with knowledge and following reading lists, ideologies and the like.

Replies from: RobinHanson, steven0461
comment by RobinHanson · 2009-03-22T14:25:46.124Z · LW(p) · GW(p)

Yes, even with great uncertainty, you should still put all your eggs into your best basket.

Replies from: thomblake, ciphergoth, private_messaging
comment by thomblake · 2009-04-02T14:27:31.295Z · LW(p) · GW(p)

Did you mean this as a general rule, or specifically about this topic?

The literal example of eggs seems to indeed work well with multiple baskets, especially if they're all equally good.

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2009-04-02T15:53:21.372Z · LW(p) · GW(p)

Specifically on this topic.

The expected number of eggs lost is least if you choose the best basket and put all your eggs in it, but because of diminishing returns, you're better off sacrificing a few eggs to reduce the variance. However, your charitable donations are such a drop in the ocean that the utility curve is locally pretty much flat, so you just optimise for maximum expected gain.

comment by Paul Crowley (ciphergoth) · 2009-03-22T15:10:28.324Z · LW(p) · GW(p)

This follows from the expected utility of the sum being the sum of the expected utility?

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-22T16:41:15.499Z · LW(p) · GW(p)

It follows from the assumption that you're not Bill Gates, don't have enough money to actually shift the marginal expected utilities of the charitable investment, and that charities themselves do not operate in an efficient market for expected utilons, so that the two top charities do not already have marginal expected utilities in perfect balance.

Replies from: CarlShulman
comment by CarlShulman · 2009-03-22T16:44:02.892Z · LW(p) · GW(p)

And that you care only about the benefits you confer, not the log of the benefits, or your ability to visualize someone benefited by your action, etc.

Replies from: ciphergoth, Anatoly_Vorobey
comment by Paul Crowley (ciphergoth) · 2009-03-24T13:19:37.134Z · LW(p) · GW(p)

I don't see how either of these affect this result - unless you're saying it's easier to visualise one person with clean water and another with a malaria net than it is two people with clean water?

Replies from: Eliezer_Yudkowsky, ciphergoth
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-08-09T05:57:15.354Z · LW(p) · GW(p)

it's easier to visualise one person with clean water and another with a malaria net than it is two people with clean water?

The sum of the affect raised is greater.

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2009-08-10T08:47:41.021Z · LW(p) · GW(p)

I don't understand I'm afraid, can you unpack that a bit please? Thanks.

Replies from: SoullessAutomaton, arundelo
comment by SoullessAutomaton · 2009-08-10T12:03:15.542Z · LW(p) · GW(p)

Consider scope insensitivity. The amount of "warm fuzzies" one gets from helping X numbers of individuals with a given problem does not scale even remotely linearly with X. Different actions to help with distinct problems, however, sum in a much closer to linear fashion (at least up to some point).

Ergo, "one person with clean water and another with a malaria net" feels intuitively like you're doing more than "two people with clean water".

Replies from: orthonormal
comment by orthonormal · 2009-08-10T20:57:22.075Z · LW(p) · GW(p)

Ergo, "one person with clean water and another with a malaria net" feels intuitively like you're doing more than "two people with clean water".

Well, not when you compare them against each other, but only when each is considered on its own: it's like this phenomenon.

comment by arundelo · 2009-08-10T12:30:43.064Z · LW(p) · GW(p)

I think it means: the sum of the feel-good points of giving one person clean water and another a malaria net will, for most people, be higher than the feel-good points of giving two people clean water.

comment by Paul Crowley (ciphergoth) · 2009-04-02T15:49:32.987Z · LW(p) · GW(p)

I'd like to get right whatever it is I'm doing wrong here, so if anyone would like to comment on any problems they see with this or the parent comment (which are both scored 0) I'd be grateful for your input.

EDIT: since this was voted down, but I didn't receive an explanation, I'm assuming it's just an attack, and so I don't need to modify what I do - thanks!

comment by Anatoly_Vorobey · 2009-03-22T17:14:45.263Z · LW(p) · GW(p)

I suspect that the ability to visualize someone benefited by your action is often a proxy for being certain that your action actually helped someone, and that people often place additional value on that certainty. They might not be acting as perfectly rational economic agents in such cases, but I'm not sure I'd call such behavior irrational.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2012-05-31T03:05:23.594Z · LW(p) · GW(p)

but I'm not sure I'd call such behavior irrational

Doesn't matter how we call a behavior. If it can be improved, it should be.

comment by private_messaging · 2012-06-22T14:28:06.755Z · LW(p) · GW(p)

Not when baskets are sapient and trying to exploit you. Utilitarians seriously need more social strategic thinking under uncertainty and input subversion.

Replies from: wedrifid
comment by wedrifid · 2012-06-22T14:42:10.420Z · LW(p) · GW(p)

Not when baskets are sapient and trying to exploit you. Utilitarians seriously need more social strategic thinking under uncertainty and input subversion.

Robin is right, you are wrong. Robin is an economist explaining a trivial application of his field.

Replies from: private_messaging
comment by private_messaging · 2012-06-22T14:48:47.364Z · LW(p) · GW(p)

Robin is wrong (or actually, correct about inanimate baskets but not about agent baskets) and you are simply wrong.

When there is a possibility that your decision method is flawed in such a way that it can be exploited (at some expense), you have to diversify or introduce randomness, to minimize pay-off for development of exploit for your decision method, thus lowering the exploitation. Basic game theory. Commonly applied in e.g. software security.

Replies from: wedrifid
comment by wedrifid · 2012-06-22T14:56:55.860Z · LW(p) · GW(p)

to minimize pay-off for development of exploit for your decision method,

No, you are still failing to comprehend this point (which applies here too).

Basic game theory.

Robin is applying said game theory correctly. You are not. More precisely Robin applied game theory correctly 3 years ago.

Replies from: private_messaging
comment by private_messaging · 2012-06-22T15:06:40.572Z · LW(p) · GW(p)

No, you are still failing to comprehend this point (which applies here too).

I comprehend that point. I also comprehend other issues:

Evaluation of the top charity is incredibly inaccurate (low probability of correctness), and taking that into account the difference in expected payoff between the good charities should be quite small.

Meanwhile if there exist a population sharing a flaw in the charity evaluation method (the flaw that you have), the payoff for finding a method of exploitation of this particular flaw is inversely proportional to how much they diversify.

Robin is applying said game theory correctly. You are not. More precisely Robin applied game theory correctly 3 years ago.

Geez. shouting match. Once again: you're wrong, and from what I know, you may well be on something that you think boosts your sanity, but it really doesn't.

Replies from: gwern, None, None
comment by gwern · 2012-06-22T15:30:06.040Z · LW(p) · GW(p)

from what I know, you may well be on something that you think boosts your sanity, but it really doesn't.

Stay classy, Dmytry!

Replies from: wedrifid, private_messaging
comment by wedrifid · 2012-06-22T17:05:52.480Z · LW(p) · GW(p)

Stay classy, Dmytry!

Oh, that explains a lot. While the two accounts had displayed similar behavioral red-flags and been relegated to the same reference class I hadn't made the connection.

Thanks Gwern.

comment by private_messaging · 2012-06-22T15:34:29.916Z · LW(p) · GW(p)

Well, I thought that giving this feedback could help. I'm about as liberal as it gets when it comes to drug use, but it must be recognized that there are considerable side effects to what he may be taking. You are studying the effects, right? You should take into account that I called you and him (out of all the people) pathological before ever knowing that any of you did this experimentation; this ought to serve as some form of evidence of side effects that are visible from outside.

Replies from: gwern
comment by gwern · 2012-06-22T15:37:04.573Z · LW(p) · GW(p)

You are studying the effects, right?

And none of them so far bear on game theoretic minimaxing vs expected value maximizing.

You should take into account that I called you and him (out of all the people) pathological before ever knowing that any of you did this experimentation

You insult everyone here. Don't go claiming this represents special insight on your part, even if one were to grant the other claims!

comment by [deleted] · 2012-06-22T15:26:08.978Z · LW(p) · GW(p)

If you're so confident you're right, prove it rigorously (with, like, math). Otherwise, I'll side with the domain expert over the guy claiming his interlocutor is on drugs any day of the week.

Replies from: private_messaging
comment by private_messaging · 2012-06-22T15:45:39.624Z · LW(p) · GW(p)

Posted on this before:

http://lesswrong.com/lw/aid/heuristics_and_biases_in_charity/5y64

The payoff for exploit calculation is incredibly trivial; if everyone with a flaw diversifies between 5 charities then the payoff for determining and utilizing exploit is 1/5 of the payoff when one pays to the 'top' one. Of course there are some things that can go wrong with this, for instance it may be easier to exploit to the extent sufficient to get into the top 5, which is why it is hard to do applied mathematics on this kind of topic, not a lot of data.

What I believe would happen if the people were to adopt the 'choose top charity, donate everything to it' strategy, is that, since people are pretty bad at determining top charities, and do so using various proxies of performance, and have systematic errors in the evaluation, most of people would just end up donating to some sort of super-stimuli of caring with which no one with truly the best intentions can compete (or to compete with which a lot of effort has to be expended on imitation of superstimuli).

I have made a turret in a game, that would shoot precisely where it expects you to be. Unfortunately, you can easily outsmart this turret's model of where you could be. Adding random noise to the bullet velocity dramatically increases the lethality of this turret, even though under the turret's model of your behaviour it is now not shooting at the point with the highest expected damage. It is very common to add noise or fuzzy spread to eliminate undesirable effects of the predictable systematic error. I believe that one should diversify among several of the subjectively 'best' charities, within the range from the best comparable to the size of systematic error in the process of determination of the best charity.

Replies from: amit, TheOtherDave, None, private_messaging
comment by amit · 2012-06-22T17:11:29.414Z · LW(p) · GW(p)

From this list

It follows from the assumption that you're not Bill Gates, don't have enough money to actually shift the marginal expected utilities of the charitable investment, and that charities themselves do not operate in an efficient market for expected utilons, so that the two top charities do not already have marginal expected utilities in perfect balance.

the assumption whose violation your argument relies on, is you not having enough money to shift the marginal expected utilities, when "you" are considered to be controlling the choices of all the donors who choose in a sufficiently similar way. I would agree that given the right assumptions about the initial marginal expected utilities and how more money would change the marginal utilities and marginal expected utilities, that this assumption might sometimes be violated doesn't look like an entirely frivolous objection to a naively construed strategy of "give everything to your top charity".

(BTW, It's not clear to me why mistrust in your ability to evaluate the utility of donations to different charities should end up balancing out to produce very close expected utilities. It would seem to have to involve something like Holden's normal distribution for charity effectiveness, or something else that would make it so that whenever large utilites are involved, the corresponding probabilities will necessarily be requisitely small.)

(edit: quickly fixed some errors)

Replies from: private_messaging
comment by private_messaging · 2012-06-22T20:55:51.023Z · LW(p) · GW(p)

It's not about marginal expected utilities of the charities as much as it is about the expected utilities for exploitation/manipulation of what ever proxies you, and those like you, have used for making your number which you insist on calling 'expected utility'.

Let's first get sorted out the gun turret example, shall we? The gun is trying to hit some manoeuvrable spacecraft at considerable distance; it is shooting predictively. If you get an expected damage function over the angles of the turret, and shoot at the maximum of that function, what will happen is that your expected damage function will suddenly acquire a dip at that point because the target will learn to evade being hit. Do you fully understand the logic behind randomization of the shots there? Behind not shooting at the maximum of what ever function you approximate the expected utility with? The optimum targeting strategy looks like shooting into the space region of the possible target positions, with some sort of pattern. The best pattern may be some random distribution, or it may be some criss cross pattern, or the like.

Note also that it has nothing to do with saturation; it works the same if there's no 'ship destroyed' limit and you are trying to get target maximally wet with a water hose.

The same situation arises in general when you can not calculate expected utility properly. I have no objection that you should pay to the charity with the highest expected utility. You do not know highest expected utility. You are practically unable to estimate it. What charity looks best to you is not expected utility. What you think is expected utility, relates to expected utility as much as how strong a beam you think bridge requires relates to the actual requirements as set by building code. Go read on equilibrium strategies and such.

comment by TheOtherDave · 2012-06-22T16:37:07.002Z · LW(p) · GW(p)

for instance it may be easier to exploit to the extent sufficient to get into the top 5

This seems sort of important.

Sure, if I have two algorithms A1 and A2, and A1 spits out a single charity, and A2 spits out an unsorted list of 5 charities, and A1 is easy for people to exploit but A2 is much more difficult for people to exploit, it's entirely plausible that I'll do better using A2, even if that means spreading my resources among five charities.

OTOH, if A2 is just as easy for people to exploit as A1, it's not clear that this gets me any benefit at all.
And if A2 is easier to exploit, it leaves me actively worse off.

Granted, if, as in your turret example, A2 is simply (A1 plus some random noise), A2 cannot be easier to game than A1. And, sure, if (as in your turret example) all I care about is that I've hit the best charity with some of my money, random diversification of the sort you recommend works well.

I suspect that some people donating to charities have different goals.

comment by [deleted] · 2012-06-22T16:00:45.279Z · LW(p) · GW(p)

As expected, you ignored the assumption that "charities themselves do not operate in an efficient market for expected utilons, so that the two top charities do not already have marginal expected utilities in perfect balance."

Replies from: private_messaging
comment by private_messaging · 2012-06-22T16:27:43.903Z · LW(p) · GW(p)

No, I am not. I am expecting that the mechanism you may use to determine expected utilities has low probability of validity (low external probability of argument, if you wish) and thus you should end up assigning very close expected utilities to the top charities, simply due to the discounting for your method imprecision. It has nothing to do with some true frequentist expected utilities that charities have.

Replies from: None
comment by [deleted] · 2012-06-22T17:38:05.495Z · LW(p) · GW(p)

You're essentially assuming that the variance of whatever prior you place on the utilities is very large in comparison to the differences between the expected utilities, which directly contradicts the assumption. Solve a different problem, get a different answer -- how is that a surprise?

It has nothing to do with some true frequentist expected utilities that charities have.

Well at least you didn't accuse me of rationalizing, being high on drugs, having a love affair with Hanson, etc...

Replies from: private_messaging
comment by private_messaging · 2012-06-22T21:16:52.651Z · LW(p) · GW(p)

You're essentially assuming that the variance of whatever prior you place on the utilities is very large in comparison to the differences between the expected utilities, which directly contradicts the assumption. Solve a different problem, get a different answer -- how is that a surprise?

What assumption? I am considering the real world donation case. People being pretty bad at choosing top charities, meaning, very poor correlation between people's idea of top charity and actual charity quality.

Well at least you didn't accuse me of rationalizing, being high on drugs, having a love affair with Hanson, etc...

Well, I am not aware of a post by you where you say that you take drugs to improve sanity, and describe the side effects of the drugs in some detail that is reminiscent of the very behaviour you display. And if you were to make such a post, and if I were to read it, if I would see you having something matching the side effects you described, I would probably mention it.

comment by private_messaging · 2012-06-22T22:36:57.501Z · LW(p) · GW(p)

To clarify a few points that may have been lost behind abstractions:

Suppose there is a sub-population of donors, people who do not understand physics very well, and do not understand how one could just claim that a device won't work without thorough analysis of a blueprint. Those people may be inclined to donate to the research charity working on magnetic free energy devices, if such charity exists; a high payoff low probability scenario.

Suppose you have N such people willing to donate, on average, $M to cause or causes.

Two strategies are considered: donating to 1 subjectively best charity, or 5 subjectively top charities.

Under the strategy to donate to 1 'best' charity, the pay off for a magnetic perpetual motion device charity, if it is to be created, is 5 times larger than under the strategy to divide between top 5 . There is five times the reward for exploitation of this particular insecurity in the choice process; for sufficiently large M and N single-charity donating will cross the threshold whereby such charity will be economically viable, and some semi-cranks semi-frauds will jump on it.

But what's about the people donating to normal charities, like the water and mosquito nets and the like? The difference between top normal charities boil down to fairly inaccurate value judgements about which most people do not feel particularly certain.

Ultimately, the issue is that the correlation of your selection of charity with the charity's actual efficacy is affected by your choice. It is similar to the gun turret example.

There is two types of uncertainty here. The probabilistic uncertainty, from which expected utility can be straightforwardly evaluated, and the systematic bias which is unknown to the agent but may be known to other agents (e.g. inferred from observations).

comment by [deleted] · 2012-06-22T15:40:22.333Z · LW(p) · GW(p)

Evaluation of the top charity is incredibly inaccurate (low probability of correctness), and taking that into account the difference in expected payoff between the good charities should be quite small. Meanwhile if there exist a population sharing a flaw in the charity evaluation method (the flaw that you have), the payoff for finding a method of exploitation of this particular flaw is inversely proportional to how much they diversify.

Doesn't follow. If you have a bunch of charities where the difference in expected payoff is the same, donating any one of them has the same expected value as splitting your donation among all of them. If you have a charity with a even slightly higher expected payoff, you should donate all of your money to that one, since the expected value will be higher.

E.g.: Say that Charity A, Charity B...Charity J can create 10 utilons per dollar. Ergo, if you have $100, donating $100 to any of the ten charities will have an expected value of 1000 utilons. Donating $10 to each charity will also have an expected value of 1000 utilons. Now suppose Charity K comes on to the scene, with an expected payoff of 12 utilons per dollar. Donating your $100 to Charity K is the optimal choice, as the expected value is 1200 utilons.

comment by steven0461 · 2009-03-22T14:27:41.467Z · LW(p) · GW(p)

But if I don't - if I'm inclined to treat data on overhead and estimates of utility as very noisy sources of data, out of skepticism or experience - is it irrational to prefer several baskets?

Very much so. Rational behavior is to maximize expected utility. When rational agents are risk-averse, they are risk-averse with respect to something that suffers from diminishing returns in utility, so that the possibility of negative surprises outweighs the possibility of positive surprises. "Time spent reading material from good sources" is a plausible example of something that has diminishing returns in utility so you want to spread it among baskets. Utility itself does not suffer from diminishing returns in utility. (Support to a charity might, but only if it's large relative to the charity. Or large relative to the things the charity might be doing to solve the problem it's trying to solve, I guess.)

comment by [deleted] · 2009-03-22T17:18:11.967Z · LW(p) · GW(p)

In the case of reading, I can see the benefit of not putting all of your eggs in one basket. All of us have biases, however hard we try not to and by reading the same books you are maybe allowing your biases to be shaped along the same lines as Eliezers. By more of your formative reading being outside of this, you increase your chance of being able to challenge these biases.

This is especially true if you want to write in the same area as Eliezer as it increases your ability to contribute in a different way.

comment by AnnaSalamon · 2009-03-22T21:03:23.195Z · LW(p) · GW(p)

The other thing is the Outside View summed up by the proverb "If two people think alike, one of them isn't thinking." In the majority of cases I observe ... that person is a sheeple and that leader has a cult.

Do you have a mechanistic unpacking (even a guess would be helpful) of what it is to be a "sheeple" or a "cult", and of what harms come from being a "sheeple"? Given Aumann, I'm more inclined to say that if two people have different beliefs, at least one of them isn't thinking.

That said, your point about respecting outside views is reasonable. Are you trying to avoid replacing the outside-presumed "badness" of cults/sheeple with understood mechanisms, so as to retain any usefulness that might be in the received heuristics and that you might not understand the mechanisms behind?

Replies from: Yvain
comment by Scott Alexander (Yvain) · 2009-03-22T21:15:45.718Z · LW(p) · GW(p)

By sheeple and cult, I mean people whose good judgment is clouded by the mechanisms described in the Affective Death Spiral sequence.

comment by pnkflyd831 · 2009-03-24T12:35:42.689Z · LW(p) · GW(p)

It would be great to add a link to the article on charitable giving you refer too to see if they already conclude or dismiss my idea on the issue. From observations of those around me I tend to see the reason behind charitable giving as something other than maximizing the utility of the charitable gift. I postulate that people give to many different charities as a social signal. The contributor is signaling to those who are receiving the gift that they sympathize with the cause. The contributor is also signaling to those around them that they are a caring and compassionate person. The quantity of the gift has an almost negligible effect on this signaling. So the more times someone gives, and the more charities they give too allows them to signal positive social mores more often and to a larger audience, increasing their social status higher, than if they gave all their expendable money to one charity a limited number of times.

comment by Anatoly_Vorobey · 2009-03-22T10:16:55.892Z · LW(p) · GW(p)

PG runs a discussion site. He's using it as a sort of wide-flung net to catch worthy candidates for the "inner circle" - startup founders who get into his YC program - and is quite open about it (e.g. he explicitly says that YC submissions will among other things be judged on how well their authors are known as HC commenters and how worthy their comments have been judged to be). Why is it surprising that this creates a cult atmosphere of sorts?

Before Hacker News, PG was already famous in the relevant community for his essays, which are often credited, among other things, for the modern revival of interest in Lisp (this is probably an exaggeration). Nobody called him a cult leader back then.

Joel Spolsky is a famous blogger in the programming/CS/IT niche; he has an active discussion forum on his site. Lots of people respect him, lots of other people look down on his posts. Nobody calls him a cult leader.

RMS doesn't even have a discussion forum, and doesn't write a blog. He browses the web through an email-mediated wget; that's not even Web 1.0, it's Web -0.5 or something. He's widely considered to be a cult leader.

I'd guess that to make people think you're behaving like a cult leader, you need some or all of the following:

  1. An ideological commitment that is seen as overriding most other priorities. Something that no matter what other things you're talking about, much of the time you're still really talking about that. Something that, from the perspective of someone not as committed as you are, you won't shut up about.

    Paul Graham won't shut up about startups and how they're the natural way of existing for a talented programmer or entrepreneur. Stallman won't shut up about free software and how you're ethically bound to call your OS GNU/Linux. You won't shut up about Topics that Won't Be Named and a few other things.

  2. Actually being a leader or being thought a leader; having a real or widely imagined amount of influence. PG determines who gets into the very prestigious - in the relevant community - YC program. RMS controls GNU and has huge mindshare among free software enthusiasts. Within the admittedly smaller community at OB, you're seen as the most active blogger/proprietor, and the one most involved in its community formation. Unlike Hanson, who's opinionated but detached, you're opinionated and very attached. After lurking at OB for a year or so, I couldn't possibly tell who among the commenters are Hanson's friends, colleagues or fierce antagonists.

  3. You need to be seen as molding the community, or your audience, to your liking - either by filtering the undesirables, or boosting the voices of the desirable. In other words, you need to be seen as growing "the cult", sometimes with active choices, sometimes simply by choosing the rhetoric or the content that'll repel the unfaithful.

    PG acts, actively and passively, to limit the total audience of even the outwardly inclusive HN. The theme of keeping HN 'small' so it doesn't deteriorate to the level of Reddit is reiterated by PG and widely shared by the audience. RMS is famous for his attempts to enforce ideological purity. You're explicitly engaged in conscious community-building, which you sometimes describe as leading to a new generation of rationalists which will embrace the Topics that Shall Not be Named. That is, you can be seen (I'm not saying that must necessarily be the case) as not merely hoping to draw an audience of people interested in rational thinking, but actually filtering that audience to a subset that substantially shares your commitment to the Cause.

  4. This is an anti-property of being considered a cult leader: actively inviting and nurturing disagreement with yourself. In a blog format, that can work by explicitly encouraging dissent through various stylistic and content-based clues, by being especially mindful of dissenting voices in comments, etc. PG, as far as I could notice, never discouraged criticism and handled it superbly, so he possesses this anti-property (and is consequently much less of a cult leader than he could be otherwise). I hesitate to say I've never seen RMS change his opinion as a result of an argument - I guess this happened a few times on very technical issues - but it's a rare exception. You, while not discouraging criticism at all, are prone to ignore criticism (not mere trolling, but serious criticism) in comments and talk over it with people who mostly agree with you; you're also prone to present criticism against you as a result of a trendy choice to stand up to a perceived cult leader (this is a dangerous stance for oneself to adopt, even when true).

Replies from: Nominull, John_Maxwell_IV, CarlShulman, ciphergoth
comment by Nominull · 2009-03-22T15:26:51.084Z · LW(p) · GW(p)

Are you aware of the irony in saying Eliezer "won't shut up" about a topic he has demanded everybody shut up about?

Replies from: Anatoly_Vorobey
comment by Anatoly_Vorobey · 2009-03-22T17:32:15.023Z · LW(p) · GW(p)

I am. I view it as evidence that he recognizes the filtering effect these topics have brought to OB, and intends LW to build a community diverse and independent enough to not let itself be dominated by these topics, unless it so chooses. It's a smart decision.

comment by John_Maxwell (John_Maxwell_IV) · 2009-03-23T04:16:19.243Z · LW(p) · GW(p)

One small step that Eliezer could take with regard to (4), I think, would be to renounce his right to decide which posts are featured and make it entirely dependent on post score.

Replies from: AnnaSalamon
comment by AnnaSalamon · 2009-03-23T05:03:32.667Z · LW(p) · GW(p)

The "top" page is already entirely dependent on post score. I'd strongly prefer that there stay some kind of editorial filter on some aspect of LW; we're doing great right now as a community, but many online communities start out high-quality and then change as their increased popularity changes the crowd and the content.

Replies from: CarlShulman
comment by CarlShulman · 2009-03-23T05:35:00.810Z · LW(p) · GW(p)

IAWYC, no 'but.'

comment by CarlShulman · 2009-03-22T16:15:34.535Z · LW(p) · GW(p)

I generally agree with your points, and draw special attention to the last sentence :

"you're also prone to present criticism against you as a result of a trendy choice to stand up to a perceived cult leader (this is a dangerous stance for oneself to adopt, even when true).." I'm not sure to what extent this is a double-instance of the recency effect (Anatoly's last sentence, and referring to Eliezer's most recent post) but it's something to be avoided regardless.

comment by Paul Crowley (ciphergoth) · 2009-03-22T11:38:29.161Z · LW(p) · GW(p)

you're also prone to present criticism against you as a result of a trendy choice to stand up to a perceived cult leader

Can you give an example where EY has been the first to bring up the whole cult thing?

Replies from: Anatoly_Vorobey
comment by Anatoly_Vorobey · 2009-03-22T12:13:38.955Z · LW(p) · GW(p)

I don't know if that ever happened, and I didn't mean to imply he had been. Suppose someone tells you that you've been acting like a cult leader. Even if you don't agree with the claim, you've just obtained a convenient meta-explanation of why people disagree with you: they're consciously standing up to the cult that isn't there; they're being extra contrarian on purpose to affirm their cherished independence. What I was trying to say is that it's generally dangerous to adopt this meta-explanation; you're better off refusing to employ it altogether or at least guard its use with very stringent empirical criteria.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-22T16:23:43.243Z · LW(p) · GW(p)

I wish I could agree with that, but you can't actually refuse to employ explanations. You might be able to refuse to talk about it, but you don't get a choice of which of several causal explanations gets to be true.

Replies from: CarlShulman, Anatoly_Vorobey, CarlShulman
comment by CarlShulman · 2009-03-22T19:05:51.654Z · LW(p) · GW(p)

You can try to correct for the self-serving temptation to overapply a certain explanation.

comment by Anatoly_Vorobey · 2009-03-22T17:24:24.228Z · LW(p) · GW(p)

Why not? Sometimes I manage to refuse to employ as many as five explanations before breakfast.

You can't pretend that the explanation doesn't exist if it occurred to you. But you certainly can refuse to act upon it, not just talk about it. Which among competing explanations for human behavior is true is almost never certain; it's perfectly possible to bias yourself against one common explanation and by doing so avoid the more harmful, and very probable, outcome of oversubscribing to it.

comment by CarlShulman · 2009-03-22T16:26:38.761Z · LW(p) · GW(p)

You can try to correct for the temptation for the self-serving application to overapply a certain explanation.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-22T10:38:14.881Z · LW(p) · GW(p)

So... just for the record... this post got up to #1 on HN, and then HN crashed, and is, so far as I can tell, still down a couple of hours later.

When you consider that the Less Wrong site format was inspired by HN, that LW is based on Reddit source code, and that Reddit is a Y Combinator company, I guess that writing about Paul Graham and then getting voted up on Hacker News exceeded the maximum recursion depth of the blogosphere.

Replies from: SoullessAutomaton, dfranke, MichaelGR
comment by SoullessAutomaton · 2009-03-22T12:56:53.611Z · LW(p) · GW(p)

This would be an excellent time for a "stack overflow" joke, if only Spolsky could be worked in somehow.

comment by dfranke · 2009-03-23T01:08:16.886Z · LW(p) · GW(p)

And here you are commenting on HN going down, and here's the guy who submitted this to HN replying to your comment.

comment by MichaelGR · 2009-03-22T17:42:25.330Z · LW(p) · GW(p)

"I guess that writing about Paul Graham and then getting voted up on Hacker News exceeded the maximum recursion depth of the blogosphere."

Just wait until PG writes an essay about all this...

comment by kurige · 2009-03-22T20:33:10.698Z · LW(p) · GW(p)

Picture of Eliezer in monk's robes (That is you, right?), stories about freemason-esque rituals, specific vocabulary with terms like, "the Bayesian conspiracy".

It's all tongue in cheek, and I enjoy it. But if you're trying to not look like a cult, then you're doing it wrong.

Replies from: PrometheanFaun
comment by PrometheanFaun · 2013-10-05T23:12:47.783Z · LW(p) · GW(p)

if you're trying to not look like a cult, then you're doing it wrong

I disagree. I think it's so easy for a community with widespread, genuine conviction as to their shared radicles to look like a cult, that, well, anyone willing to go through the rather extreme rigors of preventing anyone from seeing you as cult-like.. methinks they protest too much. I say we are- though far from being a cult- cultlike. We are weird, and passionate, and that's all it takes.

comment by RobinHanson · 2009-03-22T13:14:07.390Z · LW(p) · GW(p)

It seems to me that only a few groups get the label "cultish", so its not like people put the label on any group with an apparent leader. Such selective labels probably contain a lot of info, so it seems worth figuring out just what that info is. It is not wise to just find one group that gets the label which you think is fine, and then decide to ignore the label.

The straightforward approach would be to collect a dataset of groups, described by various characteristics, including how often folks call them "cultish." Then one would be in a position to figure out what this label actually indicates about a group.

comment by MichaelHoward · 2009-03-22T12:37:52.358Z · LW(p) · GW(p)

I find myself moved to break possibly the greatest taboo amongst our kind, but if this act of status suicide moves just one reader to action, the sacrifice is worth it.

OK, here goes...

"Gödel, Escher, Bach" by Douglas R. Hofstadter is the most awesome book that I have ever read.

Me too!

comment by Cassandra · 2009-03-22T17:51:39.765Z · LW(p) · GW(p)

This whole concept is confusing to me. I enjoy Eliezer's writing because it makes sense and is useful so it becomes part of my identity. I haven't found as many of his newer posts to be useful so a lower number of them are drafted into my identity. My 'self' is largely a collection of ideas and thoughts transmitted to me from other people and I don't find anything wrong with this. I do hope to produce useful knowledge myself but for right now I am educating myself to that point.

If I find a useful tool lying on the ground then I pick it up and use it, I do not try to recreate the tool from scratch in order to make it 'mine', which I feel is a meaningless concept. As long as my beliefs and skills pay for themselves in terms of useful benefits to my life I don't see the point in throwing them away because they came from someone else. I don't care who I am and I am not attached to any specific view of my self other than to try to pick the most effective tools to accomplish some core goals and values.

Replies from: thomblake
comment by thomblake · 2009-04-02T14:19:30.396Z · LW(p) · GW(p)

I don't care who I am

Then you're a very odd person.

I don't care who I am and I am not attached to any specific view of my self other than to try to pick the most effective tools to accomplish some core goals and values.

So you regard your 'core goals and values' to not be part of your self?

And you're really implying that you can just pick things about yourself simply as tools to accomplish your goals? I don't think I've ever met a person that can do that. Usually, there are facts about who we are and we need to work very hard to do anything about them.

comment by MichaelGR · 2009-03-22T07:50:53.163Z · LW(p) · GW(p)

IAWYC but I think you forgot to include something about jealousy in your analysis, even if few people would admit it's part of it.

I think it's very possible to greatly admire someone and at the same time feel some form of jealousy that inhibits the clear expression of that admiration. By saying that someone else is better (much better) than you are - especially at something that you value - you are in effect admitting to a lower status.

So all the forced disagreements and claims of independence are in effect just trying to signal that your status is high and you're not submissive, or something like that.

comment by [deleted] · 2009-03-22T17:09:07.548Z · LW(p) · GW(p)

When you read someones writings or follow the things they do but don't actually KNOW them, it's very easy to get sucked into a sort of 'larger-than-life' belief about them.

Because they're famous (and they must be famous because you've heard of them), they're obviously different and special and above regular, normal people. I've found it takes conscious effort to remember that no how famous or smart or talented they are, in the end they're just some guy or girl, with the same flaws as everyone else.

And when you think someone's larger-than-life, it's easy to praise highly, because you're not thinking of them as a normal person, you're thinking of them as ABOVE normal people. That they are special. In light of this, it's easy to see why praise for someone or something, no matter what it is, can be seen as cultish., and how you can fall into the trap of believing praise for anything is cultish.

Replies from: Nominull
comment by Nominull · 2009-03-22T19:57:50.592Z · LW(p) · GW(p)

Regarding this, it's really helpful when Eliezer mentions that he borrowed this or that part of his philosophy from a piece of anime fanfiction. It helps humanize him, or worse.

comment by AnneC · 2009-03-22T16:06:55.917Z · LW(p) · GW(p)

For what it's worth I don't think you've deliberately set out to become a "cult leader" -- you seem like a sincere person who just happens to be going about life in a rather nonstandard fashion. You've got some issues with unacknowledged privilege and such, and I've gotten impressions from you of an undercritical attractance to power and people who have power, but that's hardly unique.

I think mostly it's that you confuse people via sending off a lot of signals they don't expect -- like they think you must have some weird ulterior motive for not having gone to college, and instead of seeing public discussion of your own intellect as merely the result of somewhat atypical social skills, it's seen as inexcusable arrogance.

That said, because of my own negative experience(s) with people who've seemed, shall I say, rather "sparkly" at first, but who HAVE turned out to be seeking puppy-dog supplicants (or worse), I tend to be very very cautious these days when I encounter someone who seems to attract a fan club.

With you I've gone back and forth in my head many times as to whether you are what you first struck me as (a sincere, if a bit arrogant, highly ambitious guy) or something more sinister. It's been difficult to tell as you're sort of surrounded by this buzzing cloud of subcultural interference, but at this point I've sort of determined that if there's anything sinister there it's not a special sort above and beyond what you'd find in any given random middle/upper class American geek.

I think you get called out as a symbol of "smartypants white boys obsessed with trying to save the world from their basements" because you've ended up more visible than most. But, no, that doesn't make you a cult leader, it just makes you someone who would (like many of us living in wealthy, industrialized nations) benefit from making a greater effort to understand the effects of power and privilege.

Replies from: HughRistik, AnnaSalamon
comment by HughRistik · 2009-03-23T21:13:15.805Z · LW(p) · GW(p)

You've got some issues with unacknowledged privilege and such, and I've gotten impressions from you of an undercritical attractance to power and people who have power, but that's hardly unique.

I'm interested in what you mean here. Could you give examples?

comment by AnnaSalamon · 2009-03-22T21:17:44.832Z · LW(p) · GW(p)

someone who would (like many of us living in wealthy, industrialized nations) benefit from making a greater effort to understand the effects of power and privilege.

What benefits might you expect?

Replies from: AnneC
comment by AnneC · 2009-03-26T16:18:23.270Z · LW(p) · GW(p)

Well, for one thing, privilege is a major source of bias, and when a person doesn't even realize they (or those they admire) have particular types/levels of privilege, they're going to have a harder time seeing reality accurately.

E.g., when I was younger, I used to think that racism didn't exist anymore (that it had been vanquished by Martin Luther King, or something, before I was even born) and didn't affect anyone, and that if someone didn't have a job, they were probably just lazy. Learning about my own areas of privilege made it possible for me to see that things were a lot more complicated than that.

Of course it's possible for people to go too far the other way, and end up totally discounting individual effort and ability, but that would fall under the category of "reversed stupidity" and hence isn't what I'm advocating.

(And that's all I'm going to say in this thread for now - need to spend some more time languaging my thoughts on this subject.)

comment by pjeby · 2009-03-22T20:30:18.436Z · LW(p) · GW(p)

IAWY, and I actually already replied to your question about this in a comment, but:

One of the prime issues for me as a rationalist trying to learn about marketing (especially direct/internet marketing) was having to get over the fear of being a "dupe" pulled into a "scam" and "cult" situation. Essentially, if you have learned that some group you scorn (e.g. "suckers" or "fools" or whatever you call them) exhibit joining behavior, then you will compulsively avoid that behavior yourself.

I got over it, of course, but you have to actually be self-aware enough to realize that you chose this attitude/behavior for yourself... although it usually happens at a young enough age and under stressful enough conditions that you weren't thinking very clearly at the time.

But once you've examined the actual evidence used, it's possible to let go of the judgments involved, and then the feelings go away.

In other words, persons who have this issue (like me, before) have had one or more negative social experiences linking these behaviors to a disidentified group -- a group the person views negatively and doesn't want to be a part of. It's a powerfully irrational, compulsive motivator.

Hell, I had the same issue about exercise: I didn't want to be one of those shallow, vain jerks that likes to exercise!

It doesn't matter what the group is or what the behavior is, your brain picks up on the behaviors and attributes that signal participation in the groups you're around. And if you've decided you don't like the group... well, there you go.

Getting rid of it, however, is a matter of consciously re-evaluating the evidence and dropping your grudge against the target group...

Which means you're never going to talk people out of it directly -- trying to do so just makes people raise shields... "you're trying to get me to be one of THEM, aren't you?"

People are full of irrational compulsions that follow this pattern... and you're probably not immune, as pointed out by your Tolerating Tolerance article. Most likely, the reason you have to fight yourself to tolerate the tolerators of fools, is because of your own intolerance regarding fools. If you let go of your emotional grudge(s) against fools (whatever those grudges might be), you'd find it to be a lot less of a problem.

(Then, you'd also be in a lot better position to ask others to give up their grudges... including the ones you wrote this post about.)

comment by Patrick · 2009-03-22T10:01:54.888Z · LW(p) · GW(p)

Alright! a few points that I can sort of disagree on or feel were omitted in the essay. I'm being skeptical, not a cultist at all! .

My fears aren't really that you're trying to foster a cult, or that it's cultish to agree with you. I got worried when you said that you wanted more people to vocalize their agreement with you and actually work towards having a unified rationalist front. For some reason, I had this mental picture of you as a supervillain declaring your intention to take over the world. So I reflected that I was doing things, somewhat unconventional things (which I focus on more) because of your advice, but hey, it's good advise and I should probably take it (btw it's good to hear that cryonics is less expensive than I thought it was, sorry for making your life difficult by propagating false information). I mean, I followed similar patterns when I decided to learn lisp as a first programming language.

I think I'm worried because you're charismatic, and that makes you much more persuasive than an ineloquent and unimpressive philosopher/AI Hacker. Combined with the fact that I get really happy and a little self righteous when there's an eloquent speaker who makes a really persuasive argument for something I agree with, makes reading you, and other charismatic people in the atheist/revolutionary/technophile cluster, a rather deep experience with uncomfortable parallels to religion.

I've thought it over though, and this particular pattern probably won't cause too many problems, the reason is that Eliezer Yudkowsky isn't the only eloquent speaker in the world. I'm betting on something similar to the "three stooges syndrome" where I get shaped by too many intelligent and charismatic people to be influenced in to making large mistakes, because they'll probably call each other out on the more contentious claims and my bullshit detector will be reactivated.

I'm not even sure if I even agree with you more than average, but it does feel better to agree with you than usual, so that might be the source of worry, in your trip to the library of convenient rhetorical metaphors, it might be that the reason people are so anxious to say that they're not copying you is because your deep, piercing stare and badass coffee metaphors

So other than that you've totally persuaded my fear of your ability to totally persuade me away. Well it'll probably gone in a week after my subconscious stops thinking you're a super-villain.

Replies from: topynate
comment by topynate · 2009-03-22T11:11:48.198Z · LW(p) · GW(p)

For some reason, I had this mental picture of you as a supervillain

Maybe you've been primed? (see the end of the post)

comment by AnnaSalamon · 2009-03-22T08:05:31.661Z · LW(p) · GW(p)

I agree with your conclusion, and I love your library allegory. It's pretty clear that America fears strong emotions in general, and also that "our type" learns cached patterns of ritually approved of nonconformity.

That said, some may be balking, not at admiring someone hugely, but at forming nearly their entire manner of evaluating ideas from a single person, without independent sources of evidence that can label that person "trustworthy". Anne Corwin reports fearing networks of abstractions that distance people from their own concrete experiences and root-level knowledge. Carl's recent post is in part an attempt to say that many of the ideas you've expressed are supported by other, labeled-as-trustworthy sources, and folks who believe those ideas needn't feel as though they're putting all their error-catching eggs in one basket.

Other people may just fear changing their beliefs and actions, especially changing those beliefs and actions in a manner affected by an unknown person's intent. When I ran into some pamphlets on critical thinking seven years ago, and I heard arguments that I should (gasp) only believe claims supported by evidence... the prospect of only believing what was supported by evidence felt like a frightening loss of control. "It's a cult! It could take over my mind!" may just express fear at shifting which mental system (the "your choice" system, or this scary foreign "rationality/Eliezer" system) will control one's future beliefs and actions.

These fears may also apply to peoples' relationships to Paul Graham, I'm not sure.

comment by CarlShulman · 2009-03-22T08:16:50.840Z · LW(p) · GW(p)

"But upon reflection, I strongly suspect that I would feel no barrier to praising Gödel, Escher, Bach even if I weren't doing anything much interesting with my life."

You don't feel yourself to be in status competition with Hofstadter do you? Or E.T. Jaynes, for that matter. Think about effusively praising Nick Bostrom as the last best hope for the survival of humane values, instead.

"I'm hoping in particular that someone used to feel this way - shutting down an impulse to praise someone else highly, or feeling that it was cultish to praise someone else highly - and then had some kind of epiphany after which it felt, not allowed, but rather, perfectly normal."

I've always disliked generic praise, of the form 'X is great,' but find it easy enough to talk about objective standing within a population (e.g. 'in the Y percentile for math capabilities'), including the population of people I know (top Z% of my close acquaintances for social skills). The first leaves a bad taste of uncriticality, but I'm fine with saying that someone is probably even one in a million or one in a hundred million with respect to some formula aggregating across several features. I can't recall an epiphany in which I radically changed in this regard.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-22T08:25:36.142Z · LW(p) · GW(p)

You don't feel yourself to be in status competition with Hofstadter do you? Or E.T. Jaynes, for that matter. Think about effusively praising Nick Bostrom as the last best hope for the survival of humane values, instead.

Point taken. But that example isn't generic status competition, it's role competition. Are so many people calling PG a cult leader really in role competition with him? For what? Are there so many commenters at this site in role competition with me? I think you have a valid point here about a factor that would make admiration directed at another seem bad, but can it plausibly be that particular factor which is at work here?

(Edited to make clear the difference between status competition and role competition.)

Replies from: CarlShulman
comment by CarlShulman · 2009-03-22T08:31:44.515Z · LW(p) · GW(p)

They're socially engaged with him and his web community. Status competition doesn't have to mean preparation for direct overthrow, it can also mean efforts to reduce the size of status gaps relative to current superiors. Demonstrating or admitting inferiority to someone in the immediate social hierarchy pushes the low lower, while successfully tearing down a superior raises the tearer while lowering the torn, even if only marginally.

Replies from: topynate
comment by topynate · 2009-03-22T08:35:52.985Z · LW(p) · GW(p)

Great point, and I think that the "competition", if there is competition, isn't with PG or EY but with everyone else.

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2009-03-22T11:04:03.428Z · LW(p) · GW(p)

There is an essay about this by Pavel Curtis, creator of LambdaMOO - he would frequently find that newcomers would respond to his perceived status by being exaggeratedly rude to his character, showing off that they were prepared to stand up to the Man, so long as of course they could do it in perfect safety under the cover of anonymity.

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2011-08-03T11:21:56.505Z · LW(p) · GW(p)

Found it: Mudding: Social Phenomena in Text-Based Virtual Realities

Most players on LambdaMOO, for example, upon first encountering my wizard player, treat me with almost exaggerated deference and respect. I am frequently called 'sir' and players often apologize for 'wasting' my time. A significant minority, however, appear to go to great lengths to prove that they are not impressed by my office or power, speaking to me quite bluntly and making demands that I assist them with their problems using the system, sometimes to the point of rudeness.

Replies from: Rain
comment by Rain · 2011-08-03T16:31:38.448Z · LW(p) · GW(p)

'Exagerrated rudeness' could also be a product of the greater internet dickwad theory.

Attitudes toward MUD wiz teams are also part politics, since the leaders are often dictators of the local environment.

comment by Psy-Kosh · 2009-10-30T16:12:10.891Z · LW(p) · GW(p)

Figured, since this was linked to again, that I might as well say some of what I think on this:

My reaction is more, well, a couple of things, but part of it could be described like this: Yes, I do indeed admire you and think you're cool... and my natural instinctive reaction to you is kinda, well, fanboyish, I guess. Hence I try to moderate that... TO AVOID BEING ANNOYING... that is, to avoid, say, annoying you, for instance.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-10-30T16:56:48.812Z · LW(p) · GW(p)

If you can do that quietly without anyone noticing, you're doing it right. If you make a big deal out of it to prove something to other people, you're doing it wrong. Should be obvious, really.

Replies from: Psy-Kosh
comment by Psy-Kosh · 2009-10-30T17:00:37.131Z · LW(p) · GW(p)

Well, yes. :)

comment by Annoyance · 2009-03-23T15:47:16.571Z · LW(p) · GW(p)

If you have 'teachings' rather than suggestions or opinions, and you can't support those claims in a systematic and explicit way, then it doesn't much matter whether you intended to propagate a cult - that's precisely what you're doing.

comment by Ken · 2009-03-22T16:46:41.125Z · LW(p) · GW(p)

I'm afraid to read GEB now. It's been built up so high the only possible reactions I could possibly have are "as good as everybody else thinks it is", or "didn't live up to expectations", with the latter being far more likely.

Replies from: Anatoly_Vorobey, CronoDAS, thomblake
comment by Anatoly_Vorobey · 2009-03-22T17:48:58.216Z · LW(p) · GW(p)

Let me try to help you. Many people who praise GEB in the highest terms and recommend that everyone read it never finished it. Many read all the dialogues, but only some of the chapters. I have absolutely no data to support turning either of the previous "many" to "most", but wouldn't be surprised by either possibility.

GEB's most important strength, by far, is in giving you a diverse set of metaphors, thought-patterns, paradoxes and ways to resolve them, unexpected connections between heretofore different domains. It enlarges your mental vocabulary - quite forcefully and wonderfully if you haven't encountered these ideas and metaphors before. It's like a very, very entertaining and funny dictionary of ideas.

The exposition of various topics in theory of computation, AI, etc. that it also contains is not as important by comparison, and isn't the best introduction to these topics (it's still good and may well be very enjoyable, depending on your background and interest).

So there's no reason to fear reading GEB. You'll chuckle with recognition at the jokes, metaphors, notions that you've already learned elsewhere, and will be delighted at those you've never seen before. Read all the dialogues; if some of the chapters bore you, resist guilt tripping and skip a few - you'll come back to them later if you need them.

Replies from: scotherns
comment by scotherns · 2009-03-23T11:59:21.356Z · LW(p) · GW(p)

I'm about 1/2 way through. I am finding the chapters to be much more interesting than the dialogues. The style the dialogues are written in seems to be rather stilted/forced and grates somewhat. They do seem to be useful metaphors for understanding some of the trickier chapters, so I can see the merit in them.

Replies from: gwern
comment by gwern · 2009-08-10T12:53:09.509Z · LW(p) · GW(p)

The style the dialogues are written in seems to be rather stilted/forced and grates somewhat.

/me looks up from the 'Crab Canon'.

Wait, what?

comment by CronoDAS · 2009-03-23T03:40:24.706Z · LW(p) · GW(p)

I thought GEB was a very good book, but much of it consisted of things I had learned about up from several other sources (formal logic, Godel's incompleteness theorem, and some others) and I also found reading it to be slow and effortful - not the kind of book that keeps me compulsively turning pages, hungry for more. I also don't know how well it conveys its concepts to someone who never heard them before, because I already understood many of the concepts that GEB tries to painstakingly explain to an audience of people who hadn't already taken three university courses on formal logic. (And I did read every last word of it!)

GEB has been described as "an entire humanistic education in one volume" and that's not far from the truth; I could easily imagine using it as the textbook for several consecutive semesters worth of university classes. It starts with some basic concepts that the proverbial smart 12 year old could understand, and builds on them, step by step, until the book is covering concepts and applications on the level of graduate school. The book is obscenely ambitious in its scope - imagine a single book that assumes the reader has never taken a course in algebra and wants to teach that reader enough that, by the time he or she reaches the end, can understand and discuss university-level calculus, and you'll picture something much like GEB.

So, yeah, GEB really is amazing, but it's also a headache-inducing monstrosity that will try to cram your head with concepts until it explodes.

comment by thomblake · 2009-04-02T14:21:30.595Z · LW(p) · GW(p)

I agree with some of the sentiments below. Also, I became a philosopher before reading GEB, and didn't really find anything particularly enlightening in it. I still recommend "The Future and its Enemies" much more highly, and it's more of a fun read (even though it's a bit dated now)

comment by topynate · 2009-03-22T08:00:33.692Z · LW(p) · GW(p)

IMO being accused of wanting to be a cult leader is a pure double bind. You either say "yes, I do" and then you're a cult leader, or you say "what? that's crazy because of X, Y, Z..." and then people point at your protestations as evidence that their arguments have some minimal credibility (I am sure someone will do this to EY at some point). It is, prima facie, evident to me that talking to people on the internet about rationality is a poor method of getting acolytes (and even if it were a good one for some people, the Objectivists already picked up those guys) so if you are, in fact, a prospective cult leader, Eliezer, it seems to me you've had a serious failure of rationality and no-one should join. Everyone else, please note that this is exactly what I would say if I were trying to recruit you into my own, competing cult. Including the previous sentence.

In answer to the end of your post, even though this site is very young, I can already think of occasions where I've hesitated to speak up on your behalf because I anticipated being branded a sycophant. I've now decided that I would rather let otherwise interesting people exclude themselves from LW by adopting these "it's a cult!" beliefs than attempt to persuade them to stay or join in those circumstances.

Replies from: gwern
comment by gwern · 2009-08-10T13:02:08.790Z · LW(p) · GW(p)

Of course, being accused of being a cult is itself weak Bayesian evidence of being a cult (much like being accused of child abuse). Being accused would raise people's estimates, and the only way to lower them is a good defense; that their estimates might not go all the way back to the originals is not enough for the choice to deny or not deny cultishness to be a double-bind, I think.

comment by Steve_Rayhawk · 2009-03-23T11:02:47.367Z · LW(p) · GW(p)

If there's an aspect of the whole thing that annoys me, it's that it's hard to get that innocence back, once you even start thinking about whether you're independent of someone.

Cross-referencing my comment on a different post for a related idea:

Your brain remembers which "simple" predictor best described your decision [. . .]

Your brain learns to predict other peoples' judgments by learning which systems of predictive categories other people count as "natural". If you have to predict other peoples' judgments a lot, your brain starts to count their predictive categories as "natural". The effect can be viral [. . .] and it can change how you think about yourself.

comment by roland · 2009-03-22T20:38:12.547Z · LW(p) · GW(p)

Ok, I'm coming out and will admit that I admire you, Eliezer very highly. I think you are the one who taught me the most about rationality and what intelligence is all about.

Now, I admit that in my past I have fallen into the "adore the guru" trap so I still have this fear in my head and am cautious to not do the same mistake again. The cult-threads here are helping me to evaluate my position carefully.

But I like what you wrote about that innocence of being able to experience real admiration and excitement. I think if you let your critical thinking stifle that you are losing some of the important driving forces in your life. We should be happy if we experience those strong positive emotions that pull us in the right direction, if they are justified by reality:

http://www.overcomingbias.com/2007/04/feeling_rationa.html

comment by MichaelHoward · 2009-03-22T13:49:31.563Z · LW(p) · GW(p)

You've changed my beliefs and thinking more than anyone outside my family, by a pretty huge margin. This makes me far more likely to raise something to the level of being worth paying attention to just because you've recommended it (as it should), but it also makes me careful on a gut level every time I'm consider adopting yet another belief from you.

I think this is partly because of what you describe in this post, but partly because I know a lot of the existing beliefs I have that will be inclining me to accept the new belief came themselves from you. I'm not sure how much I should compensate for this as it's mostly happening at an emotional level, and is working counter to other emotions.

comment by alex_zag_al · 2013-12-09T01:41:45.303Z · LW(p) · GW(p)

my theory about myself:

I don't think people will believe me if they recognize my views as the typical LW-cluster views. They'll just dismiss them.

Which is really rational of them, actually. I think I use the same heuristic. Once I see that someone's beliefs come from a political affiliation, they're weaker evidence to me.

Like... if someone's trying to convince me out of global warming, but then I learn that she's also against affirmative action and immigration and regulation on finance. At first I might have thought she read convincing scientific arguments, now I think she's just a Mark Levin-cluster Republican.

See, now I can explain why she has those ideas without the hypothesis that she knows anything I don't about climate science. And if people recognize me as part of the LW cluster, they can explain my beliefs about cryonics without supposing I know anything more about cryonics than they do.

It's a good heuristic, and it's based on how political affiliations really work. I dodge it with a self-conscious, non-culty image.

comment by taw · 2009-03-22T09:12:26.226Z · LW(p) · GW(p)

Word "cult" seems to be used in very vague sense by everyone, and people have different definitions. Here's something I wrote about Paul Graham's and a few other "cults". It's only vaguely relevant, as I used the label "cult" differently.

If you are not into Paul Graham's cult / meme complex, and you hear people who really are - talking how working 100 hours a week on built to sell startup is the best way to prove your worth as a hacker and a human being - they really sound like "cult" members.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-22T09:16:54.405Z · LW(p) · GW(p)

Can you link to an example of someone who sounds like a Paul Graham cult victim?

Replies from: Anatoly_Vorobey
comment by Epiphany · 2012-10-03T03:01:07.344Z · LW(p) · GW(p)

Explanation: Emotional overexcitability, a trait common to gifted people and yes there is good reason to believe that most LessWrongers are gifted may cause LW and Hackernews fans to be extra excitable and intense. You've probably heard that gifted people tend to be more emotional? Well on your LessWrong survey your respondents claimed an average IQ in the 140s, well beyond the minimums for all the IQ definitions for gifted. If these readers are unusually emotionally intense, as gifted people tend to be, it's likely their unusual "electricity" sets off less excitable people's cult radar, just as intensity can set off so many other radars, getting one judged as everything from a crybaby to a drama queen.

I think it's totally normal for excitable people to have passionate feelings about all kinds of things, including praising other people highly. I think of it as "normal for gifted people" - this is a different kind of normal. Check out this page from the Davidson Institute (a school for gifted children) which describes emotional overexcitability as "heightened, intense feelings, extremes of complex emotions, identification with others' feelings, and strong affective expression" and explains that we should accept them:

"Accept all feelings, regardless of intensity. For people who are not highly emotional, this seems particularly odd. They feel that those high in Emotional OE are just being melodramatic. Though we are all melodramatic on occasion, people with high Emotional OE really do feel their emotions with remarkable or atypical strength. If we accept their emotional intensity and help them work through any problems that might result, we will facilitate healthy growth."

From:

Overexcitability and the highly gifted child

Replies from: None
comment by [deleted] · 2012-10-03T03:25:57.495Z · LW(p) · GW(p)

The IQ survey is obviously flawed.

Replies from: Epiphany
comment by Epiphany · 2012-10-03T03:35:35.613Z · LW(p) · GW(p)

We all know that a survey is not proof. It was intended as a thought-provoking clue, not as proof. Everyone here is capable of considering whether LW and Hackernews are full of gifted people. This is niggling.

Replies from: None
comment by [deleted] · 2012-10-03T03:51:02.177Z · LW(p) · GW(p)

Who said anything about absolute proof? The IQ survey means less than nothing about the respondents' IQ distribution. Mentioning it at all is a red herring at best.

Replies from: wedrifid
comment by wedrifid · 2012-10-03T04:29:45.476Z · LW(p) · GW(p)

Who said anything about absolute proof? The IQ survey means less than nothing about the respondents' IQ distribution.

Less than nothing? I don't believe you. In fact I don't even believe you believe you. Not only that the claim doesn't even make sense. If the 'meaning' falls into the negatives that seems to be meaning all of it's own.

Replies from: None
comment by [deleted] · 2012-10-03T04:46:10.251Z · LW(p) · GW(p)

I do enjoy a good round of taking someone too literally for comedic value, too.

Replies from: wedrifid
comment by VAuroch · 2013-12-09T10:02:49.559Z · LW(p) · GW(p)

(note: Hofstadter does not have a cult)

Douglas Hofstadter's research group is apparently quite cultish. It's close-knit, dominated by a single person, is not tolerant of disagreement, and has little intellectual interaction with the remainder of the field.

This doesn't make GEB less than excellent. It merely partially explains why they haven't made much progress since.

comment by handoflixue · 2011-07-18T20:21:05.535Z · LW(p) · GW(p)

My personal test for whether you're my "cult leader" or just a good teacher, is how I react when I think you're wrong. If they are merely a teacher, then I will sit down and work out exactly why they're right from base principles, and I'll admit it if I'm confused or if I think they are genuinely wrong. Given how many times in the sequences I've spent a few hours working things out, I feel safe here.

A good teacher says "here is something worth understanding" rather than "here is the teacher's password" - it is a willingness to see one's ideas tested, and a willingness to accept when they are found lacking.

A really great teacher is one where the student can reasonably trust that the teacher is probably right, because working it out themselves usually produces the same answer. I respect you because this has thus far been the case with you :)

comment by Cameron_Taylor · 2009-03-23T14:25:48.809Z · LW(p) · GW(p)

I know the opposite of stupidity is still stupidity, but every time I see some idiotic attempt to gain status by pointing out how "everone else except me seems to revere Eleizer too much" I have to restrain myself from reacting in the other direction and worshiping the guy.

comment by Alicorn · 2009-03-22T16:39:38.985Z · LW(p) · GW(p)

I used to have the idea that finding flaws in something (a piece of writing or entertainment or an idea or a person) made me better than the person or the creator of the thing I was criticizing. Then I realized two things which got me to stop: 1) Critics are parasites; they don't generally produce anything that valuable and entertaining themselves, and even beautifully written reviews are pretty low on my list of things to read for edification or fun. 2) When I go around finding flaws in everything, I stop enjoying it, and living a life where I can't enjoy anything I read or hear or see is not pleasant.

So now my strategy is to like things and people I'm inclined to like, but remain confident in my so-far-unfailing ability to find fault with them if I decide I need or want to do that. Being accused of slavish devotion to something is one of the things that can make me want to turn critic-mode back on, even though that only winds up proving a lack of slavish devotion after the fact (since I will tend not to notice the sorts of flaws I point out in critic-mode until I actually turn on critic-mode).

All of that having been said, Paul Graham is awesome, so is Eliezer, and GEB is overrated (either that or it was ruined for me when I took a class on it with non-philosophers teaching it; I do have a history of hating anything I'm obliged to read for school).

Replies from: Kaj_Sotala, Yvain
comment by Kaj_Sotala · 2009-03-22T21:46:51.504Z · LW(p) · GW(p)

Critics are parasites; they don't generally produce anything that valuable and entertaining themselves

Debunking mistaken hypotheses is just as important as coming up with new ones. Otherwise our heads would be so filled with confused theories that we could never develop the correct ones.

comment by Scott Alexander (Yvain) · 2009-03-22T21:51:50.657Z · LW(p) · GW(p)

When I go around finding flaws in everything, I stop enjoying it, and living a life where I can't enjoy anything I read or hear or see is not pleasant.

Having recently posted on the relevance of Pope's poetry to rationalism, I can't help quoting him one more time here:

Avoid extremes, and shun the faults of such
Who still are pleased too little or too much
Those minds, like stomachs, are not always best
That nauseate all, and nothing can digest

comment by MichaelHoward · 2009-03-22T13:48:45.925Z · LW(p) · GW(p)

You've changed my beliefs and thinking more than anyone outside my family, by a pretty huge margin. This makes me far more likely to raise something to the level of being worth paying attention to just because you've recommended it (as it should), but it also makes me careful on a gut level every time I'm consider adopting yet another belief from you.

I think this is partly because of what you describe in this post, but partly because I know a lot of the existing beliefs I have that will be inclining me to accept the new belief came themselves from you. I'm not sure how much I should compensate for this as it's mostly happening at an emotional level, and is working counter to other emotions.

comment by MichaelHoward · 2009-03-22T13:47:25.052Z · LW(p) · GW(p)

You've changed my beliefs and thinking more than anyone outside my family, by a pretty huge margin. This makes me far more likely to raise something to the level of being worth paying attention to just because you've recommended it (as it should), but it also makes me careful on a gut level every time I'm consider adopting yet another belief from you.

I think this is partly because of what you describe in this post, but partly because I know a lot of the existing beliefs I have that will be inclining me to accept the new belief came themselves from you. I'm not sure how much I should compensate for this as it's mostly happening at an emotional level, and is working counter to other emotions.

comment by agola · 2009-03-22T08:17:23.310Z · LW(p) · GW(p)

Leaving aside the valid points about overrating particular experts, when you have limited exposure to opposing viewpoints on the subject matter; cult-like behavior doesn't even require an intentional cult leader. Paul Graham doesn't have to willfully cultivate that type of following, for some of it to arise spontaneously as a function of the social structures and participants around him.

Frequently agreeing with someone who has a lot of good ideas, and who also has high status in a community that you're a member of, is not inherently bad. But once you get caught up in the social/community aspects of the group, there can be a profound and not-entirely-conscious motivation to value opinions of high-status members more, and to be less skeptical of their ideas, than you would of someone lower status.

Of course it doesn't follow that you should automatically reject what they do, purely for the sake of disagreement with a popular (or unpopular) figure. But if you find yourself motivated to agree with them an overwhelming percentage of the time, it's good to be alert to the possibility that you may unconsciously be maintaining a cognitive blind spot for yourself, or at least a blurry spot. Social instincts can be very powerful, and sometimes entirely automatic.

comment by Marshall · 2009-03-22T10:49:09.250Z · LW(p) · GW(p)

It seems like you are in the middle of a "father-murder", Eliezer. The hounds are baying and you are being critiqued. I predict that your lament here will not stop the flow.

I would like to suggest, that you let it happen. Being leader of the pack is not a rational place to be, and if you loosen your grasp the greater the chances are of a broader rationality in this our community.