Mind Control and Me
post by Patrick · 2009-03-21T17:31:33.340Z · LW · GW · Legacy · 34 commentsContents
Edit Mar 22: Decided to remove the cryonics example due to factual inaccuracies. None 34 comments
Reading Eliezer Yudkowsky's works have always inspired an insidious feeling in me, sort of a cross between righteousness, contempt, the fun you get from understanding something new and gravitas. It's a feeling that I have found to be pleasurable, or at least addictive enough to go through all of his OB posts, and the feeling makes me less skeptical and more obedient than I normally would be. For instance, in an act of uncharacteristic generosity, I decided to make a charitable donation on Eliezer's advice.
Now this is probably a good idea, because the charity is probably going to help guys like me later on in life and of course it's the Right Thing to Do. But the bottom line is that I did something I normally wouldn't have because Eliezer told me to. My sociopathic selfishness was acting as canary in the mine of my psyche.
Now this could be because Eliezer has creepy mind control powers, but I get similar feelings when reading other people, such as George Orwell, Richard Stallman or Paul Graham. I even have a friend who can inspire that insidious feeling in me. So it's a personal problem, one that I'm not sure I want to remove, but I would like to understand it better.
There are probably buttons being pushed by the style and the sort of ideas in the work that help to create the feeling, and I'll probably try to go over an essay or two and dissect it. However, I'd like to know who and at what times, if anyone at all, I should let create such feelings in me. Can I trust anyone that much, even if they aren't aware that they're doing it?
I don't know if anyone else here has similar brain overrides, or if I'm just crazy, but it's possible that such brain overrides could be understood much more thoroughly and induced in more people. So what are the ethics of mind control (for want of a better term) and how much effort should we put in to stopping such feelings from occuring?
Edit Mar 22: Decided to remove the cryonics example due to factual inaccuracies.
34 comments
Comments sorted by top scores.
comment by PhilGoetz · 2009-03-21T20:12:05.645Z · LW(p) · GW(p)
Fiction is such a brain override. Nobody would have heard of Ayn Rand if she had published her ideas purely as non-fiction.
I don't mean this as a criticism of Ayn Rand's ideas. Just today, I looked on a filesharing network for works by Ayn Rand. Copies of "Atlas Shrugged" outnumbered copies of her other, more interesting works at least 10 to 1.
Eliezer is one of the more rational people out there, yet I think he's given more examples of fiction that influenced him, than of non-fiction that influenced him.
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-21T19:21:36.904Z · LW(p) · GW(p)
The idea that cryonics costs "hundreds of thousands of dollars" would appear to be one of those ideas that sounds so truthy that you cannot eradicate it no matter how many times you repeat it. If you're young and you go with the cheapest option, you're going to pay maybe $150/year for way more life insurance than you need, plus $120/year or $1250 upfront for Cryonics Institute membership; and of the life insurance, $50,000 to CI when you actually die.
Replies from: PhilGoetz↑ comment by PhilGoetz · 2009-03-21T20:11:13.006Z · LW(p) · GW(p)
I've spoken to several people about their cryonics insurance lately, and most (those around 40 years old) are paying more than $100/month. Those in their 20s pay a lot less.
Insurance companies are not going to give you a life insurance policy that costs you less than their inevitable payout. Unlike other insurance policies, they have to pay up on all these policies eventually. The only way that signing up for cryonics this way beats investing your money now and using the proceeds to pay for cryonics later, is that you are covered between now and then.
So it makes sense to defer signing up for cryonics until you think the preservation process is good enough to work.
Replies from: JulianMorrison↑ comment by JulianMorrison · 2009-03-23T20:51:51.117Z · LW(p) · GW(p)
Life insurance is usually their bet you won't die early, versus your bet you will. it times out and pays nothing if you're still alive at some agreed age. They don't always have to pay up eventually - or there would be no business model. They give you a policy worth a shade less than their expected payout, in the probabilistic sense, with the difference being the margin.
comment by anonym · 2009-03-22T19:05:59.210Z · LW(p) · GW(p)
I think a large part of what you are describing is nothing more than profound (justifiable) respect, and there is nothing wrong with it unless it goes beyond what your experience and the evidence supports.
It is perfectly rational to be somewhat less skeptical of and somewhat more obedient to someone that you have that sort of respect for, as long as the respect was well earned (and must continue to be earned) and your updating is accurate and in proportion to past experience. [I use Marcello's Anti-Kibitzer script as one way of not being overly influenced in this way.]
I'm not saying that there aren't other buttons that aren't being pushed that might contribute to your ill feeling. Perhaps Eliezer's sometimes oracular tone or his evoking of samurai and zen metaphors affect you in ways that cause you to be suspicious of your rationality.
comment by loqi · 2009-03-21T18:50:47.459Z · LW(p) · GW(p)
Here's a good opportunity to test your motivations. Think about the rationale you currently harbor for cryonics. Think specifically about which of the doubts you currently harbor, if any, would need to be settled before parting with your money. Roughly how much time and energy would you devote to this?
Once you've done this, read this comment on OB (if you've done so already, my comment is probably a waste). The issue isn't so much the accuracy of those claims, but the nature of them. Did you have a good reason to believe that they're handling their money wisely prior to being forced to consider it? Given the above exercise in settling your doubts, do you think you would have done the necessary homework to find out prior to signing up? If not, then I'd be inclined to treat that as evidence that your decision was at least partially the product of irrational persuasion.
Replies from: Eliezer_Yudkowsky↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-21T19:42:09.706Z · LW(p) · GW(p)
Logi, this is a good illustration of the "rationalist" coordination failure mode where people demand that the group be perfect before joining, while disabling the natural human tendency to think the group is perfect. This is not like investing money in a hedge fund where there are hundreds of competing possibilities. Cryonics Institute or Alcor, pick one. CI, which is the only option I know, gives a pretty good impression of transparency (I could look up their cost of liquid nitrogen if I liked). But lack of the information you're demanding just isn't something that should stop you from signing up with Alcor or CI.
I'll have to do a post about this at some point - "Join the Flawed Herd".
Replies from: loqi↑ comment by loqi · 2009-03-21T20:13:52.808Z · LW(p) · GW(p)
This is not like investing money in a hedge fund where there are hundreds of competing possibilities. Cryonics Institute or Alcor, pick one.
That only holds if you assume there exists a viable cryonics option. When making a decision like this, an important question to ask is: "Am I being had?" I tend to think both organizations are probably legitimate, but their legitimacy or lack thereof is fundamental to the rationale behind signing up for cryonics in the first place, and not easily dismissed as a "flaw".
But lack of the information you're demanding just isn't something that should stop you from signing up with Alcor or CI.
The above reservations aside, my point wasn't really that it should stop you from signing up (hence "The issue isn't so much the accuracy of those claims, but the nature of them"), my point is that it seems very rational to obtain that information in the course of evaluating your options, and if you can predict that you wouldn't bother to do so, you've possibly identified an irrational operator acting on your decisions.
comment by jimrandomh · 2009-03-21T18:01:13.986Z · LW(p) · GW(p)
That feeling is peer pressure, and it's found in everyone, as shown by Asch's Conformity Experiment. When we hear a group of people express an opinion, or a figurehead who presumably represents a group of people, we are biased towards agreement. That's because when someone disagrees with a group, it is more likely that they have made a mistake than that everyone else in the group has.
But don't mistake being convinced by valid arguments for being convinced by conformity bias! Properly compensating for conformity bias means not letting groups convince you of things that are false, but if an argument is valid and its conclusion is true, then changing your mind to conform with it is the right thing to do. So trust, but verify; let authors who tell you true things influence you, avoid authors who tell you false things, and sanity-check everything you read, no matter who wrote it.
Replies from: talisman, Richard_Kennaway↑ comment by talisman · 2009-03-21T19:08:19.403Z · LW(p) · GW(p)
It's much more than peer pressure. Eliezer, along with the other authors, use a confident, rhythmic, almost biblical style, which is very entertaining and compelling. You don't just learn deep things with EY, you feel like you're learning deep things. Robin Hanson's thought is incredibly deep, but his style is much more open, and I would guess you find his writings not to have this property.
Robin and Eliezer have debated writing style over at OB, and I highly recommend you read that debate, Patrick.
You should also, in my opinion, be very cautious about this feeling; there's a reason that religious writings have this style, and I would bet you would be less able to find logical gaps in something written in this style. I had a similar set of experiences as an adolescent Randian.
Replies from: Eliezer_Yudkowsky, Roko↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-21T19:44:33.283Z · LW(p) · GW(p)
I should note that if I'm teaching deep things, then I view it as important to make people feel like they're learning deep things, because otherwise, they will still have a hole in their mind for "deep truths" that needs filling, and they will go off and fill their heads with complete nonsense that has been written in a more satisfying style.
Replies from: Yvain, SoullessAutomaton, cousin_it, pjeby↑ comment by Scott Alexander (Yvain) · 2009-03-22T13:00:24.720Z · LW(p) · GW(p)
Please make a top-level post on this. Not because it needs any more explanation, but because everyone needs to see it, and I need a detailed and official-looking version of it to link all of my friends to (especially those who are teachers).
↑ comment by SoullessAutomaton · 2009-03-22T00:23:46.576Z · LW(p) · GW(p)
This is a significant point. Even granting as accurate every charge levelled at Eliezer's writing (and at Yvain, who has adopted much the same style, and many other people outside this community), it's not obvious that there's anything inherently wrong with it.
In particular, I think Robin often does his arguments a disservice by deliberately presenting them in a way that hinders their uptake.
Replies from: talisman↑ comment by talisman · 2009-03-23T01:42:46.676Z · LW(p) · GW(p)
To be clear, my comment above isn't meant to be a "charge"! Among other things, Eliezer is exceptionally gifted at making ideas interesting and accessible in a way that Robin isn't at all. I'm looking forward to his book coming out and changing the world.
I personally love his stuff, and think it's great 1) for people that are completely new to these ideas; 2) for people that are fairly advanced and have the ideas deep in their bones.
For people in between, I sometimes feel like his writing presents too much of a glide path---answers too many questions for the student, guides the reader too unerringly to the answers, presents a polished surface that makes it hard for inexperienced learners to understand the components of the thought process and learn to do the same themselves.
↑ comment by Roko · 2009-03-21T20:04:48.640Z · LW(p) · GW(p)
Eliezer's writing is clearly not absolutely persuasive, because it didn't persuade me, even when it was correct!
↑ comment by Richard_Kennaway · 2009-03-21T18:52:43.646Z · LW(p) · GW(p)
Your link says that three quarters of Asch's subjects made at least one conforming answer. That is a long way short of "everyone".
comment by topynate · 2009-03-21T17:41:02.768Z · LW(p) · GW(p)
Maybe you're particularly aware of what it feels like from the inside to be convinced of something?
comment by taw · 2009-03-21T18:10:03.728Z · LW(p) · GW(p)
I blame social status. Well, I blame social status and other primate tribal psychology for most biases people have. You're basically accepting Eliezer as your personal guru and tribal leader, and following him mindlessly, especially when others seem to be doing so too. This worked great when you tried to get your group into power in a tribe, it's a pretty stupid thing to do these days.
Replies from: Eliezer_Yudkowsky, pjeby↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-21T20:33:50.152Z · LW(p) · GW(p)
I wish I understood why some rationalists find so terrifying the prospect that they might be part of a cult. Or why they so desperately (and publicly!) reject the prospect that anyone might see them as having a leader. I mean, I understand that there's a normative component to both, but I don't get where the sheer power of this fear comes from. It's probably an important aspect of our lack of coordination, but I don't understand it.
Replies from: taw, AnneC, pjeby, Nick_Tarleton, CarlShulman, rwallace↑ comment by taw · 2009-03-22T03:06:51.441Z · LW(p) · GW(p)
There are some good reasons for being terrified. We are tribal animals. We don't really care about the truth as such, but we care a lot about tribal politics. We can pursue truth when we have very high degree of disinterest in what the truth will be, but that's a really exceptional situation. When we care about the shape of truth, we lose a lot of rationality points, and tribal politics forces make us care a lot about things different than truth. It might be our strongest instinct, even stronger than individual survival or sex drive.
I agree with you that it has its downsides, but I really don't see how you can accept the politics and stay rational. I cannot think of many examples of that.
I'm also really disappointed by so many status indicators all over lesswrong - top contributors on every page. Your social status points (karma) on every page. User names and points on absolutely everything. Vote up / vote down. You might think we're doing fine, but so was reddit when it was tiny, let's see how it scales up. I think we should get rid of as much of that as we can. reddit's quality of discussion is a lot lower than 4chan's, even though it's much smaller.
And this is a great example of what you once posted about - different people are annoyed by different biases. You seem to think social status and politics are mostly harmless and may even be useful, I think it's the worst poison for clear rational thinking, and I haven't seen many convincing examples of it being useful.
↑ comment by AnneC · 2009-03-22T06:17:16.806Z · LW(p) · GW(p)
Well I don't know that I've got any "rationalist" cred, but as someone who at least attempts to approach life rationally, I am personally terrified by the prospect of being part of a cult because of the way cults seem to warp people's capacity for thinking straightforwardly about reality. (And I could easily lump "religion" in with "cult" here in that regard).
Basically, I don't like the way things I'd call "cultish" seem to disconnect people from concrete reality in favor of abstractions. I've seen some truly awful things happen as a result of that sort of mindset, and have also myself experienced an attempt at "indoctrination" into a sort of cult, and it was one of the worst experiences of my life. A person I knew and thought I could trust, and who seemed smart and reasonable enough, one day managed to trap me in an office under false pretenses and basically sat there berating me and telling me all kinds of horrible things about my character for two hours. And by the end of it, I was halfway ready to believe it, and my confidence and ability to do my schoolwork (this was in college) suffered for months afterward.
So I'm terrified of cults because I know how normal and reasonable their agents can seem at first, and how perfectly horrendous it is to find out what's actually going on, and how difficult it can be afterward to pick up the pieces of your brain and go forward with your life. I don't give a crap about the social-status stuff (well, beyond not wanting to be harassed, if that counts), I just don't want anyone messing with my mind.
Replies from: CarlShulman↑ comment by CarlShulman · 2009-03-22T06:53:09.725Z · LW(p) · GW(p)
"a sort of cult," But not a cult, full stop. Multi-level-marketers? I have seen some hideous zombification in that context.
Replies from: AnneC↑ comment by AnneC · 2009-03-22T15:38:15.823Z · LW(p) · GW(p)
It was some kind of "neurolinguistic programming" thing. This particular incarnation of it entailed my first being yelled at until "[my] defenses were stripped away", at which point I was supposed to accept this guy as a "master". Later on it supposedly involved attending weird summer-camp type sessions where I was told people would undergo things that "felt like torture" but which they'd "come to appreciate".
I didn't go to any camp sessions and probably wouldn't have attended them anyway for sheer lack of logistical finesse, but I am glad I had a co-worker point out to me that what was happening to me was emotional abuse at the very least.
Replies from: pjeby↑ comment by pjeby · 2009-03-22T21:01:04.411Z · LW(p) · GW(p)
That sounds more like est or Landmark/Forum or even Scientology... but nonetheless a LGAT (large-group awareness training -- basically a synonym for cult indoctrination).
Legitimate NLP training doesn't involve students getting yelled at, even offhandedly, let alone in any sort of systematic way. Anybody who claims to be teaching NLP in such a fashion needs to be reported to the organization that issued their certification, and then to the Society of NLP (so the organization's trainer-training certification can be revoked, if they don't revoke the trainer's cert).
(That link goes to a particular training organization, but I don't have any connection to them or offer any particular endorsement; it's just a page with good buyers' guidelines for ANY sort of training, let alone NLP. I'd also add that a legitimate NLP trainer will generally have enough work teaching paying customers, to have neither time nor reason to subject people to unsolicited "training".)
↑ comment by pjeby · 2009-03-21T21:46:26.304Z · LW(p) · GW(p)
I don't get where the sheer power of this fear comes from.
Status/self-image fears are among the most powerful human fears... and the status-behavior link is learned. (In my work, I routinely help people shed these sorts of fears, as they're a prominent source of irrationality, stress, procrastination... you name it.)
Basically, you experience one or more situations (most often just one) where a particular behavior pattern is linked to shaming, ridicule, rejection, or some other basic social negative reinforcer. It doesn't even have to happen to the person directly; it can just be an observation of the response to someone else's behavior. Under stress, the person then makes a snap judgment as to what the causes of the situation were, and learns to do TWO things:
To internalize the same response to themselves if they express that behavior, and
To have the same response to others having that behavior.
It also works in reverse -- if somebody does something bad to you, you learn to direct anger or attempts at ridicule towards that behavior, and also against yourself, as a result of "judging" the behavior itself to be bad, and the marker of a specific social group or class of people.
This can then manifest in odd ways, like not wanting to exhibit behaviors that would mark you as a member of the group you dislike.
One of the prime issues for me as a rationalist trying to learn about marketing (especially direct/internet marketing) was having to get over the fear of being a "dupe" pulled into a "scam" and "cult" situation. Essentially, if you have learned that some group you scorn (e.g. "suckers" or "fools" or whatever you call them) exhibit joining behavior, then you will compulsively avoid that behavior yourself.
I got over it, of course, but you have to actually be self-aware enough to realize that you chose this attitude/behavior for yourself... although it usually happens at a young enough age and under stressful enough conditions that you weren't thinking very clearly at the time.
But once you've examined the actual evidence used, it's possible to let go of the judgments involved, and then the feelings go away.
↑ comment by Nick_Tarleton · 2009-03-21T21:37:45.947Z · LW(p) · GW(p)
I wish I understood why some rationalists find so terrifying the prospect that they might be part of a cult.
For one thing, it would mean that they've been wearing a clown suit for years – and a sort of clown suit that a large part of their identity is defined in opposition to. How humiliating is that?
Ditto fear of being scammed by cryonics, which people seem to regularly treat as the worst thing that could possibly happen. Bad not to conform in belief, worse to be (exposed as) a nonconforming exploitable moron.
Note that hindsight bias can be expected to make being scammed/joining a cult look more moronic than it actually was, and the fundamental attribution error can be expected to make this reflect more badly on the actor than it should.
This still leaves your point that "the possibility of humanity being wiped out seems to have less psychological force than the opportunity to lose five pounds", but near/far probably accounts sufficiently for that.
↑ comment by CarlShulman · 2009-03-22T00:22:31.588Z · LW(p) · GW(p)
For Americans (and the cryonics organizations are American) some special factors apply. David Brin has some nice discussion of the ubiquitous pro-individualism propaganda permeating American print and electronic media. Religion is unusually common and powerful in the U.S. so rationalists have more negative affect towards it and anything that resembles it even slightly.
↑ comment by pjeby · 2009-03-21T20:02:13.952Z · LW(p) · GW(p)
Meh. Paul Graham's blog doesn't allow comments. Neither does Stallman's. And if you read OB via an RSS feed, there is no indication anywhere that other people are following along. And believing Eliezer is smart and right about a bunch of things doesn't mean mindlessly following him on everything.
Replies from: taw↑ comment by taw · 2009-03-21T20:37:17.316Z · LW(p) · GW(p)
It doesn't matter that Paul Graham and Stallman don't allow comments. People know them, they have very high reputation, and plenty of fanboys, that all makes them high social status individuals. Mindlessly following the leader is not the same as mindlessly following the group, both are real and distinct behaviours.
People feel differently reading something by Paul Graham and something by blogger they've never heard about. You might have gotten so used to social status indicators you don't consciously see them. Go to 4chan (not /b/), see what discussion is like without them. It is actually surprisingly good.
Replies from: pjeby↑ comment by pjeby · 2009-03-21T21:23:56.683Z · LW(p) · GW(p)
People feel differently reading something by Paul Graham and something by blogger they've never heard about.
Which people, and how do you know? The first time I read a PG essay, I'd never heard of him. I think you're confusing the cause and effect about people following -- at least where some people are concerned. PG, RMS, and EY aren't convincing because they have followers, they have followers because they're convincing.
Now, if you're saying the status indicators are in their writing, then that's another story. It's arguably a status symbol merely to speak possibly-unpopular and/ore weird opinions in an authoritative voice without weaseling or implying that you're a persecuted minority or even so much as dignifying the possibility that people might disagree with you.
Replies from: Michelle↑ comment by Michelle · 2009-03-23T10:43:18.582Z · LW(p) · GW(p)
This is mostly agreeing to the same point, but I'm going to say it anyway because I think it's important.
I stumbled on Eliezer's writing fairly randomly (link to OB as an interesting blog). I was immediately sucked in. In fact, I was discussing the subject of modern-day genius with a friend, and after having read two or three of his posts, I sent my friend a link saying something like "this Eliezer guy seems like a pretty legit modern genius." [He replied with "psshhh... he's just working in a hyped-up field." (I don' t think he really read the posts)]. I had absolutely no idea of the depth of his ideas nor any of the broader social context at the time. I just knew it was making sense.
Same with Paul Graham. I stumbled on his website even more randomly. I did a google search for "procrastination" while procrastinating one night. And I was hooked. Again, I had no idea about his accomplishments or social status or associations, I just knew that his writing resonated with me.
What it is for me is a deep connection with the ideas in the writing. It's not just a matter of "hmm... interesting idea," but rather "WOW. That's EXACTLY how I feel. But explained so much more clearly."
I could lump Ayn Rand into the same group to an extent.
I agree that the "cultishness" is somewhat disconcerting. But I think there's much more to it than that. I think the fact that the names of three of the writers whose writing has deeply resonated with me philosophically, writers who I have come across through completely different means, have been mentioned in the comments in this post, is very telling. I suspect that people are predisposed to a certain way of understanding the world, and when they find ideas that resonate with that understanding, they latch on. It's just that some people are much better at communicating, or make the effort to communicate, these ideas.
(This comment opens a can of worms as it could imply that there are various correct ways or understanding the world, and that rationalism is not necessarily THE way. But perhaps certain people are more predisposed to the idea of rationalism? And perhaps it is THE way, but certain people can just never come close to overcoming their views of the world imposed from their upbringing to have the ideas resonate with them?)
Either way, my main point is that it's not just a matter of blind worship.