How I infiltrated the Raëlians (and was hugged by their leader)

post by SquirrelInHell · 2016-03-16T05:45:03.387Z · LW · GW · Legacy · 32 comments

I was invited by a stranger I met on a plane and actually went to a meeting of Raëlians (known in some LW circles as "the flying saucer cult") in 沖縄, Japan. It was right next to Claude Vorilhon's home, and he came himself for the "ceremony" (?) dressed in a theatrical space-y white uniform, complete with a Jewish-style white cap on his head. When saying his "sermon" (?) he spoke in English and his words were translated into Japanese for the benefit of those who didn't understand. And yes, it's true he talked with me briefly and then hugged me (I understand he does this with all newcomers, and it felt 100% fake to me). I then went on to eat lunch in an 居酒屋 with a group of around 15 members, who were all really friendly and pleasant people. I was actually treated to lunch by them, and afterwards someone gave me a ~20 minute ride to the town I wanted to be in, despite knowing they won't see me ever again.

If you have ever wondered how it is possible that a flying saucer cult has more members than EA, now it's time to learn something.

Note: I hope it's clear that I do not endorse creating cults, nor do I proclaim the EA community's inferiority. It hasn't even crossed my mind when I wrote the above line that any LW'er would take it as a stab they need to defend against. I'm merely pointing to the fact that we can learn from anything, whether it's good or bad, and encouraging a fresh discussion on this after I gathered some new data.

Let's do this as a Q&A session (I'm at work now so I can't write a long post).

Please ask questions in comments.

32 comments

Comments sorted by top scores.

comment by Evan_Gaensbauer · 2016-03-16T07:31:12.418Z · LW(p) · GW(p)

Upvoted for sharing unique experiences for their learning potential. I recall Luke Muehlhauser attended a Toastmasters meetup run by Scientologists several years ago when he first moved to California. This was unrelated to the article, but as an aside he discouraged other LessWrong users to attend any meeting run by Scientologists just because he did, because they are friendly and they will hack people's System 1's into making them want to come back, and even being enticed to join Scientology is not a worthwhile risk, and the best case is you might just waste your time with them anyway. I mean, IIRC, this was after Luke himself had left evangelical Christianity, and read the LessWrong Sequences, so I guess we was very confident he wouldn't be pulled in.

It's interesting that you went, but if you were invited by a stranger on a plane to this home, I hardly think you "infiltrated", as opposed to being invited by a Raelian on the first step to join them. I'm not saying you'll be fooled into joining, but I caution against going back, as you could at least use the time to find other friendly communities to join, like any number of meetups, which aren't cults. It's sad other are in this cult, but it's difficult enough to pull others out I'm not confident it's worth sticking around to pull others out, even if you think they're good people. When you get back Stateside or wherever your'e from, I figure there are skeptics associations you can get involved with which do good work on helping people believe less crazy things.

Replies from: SquirrelInHell
comment by SquirrelInHell · 2016-03-16T07:47:06.967Z · LW(p) · GW(p)

I hardly think you "infiltrated"

This was tongue-in-cheek of course.

but I caution against going back

I'm not planing to, even though I'm very very very sure about not getting fooled (I've been through Chrisitanity too in my youth, I've escaped with my own strength without external help or inspiration, and have since converted a few people to atheism).

I don't plan to go back because it would be a waste of my emotional energy that I could use to work on rationality communities (that I actually care about). Pretty much what you are trying to tell me, I guess. Thanks for worrying :) I'm fine :)

Note: this is a neat example of how the economy/"investing a limited resource" viewpoint can generate better life decisions than asking "does this seem like a good idea?" about individual things.

Replies from: ChristianKl
comment by ChristianKl · 2016-03-16T14:09:04.237Z · LW(p) · GW(p)

This was tongue-in-cheek of course.

The word infiltrated assumes entering without the knowledge or approval of the group. For the purposes of epistemic hygiene's worthwhile to use language that accurately reflects reality.

Replies from: Lumifer
comment by Lumifer · 2016-03-16T15:09:27.906Z · LW(p) · GW(p)

For the purposes of epistemic hygiene's worthwhile to use language that accurately reflects reality.

Humor. It's a thing. You should try it sometime.

Replies from: ChristianKl
comment by ChristianKl · 2016-03-16T21:01:28.295Z · LW(p) · GW(p)

Did you laugh while reading the headline?

Replies from: SquirrelInHell, Lumifer
comment by SquirrelInHell · 2016-03-17T00:58:39.332Z · LW(p) · GW(p)

See "microhumor" at http://slatestarcodex.com/2016/02/20/writing-advice/

But in any case, thanks for the feedback. It's useful for me to know what styles of writing are interpreted in what ways around here. Actually what I'm doing right now is experimenting to increase my chances of successfully communicating some of my important ideas in the future.

Replies from: ChristianKl
comment by ChristianKl · 2016-03-17T12:07:15.779Z · LW(p) · GW(p)

I don't think humor is in general a good defense for writing misleading headlines.

I don't think "I engaged in a hostile action against outgroup X" is a good way to start humor. Cheering on humor like that produces bad dynamics.

Replies from: Lumifer
comment by Lumifer · 2016-03-17T14:53:20.181Z · LW(p) · GW(p)

I don't think humor is in general a good defense for writing misleading headlines.

Humor is not a defense, but a good idea. The headline is misleading only to people who can't parse language properly.

Replies from: buybuydandavis
comment by buybuydandavis · 2016-03-18T03:20:56.956Z · LW(p) · GW(p)

I doubt that too many people felt they were "misled" by the headline. I think most people got the joke.

I was trying to come up with a literary term for the device used. It seems like it's a thing.

Someone pointed out "dysphemism" as a term to me here. But that wasn't right. Just a little hyperbole for dramatic effect? For contrast with the hug?

Replies from: Lumifer
comment by Lumifer · 2016-03-18T14:57:25.656Z · LW(p) · GW(p)

I think of it as a "wink". It's not quite hyperbole, but, let's say, a literary device that sets the mood and expectations. Basically it says "I'm not being entirely serious here".

comment by Lumifer · 2016-03-17T14:54:01.415Z · LW(p) · GW(p)

I smiled. The "was hugged" expression was a hint :-)

comment by Dagon · 2016-03-16T15:23:45.397Z · LW(p) · GW(p)

Do they have goals aside from membership? How effective are they at those goals?

I'm concerned with comparisons based on number of members or tactics about recruiting that ignore the question of WHY the group is recruiting.

Replies from: SquirrelInHell
comment by SquirrelInHell · 2016-03-17T03:03:35.871Z · LW(p) · GW(p)

Good question!

As you rightly suspect, one of their strongly professed goals is the "viral" aspect, i.e. "we have been given a mission to spread the wonderful/enlightening message from the benevolent aliens that suspiciously look like a bunch of French dudes in jumpsuits".

Besides that, they care a lot about:

  • Science

This may sound funny, but yes, they think of themselves as supporters of "science" and "scientific" views (versus religion). And in their cheering/repeating phrases heard from the scientific crowd, they even got a few things right, especially those that were well established at the time when their "message" book was written.

  • World peace

They consistently put emotional effort into thinking about this, but I do not know of any instances when they achieved an outcome in the world.

  • Polyamory

This one seems to be a pretty real positive effect, especially in the context of Japan, with its overly rigid social rules that cause a lot of anxiety and unhappiness all around.

  • Emotional well-being of humans

While their epistemology is abhorrent, the methods they use for this are pretty efficient, and this aspect provides clear value to members.

I personally use "take a deep breath and feel grateful" as a primitive action, which is perfectly compatible with rationality, and has measurable effect on quality of life (especially in case of countering SNS responses). Most cults and religions know that it works, and use it in a "package deal" with brainwashing. But regardless of how often they are misused, techniques like this are valuable, and so is spreading their practice in communities.

Note: I kind of sound like I'm defending them, but please keep in mind that I only take this stance because it's opposed to the default LW reaction, and therefore more likely to produce non-tautological conversations.

comment by Evan_Gaensbauer · 2016-03-16T07:22:33.811Z · LW(p) · GW(p)

If you have ever wondered how it is possible that a flying saucer cult has more members than EA, now it's time to learn something.

One sentiment from a friend of mine that I don't completely agree with but I believe is worth keeping in mind is that effective altruism (EA) is about helping others and isn't meant to become a "country club for saints". What does that have to do with Raelianism, or Scientology, or some other cult? Well, they tend to treat their members like saints, and their members aren't effective. I mean, these organizations may be effective by one metric in that they're able to efficiently funnel capital/wealth (e.g., financial, social, material, sexual, etc.) to their leaders. I'm aware of Raelianism, and I don't know much about it. From what I've read about Scientology, it's able to get quite a lot done. However, they're able to get away with that when they don't follow rules, bully everyone from their detractors to whole governments, and brainwash people into becoming their menial slaves. The epistemic hygiene in these groups are abysmal.

I think there are many onlookers from LessWrong who are hoping much of effective altruism gets better epistemics than they have now, and would be utterly aghast if they were selling this out to use whatever tools from the dark arts to make gains in raw numbers of self-identified adherents who cannot think or act for themselves. Being someone quite involved in EA, I can tell you that EA should grow as fast as possible, or that the priority is to make anyone who willing to become passionate about it to feel as welcome as possible, isn't worth it if the expense is the quality culture of the movement, to the extent it has a quality culture of epistemic hygiene. So, sure, we could learn lessons from ufo-cults, but they would be the wrong lessons. Having as many people in EA as possible isn't the most important thing for EA to do.

Replies from: buybuydandavis, SquirrelInHell
comment by buybuydandavis · 2016-03-18T03:35:20.521Z · LW(p) · GW(p)

Being someone quite involved in EA, I can tell you that EA should grow as fast as possible, or that the priority is to make anyone who willing to become passionate about it to feel as welcome as possible, isn't worth it if the expense is the quality culture of the movement, to the extent it has a quality culture of epistemic hygiene.

Sounds like effective is being cast aside for epistemically pure. EPA? Epistemically Pure Altruism.

comment by SquirrelInHell · 2016-03-16T07:53:38.921Z · LW(p) · GW(p)

If we look at mistakes of other groups and learn not to repeat them, it's not a "wrong lesson".

Also, I think you too easily assume that they don't do anything that belongs to the light side. The whole trick in creating a group like that is, I guess, mixing the dark side stuff in between enough of the good stuff that it becomes hard to notice.

This may not be the best example, but what if you can get better PR for EA by choosing the more attractive people to do the public speaking, where's the harm in that? That's how human psychology likely works.

comment by MrMind · 2016-03-16T16:12:49.837Z · LW(p) · GW(p)

you have ever wondered how it is possible that a flying saucer cult has more members than EA

What do you think of the sentence: a flying saucer cult is bound to have more members than an EA movement, because it's actually far easier to delegate responsability to the leader, it's status-enhancing having a 'secret knowledge' that others don't have, and you're shielded from the hardship of the outside world.
All things that are emotionally appealing but epistemically toxic.

Replies from: SquirrelInHell
comment by SquirrelInHell · 2016-03-17T02:32:29.193Z · LW(p) · GW(p)

On the contrary, I assign a higher prior probability to movements based on sensible ideas rising to prominence, versus movements based on total nonsense.

(How many failed/successful flying saucer cults? How many failed/successful nonprofits and charities? I don't have real data, but my gut feeling says saucer cults are worse off on average.)

This means that success of a movement based on total nonsense requires a higher burden of proof in the quality of its organisation. For example good leadership, consistent message, effort in PR etc.

The factors you mention might make it easier in some respects to run a cult, but I think they do not on average outweigh the costs. If they do, we should see this reflected in the amount of functioning cults versus charities.

Quite the opposite view from the sentence you quoted!

Replies from: MrMind, DanArmak
comment by MrMind · 2016-03-17T09:00:55.983Z · LW(p) · GW(p)

I think the discussion is getting too nuanced to procede only from raw hypotheticals.
I'll just add that if you consider religions as 'crazy cults' I would say they're much more successful than charities (when the charity is real, of course, not some legal way to evade taxes), but also that flying saucers cults are less successful than religion because weirdness is almost by definition low status.
That said, I feel that we've arrived to a point where if we want to further the discussion we would really need to start writing equations and assigning probabilities...

comment by DanArmak · 2016-03-20T14:27:08.624Z · LW(p) · GW(p)

When talking about success rates, we have to consider base rates. Are more or fewer nonsense movements founded than sensible ones?

Replies from: SquirrelInHell
comment by SquirrelInHell · 2016-03-21T05:57:24.203Z · LW(p) · GW(p)

Thank you very much for pointing out the base rate fallacy.

Even though I was not saying anything concrete because of not enough data, I want to have had that reflex in place.

comment by niceguyanon · 2016-03-18T19:02:12.106Z · LW(p) · GW(p)

Did you get a feel of the education/income/social status of the group you saw? I wonder if they try to recruit a certain type of person.

Replies from: SquirrelInHell
comment by SquirrelInHell · 2016-03-19T06:11:36.869Z · LW(p) · GW(p)

I learned that some of the people there were really high-profile, but I don't know the percentages. On the other hand, some others were perfectly ordinary folk, so if there is a preference it's not exclusive.

For example I remember that one guy (they said) had a company selling electronics, and gave others lots of expensive presents ("presents"?) like Alienware gaming hardware etc.

comment by TheAltar · 2016-03-17T16:06:56.883Z · LW(p) · GW(p)

Why did the hug feel 100% fake to you? Do you think the other Japanese people give less fake hugs?

I generally know that Japan isn't too big on hugging as a culture, so I wonder whether very many Japanese people would be very skilled at this.

Replies from: Lumifer
comment by Lumifer · 2016-03-17T16:39:15.662Z · LW(p) · GW(p)

the other Japanese people

Rael is French, not Japanese.

Replies from: SquirrelInHell
comment by SquirrelInHell · 2016-03-18T01:26:15.797Z · LW(p) · GW(p)

Rael is French, not Japanese.

Yes. And for the record, I've lived in Japan for half a year now and I can't remember a single case of Japanese people casually hugging each other or me. I've had other pleasant things done to me, so maybe it's just that they really don't have hugging in their standard set of social responses.

comment by Viliam · 2016-03-16T06:58:56.260Z · LW(p) · GW(p)

So, is it mostly uniforms, love bombing, and sharing food, or are there other major secret ingredients?

Replies from: SquirrelInHell
comment by SquirrelInHell · 2016-03-16T07:27:11.434Z · LW(p) · GW(p)

Of these three, only love bombing was prominently present, but I think it was actually genuine feeling in the "low rank" members (and I've talked to quite a few). Of uniforms, I think they all had unassuming trinkets/medalions hung from their necks. Food sharing was not present and the mentioned lunch was not more special than any other lunches I went to with other groups of people.

I think there were quite a few other interesting factors at play, and that's why I suggested we could all learn something from a discussion about it. I don't have anything like a complete analysis ready myself, but I detected significant amounts of:

  • casual, friendly touch - there's nothing wrong with it and I would like to see more of it in other communities,

  • making good use of good looking people and sexual attractiveness - subtle enough to not be pretentious,

  • telling people to think for themselves (but I guess on meta-level, you already know the conclusion you are supposed to think of),

  • scripted/planned/predictable ways to evoke positive emotions,

  • honesty about emotions and subjective experience, and full acceptance of this by other members,

  • generally positive atmosphere of friendliness and helpfulness that carries over to the rest of the world, not just the in-group,

  • excellent stage/public speaking/PR skills of the top ranking members,

  • mixing in genuinely worthy/morally sound cases, like stopping wars (but from LW point of view it looks mostly like cheering than taking action).

Replies from: Viliam, Gleb_Tsipursky
comment by Viliam · 2016-03-16T22:42:06.743Z · LW(p) · GW(p)

I guess it's not easy to find a balance between manipulation, and the "reverse manipulation" where people sabotage themselves to signal that they are not manipulating; between focusing on impressions instead of substance, and not being aware of the impressions; between blindness towards biases, and ignoring human nature. Especially in a group where different people will have wildly different expectations for what is acceptable and what is cultish.

Sometimes it feels like a choice between losing rationality and losing momentum. Optimizing to never do anything stupid can make one never get anything done. (And there is of course the opposite risk, but that is much less likely among our kind, although we obsess about it much more.)

What you described here seems like a solid strategy for getting new members. But then the question is how to prevent diluting the original goals of the group. I mean, instead of people who care deeply about X, you succeed to recruit many random strangers because they will feel good at your group. So how do you make sure that the group as a whole (now having a majority of the people who came for the good feelings, not for X) will continue to focus on X, instead of just making their members feel good?

I think the usual answer is strict hierarchy: the people at the top of the organization, who decided that X is the official goal, will remain at the top; there is no democracy, so even if most people actually care about feeling good, they are told by the bosses to do X as a condition for their staying at the group where they feel good. And only carefully vetted new members, who really contribute to X, are later added to the elite.

So, if rationalists or effective altruists would embrace this strategy, they would need to have a hierarchy, instead of being just a disorganized mob. So instead of "rationalists in general" or "effective altruists in general", there would have to be a specific organization, with defined membership and leadership, who would organize the events. Anyone could participate at the events, but that wouldn't make them equal to the leaders.

For example, for rationalists, CFAR could play this role. You could have thousands of people who identify as "rationalists", but they would have no impact on the official speeches by CFAR. But CFAR is an organization specialized on teaching rationality; so it would be better to have some other organization to serve as an umbrella organization for the rationalist movement -- to contain people who are mutually believed to rationalists, even if they don't participate at developing a curriculum.

Similarly for effective altruism. You need a network of people who share the values of the movement, who provide "credentials" to each other, and who only accept new people who also credibly demonstrated that they share the values.

Replies from: SquirrelInHell
comment by SquirrelInHell · 2016-03-17T03:25:21.730Z · LW(p) · GW(p)

You are not wrong of course, but on the scale between "between blindness towards biases, and ignoring human nature" your views fall 80%-90% towards "ignoring human nature".

Just to give you a more complete image, here's a thought:

People are not consequentialists, and they don't know clearly what they want.

In fact, there is nothing "absolute" that tells us what we "should" want.

And the happier you make people, the happier they tend to be with the goals you give them to work on.

If you also teach people rationality, you will get more scrutiny of your goals, but you will never get 100% scrutiny. As a human, you are never allowed to fully know what your goals are.

Looking at the "human nature" side, people who "care deeply" about EA-style things are just people who were in the right situation to "unpack" their motivations in a direction that is more self-consistent than average, not people who fundamentally had different motivations.

So my "human nature" side of this argument says: you can attract people who are in it "just for feeling good", give them opportunity to grow and unpack their inner motivations, and you'll end up with people who "care deeply" about your cause.

comment by Gleb_Tsipursky · 2016-03-16T15:17:00.018Z · LW(p) · GW(p)

Yup, agreed on the benefit of expressing strong positive emotions toward rank-and-file EA members, I wrote about this earlier, don't know if you saw this piece: http://lesswrong.com/lw/n7b/celebrating_all_who_are_in_effective_altruism/

Intentional Insights is actually working actively on a project of making the EA movement more welcoming, let me know if you are interested in collaborating on this project.

comment by [deleted] · 2016-03-16T12:44:28.618Z · LW(p) · GW(p)

Clitoraid! Now that's an effective charity.