The Value (and Danger) of Ritual

post by Raemon · 2011-12-30T06:52:09.685Z · LW · GW · Legacy · 68 comments

Contents

  Art and Belief
  Art and Tribe
  Signaling Issues
  Dangers of Reinforcement
  Aspiring Rationalist Culture
None
68 comments

This is the second part of my Winter Solstice Ritual mini-sequence. The introduction post is here [? · GW].

-

Ritual is an interesting phenomenon to me. It can be fun, beautiful, profound, useful... and potentially dangerous.

Commenters from the previous article fell into two main camps - those who assumed I knew what I was doing and gave me the benefit of the doubt, and those who were afraid I was naively meddling with forces beyond my comprehension. This was a reasonable fear. In this article, I’ll outline why I think ritual is important, why it’s dangerous, why I think it's relevant to an aspiring rationalist culture.

Before I start arguing how meaningful and transformative ritual can be, I want to argue something simpler:

It can be really fun.

This is not to be discounted. For whatever reason, humans tend to appreciate songs, stories and activities that they shared with their tribe. Hedons from ritual can take the form of fun joviality as well as intense, profound experiences.

Not everything we evolved to do is good. If we feel an urge to hit the enemy tribesman with a huge rock and take their land, we can and should say “No, there are important game theoretic and moral reasons why this is a bad idea” and suppress the urge. But we can also devise new activities, like knocking the enemy tribesman over and taking their ball, satisfying that urge without the negative consequences of war. I'd like access to the experience that ritual uniquely offers, if it can be done safely.

Ritual covers a range of experience. One subset of that is a kind of art. To give you some sense of what I mean here, here's a few clusters of activities.

And here are a few songs I like:

Art and Belief

I like Silent Night because it is a simple, tranquil song, often sung with skillful harmonies.

I like Carol of the Bell because it is a powerful, awe-inspiring song that is performed with immense complexity and skill.

I like Do You Hear What I Hear partly for the same reasons I like Silent Night - it begins with simple tranquility. But I also like it for ideological reasons - it showcases the power of a meme growing over time, magnifying, evolving and changing the lives of its hosts as they come to believe it. As an artist hoping to craft powerful memes, this is very important to me. I also like the imagery of the proud king, willing to listen to the words of a shepherd boy, acknowledging the importance of a small infant born into poverty far away.

And the king is able to command the attention of an entire nation: Take a moment, stop what you are doing, and pay attention to this child.

But Do You Hear What I Hear also bothers me slightly - it lies in the uncanny valley of ideological identification. The song strikes very close to home in my heart, and I want to give myself over to the song, not just to sing the words but to truly feel them in my heart. And I can’t, because there is a slight snag when we get to the proud king. The king is valuing the child for all the wrong reasons. I want the child to be important because all children are important. But this king would not have given the child a second thought if it hadn’t been the son of God. I don’t believe in Jesus, so the intended message of the song clashes with what I want it to be about.

For the most part I sing the song without thinking about this, but that little snag is there, and it prevents the song from being one of my favorites ever.

Contrast this with Silent Night, where the message is largely irrelevant to me. Or Carol of the Bells, whose message is “Bells are pretty and people like them.” I appreciate them aesthetically and I respect skilled performers. Their messages don't bother me, but neither do I feel as strongly about them.

Art and Tribe

The Word of God is beautiful because the world is an incredible place, and humans have discovered millions of beautiful true things about it. There is exactly one thing I dislike about this song, and it is not a disagreement with its ideology. It’s just the use of the word “God.” I don’t think it was wrong word to use - it’s a nice, simple word and I read it purely as a metaphor for “the universe.”

Like Do you Hear, there is some uncanny-valley effect here. But here it’s about tribal identification. (I draw a distinction between tribal identity and ideology - tribe is about identifying with a group of people, ideology is identifying with a belief-structure).

My mind snags because “God” is a word I normally associate with other cultures. This isn’t as big a deal as in Do You Hear. I don’t actually consider the goddists to be my enemy, I just don’t feel connected to them, and the word takes me out of the beauty of the song and reminds me of this disconnection. I did go ahead and include Word of God, verbatim, in the Winter Solstice Celebration. I just want to note that there are different reasons to be moved by (or fail to be moved by) a song.

[Edit in 2018: we've since re-written Word of God (after touching base with the original songwriter) to focus more purely on scientific progress rather than God. This was less because "God" was problematic and more because it kept the focus on political conflict that didn't seem good for Solstice longterm]

Finally, we have Singularity, which I like for all kinds of reasons.

The music begins whimsical and fun, but grows more powerful and exciting over time. If you have good speakers, there's a heavy but subtle baseline that drives the sound through your bones. It was refreshing to hear an unapologetic vision of how amazing the future could be. And when the sound abruptly cuts out and the song resets, there's another image I really like - that humanity is not special because we were created by some God for a grand purpose. Instead, we are special precisely because we were shaped by random forces in an un-extraordinary corner of the universe, and all of our potential power and greatness comes from our own desires, intellect and drive.

So it's ideologically moving to me. But I didn't really realize until I sang in a group how tribally moving it could be. I wasn't sure people would like the song. The chorus in particular sounds silly when you sing it by yourself. But as a group, everyone got really into it, and yes the chorus was still a little silly but we got up and waved our arms around and belted it out and it felt really good to be part of a group who believed that this weird, outlandish idea was in fact very plausible and important.

So that was cool.

I also thought it slightly terrifying.

Songs like Singularity are what give me the most pause about encouraging Less Wrong culture and rituals.

Signaling Issues

There are two big issues with ritual. The first is how it makes other people perceive us.

Rituals are, almost by definition, symbolic actions that look a little weird from the outside. They normally seem okay, because they are ancient and timeless (or at least were created a few years before people started paying attention). But any Less Wrong ritual is going to have all the normal weirdness of "fresh" tradition, and it's going to look extra strange because we're Less Wrong, and we're going to be using words like "ritual" and "tribal identification" to matter-of-factly describe what we're doing.

Some people may be turned off. Skeptics who specifically turned to rationality to escape mindless ritual that was forced upon them may find this all scary. Quality, intelligent individuals may come to our website, see an article about a night of ritual and then tune out and leave.

I think this is an acceptable cost to pay. Because for good or for ill, most humans like emotional things that aren’t strictly rational. Many people are drawn to the Sequences not just because they say important things, but because Eliezer crafted an emotional narrative around his ideas. He included litanies and parables, which move us in a way that pure logic often can’t.

There are smart cynics who will be turned off, but there are also smart idealists who will be drawn to recognizable human emotional arcs. I don't think ritual should be the FIRST thing potential newcomers see, but I think it is something that will get them fully involved with our community and the important things we believe. I think it may particularly help former theists, who have built their entire lives around a community and ritual infrastructure, make the transition into atheists who are proud of their new beliefs and do productive things.

It may even help current theists make the transition, if they can see that they WON'T have to be giving up that community and ritual infrastructure, and all the hedons that went along with it.

But there’s another cost to ritual, that can’t be resolved quickly with a cost-benefit analysis.

[Update from 2016 - I want to clarify that while I think Less Wrong as a community is a reasonable place for ritual and culture-building, there are other related communities and organizations that are more "PR sensitive" and I don't think should be connected to ritual]

Dangers of Reinforcement

Ritual taps into a lot of regions of our brain that are explicitly irrational, or at least a-rational. I don’t think we can afford to ignore those regions - they are too essential to our existence as humans to simply write off. We didn’t evolve to use pure logic to hold together communities and inspire decisions. Some people may be able to do this, but not most.

I think we need ritual, but I would be a fool to deny that we’re dealing with a dangerous force. Ritual is a self-reinforcing carrier wave for ideas. Those ideas can turn out to be wrong, and the art that was once beautiful and important can turn hollow or even dangerous. Even true ideas can be magnified until you ignite a happy death spiral, giving them far more of your time than they deserve.

Some of this can be mitigated by making traditions explicitly about the rational process, and building evaluation into the ritual itself. We can review individual elements and remove them if necessary. We can even plan to rewrite them into new parodies that refute the old idea, ceremoniously discarding our old ideas. But this will be a meaningless process unless we are putting in genuine effort - not just doing a dutiful review as good rationalists should.

We can recite the Litany of Tarski, but unless you are truly considering both possibilities and preparing yourself for them, the words are useless. No amount of pre-planning will change the fact that using ritual will require deliberate effort to protect you from the possibility of insanity.

You should be doing this anyway. There are plenty of ways to fall into a happy death spiral that don’t involve candle-lit gatherings and weird songs. When you’re dealing with ideas as powerful as the Singularity - a meme that provides a nice, sound-byte word that suggests a solution to all of the most terrible problems humanity faces - you should already be on guard against wishful thinking. When you're talking about those ideas in a group, you should already be working hard - genuinely hard, not just performing a dutiful search - to overcome group think and evaporative cooling [? · GW] and maintain your objectivity.

You should be doing that no matter what. Every cause wants to be a cult [? · GW], whether or not they have a nice word that sounds way simpler than it actually is that promises to solve all the world’s problems.

Ritual does make this harder. I’m particularly wary of songs like Singularity, which build up a particular idea that still has a LOT of unknown factors. An anonymous commenter from the Solstice celebration told me they were concerned about the song because it felt like they were “worshipping” the Singularity, and I agree, this is concerning, both for our own sanity and for the signaling it implies to newcomers who stumble upon this discussion.

I’d go ahead and exclude the song, and any meme that got too specific with too many uncertainties.... except that a lot of our most powerful, beautiful images come from specific ideas about the future. A generic rallying cry of “Science!”, “Humanism!” or “Rationality!” is not a satisfying answer to the problems of Death and Global Suffering and Existential Risk. It’s not satisfying on an artistic level, an intellectual level or a tribal level. Having specific ideas about how to steer the future is what gives our group a unique identity. Caring too much about that identity is dangerous, but it can also be extremely motivational.

As of now, I’m not sure what I think about this particular problem. I look forward to commenters weighing in.

With all this dire warning, it may seem like a slam dunk case, to abandon the idea of ritual. Obviously, I disagree, for a few reasons.

The first is that honestly, gathering a few times a year to sing “Singularity! Singularity!”, even without all the preventative measures, simply pales in significance compared to... well... the entire Less Wrong community-memeplex doing what it does on a regular basis.

If we were genuinely concerned about making bad decisions due to reinforcement rituals, I’d start by worrying about much more mundane rituals, like continuously discussing the Singularity all the time. Constantly talking about an idea trains your brain to think of it as important. Hanging out on forums with a constant stream of news about it creates confirmation and availability bias. If you’re concerned about irrationality, as opposed to weird ceremonies that might signal low status, you should already be putting a lot of effort into protecting yourself against a happy death spiral, and the extra effort you need to expend for a few nights of jubilant celebration shouldn’t be that significant.

The danger of ceremonial ritual in particular is real, but overestimating it isn’t much better than underestimating it. Even if we’re just talking about ritual as a source of hedons that we’ve previously denied ourselves. Families across the world gather to sing songs about ideas they like, and while this may be a human behavior we need to sacrifice, I’m not going to do so out of fear of what *might* happen without a decent understanding of why.

But there’s more to it than that. And this is why I’ve worked so hard on this, and why I think the potential upsides are so important.

Aspiring Rationalist Culture

I had two major motivations for the Solstice celebration. One of them was to produce a fun event for my community, and to inspire similar events for people across the world who share my memes.

The other was personal: Rationality training has made me better at identifying good solutions, but it hasn't made those solutions emotionally salient. This is particularly important when it comes to optimal philanthropy - a million starving people across the world simply can't compete with a single smiling orphan I get to personally deliver a christmas present to. And those people have an even harder time if they live in the distant future.

Scope insensitivity and time discounting can be hard to overcome. Worst of all is when the best solution might not work, and I may not even live to see it work, and I can never get the emotional satisfaction of knowing I actually helped anyone at all.

I constructed the Solstice celebration around a narrative, based around the interplay between past, present and future. The process of crafting that narrative was extremely valuable to me, and has helped me to finally give Existential Risk the weight it deserves. I haven't committed to helping SIAI in particular, but I feel like I'm at place where if I got better information on how effective SIAI is, I'd be emotionally able to act on that information.

I don't think the first execution of the Solstice celebration successfully provided other people with that experience, but I think there is tremendous potential in the idea. I'd like to see the development of a culture that doesn't glorify any particular solution, but which takes important rationality concepts and helps people modify their emotions match the actual expected values of actions that would otherwise seem cold, hard and calculating.

I think this may turn out to be very important.

In some ways this has me far more scared than ritual-as-hedons. People can gather for a night of jovial fun and come away mostly unchanged. Using ritual *deliberately* to modify yourself is risky, and it is perhaps riskiest if you think you have a good reason for doing so.

I don't think this is that dangerous on the individual level. It was useful to me, I think others may find value in it. Actually allowing yourself to be changed in a meaningful way requires effort beyond an initial, inspiring ceremony. (It took me several weeks of intense work *preparing* the Solstice celebration, cemented by a final moment when a few ideas clicked into place and I came up with a metaphor I could use to alter my emotions. I don't know for sure whether this can be distilled into a process that others can use with less effort).

Next year, I expect the people who come to Solstice for the comfort and hedons will get what they need, and if anyone wants to use it as a springboard for self-modification, they will be able to as well.

The possibility that most concerns me is a chain of events going something like this:

  1. Someone (possibly a future version of me, possibly any random person perusing these articles) will decide that this is important enough to deliberately propagate on a mass scale.
  2. Said person will become good enough at ritual-craft (or use additional dark arts techniques) to make this much more effective than I currently anticipate.
  3. The result is an effective but low-status self-propagating organization that ends up corrupting the original goal of "help people be better at following through on correct but emotionally unsatisfying choices."

This scenario requires someone to put a lot of work in, and they would probably fail uneventfully even if they did. Even if events transpire this way, the problem is less that a hollow self-propagating memeplex exists (it's not like we don't already have plenty of them, one more won't hurt that much) but that its association with Less Wrong and related things may tarnish our reputation.

I'd like to think this is a bad thing, although truth be told I think it assumes a level of importance that Less Wrong hasn't really carved or itself yet in the greater world. But we are trying to gain status and out of respect for the community I should acknowledge this risk, and sincerely solicit feedback.

My current assessment is that a) this is unlikely, and b) any organization that's trying to accomplish things on a large scale WILL have to accept the risk that it transforms into a hollow, self perpetuating memeplex. If you don't want that risk at all, you're probably not going to affect the world in a noticeable way. Ritual-driven memeplexes tend to be religions, which many of us consider a rival tribe, so they carry more emotional weight in our risk assessment. But this can also happen to corporations, unions, non-profits and political movements that may have been genuinely valuable at some point.

I do plan to study this issue in more detail over the coming year. If anyone does have specific literature or examples that I should be aware of, I'd appreciate it. But my priors are that the few negative reactions I've gotten to this are based more out of emotion than a clear understanding of the risks.

My final concern is that this simply isn't a topic that Less Wrong should encourage that much of, partly because some people find it annoying and partly because we really should be spending our time developing tools for rational thinking and studying scientific literature. I have another article or two that I think would be valuable enough for the main page, and after that I'll be inviting people to a separate mailing list if they want to collaborate.

This is the second post of the ritual mini-sequence. The next post is Designing Ritual [? · GW].

68 comments

Comments sorted by top scores.

comment by bryjnar · 2011-12-31T13:42:49.751Z · LW(p) · GW(p)

Okay, here are a couple of things that bother me about this whole enterprise:

  • Maybe this is a British thing, but it smacks a bit of Taking Oneself Too Seriously, which is a capital crime over here. For some reason the the idea of a bunch of earnest, self-ascribed "rationalists" coming up with their own rituals just makes me cringe. I'm struggling to pinpoint the source of the cringing, but it's definitely there. It's probably personal, but put it down as evidence that you're going to cause adverse emotional reactions in some people.

  • I'm a bit wary of the idea of trying to form LW communities. Why do we want to do that? I think of LW as a great forum for discussing a bunch of intellectually interesting ideas, and I'd almost certainly be interested in meetups if there were any near me, just because discussion in person has it's own perks. But actually trying to form meetup groups into communities? That feels a bit like mission creep to me.

  • Related to the above: I'm wary of setting off down a path that is likely to make LWers identify (more) strongly as being LWers. I'm with Paul Graham on this. Bundling an ideology of any kind with fun community-type stuff seems like a recipe for producing unwarranted attachment to said ideology.

I'm glad you had fun with it, but I think you could just, you know, have a party, at which you might venture to read some texts you like, or sing some songs, or not if people don't want to. Rather than a ritual, with all the gunk associated with that.

Replies from: Raemon, Mitchell_Porter, Kaj_Sotala, buybuydandavis
comment by Raemon · 2011-12-31T17:56:49.629Z · LW(p) · GW(p)

Maybe this is a British thing, but it smacks a bit of Taking Oneself Too Seriously

There is a small part of me that cringes at the Self-Importance of it all, so you're not alone there. But I think that part of me is silly and I rebel against it the way I rebel against all my other biases. If something is enjoyable and possibly even useful, I shouldn't have to feel bad about doing it, as long as it's not also dangerous.

Forming Communities

My issue is this: 90% of the benefit I got from Less Wrong had nothing to with rationality, or discussions on this website. It had to do with me finding a real life community of people, period. I think community is absolutely essential to healthy development as a human. This may be slightly less true for certain types of introverts, but I believe even most introverts enjoy (and benefit from) extroverted activities if they have a group of people they can feel comfortable with.

That benefit didn't even necessarily require an explicitly "aspiring-rationalist" community [Aside: I think a lot of the self importance comes from the word "rationalist." "Aspiring rationalist" isn't much better and is a mouthful, but at least acknowledges that we aren't there yet].

I could have integrated myself into a group of gamers or other random friends, and gotten maybe 60% of the same benefit. In the NYC Less Wrong group, that 60% (i.e. general friendship and shared memes) is filled instead by having friends with a shared desire to think about ideas critically and avoid a lot of common brain failures.

But the other 30% of that benefit comes specifically from a community that values achievement in conjunction with rationality. If I were hanging around in a gaming community, I'd be having a lot of fun playing games and seeing people being particularly successful at playing games to encourage me to do it more. In NYC Less Wrong, I'm having a lot of fun developing an array of skills that are useful in different ways outside of a very narrow fun-niche, and I'm constantly exposed to people who are trying to launch startups or film movies or otherwise embark on big projects.

Not all of us are actually that motivated, but exposure to that motivation and the rationalist-discipline that accompanies it is, I think, incredibly valuable.

Does that community have to have any branding attachment to Less Wrong? No. In fact I don't like calling ourselves LessWrongers or even "Rationalists" (and "Aspiring Rationalist" is just so clunky that it doesn't really work either). So far we haven't come up with anything better that sticks. I think there IS value in using some kind of word to describe ourselves tribally, but it probably would be better to keep that separate from the Less Wrong label.

comment by Mitchell_Porter · 2011-12-31T14:51:11.616Z · LW(p) · GW(p)

Taking Oneself Too Seriously, which is a capital crime over here

Celia Green - a British author - once wrote "[Children] seek intensity of experience. They do not have much experience of life and they may seek it clumsily. As they grow older and saner they learn not to seek it at all."

The injunction not to take oneself seriously, if taken seriously, would lead to self-sabotage and guaranteed mediocrity. Much better that people try and fail, and still keep trying. Even better to get it right, but I do not see how an attitude of unseriousness will help there.

Green is a brilliant thinker who somehow ended up shut out of academic life, and her best works (The Human Evasion and Advice to Clever Children) are full of insights into why taking yourself seriously is the only reasonable thing to do, how and why ordinary consciousness works to suppress such an attitude, and how to see past the traps that your own mind or the attitudes of others may set for you. They are also full of imprecations against the human race for not supporting her work, but perhaps she has a right to that attitude. There ought to be a shelf full of commentary on her works by now (and surely she would have filled a shelf by herself if her life had worked properly), but somehow she remains unknown.

Here is another salient quote from Green: (After scorning the fact that most poetry is about "Love and Death":) "If the human race were a little more advanced, it would want to write poems about infinity and the inconceivable..., and if it were more advanced still it would have more urgent things to do than write poetry."

Substitute "hold a ritual" for "write poetry", and you'll get my message. I do believe that this is what functioning higher intelligence looks like: extremely urgent and directed in its activities, when it is empowered to be so. Life is short and existence is radically uncertain. "Ritual" may be something of a proxy for purpose, but at least it is reaching after empowerment.

Replies from: Raemon
comment by Raemon · 2011-12-31T17:53:44.392Z · LW(p) · GW(p)

To the extent that we did ritual "because it was fun", I think it was a proxy purpose approximately to the same degree that watching a good movie was. Entertainment may change but it's not going away - it's valuable for its own sake.

But some of our group are actively doing important things, and I don't think those things would ever "replace" the function that ritual serves. To the extent that we're doing ritual as a source of community building and inspiration, it basically serves the purpose that it's meant to serve, which is to give you a foundation to go do those things that are more important than writing poetry.

comment by Kaj_Sotala · 2012-01-02T09:42:48.410Z · LW(p) · GW(p)

I'm a bit wary of the idea of trying to form LW communities. Why do we want to do that?

I suspect that the problem that prevents LWers from getting things done the most is akrasia, and communities seem like one of the best tools for fighting it. It's a lot easier to get things done if you can do them together with someone.

See also the Good News of Situationist Psychology - being rational, too, is far more effective if you have people around you encouraging the habit. It's easy to slack off and fall into sloppy thought if nobody around you is serious about rationality.

Related to the above: I'm wary of setting off down a path that is likely to make LWers identify (more) strongly as being LWers.

Identifying more strongly as rationalists, however, is probably a good thing.

ETA: Graham said it himself:

There may be some things it's a net win to include in your identity. For example, being a scientist. But arguably that is more of a placeholder than an actual label—like putting NMI on a form that asks for your middle initial—because it doesn't commit you to believing anything in particular. A scientist isn't committed to believing in natural selection in the same way a bibilical literalist is committed to rejecting it. All he's committed to is following the evidence wherever it leads.

Considering yourself a scientist is equivalent to putting a sign in a cupboard saying "this cupboard must be kept empty." Yes, strictly speaking, you're putting something in the cupboard, but not in the ordinary sense.

comment by buybuydandavis · 2014-09-01T01:57:26.922Z · LW(p) · GW(p)

Maybe this is a British thing, but it smacks a bit of Taking Oneself Too Seriously, which is a capital crime over here. For some reason the the idea of a bunch of earnest, self-ascribed "rationalists" coming up with their own rituals just makes me cringe.

Probably because the result of people taking their lives seriously is often cringe worthy.

But the lives of people who don't take their lives seriously are generally cringe worthy as well, it's just that we've ceased cringing because we've become inured to the usual ways people misspend their lives.

I think the latter is the much more certain and prevalent waste of human lives.

comment by Armok_GoB · 2011-12-30T20:39:41.835Z · LW(p) · GW(p)

WANTWANTWANTWANTWANTWANTWANTWANTWANTWANTWANTWANTWANT WANTWANTWANTWANTWANTWANTWANTWANTWANTWANTWANTWANTNEEDNEEDNEED NEEEEEEEEEEEEEEEEEEEEEEED

Ok, I'm done now. D:

At times like this, I tend to think I were genetically "meant" to be The Tribes Shaman, every aspect of my psychology and maybe even some physical being fine tuned for that specific role by some ancient cluster of genes that got combined in just the right way to synergize and get their activation turned up to eleven. I even start to suspect the inability to act on this at all might contribute to my crippling mental illness. Right now, a picture of beaver with bloody stumps for paws trying to dig into a concrete floor refuses to stop loping in my head. Not that I trust any of these suspicions or intuitions, it's probably nonsense, but revealing them seems like the best way to communicating what I'm feeling.

I have no idea how much I'd give for a gotenburg based meetup, but it's certainly something absurdly desperate.

This post resonates so deeply within me... it's like some clichéd "calling", like this is what I were meant to spend my life doing and I'd spend it all in a creepy wireheaded stupor of ecstasy because of how fun it'd be. Everything I've ever managed to get myself to study or practice seems optimized for this one task.

Yea, it's ridiculous, maybe I'm just sleep deprived, or maybe you're just really, really good at what you're doing.

In unrelated news: if you havn't decided on what ritual to work on next, what about some kind of UDT based unbreakable oath type thing?

Replies from: shminux, Nisan
comment by Shmi (shminux) · 2011-12-30T21:07:03.250Z · LW(p) · GW(p)

Downvoted for breaking text wrapping.

Replies from: Armok_GoB, Will_Newsome
comment by Armok_GoB · 2011-12-30T23:57:33.683Z · LW(p) · GW(p)

fixed

comment by Will_Newsome · 2011-12-30T22:48:32.512Z · LW(p) · GW(p)

I went with -1 for breaking text wrapping, +1 for genuine expression of an interesting feeling, for a total of 0.

comment by Nisan · 2012-01-04T03:54:07.131Z · LW(p) · GW(p)

If that is really how you feel, maybe you should do just that. Find a small egalitarian church or pagan group or occult circle or metal scene or art scene and start guiding people through rituals. After you level up your shaman skills, you can take your most impressionable, emotionally vulnerable acolytes and instill them with the virtues of critical thinking and emotional self-reliance. If you're successful, people will love your rituals celebrating sane decision-making, and you can share the fruits of your labor with the global Less Wrong community.

(I'm assuming there are no LWers in Gothenburg.)

Replies from: Armok_GoB, PrometheanFaun
comment by Armok_GoB · 2012-01-04T14:15:32.647Z · LW(p) · GW(p)

I'd rather die [graphic description of particularly gruesome way to die].

Replies from: wallowinmaya
comment by David Althaus (wallowinmaya) · 2012-06-03T09:03:10.734Z · LW(p) · GW(p)

Why, if I may ask?

comment by PrometheanFaun · 2013-12-25T22:25:24.467Z · LW(p) · GW(p)

Sometimes I will stand and look at the church and wonder if today is the day I get desperate enough to go full sociopath, pretend to join the flock, and use the network to start a deviant christianity offshoot.

comment by atucker · 2011-12-30T18:59:28.001Z · LW(p) · GW(p)

I like the idea of having rituals, if only because they're fun. Primates saying things together is awesome, and primates saying things about things I think are cool together sounds even more awesome.

However, I also agree that it's possible for this to go horribly wrong.

What does it going horribly wrong look like? What failsafes can we put in place? I think it would be helpful to have some "if you're doing this, something went wrong" heuristics so that we can notice ahead of time that something undesirable is happening, and stop it.

For instance, if we're ever sacrificing virgins, I think we messed up somewhere and should probably stop.

Replies from: jimrandomh, philh, FiftyTwo
comment by jimrandomh · 2011-12-31T07:57:59.669Z · LW(p) · GW(p)

I like the idea of having rituals, if only because they're fun. ... However, I also agree that it's possible for this to go horribly wrong. What does it going horribly wrong look like? What failsafes can we put in place? I think it would be helpful to have some "if you're doing this, something went wrong" heuristics so that we can notice ahead of time that something undesirable is happening, and stop it.

Good idea. I don't think the people in this community would actually let things go badly awry, but demonstrating that we've thought about it and having a list of well-thought-out meta-rules should reassure people and head off objections. There should eventually be a formal document somewhere, blessed by symbolic proceedings of some sort.

Here are my proposed safeguards (with catchy summaries), all inspired by things that have actually gone wrong in other communities. Certainly not an exhaustive list of failure modes, but it hits some big ones.

Truth above all. People should never be pressured into saying, singing, or symbolically supporting things that they suspect are wrong or untrue. When someone is speaking, they always have the right to break script to avoid speaking a falsehood, or to insert an important truth. When people sing, chant or speak in unison, they always have the right to stay silent through some or all parts, if necessary to avoid falsehood. Statements about good and bad are falsehoods if they contradict the speaker's values. When people break script or refuse chants this way, it shouldn't be held against them, and they are not obligated to defend their statements or their silence. If a script is broken frequently, it should be changed.

Beliefs can change. While a ceremony can be used to mark your present state of belief, that does not mean you stop updating on evidence, and it does not mean you can't renounce the belief on new evidence or on reanalysis of old evidence. It is not possible to use ceremony to lock a belief in place. If beliefs affirmed in ceremonies are maintained, it should only be because they are still true.

No abuse of power. And no excuses for it.

No mandatory intoxicants. Especially alcohol. It should be easy to refuse anything psychoactive without calling attention to having done so. Any meeting that has a drug or intoxicant as a central theme should be irregularly scheduled, so that attendance is not the default for people who come on a regular schedule.

Reality over symbolism. Do not blur the line between acts with real-world consequences, and acts taken symbolically. The direct consequences of a symbolic act should be small. A stupid act remains stupid, no matter how symbolically appropriate. As a corollary, rituals may not contain significantly dangerous acts (but illusion of danger is fine.) If a symbolic act suddenly takes on consequences - for example, if someone shows up with an allergy to the symbolic food - then reality wins.

Replies from: Raemon
comment by Raemon · 2011-12-31T18:06:48.140Z · LW(p) · GW(p)

This is a good starting point.

My biggest flag is going to be if we find ourselves saying something like "we need to acquire more members" for a purpose OTHER than improving the community (i.e. to get their money, or because we have decided "get more members" is a metric we just arbitrarily care about)

Replies from: TheOtherDave
comment by TheOtherDave · 2011-12-31T18:24:11.152Z · LW(p) · GW(p)

Um?

It seems to me that "get more members" is already a metric some people have decided they care about. A great deal of discussion of what the emotional tone of this site ought to be (including, but not limited to, some discussions about why group rituals are good to have) is predicated on the assumption that getting more people to participate here is good in and of itself.

Another way of saying this is that some of us are allowing our attention to be focused on community for the sake of community, rather than focusing our attention on something external and allowing community to develop (or not) based on shared interest in that external thing.

And I agree with you: this is a big flag indicating a problem exists.

Replies from: Raemon
comment by Raemon · 2012-01-02T16:01:11.331Z · LW(p) · GW(p)

This is a complex issue. It probably deserves its own post, but I'll give it a go here.

Here's a few relevant beliefs which have traction in the community, which I don't necessarily agree with. Related ideas vaguely clustered together:

1) The world would be better if people were more rational 2) People can be taught to be more rational 3) Less Wrong's brand of rationality is (at least among the) best examples we have

4) Less Wrong, as a website, benefits from having more quality members posting more (and more varied along certain axis) quality content 5) There are people on the internet who already have something close to the Less Wrong mindset, but who don't know about Less Wrong.

6) Real Life Communities (of some sort) are in general valuable for increasing quality of life 7) Communities are also useful for fighting akrasia, and rationalist communities in particular are valuable for encouraging people to maintain rational practices in their daily lives. 8) Putting together a rationalist community in your area can be hard if you don't know any rationalists there.

9) Acquiring more members online is one method by which to allow real life rationality communities to form, as members from given areas start to reach critical mass. This works especially well for large cities. 10) Less Wrong, both the online and associated meat-space (meet-space?) communities, suffer somewhat from groupthink (and related issues stemming from a narrow target demographic). One way to fix that is to acquire new members with a more diverse range of opinions and interests. 11) Individuals in both the online and meet-space communities benefit from being able to find other individuals with similar interests. Partly because they can just be friends, but also so they can begin working together on bigger projects.

I agree with most of these. The only two that I have issue with is 2 and 3. There's a range on how much people can be taught to be rational. I know people whose brains are simply wired differently than ours. Maybe they could have been shaped differently during childhood, but I suspect there are fundamental biological differences as well.

I also think there are plenty of smart, left-brained people for whom the Less Wrong brand is not well suited.

The upshot to all of this is that I DO think there are lots of good reasons for us to strive for the following:

1) Less Wrong (the blog) should make an effort to reach people who would particularly benefit from it, or might rapidly self-modify into the sort of person who would particularly benefit from it. But Less Wrong's value as a community also depends on a certain focus and quality of discourse. We'd prefer people who can contribute to that focus, or at least don't detract from it too much.

(I'm a little on the fence about how to target people like me, in particular. I do not have the interest nor particular aptitude to contribute meaningfully to decision-theory posts, although I derive value from them. I've ended up posting about art-related things a lot, partly because I think there are useful, rationality-related things to say about them and partly because they are a heretofore low-supply-medium-demand subject that I'm able to contribute regarding. I'd like there to be more artist-types on Less Wrong, but if we all continued posting in the volume that I've been recently it would drown out the more traditional content.)

2) Meet-space rationality communities should be encouraged, and these can have wider range of members. Meetspace communities need inspirational organizers, a category of person that the blog doesn't need as much, and there's a much greater benefit there if individuals sharing a particular interest (art, politics, programming, etc) can meet up and start collaborating.

3) In both cases, we should not be trying to convert EVERYONE to our cause, we should be identifying people who would particularly benefit from our community and who we would benefit from including. This is harder to do in a targeted fashion online - you put advertisements on websites that are similar and you get whoever naturally shows up. In meet-space you can find particular people and just invite them.

"Get more members" is a crude metric that doesn't address the nuance of what we want. This is particularly dangerous because we DO want more members, and it's hard to do so with the nuance required to do so safely and productively. So we need to be hovering near the razor-edge that separates hollow-self-perpetuating organizations and actual good quality organizations, and actively remaining on that edge requires a lot of diligence and effort.

I have more thoughts but they're less fully formed.

Replies from: TheOtherDave, arundelo
comment by TheOtherDave · 2012-01-02T16:41:47.243Z · LW(p) · GW(p)

I agree with most of what you say here.

One exception: I disagree that identifying individuals whose association with LW would be mutually beneficial and encouraging that association is particularly hard to do in a targeted fashion online... I can find particular people and invite them to an online community, just as I can to a "meetspace" community. You seem to have jumped from "online" to "advertisements on websites" and I don't really understand why.

Another exception: you say "we need to be hovering near the razor-edge that separates hollow-self-perpetuating organizations and actual good quality organizations, and actively remaining on that edge requires a lot of diligence and effort" I'm not sure we actually do need to be hovering near that edge. It might be OK for us to simply seek to be an actual good quality organization, and not devote much time or attention to self-perpetuation at all.

Regardless... in particular, I agree that it's very easy for "we should behave in ways that cause people we want to be associated with to want to associate with us" to turn into "we should behave in ways that cause people to want to associate with us" -- for nuance to get lost, as you say. Indeed, I think it's happening: a non-negligible amount of recent discussion on related topics seems to me to fall in the latter category.

Replies from: Raemon
comment by Raemon · 2012-01-02T22:29:19.666Z · LW(p) · GW(p)

I was sort of rushing my conclusion and yeah, I agree with your assessment of my assessment. I think.

For the online thing, my brain leapt immediately to "most cost effective ways to recruit large numbers of people," which wasn't necessary. However, I didn't just mean advertisements. I found Less Wrong though HP:MoR, but I was originally linked to THAT by a public discussion on a community forum. And I would have taken longer to make the transition if there hadn't been additional discussion of Less Wrong itself on that forum.

This is more targeted than advertisements but less targeted than an individual recommendation. I also post particularly good articles on social media, which are in some ways even closer to advertisements. I don't think a dedicated effort is required here, I think this kind of word-of-mouth is what would happen naturally.

It might be OK for us to simply seek to be an actual good quality organization, and not devote much time or attention to self-perpetuation at all.

Up until recently this is what I've been in favor of, and I think it's a good default position. (In particular for the online community, with whatever linking people are motivated to do on their own). But I did just list several advantages of having more people in meetspace, and the NYC group at least has hit the upper limit on how many people it is practically to get together regularly.

There is still more advantage to be had by getting a wider variety of members, but we can't do that unless we make a double-pronged effort: finding larger meeting spaces (which cost money) and ensuring enough people that the larger meeting spaces are justified. Either one by itself doesn't really work.

"we should behave in ways that cause people we want to be associated with to want to associate with us" to turn into "we should behave in ways that cause people to want to associate with us"

The additional issue is that there are people that we wouldn't necessarily explicitly want to come join us, but whom would be totally willing to come to our blog/club if we weren't doing [X random easily changeable thing that we didn't mean to do or don't care that much about], and who would turn out to be valuable if we weren't doing X thing. And I think that's what a lot of the discussion has been about.

The "We Look Like a Cult" issue is contentious because people disagree on how many people are actually turned off by it, and how easily changed or valuable the characteristics that look cult-like are.

comment by arundelo · 2012-01-02T23:51:22.135Z · LW(p) · GW(p)

I don't know if you're intentionally introducing a new spelling here, but the standard one is "meatspace".

Replies from: Raemon
comment by Raemon · 2012-01-03T00:10:14.749Z · LW(p) · GW(p)

The spelling was intentional. (I initially used both of them to clarify that I knew what I was doing, but perhaps it was still not obvious enough. I'm not sure I consider that a tragedy though)

comment by philh · 2011-12-31T12:33:04.471Z · LW(p) · GW(p)

I think it would be helpful to have some "if you're doing this, something went wrong" heuristics so that we can notice ahead of time that something undesirable is happening, and stop it.

A possible danger with this is that we think, "oh, we can't sacrifice virgins, that's on the List Of Bad Things, let's sacrifice grandmothers instead". The fact that there is a list might encourage us to think the list is exhaustive. (Here "think" means "think/reason subconsciously/act like/whatever".)

Perhaps we could help avoid this with ritual-moderators who examine rituals for signs that things have gone horribly wrong, but explicitly without advising the ritual-smiths in the process of constructing them. If you can't avoid the moderator's List Of Bad Things without being told what's on it, you also can't avoid the list of Bad Things That The Moderator Didn't Think Of.

Replies from: Raemon
comment by Raemon · 2011-12-31T18:05:16.446Z · LW(p) · GW(p)

The Rationalist Confessor is not allowed to partake in ritual.... :)

Replies from: wedrifid, homunq
comment by wedrifid · 2011-12-31T18:12:18.179Z · LW(p) · GW(p)

The Rationalist Confessor is not allowed to partake in ritual.... :)

Which sounds like a ritualistic tradition of its own.

Replies from: Raemon
comment by Raemon · 2011-12-31T18:14:59.774Z · LW(p) · GW(p)

I wouldn't call that "ritual", but I would call it deliberate community building along a similar axis.

comment by homunq · 2012-04-19T15:24:18.567Z · LW(p) · GW(p)

It is probably a good idea to include space for a "devil's advocate" inside the ritual, rather than having them somehow sit out (without ritualistically turning their back, or NOT turning their back, or ritualistically considering the issue and Making their Own Decision? How is this even possible?) or sing "la la la I can't hear this" the entire time.

comment by FiftyTwo · 2011-12-30T19:20:13.546Z · LW(p) · GW(p)

At [prominent UK university] we have a lot of rituals in our societies. The heuristic I use is that traditions should be perpetuated as long as they are enjoyable, but perpetuating tradition should not be seen as a good in itself.

Replies from: None
comment by [deleted] · 2012-01-05T12:11:12.238Z · LW(p) · GW(p)

What if a tradition or ritual isn't fun but serves some other purpose that is adaptive to the institution in question?

comment by Solvent · 2011-12-30T09:53:51.292Z · LW(p) · GW(p)

I mostly agree. However, I think that people looking at us funny is far bigger a problem than any rationality impediment rituals may cause.

I think that LessWrong is a bit too young a community to sensibly start having this kind of ritual. It's been around for like, three years at most? A bit more if you include the OB days? We're currently way smaller than we presumably hope to become.

So I think that people coming together and singing songs, while I agree with you about their risks and benefits, is something that we should emphasize more after LW has been around for longer and has grown more.

Replies from: Raemon
comment by Raemon · 2011-12-30T16:03:52.009Z · LW(p) · GW(p)

This statement sounds sensible at first glance. But I don't think it actually is supported by anything.

Crafting rituals from scratch is hard (the next article is about that), and its easier to craft rituals if you have more history and source material to work with. But I think Less Wrong already has an array of powerful, interesting ideas to support songs and stories. I don't think there's any special amount of time you need to wait before you're suddenly allowed to have ceremonies.

Was there another reason why writing songs and stuff should wait till later?

Replies from: JoachimSchipper, Craig_Heldreth, Spurlock
comment by JoachimSchipper · 2011-12-30T17:41:23.790Z · LW(p) · GW(p)

A possible argument: rituals will help build a cohesive tribe, but harm recruiting efforts. LW is more valuable for its articles than for its community, so tribe-building is not that valuable. But there are lots of people who are not yet aware of LW's ideas/community who would be turned off by cultishness.

(I don't feel qualified to have a position on this issue myself.)

Replies from: Vaniver
comment by Vaniver · 2011-12-30T18:47:01.842Z · LW(p) · GW(p)

A possible argument: rituals will help build a cohesive tribe, but harm recruiting efforts. LW is more valuable for its articles than for its community, so tribe-building is not that valuable.

Cohesiveness generally helps recruiting more than it hurts it, so I find this unlikely.

comment by Craig_Heldreth · 2011-12-31T00:48:56.197Z · LW(p) · GW(p)

I am reminded of an essay by the Anthropologist Edmund Leach, 'Once a Knight is Quite Enough' (p. 194ff in The Essential Edmund Leach Volume I 2000 Yale U. Press) where he details the parallels between his initiation into British knighthood by Q. Elizabeth II and a Borneo headhunter ceremony which he saw at the end of WWII. Headhunting was illegal at that time in Sarawak, but they got special permission as the two victims were Japanese soldiers. Anyway the idea was if you watched a silent movie of the two ceremonies and ignored the costumes, the two rituals were nearly indistinguishable. He also mentioned that the Sarawak ceremonial grounds were laid out like a typical English village church.

Here is a link to the headhunters ritual map from google books

comment by Spurlock · 2011-12-30T18:23:29.898Z · LW(p) · GW(p)

Less Wrong already has an array of powerful, interesting ideas to support songs and stories

Funny you should mention that.

comment by AShepard · 2011-12-30T16:07:12.770Z · LW(p) · GW(p)

Off-topic: in a number of places where you've used italics, the spaces separating the italicized words from the rest of the text seems to have been lost (e.g. "helpedanyone at all.") Might just be me though?

Replies from: AShepard, army1987, Raemon
comment by AShepard · 2011-12-31T20:47:08.482Z · LW(p) · GW(p)

Addendum: This is apparently a known issue with the LW website.

comment by A1987dM (army1987) · 2011-12-30T17:11:17.819Z · LW(p) · GW(p)

I couldn't see the spaces before logging in but now I can see them. What gives?

comment by Raemon · 2011-12-30T16:15:10.367Z · LW(p) · GW(p)

Bleah. I fixed a bunch of those, thought I had got them all. Thanks.

Replies from: philh
comment by philh · 2011-12-31T12:06:07.915Z · LW(p) · GW(p)

There's a bunch left. Possibly incomplete list:

Iwantit OrCarol includeWord andit's aboutthe alreadybe anice Ritualdoesmake evenwithoutall time.Constantly work,and requireseffortbeyond weare riskat encouragethat

Replies from: Raemon
comment by Raemon · 2011-12-31T18:15:51.026Z · LW(p) · GW(p)

Thanks. Fixed these ones.

comment by falenas108 · 2011-12-30T15:15:38.435Z · LW(p) · GW(p)

Another thing is that allowing this greatly hinders the "show me you're not a cult" argument to someone coming to this site. Yes, having these rituals don't actually make organizations more likely to be a cult, but most people coming to this site for the first time won't be thinking about that.

Replies from: Raemon, undermind
comment by Raemon · 2011-12-30T16:18:44.706Z · LW(p) · GW(p)

Definitely a concern. But I'm actually less worried about it than I would have been a year or so ago.

People at the NYC group, despite trying not to, almost continuously end up introducing new people by saying "Oh we're totally not a cult!" the moment something slightly weird happens. And people... seem to uniformly laugh, perhaps slightly nervously, but keep coming to meetings and then a few months later they're talking about the Singularity and passing the chocolate and maybe even wearing vibrams.

At work, when I first mentioned "I'm going to a Rationality group," the assumption was that I was going to an Ayn Randian meetup for young 20-somethings who've discovered Rationality™ and now like to talk obnoxiously about their political opinions but will grow out of it in another few years. (I've discovered that saying "Rationality Group" almost always makes people assume you are part of a political group that is the opposite of whatever the listener believes in.)

For a while, I felt like I had to treat the NYC meetup as a joke that I was self-aware of. But in the past month I've been talking more seriously at work about optimal philanthropy and commitment contracts and a few other things, and this has coincided with me also saying "So I'm planning this Lovecraftian Solstice Ceremony...."

And oddly enough, this didn't make anyone freak out. Some people in the office assumed I was throwing a harmless party. But the ones I'd have pegged as our target demographic were interested enough to ask followup questions. ("Wait, seriously? Your rationalist group is throwing a Cthulhu party? What?") and then I explained the story of Stonehenge and how it tied into the world being dark and terrible but how humanity may be able to overcome that. And they responded "Oh, huh. That's actually kinda cool."

I don't expect either of them to attend meetups anytime soon because they already have pretty full lives, but their respect for what I was doing grew over the past month rather than decreased.

Frankly, the "are you a cult?" phenomenon may be a feature, not a bug. It grabs attention, and then the people who were actually going to be interested in the first place tend to stick around in spite of it.

comment by undermind · 2011-12-31T21:22:38.137Z · LW(p) · GW(p)

I think that worrying about being a cult (as distinct from worrying about being seen as a cult) is a pretty good indication that thus far, LessWrong is not a cult. Yes, Raemon's uze of ritual does push us slightly closer to cult territory, but not enough that I see any reasonable grounds for concern, given the benefits. Actual cults (more accurately, groups with a very high number of cult-indicators doing things that are demonstrably harmful) may worry about being seen as cults, but are probably protected by a bias shield or similar effect from seeing any problems with their own cultishness, and are certainly not likely to start an open discussion about it. This is very strong evidence that LessWrong is not a dangerous group of people propagating one ideology above all others and suppressing dissent.

In summary, I think this whole "cult/not cult" business is silly, and a disguised query for nothing. Yes, by developing rituals, LessWrong now has one more cult indicator, but does not have any particularly bad such indicators and as such is pretty okay.

Replies from: None, falenas108
comment by [deleted] · 2012-01-05T12:15:42.762Z · LW(p) · GW(p)

think that worrying about being a cult (as distinct from worrying about being seen as a cult) is a pretty good indication that thus far, LessWrong is not a cult.

Any Rand's group of proto-Objectivist friends and acquaintances jokingly called their group with a strong theoretical dedication to individualism "The Collective".

comment by falenas108 · 2012-01-01T16:09:36.949Z · LW(p) · GW(p)

Agreed. I wasn't saying that we are a cult, just that this makes us seem like more of a cult to outsiders.

comment by [deleted] · 2012-01-08T18:45:20.651Z · LW(p) · GW(p)

Emotional salience is a very important aspect.

When I first read about the singularity about five years ago, it turned around my worldview completely- for a short time, till the reality shock faded and i returned to the normal reality. The singularity just doesnt feel "real" anymore.

I startet reading LW some months ago, and already many of the things that made a strong impression on me in the beginning dont feel real anymore.

I cannot be a rationlist while beeing emotionaly stuck in the Unthinking Deeps .

comment by MaoShan · 2012-02-01T03:50:15.770Z · LW(p) · GW(p)

One way I think that rituals and folklore gain their power is by personifying an abstract idea, giving it something for a social creature to fixate on. This can be done for the Singularity by giving it a name. This may sound strange, but it's not that far from reality to begin with. The Singularity is an event, that is true, but I also link it in my mind to what comes from that event. That is, FAI or uFAI.

A lot of the general ideas of an AI would be considered to have a personality, or at least the ability to communicate with humans. There are lots of shared characteristics exclusive to either FAI or uFAI, allowing us the ability to imagine this future personality, regardless of the engineering details. Most who are willing to accept the idea of either type of AI resulting from the Singularity should see a friend in one, an enemy in the other. History has shown that many people are willing to worship and fear ultrapowerful beings who don't actually exist; it doesn't seem too far a leap to worship and fear ultrapowerful beings who just don't happen to exist yet.

Evolution, nature, the universe, and statistics can have qualitatively positive or negative personal consequences, but calling them God and Devil give them focus. They have their own nebulous sets of characteristics, and when enough of those are present, behold, the Devil did it, or behold, God's mercy hath done this. Is God a computer? No. Is the devil a hyperintelligent self-aware program? No.

We have two new gods in need of names.

Oh, I meant, this is a possibility of the misuse of ritual. Yeah.

comment by wedrifid · 2011-12-31T02:38:44.230Z · LW(p) · GW(p)

If we feel an urge to hit the enemy tribesman with a huge rock and take their land, we can and should say “No, there are complex game theoretical reasons why this is a bad idea” and suppress the urge.

Paper beats rock! (Whether you are playing hand games or calling your solicitor!)

Replies from: DSimon
comment by DSimon · 2012-01-08T01:05:10.946Z · LW(p) · GW(p)

So what do scissors represent here?

comment by Will_Newsome · 2011-12-30T22:16:18.246Z · LW(p) · GW(p)

Perhaps null hypothesis testing is a good ritual example that highlights both the upsides and downsides: http://educationgroup.mit.edu/HHMIEducationGroup/wp-content/uploads/2011/04/12-Gigerenzer-etal-2004.pdf .

Replies from: None
comment by [deleted] · 2011-12-31T22:30:12.652Z · LW(p) · GW(p)

Excellent link! This probably deserves its own discussion post.

comment by JRMayne · 2011-12-31T01:09:14.367Z · LW(p) · GW(p)

If we feel an urge to hit the enemy tribesman with a huge rock and take their land, we can and should say “No, there are >complex game theoretical reasons why this is a bad idea” and suppress the urge.

I may be misreading this, but I don't see it that way. There aren't complex reasons not to do that; there are relatively simple reasons not to kill people and take their stuff. The phrase sounds, to me, like, "Something bad may happen to me by engaging in this warlike behavior," but I think this is wrong both practically and normatively. Practically, whomping people has been successful for those with superior whomping power. Normatively, it's a utilitarian net loss to whomp people and take their stuff.

It's surely possible that I've misread this in some important way.

Replies from: Raemon, None
comment by Raemon · 2011-12-31T09:08:14.275Z · LW(p) · GW(p)

Practically, whomping people has been successful for those with superior whomping power. Normatively, it's a utilitarian net loss to whomp people and take their stuff.

I don't think we're disagreeing on anything important. "Normatively, it's a utilitarian net loss to X" seems relatively complex to me, but the statement wasn't hinging on how complicated the reason was.

comment by [deleted] · 2012-01-05T12:12:26.590Z · LW(p) · GW(p)

Normatively, it's a utilitarian net loss to whomp people and take their stuff.

Not really, if you are better at using their stuff.

Replies from: homunq
comment by homunq · 2012-04-19T15:26:57.955Z · LW(p) · GW(p)

By more than a whomp-worth. Or two whomp-worths, if they whomp back. Or maybe more if there's multiple retaliations.

comment by [deleted] · 2012-01-05T11:50:00.015Z · LW(p) · GW(p)

There are smart cynics who will be turned off, but there are also smart idealists who will be drawn to recognizable human emotional arcs.

Quick question: When it comes to instrumental and epistemic rationality which of these two groups on average has better performance?

Replies from: pjeby, Raemon
comment by pjeby · 2012-01-05T14:59:03.946Z · LW(p) · GW(p)

When it comes to instrumental and epistemic rationality which of these two groups on average has better performance?

I would expect the cynics to do better epistemically, and the idealists to do better instrumentally. That's what current science tells us to expect, anyway. (i.e., the pessimistic cynics will have more accurate beliefs, but the optimistic idealists will actually accomplish more of their goals.)

Replies from: army1987
comment by A1987dM (army1987) · 2012-01-05T18:26:18.847Z · LW(p) · GW(p)

That's what current science tells us to expect.

[citation needed] (but without the connotation of “I don't believe that” -- more like “I'd like to read more about that”).

Replies from: pjeby
comment by pjeby · 2012-01-08T01:55:43.760Z · LW(p) · GW(p)

See Seligman's book, "Learned Optimism". Lots of references there about realism of pessimists vs. success of optimists.

comment by Raemon · 2012-01-05T14:46:13.770Z · LW(p) · GW(p)

Quick answer: I don't know.

comment by [deleted] · 2012-01-05T07:59:48.160Z · LW(p) · GW(p)

(I draw a distinction between tribal identity and ideology - tribe is about identifying with a group of people, ideology is identifying with a belief-structure).

I can't think of an ideology that dosen't also create tribalism. I can think of some kinds of tribalism that don't create ideology (except if you take the implicit pan-human norm of in-group vs. out-group and self-interest as an implicit ideology).

Replies from: Raemon
comment by Raemon · 2012-01-05T14:48:15.488Z · LW(p) · GW(p)

That's probably a fair assessment. The two phenomena are absolutely related (I mean, I'm deliberately working with them both for a reason) but I don't think they're completely identical and I think it's useful to be able to talk about how the individual effects interact.

comment by Nisan · 2012-01-04T05:34:36.598Z · LW(p) · GW(p)

I had the unexpected privilege of taking part in a ritual on New Year's Eve. We went into the back yard, sat in a close circle around a fire, and chanted in Sanskrit.

We chanted each mantra 108 times while the host counted on a string of beads. After the first dozen repetitions the mantra becomes automatic and you zone out. Staring at the fire has the same mesmerizing effect. As a result I kept an empty mind for half an hour, and that always feels healthy.

We also threw spices into the fire, which was supposed to symbolize something, but it didn't do much for me.

After the chanting we brightened the atmosphere by singing a song. Then we went around the circle, sharing our hopes and goals for the coming year. I think if we knew each other better we would have shared more.

Replies from: TheOtherDave, wedrifid, Raemon
comment by TheOtherDave · 2012-01-04T14:27:08.354Z · LW(p) · GW(p)

It's also likely that, had you shared more, you'd have known each other better.

comment by wedrifid · 2012-01-04T06:54:32.230Z · LW(p) · GW(p)

We also threw spices into the fire, which was supposed to symbolize something, but it didn't do much for me.

Wrong spice. ;)

comment by Raemon · 2012-01-04T05:49:02.895Z · LW(p) · GW(p)

I have been meaning to learn to meditate. I don't think it'd end up working on a large scale but I'd be interested in smaller scale stuff (presumably similar to what you describe).

I did want to include some kind of goal-setting ritual with the Solstice - it would have tied in well - but it would have drained focus in unpredictable ways. I may try for it next year if I can figure out a good way to do so.