Ritual Report: Schelling Day

post by ModusPonies · 2013-04-17T03:46:53.809Z · LW · GW · Legacy · 99 comments

Contents

99 comments

 

On Sunday, April 14th, the Boston group held our first Schelling Day celebration. The idea was to open up and share our private selves. It was a rousing success.

 

That doesn't do it justice. Let me try again.

 

By all the stars, you guys. This was beautiful.

 

About fifteen people showed up. Most of us were from the hard core of Boston's rationalist community. Two of us were new to the group. (I'm hopeful this will convince them to start attending our regular meetups.) There was a brief explanation and a few vital clarifying questions before we began the ritual, which went for maybe 90-120 minutes, including a couple of short breaks. All of us spoke at least once.

 

I don't want to go into specifics about what people said, but it was powerful. I learned about sides of my friends I would never have guessed at. People went into depth about issues I had only seen from the surface. I heard things that will make me change my behavior towards my friends. I saw angst and guilt and hope and pain and wild joy. I saw compassion and uncertainty and courage. People said things they had never said before, things I might not have been brave enough even to think in their position. I had tears in my eyes more than once.

 

Speaking went remarkably smoothly. I set a timer for five minutes for each speaker, but it never ran out. (Five minutes is a surprisingly long time.) Partway through, Julia suggested we leave a long moment of silence between speakers, which was a very good idea and I wish I'd done a better job of enforcing it.

 

Afterwards, we had a potluck and mingled in small groups. At first we talked about our revelations, but over time our conversation started drifting towards our usual topics. Next time, in order to keep us on topic, I'll probably try adding more structure to this stage.

 

The other area I wanted to improve was the ritual with the snacks. We had five categories: Struggles, Confessions, Hopes, Joys, and Other. There weren't many Hopes, and there wasn't much distinction between Struggles and Confessions. I'll change this for next time, possibly to Hardships, Joys, Histories, and Other. There's room for improvement in the specific snacks I picked, too.

 

This celebration was the most powerful thing I've experienced since the Solstice megameetup. I don't think I want to do this again soon—it was one of the most exhausting things I've ever done, even if I didn't notice until after I'd left—but I know I want to do it again sometime.

 

 

To everyone who came: I'm so proud of what you did and who you are. Thank you for your courage and sincerity.

99 comments

Comments sorted by top scores.

comment by Raemon · 2013-04-18T23:59:10.481Z · LW(p) · GW(p)

(In the interest of attempting to counterbalance the peer pressure here, I would like to officially solicit [perhaps privately, or perhaps ModusPonies may want to sent up an anonymous comment box], for people who attended the event who had criticisms or just were not quite as awed as the people who've commented so far)

This is not in any way intended to assume there was a problem worth examining. But one of the legitimate criticisms of ritual is that it creates something difficult to criticize, and I think all rationalist rituals should come built in with an appropriate time (a few days afterwards) to evaluate the event and try to counterbalance the social pressure to conform.

Replies from: ModusPonies, jkaufman
comment by ModusPonies · 2013-04-19T00:38:30.356Z · LW(p) · GW(p)

I agree completely. Everyone, feel free to leave me anonymous feedback.* Let me emphasize that, if anyone had a bad experience, I really want to know.

*This address is not just for Schelling Day. Anyone is welcome to give me any sort of feedback.

comment by jefftk (jkaufman) · 2013-04-23T19:06:10.840Z · LW(p) · GW(p)

just were not quite as awed as the people who've commented so far

Positive:

  • I feel closer to the people who were there than I did before.
  • Some people said things that felt like missing puzzle pieces, explaining something about them that had always seemed strange to me.

Negative:

  • There were parts where I was bored.
  • People coming in in the middle, while someone was speaking especially, was disruptive.
  • The "eating the combined food" was a little awkward, as people had preferences and didn't really like the combinations.

Overall I'm glad I went. The negatives were minor compared to the positives.

comment by TheOtherDave · 2013-04-21T18:39:34.353Z · LW(p) · GW(p)

The group is focused on a living leader to whom members seem to display excessively zealous, unquestioning commitment.

If you count karma penalties as "punishment" as you do later, this adds up to the claim that LW both displays unquestioning commitment to and routinely punishes (~150 times in the last 30 days) its leader. I suspect that's an unusual behavior pattern for a cult; I wonder if there's any useful conclusion we could draw from it.

Members' subservience to the group causes them to cut ties with family and friends, and to give up personal goals and activities that were of interest before joining the group. (not sure about this)

At the level of analysis you're doing here, you could probably force anti-akrasia techniques into this mold. That is, someone spends all their time playing video games and then gets caught up in all of the productivity/mindhacking stuff that is popular on LW and then gives up playing video games! Eek!

Members are encouraged or required to live and/or socialize only with other group members. (not sure about 'only' part, )

Well, without the 'only' part, you get "Members are encouraged or required to live and/or socialize with other group members," which is certainly true of LW. Again: eek!

Agreed that semantic discussions are rarely productive, and that the important thing is to honestly evaluate these various potentially harmful conditions and determine whether they apply... and, insofar as they do, attend to them and work out how to mitigate whatever harm they potentially cause.

comment by Viliam_Bur · 2013-04-22T20:21:25.571Z · LW(p) · GW(p)

In real life you sometimes get people who write, using different words, under dozen different articles: "I suspect that this all is just Eliezer's cult designed to extract money from naive people". How much of that is acceptable criticism, and how much is just annoying? Discussing that thing once, thoroughly? Yes, definitely. Dropping the idea around all the time? That's just poluting the space. Problem is that at the moment some people are already deeply annoyed, other people go meta and say we need criticism.

Democracy does not work well online. In real life, one person cannot be at more than one place. Online, one person is enough to be everywhere (within one website). In real life, you can avoid an annoying person by simply going elsewhere and taking your friends with you. Online, you must somehow stop people from doing annoying things, otherwise there is no way to avoid them, except by avoiding the whole website.

I don't have a problem with criticism. I have a problem with boring repetitive criticism. Someone says that having a ceremony is cultish. Okay. Let's discuss that. Someone says again that having a ceremony is cultish. Okay, here is some explanation, here are the differences. Someone says yet again that having a ceremony is cultish. Okay, I heard that already; give me a new information or stop repeating yourself. -- I would have no problem if someone wrote a critical article explaining the dangers of having a ceremony even in its LessWrongian variant, and proposed alternative ways to create personal connections. But dropping hostile comments to other peoples' articles is so much easier. Well, I am not impressed.

People who try hard to appear smart typically have a problem cooperating on anything. For a textbook example, visit Mensa. It is a miracle that Mensa ever gets anything done, because every time anyone proposes an idea, all people loudly disagree, to signal that they are not sheep. Okay, I get it, they are not sheep; but it is still funny how an organization consisting purely of highly intelligent people became such a laughing stock for the rest of the world. Probably they were too busy signalling that they are not sheep, so they missed the forest for the trees.

There is a time to disagree, and there is also a time to agree. If someone has a policy of e.g. never singing a song together with other people (because that might irrationally modify their feelings towards them), I accept if they decide to never sing a song together with fellow rationalists. Yes, they are consistent. I respect that. But if someone is willing to sing a song with random strangers, but would never sing a song with rationalists, that would be reversing stupidity. It means sabotaging yourself and your goals; trying to get some nonexistent points for doing things the hard way.

The proper moment for criticism is when something worth criticising happens. Not when someone merely pattern-matches something to something, and cannot stop obsessing about that. Here is a group of people who all voluntarily decided to share some powerful emotional moments together. Did they commit suicide later? No! Did they donate all their money to Eliezer? No! Did they send disconnection letters to their relatives? No! Did they refuse to talk with their friends who didn't participate in the ritual? No! Did they kill someone or send death threats? No! Are they reduced to mindless zombies? No! Are they keeping the details secret from the rest of the world, threatening to punish whistleblowers? No! -- So why the hell does someone keep repeating that it essentially is the same thing; and why should I pay any attention at all to that kind of criticism?

Replies from: TheOtherDave
comment by TheOtherDave · 2013-04-22T20:44:29.974Z · LW(p) · GW(p)

Perhaps the thing to do is write a single post capturing essentially this argument, and additionally maintain in that post a list of topics which have come up so often in comments that "we" (whatever "we" means in this context) have decided it's passed the "stop making this point in isolated comments!" threshold, and encourage the community standard of responding to Yet Another Instance of Discussion X with some variant of "Discussions of this topic belong here; see topic #17" rather than repeating the same substantive discussion over and over.

It won't reduce the total noise, but it might keep it localized on a single thread.

comment by Luke_A_Somers · 2013-04-17T12:11:48.667Z · LW(p) · GW(p)

Wait, the 14th? Oh crud. I... I meant to be there but I remembered it as 'my second weekend in boston' instead of an absolute date. So when my arrival was delayed by a week...

Oops.

Replies from: Matt_Simpson
comment by Matt_Simpson · 2013-04-17T16:31:19.833Z · LW(p) · GW(p)

This is simultaneously hilarious and weak evidence that the holiday isn't working as intended (though I think repeating the holiday every year will do the trick).

Replies from: Luke_A_Somers
comment by Luke_A_Somers · 2013-04-17T19:19:08.806Z · LW(p) · GW(p)

It has a lot more to do with how crazy my schedule has been lately.

And what the heck people, upvoting that to +4?

HEAD SCRATCH

Replies from: Matt_Simpson
comment by Matt_Simpson · 2013-04-18T06:36:45.860Z · LW(p) · GW(p)

It's a Schelling point, er, joke isn't the right word, but it's funny because the day was supposed to be a Schelling point. And you forgot about it.

comment by gwern · 2013-04-22T00:26:08.524Z · LW(p) · GW(p)

Anyway, I think that mentioning this on RationalWiki is appropriate as a public service to other readers. Do you disagree with that?

I do. Provoking people only to get it as fodder to use against them is the epitome of 'gotcha'; it is dishonest, misleading, poisons discussion, and you should be ashamed of yourself for doing this.

You should also read up on the modern sociological literature of 'cults', because from your comments earlier, you seem to be going on some vague popular pejorative conception of 'cults', while they've deprecated the term entirely based largely on grounds reflected in some of the comments made to you, that 'cults' turn out to meet needs of their participants, do useful things, mirror established organizations closely, have high attrition rates and fail at anything which could be described as 'brainwashing', and in general, there is no apparent substantive content to the term 'cult' beyond indicating 'the speaker dislikes a particular group'.

Replies from: None
comment by [deleted] · 2013-04-22T10:01:26.022Z · LW(p) · GW(p)

Of course, the RationalWiki version of reality doesn't mention any of this; he repeats the "relatively well received" lie, and of course Dmytry chimes in with his usual litany of abuse.

At some point the RW talk page will look so ridiculous that trolls threatening to report there "as a public service to other readers" will cease being a credible threat. (Their moderators are even less effective than ours, after all.) Perhaps that time has already come?

Replies from: gwern
comment by gwern · 2013-04-22T16:39:12.532Z · LW(p) · GW(p)

Perhaps that time has already come?

I doubt it. Do you see any non-LWers linking to RW and saying 'man, what are these guys on?'

comment by Vaniver · 2013-04-21T18:30:53.127Z · LW(p) · GW(p)

Anyway, the point is not to argue whether your group fits the details of some definition of the word "cult" (discussions about the semantics of a word tend to be intellectually unproductive). The point is why it is generally considered harmful to be part of a cult and whether these reasons apply to your group as well.

I'm much happier to have the discussion on that level, but I think the primary argument you've put forward is the definition. If the definition isn't important enough to contest, it's not important enough to rest your argument on- and so you need to identify the causal mechanisms by which these behaviors are harmful.

The elements on the list that pertain to these rituals are "Use X to Y": "use chants to suppress doubt" and "use confessions to control by guilt." You notice the chants and confessions and are worried- but it's the Y that make cults harmful. If there's a group activity that includes confessions as a guilt-reducing measure, then that should be evidence against cultishness and harm, but it's not clear that you would see it that way. Notice the "if": I don't think the ritual as practiced is designed to reduce guilt for all participants, though it may do so for some participants.

Saying "I would be worried about sharing private information in a setting like this, and I think others should be worried as well" is a valuable contribution to the discussion, but "This smells like cultishness" is not. (In particular, it seems wise to include a warning that, while the group is composed of friends, three can keep a secret only if two are dead, in order to discourage people from sharing things it would be unwise to share even semi-privately.)

promote unearned loyalty

What would earned loyalty look like?

Replies from: Viliam_Bur
comment by Viliam_Bur · 2013-04-22T12:07:25.737Z · LW(p) · GW(p)

Also there is a huge difference whether the information sharing is voluntary or involuntary.

(Where "involuntary" includes also stuff like: "you have a right to remain silent, but then you will burn in hell for eternity, mwahaha!" or "we are not going to pressure you into telling anything, this is just a friendly talk, am I right? but your unwillingness to cooperate could reflect negatively on your assessment report, so why don't you think about it again and then tell us your decision".)

Without pressure, sharing information in group is just sharing information on a group scale.

comment by Vaniver · 2013-04-22T00:03:50.385Z · LW(p) · GW(p)

You're welcome!

I do try to think in terms of fallacies, and I think that warning signs are indeed important heuristics, though they can be spurious. They should make you update your beliefs to some extent.

At the very least, when you notice them you should do a formal outside view perspective, to compare to your inside view perspective. You can often learn a bit about how to present things this way.

What do you mean by causal models in this context?

I think that the discussion would have gone more productively if you had narrowed your original comment to the feature of the ritual that worried you, the effect it could have, and how that feature could reasonably lead to that effect. Then, people can focus on the individual components of the specific worry, rather the amorphous charge of cultishness. Even if you only noticed the danger because of it sounded your cult alarm, you don't need to use that as part of your explanation of why you think it's dangerous.

Indeed, if you can't come up with an independent reason for why it's dangerous, that's moderate evidence that it's not dangerous, but still suggests something like "I'm worried about group confessions as a component of this ritual; what could go wrong?", which will get the contrarians to do your imagining for you.

Even if in the majority of these meetings all or most members are long-time friends, there can be concerns about sharing very personal information.

I think these concerns are worth informing attendees about- "hey, remember, there's no oath of secrecy here, but also please don't spread stories without permission"- but because attendees can choose to share whatever they like, there's no element of coercion to be worried about. (There might be a reciprocity concern, but that seems minimal and could be ameliorated with the addition of a targeted "X is verboten" rule.)

comment by Morendil · 2013-04-21T18:50:20.110Z · LW(p) · GW(p)

For fun, try applying the above list to a "group" such as Yahoo!, Google, Apple, etc. You might come to the surprising conclusion that most hi-tech businesses are actually cults. (What went wrong?)

Replies from: gwern, V_V, PrawnOfFate
comment by gwern · 2013-04-22T18:24:54.808Z · LW(p) · GW(p)

One of my favorite versions of this: is the National Institute of Health a cult? The answer may surprise you!

comment by V_V · 2013-04-22T14:47:19.382Z · LW(p) · GW(p)

Actually, other than the preoccupation with making money obviously, I don't see much in common.

Notably, legitimate for-profit companies pay their employees for their work, they don't solicit donations or unpaid work for the cause. In fact, they don't require their employees to believe in a greater cause, or that their CEO is some sort of super-human being, or that their group is better than everybody else, and all the stuff that cults are about.

Sure, just like any human organization, companies can develop a culture of groupthink, ingroup-outgroup bias and excessive reverence towards authority. Functional companies recognize this as a problem and take steps to mitigate it. Cults, on the other hand, encourage it.

comment by PrawnOfFate · 2013-04-21T19:49:21.949Z · LW(p) · GW(p)

You might come to the surprising conclusion that most hi-tech businesses are actually cults. (What went wrong?)

And that's a reductio? Or maybe not-actual-cult organisations exploit foibles that everyone has, in a comprehensive way, and other social organsiations do so in a lesser way. Maybe its a spectrum. Glass half-full, glass half-enpty.

For my money, a successful rationalist organisation should be right up at the zero end of the scale. I don't think anyone has ever done this. I think it's an interesting idea to design it. I'm pretty sure EY has zero interest. He thinks he is succeeding when people become deconverted from formal religion, and doesn't check they they have become reconverted to lesswrongianity. I don't think that is evil on his part. I think most cult leaders slip into it. If someone wants to design a rationalist organistation that is free from all the group-level, sociological forces towards irrationality, they are going to have to study some (yech!) social psyhchoogy...I know, soft sciences!

Edited for claity

Replies from: Viliam_Bur, Vaniver, hairyfigment, Morendil
comment by Viliam_Bur · 2013-04-22T12:53:04.782Z · LW(p) · GW(p)

a successful rationalist organisation should be right up at the zero end of the scale

Because everyone knows that reversed stupidity is the best form of rationality ever.

Here are some guidelines for the new ultra-rational community to follow:

  • Don't have any leadership. If someone tries to organize something, make sure you criticize them loudly and question their motives, until they crawl away crying.
  • Prevent new people from joining you.
  • Money making or any success in real life should be considered shameful.
  • Emphasise that there is no truth, no reality, ever. You are an intolerant bigot if you think that 2+2=4.
  • You shall never: sing, dance, read poetry, give someone cookie, smile, etc.
  • You should invest a lot of energy into offending your group members, and especially anyone who tries to do something admirable.
  • You should never feel guilty for being an asshole to other members of your group.
  • Preferably, you should not even speak with other group members. Or meet them.

If you break any of these rules, I can give you a hyperlink to a cultish behavior.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2013-04-23T03:13:36.711Z · LW(p) · GW(p)

Nice response so I'm keeping it, but killthread beyond this point, or take it to the LW uncensored thread on Reddit. Attention equals reward for trolls.

Discussion of this action is also meta and should also go to the uncensored thread (posting a link there will be tolerated so long as the body text is not offensive).

comment by Vaniver · 2013-04-21T19:56:41.448Z · LW(p) · GW(p)

He thinks he is succeeding when people become deconverted from formal religion, and doesn't check they they have become reconverted to lesswrongianity.

Really? I get much more of the "have a deliberately built, internally consistent, and concordant with reality worldview" vibe from EY and LW than I do from most of the new atheist movement.

If someone wants to design a rationalist organistation that is free from all the group-level, sociological forces towards irrationality, they are going to have to study some (yech!) social psyhchoogy...I know, soft sciences!

Have you read the Death Spirals and the Cult Attractor sequence?

Replies from: PrawnOfFate
comment by PrawnOfFate · 2013-04-22T12:45:16.283Z · LW(p) · GW(p)

Really? I get much more of the "have a deliberately built, internally consistent, and concordant with reality worldview" vibe from EY and LW than I do from most of the new atheist movement.

I don't see what you are getting at. Are you saying the psychological basis of a belief (groupthink vesus rational appraisal) doens't matter, so long as the belief is correct.

comment by hairyfigment · 2013-04-22T03:33:21.802Z · LW(p) · GW(p)

If you tell me that all successful organization do X, and then advise me not to do X, I'll start to doubt that you have my best interests at heart. At least if I can think of several defunct clubs/ideas from my own experience that did not do X (which I think I can).

Replies from: PrawnOfFate
comment by PrawnOfFate · 2013-04-22T09:20:40.467Z · LW(p) · GW(p)

Successful at what? There isn't an organisation on the planet that's successful at Overcoming Bias.

Edit: Oh, and the original comment (examining whether LW meets the criteria for culthood) has been deleted. Hmm....

comment by Morendil · 2013-04-21T20:51:09.286Z · LW(p) · GW(p)

And that's a reductio?

Insofar as many corporations would check more items from that list than I suspect the Boston LW group would, yes.

Insofar as many of the items are vague enough to apply to any social group that elicits loyalty from its members, yes.

One problem is relative terms like "excessive" in "excessively zealous, unquestioning commitment". What observations, precisely, count as indications of "excessive" behavior in this regard?

Or "preoccupied with making money" - well who isn't? Again, what's a cult-indicative level of preoccupation? It's going to be hard to beat, e.g. the startup community in terms of being obsessed with money, so this indicator totally fails to discriminate cults in any useful manner. If you said "cults assert and enforce an exclusive and all-encompassing claim to members' or prospective members' income and wealth", that would be more diagnostic. (But then you couldn't arbitrarily designate any group you didn't like as being a cult. Oh well.)

comment by RolfAndreassen · 2013-04-22T00:11:18.608Z · LW(p) · GW(p)

So we just ran this ritual at the Cincinnati meetup. We had nine participants, and went three rounds; everybody spoke at least once, and some three times. It was clearly possible to continue with more rounds, but I think we were roughly at the limit of our attention span; more would have been counterproductive. in accordance with the insight about confessions and struggles being much the same thing, we combined these into one category, which seemed to work well. We had a mix of all four categories, with some being, as our resident theorist put it, superpositions.

Everyone seemed pleased with the results; it was even suggested that we might run the ritual more often than once a year. In truth, any organised activity that leads to someone shouting, in full seriousness, "Sweden shall be CLEANSED with FIRE and FLAME" cannot be all bad.

Replies from: taelor
comment by taelor · 2013-04-22T10:13:58.440Z · LW(p) · GW(p)

In truth, any organised activity that leads to someone shouting, in full seriousness, "Sweden shall be CLEANSED with FIRE and FLAME" cannot be all bad.

I think that the Swedes might disagree about that.

Replies from: RolfAndreassen
comment by RolfAndreassen · 2013-04-22T19:15:01.090Z · LW(p) · GW(p)

The context is that the Swedes had just spent the better part of two decades in revolt against my humane and enlightened rule, and all the jarls and most of the chiefs were in my dungeon awaiting my decision on their fate. Moreover, most of them had ransomed themselves at least once and then rebelled again; so I was not particularly inclined to mercy. Their opinion, then, was not actually very important. :)

The category for this was "other".

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2013-04-21T21:20:45.932Z · LW(p) · GW(p)

I've previously marked V_V as a probable troll. It seems a lot of feeding is going on. This post in particular is not an appropriate place for it. I'm thinking of adding a term to the Deletion Policy for, well, this sort of thing on any post that reports a positive community effort - see Why Our Kind Can't Cooperate for the rationale.

When I was doing OB and the Sequences, I realized at one point that Caledonian was making it un-fun for me since each post was followed by antihedons from him, and that if I didn't start deleting his comments, I would probably stop continuing (though I certainly didn't know as much then about reinforcement psychology, I still appreciated this on some instinctive level). I'm not going to tolerate that kind of negative stimulus being applied to community organizers.

I think it might actually be a good idea to give any poster the power to delete replies in their post's comments thread - Facebook does this automatically and I don't think it's a problem in real life, except of course for the trolls themselves - but that would require development resources, and as ever, we have none.

Replies from: wedrifid, Vaniver, Richard_Kennaway, None, Dorikka, someonewrongonthenet, Viliam_Bur, PrawnOfFate, V_V
comment by wedrifid · 2013-04-22T03:09:05.886Z · LW(p) · GW(p)

I think it might actually be a good idea to give any poster the power to delete replies in their post's comments thread - Facebook does this automatically and I don't think it's a problem in real life, except of course for the trolls themselves - but that would require development resources, and as ever, we have none.

This is a terrible idea. People already try to bully people out of disagreement with their point. Giving everyone the power to delete dissenters in their threads introduces drastically undesirable incentives. It means that people would, and, indeed should systematically downvote every comment in a thread if they believe the local PostDictator has or will abuse their local dictatorial powers. That is the only way to eliminate the bias in the conversation.

comment by Vaniver · 2013-04-21T23:06:53.160Z · LW(p) · GW(p)

I've previously marked V_V as a probable troll. It seems a lot of feeding is going on.

I agree that it's possible that V_V is trolling. I think it's more likely that they're just educated enough to cut themselves, thinking in terms of fallacies and warning signals, rather than causal models.

But I responded to V_V because you have the critics you have, not the critics you want, and because they do sometimes raise concerns that are worth considering. It is a questionable idea to share secrets in a public setting, but I suspect that V_V and other observers overestimate the social distance between the attendees; I know I would be comfortable telling the regulars at my LW meetup quite a bit about myself, because I've been friends with them for quite some time now. When you cast it as "we're friends that would like to deliberately be friendlier, and that includes targeted attempts to get to know each other better," it loses much of its danger.

(It still has the awkwardness of "how dare you be deliberate in your dealings with other humans!", but I don't think it's possible for that awkwardness to go away, and that's something that most posts on social issues seem to be open about.)

Responding positively demonstrates open-mindedness, encourages superior criticism, and gives me an opportunity to improve the thing criticized.

When I was doing OB and the Sequences, I realized at one point that Caledonian was making it un-fun for me since each post was followed by antihedons from him, and that if I didn't start deleting his comments, I would probably stop continuing (though I certainly didn't know as much then about reinforcement psychology, I still appreciated this on some instinctive level). I'm not going to tolerate that kind of negative stimulus being applied to community organizers.

Deleting people's comments because of your negative emotional reaction is a strategy I strongly recommend against, and admitting to that in response to deleting someone's accusation of cultishness is a mistake. Your refrigerator is unplugged, and you should plug it back in before the ice melts and the food starts to spoil.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2013-04-21T23:12:00.489Z · LW(p) · GW(p)

Deleting people's comments because of your negative emotion reaction is a strategy I strongly recommend against

I suppose I should've used my free will to ignore the negative conditioning being applied to me? I'll go do that as soon as I acquire free will.

Replies from: Vaniver, None
comment by Vaniver · 2013-04-21T23:27:27.896Z · LW(p) · GW(p)

I suppose I should've used my free will to ignore the negative conditioning being applied to me? I'll go do that as soon as I acquire free will.

This isn't a goal you automatically succeed at; responding appropriately to criticism is a skill that takes development. I've put quite a bit of effort into training my skill at this, and am pleased with how far I have gotten, but recognize I still have a ways to go. In particular, I'm afraid I haven't put much effort into developing my ability to train others; I'd recommend talking to Val about it; he should be able to teach you much more effectively than I can.

The primary technique that I use that's communicable is to try and use defensiveness as a trigger for curiosity. That association is very useful, but I'm not sure what sort of practice would help teach it. Perhaps a helpful visualization is to try and 'slide' down from combativeness into curiosity.

Perspective alteration is also useful. People aren't responding to you, but to what you created; Julia has a neat visualization trick of seeing people's positions (including her own) as somewhat displaced from them during arguments. If Caledonian has something mean to say about one of your posts, well, it's attacking your post, not you. (And even if he says something along the lines of "Eliezer is a big meanie head," well, it could easily be the case that the Eliezer model in Caledonian's mind is a big meanie head, but you don't have to interpret that as an attack.)

And once you have distance from it, you can remove the tone and focus on the substance, and see whether or not you can use the substance to make yourself stronger.

Replies from: Eliezer_Yudkowsky, None
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2013-04-22T00:13:01.386Z · LW(p) · GW(p)

Been there. Done that. Got tired. Try being a D-level Internet celebrity sometime. It will rapidly exceed reserves of patience you didn't know you had.

Replies from: Rain, Vaniver
comment by Rain · 2013-04-22T00:29:10.815Z · LW(p) · GW(p)

I continue to support your decisions on heavier moderation, and once again thank you for your efforts to keep Less Wrong a well-tended garden.

comment by Vaniver · 2013-04-22T00:29:31.488Z · LW(p) · GW(p)

I empathize. Looking back, I also realize I was unclear; in the grandparent I talked mostly about how to respond positively to criticism, when my original comment of responding appropriately to criticism was closer to the mark.

I don't expect you to respond positively to all criticism; one of the benefits of being a celebrity is that there are other people who will do that for you. But if it takes patience for you to be indifferent to criticism, then I think you would see significant gains from further skill development. Deleting critical comments reduces your public effectiveness, and putting emotional satisfaction above that is not something I recommend. This is especially important for you, because you have pinned your public image so closely to rationality.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2013-04-22T00:47:58.159Z · LW(p) · GW(p)

I don't think you understand the concept here. I'm not deleting comments because it gives me a satisfying feeling. I deleted Caledonian's comments because he was successfully shifting OB to troll comments and discussion of troll comments, and this was giving me an 'ouch' feeling each time I posted. I tried talking myself out of the ouch feeling but it didn't actually work. I asked people to stop feeding the troll and that also didn't work. So I started deleting comments because I don't live in a world of things that ought to work.

Banning all meta discussion on LW of any kind seems like an increasingly good idea - in terms of it being healthy for the community, or rather, meta of any kind being unhealthy.

/r/science occasionally vaporizes half the comments in their threads now and it hasn't seemed to hurt them any. I don't think censorship actually hurts reputation very much, certainly not enough to make up for the degree to which meta blather hurts community.

Replies from: None, Vaniver, shminux, shminux, PrawnOfFate
comment by [deleted] · 2013-04-22T00:57:25.924Z · LW(p) · GW(p)

I don't think censorship actually hurts reputation very much, certainly not enough to make up for the degree to which meta blather hurts community.

Censorship of offtopic and idiots is very much appreciated and not usually regarded as the squicky kind of censorship, except on places like r/anarchism, which I wouldn't worry about.

As always, I encourage you to do more public executions. (keyword "public". The masses must know that there is a benevolent moderator delivering them from evil trolls).

Banning all meta discussion on LW of any kind seems like an increasingly good idea - in terms of it being healthy for the community, or rather, meta of any kind being unhealthy.

+1. Even those of us who participate in meta discussions don't necessarily appreciate their existence. Start with this thread.

comment by Vaniver · 2013-04-22T01:04:02.282Z · LW(p) · GW(p)

I'm not deleting comments because it gives me a satisfying feeling.

What would it look like if you were?

I deleted Caledonian's comments because he was successfully shifting OB to troll comments and discussion of troll comments, and this was giving me an 'ouch' feeling each time I posted. I tried talking myself out of the ouch feeling but it didn't actually work, so there you go.

It's not clear to me how to interpret this "and." If he were successfully shifting OB to troll comments, and this was giving you a pleasant feeling every time you posted, you wouldn't have deleted his comments? If lowering the discourse was reason enough to delete his comments, why not just list that as your primary reason, rather than your internal emotional response to him lowering the discourse?

Banning all meta discussion on LW of any kind seems like an increasingly good idea - in terms of it being healthy for the community, or rather, meta of any kind being unhealthy.

It seems to me that there are several kinds of healthy meta discussions. I am worried that a ban on meta discussion will accelerate the departure of dissatisfied members of the community, because they have no outlet to process their dissatisfactions, and that this will decrease the quality and intellectual breadth of the community.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2013-04-22T13:44:12.190Z · LW(p) · GW(p)

I'm not deleting comments because it gives me a satisfying feeling.

What would it look like if you were?

SMITE!

comment by shminux · 2013-04-22T18:12:48.296Z · LW(p) · GW(p)

Banning all meta discussion on LW of any kind seems like an increasingly good idea - in terms of it being healthy for the community, or rather, meta of any kind being unhealthy.

Just voicing support for this, together with an outlet in terms of a periodic meta thread or that LW subreddit.

comment by shminux · 2013-04-22T18:09:47.720Z · LW(p) · GW(p)

Banning all meta discussion on LW of any kind seems like an increasingly good idea - in terms of it being healthy for the community, or rather, meta of any kind being unhealthy.

It is a good idea, provided you also give people an explicit outlet to blow off steam, like http://www.reddit.com/r/LessWrong. Seems to have worked for the basilisk discussions. Alternatively, a periodic "Rules and Regulations" meta thread could help keep meta discussions away from other threads. Anyway, something like this works great for a few of subject-specific IRC channels I frequent or moderate.

comment by PrawnOfFate · 2013-04-22T13:11:52.988Z · LW(p) · GW(p)

Banning all meta discussion on LW of any kind seems like an increasingly good idea - in terms of it being healthy for the community, or rather, meta of any kind being unhealth

Have you considered having a separate "place" for it?

Replies from: None
comment by [deleted] · 2013-04-22T13:22:22.101Z · LW(p) · GW(p)

http://lesswrong.com/lw/gkv/official_lw_uncensored_thread_on_reddit/

Replies from: PrawnOfFate
comment by PrawnOfFate · 2013-04-22T13:29:19.580Z · LW(p) · GW(p)

I haven't seen anything to say that is for meta discussion, it mostly isn't de facto, and I haven't seen a "take it elsewhere" notice anywhere as an aternative to downvote and delete.

comment by [deleted] · 2013-04-21T23:30:16.644Z · LW(p) · GW(p)

If Caledonian has something mean to say about one of your posts, well, it's attacking your post, not you.

Were you around back then? Caledonian was attacking posts because it knew it was getting under people's skin.

Replies from: Vaniver
comment by Vaniver · 2013-04-21T23:41:15.435Z · LW(p) · GW(p)

Were you around back then?

Nope; I only saw his comments when reading through the sequences, and thought they were often sharp (in both senses of the word). There are no doubt selection effects at play in which ones still existed for me to read them.

Caledonian was attacking posts because it knew it was getting under people's skin.

To which the obvious response is to not let it get under your skin, and if you lack that level of control over your skin, to deliberately develop it.

To quote ShannonFriedman from another post:

Today I know that if an agenty person has to write bylaws and they don't have experience, they go off and read about how to write bylaws.

(Replacing 'write bylaws', of course, with 'respond positively to criticism.')

Replies from: None
comment by [deleted] · 2013-04-22T00:44:30.719Z · LW(p) · GW(p)

To which the obvious response is to not let it get under your skin, and if you lack that level of control over your skin, to deliberately develop it.

Willpower isn't an infinite resource.

Replies from: PrawnOfFate
comment by PrawnOfFate · 2013-04-22T12:59:47.471Z · LW(p) · GW(p)

But being able to handle criticism properly is a very important rational skill. Those who feel they cannot do it need to adjust their levels of self-advertisement as rationalists accordingly.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2013-04-22T14:14:50.585Z · LW(p) · GW(p)

being able to handle criticism properly is a very important rational skill

You are absolutely right. Some parts of this very important rational skill are: properly discerning genuine criticism from trolling; properly discerning whether the person posting it is a useful or a harmful presence in the forum; properly deciding a useful course of action.

I think that Eliezer has indeed demonstrated possession of this very important rational skill in his handling of V_V's criticism.

comment by [deleted] · 2013-04-21T23:28:14.981Z · LW(p) · GW(p)

It's not just yours; it's also negative for the people trying to put together these events. Vaniver was wrong to single your reaction out in this instance.

For what it's worth, I agree with your moderation decision in this circumstance.

Replies from: Vaniver
comment by Vaniver · 2013-04-22T00:49:03.924Z · LW(p) · GW(p)

Vaniver was wrong to single your reaction out in this instance.

I am also opposed to deleting comments because they cause antihedons for community organizers. In general, I am opposed to the exercise of institutional power to achieve hedons or avoid antihedons instead of to achieve institutional goals, and am particularly opposed in the case that doing so damages institutional goals.

It seems to me that the deletion of criticism, even ill-intended criticism, damages several key goals of the LW community.

Replies from: Viliam_Bur
comment by Viliam_Bur · 2013-04-22T12:32:02.173Z · LW(p) · GW(p)

Seems to me you misunderstand this aspect of trolling: someone systematically working to create an ugh field about some topic, person, or a blog. Pavlovian conditioning through online communication.

Imagine a situation where every time you speak about a topic X, someone kicks you in a foot. Not too painfully, but unpleasantly enough. Imagine that there is no way for you to avoid this feeling (except for not speaking about X ever again). Do you expect that in a long term it would influence your emotions about X, and your ability to think about X clearly? If yes, why would you want to give anyone this kind of power over you?

This is an art some people are very successful at. I don't know why exactly they do that; maybe it is deliberate on their part, or maybe they have some bad emotions related with the topic or person, and they can't resist sharing the emotions with a larger audience.

In the past I have left one website I participated on for a few years, just because one crazy person got angry at me for some specific disagreement (I criticized their favorite politician once), and then for the following months, wherever I posted a comment about whatever topic, that person made sure to reply to me, negatively. Each specific instance, viewed individually, could be interpreted as a honest disagreement. The problem was the pattern. After a few months, I was perfectly conditioned... I merely thought about writing a comment, and immediately I saw myself reading another negative response by the given person, other people reacting to that negative response, and... I stopped writing comments, because it felt bad.

I am not the only person who left that specific website because of this specific person. I tried to have a meta conversation about this kind of behavior, but the administrators made their values obvious: censorship is evil and completely unacceptable (unless swear words or personal threats are used). Recently they have acquired another website, whose previous owner agreed to work as a moderator for them. I happen to know the moderator personally, and a few days ago he said to me he is considering quitting the job he used to love, because in a similar way most of his valuable contributors were driven away by a single dedicated person, whom the site owners refuse to censor.

If you have a sufficiently persistent person and inflexible moderation policy, one person really is enough to destroy a website.

Replies from: Vaniver, shminux
comment by Vaniver · 2013-04-22T16:24:53.429Z · LW(p) · GW(p)

If you have a sufficiently persistent person and inflexible moderation policy, one person really is enough to destroy a website.

I agree that destructive people can do a lot of damage, and that removing them is a good idea. I also agree that destructiveness doesn't even require maliciousness.

The strategy I'd like to see is "cultivate dissent." If someone is being critical in an unproductive way, then show them the productive way to be critical, and if they fail to shape up, then remove them from the community, through a ban or deletion/hiding of comments. Documenting the steps along the way, and linking to previous warnings, makes it clear to observers that dissent is carefully managed, not suppressed.

Tying the moderator reaction to whether or not the criticism is fun to receive, rather than if it is useful to receive, is a recipe for receiving fun but useless criticisms and not receiving unfun but useful criticisms.

Receiving and processing unfun but useful criticisms is a core part of rationality, to the point that there are litanies about it.

Replies from: Raemon, Viliam_Bur, Richard_Kennaway
comment by Raemon · 2013-04-22T17:02:41.909Z · LW(p) · GW(p)

Very much agree with this.

The most unsuccessful thing about the message deletion is that now I am insatiable curious about what the message said and am thinking way more about that, and having to spend cognitive effort worrying about whether Eliezer overstepped his bounds or not, in a way that (I suspect) is at least as bad as whatever the original comment was. (This remains the case whether or not the message was truly awful)

comment by Viliam_Bur · 2013-04-22T19:11:53.639Z · LW(p) · GW(p)

If someone is being critical in an unproductive way, then show them the productive way to be critical, and if they fail to shape up, then remove them from the community, through a ban or deletion/hiding of comments.

How specifically? I imagine it would be good to tell certain people: "you have already written twenty comments with almost the same content, so either write a full article about it, or shut up".

The idea is that writing an article requires more work, better thinking, and now you are a person who must defend an idea instead of just attacking people who have different ideas. Also an article focuses the discussion of one topic on one place.

Even if someone e.g. thinks that the whole LessWrong community is Eliezer's suicidal cult, I would prefer if the person collected all their best evidence at one place, so people can focus on one topic and discuss it thoroughly, instead of posting dozens of sarcastic remarks in various, often unrelated places.

Replies from: Vaniver
comment by Vaniver · 2013-04-22T19:35:55.069Z · LW(p) · GW(p)

I imagine it would be good to tell certain people: "you have already written twenty comments with almost the same content, so either write a full article about it, or shut up".

I like this idea quite a bit, though I would word it more politely.

I also imagine that many posters would benefit from suggestions on how to alter their commenting style in general, as well as specific suggestions about how to apply those communication principles to this situation.

comment by Richard_Kennaway · 2013-04-22T20:18:14.231Z · LW(p) · GW(p)

Tying the moderator reaction to whether or not the criticism is fun to receive, rather than if it is useful to receive, is a recipe for receiving fun but useless criticisms and not receiving unfun but useful criticisms.

Useless criticisms are no fun at all.

comment by shminux · 2013-04-22T17:35:39.470Z · LW(p) · GW(p)

Retaliatory sniping like the one you described is common, both online and IRL, and is not easy to moderate against. It is present on this forum, as well, to some degree, and occasionally complained about. The problem is that it is hard to prevent, since each specific instance usually does not break the usual ground rules. A couple of places I know have an informal "no sniping" rule, but it is quite subjective and the violations are hard to prove, except in extreme cases. An enforcement attempt by the mods is often costly, as it often evokes the ire of the egalitarian rank and file, who only see the tip of the iceberg.

Interestingly, on karma-supporting forums it often takes the form of downvoting with impunity everything (or almost everything) written by a poster you don't like. Because of its zero cost it is hard to resist, and because of its anonymity it is hard to guard against. Fortunately, it is not as destructive as explicit sniping, since the hate-on downvotes tend to get overwhelmed by the relevant feedback, whether positive or negative.

comment by Richard_Kennaway · 2013-04-22T12:21:55.454Z · LW(p) · GW(p)

I'm not going to tolerate that kind of negative stimulus being applied to community organizers.

Yes.

I think it might actually be a good idea to give any poster the power to delete replies in their post's comments thread - Facebook does this automatically

No.

A Facebook page is a personal fiefdom within which one has absolute power (within the limits of what one's feudal superiors, i.e. the owners of Facebook, and beyond them the state, allow). The same applies to personal blogs. Making a post or a comment here puts it up for grabs by anyone. That is what a discussion forum (which this is) is for. I specifically do not want that power over replies to my posts here.

comment by [deleted] · 2013-04-21T21:43:04.195Z · LW(p) · GW(p)

but that would require development resources, and as ever, we have none.

This intrigues me. You (and others) have said this multiple times, and I wonder what it means.

Presumably it would only take a few thousand dollars to round up a list of the highest value/cost ratio programming improvements on LW, and then pay someone to implement them. Do I underestimate the cost here?

So the fact that you (generalized you, in your role as LW sponsor) are not doing this implies that improvements to LW have low marginal value compared to other projects (presumably MIRI stuff). LW improvements look high value from out here.

It's interesting, then, that you take the time to delete things and write up these deletion reports. A few thousand dollars applied to some brave volunteer could save you a lot of time added up over the years. This needs calculation, of course. Also, you're probably doing this LW janitor stuff on your recharge time between actual work-ability time.

I can't say I disagree with the revealed preference; most of the value of LW seems to be the archives, meetups, and existence, which is secured for now. I'd rather you spent my money on saving the world (which I tentatively infer is much further along than external communications claim).

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2013-04-21T23:07:54.369Z · LW(p) · GW(p)

If we were bid $5K for the top dozen improvements by a credible source, we'd take it, but no such bid has ever occurred. I think you underestimate the cost.

Replies from: Nova_Division, Vaniver
comment by Nova_Division · 2013-04-22T01:17:16.635Z · LW(p) · GW(p)

Could I get a quick list of those top dozen improvements, so I can estimate the hourly rate for a $5k pay, and then forward that on to my extremely talented programmer fiance (who is also a LWer)?

Replies from: Eliezer_Yudkowsky, lukeprog, gwern
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2013-04-22T02:16:19.454Z · LW(p) · GW(p)

Off the top of my head:

  • Stop showing user page edits in the LW wiki sidebar.
  • Also in the sidebar: Don't show reverted edits to pages. Show the last real pages with non-reverted edits.
  • When a moderator deletes a comment, if it has no subcomments or if all comments have been deleted, don't show any leftover traces. If there are subcomments, require a deliberate user click to expand them and only allow logged-in users with >=1 karma to do so. Apply the -5 karma troll toll to comments with a deleted ancestor.
  • Show parent comments in user comment feeds (the list of a user's most recent comments). If the user has less than 80% upvotes, enable downvoting in the comment feed for users with over 1000 karma.
  • Cause new comments on a post to stand out more than they do currently (the small green aura does not enable super-easy scanning).
  • Implement Reddit's "highlight comments since..." feature.
  • Show the most recent Rational Quote comment in the main sidebar, the most recent Open Thread and Rationality Diary comment in the Discussion sidebar (i.e., the latest comment from the latest post with the appropriate tag).
  • Automatically strip crap (the sort of stuff e.g. Word generates, but all text editors seem to do it now) from the post editor.

More ambitious projects:

  • Easier tracking of ongoing discussions - "subscribe" to a post and see new comments on it forever, possibly in a separate inbox.
  • Subreddits besides Discussion.
Replies from: Desrtopa, Eliezer_Yudkowsky, army1987
comment by Desrtopa · 2013-04-22T02:40:48.908Z · LW(p) · GW(p)

Cause new comments on a post to stand out more than they do currently (the small green aura does not enable super-easy scanning).

You know, I've been here for years now, and I never noticed the green aura around new posts at all before.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2013-04-22T23:06:32.700Z · LW(p) · GW(p)

EDIT: Also in the Ambitious column: Give the moderators the ability to move a whole comment thread between posts, preferably to another subreddit (so we can create /r/meta and dump a bunch of this old stuff there), preferably with a replacement comment that automatically links to the new location and has space for a moderator comment explaining the reason for moving. This would be better than deletion in a lot of cases.

comment by A1987dM (army1987) · 2013-04-22T11:24:59.317Z · LW(p) · GW(p)

Easier tracking of ongoing discussions - "subscribe" to a post and see new comments on it forever, possibly in a separate inbox.

Isn't that what the RSS feed for each comment is for? (I've never used RSS myself, so I dunno.)

comment by lukeprog · 2013-04-22T01:46:06.646Z · LW(p) · GW(p)

Here are some of the LW issues sitting in the queue because the previous odesk programmer collaborating with Trike Apps on LW development went MIA a couple weeks ago: 373, 370, 367, 358, 323, 203.

BTW, that's one of the reasons development is so expensive. You can invest in training people, but they might disappear.

comment by gwern · 2013-04-22T01:42:00.901Z · LW(p) · GW(p)

Look at the bugtracker and sort by priority? https://code.google.com/p/lesswrong/issues/list?can=2&q=priority=High&colspec=ID%20Estimate%20Type%20Status%20Priority%20Milestone%20Owner%20Summary%20Contributions

Replies from: Vaniver
comment by Vaniver · 2013-04-22T01:48:28.912Z · LW(p) · GW(p)

Even sorting by enhancements, I don't see a lot of things that people have been asking for.

Replies from: gwern
comment by gwern · 2013-04-22T01:58:01.124Z · LW(p) · GW(p)

It's a quick list. If it doesn't include what some people want, perhaps they should be filing requests instead of leaving their wishes languishing in threads no one will read again.

Replies from: Vaniver
comment by Vaniver · 2013-04-22T02:02:40.843Z · LW(p) · GW(p)

Agreed that improvements to LW will be more likely if people are incentivized to resolve issues in the tracker and post them in the tracker (and that the second could be accomplished just by advertising and visibility of site improvements).

comment by Vaniver · 2013-04-21T23:10:48.211Z · LW(p) · GW(p)

Have you asked, or put together a list of those improvements? My current expectation is that unsolicited bids for unscoped projects are infrequent, but I don't have professional experience in that area.

Replies from: lukeprog
comment by Dorikka · 2013-04-21T21:42:50.915Z · LW(p) · GW(p)

Relevant: I did not actually read the thread before it was deleted.

Even if they're being fed, troll posts are probably going to be below -3, hidden by default. A solution that would not require changing the deletion policy might be for people to stop reading/rehide a post if they know that it's likely to emotionally affect them in a negative way and not serve a useful purpose. (Yes, this would inhibit feedback, but deleting the feedback entirely would do the same, but moreso.)

As a note, I likely won't respond to comments on this -- the topic isn't important enough to me for me to keep up with the discussion; I just wanted to drop in a potentially useful suggestion.

comment by someonewrongonthenet · 2013-04-23T06:55:21.697Z · LW(p) · GW(p)

I think it might actually be a good idea to give any poster the power to delete replies in their post's comments thread - Facebook does this automatically and I don't think it's a problem in real life, except of course for the trolls themselves - but that would require development resources, and as ever, we have none.

If you feel that going to those lengths are necessary, here is a more palatable censorship technique which might achieve the intended effect - instead of the current troll feeding penalty, simply automatically delete heavily down-voted comments. That way the community collectively decides what is inappropriate and what is simply discussion, rather than the OP.

comment by Viliam_Bur · 2013-04-22T13:01:57.893Z · LW(p) · GW(p)

I agree about the seriousness of the problem, but disagree about the solution. Giving anyone the power to delete any comments from their articles could be abused easily.

How about a compromise solution: a special "extra vote down" button, available only for the article author, giving -5 karma to a comment? That is enough to make a new comment collapsed, but can be reversed by enough votes from the readers. The probability of readers upvoting a trolling comment en masse is in my opinion not greater than the probability of an author to downvote a comment for wrong reasons. (Also, the number -5 can be adjusted later, if the value seems wrong.)

comment by PrawnOfFate · 2013-04-22T12:47:15.454Z · LW(p) · GW(p)

Is unpleasantness the only criterion? Nobody much likes criticism, but it is hardly rational to disregard it becuase you don't like it.

comment by V_V · 2013-04-21T23:40:12.984Z · LW(p) · GW(p)

Q.E.D.

un-fun for me since each post was followed by antihedons from him

Well, speaking of anti-hedons, this instance of censorship is obviously going to RationalWiki. Good job!

I still appreciated this on some instinctive level). I'm not going to tolerate that kind of negative stimulus being applied to community organizers.

So, now rational discussion about potential harm of certain practice is intolerable " negative stimulus" or trolling? Nice to know.

I think it might actually be a good idea to give any poster the power to delete replies in their post's comments thread - Facebook does this automatically and I don't think it's a problem in real life, except of course for the trolls themselves - but that would require development resources, and as ever, we have none.

Actually, the post you deleted wasn't even downvoted. In fact, we were having a polite discussion.

Replies from: gwern
comment by gwern · 2013-04-22T00:00:18.948Z · LW(p) · GW(p)

Well, speaking of anti-hedons, this instance of censorship is obviously going to RationalWiki. Good job!...Actually, the post you deleted wasn't even downvoted. In fact, we were having a polite discussion.

I was one of the people who upvoted your comments and was partially responsible for it being 'not even downvoted'. However, now you're playing gotcha, demonstrating you weren't even arguing in good faith to begin with, and so I feel regret over encouraging you at all - especially since I should've known better ("You knew I was a troll when you upvoted me...").

comment by Nornagest · 2013-04-22T21:18:32.434Z · LW(p) · GW(p)

Don't hold weird ceremonies.

This brings up some interesting history, actually. Synthetic subcultures -- fraternal orders, service organizations, social clubs not associated with a particular hobby -- used to be a lot more common than they are now; the Freemasons are probably the best-known example, but far from the only one. Ritual was quite common among them, basically as a group-cohesion hack. Now, the lines between an initiatory society and a mystery religion are pretty hard to draw, but generally these organizations didn't claim religious status and I'm inclined to believe them.

For reasons I don't fully understand, these started falling out of fashion sometime during the mid-20th century. The main survivals these days, at least in the United States, are college frats and sororities, service clubs along animal-club lines (Lions Club, etc.), and certain branches of Masonry; generally these are in decline, or have substantially changed their model, or both. You could also make a case for the Scouting movement. which survived thanks to its educational niche.

I'm starting to think of the social branch of the LW community as something like a stab at reviving this sort of organization, and I think rituals like the OP's are best viewed in that context. Unfortunately that's not a prototype that's likely to occur to most people, and so I do see ritual as a potential PR difficulty, maybe something that should be offloaded onto local communities rather than promoted globally. Raemon seems to be thinking along similar lines, to his credit.

But in any case I wouldn't consider it positive evidence for anything untoward. It's just a social tool, and not even an intrinsically Dark Artsy one -- though the existing rituals are not my cup of tea, personally speaking.

comment by juliawise · 2013-04-19T01:28:33.915Z · LW(p) · GW(p)

More thoughts on this on the earlier post.

comment by chronophasiac · 2013-04-18T15:19:59.626Z · LW(p) · GW(p)

As one of the participants, I can honestly say Schelling Day was a highlight of the past year. The experience was every bit as powerful as described. Afterwards, I felt a sense of friendship and goodwill towards my friends (old and new) that was nearly overwhelming.

Thank you so much for organizing this event. Here's to next year's Schelling Day!

comment by Raemon · 2013-04-17T15:50:42.010Z · LW(p) · GW(p)

Glad this went really well!

We've done some non-ritualized group therapy sessions at the New York meetup, which seem similar. I'm interested in running something closer to what you describe here, and see if it feels noticeably different.

Replies from: juliawise
comment by juliawise · 2013-04-19T00:28:59.151Z · LW(p) · GW(p)

We also recently started something partly based on the New York group therapy sessions. (Huh, we should do a writeup of that, too. I'd love to see a writeup of what you guys are doing.)

This felt very different. There's a difference between something that you want to spend 90 minutes talking about with people and something that you just kind of want your friends to know about you. So people were willing to say a wider variety of things, and more people were willing to speak, than would probably do so in a longer and more specific session.

Replies from: Raemon
comment by Raemon · 2013-04-19T14:42:06.491Z · LW(p) · GW(p)

Gotcha.

Do you anticipate it would work well again if you repeated it with the same people? (I suppose with a year in between them there may be new things to bring up. I'm curious how well it'll work as a permanent yearly thing)

Replies from: juliawise, ModusPonies
comment by juliawise · 2013-04-21T13:00:46.241Z · LW(p) · GW(p)

Yes, I would expect it to work fine. People said between 1 and 3 things about themselves. This session probably was where people said the things they most wanted to get off their chests, but I would expect people to have more than 3 things that they might want their friends to know about themselves.

comment by ModusPonies · 2013-04-19T18:11:08.171Z · LW(p) · GW(p)

In a year? Probably, although that's mostly a guess. If I did it again now, I wouldn't have much to say.

comment by ArisKatsaris · 2013-04-21T18:17:29.828Z · LW(p) · GW(p)

Yeah, out of all those things you put "check" on, I'd probably not put "check" on a single one of them (or at the very most one or two), since almost all of what you just said are a gross misrepresentation of the truth, bordering on libelous.

comment by V_V · 2013-04-19T15:18:05.134Z · LW(p) · GW(p)

The solstice rituals didn't look quite right: according to the reports it seemed that people were taking them too seriously.
But at least you could have given them the benefit of doubt: these people came from Christian or Jewish backgrounds and missed their traditional holidays, so they invented a replacement.

But group confession sessions are way beyond the rituals of mainstream religions, they are outright cult practices: http://www.prem-rawat-talk.org/forum/uploads/CultCharacteristics.htm

Replies from: ModusPonies, BlazeOrangeDeer, Vaniver
comment by ModusPonies · 2013-04-19T15:46:54.002Z · LW(p) · GW(p)

I have seen some convincing arguments that this sort of deliberate bonding is dangerous. Shouting "Cult!" is not one of them.

Separately, the link you've provided doesn't support the claim that we are a cult. I think you're trying to draw support from UC Berkeley's list of 19 characteristics on that page and claiming we have the Guilt and Fear characteristic. I think we've also got Intense Study and arguably Striving for the Unreachable. Three out of nineteen doesn't seem worrying. The page also has several independent lists of cult characteristics which do not include group confession.

Replies from: cadac, Kindly
comment by cadac · 2013-04-19T21:51:48.273Z · LW(p) · GW(p)

I have seen some convincing arguments that this sort of deliberate bonding is dangerous.

I’d be interested in hearing them.

comment by Kindly · 2013-04-19T16:39:56.117Z · LW(p) · GW(p)

The fact that you're doing things which make someone want to shout "Cult!" is in itself a warning sign.

comment by BlazeOrangeDeer · 2013-04-21T07:35:38.104Z · LW(p) · GW(p)

The fact that they are practiced by existing cults does not mean they are not beneficial. The main cultish aspect is the fear of exploitation, which hopefully is not present.

edit: if it is, please say so.

Replies from: V_V
comment by V_V · 2013-04-21T16:43:29.135Z · LW(p) · GW(p)

The fact that they are practiced by existing cults does not mean they are not beneficial.

But certainly it's not evidence for them being beneficial.

The main cultish aspect is the fear of exploitation, which hopefully is not present.

How do you know?

comment by Vaniver · 2013-04-19T15:31:43.340Z · LW(p) · GW(p)

I don't know how much these are "confession" sessions (the link you provided focused on having people admit things they feel guilty about / ashamed by that could then be used to control them) as compared to "get to know each other much better very quickly" sessions. Can anyone who attended give us a sense of the vibe along that dimension, or quantify how many things people shared were confessions as opposed to struggles, joys, hopes, or something else?

Replies from: ModusPonies
comment by ModusPonies · 2013-04-19T18:08:32.000Z · LW(p) · GW(p)

I'd estimate that something like a third of what people had to say was confessions. These weren't usually "I shoplifted" type of confessions, but closer to the category containing "I often feel like I'm obligated to do X, but can't because of Y" or "I don't fit social construct Z and I'm afraid I would be ostracized if my friends and family knew."

I think this was common enough that, if someone did want to use this sort of information to control us, they could get some serious mileage out of it. I just don't expect this to happen because there's not much evidence of any unethical manipulation.