Political topics attract participants inclined to use the norms of mainstream political debate, risking a tipping point to lower quality discussion

post by emr · 2015-03-26T00:14:18.806Z · score: 42 (43 votes) · LW · GW · Legacy · 71 comments

(I hope that is the least click-baity title ever.)

Political topics elicit lower quality participation, holding the set of participants fixed. This is the thesis of "politics is the mind-killer".

Here's a separate effect: Political topics attract mind-killed participants. This can happen even when the initial participants are not mind-killed by the topic. 

Since outreach is important, this could be a good thing. Raise the sanity water line! But the sea of people eager to enter political discussions is vast, and the epistemic problems can run deep. Of course not everyone needs to come perfectly prealigned with community norms, but any community will be limited in how robustly it can handle an influx of participants expecting a different set of norms. If you look at other forums, it seems to take very little overt contemporary political discussion before the whole place is swamped, and politics becomes endemic. As appealing as "LW, but with slightly more contemporary politics" sounds, it's probably not even an option. You have "LW, with politics in every thread", and "LW, with as little politics as we can manage".  

That said, most of the problems are avoided by just not saying anything that patterns matches too easily to current political issues. From what I can tell, LW has always had tons of meta-political content, which doesn't seem to cause problems, as well as standard political points presented in unusual ways, and contrarian political opinions that are too marginal to raise concern. Frankly, if you have a "no politics" norm, people will still talk about politics, but to a limited degree. But if you don't even half-heartedly (or even hypocritically) discourage politics, then a open-entry site that accepts general topics will risk spiraling too far in a political direction. 

As an aside, I'm not apolitical. Although some people advance a more sweeping dismissal of the importance or utility of political debate, this isn't required to justify restricting politics in certain contexts. The sort of the argument I've sketched (I don't want LW to be swamped by the worse sorts of people who can be attracted to political debate) is enough. There's no hypocrisy in not wanting politics on LW, but accepting political talk (and the warts it entails) elsewhere. Of the top of my head, Yvain is one LW affiliate who now largely writes about more politically charged topics on their own blog (SlateStarCodex), and there are some other progressive blogs in that direction. There are libertarians and right-leaning (reactionary? NRx-lbgt?) connections. I would love a grand unification as much as anyone, (of course, provided we all realize that I've been right all along), but please let's not tell the generals to bring their armies here for the negotiations.

71 comments

Comments sorted by top scores.

comment by ChristianKl · 2015-03-26T00:39:04.491Z · score: 7 (9 votes) · LW · GW

Political topics elicit lower quality participation, holding the set of participants fixed. This is the thesis of "politics is the mind-killer".

I don't think that's a good description. The article argues that using political example when you can make the same point with a nonpolitical example is bad because the political aspect prevents people from using their usual reasoning abilities.

comment by Lumifer · 2015-03-26T14:37:25.175Z · score: 6 (6 votes) · LW · GW

So, has there been an influx of new participants into LW who only want to argue politics? I haven't noticed any.

It's also worth pointing out that we mostly debate political philosophy and not politics. Politics debates look like "Should Obama just ignore Congress and ram through whatever regulations he can?" or "Is Ted Cruz the greatest guy ever?" or "Shall we just tell the Greeks to go jump into the Aegean sea?" and we do NOT have them.

comment by Zubon · 2015-03-27T13:30:13.011Z · score: 7 (7 votes) · LW · GW

I'm not sure if it says more about me or the context I'm used to seeing here at Less Wrong, but when I read, "Shall we just tell the Greeks to go jump into the Aegean sea?" I thought "Iliad" before "ongoing economic crisis." If that order ever flips, we may have gotten too much into current events and lost our endearing weirdness and classicism.

comment by Lumifer · 2015-03-27T14:42:49.792Z · score: 2 (2 votes) · LW · GW

when I read, "Shall we just tell the Greeks to go jump into the Aegean sea?" I thought "Iliad".

:-D Yep. That's a good thing.

comment by ChristianKl · 2015-03-26T14:59:51.611Z · score: 4 (4 votes) · LW · GW

So, has there been an influx of new participants into LW who only want to argue politics? I haven't noticed any.

There seems to be a member who mainly came to LW to argue "pragmatarianism" who's karma is as the time of this writing negative.

comment by Lumifer · 2015-03-26T15:07:46.076Z · score: 5 (5 votes) · LW · GW

I think this member is mostly in the business of marketing his blog and isn't interested in LW otherwise.

comment by emr · 2015-03-27T16:04:13.276Z · score: 1 (1 votes) · LW · GW

I agree that this isn't happening to LW. (To avoid repetition, I talk about a bit more about motivation in this comment)

comment by JoshuaZ · 2015-03-26T00:17:29.999Z · score: 6 (6 votes) · LW · GW

This is a plausible, and worrying point. Is there some evidence for the basic thesis beyond simple intuitiveness?

comment by passive_fist · 2015-03-26T03:18:57.588Z · score: 5 (5 votes) · LW · GW

As I've mentioned in the other recent political thread, it's not just that political topics elicit lower quality participation. Even if you have the best of intentions and can keep your mind-killing mechanisms in check, it's extremely hard to have a rational debate about politics.

comment by Lumifer · 2015-03-26T14:32:42.651Z · score: 4 (10 votes) · LW · GW

Even if you have the best of intentions and can keep your mind-killing mechanisms in check, it's extremely hard to have a rational debate about politics.

This is the classic keys-under-a-streetlamp argument. We should debate what needs to be debated, not what's easy to debate.

comment by satt · 2015-03-27T01:33:42.777Z · score: 4 (4 votes) · LW · GW

We should debate what needs to be debated, not what's easy to debate.

Dunno whether "We" means "someone in general" or "LWers" there. If the former, I agree.

If the latter, ehhhh, I'm doubtful. Some things need to be debated, but from a consequentialist perspective it doesn't automatically follow that we should be debating those things. Political debates on LW tend to be better than political debates elsewhere, but they might not be a net gain.

comment by Lumifer · 2015-03-27T14:38:44.849Z · score: -1 (1 votes) · LW · GW

How do you calculate what might or might not be "a net gain"?

comment by satt · 2015-03-28T18:43:40.416Z · score: 1 (1 votes) · LW · GW

By iterating over individual things and evaluating whether each of those might or might not be a net gain.

More general & idealized method for evaluating whether a specific thing might be a net gain: list the concrete consequences of the thing, then assign a number to each consequence representing how much utility each consequence adds or subtracts, according to your normative preferences. Sum the numbers and look at the sign of the result.

More specific & common method for evaluating whether a specific thing might be a net gain: try to think of the consequences of the thing, picking out those which seem like they have a non-negligible utility or disutility, then use your gut to try weighing those together to come up with the net utility of the thing.

I recognize my answer is general, but that's a side effect of the question's generality. I'm not the person who downvoted your question, but I'm not sure what its point is.

comment by Lumifer · 2015-03-30T17:09:27.391Z · score: 0 (0 votes) · LW · GW

I think I phrased my question incorrectly -- I am interested not so much in "how", but in "what" and "for whom". By what kind of criteria do you estimate whether something is a "gain" or not and whose gain is it? And if the answer is "look at utility", the question just chains to how do you estimate whether something is positive-utility, especially with non-obvious issues like having or banning certain kinds of debates on a forum.

comment by satt · 2015-03-31T03:39:16.731Z · score: 1 (1 votes) · LW · GW

By what kind of criteria do you estimate whether something is a "gain" or not and whose gain is it?

What kind of criteria? Depends on the something. (Again a general answer but again the question is general. The criteria for evaluating whether to take an aspirin are different to those for evaluating an online debate, which are different again to those for evaluating some bookshelves, which are different again to...)

Whose gain? Whoever the person doing the evaluating cares about.

And if the answer is "look at utility", the question just chains to how do you estimate whether something is positive-utility,

Well, you hit rock bottom eventually; you translate things into consequences about which you have reasonably clean-cut "this is good" or "this is bad" intuitions. Or, if you're doing it in a more explicit cost-benefit-analysis kind of way, you can pin rough conversion factors on each of the final consequences which re-express those consequences in terms of a single numeraire for comparison.

especially with non-obvious issues like having or banning certain kinds of debates on a forum

Here I think the estimating is relatively easy, because I'm weighing up "We should debate what needs to be debated", apparently in the context of LW specifically, and the impression I got from your phrasing was that you were implicitly excluding broad classes of consequences like warm fuzzy hedons. If so, considering the issue on your terms (as I understand them), I can simplify the calculation by leaving out hedonic and similarly self-centered aspects.

Elaborating on why I interpreted you like that: when people use the language of duty or obligation, as in "We should [X] what needs to be [X]ed", they normally imply something like "we need to do that, even if through gritted teeth, for prudential reasons", rather than e.g. "that would be fun, we should do that". If that's what you meant here (perhaps I misunderstood you?), that excludes consequences like the little glow we might get from signalling how clever we are by debating things, the potential pleasure of learning things that're new to us, or even the epistemic virtue of inching closer to the right answer to a knotty, politically polarized empirical question.

Once one rules out those kinds of consequences, the main kind that's left, I reckon, is how those debates lead to resolutions, or at least lessening, of political problems in the wider world. (If our debates didn't lead to such improvements, then what would be obliging us to "debate what needs to be debated"?*) And I'm sceptical political debates on LW would accomplish that, at least on average.

I'm pretty sure some people would disagree with me. I'm also pretty sure it's at least debatable (haw) whether political debates on LW would improve actually existing politics, and whether effort spent on those debates would be better spent on something else (like arguing politics with people known to have influence) and that's enough for my point to go through. In fact, I'm now a little more surprised by your original comment, since your questions suggest you have difficulty working out whether "having or banning certain kinds of debates on a forum" is on balance a good thing or not, which I'm not sure how to square with your confident judgement that "We should debate what needs to be debated".


* One way I could be misunderstanding you: perhaps you do take utility-maximizing consequentialism seriously enough that you actually think the e.g. entertainment value of arguing outweighs the other consequences of arguing here, and so we're morally obliged to debate politics here for the entertainment value. I don't have the impression you're of that view, though.

comment by Lumifer · 2015-03-31T14:25:57.224Z · score: 0 (2 votes) · LW · GW

Thanks for the serious answer.

perhaps you do take utility-maximizing consequentialism seriously enough

No, I do not.

entertainment value of arguing outweighs the other consequences of arguing here

...but that is a very interesting idea :-D

comment by buybuydandavis · 2015-03-26T19:54:03.415Z · score: 3 (3 votes) · LW · GW

I'd say we should debate what's hard to debate.

IMO, LW provides a very interesting forum. Progressives and Libertarians actually talking, and moderately civilly. What I like is the window into different priors and different values. Really, that's what is going on in your head? Who knew? Not me.

comment by passive_fist · 2015-03-26T22:35:13.859Z · score: 2 (2 votes) · LW · GW

I'd say we should debate what's hard to debate.

Difficulty is certainly no reason not to attempt debate, as long as all sides acknowledge that debates about politics are necessarily difficult, ill-informed, and far from optimally rational.

comment by Gram_Stone · 2015-03-26T08:14:38.313Z · score: 3 (3 votes) · LW · GW

I think the fact that the public is less well-informed about politics than world leaders is an issue separate from the ability to have rational discussions about politics. From the fact that we have less information than world leaders it doesn't follow that we necessarily use that information less rationally. I've always considered the ability to reason under ignorance a Great Rationalist Superpower. Less information does seem to make mind-killing more likely, but you're assuming that mind-killing mechanisms are being kept in check.

comment by buybuydandavis · 2015-03-26T19:58:58.422Z · score: 0 (0 votes) · LW · GW

is an issue separate from the ability to have rational discussions about politics.

Or the desire to have rational discussions about politics.

People in power get that way by using discussion to get power, not illuminate the truth.

comment by bogus · 2015-03-26T16:36:39.603Z · score: 3 (3 votes) · LW · GW

Political participants do not just have different norms of community participation: by definition, they have very different motivations as well. This is the real take-away of Politics is the mind-killer. Keep in mind that politics is a kind of conflict: it's about things that people actually fight over, in the real world. So the difference in norms may well be a consequence of these motivations: as the potential for real strife increases sharply, good deliberation becomes less relevant and "fairness" concerns are far more important:

This is why I very much agree with the spirit of this post: we definitely don't want to attract people who are just looking to win by diligently adding more recruits to their preferred side. At the same time, we can hardly afford to disregard the norms of political debate entirely; that's a recipe for being perceived as somehow "biased" and "unfair" by the general public when political issues do become relevant, despite our best efforts. IOW, the norms aren't really the problem. The best compromise, AIUI, is to cautiously encourage (1) the rise of factional blogs like More Right and Slate Star Codex as venues where folks can apply the skills of rationality to their favorite political perspective, while interacting "at arms length" with other sides, and (2) developing political mediation and conflict reduction skills at LessWrong itself, while keeping ground-level political disputes mostly off limits.

comment by Jiro · 2015-03-26T22:26:43.275Z · score: 0 (0 votes) · LW · GW

More Right doesn't allow comments, so how does that work?

comment by bogus · 2015-03-27T00:03:44.839Z · score: 1 (3 votes) · LW · GW

AIUI, they host open threads where comments are allowed. Alternatively, they do take e-mails, and will consider posting these if sufficiently relevant and high-quality (by their standards). Slate Star Codex allows comments, but with no karma system to provide a "currency", they're not exactly helpful.

comment by Raemon · 2015-03-26T15:12:07.658Z · score: 3 (3 votes) · LW · GW

I'm a bit curious what prompted you to post this?

What I've been noticing is that right now, Slatestarcodex is sort of the place people go to talk about politics in a rationality-infused setting, and the comments there have been trending in the direction you'd caution about. (I'm not sure whether to be sad about that or glad that there's a designated place for political fighting)

comment by emr · 2015-03-27T15:59:23.419Z · score: 5 (5 votes) · LW · GW

I'm a bit curious what prompted you to post this?

Well, I think it's true, interesting, and useful :)

The argument is a specific case of a more general form (explaining changing group dynamics by selection into the group, driven by the norms of the group, but without the norms necessarily causing a direct change to any individual's behavior) which I think is a powerful pattern to understand. But like a lot of social dynamics, explicitly pointing it out can be tricky, because it can make the speaker seem snooty or Machiavellian or tactless, and because it can insult large classes of people, possibly including current group members. I felt that LW is one of the few places where I could voice this type of argument and get a charitable reception (after all, I'm indirectly insulting everyone who likes to talk politics, which is most people, including me :P)

To be clear: I don't think lesswrong is currently being hurt by this dynamic. But I do see periodic comments criticizing the use of only internal risks (mind-killing ourselves) as the justification for avoiding political topics. I'm sympathetic to some of these critiques, and I wanted to promote a reason to avoiding political topics that didn't imply that mind-killing susceptibility was somehow an insurmountable problem for individuals.

comment by Lumifer · 2015-03-27T16:14:29.543Z · score: 2 (2 votes) · LW · GW

Homeostasis of social communities is a very interesting topic. Let me just point out that there are dangers on all sides -- you don't want to be at the mercy of every wandering band of barbarians, but you also don't want to become an inbred group locked up high in an ivory tower.

comment by Viliam_Bur · 2015-03-30T07:34:40.432Z · score: 2 (2 votes) · LW · GW

SSC is one-person dictatorship with a benevolent dictator. It would be much worse there if people could play voting games in comments: upvoting everyone on their "side" and downvoting everyone on the opposing "side".

Also, on SSC people are banned more often than on LW, although most of the bans are temporary.

comment by Vaniver · 2015-03-27T13:38:27.199Z · score: 1 (1 votes) · LW · GW

What I've been noticing is that right now, Slatestarcodex is sort of the place people go to talk about politics in a rationality-infused setting, and the comments there have been trending in the direction you'd caution about. (I'm not sure whether to be sad about that or glad that there's a designated place for political fighting)

I would be very sad if LW comments went the way of Slate Star Codex comments.

comment by Raemon · 2015-03-27T13:54:21.666Z · score: 1 (1 votes) · LW · GW

I hadn't noticed a trend of political posts on LW, so hadn't been worried about this specific phenomenon.

comment by casebash · 2015-03-26T04:23:18.383Z · score: 3 (3 votes) · LW · GW

I think that there needs to be somewhere to discuss politics related to Less Wrong, but somewhere away from the main site. Ideally somewhere hard to find so as to keep the quality high.

comment by Stuart_Armstrong · 2015-03-26T10:10:35.025Z · score: 6 (6 votes) · LW · GW

A kind of would-have-banned-store for political discussion...

comment by [deleted] · 2015-03-26T10:57:34.637Z · score: 17 (17 votes) · LW · GW

"MAIN" "DISCUSSION" "QUARANTINE"

would be a good site layout.

comment by ChristianKl · 2015-03-26T12:09:46.553Z · score: 15 (15 votes) · LW · GW

And only visible to people who log in.

comment by [deleted] · 2015-03-26T12:13:29.053Z · score: 3 (3 votes) · LW · GW

I agree.

comment by dxu · 2015-03-28T02:19:20.012Z · score: 1 (1 votes) · LW · GW

Upvoted for the purpose of tolerating tolerance.

comment by Transfuturist · 2015-03-27T02:16:55.526Z · score: 4 (4 votes) · LW · GW

I think usernames would have to be anonymized, as well.

comment by satt · 2015-03-27T01:11:34.581Z · score: 4 (4 votes) · LW · GW

I've had the idea before that a group of LWers keen to start an inflammatory political argument could help keep LW cool by having the argument on an unrelated, pre-existing, general politics forum. They could link the argument on the Open Thread so the rest of us know it's happening.

Possible advantages & disadvantages:

  • fewer political flame-outs on LW...
  • ...more political flame-outs on Unnamed Other Forum
  • other forum posters would likely have worse argumentative norms...
  • ...but you could look for a forum with relatively good norms to minimize this (a pre-existing LW-affiliated blog/network, or a traditional rationalist/sceptic forum with a politics subforum?)
  • LWers modelling good argumentative norms to strangers might get the strangers to up their own game...
  • ...or social contagion might happen in the other direction, with LWers regressing towards the mean for online political arguments
  • has the trivial inconvenience of requiring LWers to register on another forum and post there, even as they continue to post other stuff here...
  • ...but maybe a trivial inconvenience is what you want if you think the marginal LW political argument has net negative value
  • could be interpreted as a forum invasion...
  • ...but it's not like LWers are trolls, and only a few LWers would likely bother with this anyway, so they'd probably blend into a bigger on-topic forum without much fuss
  • might entrench misinterpretations of "Politics is the Mindkiller"
  • in the unlikely event this became a firmly established norm, LWers might start demanding threads be taken elsewhere at the least scent of politics
comment by 9eB1 · 2015-03-27T03:47:40.509Z · score: 3 (3 votes) · LW · GW

We could easily use the LessWrong subreddit for that purpose, or create a LWPolitics subreddit.

comment by casebash · 2015-03-27T14:00:19.003Z · score: 1 (1 votes) · LW · GW

Interesting idea. There doesn't seem to be much traffic there, I wonder if the mods would be open to it?

comment by Zubon · 2015-03-27T13:36:21.068Z · score: 1 (1 votes) · LW · GW

Rationalist Tumblr discusses politics and culture, but it is definitely not hard to find; the quality of discussion may be higher than the Tumblr average but probably not what you are looking for. On the plus side, most of us have different usernames there, so you can consider ideas apart from author until you re-learn everyone. Which happens pretty quickly, so not a big plus.

The Tumblr folks seem to mostly agree that Tumblr is not the optimal solution for this, but it has the advantage of currently existing.

comment by casebash · 2015-03-27T13:56:42.692Z · score: 2 (2 votes) · LW · GW

The problem is that to contribute to that I would have to follow like 50 tumblrs and try to convince people to follow me as well

comment by SanguineEmpiricist · 2015-03-28T06:26:46.607Z · score: 2 (2 votes) · LW · GW

We need to be able to sort which participants perceive parts of Haidt's moral taste spectrum before we have any discussion on anything.

comment by Epictetus · 2015-03-27T08:05:24.248Z · score: 2 (2 votes) · LW · GW

There are certain questions that have several possible answers, where people decide that a certain answer is obviously true and have trouble seeing the appeal of the other answers. If everyone settles on the same answer, all is well. If different people arrive at different answers and each believes that his answer is the obvious one, then the stage is set for a flame war. When you think the other guy's position is obviously false, it's that much harder to take him seriously.

comment by seer · 2015-03-28T02:26:00.440Z · score: 0 (2 votes) · LW · GW

If everyone settles on the same answer, all is well.

No, all seems well. Except people develop massive over-confidence in that answer.

comment by ChristianKl · 2015-03-29T15:17:42.557Z · score: -1 (1 votes) · LW · GW

How much flame wars do you see on LW, when we do discuss politics?

I don't see that as a major problem.

comment by Kindly · 2015-03-29T15:26:07.389Z · score: 0 (2 votes) · LW · GW

We don't have flame wars of the calling-each-other-names kind, because of norms that say that if you see a comment with the substring "you're an idiot" outside of quotation, you downvote it regardless of anything else. (Or at least this is my strategy. If the comment is otherwise brilliant, I retract the downvote instead of upvoting, but this doesn't happen.)

We do still have discussions about politics in which everyone says unproductive and/or stupid things. At the very least, all the stupidest comments that I've made on LW have been related to politics. And I'd go back and apologize for them if other people involved in the discussion weren't so obviously wrong.

comment by ChristianKl · 2015-03-29T15:31:14.295Z · score: 0 (0 votes) · LW · GW

We do still have discussions about politics in which everyone says unproductive and/or stupid things.

Yes. That very much in line with the position I argue in this thread. Epictetus on the other hand did argue that flame wars are an issue.

comment by Epictetus · 2015-03-29T21:55:35.729Z · score: 0 (0 votes) · LW · GW

I was being figurative. I meant to imply that when two people both think the other person is obviously wrong, then productive, civil discourse is unlikely.

The short time I've been on LW I noticed that the community is very much averse to actual flame wars and would probably down-vote a thread into oblivion before things got out of hand.

comment by Viliam_Bur · 2015-03-30T07:39:22.152Z · score: 1 (1 votes) · LW · GW

In recent months there were a few comments with flame-war potential which were quickly "downvoted into oblivion", but the next day their karma was above zero.

Either it means we have a group of people who prevent their "side" from being downvoted below zero (although they don't bother to upvote it highly when it already is above zero), or we have a group of people who believe in something like "no comment should be downvoted just because it has a flame-war potential" who prevent downvoting below zero in principle regardless of the side. I don't know which one of these options it is, since all comments where I have seen this happen were from one "side" (maybe even from one user, I am not sure).

comment by [deleted] · 2015-03-26T10:46:38.173Z · score: 1 (11 votes) · LW · GW

This is why I write about political philosophy, not politics. E.g. I disagree with John Rawl's veil-of-ignorance theory and even find it borderline disgusting (he is just assuming everybody is a risk-averse coward), but I don't see either myself or anyone else getting mind-killingly tribal over it. After all it is not about a party. It is not about an election program. It is not about power. It is about ideas.

I see the opposite norm what you mention: I think when I write about political philosophy on LW it gets a negative reaction because it is too political and may invite mind-killing. Yet I have not seen this actually happen.

I think LW needs to be far more tolerant about political philosophy, and freely discuss the whole spectrum from Marx to Bonald, because where else? After you get a taste of LW, every other internet forum feels stupid, playing ego-games, un-self-critical and unhelpful. (By every other I mean Reddit and the 10-15 blogs I read, I am not very good at googling up interesting websites...)

This may be done parallelly with being less tolerant about partisan politics, but that may be a tad tricky.

I think we can just taboo partisan or emotional monikers out of political philosophy and do it easily. For example never refer to John Kekes as a conservative, refer to him as a pluralist skeptic - he identifies with both actually. Rawls may be defined as a theorist of distributive justice, not a liberal. And so on.

comment by Viliam_Bur · 2015-03-26T15:57:03.446Z · score: 10 (10 votes) · LW · GW

After you get a taste of LW, every other internet forum feels stupid

And why do you think this is so? Are all participants on this forum genetically superior, and they have to prove it by giving a DNA sample before registering the user account? Or could it be because some topics and some norms of the debate attract some kind of people (and the few exceptions are then removed by downvoting)? Any other hypothesis?

If you propose another hypothesis, please don't just say "well, it is because you are (for example) more intelligent or more reasonable" without adding an explanation about how specifically this website succeeds in attracting the intelligent and/or reasonable people, without attracting the other kinds of people, so the newcomers who don't fit the norm are not able to simply outvote the old members and change the nature of the website dramatically. (Especially considering that this is a community blog, not one person's personal blog such as Slate Star Codex.)

comment by [deleted] · 2015-03-27T08:39:57.418Z · score: 6 (6 votes) · LW · GW

And why do you think this is so?

Well, as for me, reading half the sequences change my attitude a lot by simply convicing me to dare to be rational, that it is not socially disapproved at least here. I would not call it norms, as the term "norms" I understand as "do this or else". And it is not the specific techniques in the sequences, but the attitudes. Not trying to be too clever, not showing off, not trying to use arguments as soldiers, not trying to score points, not being tribal, something I always liked but on e.g. Reddit there was quite a pressure to not do so.

So it is not that these things are norms but plain simply that they are allowed.

A good parallel is that throughout my life, I have seen a lot of tough-guy posturing in high school, in playgrounds, bars, locker rooms etc. And when I went to learn some boxing then paradoxically, that was the place I felt it is the most approved to be weak or timid. Because the attitude is that we are all here to develop, and therefore being yet underdeveloped is OK. One way to look at is that most people out in life tend to see human characteristics as fixed: you are smart of dumb, tough or puny and you are just that, no change, no development. Or putting it different, it is more of a testing, exam-taking attitude, not learning attitude: i.e. on the test, the exam, you are supposed to prove you already have whatever virtue is valued there, it is too late to say I am working on it. But in the boxing gym where everybody is there to get tougher, there is no such testing attitude, you can be upfront about your weakness or timidity and as long as you are working on it you get respect, because the learning attitude kills the testing attitude, because in learning circumstances nobody considers such traits too innate. Similarly on LW, the rationality learning attitude kills the rationality testing attitude and thus the smarter-than-though posturing, points-scoring attitude gets killed by it, because showing off inborn IQ is less important than learning the optimal use of whatever amount of IQ there is. Thus, there is no shame in admitting ignorance or using wrong reasoning as long as one there is an effort to improve it.

I think this is why. And this has little to do with topics and little to do with enforced norms.

comment by Viliam_Bur · 2015-03-27T10:22:27.347Z · score: 3 (3 votes) · LW · GW

I like your example and "learning environment" vs "testing environment".

However, I am afraid that LW is attractive also for people who instead of improving their rationality want to do other things; such as e.g. winning yet another website for their political faction. Some people use the word "rationality" simply as a slogan to mean "my tribe is better than your tribe".

There were a few situations when people wrote (on their blogs) something like: "first I liked LW because they are so rational, but then I was disappointed to find out they don't fully support my political faction, which proves they are actually evil". (I am exaggerating to make a point here.) And that's the better case. The worse case is people participating on LW debates and abusing the voting system to downvote comments not beause those comments are bad from the espistemic rationality point of view, but because they were written by people who disagree (or are merely suspect to disagree) with their political tribe.

comment by [deleted] · 2015-03-27T10:45:54.782Z · score: 2 (2 votes) · LW · GW

This is all fine, but what is missing for me is the reasoning behind something like "... and this is bad enough to taboo it completely and forfeit all potential benefits, instead of taking these risks" - at least if I understand you right. The potential benefits is coming up with ways to seriously improve the world. The potential risk is, if I get it right, that some people will behave irrationally and that will make some other people angry.

Idea: let's try to convince the webmaster to make a third "quarantine" tab, to the right from the discussion tab, visible only to people logged in. That would cut down negative reflections from blogs, and also downvotes could be turned off there.

An alternative without programming changes would be biweekly "incisive open threads", similar to Ozy's race-and-gender open threads, and downvoting customarily tabood in them. Try at least one?

comment by Viliam_Bur · 2015-03-27T12:30:07.533Z · score: 2 (8 votes) · LW · GW

An alternative without programming changes would be biweekly "incisive open threads", similar to Ozy's race-and-gender open threads

Feel free to start a "political thread". Worst case: the thread gets downvoted.

However, there were already such threads in the past. Maybe you should google them, look at the debate and see what happened back then -- because it is likely to happen again.

and downvoting customarily tabood in them.

Not downvoting brings also has its own problems: genuinely stupid arguments remain visible (or can even get upvotes from their faction), people can try winning the debate by flooding the opponent with many replies.

Another danger is that political debates will attract users like Eugine Nier / Azathoth123.

Okay, I do not know how to write it diplomatically, so I will be very blunt here to make it obvious what I mean: The current largest threat to the political debate on LW is a group called "neoreactionaries". They are something like "reinventing Nazis for clever contrarians"; kind of a cult around Michael Anissimov who formerly worked at MIRI. (You can recognize them by quoting Moldbug and writing slogans like "Cthulhu always swims left".) They do not give a fuck about politics being the mindkiller, but they like posting on LessWrong, because they like the company of clever people here, and they were recruited here, so they probably expect to recruit more people here. Also, LessWrong is pretty much the only debate forum on the whole internet that will not delete them immediately. If you start a political debate, you will find them all there; and they will not be there to learn anything, but to write about how "Cthulhu always swims left", and trying to recruit some LW readers. -- Eugine Nier was one of them, and he was systematically downvoting all comments, including completely innocent comments outside of any political debate, of people who dared to disagree with him once somewhere. Which means that if a new user happened to disagree with him once, they usually soon found themselves with negative karma, and left LessWrong. No one knows how many potential users we may have lost this way.

I am afraid that if you start a political thread, you will get many comments about how "Cthulhu always swims left", and anyone who reacts negatively will be accused of being a "progressive" (which in their language means: not a neoreactionary). If you will ask for further explanation, you will either receive none, or a link to some long and obscurely written article by Moldbug. If you downvote them, they will create sockpuppets and upvote their comments back; if you disagree with them in debate, expect your total karma to magically drop by 100 points overnight.

Therefore I would prefer simply not doing this. But if you have to do it, give it a try and see for yourself. But please read the older political threads first.

comment by Vaniver · 2015-03-27T17:35:27.036Z · score: 6 (6 votes) · LW · GW

I upvoted for this:

However, there were already such threads in the past. Maybe you should google them, look at the debate and see what happened back then -- because it is likely to happen again.

And, to further drive home the point, I'll link to the ones I could easily find: Jan 2012, Aug 2012, Dec 2012, Jan 2013, Feb 2013, more Feb 2013, Oct 2013, Jun 2014, Nov 2014.

comment by seer · 2015-03-28T05:05:23.234Z · score: 2 (6 votes) · LW · GW

I am afraid that if you start a political thread, you will get many comments about how "Cthulhu always swims left"

Just out of curiosity, I looked at the latest politics thread in Vaniver's list. Despite being explicitly about NRx, in contains only two references to "Cthulhu", both by people arguing against NRx.

and anyone who reacts negatively will be accused of being a "progressive" (which in their language means: not a neoreactionary).

Rather anyone who isn't sufficiently progressive gets called a neoreactionary.

comment by Lumifer · 2015-03-27T14:57:12.843Z · score: 2 (8 votes) · LW · GW

Y'know, you do sound mindkilled about NRx...

comment by Vaniver · 2015-03-27T17:36:04.506Z · score: 6 (6 votes) · LW · GW

Viliam_Bur is the person who gets messages asking him to deal with mass downvotes, so I am sympathetic to him not wanting us to attract more mass downvoters.

comment by Viliam_Bur · 2015-03-30T09:27:33.757Z · score: 0 (0 votes) · LW · GW

Not anymore, but yeah, this is where my frustration is coming from. Also, for every obvious example of voting manipulation, there are more examples of "something seems fishy, but there is no clear definition of 'voting manipulation' and if I go down this slippery slope, I might end up punishing people for genuine votes that I just don't agree with, so I am letting it go". But most of these voting games seem to come from one faction of LW users, which according to the surveys is just a tiny minority.

(When the "progressives" try to push their political agenda on LW -- and I don't remember them doing this recently -- at least they do it by writing accusatory articles, and by complaining about LW and rationality on other websites, not by playing voting games. So their disruptions do not require moderator oversight.)

comment by hairyfigment · 2015-03-30T01:32:25.795Z · score: 1 (1 votes) · LW · GW

I don't understand this word "was" - I just lost another 9+ karma paperclips to Eugine Nier.

Not to put too fine a point on it, but this seems less like a problem with political threads and more like a problem with someone driving most of the world's population (especially the educated western population) away from existential risk prevention in general and FAI theory in particular.

comment by ChristianKl · 2015-03-26T16:47:24.913Z · score: 5 (5 votes) · LW · GW

E.g. I disagree with John Rawl's veil-of-ignorance theory and even find it borderline disgusting (he is just assuming everybody is a risk-averse coward), but I don't see either myself or anyone else getting mind-killingly tribal over it

It's usually very hard to recognize when one get's mindkilled.

I disagree with John Rawl's veil-of-ignorance theory and even find it borderline disgusting (he is just assuming everybody is a risk-averse coward), but I don't see either myself or anyone else getting mind-killingly tribal over it. After all it is not about a party. It is not about an election program. It is not about power. It is about ideas.

Empirical evidence from studies suggests that it needs very little to get people who can use Bayes rules for abstract textbook problems to avoid using it when faced with a political subject where they care about one side winning. That's what "mind-killing" is about. People on LW aren't immune on that regard. I have plenty of times seen that someone on LW makes an argument on the subject of politics that he surely wouldn't make on a less charged subject because they argument structure doesn't work.

comment by [deleted] · 2015-03-27T09:34:05.803Z · score: 0 (0 votes) · LW · GW

Yes, but Bayesian rules are about predictions e.g. would a policy what it is expected to do e.g. does raising the min wage lead to unemployment or not, and political philosophy is one meta-level higher than that e.g. is unemployment bad or not, or is it unjust or not. While it is perhaps possible and perhaps preferable to turn all questions of political philosophy into predictive models, changing some of them and some other questions simply dissolved (i.e. is X fair?) if they cannot be, that is not done yet, and that is precisely what could be done here. Because where else?

comment by ChristianKl · 2015-03-27T11:53:44.458Z · score: 0 (0 votes) · LW · GW

When talking about issues of political philosophy you often tend to talk quite vaguely and are to vague to be wrong. That's not being mind-killed but it's also not productive.

If you want to decide whether unemployment is bad or not than factual questions about unemployment matter a great deal. How does unemployment affect the happiness of the unemployed? To what extend do the unemployed use their time to do something useful for society like volunteering?

comment by Transfuturist · 2015-03-27T02:22:26.864Z · score: 1 (1 votes) · LW · GW

I disagree with John Rawl's veil-of-ignorance theory and even find it borderline disgusting (he is just assuming everybody is a risk-averse coward)

Um, what? What's wrong with risk-aversion? And what's wrong with the Veil of Ignorance? How does that assumption make the concept disgusting?

comment by [deleted] · 2015-03-27T09:29:14.027Z · score: 6 (6 votes) · LW · GW

First of all, the there is the meta-level issue whether to engage the original version or the pop version, as the first is better but the second is far, far more influential. This is an unresolved dilemma (same logic: should an atheist debate with Ed Feser or with what religious folks actually believe?) and I'll just try to hover in between.

A theory of justice does not simply describe a nice to have world. It describes ethical norms that are strong enough to be warrant coercive enforcement. (I'm not even libertarian, just don't like pretending democratic coercion is somehow not one.)

Rawls is asking us to imagine e.g. what if we are born with a disability that requires really a lot of investment from society to make its members live an okay life, let's call the hypothetical Golden Wheelchair Ramps.

Depending on whether we look at it rigorously, in a more "pop" version Rawls is saying our pre-born self would want GWR built everywhere even when it means that if we are born able and rich we taxed through the nose to pay for it, or in a more rigorous version 1% change to be born with this illness would mean we want 1% of GWRs built.

Now, this all is all well if it is simply understood as the preferences of risk-averse people. After all we have a real, true veil of ignorance after birth: we could get poor, disabled etc. any time. It is easy to lose birth privileges, well, many of them at least. More risk-taking people will say I don't really want to pay for GWR, I am taking my gamble tha I will be born rich and able in which case I won't need them and I would rather keep that tax money. (This is a horribly selfish move, but Rawls set up the game so that it is only about fairness emerging out of rational selfishness and altruism is not required in this game so I am just following the rules.)

However, since it is a theory of justice, it means the preferences of risk-aversge people are made mandatory, turned into a social policy and enforced with coercion. And that is the issue.

Now, how could Rawls (or pop-Rawlsians) get away with that? By assuming that all reasonable people are risk-averse anyway. In other words, turning risk-aversity into a tacit norm. Instead of seeing it negatively as a vice, or neutrally as a preference, it is basically a virtue here. Now, we have a perfect name for turning timidity into a norm: it is called cowardice.

And I think my argument managed to demonstrate avoiding in politics mind-killing up to the last sentence when I used a connotationally loaded word (cowardice), but at this point I had to, as I casually remarked earlier I feel this way about it and now had to explain why. But the last sentence refers only to my feelings and not an integral part of the argument, for the argument , just stop reading at "risk aversion should not be made into a norm and coercively enforced calling it justice".

Again, it is not part of the argument, but an explanation of my feelings: when I try to improve one my vices or weaknesses, and I see others almost see them as norms, I feel disgust. For example, willful stupidity disgusts me - I think this feeling may be common around here. But as I am also trying to work on my own cowardice, being too accepting of it also disgusts me.

comment by Transfuturist · 2015-03-27T19:32:43.922Z · score: 0 (0 votes) · LW · GW

Thanks for the explanation. Do you have any alternatives?

comment by [deleted] · 2015-03-30T08:00:37.744Z · score: 2 (2 votes) · LW · GW
  1. How about no theory of justice? :) Philosophers should learn from scientists here: if you have no good explanation, none at all is more honest than a bad but seductive one. As a working hypothesis we could consider our hunger for justice and fairness an evolved instinct, a need, emotion, a strong preference, something similar to the desire for social life or romantic love, it is simply one of the many needs a social engineer would aim to satisfy. The goal is, then, to make things "feel just" enough to check that checkmark.

  2. "to each his own" reading Rawls and Rawlsians I tend to sense a certain, how to put it, overly collective feeling. That there is one heavily interconnected world and it is the property of all humankind and there is a collective, democratic decision-making on how to make it suitable for all. So in this kind of world there is nothing exempt for politics, nothing is like "it is mine and mine alone and not to be touched by others". The question is, is it a hard reality derived by the necessities of the dynamics of a high-tech era? Or just a preference? My preferences are way more individualistic than that. The attitude that everything is collective and to be shaped and formed in a democratic way is IMHO way too often a power play by "sophists" who have a glib tongue, good at rhethorics, and can easily shape democratic opinion. I am atheist but "culturally catholic" enough to find the parable of the snake offering the fruit useful: that it is not only through violence, but also through glib, seductive persuasion, through 100% consent, a lot of damage can be done.

This is something not really understood properly in the modern world, we understand how violence, oppression or outright fraud can be bad, but not really realize how much harm a silver tongue can cause without even outright lying, because we already live in socities where silver-tongue intellectuals are already the ruling class, so they underplay their own power by lionizing consent and freedom of speech as institutions that can reasonably considered to lead to good results.

I mean, for example, a truly realistic society would censor arguments that feel good. This sounds super weird: we are used to either complete freedom of speech or to censorship based on imputed harm or untruth, but censor even true and useful ideas if they feel too good? Yes, as long as we understand censorship as a cost not an impenetrable barrier: putting a cost on ideas that feel good would neutralize that feeling and thus enable us to judge the idea on a rational basis, without an affective bias.

Compare that to the real world and realize we are living in a sophists paradize where feel-good ideas have power through democratic consent.

I would want a way more autist-friendly world than that, and the way I would imagine it is some clear fences, Schelling points, whatnot, some kind of a "this is mine, this is yours, and these things are not subject to the political process or democratic-collective consensus, only those and those things are subject to that". This would by own risk-aversion: to have some minimal insurance against the loss "sophists" can enact on me by persuading public opinion.

comment by seer · 2015-03-27T02:56:24.990Z · score: 0 (4 votes) · LW · GW

The problem is that Rawls asserts that everyone is maximally risk-averse.

comment by JoshuaZ · 2015-03-30T17:31:28.632Z · score: 1 (1 votes) · LW · GW

I don't think Rawls makes that assertion. Rawls does presume some amount of risk aversion, but it seems highly inaccurate to say that Rawls asserts that "everyone is maximally risk-averse."

comment by Transfuturist · 2015-03-27T05:06:12.324Z · score: -4 (4 votes) · LW · GW

.