"Politics is the mind-killer" is the mind-killer
post by thomblake · 2012-01-26T15:55:22.746Z · LW · GW · Legacy · 99 commentsContents
99 comments
Summary: I propose we somewhat relax our stance on political speech on Less Wrong.
Related: The mind-killer, Mind-killer
A recent series of posts by a well-meaning troll (example) has caused me to re-examine our "no-politics" norm. I believe there has been some unintentional creep from the original intent of Politics is the Mind-Killer. In that article, Eliezer is arguing that discussions here (actually on Overcoming Bias) should not use examples from politics in discussions that are not about politics, since they distract from the lesson. Note the final paragraph:
I'm not saying that I think Overcoming Bias should be apolitical, or even that we should adopt Wikipedia's ideal of the Neutral Point of View. But try to resist getting in those good, solid digs if you can possibly avoid it. If your topic legitimately relates to attempts to ban evolution in school curricula, then go ahead and talk about it - but don't blame it explicitly on the whole Republican Party; some of your readers may be Republicans, and they may feel that the problem is a few rogues, not the entire party. As with Wikipedia's NPOV, it doesn't matter whether (you think) the Republican Party really is at fault. It's just better for the spiritual growth of the community to discuss the issue without invoking color politics.
So, the original intent was not to ban political speech altogether, but to encourage us to come up with less-charged examples where possible. If the subject you're really talking about is politics, and it relates directly to rationality, then you should be able to post about it without getting downvotes strictly because "politics is the mind-killer".
It could be that this drift is less of a community norm than I perceive, and there are just a few folks (myself included) that have taken the original message too far. If so, consider this a message just to those folks such as myself.
Of course, politics would still be off-topic in the comment threads of most posts. There should probably be a special open thread (or another forum) to which drive-by political activists can be directed, instead of simply saying "We don't talk about politics here".
David_Gerard makes a similar point here (though FWIW, I came up with this title independently).
99 comments
Comments sorted by top scores.
comment by Raemon · 2012-01-26T17:14:06.201Z · LW(p) · GW(p)
I think the politics taboo is one of the best things about Less Wrong.
Yes, it's also a frustrating thing, because politics is important and full of relevant examples about rationality. But if you think you have an insightful, rational point to say about politics that will not degenerate into a sprawling discussion with negative utility... you are probably wrong.
Replies from: Curiouskid↑ comment by Curiouskid · 2012-01-30T22:57:28.872Z · LW(p) · GW(p)
There are actually some of these sprawling discussions of negative utility on LW already if you look at some of the seasteading threads.
comment by cousin_it · 2012-01-26T17:59:49.811Z · LW(p) · GW(p)
I like the idea of starting a Politics Open Thread if it means I won't see any more political comments elsewhere on LW. Also it would work as a nice experiment to convince libertines like you that encouraging political discussion isn't a good idea, or convince curmudgeons like me that it is.
Replies from: komponisto, TimS↑ comment by komponisto · 2012-01-26T20:09:44.052Z · LW(p) · GW(p)
I like the idea of starting a Politics Open Thread if it means I won't see any more political comments elsewhere on LW.
It won't. Instead, what will happen is that people will start attaching the mental labels of "Blue" and "Green" to other commenters, based on encounters in such a thread, and these labels will apply everywhere, and consequently distort the discussions and the voting on all topics.
I agree with thomblake that the original intent of the "Politics is the Mind-Killer" doctrine wasn't to ban politics (and even that post itself wasn't intended as official Overcoming Bias policy, just advice from Eliezer!), but I am also 100% with Raemon in endorsing the anti-politics norm that has subsequently developed.
But note that the norm itself, like most human norms, is not an absolute or rigid one, just a scale of increasing costs or penalties with increasing severity of "violations". It's always been okay to mention politics in a way that shows you "know what you're doing" (proof: I have); high-status people are allowed more leeway than the lower-status (except for the very highest-status individuals, on whom norms are often strictly enforced for symbolic reasons); etc.
Theoretically, if we really needed to discuss politics (e.g. if there were pending legislation before the U.S. Congress to regulate FAI research; if Obama had criticized Republicans by invoking LW concepts in his State of the Union speech; if Putin had promised to make cryonics mandatory for everyone in Russia; you get the idea), we could.
Replies from: None, Eugine_Nier, TimS↑ comment by [deleted] · 2012-01-26T22:49:33.078Z · LW(p) · GW(p)
It won't. Instead, what will happen is that people will start attaching the mental labels of "Blue" and "Green" to other commenters, based on encounters in such a thread, and these labels will apply everywhere, and consequently distort the discussions and the voting on all topics.
This. I can' tell you how grateful I am that I have no idea about the politics of most posters I'm familiar with.
Replies from: lavalamp↑ comment by Eugine_Nier · 2012-01-26T23:48:23.158Z · LW(p) · GW(p)
It won't. Instead, what will happen is that people will start attaching the mental labels of "Blue" and "Green" to other commenters, based on encounters in such a thread
I can already do this to many commenters based on their comments in the existing threads.
↑ comment by TimS · 2012-01-27T03:34:15.810Z · LW(p) · GW(p)
The problem is that the norm of the politics-ban is quite broad. Basically everything that the "Personal is Political" crowd would label political is swept in.
Not only is discussion of the latest maneuverings of Newt vs. Mitt prohibited, but discussion of democracy vs. authoritarianism, feminism, the purpose of juries, etc. are considered off limits by a vocal portion of LessWrong. I have no desire to debate whether Obama's State of the Union was good policy or good politics, but the broadness of the negative reaction excludes a lot of conceptspace, to the point that there are real world problems it's very difficult to discuss here.
In short, there's a reason why I was talking about a Political Theory Open Thread, not a Politics Open Thread.
↑ comment by TimS · 2012-01-26T18:33:06.237Z · LW(p) · GW(p)
I like the idea of a political theory thread, but before I do it, I think it's worthwhile to think about some ground rules in order for it to be productive.
- Arguments still aren't soldiers. Being mindkilled is still bad.
- Read posts charitably, even if you intend to steelman
- Don't say "Your position requires you to kick puppies" unless you genuinely believe the poster is unaware of that fact.
- What happens in Political Theory Open Thread stays in Political Theory Open Thread. Edit: In short, beware the halo effect.
Any other points I should add (particularly about voting/karma)?
Edit:
- Distrust your impulse to vote on something. Particularly if you are emotionally engaged. Politics is the mindkiller.
- Extreme contrarianism for its own sake is probably not valuable.
↑ comment by roystgnr · 2012-01-26T19:46:51.882Z · LW(p) · GW(p)
"Arguments are soldiers" is practically the definition of democracy. In theory, if my arguments are persuasive enough it will determine whether or not my neighbors or I can continue doing X or start doing Y without being fined, jailed, or killed for it. Depending on what great things I like to do or what horrible things I want to prevent my neighbors from doing, that's an awfully powerful incentive for me to risk a few minds being killed.
Now, in practice we mostly live in near-megaperson cities in multi-megaperson districts of near-gigaperson countries, whereas my above theory mostly applies to hectoperson and kiloperson tribes. But my ape brain can't quite internalize that, so the subconscious incentive remains.
But that's not even the worst of it! I try to read a range of liberal, conservative, libertarian, populist etc. news and commentary, just so that the gaps in each don't overlap so much... but it requires a conscious effort. Judging by the groupthink in reader comments on these sites, most people's behavior is the opposite of mine. Why not? Reading about how right you are is fun; reading about how wrong you are is not.
It would be very easy for new would-be LessWrong readers to see the politics threads, jump to conclusions like "Oh, these people think they're so smart but they're actually a bunch of Blues! A wise Green like me should look elsewhere for rationality." Repeat for a few years and the average LessWrong biases really do start to skew Blue, even bad Blue-associated ideas start going unchallenged, etc.
I think I would still love to read what LessWrong users have to say about politics. Probably on a different site. With unconnected karma and preferably unconnected pseudonyms.
Replies from: TimS, Viliam_Bur, AlexanderRM↑ comment by TimS · 2012-01-26T19:53:17.052Z · LW(p) · GW(p)
"Arguments are soldiers" is practically the definition of democracy.
Respectfully, that's not a correct use of the metaphor. The point is that unwillingness to disagree with other positions simply because those positions reach the desired conclusion is evidence of being mindkilled. You don't shoot soldiers on your side, but for those thinking rationally, arguments are not soldiers, so bad ideas should always be challenged.
It would be very easy for new would-be LessWrong readers to see the politics threads, jump to conclusions like "Oh, these people think they're so smart but they're actually a bunch of Blues! A wise Green like me should look elsewhere for rationality." Repeat for a few years and the average LessWrong biases really do start to skew Blue, even bad Blue-associated ideas start going unchallenged, etc.
This is a real risk, but it's worth assessing (and figuring out how to assess) how likely it is to occur.
Replies from: roystgnr↑ comment by roystgnr · 2012-01-26T23:51:23.802Z · LW(p) · GW(p)
By "thinking rationally", you must mean epistemically, not instrumentally.
If (to use as Less-Wrong-politically-neutral an allegory as I can) you are vastly outnumbered by citizens who are wondering if maybe those birds were an omen telling us that Jupiter doesn't want heretics thrown to the lions anymore, I agree that the epistemically rational thing to do is point out that we don't have much evidence for the efficacy of augury or the existence of Zeus, but the instrumentally rational thing to do is to smile, nod, and point out that eagles are well-known to convey the most urgent of omens. In more poetic words: you don't shoot soldiers on your side.
The metaphor seems to be as correct as any mere metaphor can get. Is it such a stretch to call an argument a "soldier" for you when it's responsible for helping defend your life, liberty, or property?
Replies from: TimS, Strange7↑ comment by TimS · 2012-01-27T00:13:44.735Z · LW(p) · GW(p)
First, that's not the metaphor we were discussing. Second, the metaphor you are using allows arguments to be soldiers of any ideology, not simply democracy.
Replies from: roystgnr, bio_logical↑ comment by roystgnr · 2012-01-27T19:51:12.392Z · LW(p) · GW(p)
I have read "Politics is the mindkiller" and am discussing the same metaphor. For that matter, I'm practically recapitulating the same metaphor, to make an even stronger point: not only can politics provoke irrational impulses to support poor arguments on your "side", politics can create instrumentally rational incentives to (publicly, visibly, not internally) support poor arguments. Sometimes you support a morally dubious soldier because of jingoism, sometimes you support him because he's the best defense in between you and an even worse soldier.
Would you be more specific about how you think my use of the metaphor is different and/or invalid?
I do think I've given a compelling counterexample to "bad ideas should always be [publicly] challenged". (my apologies if the implicit [publicly] here was not your intended claim, but the context is that of a proposed public discussion) Have you changed your mind about that claim, or do you see a problem with my reasoning? For that matter, in my hypothetical political forum would you be arguing for atheism or for more compassionate augury yourself?
The preposition of your second sentence suggests a miscommunication of my initial claim. I didn't intend to say "arguments are soldiers of democracy", but rather "arguments are soldiers in a democracy". You're still right that this also applies to non-democracies: in any state where public opinion affects political policy, incentives exist to try and steer opinion towards instrumentally rational ends even if this is done via epistemically irrational means. Unlimited democracy is just an abstract maximum of this effect, not the only case where it applies.
Replies from: TimS↑ comment by TimS · 2012-01-27T22:09:16.472Z · LW(p) · GW(p)
In brief, I think my interpretation is right because it is consistent with the intended lesson, which is "Don't talk about Politics on LessWrong." In other words, I understood the point of the story to be that treating arguments as soldiers interferes with believing true things.
I agree that "bad ideas should be publicly challenged" is only true if what I'm trying to do is believe true theories and not believe false theories. If I'm trying to change society (i.e. do politics), I shouldn't antagonize my allies. The risk is that I will go from disingenuously defending my allies' wrong claims to sincerely believing my allies' wrong claims, even in the face of the evidence. That's being mindkilled. In short, engaging in the coalition-building necessary to do politics is claimed to cause belief in empirically false things. I.e. "Politics is the Mindkiller."
Replies from: roystgnr, bio_logical↑ comment by roystgnr · 2012-01-30T18:37:41.474Z · LW(p) · GW(p)
My interpretation could be summarized in similar fashion as "really, really, don't talk about politics on LessWrong" - whether this is "consistent" or not depends on your definition of that word.
I agree with your interpretation of the point of the story... and with pretty much everything else you wrote in this comment, which I guess leaves me with little else to say.
Although, that's an example of another issue with political forums, isn't it? In an academic setting, if a speaker elicits informed agreement from the audience about their subject, that means we've all got more shared foundational material with which to build the discussion of a closely related subsequent topic. Difficult questions without obvious unanimous answers do get reached eventually, but only after enough simpler related problems have been solved to make the hard questions tractable.
Politics instead turns into debates, where discussions shut down once agreement occurs, then derail onto the less tractable topics where disagreement is most heated. Where would we be if Newton had decided "Yeah, Kepler's laws seem accurate; let me just write "me too" and then we're on to weather prediction!"
↑ comment by bio_logical · 2013-10-28T02:57:26.446Z · LW(p) · GW(p)
In short, engaging in the coalition-building necessary to do politics is claimed to cause belief in empirically false things. I.e. "Politics is the Mindkiller."
To me, this just shows that a ban on political argumentation is the very last thing that Lesswrong needs. The accusation of being "mind-killed" is levied by those whose minds are too emotionally dysfunctional for them to tell the difference between abolition and slave ownership (after all, one is blue and the other is green, and there couldn't very well be an objective reason for either side holding their position, could there?).
The ability to stifle debate with an ad hominem and a karmic downgrade is the mark of a totalitarian (objectively unintelligent) forum, not a democratic (more intelligent than totalitarian) one. Now a libertarian and democratic forum with smart filters? That's smarter still. In hindsight, everyone agrees so, but in present scenarios, many people are corrupted or uneducated, and lack comprehension.
This is one of the primary reason posts are labeled as mind-killed --because those posts are actually higher-level comprehension and what people don't understand, they often attempt to destroy (especially where force is involved --people generally hate to be held accountable for possessing evil beliefs, and politics is the domain of force).
Every argument I make, no matter how seemingly mind-killed it is, is always able to be defended by direct appeal to the evidence. Many people don't understand the evidence, though, or they deny it. Evidence that places sociopaths and their conformists on the wrong side of morality will always be fought, tooth and nail. To test this out, tell your entire family that they're all thieves, no better than the Nazis who watched train cars of Jews go by in the distance, at your next Thanksgiving meal. (Don't actually try this. LOL.)
Still, at some point, there was a family gathering prior to Nazi Germany, where all hope hadn't been lost, and someone told their family that they should all buy rifles and join the resistance. That person was right. He was reported, sent to prison, and murdered by the prevailing "consensus view." ...So the Warsaw Jews had to figure it out later, and resist with a far smaller chance of success.
As John Ross wrote in "Unintended Consequences" if you wait to stand up for what's right until you're 98 pounds and being herded onto a cattle car with only the clothes on your back, it's too late for you to have a chance at winning. You need to deploy soldiers when you'll be hooted down for deploying soldiers. And, you need to be certain you're in the right, while deploying soldiers.
The best thing possible is to make sure that your soldiers are defending something defensible at its core. The best way to do this is to quickly show that such soldiers are not in the wrong, and clearly aren't in the wrong. If you're defending Democrats, Republicans, most Libertarians, Greens, or Constitution Party candidates, you have a difficult row to hoe if this is your goal.
Far less difficult is an issue-based stance, and philosophical stance, on any given political subject. So yes, soldiers can be deployed, and here at LW, one would ideally wish to distance oneself from identification with bad arguments or poor defenses of an idea. ...So just refrain from up-voting it. Not difficult.
Of course, once someone is tarred with "bad Karma" that's a scarlet letter that prevents anything useful from that account from ever being considered --an ad hominem attack on all ideas from that account, no matter how valid they are.
Replies from: fubarobfusco, Lumifer↑ comment by fubarobfusco · 2013-10-28T03:51:49.265Z · LW(p) · GW(p)
If you cannot speak without insulting your audience, you probably aren't going to convince anyone.
This comment would be much better, therefore, without the insults — the "emotional dysfunction", the "totalitarian (objectively unintelligent)", the "corrupted or uneducated", the "sociopaths and their conformists", and so on, and so on, ad nauseam.
↑ comment by Lumifer · 2013-10-28T03:30:14.437Z · LW(p) · GW(p)
Still, at some point, there was a family gathering prior to Nazi Germany, where all hope hadn't been lost, and someone told their family that they should all buy rifles and join the resistance. That person was right.
No, he was wrong. The right thing to buy was tickets overseas.
You need to deploy soldiers when you'll be hooted down for deploying soldiers. And, you need to be certain you're in the right, while deploying soldiers.
I see a certain... tension between these two sentences.
↑ comment by bio_logical · 2013-10-28T02:39:09.197Z · LW(p) · GW(p)
Are some ideologies more objectively correct than others? (Abolitionists used ostracism and violence to prevail against those who would return fugitive slaves south. Up until the point of violence, many of their arguments were "soldiers." One such "soldier" was Spooner's "The Unconstitutionality of Slavery" --from the same man who later wrote "the Constitution of No Authority." He personally believed that the Constitution had no authority, but since it was revered by many conformists, he used a reference to it to show them that they should alter their position to support of abolitionism. Good for him!)
If some ideologies are more correct than others, then those arguments which are actually soldiers for those ideologies have strategic utility, but only as strategic "talking points," "soldiers," or "sticky" memes. Then, everyone who agrees with using those soldiers can identify them as such (strategy), and decide whether it's a good strategic or philosophical, argument, or both, or neither.
↑ comment by Strange7 · 2013-03-22T07:34:09.320Z · LW(p) · GW(p)
You seem to have excluded a middle option, namely "I am in favor of heretics not being thrown to the lions, and no amount of bird-related omen interpretation will sway my opinion on the subject one way or another."
Replies from: bio_logical↑ comment by bio_logical · 2013-10-28T04:17:49.361Z · LW(p) · GW(p)
Here on Lesswrong, I'd favor such an argument. However, What happens when you look at a giant crowd of people with their bird masks on, and all of them are looking at you for an answer, and they're about to throw the heretic to the lions, because they lack moral consciences of their own? It's hard to argue against a "dishonest" strategic argument that still allows the heretic to live, when logic is out-gunned. Even so, I think that such a thing could be stated here, especially with an alias, in case you're called for jury duty in the future and want to "Survive Voir Dire."
...This is an old political question. There are a lot of people who were forced to answer it in times when right and wrong suddenly came into clear focus because it became "life or death." Anne Frank is hiding in the attic: you have to be "dishonest" to the Nazis who are looking for her. In that case, dishonesty is not only "legitimate" it's the ONLY moral course of action. If you tell the truth to the Nazis, you are then morally reprehensible. You are morally reprehensible if you don't even lie convincingly.
Here's another example where the status quo is morally wrong, and (narrow, short-term, non-systemic, low-hierarchical-level) dishonesty is the only morally acceptable pathway: A fugitive slave has escaped, and is being pursued by Southern bounty-hunters and also Northern judges, cops, and prosecutors. He can be forcibly returned on your ex parte testimony, and you'll even get reward money. Yet, if you don't make up a lie, you're an immoral part of a system of slavery, and an intellectual coward.
Here's an example that is less clear to the bootlickers and tyrant-conformists among the Lesswrong crowd: You're called for jury duty. The judge is trying to stack the jury full of people who will agree to "apply the law as he gives it to them." Since the other veniremen are simpletons who have no curiosity about the system they live under that goes beyond the platitudes they learned in their government school, the judge is likely to succeed. You however, are an adherent to the philosophy of Eliezer Yudkowsky, and you have read about jury rights on a severely down-ranked "mind-killed" comment at Lesswrong. You know the defendant's only hope is someone who knows that the judge is legally allowed to lie to the jury, the same way the police are, by bad Supreme Court precedent. You know that the victimless crime defendant's only hope is an independent thinker who will get seated on the jury and then refuse to convict. The defendant will be sent to a rape room for 20 years, to have his young life stolen from him, and have his hopes and dreams destroyed, if you fail to answer the "voir dire" questions like the other conformists, and fail to get seated. So, you get seated, and then, knowing that you are superior in power to the judge once seated, you vote to acquit, exercising your right to nullify the evil laws the defendant is charged with breaking.
All three of the prior lessons reference the same principle: lying to an illegitimate system is proper and moral. Yet, the powerful status quo derides this course of action as "immoral." Thus, it is the domain of proper philosophy to address the issue, and provide guidance to those who lack emotional intelligence.
(Ideally, if Lesswrongers are actually "less wrong" about a subject, the uprank and downrank features could begin to indicate real political intelligence, or how closely one's argument mirrors reality, or a viable philosophical position. --regardless of whether advocates in one direction or another are "mind-killed" or not. Even a "brain-dead" or "mind-killed" person can say 2+2=4. So maybe that truth doesn't get upvoted, because it's obvious. It's still true. And in times of universal deceit, telling the truth is a revolutionary act.)
Ultimately, political arguments decide policy. Policy will then decide which innocents will live or die, and whether those innocents will be killed by you, or defended by you.
That's what politics is. That most people lack any kind of a political philosophy and simply "root for their color" is a tangential aside that has now superseded the legitimacy of the debate.
I prefer to have arguments act as soldiers, because that's still preferable to actual soldiers acting as soldiers. That's still debate. We're all adults here. My feelings won't be hurt when this is downvoted into oblivion and I need to create another profile in order to down-vote somebody's stupid (unwittingly self-destructive) comment.
Which, by the way, should be the criterion for judging all political arguments: what is the predicted outcome? What is the utility? What is the moral course of action based on a common moral standard? How do the good guys win?
Good guys: abolitionists, allies in WWII, people who sheltered Shin Dong-Hyuk and didn't report him to the secret police in his Escape from Camp 14, the Warsaw ghetto uprising's marksmen (not the ones who tried to inform on them, or who counseled putting faith in "god")
Worthless: The people hooting down debate as "mind-killing," those who counseled faith in god in the warsaw ghetto, the people who turn anti-government meetings into prayer sessions, those who gave up their friends to avoid being killed by the KGB, etc., those who suggest silencing political debate about ending the drug war because "it's a downer" (as much of a "downer" as living 14 years or more behind bars like Gary Fannon? --you callous, uncaring pukes!)
Bad guys: the slave owners, the plantation owners who politically opposed abolition, the Nazis, the KGB, the teacher who beat the little girl to death for hiding a few kernels of corn in her pocket inside North Korea's Camp 14, those who want the drug war to continue because they profit from it, people with a lot of private property who vote for the state to control all private property, etc.
Being dim witted and shutting down debate is not being the opposite of mind-killed. It's not being philosophical. It's being brain-dead far worse than being mind-killed --it's being "inanimate to begin with," or "still-born."
That would be a good comeback for those accused of being "mind-killed." "Tell me how I'm mindlessly taking a side of an irrational argument, or bear the true appellation of 'still-born' or philosophically absent, follower, conformist."
And isn't that the most damning charge anyway? "Conformist." Someone who adopts a philosophical position without any reason, simply because there's safety in numbers, and someone in authority gave the command. Big strong men who don't dare to defend a logical principle, physically brave, but intellectually weak. ...The core of the Nuremburg defense, which was universally ruled illegitimate by western philosophy, law, and civilization.
I suspect that the real fear on this board is that narrow logic divorced from reality is no longer adequate to defend one's reputation as a thinker.
"Going with the flow" might work in an uncorrupted, civilized regime. Now, show me one! This is really why people don't want to have to reference reality. Reality implies a bare minimum standard in terms of moral responsibility, and that's the most terrifying idea known to the majority of men and women, worldwide.
How else do you explain the very low moral standards and corresponding bad results of the majority?
The majority are conformists, guided by power-seeking sociopaths. This isn't just a fringe theory, it's a truth referenced by all great political thinkers. To deny this omnipresent truth is to indicate an internal problem with moral comprehension, or basic philosophy.
Those who want to kill or punish anyone should be highly suspect, and a natural question follows: would that be retaliatory force, or "preemptive" force? There is always a path to the truth for those who know how to ask the right questions. Rather than point at someone like Donald Sutherland in "Invasion of the Body Snatchers" while typing out "mind-killed," perhaps it would make sense learning a little bit of economics, law, and libertarian philosophy, and asking some questions about it to try to see where it's fundamentally mistaken. The same goes for supporting arguments of a political position.
I'm always willing to tell you why I think I'm right, and offer evidence for it that meets you on your own terms and your own comprehension of reality, and individual facts and evidence within it. I can drill down as far as anyone wishes to go.
What I can't do is respond in any meaningful way to a crowd of people yelling "mind-killed" as a thrown bottle bounces off my lectern and my mic is turned off. That's what Karma does to political conversations. It lets those who feel intelligent kill the debate, and kill the emergence of the Lesswrong cybernetic mind.
“Political tags — such as royalist, communist, democrat, populist, fascist, liberal, conservative, and so forth — are never basic criteria. The human race divides politically into those who want people to be controlled and those who have no such desire. The former are idealists acting from highest motives for the greatest good of the greatest number. The latter are surly curmudgeons, suspicious and lacking in altruism. But they are more comfortable neighbors than the other sort.”
― Robert A. Heinlein
“Delusions are often functional. A mother's opinions about her children's beauty, intelligence, goodness, et cetera ad nauseam, keep her from drowning them at birth.”
― Robert A. Heinlein, Time Enough for Love
“Goodness without wisdom always accomplishes evil.”
― Robert A. Heinlein
Replies from: Lumifer↑ comment by Viliam_Bur · 2012-01-27T13:45:13.129Z · LW(p) · GW(p)
Reading about how right you are is fun; reading about how wrong you are is not.
I don't read about how I am wrong. I only read about how other people (sometimes including my former selves) are wrong, and that's fun too.
↑ comment by AlexanderRM · 2015-03-21T03:34:52.388Z · LW(p) · GW(p)
Seconded on the different site, unconnected karma and unconnected pseudonyms. Also, it'd be nice if it could somehow be somewhat dissociated from LW... might be useful to have a link to it easily visible, actually, but if there is one it should be right next to a specification explaining the idea and linking to "politics is the mind-killer".
Separately, the idea of retaining a taboo on things like discussing politicians or the like, and restricting it to mostly issue discussions, also sounds useful.
↑ comment by thomblake · 2012-01-26T18:40:00.873Z · LW(p) · GW(p)
Any other points I should add (particularly about voting/karma)?
Downvote spam, but otherwise avoid voting up or down - we're likely to be voting for biased reasons.
Replies from: cousin_it, TimS, lessdazed↑ comment by cousin_it · 2012-01-26T18:44:03.592Z · LW(p) · GW(p)
That's an awesome idea. Maybe amend it to "downvote spam, otherwise vote everything toward 0" so a minority of politically-motivated voters can't spoil the game for everyone else?
Replies from: TimS, TimS↑ comment by TimS · 2012-01-26T21:10:55.468Z · LW(p) · GW(p)
In addition to my other comment, I think it will be hard to enforce a voting norm that is so inconsistent with the voting norms on the rest of the site.
Replies from: erratio↑ comment by erratio · 2012-01-27T00:08:00.337Z · LW(p) · GW(p)
Disagree, there are successful instances of using karma in ways inconsistent with the rest of the site.
The most important counterexample here is Will Newsome's Irrationality Game post, where voting norms were reversed: the weirdest/most irrational beliefs were upvoted the most, and the most sensible/agreeable beliefs were downvoted into invisibility. Many of the comments in that thread, especially the highest-voted, have disclaimers indicating that they operate according to a different voting metric. There is no obvious indication that anyone was confused or malicious with regard to the changed local norm.
↑ comment by TimS · 2012-01-26T18:41:57.357Z · LW(p) · GW(p)
Hmm. How about:
Please reserve downvotes for failure to engage, not simply disagreement. Consider not upvoting at all.
Spam is not engagement, but the poster whose posting led to this discussion post was not really interested in a discussion.
Replies from: thomblake, thomblake↑ comment by thomblake · 2012-01-26T18:45:01.606Z · LW(p) · GW(p)
Consider not upvoting at all.
Sounds good. Has a side-effect of there being a perceived cost for posting in the thread; you're more likely to be downvoted.
Please reserve downvotes for failure to engage, not simply disagreement.
I generally counsel not downvoting for disagreement anywhere on the site. I think this needs to be stronger.
Replies from: TheOtherDave, TimS↑ comment by TheOtherDave · 2012-01-26T19:10:30.761Z · LW(p) · GW(p)
Mm. I sometimes upvote for things I think are good ideas, as an efficient alternative to a comment saying "Yes, that's right." I sometimes downvote for things I think are bad ideas, as an alternative to a comment saying "Nope, that's wrong." While I would agree that in the latter case a downvote isn't as good as a more detailed comment explaining why something is wrong, I do think it's better than nothing.
So, consider this an opportunity to convince someone to your position on downvotes, if you want to: why ought I change my behavior?
Replies from: thomblake↑ comment by thomblake · 2012-01-26T19:23:15.120Z · LW(p) · GW(p)
Voting is there to encourage/discourage some kinds of comments. We don't want people to not make comments just because we disagree with their contents, so we shouldn't downvote comments for disagreement.
If someone makes a good, well-reasoned comment in favor of a position I disagree with, that merits an upvote and a response.
It might be nice to have a mechanism for voting "agree/disagree" in addition to "high quality / low quality" (as I proposed 3 years ago), but in the absence of such a mechanism we should avoid mixing our signals.
The comments that float to the top should be the highest-quality, not the ones most in line with the Lw party line.
And people should be rewarded for making high-quality comments and punished for making low-quality comments, not rewarded for expressing popular opinions and punished for expressing unpopular opinions.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2012-01-27T01:19:47.202Z · LW(p) · GW(p)
I agree that good, well-reasoned comments don't merit downvotes, even if I disagree with the position they support. I agree that merely unpopular opinions don't merit downvotes. I agree that low-quality comments in line with the LW party line don't merit upvotes. I agree that merely popular opinions don't merit upvotes. I agree that voting is there to encourage and discourage some kinds of comments.
What's your position on downvoting a neither-spectacularly-well-or-poorly-written comment expressing an idea that's simply false?
Replies from: saturn, thomblake, bio_logical↑ comment by saturn · 2012-01-27T10:13:59.673Z · LW(p) · GW(p)
I don't think that type of comment should be downvoted except when the author can't take a hint and continues posting the same false idea repeatedly. Downvoting false ideas won't prevent well-intentioned people from making mistakes or failing to understand things, mostly it would just discourage them from posting at all to whatever extent they are bothered by the possibility of downvotes.
↑ comment by bio_logical · 2013-10-28T05:02:23.301Z · LW(p) · GW(p)
An idea that's false but "spectacularly well-written" should be downvoted to the extent of its destructiveness. Stupidity (the tendency toward unwitting self-destruction) is what we're trying to avoid here, right? We're trying to avoid losing. Willful ignorance of the truth is an especially damaging form of stupidity.
Two highly intelligent people will not likely come to a completely different and antithetical viewpoint if both are reasonably intelligent. Thus, the very well-written but false viewpoint is far more damaging than the clearly stupid false viewpoint. If this site helps people avoid damaging their property (their brain, their bodies, their material possessions), or minimizes systemic damage to those things, then it's more highly functional, and the value is apparent even to casual observers.
Such a value is sure to be adopted and become "market standard." That seems like the best possible outcome, to me.
So, if a comment is seemingly very well-reasoned, but false, it will actually help to expand irrationality. Moreover, it's more costly to address the idea, because it "seems legit." Thus, to not sound like a jerk, you have to expend energy on politeness and form that could normally be spent on addressing substance.
HIV tricks the body into believing it's harmless by continually changing and "living to fight another day." If it was a more obvious threat, it would be identified and killed. I'd rather have a sudden flu that makes me clearly sick, but that my body successfully kills, than HIV that allows me to seem fine, but slowly kills me in 10 years. The well-worded but false argument is like a virus that slips past your body's defenses or neutralizes them. That's worse than a clearly dangerous poison because it isn't obviously dangerous.
False ideas are most dangerous when they seem to be true. Moreover, such ideas won't seem to be true to smart people. It's enough for them to seem true to 51% of voters.
If 51% of voters can't find fault with a false idea, it can be as damaging as "the state should own and control all property." Result: millions murdered (and we still dare not talk about it, lest we be accused of being "mind killed" or "rooting for team A to the detriment of team B" --as if avoiding mass murder weren't enough of a reason for rooting for a properly-identified "right team").
Now, what if there's a reasonable disagreement, from people who know differen things? Then evidence should be presented, and the final winner should become clear, or a vital area where further study is needed can be identified.
If reality is objective, but humans are highly subjective creatures due to limited brain (neocortex) size, then argument is a good way to make progress toward a Lesswrong site that exhibits emergent intelligence.
I think that's a good way to use the site. I would prefer to have my interactions with this site lead me to undiscovered truths. If absolutely everyone here believes in the "zero universes" theory, then I'll watch more "Google tech talks" and read more white papers on the subject, allocating more of my time to comprehending it. If everyone here says it's a toss-up between that and the multiverse theory, or "NOTA.," I might allocate my time to an entirely different and "more likely to yield results" subject.
In any case, there is an objective reality that all of us share "common ground" with. Thus, false arguments that appear well reasoned are always poorly-reasoned, to some extent. They are always a combination of thousands of variables. Upranking or downranking is a means for indicating which variables we think are more important, and which ones we think are true or false.
The goal should always be an optimal outcome, including an optimal prioritization.
If you have the best recipe ever for a stevia-sweetened milkshake, and your argument is true, valid, good, and I make the milkshake and I think it's the best thing ever, and it contains other healthy ingredients that I think will help me live longer, then that's a rational goal. I'm drinking something tasty, and living longer, etc. However, if I downvote a comment because I don't want Lesswrong to turn into a recipe-posting board, that might be more rational.
What's the greatest purpose to which a tool can be used? True, I can use my pistol to hammer in nails, but if I do that, and I eventually need a pistol to defend my life, I might not have it, due to years of abuse or "sub-optimal use." Also, if I survive attacks against me, I can buy a hammer.
A Lesswrong "upvote" contains an approximation of all of that. Truth, utility, optimality, prioritization, importance, relevance to community, etc. Truth is a kind of utility. If we didn't care about utility, we might discuss purely provincial interests. However: Lesswrong is interested in eliminating bad thinking, and it thus makes sense to start with the worst of thinking around which there is the least "wiggle room."
If I have facial hair (or am gay), Ayn Rand followers might not like me. Ayn Rand often defended capitalism. By choosing to distance herself from people over their facial hair, she failed to prioritize her views rationally, and to perceive how others would shape her views into a cult through their extended lack of proper prioritization. So, in some ways, Rand, (like the still worse Reagan) helped to delegitimize capitalism. Still, if you read what she wrote about capitalism, she was 100% right, and if you read what she wrote about facial hair, she was 100% superficial and doltish. So, on an Ayn Rand forum, if someone begins defending Rand's disapproval of facial hair, I might point out that in 2006 the USA experienced a systemic shock to its fiat currency system, and try to direct the conversation to more important matters.
I might also suggest leaving the discussions of facial hair to Western wear discussion boards.
It's vital to ALWAYS include an indication of how important a subject is. That's how marketplaces of ideas focus their trading.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2013-10-28T13:54:21.239Z · LW(p) · GW(p)
An idea that's false but "spectacularly well-written" should be downvoted to the extent of its destructiveness.
Well, to the extent of its net destructiveness... that is, the difference between the destructiveness of the idea as it manifests in the specific comment, and the destructiveness of downvoting it.
But with that caveat, sure, I expect that's true.
That said, the net destructiveness of most of the false ideas I see here is pretty low, so this isn't a rule that is often relevant to my voting behavior. Other considerations generally swamp it.
That said, I have to admit I did not read this comment all the way through. Were it not a response to me, which I make a habit of not voting on, I would have downvoted it for its incoherent wall-of-text nature.
↑ comment by TimS · 2012-01-26T18:50:37.828Z · LW(p) · GW(p)
I think the norm is pretty strong. I tend to downvote for stupid, not just wrong. But it will need to be explicitly reinforced.
Edit: The norm on the site is also different if you are participating in the conversation (try not to downvote at all) or simply observing.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2012-01-26T19:06:03.312Z · LW(p) · GW(p)
To call "don't downvote if I'm in the conversation" a local norm might be overstating the case. I've heard several people assert this about their own behavior, and there are good reasons for it (and equally good reasons for not upvoting if I'm in the conversation), but my own position is more "distrust the impulse to vote on something I'm emotionally engaged with."
Replies from: TimS↑ comment by thomblake · 2012-01-26T19:28:28.991Z · LW(p) · GW(p)
To echo Alejandro1, downvotes should also go to comments which break the rules.
Replies from: bio_logical↑ comment by bio_logical · 2013-10-28T05:12:26.061Z · LW(p) · GW(p)
“I am free, no matter what rules surround me. If I find them tolerable, I tolerate them; if I find them too obnoxious, I break them. I am free because I know that I alone am morally responsible for everything I do.”
― Robert A. Heinlein
(There's no way to break the rule on posting too fast. That's one I'd break. Because yeah, we ought not to be able to come close to thinking as fast as our hands can type. What a shame that would be. ...Or can a well-filtered internet forum --which prides itself on being well-filtered-- have "too much information")
↑ comment by lessdazed · 2012-01-26T18:57:09.897Z · LW(p) · GW(p)
we're likely to be voting for biased reasons.
Downvoted for fallacy of gray, and because I'm feeling ornery today.
Replies from: thomblake↑ comment by thomblake · 2012-01-26T19:17:57.424Z · LW(p) · GW(p)
There's no fallacy of gray in there. Since votes count just as much in the thread, and our votes will be much more noisy, it would often be best to refrain from voting there. If anything, I might have expected to be accused of the opposite fallacy.
Replies from: lessdazed↑ comment by lessdazed · 2012-01-27T23:10:40.427Z · LW(p) · GW(p)
and our votes will be much more noisy
This qualification makes it not the fallacy of gray. If that qualifier was implicit from context above, I simply missed it.
Replies from: thomblake↑ comment by thomblake · 2012-01-29T19:05:26.418Z · LW(p) · GW(p)
I still don't see how that would relate to the fallacy of gray:
"The world isn't black and white. No one does pure good or pure bad. It's all gray. Therefore, no one is better than anyone else."
↑ comment by [deleted] · 2012-01-26T18:42:38.755Z · LW(p) · GW(p)
Perhaps a norm of using the anti-kibitzer for the thread?
Replies from: thomblake↑ comment by thomblake · 2012-01-26T18:47:11.252Z · LW(p) · GW(p)
I'm not sure that's a help for biased voting patterns (which would probably come from the views being expressed), but it might help preventing local mind-killing from spilling out onto the rest of the site.
But I don't think there's an easy mechanism for that, and comments will still show up in 'recent comments' under discussion.
Replies from: bio_logical↑ comment by bio_logical · 2013-10-28T05:22:37.900Z · LW(p) · GW(p)
If your forum has a lot of smart people, and they read the recommended readings, then the more people who participate in the forum, the smarter the forum will be. If the forum doesn't have a lot of stupid, belligerent rules that make participation difficult, then it will attract people who like to post. If those people aren't discouraged from posting, but are discouraged from posting stupid things, your forum will trend toward intelligence (law of large numbers, emergence with many brains) and away from being an echo chamber (law of small numbers, emergence with few brains).
I wouldn't stay up late at night worrying about how to get people to up-vote or down-vote things. They won't listen anyway, but even so, they might contain a significant amount of the wisdom found in "the Sequences," and wisdom from other places, too. They might even contain wisdom from the personal experiences of people on the blue and green teams, who then can contribute to the experiential wisdom of the Lesswrong crowd, even without being philosophically-aware participants, and even with their comments being disdained and down-voted.
Replies from: Desrtopa↑ comment by Desrtopa · 2013-10-28T06:28:16.090Z · LW(p) · GW(p)
If your forum has a lot of smart people, and they read the recommended readings, then the more people who participate in the forum, the smarter the forum will be.
If the forum can be said to have an intelligence which is equal to the sum of its parts, or even just some additive function of its parts, then yes. But this is not reliably the case; agents within a group can produce antagonistic effects on each others' output, leading to the group collectively being "dumber" than its individual members.
If those people aren't discouraged from posting, but are discouraged from posting stupid things, your forum will trend toward intelligence (law of large numbers, emergence with many brains) and away from being an echo chamber (law of small numbers, emergence with few brains).
This is true in much the same sense that it's true that you can effectively govern a country by encouraging the populace to contribute to social institutions and discouraging antisocial behavior. It might be true in a theoretical sense, but it's too vague to be meaningful as a prescription let alone useful, and a system which implements those goals perfectly may not even be possible.
comment by shokwave · 2012-01-26T16:41:47.763Z · LW(p) · GW(p)
Data point: during the Melbourne LessWrong meetups, discussion of politics proved (a large fraction would say significant) net negative.
For what it's worth, I read Politics is the Mind-Killer as almost the opposite of your interpretation: that politics is a mind-killer, so why would you want to drag that awful mess into examples that could otherwise be clean. ie, avoid politics at significant cost, and this includes in otherwise sterile examples.
To some extent I wonder why we'd need to avoid politically-charged examples if we were capable of actually talking about politics; I feel like if that was the case it would be Politics is the Comment-Thread-Exploder, and we'd only avoid it because a throwaway example would case a huge, well-reasoned, rational but off-topic discussion.
Replies from: Alicorn, duckduckMOO, thomblake↑ comment by duckduckMOO · 2012-01-28T10:26:19.491Z · LW(p) · GW(p)
shouldn't that be the make-comment-threads-explode-er?
Replies from: shokwave↑ comment by shokwave · 2012-01-29T04:27:20.511Z · LW(p) · GW(p)
It's Politics is the Mind-Killer, i.e., politics kills the mind, politics is a killer of minds. So it should be Comment-Thread-Exploder, because politics would explode the thread, politics would be an exploder of threads. Good catch.
For reference, the grandparent originally used the phrase "Politics is the Make-Comment-Threads-Exploder".
Replies from: bio_logical↑ comment by bio_logical · 2013-10-28T05:37:50.707Z · LW(p) · GW(p)
I'm interested to know what rational people should have done in 1930 Germany to prevent politics from killing minds there. Is there a general consensus here on that issue?
I mean, if ever there were an issue worthy of rational prioritization, I would think that the construction of deathcamps and the herding of people into them, should be prioritized. How might one rationally prioritize one's actions in that type of situation?
I honestly would like to know if there's a "non-mind-killing" approach possible in such a situation.
If the answer is not "political engagement" or "attempting to exert influence at the ballot box," and the answer is not "urge people you love to leave Germany," and the answer is not "buy black market firearms and join the resistance," and the answer is not "roll over on your back and bear your belly in submission," and the answer is not "mind-killing political discussion," then I'd like to know what a rational course of action is in that type of situation.
I ask this question for purely narrow, purely selfish reasons. I am now holding approximately equal numbers of federal reserve notes and one-ounce gold pieces and silver pieces, and I can't help but notice that every year I hold the notes, they are worth less and less, in relation to the gold and silver. Since 1970, I've lost money on the notes, and gained money on the gold and silver. Is there any rational principle at work here? Am I being stolen from, or am I simply not lucky? Is there any sort of system I should adopt?
What course of action is most rational? And how can I decide without engaging in mind-killing thought? I'm really trying to minimize the mind-killing thoughts, and other crime-think. The last thing I'd like to be is a filthy mind-killed (brain dead?) crime-thinker.
Also, for those not wanting to dirty themselves by replying to political threads (presumably because they're building strong AGI, which is a seriously better use of their time), how and why would ANY thread other than a recruitment thread for computer scientists and engineers be a good use of one's time?
Sorry for exploding this thread. Mea culpa!
↑ comment by thomblake · 2012-01-26T17:04:05.832Z · LW(p) · GW(p)
To be clear, I'm not advocating starting a thread on how the Greens will triumph over our hated enemies the Blues. Even when posting about a political issue, it would be best to steer clear of color politics. But if someone wants to make a post about a relevant political issue, there should be a place for that.
Possibly with a clear warning at the top that the post is about politics, and comments should remember to avoid color politics. Maybe a standard infobox or something.
comment by lessdazed · 2012-01-26T19:17:07.528Z · LW(p) · GW(p)
I downvoted those comments because they sucked. They were wrong in systematic ways indicative of a killed mind.
People who err on the side of shutting down discussion and debate are commonly known as authoritarian in nature. I don't think that's a good thing. I would expect lesswrong to err more on the side of preservation of information, and free speech absolutism, designed for ease of reading and information preservation.
Just look at that snippet. The first sentence is awkwardly worded such that I can't tell whether he's committing the bandwagon fallacy, the fallacy of appeal to nature and arguing by definition, or the bandwagon fallacy and the fundamental attribution error. The second sentence is a crude rhetorical appeal. The third sentence wraps the usual total failure to understand that policy debates should not appear one-sided within cringe-worthy phrases pretending the position advocated is nuanced and pragmatic.
I don't have a policy of downvoting political pieces. I have policy of downvoting crap, and downvoting political comments is just what tends to happen.
Replies from: bio_logical↑ comment by bio_logical · 2013-10-28T07:09:34.815Z · LW(p) · GW(p)
I downvoted those comments because they sucked. specifics? or just ad-hominem attacks against a filthy blue?
They were wrong in systematic ways indicative of a killed mind. more ad-hominems from a dazed fool
People who err on the side of shutting down discussion and debate are commonly known as authoritarian in nature. Just look at that snippet. The first sentence is awkwardly worded such that I can't tell whether he's committing the bandwagon fallacy, the fallacy of appeal to nature and arguing by definition, or the bandwagon fallacy and the fundamental attribution error.
Seems pretty clear to me that there's an honest point being made. Let's analyze your allegations of bias.
The bandwagon fallacy: You labeled the comment as a possible "bandwagon fallacy" or "appeal to popularity of an idea, so as to avoid the merits of the idea." (Wikipedia defines this fallacy as a mixture of "red herring" which distracts from the point under discussion, and also "genetic fallacy" origins of the idea are referred to and argued against, rather than the idea itself.) In order for this to be the case, he'd need to be trying to unite those on lesswrong against those who possess an authoritarian philosophy, simply because authoritarians are unpopular (not because something in the common authoritarian method is fallacious). However, that would mean that the poster would not have a point that "the tactic of silencing debate" is itself an authoritarian tactic, the fallacy of "the appeal to force." (Appeals to force are the worst form of argumentational fallacy --the drowning out legitimate criticism with noise.) If there are those who now dispute that appeals to force are not authoritarian, or that the karma diminishing of voting rights, etc. are not "shutting down discussion" and therefore indicative of authoritarian philosophy, or at least authoritarian behavior, then I don't know what can be said to offer rational evidence here.
the fallacy of appeal to nature and arguing by definition,
Authoritarian in nature should clearly be shortened to "authoritarian," period. Nowhere is the appeal to nature indicated by an honest reading of the above quote. The nature of the tactic "shutting down discussion and debate" is authoritarian. By definition, it is. Look up the word authoritarian, if you don't know what it means. At http://www.lp.org there's a link to a political quiz that indicates it's the philosophy that is antithetical to libertarianism. It would be proper to reason that authoritarians have an indefensible view of reality, and that that viewpoint relies on shutting down debate --the idea cluster of "authoritarianism" doesn't make claims about the legitimacy of the authority. The "idea cluster" of authoritarianism is rich with evidence that illegitimate authorities frequently silence debate when they don't have a rational leg to stand on. I don't think that this should be a stretch for most rational people to understand. "Silencing debate" is a part of the authoritarian "cluster of facts" in reality. http://lesswrong.com/lw/nz/arguing_by_definition/ --If I were to argue that "because authoritarians are evil, by definition, I should be allowed to say whatever I want" then I'd be arguing against authoritarians, but this time I'd be arguing by definition, assuming an evil for which I'd given no evidence except the definition. Here, the poster references "the evil" or "the wrong" as "the shutting down of debate" and associates it with a popularly-understood philosophy, known as authoritarianism. I'm assuming that most Lesswrongers have seen an example of someone's microphone being turned off in a political debate, or perhaps Sean Hannity or Bill O'Reilly, turning off someone's video feed just as they were destroying his arguments in a well-reasoned manner. (This is a commonly-understood "fact cluster" in reality. In fact, the entire North Korean authoritarian regime is founded on the consistent application of this practice: censorship, silencing of criticism, silencing of potential critics.)
Now, clearly, authoritarians without power are nowhere near as scary as authoritarians with power. But the very fact that they attempt similarly dishonest tactics to win adherents to their position should indicate that they have no respect for those they are attempting to convince.
So, the poster decries attempts to silence debate, and thereby decries attempts to silence appeals to objective measures of value. Also, declares free speech absolutism to be of high value to himself (and himself only). In doing so, he appeals to consensus, because consensus confers political victory, and determines whether there's a place for his value structures --backed up with argument and evidence-- on this board.
Fundamental attribution error: are there "people who err on the side of shutting down discussion" so they don't have to address a favorable interpretation of someone else's legitimate position? Yeah, there are, and they can broadly be referred to as "authoritarians." Keep in mind that he didn't name anyone here. He just posited the existence of bellicose and belittling authoritarians like yourself. Now, it's possible he's wrong, and every person legitimately interpreted his position as false, and it is false (or otherwise erroneous) in reality. In order for that to be true, then it's possible all such judges of his interpretation are also making the "fundamental attribution error," which posits an unknowable knowledge of others' motivations based on an incorrect interpretation of their actions. Of course, you yourself could be making the fundamental attribution error, without comprehension of his truly greater understanding, since you, yourself, are actually an authoritarian. In order for any questions about fundamental attribution error to be resolved, a third party needs to measure each of the parties in a quantifiable way that eliminates motive for bias. In the WIKIpedia example, the speeding driver has good reason to be speeding, yet all the other drivers attribute the speeding to wanton disregard for human life (without knowing that the driver is taking a calculated risk to save their passenger who is bleeding out, trying to get them to a hospital). The fundamental attribution bias can almost always be a legitimate interpretation of the events, based on the available information. The more information there is, the better the interpretation. Perhaps the poster has spent a lot of time interacting with authoritarian jerks, and observing their callous conformity. You fit the bill. Also, keep in mind that analysis of attribution is said to have created the whole field of sociology --meaning: sociologists try to attribute motives to observed actions. If that weren't possible, then there would only be fundamental attribution error, there would be no sociology. Collective actions are often able to be correctly analyzed from behaviorism alone. In fact, when dealing with low-level thought (such as your own) behaviorism is a highly reliable indicator of proper attribution. John Douglas has caught hundreds of serial killers by properly attributing their behavior to predictable, low-level motives.
From wikipedia: "bias arises when the inference drawn is incorrect, e.g., dispositional inference when the actual cause is situational." I suspect I'm correct in my attribution. Why not offer some evidence that I'm not correct? And what of the defense from sociopaths that their actions are always legitimate, and that others are simply trying to "tar and feather" them with the brush of sociopathy? Wouldn't an authoritarian sociopath always respond as you have? (Generally, yes, they do.) I'm not saying you're a sociopath. There might be other reasons why you're a whimpering sycophant with low moral character. Or, you might genuinely believe that I'm mistaken in my attribution.
In order to test this out, let's imagine that there's a situation somewhere where a bunch of conformists down-vote an individualist because they are authoritarians who disagree with individual freedom, and the social norms of their group disfavor individualism. Such as in the case of "the unlikeliest cult," as reported on this site.
The third sentence wraps the usual total failure to understand that policy debates should not appear one-sided within cringe-worthy phrases pretending the position advocated is nuanced and pragmatic.
So why shouldn't a policy that's "mind-killing" be attacked? There's no valid reason why anyone should argue in favor of a position they disbelieve, simply to feign the appearance of "even handedness." If this were legitimate, then holocaust survivors would have to qualify their denunciations of Nazism, perhaps by saying "Well, even if the Jews sometimes deserve to get killed, genocide was taking it a bit too far..." Absurd! Illegitimate arguments and policies should be eliminated, if they confer no benefit. If there are defenders of such bad ideas, then let them "man up" and argue their position with valid arguments, as the poster did.
preservation of information, and free speech absolutism, designed for ease of reading and information preservation.
OK, so the poster is redundant, at worst. Still, the points he made were valid, and you didn't attack the merits of a single one of them. You called them names, and derided the arguer's style, which unfortunately appears hurried.
Attack one "cringe-worthy phrase" or attack the position he made. Specifically. Unless you're slinging nothing but pissy ad hominem attacks against a righteous libertarian. If that's the case, feel free to help yourself to another shot so you can be less dazed, and we'll talk about it when you've slept off your drunk.
comment by Alicorn · 2012-01-26T18:47:13.542Z · LW(p) · GW(p)
I think it would be interesting if we had a politics thread where we held off on proposing solutions and spoke only in facts/questions. I'm not sure it's sustainable.
Replies from: APMason↑ comment by APMason · 2012-01-26T18:59:08.549Z · LW(p) · GW(p)
"Here's a question that I'm sure you all think you know the answer to but which you're not allowed to answer," is probably a good way of making some heads explode.
Replies from: Normal_Anomaly↑ comment by Normal_Anomaly · 2012-01-27T00:52:27.097Z · LW(p) · GW(p)
I think Alicorn's idea is that one could answer questions with lists of what you consider the relevant facts, possibly stating a conclusion at the end, possibly not.
comment by fubarobfusco · 2012-01-26T22:14:43.232Z · LW(p) · GW(p)
I'm curious what you mean by "well-meaning troll". The way I use the word, a "troll" is someone who posts for the enjoyment of disrupting discussions, pissing people off, or wasting people's time and making them look foolish. As such, "well-meaning troll" is an oxymoron.
Replies from: Nornagest, thomblake, lessdazed↑ comment by Nornagest · 2012-01-27T00:05:28.538Z · LW(p) · GW(p)
This probably isn't what the OP means by it, but I've encountered a number of trolls who justified, or rationalized, their trolling by claiming to act as a counter to groupthink, or as predators in the ecosystem of ideas, or as some kind of Socratic gadfly. It's up to you how much you want to trust those claims, but they are arguably altruistic and do seem consistent with the definition you offer.
↑ comment by thomblake · 2012-01-27T16:02:51.193Z · LW(p) · GW(p)
I'm curious what you mean by "well-meaning troll".
Basically, the posts were exactly what a troll would post, but I get the impression they were not posted by way of trollish intentions. Since consequences are what matters, 'troll' is still a good description.
If it looks like a duck and quacks like a duck, it is useful to refer to "that duck" even if you think it's not a duck. It is actually a cleverly-disguised kitten with a speech impediment.
↑ comment by lessdazed · 2012-01-26T22:35:15.158Z · LW(p) · GW(p)
As such, "well-meaning troll" is an oxymoron.
Would the sentence "As such, "well-meaning troll" is an oxymoron by definition," have a different meaning than what you wrote?
Replies from: fubarobfusco↑ comment by fubarobfusco · 2012-01-27T00:00:15.287Z · LW(p) · GW(p)
Yes, it would. My intent was to present my definition and ask what one the poster was using, since it clearly differed.
comment by jimrandomh · 2012-01-26T21:33:12.043Z · LW(p) · GW(p)
The no-politics norm isn't just on LW; it's widespread. But these norms are a defensive adaptation, and I don't think they can be dropped safely.
Instead, I think we should have a designated politics day, on which all no-discussing-politics taboos are lifted in all contexts, and people who normally avoid politics are encouraged to post position papers. I think this would produce most of the benefits of talking about politics, while limiting the damage.
comment by Alejandro1 · 2012-01-26T20:39:26.545Z · LW(p) · GW(p)
Several people have agreed with the idea of a politics thread and jumped to discuss implementation details, while others have expressed opposition, with both stances receiving upvotes. I think we need a poll. Response comments to this one include for, against, and karma balance.
Replies from: Alejandro1, Alejandro1, Alejandro1↑ comment by Alejandro1 · 2012-01-26T20:41:22.449Z · LW(p) · GW(p)
AGAINST
Vote this up if you disagree with creating a politics thread.
↑ comment by Alejandro1 · 2012-01-26T20:40:32.267Z · LW(p) · GW(p)
FOR
Vote this up if you agree with creating a politics thread.
↑ comment by Alejandro1 · 2012-01-26T20:42:40.741Z · LW(p) · GW(p)
KARMA BALANCE
Vote this down if you have voted up either the FOR or the AGAINST comments.
comment by Rob Bensinger (RobbBB) · 2013-05-26T19:18:47.093Z · LW(p) · GW(p)
'Politics' is a massive category, and has a disproportionate share of the important issues (relative to, say, randomly selected academic topics). In the long run (assuming there will be a long run), reinforcing the intellectual norm that politics is low-status and impossible to productively discuss is surely a bad thing if we think that it's at all important to get political questions right. It will function to make politics increasingly intellectually impoverished and divisive, as we keep seeing more and more of the calmest and sanest thinkers avert their eyes from politics and from political theory.
Because politics is so dangerous to talk about, especially high-level rationalists should be encouraged to practice their craft on it sometimes, to improve the state of the discourse, contribute important new ideas to it, and further hone their own knowledge and anti-mindkill skills.
That said, I agree that at this moment the risks of a politics open thread on LW probably outweigh the benefits. I would suggest instead an off-site politics discussion forum maintained by passionately dispassionate LWers, intended for discussants and posts with LW-like quality levels and topics. (If there already is such a thing, do let me know!) Since it would be off-site, there would be less risk of bleed-over, particularly since we'd have flexibility to implement extreme measures like:
- there are no public usernames, and users are discouraged from giving identifying information in their posts. So posters will not be easily identified with specific LWers, and the forum itself won't tend to coalesce around clearly defined personalities, making tribes relatively amorphous and individual posts difficult to ad hominem.
- there are private user accounts, i.e., the forum won't be open to unregistered users. This makes it possible to implement a karma system, and to strongly restrict the posting privileges of new visitors until they've repeatedly proved their lack of mindkill.
- to make it possible (though not too easy) to prove that you're the same person as a previous poster, we can introduce a special tag that, e.g., makes you able to type #4F33301 in red iff you are the user who made post 4F33301. So in special circumstances identity can be maintained without risk of impersonation. You can also refer back to 4F33301 in black if you want to clarify which post you're responding to.
- but instead of serial numbers like 4F33301, let's use random dictionary words like 'vial' or 'fittingly' to mark individual posts, because that would be way cuter and easier to remember.
- in fact, optimize for cuteness, quirkiness, and friendliness in general, as much as is possible while maintaining anonymity. The friendlier and funnier the site looks and feels, the more light-hearted and collaborative the posts will be. professionalism and silly benign emoticons are totally compatible.
- to make the karma system more useful, we can introduce community guidelines to the effect that you should upvote for good methods, more so than for Correct Beliefs. (To encourage this we can use framing like 'Useful? Not Useful?' as opposed to 'Vote Up? Vote Down?' or 'Like? Dislike?'.)
- karma will determine how visible your post is, but the actual karma number of the post itself won't be visible to anyone. So you'll get a general sense that you're doing a good job if you see your posts rising to prominence (or, perhaps, a private aggregate per-user karma number increase), but there won't be as much of a temptation to fixate on points as there is on LW.
- the politics forum will rely largely on top-down moderation, probably even more so than on karma. Moreover, getting your post readily visible on the site will be a mark of privilege and of exceptional poster quality, not the norm. Mindkill posts will be deleted without mercy or hesitation, and borderline/mediocre posts will be more readily hidden on the site than they are on LW. (Possibly all posts will require moderator approval, or moderators will have an easy shortcut to hiding the posts, e.g., giving massive downvotes.)
- users will be strongly encouraged to routinely report all site abuses, and will be strongly and swiftly punished for feeding trolls even in cursory ways. (Part of becoming a user with full site privileges might even include a trial run of proving you will actively report problem posts without replying to them.)
- users who don't use the karma system in a way that overall improves the site will have their voting ability taken away and all their votes annulled. So the ability to downvote or upvote itself might become a privilege rather than a base-level expectation for users. If the user base is amazing enough, high-level poster privileges might smoothly transition into moderator-style privileges.
What do you think?
Replies from: BerryPick6, elharo↑ comment by BerryPick6 · 2013-05-31T22:09:29.427Z · LW(p) · GW(p)
I think yes. Are the chances of this happening >20%?
Replies from: RobbBB↑ comment by Rob Bensinger (RobbBB) · 2013-06-01T04:40:01.728Z · LW(p) · GW(p)
Probably not in the near future, but I need more feedback to have confident estimates.
It sounds like there's some interest in the idea, so I think I'll start a new Discussion page to drum up more ideas and concerns.
↑ comment by elharo · 2013-05-26T20:51:15.551Z · LW(p) · GW(p)
Fascinating idea, especially hiding usernames and identity. I do think it's important to track and expose identity within a thread, but that may be all. Then again maybe not even that.
Replies from: RobbBB↑ comment by Rob Bensinger (RobbBB) · 2013-05-26T21:41:14.025Z · LW(p) · GW(p)
We could assign random usernames to each person in each thread, which are held constant within but not between threads. I'm not sure how useful that would be, or how much it would erode the general anonymity benefits. It depends on how the forum is structured; if individual threads tend to be very long then preserving identity within threads will carry similar risks to nonymity in general.
comment by irrationalist · 2012-01-27T05:31:13.834Z · LW(p) · GW(p)
I only have two kinds of political discussions now:
- Pure trolling for emotional catharsis
- Finding a way to evade the political part of the issue (in other words, if you're concerned about making medical care cheaper, can I think of a way to help you achieve your goal that doesn't require anyone to vote a particular way?)
The second is, I sincerely believe, the best way for us non-politicians to solve problems. The first is something I just kind of like doing. It's pure hate and I don't pretend it's anything else.
comment by Grognor · 2012-01-27T00:01:03.482Z · LW(p) · GW(p)
In general, I am extremely suspicious of claims that things are just fine the way they are. But this is one of those cases where I'm in that color.
A recent series of posts by a well-meaning troll
Doesn't this summarize lots of good reasons to keep imposing sharp costs on politics talk at Less Wrong? Looking at that guy's comment history makes me want to be even more aggressive at keeping it elsewhere.
Replies from: Solvent↑ comment by Solvent · 2012-01-27T00:47:58.356Z · LW(p) · GW(p)
Yeah, but the guy was obviously an idiot, and everyone could have easily just ignored him, and he got downvoted into oblivion.
If the experienced members of the site generally don't post about politics, then the people who post about politics are less likely to be sensible, coherent, and/or intelligent.
Replies from: JGWeissman↑ comment by JGWeissman · 2012-01-27T01:07:01.307Z · LW(p) · GW(p)
Yeah, but the guy was obviously an idiot, and everyone could have easily just ignored him, and he got downvoted into oblivion.
In my experience, when everyone could ignore an obvious idiot/troll who get downvoted into oblivion, some people don't actually ignore it, and low quality discussion ensues.
Unless a group is highly coordinated, it doesn't seem useful ask what the group could do, what actions are available to the group. The group is not an coherent agent that considers its available actions and chooses one. You could ask usefully what a group member could do, but then the actions of other group members are a fact about the environment, there is no "could" about it.
Replies from: Vladimir_Nesov↑ comment by Vladimir_Nesov · 2012-01-28T01:18:36.822Z · LW(p) · GW(p)
I follow the policy of downvoting replies to worthless comments, regardless of replies' quality. More people following this policy would quench the worthless discussions of that kind.
comment by Alejandro1 · 2012-01-26T19:11:58.914Z · LW(p) · GW(p)
I had been thinking about making the same suggestion. Some pros of a politics thread include:
Having a place to take the long subthreads on politically-charged topics that sometimes inevitably arise by topic drift on other posts, making LW a more pleasant experience for politics-allergic readers.
A place to test whether our rationalist skills are up to the task of discussing mind-killing topics in a non-mind-killing way.
Some would enjoy the possibility of discussing political topics in a "rational" atmosphere (truth-searching, not us-vs-them, aware of biases and fallibility, etc.) that other politics forums do not have.
Cons:
Mind-killing of a certain degree might (some may say, would) nevertheless occur. This is not so tragic if the thread remains a self-contained experiment and the rest of LW proceeds as usual. The danger is that there might be spillover. If you were in a heated argument in the politics thread, and then see an unrelated post on AI or game theory by your rival, would you be able to avoid the halo effect and not judge it negatively as the post of a commie/libertardian/racist/PC thought cop/whatever? Even worse, if the arguments in the politics thread happened to usually have the same participants taking the same sides in opposite political clusters, we would be in danger of a Robber's Cave situation.
Some might be concerned that having extreme/contrarian political views openly expressed may bring bad reputation to LW, but without these views represented the discussion would be bland and uninteresting. Personally I don't see why things would be necessarily worse with a politics thread than with already existing threads like this one. One could even argue that it would be good for appearances to have all the "extremism" concentrated in one or a few politics thread/s, and not scattered all over the site.
I think the experiment is worth trying, with rules like Tim's clearly stated in the post, and ruthlessly enforced by downvoting offenders even if they share your "side".
comment by JenniferRM · 2012-01-27T00:05:00.405Z · LW(p) · GW(p)
I would rather see politics at LW done in a way that playfully respects the complications that are obvious, and ends up doing something surprising and hopefully awesome. Let me see if I can develop this a bit...
Imagine starting with a pool of people who think their brains are turbo-charged and who "enjoy the possibility of discussing political topics in a 'rational' atmosphere (truth-searching, not us-vs-them, aware of biases and fallibility, etc.)". If they're really actually rational, you'd kind of expect them to be able to do things like Aumann update with people with similar capacities and inclinations but different conclusions on a given hot-button political subject. So the trick would be to find two people whose starting points on a political topic was radically different and have them be each other's chavutra, and discuss the subject with each other until they either agree on the relevant and substantive facts or agree to disagree.
Now, maybe this is just me, but it seems to me that having chavutra discussions in a public forum would introduce all kinds of signalling complications and potentially cause negative externalities if other people run into it without adequate background. To avoid both problems, it seems like it would be better to do this via the phone, or IM, or email. IM and email would be easier to log, but voice would probably be helpful for issues of tone and hashing out a common vocabulary fast.
I would expect this to be quite educational. Also I think it would be neat to read a joint write-up of how it went afterwards, where the reader finds out about the initial dramatically different opinions, and hears about the sort of higher level surprises came up in the process itself: how long it took, what was helpful, what was learned in general.
I'd personally prefer not to hear the details of the final conclusion other than the single yes/no bit about whether agreement was reached or not, because I would expect it would re-introduce signaling issues into the discussion itself, make future updates harder, and sort of implicitly suggest to the community and the wider world that these two people's conclusion is "endorsed by all reasonable people in general". (Which suggests a second order thing to try: have two pairs of people update on the same subject and then compare each pair's agreements...)
It would be pretty awesome if LW had a thread every so often where people broadcasted interest in finding an "aumann chavutra" for specific topics, including political topics. This might help with people specific "dialogue cravings". It might eventually start clogging up the site itself (the way meetup posts clogged things up for a while) but that seems like it would be a good problem to have to solve :-)
Replies from: Vaniver↑ comment by Vaniver · 2012-01-27T02:45:01.275Z · LW(p) · GW(p)
The reason why I am not optimistic about this sort of thing is because many people know someone clever who has radically different political opinions from them, and they often talk about politics quite a bit. So those sort of Aumann updates often happen, but they often end at a stance like "we both understand each other's opinions of the facts, but have different value systems, and so disagree" or something like "we both assign the same likelihood ratio to evidence, but have very different priors."
Replies from: JenniferRM↑ comment by JenniferRM · 2012-01-27T03:22:25.438Z · LW(p) · GW(p)
I guess my thought was that LWers are likely to think that its possible to implement values incoherently (ie correctably), and so might have much more to say (and learn) other than your average "clever person". Scope neglect, cognitive dissonance, etc, etc.
My guess would be that really solid rationalists might turn out to disagree with each other over really deep values, like one being primarily selfish and sadistic while another has lots of empathy and each can see that each has built a personal narrative around such tendencies, but I wouldn't expect them to disagree, for example, over whether someone was really experiencing pain or not. I wouldn't expect them to get bogged down in a hairsplitting semantic claim about whether a particular physical entity "counts as a person" for the sake of a given moral code.
And "we just have different priors" usually actually means "that would take too long to explain" from what I can tell. Pretty much all of us started out as babies, and most of us have more or less the same sensory apparatus and went through Piaget's stages and so on and so forth. Taking that common starting point and "all of life" as the evidence, it seems likely that differences in opinion could take days or weeks or months of discussion to resolve, rather than 10 minutes of rhetorical hand waving. I saw an evangelical creationist argued into direct admission that creationism is formally irrational once, but it took the rationalist about 15 hours over the course of several days to do (and that topic is basically a slam dunk). I wouldn't expect issues that are legitimately fuzzy and emotionally fraught to be dramatically easier than that was.
...spelling this out, it seems likely to me that being someone's aumann chavutra could involve substantially more intellectual intimacy than most people are up for. Perhaps it would be good to have some kind of formal non-disclosure contract or something like that first, as with a therapist, confessor, or lawyer?
Replies from: torekp↑ comment by torekp · 2012-02-05T03:43:51.914Z · LW(p) · GW(p)
Taking that common starting point and "all of life" as the evidence, it seems likely that differences in opinion could take days or weeks or months of discussion to resolve, rather than 10 minutes of rhetorical hand waving.
All of our lives, or even a month of it, probably imparted to us far more evidence than we could explain to each other in a month of discussion. The trouble is that much of the learning got lodged in memory regions that are practically inaccessible to the verbal parts of our brains. I can't define Xs and you can't define Ys, but we know them when we see them.
"We just have different priors" is probably not the best way to describe these cognitive differences - I agree with you there. But we could still be at a loss to verbally reason our way through them.
Replies from: JenniferRM↑ comment by JenniferRM · 2012-02-06T20:10:27.934Z · LW(p) · GW(p)
I don't think people have any sort of capacity to fully describe their entire audio/video experience in full resolution, but if you think about the real barriers to more limited communication I predict that you'll be able to imagine plausible attempts to circumvent these barriers for the specific purpose of developing a model of a particular real world domain in common with someone with enough precision to derive similar strategic conclusions in limited domains.
I can't define Xs and you can't define Ys, but we know them when we see them.
Maybe I'm misunderstanding you, but my impression is that this is what extensive definitions and rationalist taboo are for: the first to inspire words and the second to trim away confusing connotations that already adhere to the words people have started to use. The procedure for handling the apparently incommensurable "know it when I see it" concepts of each party is thus to coin new words in private for the sake of the conversation, master the common vocabulary, and then communicate while using these new terms and see if the reasonable predictions of the novel common understanding square with observable reality.
A lot of times I expect that each person will turn out to have been somewhat confused, perhaps by committing a kind of fallacy of equivocation due to lumping genuinely distinct things under the same "know it when I see it" concept, which (in the course of the conversation) could be converted to a single word and explored thoroughly enough to detect the confusion, perhaps suggesting the need for more refined sub-concepts that "cut reality at the joints".
When I think of having a conversation with a skilled rationalist, I expect them to be able to deploy these sorts of skills on the most important seeming source of disagreement, rather than having to fall back to "agreeing to disagree". They might still do so if the estimated cost of the time in conversation is lower the the expected benefit of agreement, but they wouldn't be forced to it out of raw incapacity. That is, it wouldn't be a matter of incapacity, but a matter of a pragmatically reasonable lack of interest. In some sense, one or both of us would be too materially, intellectually, or relationally impoverished to be able to afford thinking clearly together on that subject.
However, notice how far the proposal has come from "talking about politics in a web forum". It starts to appear as though it would be a feat of communication for two relatively richly endowed people, in private, to rationally update with each other on a single conceptually tricky and politically contentious point. If that conversational accomplishment seems difficult for many people here, does it seem easier or more likely to work for many people at different levels of skill, to individually spend fewer hours, in public, writing for a wide and heterogeneously knowledgeable audience, who can provide no meaningful feedback, on that same conceptually tricky and politically contentious point?
comment by Cthulhoo · 2012-01-26T19:34:22.517Z · LW(p) · GW(p)
It might be that my current opinion is skewed by the present political situation in my country (Italy). I haven't enough knowlegde of foreign internal politics to judge if the italian situation is typical or not, brief conversations with foreign people on the subject suggest it's worse than in the average developed countries, but not that much. To the point.
There's one main problem of talking about politics: that it doesn't work like it should work, and there's little way to collect enough information to produce a good model of the reality. In practice: politicians, on average, don't try to do what they think to be the best interest of the country. They try to mantain the power and possibly to gain more advantages* from it. They have to pay debts to whoever supported them: the voters base, the powerful industrial groups, the trade unions, the banks and very likely the mafia. Sometimes these webs of interests are in plain sight, but very often they're not. Discussing of politics is therefore very often frustrating at best, since you have to work with partial (and very often wrong) info.
*E.G. the last year the government didn't fell mostly because more than 50% of the congressmen wanted to reach the pension threshold, and therefore need to sit in the parliament for another six months ( Link - in Italian unfortunately )
Replies from: thomblake↑ comment by thomblake · 2012-01-26T19:59:46.546Z · LW(p) · GW(p)
Certainly skewed by looking at Italy, but most of what you're talking about would be familiar to cynics anywhere.
But "politics is the mind-killer" is more about the tribal affiliation aspects of associating oneself with a party.
Replies from: Cthulhoo↑ comment by Cthulhoo · 2012-01-26T22:52:57.643Z · LW(p) · GW(p)
But "politics is the mind-killer" is more about the tribal affiliation aspects of associating oneself with a party.
This is, in my opinion, a consequence of what I wrote above. You can't properly evaluate politicians and parties and you can't reliably predict their behavior, so you end up reacting emotionally and attaching yourself to one of the "teams". If you could reliably predict what a party will do if it wins the election, then you could evaluate and discuss the precise program without being "mind-killed". Since you generally cannot, you end up cheering for one of the team that you somewhat feel is more or less aligned with your position.
comment by John_Maxwell (John_Maxwell_IV) · 2012-01-26T22:46:39.049Z · LW(p) · GW(p)
I don't see what the win from more discussion of politics is. Your vote doesn't count. Get over it. We have higher return things to attend to.
Replies from: J_Taylorcomment by Peter Wildeford (peter_hurford) · 2012-01-27T03:12:23.470Z · LW(p) · GW(p)
For what it's worth, I think that the realm of politics could be a great way to make discussions that need to be driven by truth-seeking and not tribal loyalties, give good opportunities to watch and guard against balance, and provides an opportunity to carefully calibrate confidence. I can see all the reasons why people (including me) couldn't handle it because we're not ideally rational, but if there was any discussion that gave LessWrong the ability to walk the talk and raise the sanity waterline, that discussion would be politics.
comment by drethelin · 2012-01-26T21:31:23.339Z · LW(p) · GW(p)
I would like there to be a politics open thread. I don't know how it would be for the health of the forum in general, but I think I would enjoy reading it.
Replies from: Nectanebo↑ comment by Nectanebo · 2012-01-27T19:48:15.666Z · LW(p) · GW(p)
Because of the issues with the health of the forum, I am not sure if I support a politics open thread, but I definitely think that I would enjoy reading it, and that latter thought only materialized into words after reading your comment. Thanks.
comment by Craig_Heldreth · 2012-01-27T20:43:33.517Z · LW(p) · GW(p)
This is not likely to be implemented easily here. When I looked at the poll it was around 20 for and 20 against having a politics open thread.
What could be done easily is start a subreddit lesswrongpoliticsbeta and if there did happen to be a great discussion on some topic ongoing there then put a pointer to it in the discussion sections here.