LINK: In favor of niceness, community, and civilisation

post by Solvent · 2014-02-24T04:13:10.620Z · LW · GW · Legacy · 137 comments

Contents

137 comments

Scott, known on LessWrong as Yvain, recently wrote a post complaining about an inaccurate rape statistic.

Arthur Chu, who is notable for winning money on Jeopardy recently, argued against Scott's stance that we should be honest in arguments in a comment thread on Jeff Kaufman's Facebook profile, which can be read here.

Scott just responded here, with a number of points relevant to the topic of rationalist communities.

I am interested in what LW thinks of this.

Obviously, at some point being polite in our arguments is silly. I'd be interested in people's opinions of how dire the real world consequences have to be before it's worthwhile debating dishonestly.

137 comments

Comments sorted by top scores.

comment by Viliam_Bur · 2014-02-25T14:33:14.188Z · LW(p) · GW(p)

A part which seems missing in the discourse -- probably because of politeness or strategy -- is that there are more than two sides, and that people on your side don't necessarily share all your values. When someone tells you: "Harry, look how rational I am; now do the rational thing and follow me in my quest to maximize my utility function!" it may be appropriate to respond: "Professor Quirrell, I have no doubts about your superb rationalist skills, but I'd rather use my own strategy to maximize my utility function." Your partner doesn't have to be literally Voldemort; mere corrupted hardware will do the job.

On the battlefield, some people share the common goal, and some people just enjoy fighting. Attacking the enemy makes both of them happy, but not for the same reasons. The latter will always advocate violence as the best strategy for reaching the goal. (The same thing happens on the other side, too.)

And an imporant part of the civilizing process Scott described is recognizing that both your side and the other side are in a constant risk of being hijacked by people who derive their benefits from fighting itself, and who may actually be more similar to their counterparts than they are to you. And that miraculous behavior which shouldn't happen and seems like a losing strategy, is actually the civilized people from the both sides half-knowingly forging a fragile treaty with each other against their militant allies and leaders.

Which feels like a treason... because it is! It is recognizing that there is some important value other than the official axis of the conflict, and that this value should be preserved, sometimes even at cost of some losses in the battlefield! -- This is what it means to have more than one value in your utility function. If you are not willing to sacrifice even epsilon of one value to a huge amount of the other value, then the other value simply does not exist in your utility function.

So, officially there is a battle between X and Y, and secretly there is a battle between X1 and X2 (and Y1 and Y2 on the other side). And people from X1 and X2 keep rationalizing about why their approach is the best strategy for the true victory of X against Y (and vice versa on the other side).

Civilization is a tacit conspiracy of decent people against psychopaths and otherwise defective or corrupted people. Whenever we try to make it explicit, it's too easy for someone to come and start yelling that X is the side of all decent people, and Y is the side of psychopaths, and this is why we from X have to fight dirty, silence the heretics in our own ranks and then crush the opponents. So we stay quiet amidst the yelling, and then we ignore it and secretly do the right thing; hoping that the part of conspiracy on the other side is still alive and ready to reciprocate. Sometimes it works, sometimes it doesn't; but on average we seem to be winning. And I wouldn't trade it for a "rationalist" pat on shoulder from someone I don't trust.

Replies from: Benito, shokwave
comment by Ben Pace (Benito) · 2014-02-28T11:11:45.349Z · LW(p) · GW(p)

"Professor Quirrell, I have no doubts about your superb rationalist skills, but I'd rather use my own strategy to maximize my utility function." Your partner doesn't have to be literally Voldermort

cough cough

Replies from: Viliam_Bur
comment by Viliam_Bur · 2014-03-01T15:04:51.612Z · LW(p) · GW(p)

'kay, fixed

Replies from: Benito
comment by Ben Pace (Benito) · 2014-03-01T18:44:56.920Z · LW(p) · GW(p)

Tbh, I just found it funny that you said that when your example actually was Voldemort.

comment by shokwave · 2014-02-26T15:56:44.631Z · LW(p) · GW(p)

So, officially there is a battle between X and Y, and secretly there is a battle between X1 and X2 (and Y1 and Y2 on the other side). And people from X1 and X2 keep rationalizing about why their approach is the best strategy for the true victory of X against Y (and vice versa on the other side).

This part doesn't make clear enough the observation that X2 and Y2 are cooperating, across enemy lines, to weaken X1 and Y1. 2 being politeness and community, and 1 being psychopathy and violence.

Replies from: Viliam_Bur
comment by Viliam_Bur · 2014-02-26T17:21:35.837Z · LW(p) · GW(p)

Disclaimer: I mentioned psychopaths and violent people, but that's in a context of an actual war and actual killing. If we only speak about "fighting" metaphorically, we need to appropriately redefine what it means to be "violent". In context of verbal internet wars, the analogy of psychopaths would be trolls, and the analogy of people who enjoy violence would be people who enjoy winning debates.

For the internet version of Genghis Khan, the greatest joy is to defeat his enemies in a public discourse, make them unpopular, destroy their websites, and take over their followers. The important thing is to win the popularity contest, having a better model of reality is only incidental. The thing to protect is the pleasure of winning, but other people's applause lights can be used strategically.

A person from X1 has only friends in X1 and X2. A person from X2 has friends in X1, X2, Y2. Assuming that having more friends is an advantage, the mutual politeness creates an advantage for people from X2 and Y2, and this is why they are doing it. I'd call that cooperation. In their case, cooperation is both a strategy and a goal.

In a way, also people from X1 and Y1 cooperate, but this cooperation is purely instrumental, as they hate each other. However, any act that successfully increases the mutual hate between groups X and Y helps them both, because it reduces their relative disadvantage against the 2.

comment by Eugine_Nier · 2014-02-25T04:32:52.839Z · LW(p) · GW(p)

The problem with Yvain's reply is that he omits the main reason why lying is a bad idea. Yvain compares lying to violence. I don't think this is a good comparison. It's acceptable to respond to violence with violence. It's not a good idea to respond to lies with lies.

Eliezer touched on this issue in his post here where he pointed out that one problem with lying to support a cause is that you'd better be absolutely sure that all your beliefs about the cause and what to do for it are in fact correct. However, the problem is even worse, there is a vicious cycle here since a cause that frequently lies is much more likely to acquire incorrect beliefs.

Think about it this way: suppose you believe that your cause justifies lying, so you lie about it. Your lies attract people to your cause who believe those lies. They in turn make up further lies (that they think are justified based on the lies they believe to be true). And so no until your cause's belief system is full of falsehoods and anti-epistomology. Your cause may ultimately "win" in the sense that it's followers acquire power, but by that point said followers may no longer care about your original goal. Even if they do, they're likely to have so many false beliefs that what they do to accomplish it is likely to be counter-productive and probably have other unpleasant side effects.

Note that the above argument applies to lying but not to violence. Thus in some sense lying for your cause is in fact worse than committing violence for it.

Replies from: Eugine_Nier, Eugine_Nier
comment by Eugine_Nier · 2014-02-25T04:50:21.570Z · LW(p) · GW(p)

A pithy way of summarizing the above comment:

If someone tells you his cause is so important that lying for it is justified, assume he's lying.

Replies from: Viliam_Bur
comment by Viliam_Bur · 2014-02-25T13:02:32.121Z · LW(p) · GW(p)

If someone tells you his cause is so important that lying for it is justified, assume they're lying.

This wins my personal "rationality quote of the decade" award.

comment by Eugine_Nier · 2014-02-26T06:00:36.018Z · LW(p) · GW(p)

One implication of this is that we can develop heuristics for how bad different lies. The basic idea is that lies that likely to spread (especially if their effectiveness depends on them spreading) are particularly bad. Especially if they're likely to spread within your movement (note lies used to increase support for your movement count here, since they'll bring in new recruits who believe them).

Note: that using these heuristics we can see that the classic example used to justify lying: "There are no Jews in my basement" is in fact much less bad then Yvain's example: "A man is more likely to be struck by lightning than be falsely accused of rape."

Replies from: roystgnr
comment by roystgnr · 2014-02-26T20:21:10.165Z · LW(p) · GW(p)

Would you elaborate?

First, I'm not sure what it means to say that "There are no Jews in my basement" is unlikely to spread. In a sense it's a "pre-spread" lie, since the lack of Gestapo breaking down your doors implies that they are all already fairly confident of the falsehood; you're just lying to decrease the probability that they'll stop believing it.

Second, to add my own hypothetical: I can see an isomorphism (in terms of how the lie spreads) between "There are no Jews in my basement" and "There are no embezzled charity funds in my basement". Obviously this isomorphism doesn't extend to the morality of the lies, which makes it hard for me to see a connection between spreadability and immorality.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2014-02-27T01:54:51.238Z · LW(p) · GW(p)

First, I'm not sure what it means to say that "There are no Jews in my basement" is unlikely to spread.

The Gestapo member is likely to have forgotten all about that specific lies by the time he finishes asking everyone on the block.

I can see an isomorphism (in terms of how the lie spreads) between "There are no Jews in my basement" and "There are no embezzled charity funds in my basement". Obviously this isomorphism doesn't extend to the morality of the lies, which makes it hard for me to see a connection between spreadability and immorality.

Disagree, the lies themselves are comparable, the difference in morality comes from the difference between the goals the lies are being used for.

comment by Protagoras · 2014-02-24T18:05:13.140Z · LW(p) · GW(p)

I'm with Scott. It's so natural to think that if your enemies are as ruthless as the Tsars and their goons, you need to be as ruthless as the Bolsheviks to fight them. But we all know how that worked out, and it hardly seems to be an outlier; rather, it seems to be the norm for those willing to sink to their opponents' level. If the goal is victory for our cause, and not just victory for some people who find it convenient to claim to be cheerleaders for our cause, we need to be very careful that our tactics are not training up Stalins within our ranks. Not that I'm advocating total purity at all times and in all respects, but I think before playing dirty you need to make sure you have a much better reason to think it's a good idea than "the other guys are doing it."

Replies from: blacktrance, buybuydandavis
comment by blacktrance · 2014-02-24T21:56:59.698Z · LW(p) · GW(p)

If the goal is victory for our cause, and not just victory for some people who find it convenient to claim to be cheerleaders for our cause, we need to be very careful that our tactics are not training up Stalins within our ranks.

Well said. Also, an additional benefit of rational discussion is that it promotes truthseeking - people may discover that the cause that they're supporting is not the cause that they should be supporting. Under a "win at all costs" paradigm, arguments against your position are enemy soldiers, so if you win, it'll be without seriously considering the arguments of the opposition. That increases the likelihood of you being wrong. If your goal is something beyond personal power - if it's something like "the correct thing should win and become dominant" and not "I, as I am now, should win and become dominant" - then honest discussion is even more useful.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2014-02-25T05:34:18.158Z · LW(p) · GW(p)

Also, as I mentioned here even if your initial cause was right, by lying about it you'll attract people who believe your lies. Thus, eventually your cause is likely to morph into something that is a bad idea.

Replies from: ChristianKl
comment by ChristianKl · 2014-02-25T14:04:35.859Z · LW(p) · GW(p)

Every revolution eats it's own children.

Replies from: Benito
comment by Ben Pace (Benito) · 2014-02-26T19:34:46.491Z · LW(p) · GW(p)

Other than the annihilation of the baby-eaters... But otherwise a really cool quote.

comment by buybuydandavis · 2014-02-24T20:34:29.201Z · LW(p) · GW(p)

Of course.

The goal isn't to match the opponent, the goal is an effective strategy to further your own ends. Complete pacifism in the face of abuse is probably not it.

Replies from: Protagoras, Eugine_Nier
comment by Protagoras · 2014-02-24T21:07:23.585Z · LW(p) · GW(p)

People seem to overestimate the effectiveness of playing dirty, though. Perhaps willingness to play dirty signals commitment, and I expect some of the time people are more interested in showing off their commitment than actually making progress toward the putative goal. But in any event, playing dirty has all sorts of costs (some discussed in this thread) which people seem to ignore or underestimate, and my only point is that it's a strategy to be employed only when it still seems like the best option even after all the costs and risks have been considered.

Replies from: TheOtherDave, polymathwannabe
comment by TheOtherDave · 2014-02-24T21:50:40.312Z · LW(p) · GW(p)

Perhaps willingness to play dirty signals commitment, and I expect some of the time people are more interested in showing off their commitment than actually making progress toward the putative goal.

Yeah, that's pretty much my take. Often, signalling the willingness to play dirty without actually doing so gets us the collective benefits of "niceness, community, and civilization" while also getting us some extra individual benefits on top of that. And asserting that playing dirty is effective and that rational agents should be willing to play dirty can be an effecting way of signalling that willingness.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2014-02-25T03:54:45.622Z · LW(p) · GW(p)

Until someone comes along reads all the stuff you wrote about the importance of playing dirty and believes you.

Replies from: ChristianKl
comment by ChristianKl · 2014-02-25T14:06:29.084Z · LW(p) · GW(p)

Or alternatively uses it to argue that you aren't trustworthy because you are willing to play dirty.

comment by polymathwannabe · 2014-02-24T21:17:40.203Z · LW(p) · GW(p)

I've been considering to precommit to this: if someone in a group I'm a part of plays dirty or uses blackmail, I'll delete all of his/her reputation points in my head, and impose a moratorium on when he/she can start earning reputation points with me again. I would do this regardless of the success of what he/she did to the group.

Is this wise?

Replies from: Scott Garrabrant
comment by Scott Garrabrant · 2014-02-25T04:17:18.840Z · LW(p) · GW(p)

It is perhaps not wise to have such an all or nothing reaction to something that is as hard to define as "plays dirty" or "uses blackmail."

comment by Eugine_Nier · 2014-02-25T05:22:43.100Z · LW(p) · GW(p)

What do you mean by "complete pacifism"?

The way to fight someone how spreads lies about you is not to spread lies about them, it's to spread the truth about them.

Replies from: buybuydandavis
comment by buybuydandavis · 2014-02-25T09:47:10.444Z · LW(p) · GW(p)

When I speak of fighting back, I'm talking about making them pay a cost, and not feeling constrained to play fair for their sake. They've forfeited that consideration.

If you have overriding reasons to tell the truth, do so. But not to preserve value for them. When someone attacks you, it's time to destroy values for them.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2014-02-26T01:46:38.229Z · LW(p) · GW(p)

When I speak of fighting back, I'm talking about making them pay a cost, and not feeling constrained to play fair for their sake.

Agreed, however, as I argue here the biggest reason for not lying for your cause isn't for their sake, it's for yours.

comment by acadigia · 2014-02-24T09:23:44.499Z · LW(p) · GW(p)

I'm not entirely convinced that the relationship between crafting a rational argument and crafting a persuasive argument is nearly as inverse-correlational as implied. On average, lies have a higher manufacturing cost (because you have to tread carefully and be more creative), a greater risk (since getting caught will lower your overall persuasiveness), and a smaller qualitative gain (while lies probably persuade more people, I suspect that they persuade less rationalists than civil debate and are therefore less qualitative overall). There are other means of persuading people without making deliberately irrational arguments. If sound reasoning alone isn't tasteful enough for you, why not season your truth with charm instead of coating it in sophistry? Why not leverage charisma or cordiality? You know - the dark art of sucking up?

While fear is often heralded in psychological communities as the most effective mechanism of persuasion, that doesn't mean it's the mechanism of persuasion with the greatest utility. A well-beaten child might obey best, but obedience isn't the only goal of discipline - nor agreement the only goal of argumentation. Personally, I'd rather treat every worthy cause as an opportunity for non-rationalists to exercise rationality than as an excuse for rationalists to manipulate non-rationalists. This tactic might not win every argument now, but it lays a surer foundation on which to build our arguments in the future.

Replies from: Error, buybuydandavis, NancyLebovitz
comment by Error · 2014-02-25T15:22:55.677Z · LW(p) · GW(p)

A well-beaten child might obey best, but obedience isn't the only goal of discipline - nor agreement the only goal of argumentation.

I've heard this referenced somewhere as the difference between persuading someone and convincing them. You can apply rhetoric or logic until someone verbally accepts your arguments, but that is not the same as getting them to genuinely believe that what you are saying is true.

Sometimes people will say "Okay, you're right" just to get you to shut up.

comment by buybuydandavis · 2014-02-25T20:29:52.464Z · LW(p) · GW(p)

On average, lies have a higher manufacturing cost (because you have to tread carefully and be more creative)

Hardly. It's much easier to throw bullshit at the wall than to clean it off.

In many public debates, people shovel outright lies again and again. In the time it takes for you to properly evaluate their lie, they've shoveled 50 more.

Also, their are lies, and then there is conceptual muddle that's not even false. Try cleaning that up. Conceptual muddle takes centuries to clean up.

, a greater risk (since getting caught will lower your overall persuasiveness)

Since when? With whom? Who has paid enough attention to keep track? This is one of the fundamental problem with public debates - nobody is keeping score.

If the people who already agree with you even notice, they'll likely shrug it off as a tactic, or just shift their attention to the next piece of bullshit supporting their views that they haven't yet seen through.

, and a smaller qualitative gain (while lies probably persuade more people, I suspect that they persuade less rationalists than civil debate and are therefore less qualitative overall)

It looks like you're suggesting that rationalists count more - somehow? Even if they do, they don't have the numbers.

Rationalists are good if you want someone to produce useful epistemic truths. If you want to persuade masses of people, probably not so good. They don't persuade, and their numbers are so small persuading them doesn't take you very far in the aggregate.

People use the Dark Arts because they're effective. Otherwise they'd be called the Dark Incompetencies.

comment by NancyLebovitz · 2014-03-04T18:45:17.522Z · LW(p) · GW(p)

A well-beaten child might obey best, but obedience isn't the only goal of discipline - nor agreement the only goal of argumentation.

You've got the wrong kind of fear there-- the effective use of fear is to make your listener afraid of some third party or event, not to make them afraid of you.

If you make people afraid of you, they might give in, especially if you have physical power over them. You might get useful compliance that way. However, you're also likely to get people to avoid you if they can, or to push back compulsively.

comment by Mestroyer · 2014-02-25T02:53:33.849Z · LW(p) · GW(p)

Whether or not the lawful-goods of the world like Yvain are right, they are common. There are tons of people who want to side with good causes, but who are repulsed by the dark side even when used in favor of those causes. Maybe they aren't playing to win, but you don't play to win by saying you hate them for for following their lawful code.

For many people, the lawful code of "I'm siding with the truth" comes before the good code of "I'm going to press whatever issue." When these people see a movement playing dirty, advocating arguments as soldiers, where you decide whether to argue against it based on whether it's for your side rather than whether it's a good argument, getting mad at people for pointing out bad arguments from their side, they begin to suspect that your side is not the "Side of Truth". So you lose potential recruits. And the real Sith lords, not the ones who are trying to use the dark side for good, will have much less trouble hijacking your movement with the lawful-goods and their annoying code and the social standards they impose gone.

Leaving aside the honor among foes idea, and the "what if you're really the villain" idea, if your cause is really just, then although the lawful-goods are less effective than you, their existence is good for you. Not everything they do is good, but on balance they are a positive influence. You're not going to convince them to attempt to be dark side users for good like you are attempting to be, so stop giving them reasons to dislike you.

Even if you can convince them, the lawful-evils who think they are lawful-goods are listening to your arguments. Most people think they are good. It is hard to tell when you're not good. So the idea that only truly good people are bound by the lawful code is crazy. Lots of lawful evil is an unintentional corruption of lawful good, and this corruption doesn't unilaterally affect your goodness and your lawfulness. They could tell (or at least convince themselves) they weren't really good, if they didn't follow the lawful code, because they think like lawful good people in that respect. The lawful evil people who see you, and know you are opposed to them on the good/evil axis, think they see evil people saying "Forget this honor among enemies thing. We have no honor. Watch me put on this 'I am defectbot' shirt". And that is a much stronger argument to abandon the lawful code of rational argument and become the much more dangerous chaotic evil than what the lawful-goods hear, which is their chaotic good allies telling them to defect.

But in real modern human politics, it's more complicated because although there is one lawful/chaotic axis, there are many good/evil axes. Because there are many separate issues that people can get right or wrong. Arthur Chu thinks that the issue of overriding importance is social justice. So he demands that we drop all cooperation with people who are evil on that axis. He says we aren't playing to win. I can think of 3 issues (2 of them are actually broad categories of issues) that I am confident are more important than social justice, and which are easier to improve than the problems social justice wants to counter. In order of decreasing importance, existential risk, near-term animal suffering including factory farming and wild animals, and mortality/aging.

In real life, you don't demand that your allies be on the same end of every good/evil axis as you. That is not playing to win. A better strategy (and the one Chu is employing) is to pick the most important axis, and try and form a coalition based on that axis. Chu accuses LW of not playing to win, well, I'm just not playing to win along the social justice axis at the cost of everything else. I think different axes are more important.

And there's also the fact that for some causes, "lawful" people (people who play by the rules of rational discourse) are much better to have as allies. If we use bad statistics and dark arts to convince the masses to fund FAI research, they may as well fund Novamente as MIRI. Not all causes can benefit from irrational masses. Something like MIRI can't afford to even take one step down the path to the dark side. When you want to convince academics and experts of your cause, they will smell the dark arts on you and conclude you are a cult. And with the people you will attract by using dark arts, your organization will soon become one. The kind of people who you absolutely need to do object-level work for you are the kind of people who will never join you if you use the dark arts.

If you take a pluralistic "which axes are important" approach instead of the one that Chu takes, then there is a lot to be said for lawfulness, because it tends to promote goodness*, a little. And when get a bunch of lawful-goods and lawful-evils together and you nudge them all a little toward good through rational discussion (on different axes), that is pretty valuable. Because almost everyone is evil on at least one axis. And such a community needs a policy like "we ask that you be lawful,[follow standards of rational discourse] not that you are good [have gotten object-level questions of policy right]," because it is the only defensible Schelling point.

*If you haven't caught on to how I'm using "law vs chaos" and "good vs evil" axes here by now, this may sound like moral realism, but when I mean by "law" is upholding Yvain-style standards of discourse. What I mean by "good" is not just being moral, but being moral and right given that morality about questions of ethics.

Replies from: anon895, None
comment by anon895 · 2015-02-24T03:19:21.422Z · LW(p) · GW(p)

Could you post a screenshot or archived version of your Facebook link?

comment by [deleted] · 2014-02-25T03:28:06.162Z · LW(p) · GW(p)

According to the vast majority of social just types, you have just signaled yourself as a quite serious enemy. At the very least you would get banned from any site you posted this on. At worst you would probably be blackballed and quite roundly thrashed in social media.

Given your argument of how they should interact with "lawful goods", and please taboo your applause lights of "dark arts" and "dnd alignments in general", you are unironically making a much larger mistake wrt to them. That is, this post will put them off towards you, and possibly people who interact positively with you, far more than Arthur's posts would put Yvain off.

Can you explain the difference here? Why is it rational for you to make this post but not for Arthur to make his?

For reference, saying that existential risk and animal cruelty outweigh social justice is going to be extremely offensive to them. I'm not sure I could state how much mortality/aging, especially in LW terms, being more important than social justice would make them hate you without getting banned, even on a site that pride's itself on free and open discourse. Well, I suppose i could try but I would probably fail.

I wouldn't react the same way, but I also wouldn't fault them for their reaction.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2014-02-25T05:03:50.868Z · LW(p) · GW(p)

According to the vast majority of social just types, you have just signaled yourself as a quite serious enemy. At the very least you would get banned from any site you posted this on. At worst you would probably be blackballed and quite roundly thrashed in social media.

Is this supposed to be an argument against Mestroyer or against the Social Justice Types?

Replies from: None
comment by [deleted] · 2014-02-25T05:08:03.709Z · LW(p) · GW(p)

Its an inconsistency in Mestroyer's logic. He talks about how social justice types could attract "lawful good" people. But his response, typical of a certain class of rationalist is so offensive to social justice types that one wonders whether it makes sense to attract a group who places social justice so low on their to-do list. It seems to me that it would be counter productive to integrate people such as Mestroyer into the social justice community.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2014-02-25T05:29:07.251Z · LW(p) · GW(p)

But his response, typical of a certain class of rationalist is so offensive to social justice types

This is a problem with the social justice types being too mindkilled, not a problem with Mestroyer's logic.

Replies from: None
comment by [deleted] · 2014-02-25T05:42:46.105Z · LW(p) · GW(p)

Disagree. Social justice is a set of dozens of axes and deals with issues like the prison-industrial complex. But somehow people caught up in that are being unreasonable when Mestroyer says that existential risk is more important and they take offense? That's ridiculous.

The net harm done by any number of social justice issues far outweighs the issues Mestroyer considers important based on his comment.

Are you arguing that its merely the intensity of the response that makes them mindkilled?

Replies from: Eugine_Nier
comment by Eugine_Nier · 2014-02-25T06:05:36.009Z · LW(p) · GW(p)

Are you arguing that its merely the intensity of the response that makes them mindkilled?

The intensity of the response is what makes them mindkilled. Of course, if the social justice people were actually willing to listen to people who disagreed with them, they might realize that Mestroyer is in fact correct about existential risk being more important that their issues.

Edit: In fact, if they listed to more criticism, they might realize that the net harm from most of their issues is at worst negligible and at best negative, i.e., it is the social justice movement itself that is doing net harm.

Replies from: None
comment by [deleted] · 2014-02-25T06:11:26.612Z · LW(p) · GW(p)

No its not?

It might be more important to white upper middle class rationalists, especially in somewhere like the Bay Area. Can't argue with that.

You'd be hard pressed to convince me that cryonics is more beneficial to people of color than dismantling the systematic bias against people of color inherent in western society. Most existential risks are similarly unconvincing.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2014-02-25T06:22:35.189Z · LW(p) · GW(p)

You'd be hard pressed to convince me that cryonics is more beneficial to people of color than dismantling the systematic bias against people of color inherent in western society.

Well, there is a very simple reason cryonics is more beneficial than "dismantling the systematic bias against people of color inherent in western society", namely, that the "systematic bias against people of color inherent in western society" doesn't actually exist. If anything modern western society has a systematic bias in favor of people of color.

Here's a hint: the people who told you that there exists "a systematic bias against people of color inherent in western society" believe that lying is justified for the cause, and they were either lying to you or repeating someone else's lie.

Replies from: None
comment by [deleted] · 2014-02-25T06:28:43.993Z · LW(p) · GW(p)

You're really serious aren't you? Affirmative action? That's your argument? And I thought this was a rationalist website. You probably think there isn't a systematic bias against women either.

I'm tapping out, now that you've revealed your true nature.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2014-02-25T06:34:31.334Z · LW(p) · GW(p)

You're really serious aren't you? Affirmative action? That's your argument?

Yes, and I notice a distinct lack of counter-argument on your part.

And I thought this was a rationalist website.

Yes, and that means we are expected to provide arguments for our claims here.

You probably think there isn't a systematic bias against women either.

There isn't.

now that you've revealed your true nature.

From my experience, that's social-justice-speak for "I don't actually have any rational arguments against your position so I'm going to resort to name calling".

comment by JQuinton · 2014-02-24T22:43:19.973Z · LW(p) · GW(p)

Chu's position -- at least, as presented at Yvain's blog -- seems to dip into the realm of being a guardian of truth. To me, that position is always scary... even if it comes from the "good guys".

Replies from: None
comment by [deleted] · 2014-02-24T23:42:07.733Z · LW(p) · GW(p)

I would personally suggest that anyone considering this issue seriously read the multitude of comments Chu made on the JKaufmann blog post thing. It may or may not change your mind but its solid evidence, and its easily accessible. If you took the time to read Yvain's post, the 2-10 minutes, depending on your reading speed, to read all of Chu's comments in their original context is time well spent.

Replies from: Pfft
comment by Pfft · 2014-02-25T03:22:44.677Z · LW(p) · GW(p)

Done. I think the most significant point Chu made which didn't come across in the other summaries was that "some ideas are inherently dangerous and must not be allowed to spread", and that neoreaction is among those.

So I guess that a lot of the disagreement come down to how dangerous you believe the ideas are. A big reason I feel comfortable reading Moldbug looking for interesting points of view is that his ideas have lost so thoroughly---regardless of his feelings that black people would be better off as slaves, the probability that slavery will be reinstated in America is basically zero (except perhaps in a complete collapse of civilization). If I believed that discussing Moldbug carried an appreciable risk of destroying modern liberal society, then I wouldn't.

(Indeed, since the pseudo-nazi revival in Greece in recent years, I have felt a bit less comfortable about Moldbug too. Suddenly, liberal democracy seems slightly less secure).

Replies from: None, Eugine_Nier, ikrase
comment by [deleted] · 2014-02-25T03:43:42.894Z · LW(p) · GW(p)

Upvoted for followthrough.

How do you feel about less intense negatives, such as social regression? Things like how American conservatives essentially export their bullshit all over the world, such as the rise of American anti abortion tactics in Britain, and the role American conservatives player in the anti-gay movement in Russia? Or just certain anti-gay, anti-woman, anti-POC stances in America? For instance Arizona passed, or tried to pass, a law allowing for discrimination against LGBT and racial minorities under the grounds of religious freedom by businesses.

At what point does it be come problematic enough that we should stop debating and crack down on these behaviors? Although, crackdown may be of varying levels. No need to send in the army to round up Arizona legislators.

Replies from: Eugine_Nier, Viliam_Bur, Eugine_Nier
comment by Eugine_Nier · 2014-02-26T02:02:14.019Z · LW(p) · GW(p)

such as social regression

Care to define what you mean by "social regression", also explain why it's a bad thing.

For instance Arizona passed, or tried to pass, a law allowing for discrimination

Why should discrimination be illegal? Also while we're on the subject, should churches be forbidden to discriminate on religion?

comment by Viliam_Bur · 2014-02-25T12:56:07.338Z · LW(p) · GW(p)

What happens in America, influences significantly the rest of the world. Yes. But it's almost orthogonal to the question of how to win the battle within America.

If the hypothesis "if you play nice, you are more likely to win (because people will enjoy joining your side), and if you play dirty, you are more likely to lose (because neutral people will hate you, and you will also have a lot of internal fighting)" is true -- which is the thing being debated -- then the fact that the outcome in America will strongly influence the rest of the world, just makes it more important to play nice in America.

More meta: If you believe some strategy is the winning strategy, increasing stakes should make you follow the strategy more carefully, not abandon it.

comment by Eugine_Nier · 2014-02-26T04:42:00.869Z · LW(p) · GW(p)

For instance Arizona passed, or tried to pass, a law allowing for discrimination against LGBT and racial minorities under the grounds of religious freedom by businesses.

This is not even an accurate summary of the law in question.

comment by Eugine_Nier · 2014-02-25T04:01:29.097Z · LW(p) · GW(p)

If I believed that discussing Moldbug carried an appreciable risk of destroying modern liberal society, then I wouldn't.

How sure are you that "modern liberal society" is in fact a good thing? What evidence convinced you to believe this? How sure are you that evidence wasn't fabricated by someone who also thought lying was justified to protect modern liberal society?

Replies from: polymathwannabe
comment by polymathwannabe · 2014-02-25T04:03:45.126Z · LW(p) · GW(p)

Upvoted, not because I have anything against modern liberal society, but because we should routinely question our beliefs.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2014-02-25T04:42:00.880Z · LW(p) · GW(p)

My point is more than that. It is that by lying for a cause you've made it much harder to properly question any of its beliefs. After all, properly questioning something requires getting accurate data, which is much harder if you're also spreading false data about the subject.

comment by ikrase · 2014-03-01T12:15:34.810Z · LW(p) · GW(p)

Does Moldbug actually believe that?

Replies from: bramflakes
comment by ArisKatsaris · 2014-02-24T17:18:14.052Z · LW(p) · GW(p)

I'd be interested in people's opinions of how dire the real world consequences have to be before it's worthwhile debating dishonestly

I, for one, have the impression me that the more dire the consequences, the more important honesty in arguments becomes. So, I don't really get your dilemma.

Replies from: James_Miller
comment by James_Miller · 2014-02-24T19:15:11.945Z · LW(p) · GW(p)

What if you are Jewish and are trying to stop a Hitler from coming to power and the best means would be to spread a deliberate lie about him. Are you saying that the worse the outcome would be, the less likely you would be to lie?

Replies from: asr, Yvain, ArisKatsaris, polymathwannabe
comment by asr · 2014-02-25T03:37:08.043Z · LW(p) · GW(p)

What if you are Jewish and are trying to stop a Hitler from coming to power and the best means would be to spread a deliberate lie about him. Are you saying that the worse the outcome would be, the less likely you would be to lie?

Nobody in this discussion is confronting a present or potential totalitarian state bent on murder so this feels like a tangent. In fact, this is a hypothetical that very few people are ever confronted with and therefore it isn't relevant to a question of practical ethics. Very few people are skilled enough at predicting the future to know when the situation is dire or whether dishonesty will work; very few people are skilled enough manipulators to pull it off.

For the range of social issues the participants in this conversation are likely to confront, I think it's a good policy to be more careful and honest the higher the stakes. Among other things, the higher the stakes, the likelier a lie or mistake is to be caught. And being caught lying doesn't generally achieve any goal of the liar.

Replies from: James_Miller
comment by James_Miller · 2014-02-25T03:48:24.278Z · LW(p) · GW(p)

Obamacare only became law because Obama lied by saying that under the law "If you like your health care plan, you can keep it." PolitiFact made this their lie of the year.

I suspect that many on the left knew at the time Obama was lying about this but kept quiet because they really wanted the law to pass. They won.

Replies from: asr, ChristianKl
comment by asr · 2014-02-25T03:59:50.261Z · LW(p) · GW(p)

I suspect that many on the left knew at the time Obama was lying about this but kept quiet because they really wanted the law to pass. They won.

[upvoted for giving a crisp recent, and plausible example of people getting away, at least in the short term, with dishonesty. I was a little squeamish about the politicization of the topic but I think it's hard to avoid giving a real political example in a conversation about political dishonesty]

I take the point that there's a complicated collective-action problem here where if enough people repeat something they wish were true, it can become relatively accepted, at least for a while.

The catch is that, as happened here, people often get caught having been dishonest. And we will see how painful the consequences are for those people personally and politically.

comment by ChristianKl · 2014-02-25T14:34:48.073Z · LW(p) · GW(p)

Obamacare only became law because Obama lied by saying that under the law "If you like your health care plan, you can keep it."

Obama doesn't use truth as a strategy but that doesn't change the fact that Cato was a very successful politician when it comes to people respecting his positions.

I suspect that many on the left knew at the time Obama was lying about this but kept quiet because they really wanted the law to pass. They won.

The didn't lose but they also didn't get the single payer health care they wanted.

I think US politics is ready for someone like Cato to come up and take it over. You don't win in politics by telling a bit less lies than your opponents. On the other hand actually being honest has it's advantages.

comment by Scott Alexander (Yvain) · 2014-02-28T09:35:40.829Z · LW(p) · GW(p)

I think the relevant axis may be short-term/specific vs. long-term/broader consequences rather than unimportant vs. important. I think defecting is usually a long-term bad strategy but a short-term good one. If you're pretty sure there's not going to be a long-term unless you fix your short-term problems immediately, defecting might be a good idea for you or your chosen cause - not sure about for the world at large.

comment by ArisKatsaris · 2014-02-25T01:30:34.328Z · LW(p) · GW(p)

What if you are Jewish and are trying to stop a Hitler from coming to power and the best means would be to spread a deliberate lie about him.

If I lie about him, then the most likely consequence is that Hitler will have verified proof that "Jews are lying about me". So the consequence is that I would end up helping cause the holocaust, not stopping it.

More generally what's the point of using a hypothetical scenario where the assumption is that the best means would be to spread a lie, when that's exactly what I'm contesting (that lying is the best means)? That's begging the question. Tell me in what exact way I'd be in an epistemic position to know that lying is the best means?

Replies from: James_Miller
comment by James_Miller · 2014-02-25T04:26:43.044Z · LW(p) · GW(p)

The set of things you could say is vastly larger than the set of true things you could say so unless lying is observed and punished you should assume that you are probably better off at least occasionally lying.

I'm a game theorist and think that wearing makeup or acting more confident than you really are, are forms of lying that frequently benefit individuals.

Replies from: ChristianKl
comment by ChristianKl · 2014-02-25T14:36:21.561Z · LW(p) · GW(p)

I think I spent to much attention at optimizing things like the clothing I'm wearing and the way the background is arranged at my first TV interview.

Being busy with tactics takes mental resources and builds anxiety. I would have probably done better if I would have spoken from a more relaxed state of mind that doesn't worry that much about the background of the image.

I'm a game theorist and think that wearing makeup or acting more confident than you really are, are forms of lying that frequently benefit individuals.

Do you in fact wear makeup on a regular basis?

Replies from: James_Miller
comment by James_Miller · 2014-02-25T15:12:50.247Z · LW(p) · GW(p)

No makeup, but I do fake confidence.

Replies from: ChristianKl
comment by ChristianKl · 2014-02-25T15:14:42.753Z · LW(p) · GW(p)

Why no makeup? It's possible to use makeup as a man in a way that accentuates manly features.

Replies from: James_Miller
comment by James_Miller · 2014-02-25T15:19:40.198Z · LW(p) · GW(p)

I'm open to the idea, I do dye my hair.

Replies from: ChristianKl
comment by ChristianKl · 2014-02-25T16:10:26.502Z · LW(p) · GW(p)

I think that if you really look at the makeup question you will find that's not cost effective.

A quick googling gives me the number that woman spent on average 91 hours per year applying makeup ( http://www.dailymail.co.uk/femail/article-2175077/Women-spend-43-weeks-life-applying-make-perfecting-face-night-out.html ).

I think that a woman who rather spends the same amount of time in daily meditation sessions will get a higher return on her time investment.

In a world full of superficial people there's not much comparative advantage at trying to be better at being more superficial than everyone else. I think it's a better strategy to compete based on personal depth.

If you are open about who you are, that will make you more confident than if you walk around all the time with a mask.

Replies from: gwern, D_Malik
comment by gwern · 2014-02-25T19:01:13.496Z · LW(p) · GW(p)

In a world full of superficial people there's not much comparative advantage at trying to be better at being more superficial than everyone else. I think it's a better strategy to compete based on personal depth.

Comparative advantage doesn't mean you can neglect something entirely. Personal attractiveness has large consequences on how people evaluate & treat you, and equally so for men and women, it looks like (Langlois et al 2000 (excerpts) claim gender is not a large moderator of beauty effects). Even if Miller goes beyond just dying his hair, he could still be well below optimal.

Replies from: ChristianKl
comment by ChristianKl · 2014-02-25T21:11:07.722Z · LW(p) · GW(p)

You can get personal attractiveness through different ways.

I went in three years from being told that I never smile while dancing to being asked why I smile while dancing without being able to give a reason.

It's not because I specifically worked on my smile but because I did emotional work on a deep level.

At a family event yesterday someone told me that I look taller then when we last meet a while ago and I probably do appear taller than I was a year ago because my body language changed as a result of deeper work.

If you become a more happy person who doesn't get anxiety because of all sorts of things that are happening around you, you will appear to be more attractive in any face to face encounter and even on photos.

If I want to connect with another person I care about perceiving the reactions that the words that I speak have on the other person. If the women with whom I'm talking doesn't show any facial reactions because she's on botox that makes it a lot harder for me to connect with her.

A good quote in the CBT book "The feeling good handbook" is "You can never be loved for your successes-only for your vulnerabilities. People may be attracted to you and may admire you if you are a great success. They may also resent and envy you. But they can never love you for your success."

Being vulnerable is useful. If all of your bodylanguage is fake and further signals are hidden by makeup than you aren't vunerable and you make it hard for other people to love you.

gwern in my you in my mind one of the few individuals who usually walks his talk. Do you think it's useful to use makeup? Do you use it yourself? Especially if you cite a paper that gender isn't very important when it comes to the effects of beauty,

Given the nature of the subject it might be hard to speak openly*, but do you do other black hat stuff to manipulate the people you interacts with into finding you more attractive?

*While I do promote openness I'm also willing to treat information that's marked as private privately and my commitment to openness, doesn't mean that I have a problem of protecting the secrets of other people.

Replies from: gwern
comment by gwern · 2014-02-25T23:45:48.176Z · LW(p) · GW(p)

gwern in my you in my mind one of the few individuals who usually walks his talk. Do you think it's useful to use makeup? Do you use it yourself? Especially if you cite a paper that gender isn't very important when it comes to the effects of beauty

I don't use makeup at the moment, but I have two main reasons for this: I interact with few people so I expect my gains to be less than average, and I am revolted by the very idea of using cosmetics or working on my appearance. (I think it's a mix of dislike of deception, laziness, and gender norms.)

The former is fine as far as it goes, but as far as the latter is concerned... I admit it is a bad reason; I've been trying to improve matters by small compromising steps which don't trigger my dislike: purchasing better-looking glasses, improving my shaving routine, more regular exercise, throwing out the worst of my clothes.

Replies from: ChristianKl
comment by ChristianKl · 2014-02-26T09:35:08.825Z · LW(p) · GW(p)

I am revolted by the very idea of using cosmetics or working on my appearance. (I think it's a mix of dislike of deception, laziness, and gender norms.)

It's indeed a problem when you at the same time revolt against the idea of working on your appearance and think it's a high benefit activity.

I think the solution is either to work out that working on your appearance goes against the values of yourself or to revolve the emotional issues and work at your appearance.

If you walk around and it's clear that your appearance isn't optimized because you don't believe in doing so but could if you wanted to that's respectable and can be a high status move. If you appear to be trying hard to work on your appearance and fail in doing anything because you revolt against the idea of working on your appearance that no sign of social status.

For myself the time when I put the most attention on my appearance was being 1 year into dancing Salsa. The activity gave me a new perception of my body and after I perceived I had internal motivation to improve. At the time I was also trying to optimize to effect other people with I'm not really anymore but I'm still not badly dressed. I'm no Zizek ;)

comment by D_Malik · 2014-02-26T07:45:24.399Z · LW(p) · GW(p)

I think that a woman who rather spends the same amount of time in daily meditation sessions will get a higher return on her time investment.

I'm not so sure. Women who don't wear makeup are much less attractive, which significantly reduces their social status and their dating market value. These are things people greatly value.

If you are open about who you are, that will make you more confident than if you walk around all the time with a mask.

I think confidence depends mostly on practice and genetics and situational factors. If anything, I think the superficiality-confidence connection is the other way round - being confident makes people see you as more genuine, because of the halo effect, i.e. because everybody loves to hate low-status people. People without masks are weirdos, because what people call "being normal" is a learned behavior, a mask.

Replies from: polymathwannabe, ChristianKl, Lumifer, therufs
comment by polymathwannabe · 2014-03-27T18:49:03.421Z · LW(p) · GW(p)

Women who don't wear makeup are much less attractive...

... depending on the eye of the beholder.

comment by ChristianKl · 2014-02-26T08:55:44.714Z · LW(p) · GW(p)

I think confidence depends mostly on practice and genetics and situational factors.

You will get less real practice if you are walking around with a mask. If you worry what other people think about your looks to the extend that you spent 30 minutes to look presentable that will effect your confidence.

Women who don't wear makeup are much less attractive, which significantly reduces their social status and their dating market value.

Why do you believe that there a difference between men and woman in that regard? I think the fact that you separate genders has a lot to do with status quo bias.

People without masks are weirdos

That depends a lot on the environment in which you are moving. There are corporate environments where you are expected to wear a mask and where you can't drop it completely. Yet Steve Job who was a Buddhist who meditated a lot did very well while wearing a sweater instead of dressing in a suit.

Steve Job wore no makeup with is not typical for people who go in front of the camera and on big stages and have the budget for makeup stylists.

Jobs: I have never worn makeup.
Shriver: Really?
Jobs: I don’t give a s**t what I look like.

For all the talk about game theory, strategy matters. If you want to play Steve Job's strategy that's not compatible with spending a lot of effort on looking attractive but instead sitting a lot and meditate.

Getting bogged down in tactics isn't good.

Replies from: D_Malik
comment by D_Malik · 2014-02-26T09:38:52.780Z · LW(p) · GW(p)

You will get less real practice if you are walking around with a mask. If you worry what other people think about your looks to the extend that you spent 30 minutes to look presentable that will effect your confidence.

Then again, you'll be more confident wearing a mask the more practice you have with it. I think we mean different things by "masks" though.

I think that a woman who rather spends the same amount of time in daily meditation sessions will get a higher return on her time investment.

Women who don't wear makeup are much less attractive, which significantly reduces their social status and their dating market value.

Why do you believe that there a difference between men and woman in that regard? I think the fact that you separate genders has a lot to do with status quo bias.

The quote I was replying to dealt exclusively with women. That said, there is a big difference, especially if by "makeup" you mean what everyone else means by "makeup". Men are not respected more if they wear eyeliner and blusher every day. I do think that men would benefit from optimizing their personal appearance, eg by getting rid of acne, whitening teeth, dressing better, wearing heel lifts, etc.

Re Steve Jobs: Giving one outlier as a counterexample does not undermine the general principle. Anyway, Jobs seems to be countersignalling: "I'm so awesome I don't even need to dress up for you to know that I'm awesome." It wouldn't have worked if people didn't already think highly of him, and I'm not even sure it worked at all. (We don't know what would have happened if he had cared more about his appearance.)

Replies from: ChristianKl
comment by ChristianKl · 2014-02-26T16:03:35.157Z · LW(p) · GW(p)

That said, I do think there is a big difference, especially if by "makeup" you mean what everyone else means by "makeup". Men are not respected more if they wear eyeliner and blusher every day.

The kind of makeup that male actors on TV use isn't about wearing eyeliner.

There are women who use makeup to look artificial and there are women who go for a "natural look". I don't think that a women who goes for an artificial look will get more status in a Yoga class than a women who uses less makeup. The benefits of looking artificial for women depend on the social circle in which the woman moves.

While Angela Merkel does use makeup, she didn't get in a position of political power by being good looking. Using makeup to make her look very feminine wouldn't help her.

I do think that men would benefit from optimizing their personal appearance, eg by getting rid of acne, whitening teeth, dressing better, wearing heel lifts, etc.

A lot of men have acne because of hormonal issues. If I try to avoid to show emotions on my face, it gets tense. That reduces blood flow in my face. Less bloodflow means that it's more difficult for my immune system to clear my face of bacteria that cause acne.

I don't think it's an accident that you see acne more often in asocial nerds than you see it when you look bodybuilders. The difference also isn't that the bodybuilder went to more dermatologists.

Then again, you'll be more confident wearing a mask the more practice you have with it.

Your brain will always know that you are afraid to be open if you wear a mask for the sake of impressing other people. I don't have an issue with a woman putting on makeup because she enjoys putting on makeup and don't think that will reduce confidence.

Re Steve Jobs: Giving one outlier as a counterexample does not undermine the general principle.

I'm talking about making strategic choices. Steve Job is someone who went to Buddhist meditation retreats before he was famous. That distinguishes him. As a result he wasn't obsessed about optimizing his appearance on the makeup level.

It's not a choice that in the cards for someone who did that level of deep internal work, to put on makeup to impress other people. Yes, there are women who are on that level who put on makeup because they like the activity but it comes from a very different place than putting on makeup to make people like them.

Replies from: D_Malik
comment by D_Malik · 2014-03-05T09:36:40.352Z · LW(p) · GW(p)

If I try to avoid to show emotions on my face, it gets tense. That reduces blood flow in my face. Less bloodflow means that it's more difficult for my immune system to clear my face of bacteria that cause acne.

This is very interesting; is this a significant cause of acne, and if so how do you know? If this were true, we would expect that other things that decrease blood flow in the face (such as cold weather, maybe?) would also increase acne.

Here are other hypotheses on acne, not sure whether they're true:

  • Acne is a defense mechanism employed when the body detects that one is low-status. That is, it's a way of making yourself less threatening to the rest of the tribe so that they won't slaughter you. If true, this could be mediated by status hormones like testosterone and cortisol.
  • Sunlight and/or shortage of vitamin D causes acne. (I have anecdotal evidence that tanning reduces acne.)
  • Acne is caused by weird foods, such as dairy or sugar.
  • Acne is caused by excessive face-washing, which screws up homeostatic processes controlling the amount of oil and water on the face.
  • Acne is due to evolutionary inertia: after our ancestors became hairless, they didn't have enough time or enough evolutionary pressure to evolve to excrete less oil on overly oily areas.

Several of these would also explain the nerd-acne connection. Or, that connection could go the other way round, because acne could cause people to stay inside, have lower status, etc.

Replies from: ChristianKl
comment by ChristianKl · 2014-03-05T11:22:31.397Z · LW(p) · GW(p)

If this were true, we would expect that other things that decrease blood flow in the face (such as cold weather, maybe?) would also increase acne.

Cold weather might reduce your blood flow for a few hours but it will come back to normal once you are again in a warm environment.

If I would run an experiment I would attempt to measure how tense the muscles in the face happen to be and how warm the skin happens to be and see whether those ratings correlates with the amount of acne.

This is very interesting; is this a significant cause of acne, and if so how do you know?

It's a working theory of myself at the moment.

The background is there are techniques in the hypnosis realm for resolving "trapped emotions" which often work to help solve physical alignments.

My perception of the bodies of other people is also getting better and I get better at perceiving when a certain part of the person I'm interacting with is colder and tenser than it should be.

Acne is caused by excessive face-washing, which screws up homeostatic processes controlling the amount of oil and water on the face.

As far as self reports go, some people improve their acne by washing their faces less and others by washing it more.

Acne is a defense mechanism employed when the body detects that one is low-status. That is, it's a way of making yourself less threatening to the rest of the tribe so that they won't slaughter you. If true, this could be mediated by status hormones like testosterone and cortisol.

Hunter gatherer tribes have nearly no acne. I would be wary with an explanation that focuses on the utility of getting acne in a hunter gather tribe to explain the acne we have in Western civilisation.

comment by Lumifer · 2014-03-27T19:02:59.641Z · LW(p) · GW(p)

Women who don't wear makeup are much less attractive, which significantly reduces their social status and their dating market value. These are things people greatly value.

This is true for certain subcultures. It is NOT true for other subcultures. And this is, of course, before we go into individual differences -- some girls are very pretty with a freshly-scrubbed look.

comment by therufs · 2014-03-27T18:07:38.144Z · LW(p) · GW(p)

Women who don't wear makeup are much less attractive, which significantly reduces their social status and their dating market value.

This crossed my mind as well, but for me spending 3-5 minutes on makeup in the morning is enough to make a substantial difference in my appearance. One has probably reached a point of diminishing returns by the time one makes it to 91 hours per year.

comment by polymathwannabe · 2014-02-24T19:38:46.754Z · LW(p) · GW(p)

The best means to stop a Hitler would be to show the actual, ugly truth of where he'll lead us. Very few lies about Hitler could match the real horror.

Replies from: Nornagest, James_Miller
comment by Nornagest · 2014-02-24T20:03:13.198Z · LW(p) · GW(p)

To credibly show the truth. Claims of Hitler-equivalent societal doom are a dime a dozen. Almost all of them are false.

Replies from: James_Miller
comment by James_Miller · 2014-02-24T20:39:12.407Z · LW(p) · GW(p)

Almost all isn't that reassuring given the scope of the potential harm. Hitler democratically acquired power in an advanced civilized Western Christian nation while being fairly open about his terminal values. Fear of this pattern repeating is worth continually emphasizing.

Replies from: Nornagest, ChristianKl
comment by Nornagest · 2014-02-24T21:09:14.847Z · LW(p) · GW(p)

The analogy isn't effective (outside the ingroup where it originates) unless it's credible; throwing it around in situations where it isn't in no way guards against the possibility of a recurrence of Nazism, or one of its less famous but often equally nasty companions in 20th-century totalitarianism. In fact, I'd say it's probably actively detrimental, as it makes the accusation less punchy when and if we do start seeing a totalizing popular movement that openly preaches extreme prejudice against an unpopular group of scapegoats.

That's not to say that these kinds of mass movements aren't worth studying or analogies to modern movements can't be made; they absolutely are and can. But crying Nazi without commensurately serious justification can only cheapen the term once everyone catches on. Who cares about having one more political slur?

comment by ChristianKl · 2014-02-25T14:44:43.782Z · LW(p) · GW(p)

Hitler democratically acquired power in an advanced civilized Western Christian nation while being fairly open about his terminal values.

I think it's somewhere in Sun Tzu's Art of War. Often things are well hidden in plain sight.

Hitler's biggest advantage was that nobody took him seriously.

Replies from: James_Miller
comment by James_Miller · 2014-02-25T15:14:49.936Z · LW(p) · GW(p)

And yet the German military didn't overthrow Hitler when he started messing up military strategy in Russia.

Replies from: ChristianKl, roystgnr
comment by ChristianKl · 2014-02-25T15:21:50.422Z · LW(p) · GW(p)

By that time Hitler did put people he trusted into central positions of military power. Everybody who Hitler considered to be untrustworthy was already removed from power.

Nobody succeeded in running a coup against him but people did try at such dates as the 20 of July. The military didn't follow Hitlers orders when it comes to subjects such as burning brides in Germany.

comment by roystgnr · 2014-02-25T21:26:24.541Z · LW(p) · GW(p)

A few tried, even specifically operating under the theory that the failures in Russia would make a post-assassination coup politically possible, in Operation Spark.

I don't think this much affects your point, though; by the time a sufficiently evil person and/or group is in power, there doesn't seem to be any shortage of political and psychological mechanisms they can use to entrench there.

comment by James_Miller · 2014-02-24T20:33:31.653Z · LW(p) · GW(p)

In a world with rational voters, yes. In our world you might want to start a false rumor such as Hitler's Jew hating is just a false cover for his true desire to reduce social welfare payments

Replies from: ChristianKl, polymathwannabe
comment by ChristianKl · 2014-02-25T14:59:14.653Z · LW(p) · GW(p)

In our world you might want to start a false rumor such as Hitler's Jew hating is just a false cover for his true desire to reduce social welfare payments

That rumor wouldn't spread. It's to complicated to be a good story that's believable to the average person in that time period. I think Bruce Sterling's novel Distraction is quite brilliant at illustrating how such principles work.

Replies from: James_Miller
comment by James_Miller · 2014-02-25T15:17:28.320Z · LW(p) · GW(p)

I was making an analogy to Bill Clinton's false claim that Bob Dole wanted to cut medical benefits to senior citizens. When confronted with his lie by Dole, Clinton reportedly said "You gotta do what you gotta do."

Replies from: ChristianKl
comment by ChristianKl · 2014-02-25T15:25:21.085Z · LW(p) · GW(p)

It's confusing to talk about history of the 1930's with examples that come from the 1990's and which aren't marked that way.

It prevents you from learning the historic lessons that the 1930's do provide.

comment by polymathwannabe · 2014-02-24T20:37:17.318Z · LW(p) · GW(p)

In Tea-Party constituencies, that'd be an argument in his favor.

Replies from: James_Miller
comment by James_Miller · 2014-02-24T20:51:37.758Z · LW(p) · GW(p)

No, smarter voters would see the purpose of the lie and vote against Hitler. (As a tea party person I'm disassociating with Hitler.)

Replies from: polymathwannabe
comment by polymathwannabe · 2014-02-24T20:55:10.480Z · LW(p) · GW(p)

The Tea Party would probably support a candidate who they had reason to think wants to cut down welfare programs, even if there are some unnerving rumors about him.

comment by Shmi (shminux) · 2014-02-24T09:02:02.209Z · LW(p) · GW(p)

Scott's examples have a fair amount of selection bias. If you take Chile, Russia, NK, or Zimbabwe, those who play dirty prevail. However, I agree that building a walled garden and making it attractive to join is a far better strategy whenever feasible.

Replies from: None, ChristianKl
comment by [deleted] · 2014-02-24T10:41:17.976Z · LW(p) · GW(p)

This is class bias though. Some people are not in a position to create and live in a walled garden, either in real life or on the internet, especially in places without an internet connection. Sure its great for the kind of people who use less wrong, educated, white middle class, mainly male (as I know there are several women who post regularly), obviously internet access, lots of free time to write intense blog posts.

I suppose you may cancelled this out with whenever feasible, but I also suspect the average less wrong poster would not be a good judge of feasible.

I've put a lot of thought into the politics of anger and I see value and problems on both sides. Arthur Chu appears to have expressed a much more intense version than I've seen in say, Feministe or Atheism+. Although one Feministe commentator expressed her belief that she would be morally correct to murder or burn down the house of people doing things which don't justify such a response in my personal opinion.

I've never come across the idea of deception in precisely the way that Scott framed Arthur Chu's comments. Is there a link to the actual comment made by Chu?

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2014-02-24T11:53:07.762Z · LW(p) · GW(p)

Is there a link to the actual comment made by Chu?

Yes.

This is class bias though. Some people are not in a position to create and live in a walled garden, either in real life or on the internet, especially in places without an internet connection.

Sucks to be them. You seem to be arguing something I've seen elsewhere, and which for the sake of definiteness I'll state in the strongest form.

No-one ever achieves anything through their own efforts. All "success" is due to privilege, which is oppression. So-called self-help is privilege used as an excuse to blame the unprivileged for their situation, which situation is in reality due to oppression by the privileged. Showing people what they can do for themselves is disempowerment. Blaming their troubles on the forces of privilege is empowerment. Individual action is vice. Collective action is virtue.

Do your thoughts point in that direction?

I've put a lot of thought into the politics of anger and I see value and problems on both sides. Arthur Chu appears to have expressed a much more intense version than I've seen in say, Feministe or Atheism+. Although one Feministe commentator expressed her belief that she would be morally correct to murder or burn down the house of people doing things which don't justify such a response in my personal opinion.

"A lot of thought" is not a phrase that leaps to my mind on reading that. Does any of this seeing value and problems, and having personal beliefs and opinions lead to anything but more words on blogs? I'm glad you don't think murder and arson should be freely employed, but what does it mean to say that judgement is your personal opinion? That it doesn't matter?

Replies from: None
comment by [deleted] · 2014-02-24T12:34:38.112Z · LW(p) · GW(p)

That paragraph sounds awful. No, I don't think that. I'll be lazy and point to John Scalzi I guess: http://whatever.scalzi.com/2012/05/15/straight-white-male-the-lowest-difficulty-setting-there-is/

I don't think that individual advice is useless. I'm skeptical that certain people are giving useful advice. Useful here involves a criterion of novelty. Giving someone advice they have heard 1000 times is not helpful. I guess necessary but not sufficient is a good description of personal effort in this context.

Working three jobs doesn't leave a lot of time to get educated and then use that education to post large amounts of philosophical text on a Reddit like rationalist site. And being poor and black in the real world isn't the optimal condition for creating meat space walled gardens.

As far as individual action, its my opinion that individual charity is helpful but not sufficient. And often comes with coercion I don't approve of. Religious charity would be a good example here. Strings attached? More like ropes.

For the second thing you quoted, I wouldn't say that I am the most productive person in helping the less fortunate. Although that's somewhat for psychological and/or financial reasons. This is orthogonal to the efficacy of certain strategies to promote social change. Just because I don't turn on the faucet doesn't mean that if I did water wouldn't come out.

Replies from: ikrase
comment by ikrase · 2014-03-01T12:25:17.694Z · LW(p) · GW(p)

I think that the answer to this problem is that it will simply be neccesary for class oppression to be ended then.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2014-03-01T19:58:04.216Z · LW(p) · GW(p)

Could you taboo "oppression". SJ types (and Marxists in general) love throwing around that word, but I've never seen a coherent definition beyond connoting something they disapprove of.

comment by ChristianKl · 2014-02-24T10:34:57.188Z · LW(p) · GW(p)

How do you know? Because the news you read about Chile, Russia, NK and Zimbabwe tell you so?

comment by Lumifer · 2014-02-24T16:34:07.954Z · LW(p) · GW(p)

It is rather amusing to treat this exchange as a debate on consequentialism.

Replies from: None
comment by [deleted] · 2014-02-24T20:03:50.913Z · LW(p) · GW(p)

It looks like all the participants are consequentialists in good standing. The argument is over whose model of the world more accurately predicts consequences.

Replies from: shokwave
comment by shokwave · 2014-02-26T16:00:30.326Z · LW(p) · GW(p)

I mentioned on Slate Star Codex as well, it seems like if you let consequentialists predict the second-order consequences of their actions they strike violence and deceit off the list of useful tactics, in much the same way that a consequentialist doctor doesn't slaughter the healthy traveler for the organ transplants to save five patients, because the consequentialist doctor knows the consequence of destroying trust in the medical establishment is a worse consequence.

comment by buybuydandavis · 2014-02-24T11:00:22.458Z · LW(p) · GW(p)

Nice. This is another confirmation of something that's becoming increasing apparent to me, and raises the same issue I've been thinking about.

I'm of the rationalist libertarian persuasion. We value truth, honesty, and a lack of coercion in human interaction. When you argue, you argue honestly. You don't lie, you admit when the other side scores points, etc. Politically, you respect the freedom and equal rights of others, and don't use force to violate those rights. But we live in a world of people who do not share those values. By our lights, these people are engaged in war against us. Yet we don't respond in kind. We're in a war, we're being shot at, but we never shoot back.

For all the talk of guns and self defense, rationalist libertarians are basically social pacifists. You could add the nerd persuasion to the list of pacifisms - in this case pacifism in the near politics of negative sum social status wars.

All this pacifism, from people who generally think it's moronic when actual bullets are involved. It's rather peculiar.

Whatever we might think of the best way for humans to interact, surely that is contextual, and the fact that another person is shooting at you should count as a relevant part of that context. If we never make such people pay a price in violating what we consider the ceasefire of the war of all against all, why would we ever expect them to stop?

Libertarians in the US are fond of the quote (variously attributed) that "The price of liberty is eternal vigilance". Vigilance is just the start. The true price is making violators of that liberty pay.

It always makes me happy when my ideological opponents come out and say eloquently and openly what I’ve always secretly suspected them of believing.

It's nice to get those occasional windows into an alien mind. Probably worthwhile to go to various ideological walled gardens on the internet disguised as a native.

Replies from: Viliam_Bur, Lumifer
comment by Viliam_Bur · 2014-02-24T16:40:27.135Z · LW(p) · GW(p)

The fact that we don't shoot each other literally and verbally is one thing that allows a website like LessWrong to exist.

The alternative would be splitting the website into dozen subsites: More Right, More Left, More Free, More Feminist, More Vegetarian, etc., which I suspect wouldn't remain rational for too long, although some of them might keep the word "rationality" as their local applause light.

Would that improve the world? My first guess is that these diverse websites would mostly cancel out each other, so the result would be zero. As an impact on their personal lives, they would probably spend less time studying, and more time inventing smart sounding political arguments. Which already other big parts of internet are doing, so they would be just another drop in the ocean.

Replies from: buybuydandavis
comment by buybuydandavis · 2014-02-24T20:26:10.095Z · LW(p) · GW(p)

Yes, here, where there is a sizable libertarian contingent, libertarians and progressives manage to be civil. And from my rationalist libertarian perspective, that's a good thing, and I wouldn't want it to change.

I don't think it's good to initiate force. I think peaceful truce's are good things.

But I wasn't addressing the situation at LW, I was addressing the broader context where rationalist libertarians are taking bullets, but not returning fire. I consider pacifism a loser of a strategy. A better strategy, IMO, is some kind of proportionate tit for tat. But the first step is to realize that pacifism is the current strategy, and that it's probably a loser of a strategy.

I'd say the same with Nerd near social pacifism.

"The great are great only because we are on our knees: Let us rise."

comment by Lumifer · 2014-02-24T18:35:30.626Z · LW(p) · GW(p)

For all the talk of guns and self defense, rationalist libertarians are basically social pacifists.

Well, two points come to mind.

First, libertarians are by definition social pacifists if "social pacifism" is defined as refusal to the use coercion to propagate your own memes and values.

Second, rational libertarians who happen to be upper-middle-class college kids living in big coastal cities -- these might be social pacifist. But I bet I can find some pretty rational pretty libertarian guys somewhere in Wyoming and they won't be pacifist at all.

Replies from: buybuydandavis
comment by buybuydandavis · 2014-02-24T18:57:50.722Z · LW(p) · GW(p)

Libertarians are supposed to refrain from initiating force, while pacifists refrain from using any force. That's theoretically the distinction between pacifists and libertarians. In practice politically, very little difference.

As for the boys in Wyoming, I don't think they're much better. Maybe worse. Sure, if you show up with an actual gun and shoot at them, they're likely to shoot back. But for all the huffy talk about how government initiates force against them, what do they actually do to retaliate? At least City Beta Boy Snowden actually did something.

If all their "eternal vigilance" amounts to is bitching and moaning when their liberty is infringed, what good are they?

Replies from: Lumifer
comment by Lumifer · 2014-02-24T19:18:58.621Z · LW(p) · GW(p)

Remember the context -- we're talking about persuasion in the social setting, about meme and value propagation, basically. In this context "pacifism" means "tolerance" in the sense of "you don't believe the same things as I do and that's fine".

Replies from: buybuydandavis
comment by buybuydandavis · 2014-02-24T20:12:23.356Z · LW(p) · GW(p)

And my point was that not only in that context, but in other contexts as well, rationalist libertarians are pacifists until the bullets flying at them are actual bullets.

Replies from: Lumifer
comment by Lumifer · 2014-02-24T21:18:00.997Z · LW(p) · GW(p)

not only in that context, but in other contexts as well, rationalist libertarians are pacifists until the bullets flying at them are actual bullets.

I don't believe this to be true. At least according to my understanding of rationalist libertarians.

comment by rocurley · 2014-02-24T04:35:44.332Z · LW(p) · GW(p)

Scott just responded here, with a number of points relevant to the topic of rationalist communities.

I would assume there was supposed to be a link there?

Replies from: Vladimir_Nesov, seez
comment by Vladimir_Nesov · 2014-02-24T08:31:38.005Z · LW(p) · GW(p)

Fixed.

comment by seez · 2014-02-24T06:54:17.385Z · LW(p) · GW(p)

Link here.

Replies from: komponisto
comment by komponisto · 2014-02-24T06:55:48.779Z · LW(p) · GW(p)

...and on the sidebar ("Recent on Rationality Blogs")....

comment by ChristianKl · 2014-02-24T10:31:14.609Z · LW(p) · GW(p)

Obviously, at some point being polite in our arguments is silly.

I think you seldom convince someone to change his opinion by name calling.

I once went to a talk about the implications of neurology on economics. Unfortunately for the professor who gave the talk he had a badly dressed conspiracy theorist in his audience who was upset about the professor providing a new way to justify the economic status quo. That talk would have benefited from throwing out the conspiracy theorist instead of being nice to him. The reason isn't that the conspiracy theorist is dangerous but because wasted valuable time of a talk about neurology on economics.

Most situations however aren't like that. While I would have wanted the person thrown out of that talk I would happily talk with someone who holds the same position face to face or at a forum like LW.

I personally don't attempt to manipulate people against their will but if I would want to do so, dishonesty isn't the most straightforward way. If you interact with a person in that way you bring up resistance.

If you create rapport you can ask them whether they are happy with how their life is going and hit much deeper. That means both being able to do real psychical damage that goes deeper than a cheap insult and having a lever that's strong enough to change opinions.

Replies from: None
comment by [deleted] · 2014-02-24T23:03:38.789Z · LW(p) · GW(p)

You probably won't convince anyone, but you can probably discourage uncommitted/future people from taking the scorned position.

Replies from: Nornagest
comment by Nornagest · 2014-02-24T23:16:54.101Z · LW(p) · GW(p)

The problem with using rhetoric to push people off the fence is that it's pretty hard to tell which way they'll fall.

comment by ikrase · 2014-03-01T12:35:44.989Z · LW(p) · GW(p)

I think that a conception of heroic morality (basically, whether or not to use TDT, or choosing between act and rule utilitarianism) may be at the heart of many of the choices to be cooperative/nice or not. Many people seem to assume that they should always play the hero, and those more virtuous ones who don't seem to think that you would never be able to play the hero.

As an example, consider assassinating Hitler. It's not clear how Hitler could reprise this -- he is already killing people who disagree with him, and he is a single tyrant while you are an invisible individual. This does not apply, however, if you are in equal factions, say Fascists and Communists.

comment by Sieben · 2014-02-26T00:12:15.807Z · LW(p) · GW(p)

I don't understand all the consequentialist arguments against playing dirty. If your only objections are practical, then you're open to subtle dirty maneuvers that have very high payoffs.

A really simple example of this would be to ignore articulate opponents and spend most of your energy publicly destroying the opposition's overzealous lowest-common-denominators. This is actually how most of politics works...

... and also how this conversation seems to be working, since the Scott Alexander side seems more intent on arguing through hyperbole than addressing the actual spirit of what is being suggested. A simple example could be to deliberately misinterpret what the other side is saying, and force them to clarify themselves ad nauseam until they run out of energy, "conceding" the issue by default.

AND JUST LIKE THAT WE'RE BACK TO THE IRONY

Replies from: Viliam_Bur
comment by Viliam_Bur · 2014-02-26T12:03:24.492Z · LW(p) · GW(p)

Addressing the most stupid of opposition's arguments is not an enlightened way of discussion, but it's still way better than manufacturing and spreading widely false statistics.

If the other side played equally dirty, we would see articles like: "Did you know that 95% of violent crimes are committed by Social Justice Warriors?" or "Woman is most likely to get raped at the feminist meeting (therefore, ladies, you should avoid those meetings, and preferably try to ban them at your campus)".

[EDIT: After some thought, removed a realistic example of a specific form of attack against a specific person, because that kind of thing should not appear in LW discussions. Just leaving a hint: Imagine how a successful support for a false statistics could be used to design an ironic revenge at the very person who supported it.]

I hope this sufficiently illustrates that the belief that the other side already is fighting as dirty as they can, and you cannot give them ideas by fighting dirty yourself, is completely false.

Replies from: Sieben
comment by Sieben · 2014-02-26T14:48:52.691Z · LW(p) · GW(p)

Addressing the most stupid of opposition's arguments is not an enlightened way of discussion, but it's still way better than manufacturing and spreading widely false statistics.

You seem to be confused. Both of the things you mentioned are examples of "playing dirty".

If the other side played equally dirty, we would see articles like: "Did you know that 95% of violent crimes are committed by Social Justice Warriors?" or "Woman is most likely to get raped at the feminist meeting (therefore, ladies, you should avoid those meetings, and preferably try to ban them at your campus)".

But this is a very stupid way to play dirty because it is transparent and can backfire. Making a public example of the other side's inarticulate idiots is extremely unlikely to backfire.

Just leaving a hint: Imagine how a successful support for a false statistics could be used to design an ironic revenge at the very person who supported it.]

Just a hint: If you are using consequentialist arguments against playing dirty, then you are open to playing dirty if you can be shown it works. I submit to you that you have a failure of imagination.

I hope this sufficiently illustrates that the belief that the other side already is fighting as dirty as they can, and you cannot give them ideas by fighting dirty yourself, is completely false.

Strategic mimicry is not one of my arguments. You seem to be arguing with someone else. Regardless, see the "consequentialist" point above.

Replies from: Sieben
comment by Sieben · 2014-02-26T15:04:11.644Z · LW(p) · GW(p)

Simple examples of playing dirty:

  • Someone links a URL but it is broken in an obvious way. If you truly interested in arguing for the sake of argument, you could fix the URL and go to their link. But you could also take the opportunity to complain that they are just wasting your time and aren't really serious.

  • Sometimes, there is a finite amount of time or space for your opponents to reply to you in. You can pick arguments whose articulation is economic, but whose rebuttal is not. This puts a huge volumetric burden on them such that they will be unlikely to be able to reply to all your points. Later you can point out that they "ignored many of your best arguments". This is an old debater's trick.

  • You're going to have a live debate online for a public audience. 45 minutes beforehand, you receive an e-mail from your opponent indicating that they are having difficulty connecting to Skype and suggest the debate be moved to Omegle. You can play nice and get the debate to happen, or you can pretend that you didn't see the e-mail in time and then gloat that your opponent didn't show up because of "technical difficulties" har har har.

  • Abuse the last word. If you're in the final stretch of a debate, bring up new issues that your opponent cannot address because they are out of time. This technique is actually heavily penalized in high school debate competitions, but people get away with it regularly because adults are more biased than teenagers.

comment by Sieben · 2014-02-26T00:13:53.629Z · LW(p) · GW(p)

Just think about how much more persuasive fighting dirty sounds if the whole fate of the human race hangs in the balance. As is, there is an underlying assumption that we have infinite time to grind down our opposition with passive logical superiority.

Replies from: Salemicus
comment by Salemicus · 2014-02-27T00:21:32.376Z · LW(p) · GW(p)

If the fate of the whole human race hangs in the balance, then it is particularly important that the correct decision is taken, not just the one most driven by tribal feeling, loose rhetoric, etc. Therefore it is particularly important that we are able to evaluate all ideas as accurately as we can, and particularly important not to spread lies, etc.

Of course, if you assume going in that your ideas are infallible, then fighting dirty can look appealing. But if the fate of the human race hangs in the balance, then you can afford the luxury of that assumption.

Replies from: Sieben
comment by Sieben · 2014-02-28T00:31:41.518Z · LW(p) · GW(p)

Therefore it is particularly important that we are able to evaluate all ideas as accurately as we can, and particularly important not to spread lies, etc.

Okay, so, we don't know what the right answer is. But we know what the right answer ISN'T, right? We know that Westboro Baptist Church isn't going to lead the human race into a new golden age. Why not try to limit their influence?

And even if there were some seemingly bad ideas that could, through some twist, actually be good ideas, there are still nonzero costs to considering them. Like if there is a 0.00001% chance it is "the answer", but a 99.99999% chance to waste everyone's time and making some people angry, we should probably discard it. Why waste time when we can pursue that handful of ideas that have a much higher chance of improving the world?

But if the fate of the human race hangs in the balance, then you can afford the luxury of that assumption.

I'm going to assume you meant that you can't afford the luxury of that assumption, and actually yes I can. In fact, I have no choice. I have a finite amount of computational power and if I go through all possible permutations of ideas then the probability of me coming out with The Right Answer becomes vanishingly small. Instead, I can apply some very defensible heuristics to write off huge sections of thought wholesale. I should focus my efforts on ideas that are not obviously wrong.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2014-02-28T04:42:25.744Z · LW(p) · GW(p)

Okay, so, we don't know what the right answer is. But we know what the right answer ISN'T, right?

See Yvain's post on Schelling Fences on Slippery Slopes.

Like if there is a 0.00001% chance it is "the answer", but a 99.99999% chance to waste everyone's time and making some people angry, we should probably discard it. Why waste time when we can pursue that handful of ideas that have a much higher chance of improving the world?

You do realize that most people have the same opinion about the Singularity?

Replies from: Sieben, ikrase
comment by Sieben · 2014-03-04T18:35:31.923Z · LW(p) · GW(p)

See Yvain's post on Schelling Fences on Slippery Slopes.

This is not a blanket reason to defend all ideologies against censorship. The analysis of many religions also implicitly assumes that there is no cost to tolerating competing religions, whereas there is a definite cost to hearing out many of the worst political ideologies.

It's almost as if the slippery slope works both ways. If you can't filter anything, your energy is drained by a thousand paper cuts.

You do realize that most people have the same opinion about the Singularity?

I wasn't aware that the general public was angry about Singularity nerds. I was talking more about like teenage neo-nazis. Extremely high probability to contribute nothing, piss a bunch of people off, and waste all our time.

comment by ikrase · 2014-03-01T12:28:18.022Z · LW(p) · GW(p)

In the case of the Singularity, I'd say that most people don't consider probability and very largepayoffs.

comment by Ilverin the Stupid and Offensive (Ilverin) · 2014-02-24T17:25:53.598Z · LW(p) · GW(p)

"How dire [do] the real world consequences have to be before it's worthwhile debating dishonestly"?

M̶y̶ ̶b̶r̶i̶e̶f̶ ̶a̶n̶s̶w̶e̶r̶ ̶i̶s̶:̶

One lower bound is:

If the amount that rationality affects humanity and the universe is decreasing over the long term. (Note that if humanity is destroyed, the amount that rationality affects the universe probably decreases).

T̶h̶i̶s̶ ̶i̶s̶ ̶a̶l̶s̶o̶ ̶m̶y̶ ̶a̶n̶s̶w̶e̶r̶ ̶t̶o̶ ̶t̶h̶e̶ ̶q̶u̶e̶s̶t̶i̶o̶n̶ ̶"̶w̶h̶a̶t̶ ̶i̶s̶ ̶w̶i̶n̶n̶i̶n̶g̶ ̶f̶o̶r̶ ̶t̶h̶e̶ ̶r̶a̶t̶i̶o̶n̶a̶l̶i̶s̶t̶ ̶c̶o̶m̶m̶u̶n̶i̶t̶y̶"̶?̶

R̶a̶t̶i̶o̶n̶a̶l̶i̶t̶y̶ ̶i̶s̶ ̶w̶i̶n̶n̶i̶n̶g̶ ̶i̶f̶,̶ ̶o̶v̶e̶r̶ ̶t̶h̶e̶ ̶l̶o̶n̶g̶ ̶t̶e̶r̶m̶,̶ ̶r̶a̶t̶i̶o̶n̶a̶l̶i̶t̶y̶ ̶i̶n̶c̶r̶e̶a̶s̶i̶n̶g̶l̶y̶ ̶a̶f̶f̶e̶c̶t̶s̶ ̶h̶u̶m̶a̶n̶i̶t̶y̶ ̶a̶n̶d̶ ̶t̶h̶e̶ ̶u̶n̶i̶v̶e̶r̶s̶e̶.̶

Replies from: Mestroyer
comment by Mestroyer · 2014-02-25T19:07:39.843Z · LW(p) · GW(p)

Downvoted for the fake utility function.

"I wont let the world be destroyed because then rationality can't influence the future" is an attempt to avoid weighing your love of rationality against anything else.

Think about it. Is it really that rationality isn't in control any more that bugs you, not everyone dying, or the astronomical number of worthwhile lives that will never be lived?

If humanity dies to a paperclip maximizer, which goes on to spread copies of itself through the universe to oversee paperclip production, each of those copies being rational beyond what any human can achieve, is that okay with you?

Replies from: Ilverin
comment by Ilverin the Stupid and Offensive (Ilverin) · 2014-02-26T16:54:49.689Z · LW(p) · GW(p)

Thank you, I initially wrote my function with the idea of making it one (of many) "lower bound"(s) of how bad things could possibly get before debating dishonestly becomes necessary. Later, I mistakenly thought that "this works fine as a general theory, not just a lower bound".

Thank you for helping me think more clearly.