You have a set amount of "weirdness points". Spend them wisely.

post by Peter Wildeford (peter_hurford) · 2014-11-27T21:09:22.804Z · LW · GW · Legacy · 98 comments

I've heard of the concept of "weirdness points" many times before, but after a bit of searching I can't find a definitive post describing the concept, so I've decided to make one.  As a disclaimer, I don't think the evidence backing this post is all that strong and I am skeptical, but I do think it's strong enough to be worth considering, and I'm probably going to make some minor life changes based on it.

-

Chances are that if you're reading this post, you're probably a bit weird in some way.

No offense, of course.  In fact, I actually mean it as a compliment.  Weirdness is incredibly important.  If people weren't willing to deviate from society and hold weird beliefs, we wouldn't have had the important social movements that ended slavery and pushed back against racism, that created democracy, that expanded social roles for women, and that made the world a better place in numerous other ways.

Many things we take for granted now as why our current society as great were once... weird.

 

Joseph Overton theorized that policy develops through six stagesunthinkable, then radical, then acceptable, then sensible, then popular, then actual policy.  We could see this happen with many policies -- currently same-sex marriage is making its way from popular to actual policy, but not to long ago it was merely acceptable, and not too long before that it was pretty radical.

Some good ideas are currently in the radical range.  Effective altruism itself is such a collection of beliefs typical people would consider pretty radical.  Many people think donating 3% of their income is a lot, let alone the 10% demand that Giving What We Can places, or the 50%+ that some people in the community do.

And that's not all.  Others would suggest that everyone become vegetarian, advocating for open borders and/or universal basic income, theabolishment of gendered language, having more resources into mitigating existential riskfocusing on research into Friendly AIcryonicsand curing death, etc.

While many of these ideas might make the world a better place if made into policy, all of these ideas are pretty weird.

 

Weirdness, of course, is a drawback.  People take weird opinions less seriously.

The absurdity heuristic is a real bias that people -- even you -- have.  If an idea sounds weird to you, you're less likely to try and believe it,even if there's overwhelming evidence.  And social proof matters -- if less people believe something, people will be less likely to believe it.  Lastly, don't forget the halo effect -- if one part of you seems weird, the rest of you will seem weird too!

(Update: apparently this concept is, itself, already known to social psychology as idiosyncrasy credits.  Thanks, Mr. Commenter!)

...But we can use this knowledge to our advantage.  The halo effect can work in reverse -- if we're normal in many ways, our weird beliefs will seem more normal too.  If we have a notion of weirdness as a kind of currency that we have a limited supply of, we can spend it wisely, without looking like a crank.

 

All of this leads to the following actionable principles:

Recognize you only have a few "weirdness points" to spend.  Trying to convince all your friends to donate 50% of their income to MIRI, become a vegan, get a cryonics plan, and demand open borders will be met with a lot of resistance.   But -- I hypothesize -- that if you pick one of these ideas and push it, you'll have a lot more success.

Spend your weirdness points effectively.  Perhaps it's really important that people advocate for open borders.  But, perhaps, getting people to donate to developing world health would overall do more good.  In that case, I'd focus on moving donations to the developing world and leave open borders alone, even though it is really important.  You should triage your weirdness effectively the same way you would triage your donations.

Clean up and look good.  Lookism is a problem in society, and I wish people could look "weird" and still be socially acceptable.  But if you're a guy wearing a dress in public, or some punk rocker vegan advocate, recognize that you're spending your weirdness points fighting lookism, which means less weirdness points to spend promoting veganism or something else.

Advocate for more "normal" policies that are almost as good.   Of course, allocating your "weirdness points" on a few issues doesn't mean you have to stop advocating for other important issues -- just consider being less weird about it.  Perhaps universal basic income truly would be a very effective policy to help the poor in the United States.  But reforming the earned income tax credit and relaxing zoning laws would also both do a lot to help the poor in the US, and such suggestions aren't weird.

Use the foot-in-door technique and the door-in-face technique.  The foot-in-door technique involves starting with a small ask and gradually building up the ask, such as suggesting people donate a little bit effectively, and then gradually get them to take the Giving What We Can Pledge.  The door-in-face technique involves making a big ask (e.g., join Giving What We Can) and then substituting it for a smaller ask, like the Life You Can Save pledge or Try Out Giving.

Reconsider effective altruism's clustering of beliefs.  Right now, effective altruism is associated strongly with donating a lot of money and donating effectively, less strongly with impact in career choice, veganism, and existential risk.  Of course, I'm not saying that we should drop some of these memes completely.  But maybe EA should disconnect a bit more and compartmentalize -- for example, leaving AI risk to MIRI, for example, and not talk about it much, say, on 80,000 Hours.  And maybe instead of asking people to both give more AND give more effectively, we could focus more exclusively on asking people to donate what they already do more effectively.

Evaluate the above with more research.  While I think the evidence base behind this is decent, it's not great and I haven't spent that much time developing it.  I think we should look into this more with a review of the relevant literature and some careful, targeted, market research on the individual beliefs within effective altruism (how weird are they?) and how they should be connected or left disconnected.  Maybe this has already been done some?

-

Also discussed on the EA Forum and EA Facebook group.

98 comments

Comments sorted by top scores.

comment by complexmeme · 2014-11-29T05:25:52.512Z · LW(p) · GW(p)

after a bit of searching I can't find a definitive post describing the concept

The idiom used to describe that concept in social psychology is "idiosyncrasy credits", so searching for that phrase produces more relevant material (though as far as I can tell nothing on Less Wrong specifically).

Replies from: peter_hurford
comment by Peter Wildeford (peter_hurford) · 2014-11-29T17:35:38.679Z · LW(p) · GW(p)

Wow, that's amazing! Thanks!

comment by Salemicus · 2014-11-27T23:48:15.493Z · LW(p) · GW(p)

This post makes some great points. As G.K. Chesterton said:

A man must be orthodox upon most things, or he will never even have time to preach his own heresy.

Fundamentally, other people's attention is a scarce resource, and you have to optimise whatever use of it you can get. Dealing with someone with a large inferential gap can be exhausting and you are liable to be tuned out if you make too many different radical points.

I would also add that part of being persuasive is being persuadable. People do not want to be lectured, and will quickly pick up if you see them as just an audience to be manipulated rather than as equals.

Replies from: Metus
comment by Metus · 2014-11-29T19:29:48.844Z · LW(p) · GW(p)

Personally, people who do plenty of things weirdly come off as trying way too hard. Depending on your environment, choose one thing that you care about and do that. I wouldn't be found dead at a formal dinner in washed out jeans and a dirty t-shirt. I would be willing to experiment with neck pieces different from formal ties.

comment by Kaj_Sotala · 2014-11-28T10:23:58.135Z · LW(p) · GW(p)

I agree with the general gist of the post, but I would point out that different groups consider different things weird, and have differing opinions about what weirdness is a bad thing.

To use your "a guy wearing a dress in public" example - I do this occasionally, and gauging from the reactions I've seen so far, it seems to earn me points among the liberal, socially progressive crowd. My general opinions and values are such that this is the group that would already be the most likely to listen to me, while the people who are turned off by such a thing would be disinclined to listen to me anyway.

I would thus suggest, not trying to limit your weirdness, but rather choosing a target audience and only limiting the kind of weirdness that this group would consider freakish or negative, while being less concerned by the kind of weirdness that your target audience considers positive. Weirdness that's considered positive by your target audience may even help your case.

Replies from: Philip_W, Metus, None
comment by Philip_W · 2015-02-09T00:24:15.143Z · LW(p) · GW(p)

I think I might have been a datapoint in your assessment here, so I feel the need to share my thoughts on this. I would consider myself socially progressive and liberal, and I would hate not being included in your target audience, but for me your wearing cat ears to the CFAR workshop cost you weirdness points that you later earned back by appearing smart and sane in conversations, by acceptance by the peer group, acclimatisation, etc.

I responded positively because it fell within the 'quirky and interesting' range, but I don't think I would have taken you as seriously on subjectively weird political or social opinions. It is true that the cat ears are probably a lot less expensive for me than cultural/political out-group weirdness signals, like a military haircut. It might be a good way to buy other points, so positive overall, but that depends on the circumstances.

Replies from: Kaj_Sotala
comment by Kaj_Sotala · 2015-02-09T08:30:58.438Z · LW(p) · GW(p)

Thank you! I appreciate the datapoint.

comment by Metus · 2014-11-29T19:27:16.511Z · LW(p) · GW(p)

To make this picture a bit more colourful: I love suits, they look great on me. But I will be damned if I wear suits to university for people will laugh at me and not take me seriously because to the untrained eye all suits are considered business suits. On the other hand hanging around in a coffee place at any odd time of the day is completely to normal to the same group.

Contrast this with the average person working in an environment where they wear a suit: The suit could help me signal that I am on their side, the being in a coffee place at any odd time would then become my cause to be accepted.

The lesson then is to pick the tribe you are in, as you will know their norms best and adher to them anyhow, and then a cause that will produce the most utility within that tribe. It just so happens that there is the extremely large tribe "the public" which sometimes leads people to ignore that they can influence other, really big tribes, like Europeans, British, Londoners and then the members of their boroughs, to make a divide by region.

comment by [deleted] · 2014-11-28T15:44:31.101Z · LW(p) · GW(p)

This carries the slight problem that people tend to get offended when they realize you're explicitly catering to an audience. If I talked about the plight of the poor and meritocracy to liberals and about responsibility and family to conservatives, advocating the exact same position to each, and then each group found out about the speech I gave to the other, they would both start thinking of me as a duplicitous snake. They might start yelling about "Eli Sennesh's conspiracy to pass a basic income guarantee" or something like that: my policy would seem "eviler" for being able to be upheld from seemingly disjoint perspectives.

Replies from: Kaj_Sotala
comment by Kaj_Sotala · 2014-11-28T16:35:51.895Z · LW(p) · GW(p)

Right, I wouldn't advocate having contradictory presentations, but rather choosing a target audience that best fits your personality and strengths, and then sticking to that.

comment by plex (ete) · 2014-11-28T00:58:22.553Z · LW(p) · GW(p)

I believe the effect you describe exists, but I think there are two effects which make it unclear that implementing your suggestions is an overall benefit to the average reader. Firstly, to summarize your position:

Each extra weird belief you have detracts from your ability to spread other, perhaps more important, wierd memes. Therefore normal beliefs should be preferred to some extent, even when you expect them to be less correct or less locally useful on an issue, in order to improve your overall effectiveness at spreading your most highly valued memes.

  1. If you have a cluster of beliefs which seem odd in general then you are more likely to share a "bridge" belief with someone. When you meet someone who shares at least one strange belief with you, you are much more likely to seriously consider their other beliefs because you share some common ground and are aware of their ability to find truth against social pressure. For example, an EA vegan may be vastly more able to introduce the other EA memes to a non-EA vegan than a EA non-vegan. Since almost all people have at least some weird beliefs, and those who have weird beliefs with literally no overlap with yours are likely to not be good targets for you to spread positive memes to, increasing your collection of useful and justifiable weird memes may well give you more opportunities to usefully spread the memes you consider most important

  2. Losing the absolute focus on forming an accurate map by making concessions to popularity/not standing out in too many ways seems epistemologically risky and borderline dark arts. I do agree that some situations that not advertizing all your weirdness at once may be a useful strategic choice, but am very wary of the effect putting too much focus on this could have on your actual beliefs. You don't want to strengthen your own absurdity heuristic by accident and miss out on more weird but correct and important things.

While I can imagine situations the advice given is correct (especially the for interacting with domain limited policymakers, or people you have a good read on likely reactions to extra weirdness), recommending it in general seems not sufficiently justified and I believe would have significant drawbacks.

Replies from: Raiden
comment by Raiden · 2014-11-28T04:44:56.295Z · LW(p) · GW(p)

Regarding point 2, while it would be epistemologically risky and borderline dark arts, I think the idea is more about what to emphasize and openly signal, not what to actually believe.

Replies from: ete
comment by plex (ete) · 2014-11-28T13:30:38.059Z · LW(p) · GW(p)

True, perhaps I should have been more clear in my dealing with the two, and explained how I think the they can blur across unintentionally. I do think being selective with signals can be instrumentally effective, but I think it's important to be intentionally aware when you're doing that and not allow your current mask to bleed over and influence your true beliefs unduly.

Essentially I'd like this post to come with a "Do this sometimes, but be careful and mindful of the possible changes to your beliefs caused by signaling as if you have different beliefs." warning.

Replies from: Nepene
comment by Nepene · 2014-11-30T04:34:24.498Z · LW(p) · GW(p)

There is a definite likelihood that acting out a belief will cause you to believe it due to your brain poorly distinguishing signalling and true beliefs.

That can be advantageous at times. Some beliefs may be less important to you, and worthy of being sacrificed for the greater good. If you say, believe that forcing people to wear suits is immoral and that veganism is immoral then it may be worth you sacrificing your belief in the unethical nature of suits so you can better stop people eating animals.

A willingness to do this is beneficial in most people who want to join organizations. They normally have a set of arbitrary rules on social conduct, dress, who to respect and who to respect less, how to deal with sickness and weakness, what media to watch, who to escalate issues to in the event of a conflict. If you don't do this you'll find it tricky gaining much power because people can spot people who fake these things.

Replies from: Capla
comment by Capla · 2014-12-09T19:12:02.428Z · LW(p) · GW(p)

Some beliefs may be less important to you, and worthy of being sacrificed for the greater good. If you say, believe that forcing people to wear suits is immoral and that veganism is immoral then it may be worth you sacrificing your belief in the unethical nature of suits so you can better stop people eating animals.

No. I will make concessions about which beliefs to act on in order to optimize for "Goodness", but I'm highly concerned about sacrificing beliefs about the world themselves. Doing this may be beneficial in specific situation, but at a cost to your overall effectiveness in other situations across domains. Since the range of possible situations that you might find yourself in is infinite, there is no way to know whether you've made a change to your model with catastrophic consequences down the line. Furthermore, we evaluate the effectiveness of strategies on the basis of the model we have, so every time your model becomes less accurate, your estimate of what is the best option in a given situation becomes less accurate. (Note that your confidence in your estimate may rise, fall, or stay the same, but I would doubt that having a less accurate model is going to lead to better credence calibration)

Allowing your beliefs to change for any reason other than to better reflect the world, only serves to make you worse at knowing how best to deal with the world.

Now, changing your values - that's another story.

Replies from: Nepene
comment by Nepene · 2014-12-13T02:20:32.001Z · LW(p) · GW(p)

You can easily model beliefs and work out if they're likely to have good or bad results. They could theoretically have a variety of infinite impacts, but most probably have a fairly small and limited effect. Humans have lots of beliefs, they can't all have a major impact.

For the catastrophic consequences issue, have you read this?

http://lesswrong.com/lw/ase/schelling_fences_on_slippery_slopes/

The slippery slope issue of potentially catastrophic consequences from a model can be limited by establishing arbitrary lines before hand that you refuse to cross. Whether you should sacrifice your beliefs, like with Gandhi, depends on what the value given for said sacrifice is, how valuable your sacrifice is to your models, and what the likelihood of catastrophic failure is. You can swear an oath not to cross those lines, give valuable possessions to people to destroy if you cross those lines so you can heavily limit the chance of catastrophic failure.

Allowing your beliefs to change for any reason other than to better reflect the world, only serves to make you worse at knowing how best to deal with the world.

Yeah, your success rate drops, but your ability to socialize can rise since irrational beliefs are how many think. If your irrational beliefs are of low importance, not likely to cause major issues, and unlikely to cause catastrophic failure they could be helpful.

comment by kilobug · 2014-11-29T11:37:51.085Z · LW(p) · GW(p)

Interesting post (upvoted) but I would add one "correction" : the amount of "weirdness points" isn't completely set, there ways to get more of them, especially by being famous, doing something positive or helping people. For example, by writing a very popular fanfiction (HPMOR), Eliezer earned additional weirdness points to spend.

Or on my own level, I noticed that by being efficient in my job and helpful with my workmates, I'm allowed a higher number of "weirdness points" before having my workmates start considering me as a loonie. But then you've to be very careful, because weirdness points earned within a group (say, my workmates) don't extend outside of the group.

Replies from: ike, peter_hurford
comment by ike · 2014-11-30T05:18:49.798Z · LW(p) · GW(p)

For example, by writing a very popular fanfiction (HPMOR)

For anyone who hasn't read HP and thinks fantasy is weird, he lost points for that.

One way to get more points is to listen to other people's weird ideas. In fact, if someone else proposes a weird idea that you already agree with, it may be a good idea not to let on, but publicly "get convinced", to gain points. (Does that count as Dark Arts?)

Replies from: dxu
comment by dxu · 2014-11-30T05:38:08.024Z · LW(p) · GW(p)

I have actually thought of that, but in relation to a different problem: not that of seeming less "weird", but that of convincing someone of an unpopular idea. It seems like the best way to convince people of something is to act like you're still in the process of being convinced yourself; for instance, I don't remember where, but I do remember reading an anecdote on how someone was able to convince his girlfriend of atheism while in a genuine crisis of faith himself. Incidentally, I should emphasize that his crisis of faith was genuine at the time--but it should work even if it's not genuine, as long as the facade is convincing. I theorize that this may be due to in-group affiliation, i.e. if you're already sure of something and trying to convince me, then you're an outsider pushing an agenda, but if you yourself are unsure and are coming to me for advice, you're on "my side", etc. It's easy to become entangled in just-so stories, so obviously take all of this speculation with a generous helping of salt, but it seems at least worth a try. (I do agree, however, that this seems borderline Dark Arts, so maybe not that great of an idea, especially if you value your relationship with that person enough to care if you're found out.)

Replies from: Richard_Kennaway, ChristianKl
comment by Richard_Kennaway · 2014-11-30T08:02:40.886Z · LW(p) · GW(p)

I should emphasize that his crisis of faith was genuine at the time--but it should work even if it's not genuine, as long as the facade is convincing.

This is called "concern trolling".

I do agree, however, that this seems borderline Dark Arts

It isn't "borderline Dark Arts", it's straight-out lying.

It should work ... as long as the facade is convincing

This imagines the plan working, and uses that as argument for the plan working.

Replies from: dxu
comment by dxu · 2014-11-30T17:11:51.825Z · LW(p) · GW(p)

This is called "concern trolling".

I was not aware that it had a name; thank you for telling me.

It isn't "borderline Dark Arts", it's straight-out lying.

Agreed. The question, however, is whether or not this is sometimes justified.

This imagines the plan working, and uses that as argument for the plan working.

Well, no. It assumes that the plan doesn't fall prey to an obvious failure mode, and suggests that if it does not, it has a high likelihood of success. (The idea being that if failure mode X is avoided, then the plan should work, so we should be careful to avoid failure mode X when/if enacting the plan.)

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2014-11-30T18:55:55.826Z · LW(p) · GW(p)

This imagines the plan working, and uses that as argument for the plan working.

Well, no. It assumes that the plan doesn't fall prey to an obvious failure mode

The failure mode (people detecting the lie) is what it would be for this plan to fail. It's like the empty sort of sports commentary that says "if our opponents don't get any more goals than us, we can't lose", or the marketing plan that amounts to "if we get just 0.001% of this huge market, we'll be rich."

See also. Lying is hard, and likely beyond the capability of anyone who has just discovered the idea "I know, why not just lie!"

Replies from: dxu
comment by dxu · 2014-11-30T19:16:21.356Z · LW(p) · GW(p)

That the plan would fail if the lie is detected is not under contest, I think. However, it is, in my opinion, a relatively trivial failure mode, where "trivial" is meant to be taken in the sense that it is obvious, not that it is necessarily easy to avoid. For instance, equations of the form a^n + b^n = c^n have trivial solutions in the form (a,b,c) = (0,0,0), but those are not interesting. My original statement was meant to be applied more as a disclaimer than anything else, i.e. "Well obviously this is an easy way for the plan to fail, but getting past that..." The reason for this was because there might be more intricate/subtle failure modes that I've not yet thought of, and my statement was intended more as an invitation to think of some of these less trivial failure modes than as an argument for the plan's success. This, incidentally, is why I think your analogies don't apply; the failure modes that you mention in those cases are so broad as to be considered blanket statements, which prevents the existence of more interesting failure modes. A better statement in your sports analogy, for example, might be, "Well, if our star player isn't sick, we stand a decent chance of winning," with the unstated implication being that of course there might be other complications independent of the star player being sick. (Unless, of course, you think the possibility of the lie being detected is the only failure mode, in which case I'd say you're being unrealistically optimistic.)

Also, it tends to be my experience that lies of omission are much easier to cover up than explicit lies, and the sort suggested in the original scenario seem to be closer to the former than to the latter. Any comments here?

(I also think that the main problem with lying from a moral perspective is that not just that it causes epistemic inaccuracy on the part of the person being lied to, but that it causes inaccuracies in such a way that it interferes with them instrumentally. Lying omissively about one's mental state, which is unlikely to be instrumentally important anyway, in an attempt to improve the other person's epistemic accuracy with regard to the world around them, a far more instrumentally useful task, seems like it might actually be morally justifiable.)

Replies from: Lumifer
comment by Lumifer · 2014-11-30T23:56:03.622Z · LW(p) · GW(p)

Lying also does heavy damage to one's credibility. The binary classification of other people into "honest folk" and "liars" is quite widespread in the real world. You get classified into "liars", pretty hard to get out of there.

Replies from: dxu
comment by dxu · 2014-12-01T04:58:25.001Z · LW(p) · GW(p)

Well, you never actually say anything untrue; you're just acting uncertain in order to have a better chance of getting through to the other person. It seems intuitively plausible that the reputational effects from that might not be as bad as the reputational effects that would come from, say, straight-out lying; I accept that this may be untrue, but if it is, I'd want to know why. Moreover, all of this is contingent upon you being found out. In a scenario like this, is that really that likely? How is the other person going to confirm your mental state?

Replies from: Lumifer, Richard_Kennaway
comment by Lumifer · 2014-12-01T07:59:52.281Z · LW(p) · GW(p)

but if it is, I'd want to know why

YMMV, of course, but I think what matters is the intent to deceive. Once it manifests itself, the specific forms the deception takes do not matter much (though their "level" or magnitude does).

How is the other person going to confirm your mental state?

This is not a court of law, no proof required -- "it looks like" is often sufficient, if only for direct questions which will put you on the spot.

Replies from: dxu
comment by dxu · 2014-12-01T21:04:48.075Z · LW(p) · GW(p)

This is not a court of law, no proof required -- "it looks like" is often sufficient, if only for direct questions which will put you on the spot.

Well, yes, but are they really going to jump right to "it looks like" without any prior evidence? That seems like major privileging the hypothesis. I mean, if you weren't already primed by this conversation, would you automatically think "They might be lying about being unconvinced" if someone starts saying something skeptical about, say, cryonics? The only way I could see that happening is if the other person lets something slip, and when the topic in question is your own mental state, it doesn't sound too hard to keep the fact that you already believe something concealed. It's just like passing the Ideological Turing Test, in a way.

Replies from: Lumifer
comment by Lumifer · 2014-12-01T21:25:21.404Z · LW(p) · GW(p)

but are they really going to jump right to "it looks like" without any prior evidence?

Humans, in particular neurotypical humans, are pretty good at picking up clues (e.g. nonverbal) that something in a social situation is not quite on the up-and-up. That doesn't necessarily rise to the conscious level of a verbalized thought "They might be lying...", but manifests itself as a discomfort and unease.

it doesn't sound too hard

It's certainly possible and is easy for a certain type of people. I expect it to be not so easy for a different type of people, like ones who tend to hang out at LW... You need not just conceal your mental state, you need to actively pretend to have a different mental state.

Replies from: dxu
comment by dxu · 2014-12-02T04:04:44.422Z · LW(p) · GW(p)

Fair enough. How about online discourse, then? I doubt you'd be able to pick up much nonverbal content there.

Replies from: Lumifer
comment by Lumifer · 2014-12-02T05:41:31.943Z · LW(p) · GW(p)

It is much easier to pretend online, but it's also harder to convince somebody of something.

Replies from: dxu
comment by dxu · 2014-12-04T04:41:28.610Z · LW(p) · GW(p)

Would you say the difficulty of convincing someone scales proportionally with the ease of pretending?

Replies from: Lumifer
comment by Lumifer · 2014-12-04T05:55:27.415Z · LW(p) · GW(p)

Hm. I don't know. I think it's true when comparing a face-to-face conversation with an online one, but I have no idea whether that can be extended to a general rule.

comment by Richard_Kennaway · 2014-12-01T12:29:43.331Z · LW(p) · GW(p)

Moreover, all of this is contingent upon you being found out. In a scenario like this, is that really that likely?

Yes. It is.

Replies from: dxu, milindsmart
comment by dxu · 2014-12-01T20:58:38.448Z · LW(p) · GW(p)

Yes. It is.

That's not very helpful, though. Could you go into specifics?

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2014-12-02T10:05:07.465Z · LW(p) · GW(p)

Moreover, all of this is contingent upon you being found out. In a scenario like this, is that really that likely?

Yes. It is.

That's not very helpful, though. Could you go into specifics?

In general, any argument for the success of a plan that sounds like "how likely is it that it could go wrong?" is a planning fallacy waiting to bite you.

Specifically, people can be quite good at detecting lies. On one theory, that's what we've evolved these huge brains for: an arms race of lying vs. detecting lies. If you lie as well as you possibly can, you're only keeping up with everyone else detecting lies as well as they can. On internet forums, I see concern trolls and fake friends being unmasked pretty quickly. Face to face, when person A tells me something about person B not present, I have sometimes had occasion to think, "ok, that's your story, but just how much do I actually believe it?", or "that was the most inept attempt to plant a rumour I've ever heard; I shall be sure to do exactly what you ask and not breathe a word of this to anyone, especially not to the people you're probably hoping I'll pass this on to." If it's a matter that does not much concern me, I won't even let person A know they've been rumbled.

In the present case, the result of being found out is not only that your relationship ends with the person whose religion you were trying to undermine, but they will think that an atheist tried to subvert their religion with lies, and they will be completely right. "As do all atheists", their co-religionists will be happy to tell them afterwards, in conversations you will not be present at.

Replies from: dxu
comment by dxu · 2014-12-04T05:50:43.148Z · LW(p) · GW(p)

On internet forums, I see concern trolls and fake friends being unmasked pretty quickly.

In what manner do you think it is most likely for this to occur?

Face to face, when person A tells me something about person B not present, I have sometimes had occasion to think, "ok, that's your story, but just how much do I actually believe it?", or "that was the most inept attempt to plant a rumour I've ever heard; I shall be sure to do exactly what you ask and not breathe a word of this to anyone, especially not to the people you're probably hoping I'll pass this on to." If it's a matter that does not much concern me, I won't even let person A know they've been rumbled.

If possible, could you outline some contributing factors that led to you spotting the lie?

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2014-12-04T10:32:14.525Z · LW(p) · GW(p)

If possible, could you outline some contributing factors that led to you spotting the lie?

That's a bit like asking how I recognise someone's face, or how I manage to walk in a straight line. Sometimes things just "sound a bit off", as one says, which of course is not an explanation, just a description of what it feels like. That brings to my attention the distinction between what has been said and whether it is true, and then I can consider what other ways there are of joining up the dots.

Of course, that possibility is always present when one person speaks to another, and having cultivated consciousness of abstraction, it requires little activation energy to engage. In fact, that's my default attitude whenever person A tells me anything negatively charged about B: not to immediately think "what a bad person B is!", although they may be, but "this is the story that A has told me; what does it seem to me likely to be true?"

Replies from: dxu
comment by dxu · 2014-12-04T16:52:37.670Z · LW(p) · GW(p)

Well, based on that description, would I be accurate in saying that it seems as though your "method" would generate a lot of false positives?

Replies from: ChristianKl, Richard_Kennaway
comment by ChristianKl · 2014-12-07T12:23:19.088Z · LW(p) · GW(p)

Well, based on that description, would I be accurate in saying that it seems as though your "method" would generate a lot of false positives?

You can always trade of specificity for sensitivity. It also possible to ask additional questions when you are suspicious.

comment by Richard_Kennaway · 2014-12-04T17:18:55.880Z · LW(p) · GW(p)

Suspending judgement is not a false positive. And even from such a limited interaction as seeing the name and subject line of an email, I am almost never wrong in detecting spam, and that's the spam that got past the automatic filters. I don't think I'm exceptional; people are good at this sort of thing.

My hobby: looking at the section of the sidebar called "Recent on rationality blogs", and predicting before mousing over the links whether the source is SlateStarCodex, Overcoming Bias, an EA blog, or other. I get above 90% there, and while "Donor coordination" is obviously an EA subject, I can't explain what makes "One in a Billion?" and "On Stossel Tonight" clearly OB tiles, while "Framing for Light Instead of Heat" could only be SSC.

Replies from: gjm, dxu
comment by gjm · 2014-12-07T18:05:57.482Z · LW(p) · GW(p)

One in a Billion?

Deliberately uninformative title. Robin Hanson does this fairly often, Scott much less so. Very short, which is highly characteristic of OB. Very large number is suggestive of "large-scale" concerns, more characteristic of OB than of Scott. Nothing that obviously suggests EAism.

On Stossel Tonight

Self-promoting (RH frequently puts up things about his public appearances; other sidebarry folks don't). Very short. Assumes you know what "Stossel" is; if you don't this reads as "deliberately uninformative" (somewhat typical of OB), and if you do it reads as "right-wing and businessy connections" (very typical of OB).

(As you may gather, I share your hobby.)

comment by dxu · 2014-12-06T23:34:08.834Z · LW(p) · GW(p)

I don't think I'm exceptional; people are good at this sort of thing.

Huh. I must just be unusually stupid with respect to "this sort of thing", then, as I'm rarely able to discern a plausible-sounding lie from the truth based on nonverbal cues. (As a result, my compensation heuristic is "ignore any and all rumors, especially negative ones".) Ah, well. It looks like I implicitly committed the typical mind fallacy in assuming that everyone would have a similar level of difficulty as I do when detecting "off-ness".

My hobby: looking at the section of the sidebar called "Recent on rationality blogs", and predicting before mousing over the links whether the source is SlateStarCodex, Overcoming Bias, an EA blog, or other. I get above 90% there, and while "Donor coordination" is obviously an EA subject, I can't explain what makes "One in a Billion?" and "On Stossel Tonight" clearly OB tiles, while "Framing for Light Instead of Heat" could only be SSC.

That sounds like an awesome hobby, and one that I feel like I should start trying. Would you say you've improved at doing this over time, or do you think your level of skill has remained relatively constant?

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2014-12-07T10:02:32.557Z · LW(p) · GW(p)

Would you say you've improved at doing this over time, or do you think your level of skill has remained relatively constant?

I couldn't really say. Back when I read OB, I'd often think, "Yes, that's a typical OB title", but of course I knew I was looking at OB. When the sidebar blogroll was introduced here, I realised that I could still tell the OB titles from the rest. The "X is not about Y" template is a giveaway, of course, but Hanson hasn't used that for some time. SSC tends to use more auxiliary words, OB leaves them out. Where Scott writes "Framing For Light Instead Of Heat", Hanson would have written "Light Not Heat", or perhaps "Light Or Heat?".

comment by milindsmart · 2015-01-09T18:35:28.936Z · LW(p) · GW(p)

It sounds like you're implying that most lies are easily found, and consequently, most unchallenged statements are truths.

That's, really really really stretching my capacity to believe. Either you're unique with this ability, or you're also committing the typical mind fallacy, w.r.t thinking all people are only as good at lying (at max) as you are at sniffing them out.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2015-01-09T19:04:52.406Z · LW(p) · GW(p)

Emphasis added:

Moreover, all of this is contingent upon you being found out. In a scenario like this, is that really that likely?

Yes. It is.

It sounds like you're implying that most lies are easily found

In a scenario like this, i.e. pretending to be undergoing a deep crisis of faith in order to undermine someone else's. My observation is that in practice, concern trolling is rapidly found out, and the bigger the audience, the shorter the time to being nailed.

thinking all people are only as good at lying (at max) as you are at sniffing them out.

On the whole, people are as good at lying as, on the whole, people are at finding them out, because it's an arms race. Some will do better, some worse; anyone to whom the idea, "why not just lie!" has only just occurred is unlikely to be in the former class.

comment by ChristianKl · 2014-12-01T13:17:11.636Z · LW(p) · GW(p)

I should emphasize that his crisis of faith was genuine at the time--but it should work even if it's not genuine, as long as the facade is convincing.

Most people are not able to have the kind of strength of emotions that come with a genuine crisis of faith via conscious choice. Pretending to have them might come of as creepy even if the other person can't exactly pinpoint what's wrong.

Replies from: dxu
comment by dxu · 2014-12-01T20:57:13.602Z · LW(p) · GW(p)

Fair enough. Are there any subjects about which there might not be as high an emotional backlash? Cryonics, maybe? Start off acting unconvinced and then visibly think about it over a period of time, coming to accept it later on. That doesn't seem like a lot of emotion is involved; it seems entirely intellectual, and the main factor against cryonics is the "weirdness factor", so if there's someone alongside you getting convinced, it might make it easier, especially due to conformity effects.

Replies from: ChristianKl
comment by ChristianKl · 2014-12-01T21:53:58.031Z · LW(p) · GW(p)

The topic of cryonics is about dealing with death. There a lot of emotion involved for most people.

Replies from: dxu
comment by dxu · 2014-12-02T04:12:35.090Z · LW(p) · GW(p)

It's true that cryonics is about death, but I don't think that necessarily means there's "a lot of emotion involved". Most forms of rejection to cryonics that I've seen seem to be pretty intellectual, actually; there's a bunch of things like cost-benefit analysis and probability estimates going on, etc. I personally think it's likely that there is some motivated cognition going on, but I don't think it's due to heavy emotions. As I said in my earlier comment, I think that the main factor against cryonics is the fact that it seems "weird", and therefore the people who are signed up for it also seem "weird". If that's the case, then it may be to the advantage of cryonics advocates to place themselves in the "normal" category first by acting skeptical of a crankish-sounding idea, before slowly getting "convinced". Compare that approach to the usual approach: "Hey, death sucks, wanna sign up to get your head frozen so you'll have a chance at getting thawed in the future?" Comparatively speaking, I think that the "usual" approach is significantly more likely to get you landed in the "crackpot" category.

Replies from: ChristianKl
comment by ChristianKl · 2014-12-02T12:05:14.068Z · LW(p) · GW(p)

Most forms of rejection to cryonics that I've seen seem to be pretty intellectual, actually; there's a bunch of things like cost-benefit analysis and probability estimates going on, etc

That's really not how most people make their decisions.

Compare that approach to the usual approach: "Hey, death sucks, wanna sign up to get your head frozen so you'll have a chance at getting thawed in the future?"

There are plenty of ways to tell someone about cryonics that don't involve a direct plea for them to take action.

Replies from: dxu
comment by dxu · 2014-12-04T17:05:35.109Z · LW(p) · GW(p)

That's really not how most people make their decisions.

Maybe it's not how most people make their decisions, but I have seen a significant number of people who do reject cryonics on a firmly intellectual basis, both online and in real life. I suppose you could argue that it's not their true rejection (in fact, it almost certainly isn't), but even so, that's evidence against heavy emotions playing a significant part in their decision process.

There are plenty of ways to tell someone about cryonics that don't involve a direct plea for them to take action.

Yes, but most of them still suffer from the "weirdness factor".

comment by Peter Wildeford (peter_hurford) · 2014-11-29T17:34:22.720Z · LW(p) · GW(p)

Or on my own level, I noticed that by being efficient in my job and helpful with my workmates, I'm allowed a higher number of "weirdness points" before having my workmates start considering me as a loonie.

Seems like another way you're taking advantage of a positive halo effect!

comment by ChristianKl · 2014-11-28T16:35:41.280Z · LW(p) · GW(p)

Nerds are very often too shy. They are not willing to go to the extreme. Radical feminism has a lot of influence on our society and plenty of members of that community don't hold back at all.

Bending your own views to be avoid offending other people leads to being perceive as inconfident. It's not authentic. That's bad for movement building.

I think you are making a mistake if you treat the goal of every project as being about affecting public policy. Quite often you don't need a majority. It's much better to have a small group of strongly committed people then a large group that's only lukewarm committed.

Mormons who spent 2 years doing their mission are extreme. Mormonism is growing really fast while less extreme Christian groups don't grow. Groups that advocate extreme positions give their members a feeling that they are special. They are not boring.

In the scarce attention economy of the 21st century being boring is one of the worst things you can do if you want to speak to a lot of people.

Replies from: Jiro, peter_hurford
comment by Jiro · 2014-11-29T18:43:47.693Z · LW(p) · GW(p)

Mormon missions are not primaritly there to gain converts. They are there to force the Mormon to make a commitment of time and resources to Mormonism so that the sunk costs would psychologically tie him to the religion.

(Of course, it wasn't necessarily consciously designed for this purpose, but that doesn't prevent the purpose from being served.)

Replies from: ChristianKl
comment by ChristianKl · 2014-11-29T19:45:15.712Z · LW(p) · GW(p)

That's part of the point. If you want strong changes in society than you need to do movement building. That means you don't focus on outsiders but on strengthing the commitment inside the movement.

comment by Peter Wildeford (peter_hurford) · 2014-11-29T06:43:39.543Z · LW(p) · GW(p)

Mormons who spent 2 years doing their mission are extreme.

Though they're only really extreme about a few things -- their Mormonism and some personal restraint (e.g., no alcohol, etc.) that serves religious purposes. They're otherwise quite normal people.

And I think religious weirdness is one of the kinds of weirdness that people see past the most easily.

I'm not saying that one shouldn't try to be extreme, but that one should (if one aims at public advocacy) try to be extreme in only a few things.

Replies from: Capla, ChristianKl
comment by Capla · 2014-12-09T19:17:37.939Z · LW(p) · GW(p)

personal restraint (e.g., no alcohol, etc.)

It seems borderline-literally insane to me that "personal restraint" is "extreme" and marks one as a radical.

Replies from: TheOtherDave, Lumifer
comment by TheOtherDave · 2014-12-09T19:31:54.926Z · LW(p) · GW(p)

It's pretty common for groups to treat individual restraint in the context of group lack-of-restraint as a violation of group norms, though "radical" is rarely the word used. Does that seem insane to you more generally (and if so, can you say more about why)?

If not, I suspect "extreme" has multiple definitions in this discussion and would be best dropped in favor of more precise phrases.

Replies from: Capla
comment by Capla · 2014-12-09T20:39:58.454Z · LW(p) · GW(p)

It's pretty common for groups to treat individual restraint in the context of group lack-of-restraint as a violation of group norms, though "radical" is rarely the word used. Does that seem insane to you more generally (and if so, can you say more about why)?

Yes. That seems insane to me.

Self restraint is applied self control. It is a virtue and is something to be admired, so long as what as one is restraining one's self for some benefit, not needlessly (though personally, I have respect for all forms of restraint, even if they are needless, e.g. religiously motivated celibacy, in the same way I have respect for the courage of suicide bombers).

Is alcohol consumption restraint without benefit? No. Alcohol is a poison that limits one's faculties in small amounts, with detrimental health effects, in large doses.

A friend, was sharing with me the other day that he doesn't like the culture of...I'm not sure what to call it...knowing overindulgence? He gave the example of the half joking veneration of bacon something that everyone loves and always wants more of, as if to say "I know it's unhealthy, but that's why we love it" so much.

I hear people say, "I don't eat healthy food", and in the culture we live in, that is an acceptable thing to say, where to me it sounds like an admission that you lack self control, but instead of acknowledging it as a problem, and working on it, glancing over it with a laugh.

I am a vegetarian. I once sat down for a meal with a friend and my sister. The friend asked my sister if she was a vegetarian. My sister said she wasn't. The friend said (again, half joking), "Good", as if vegetarianism is a character flaw: real people love meat. I confronted her about it later, and said that that bothered me. I know not everyone is a vegetarian, and it is each person's own choice to weigh the costs and benefits to decide for themselves, but there are many, many good reason to practice some kind of meat-restriction, from the ecological, to the moral, to simple health. I won't tolerate my friend, acting as if not eating meat means there is something wrong with you..

It feels to me, and maybe I'm projecting, that not everyone is up for making hard choices*, but instead of owning up to that, we have built a culture that revels in overindulgence. The social pressure pushes in the wrong direction.

It's weird to not drink. It's weird to not eat meat. It's weird to put to much effort into staying healthy. It's weird to give a significant portion of your income to save lives. Those are just obviously (to me) the right thing to do.

It seems to me that the way we treat smoking is about right. Mostly, we let smokers make their own choices, and don't hold those choices against them as individuals. However, there is also a social undercurrent of, "smoking is disgusting" or at least "smoking is stupid; if you don't smoke, don't start." There is a mild social pressure for people to stop smoking, as opposed to someone getting weird looks if they turn down a cigarette (the way I do now, if I turn down a cookie).

This is a subjective, semi-rant and I'm expressing my opinion. Consider this an elaboration on the off-hand comment above, and feel free to challenge me if I'm wrong.

  • I'm self conscious about the fact that I'm implicitly saying that I'm strong enough to make those hard choices, but I'm saying it anyway.
Replies from: TheOtherDave
comment by TheOtherDave · 2014-12-09T21:20:51.891Z · LW(p) · GW(p)

Consider this an elaboration on the off-hand comment above,

(nods) Which is exactly what I asked for; thank you.

feel free to challenge me if I'm wrong

I think you're using a non-standard definition of "insane," but not an indefensible one.

comment by Lumifer · 2014-12-09T19:39:31.201Z · LW(p) · GW(p)

It seems borderline-literally insane to me that "personal restraint" is "extreme" and marks one as a radical.

Depends on what kind. The one that runs counter to the prevailing social norms does mark one as a radical.

You can treat incluses as people who practice "personal restraint" :-/

Replies from: Capla
comment by Capla · 2014-12-09T20:41:57.449Z · LW(p) · GW(p)

I think these fall under the group that I admire the way I admire the courage of suicide bombers. I admire the dedication, but I think they are insane for other reasons.

comment by ChristianKl · 2014-11-29T16:47:48.371Z · LW(p) · GW(p)

Mormon polygamy is not normal. Mormons donating 10% of their income also isn't normal. Mormonism has enough impact on a person that some Mormons can identify other Mormons.

And I think religious weirdness is one of the kinds of weirdness that people see past the most easily.

The thing that distinguishes religious weirdness is that it comes from a highly motivated place and isn't a random whim.

if one aims at public advocacy

I'm not exactly sure what you mean with "public advocacy".

Replies from: Adele_L, peter_hurford
comment by Adele_L · 2014-11-30T00:03:06.087Z · LW(p) · GW(p)

Mormons don't practice polygamy anymore, and they haven't for a long time (except for small 'unofficial' groups). Most Mormons I know feel pretty weird about it themselves.

comment by Peter Wildeford (peter_hurford) · 2014-11-29T17:47:00.410Z · LW(p) · GW(p)

Mormon polygamy is not normal. Mormons donating 10% of their income also isn't normal.

Good point. But, if I recall correctly, don't they go to a good amount of length to not talk about these things a lot?

-

The thing that distinguishes religious weirdness is that it comes from a highly motivated place and isn't a random whim.

I don't think it's just a highly motivated place, but rather a highly motivated place that other people can easily verify as highly motivated and relate to.

-

I'm not exactly sure what you mean with "public advocacy".

Bringing up an ingroup idea with people outside your ingroup.

For example, I'd love it if people ate less meat. So I might bring that up with people, as the topic arises, and advocate for it (i.e., tell them why I think not eating meat is better). I still envision it as a two-way discussion where I'm open to the idea of being wrong, but I'd like them to be less affected by certain biases (like weirdness) if possible.

Replies from: ChristianKl
comment by ChristianKl · 2014-11-29T20:14:05.717Z · LW(p) · GW(p)

I don't think a conversation at a birthday of a friend qualifies as "public" in the traditional sense.

So I might bring that up with people, as the topic arises, and advocate for it (i.e., tell them why I think not eating meat is better).

I think that's seldom the most straightforward way for changing people through personal conversation. It makes much more sense to ask a lot of questions and target your communication at other person,

Status also matters. Sometimes doing something weird lower your status other time it raises it. It always makes sense to look at the individual situation.

Replies from: peter_hurford
comment by Peter Wildeford (peter_hurford) · 2014-11-29T22:12:53.265Z · LW(p) · GW(p)

I don't think a conversation at a birthday of a friend qualifies as "public" in the traditional sense.

What did you have in mind? I think this advice applies even more so to "public" venues in the traditional sense (e.g., blogging for general audiences).

comment by Brian_Tomasik · 2014-12-13T07:17:15.264Z · LW(p) · GW(p)

Thanks, Peter. :) I agree about appearing normal when the issue is trivial. I'm not convince about minimizing weirdness on important topics. Some counter-considerations:

  • People like Nick Bostrom seem to acquire prestige by taking on many controversial ideas at once. If Bostrom's only schtick were anthropic bias, he probably wouldn't have reached FP's top 100 thinkers.
  • Focusing on only one controversial issue may make you appear single-minded, like "Oh, that guy only cares about X and can't see that Y and Z are also important topics."
  • If you advocate many things, people can choose the one they agree with most or find easiest to do.
comment by someonewrongonthenet · 2014-11-29T20:49:49.175Z · LW(p) · GW(p)

Clean up and look good. Lookism is a problem in society, and I wish people could look "weird" and still be socially acceptable. But if you're a guy wearing a dress in public, or some punk rocker vegan advocate, recognize that you're spending your weirdness points fighting lookism, which means less weirdness points to spend promoting veganism or something else.

Caveat - if people already know you are well liked and popular, the weirdness actually functions as counter-signalling which makes you more popular - similar to how teasing strengthens friendships. You're signalling, "look, I'm so well liked that I can afford to be weird." If you're surrounded by chattering friends, or a straight A student, or the most skilled person in your field, people see the fact that go out in your pajamas, suffer from Einstein hair, or are covered in tattoos as a sign that you are unconcerned about social status, which in turn raises your status.

This is something I learned by initially not caring due to social ineptitude, and then slowly starting to care with age...and then noticing that caring about appearance strengthened my reception among total strangers but not caring about appearance strengthened by reception among friends and strangers who plainly could see that I had tons of friends.

In another situation, if I was doing poorly in class, dressing like as slob made me look bad...but if I was an impeccable student, it came off as an eccentric genius sort of thing. Weirdness seems to just basically magnify whatever you're already seen as.

I think this applies more to neutral stuff like appearance than to deep stuff like beliefs - but it does generally hold true with weirdness in social behavior. Even with ideas though - I suspect the ideas that made Kurzweil sound cooky would make Einstein seem even more visionary.

comment by 27chaos · 2014-11-27T22:05:52.917Z · LW(p) · GW(p)

Notions of weirdness vary a lot. Also, individual instances of weirdness will be visible to different people. Both these challenge the idea we should bother having aggregated measurements of weirdness at all. People's sensitivity to weirdness also varies, sometimes in complicated ways. Some people are actually more receptive to ideas that sound weird. Other people will believe if someone is both successful and weird they must know something others don't. Others are willing to ignore weirdness if allied with it. This is all very complex.

I think our social brains already do a good job of keeping track of all these important details. Trying to consciously score different traits, beliefs, and actions for an aggregated weirdness score seems like a recipe for anxiety and disaster to me. I don't think it's worth worrying about. Inauthentic strategies like this are too hard and unpleasant to sustain.

comment by Lumifer · 2014-11-29T04:38:07.171Z · LW(p) · GW(p)

It would be helpful to point out that your post is within the context of trying to convince other people, aka memetic warfare. Your "actionable principles" serve a specific goal which you do not identify.

Replies from: peter_hurford
comment by Peter Wildeford (peter_hurford) · 2014-11-29T06:44:27.979Z · LW(p) · GW(p)

Fair point. I concede I'm only writing here in the context of public advocacy.

comment by Daniel Campagnoli (daniel-campagnoli) · 2020-02-05T01:19:23.682Z · LW(p) · GW(p)

This reminds me of a quote by George Bernard Shaw:

“The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.”
comment by ilzolende · 2014-11-28T01:26:23.191Z · LW(p) · GW(p)

I think it's important to consider the varying exchange rates, as well as the possible exchange options, when choosing how to spend your weirdness points.

Real example: Like enough other people on this website to make it an even less representative sample of the general population, I'm autistic, so spending a weirdness point on being openly such is useful, not because it's the best possible way to promote disability rights, but rather because I can save a lot of willpower that I need for other tasks that way.

Fake example: The Exemplars, a band popular with the teenage set, are rationalists who want to promote a political cause. However, none of the causes they care about are standard causes for bands. As teenagers are generally more interested in the end of the world than, for example, cryonics, they decide to sing about x-risk on their East Examplestan tour. This still makes them look just about as weird as if they sang about cryonics, but teenagers are more interested in x-risk, so they get better results. (Until the Church of Driving 50 In A 30 Mile Per Hour Zone gets them convicted of blasphemy, that is.)

comment by jwray · 2022-01-16T03:27:06.570Z · LW(p) · GW(p)

This seems like a subset of point #7 here (https://slatestarcodex.com/2016/02/20/writing-advice/)

7. Figure out who you’re trying to convince, then use the right tribal signals

I would define weirdness as emitting signals that the tribe recognizes as "other" but not "enemy".  Emitting enough of the in-group signals may counteract that.

This is also reminiscent of John Gottman's empirical research on married couples where he found they were much more likely to split if the ratio of positive to negative interactions was less than 5 to 1.

comment by dspeyer · 2014-11-28T15:56:21.474Z · LW(p) · GW(p)

Have we worked through the game theory here? It feels like negotiating with terrorists.

Replies from: Strange7
comment by Strange7 · 2014-11-28T16:34:16.656Z · LW(p) · GW(p)

My objection is to the 'set amount.' What about the Bunny Ears Lawyer trope, where someone purchases additional weirdness points with a track record of outstanding competence?

Replies from: VAuroch
comment by VAuroch · 2014-12-01T05:56:00.741Z · LW(p) · GW(p)

The Bunny Ears Lawyer trope is one of those ones that never shows up in real life.

Replies from: Lumifer, dxu
comment by Lumifer · 2014-12-01T20:16:24.186Z · LW(p) · GW(p)

Sure they do, you just need to look in the right places. Speaking of lawyers, one place would be the tax department of a top-tier New York law firm. Another place would be sysadmins of small companies.

comment by dxu · 2014-12-01T06:00:17.770Z · LW(p) · GW(p)

Not to the same extent, maybe, but I was under the impression that it does occur. I'm not willing to go check right now due to memetic hazard, but doesn't TV Tropes have a page full of real life examples?

Replies from: VAuroch
comment by VAuroch · 2014-12-01T20:07:15.827Z · LW(p) · GW(p)

Nope. No Real Life entry for that trope exists.

Replies from: ike
comment by ike · 2014-12-01T20:22:05.586Z · LW(p) · GW(p)

It appears like it used to, as it is referenced in the text, and can be found on a fork of tvtropes here.

Replies from: arundelo
comment by arundelo · 2014-12-02T15:40:54.830Z · LW(p) · GW(p)

My favorite example from that page is Paul Erdős, who spent his life couch-surfing from one mathematical collaborator to the next.

comment by Will_Lugar · 2015-04-16T01:16:07.375Z · LW(p) · GW(p)

Ozy Frantz wrote a thoughtful response to the idea of weirdness points. Not necessarily disagreeing, but pointing out serious limitations in the idea. Peter Hurford, I think you'll appreciate their insights whether you agree or not.

https://thingofthings.wordpress.com/2015/04/14/on-weird-points/

comment by [deleted] · 2014-11-28T15:41:22.229Z · LW(p) · GW(p)

Weirdness is incredibly important. If people weren't willing to deviate from society and hold weird beliefs, we wouldn't have had the important social movements that ended slavery and pushed back against racism, that created democracy, that expanded social roles for women, and that made the world a better place in numerous other ways.

Well, there is that. But there's also just the fact that being weird is what makes people interesting and fun, the sort of people I want to hang out with.

While many of these ideas might make the world a better place if made into policy, all of these ideas are pretty weird.

I think we need another adjective for the category of ideas that make people uncomfortable due to being substantially morally superior to the status quo. (And I don't even do all those things or believe all those particular ideas are good ones.)

...But we can use this knowledge to our advantage. The halo effect can work in reverse -- if we're normal in many ways, our weird beliefs will seem more normal too. If we have a notion of weirdness as a kind of currency that we have a limited supply of, we can spend it wisely, without looking like a crank.

Well, when propagandizing, yes.

But maybe EA should disconnect a bit more and compartmentalize -- for example, leaving AI risk to MIRI, for example, and not talk about it much, say, on 80,000 Hours.

Yes, this is a very good example of allocating scarce weirdness points in one's propaganda.

comment by hawkice · 2014-11-28T02:48:23.496Z · LW(p) · GW(p)

It might be worth emphasizing the difference between persuading people and being right. The kind of people who care about weirdness points are seldom the ones contributing good new data to any question of fact, nor those posing the best reasoning for judgments of value. I appreciate the impulse to try to convince people of things, but convincing people is extremely hard. I'm not Noam Chomsky; therefore, I have other things to do aside from thinking and arguing with people. And if I have to do one of those two worse in order to save time, I choose to dump the 'convince people' stat and load up on clear thinking.

Replies from: Unknowns
comment by Unknowns · 2014-11-28T04:01:32.103Z · LW(p) · GW(p)

Weirdness points are evidence of being wrong, since someone who holds positions different from everyone else on almost every point is probably willfully contrarian. So people who care about truth will also care about weirdness points; if someone is too weird (e.g. timecube), it is probably not worth your time listening to them.

Replies from: dxu
comment by dxu · 2014-11-28T06:27:11.891Z · LW(p) · GW(p)

I at least somewhat disagree with this. Weirdness is not a reliable measure of truth; in fact, I'd argue that it may even slightly anti-correlate with truth (but only slightly--it's not like it anti-correlates so well that you'd be able to get a good picture of reality out of it by reversing it, mind you). After all, not every change is an improvement, but every improvement is a change. Every position that seems like common sense to us nowadays was once considered "weird" or "unusual". So yeah, dismissing positions on the basis of weirdness alone doesn't seem like that great of an idea; see the absurdity heuristic for further details.

Also, people often have reasons for discrediting things outside of striving for epistemic accuracy. Good people/causes can often be cast in a bad light by anyone who doesn't like them; for instance*, RationalWiki's article on Eliezer makes him so weird-sounding as to be absolutely cringeworthy to anyone who actually knows him, and yet plenty of people might read it and be turned off by the claims, just like they're turned off from stuff like Time Cube.

*It is not my intention to start a flame-war or to cause a thread derailment by bringing up RW. I am aware that this sort of thing happens semi-frequently on LW, which is why I am stating my intentions here in advance. I would ask that anyone replying to this comment not stray too far from the main point, and in particular please do not bring up any RW vendettas. I am not a moderator, so obviously I have no power to enforce this request, but I do think that my request would prevent any derailment or hostility if acceded to. Thank you.

Replies from: hawkice
comment by hawkice · 2014-11-29T00:20:02.989Z · LW(p) · GW(p)

I think all three of us are right and secretly all agree.

(1) that weirdness points are bayesian evidence of being wrong (surely timecube doesn't seem more accurate because no one believes it). Normal stuff is wrong quite a lot but not more wrong than guessing.

(2) weirdness points can never give you enough certainty to dismiss an issue completely. Time Cube is wrong because it is Time Cube (read: insane ramblings), not because it's unpopular. Of course we don't have a duty to research all unlikely things, but if we already are thinking about it, "it's weird" isn't a good/rational place to stop, unless you want to just do something else, like eat a banana or go to the park or something.

and, critically, (3) If you don't have evidence enough to completely swamp and replace the bayesian update from weirdness points, you really don't have enough evidence to contribute a whole lot to any search for truth. That's what I was getting at. It's also pretty unlikely that the weirdness that "weirdness points" refer to would be unknown to someone you're talking with.

comment by Arkanj3l · 2014-12-04T02:33:55.508Z · LW(p) · GW(p)

Weirdness is a scarce resource with respect to ourselves? Great! Does that mean that we'd benefit from cooperating such that we all take on different facets of the weirder whole, like different faces of a PR operation?

Replies from: Nornagest
comment by Nornagest · 2014-12-04T03:32:36.063Z · LW(p) · GW(p)

People tend to model organizations as agents, and I expect weirdness in an org's public-facing representatives would be more salient than normality. That implies that representatives' weirdness would be taken as cumulative rather than exclusive.

So, no.

comment by [deleted] · 2015-08-03T07:03:14.371Z · LW(p) · GW(p)

Lots I agree with here. I was suprised to see basic income in your clustering above. As much as I think Cuban's are the ones doing socialism wrong, and everyone doing socialism less, like Venezuala isn't socialist enough, I'm right wing and mindkilled enough to have rejected basic income using general right wing arguments and assumptions until I read the consistency of positive examples on the Wikipedia page. The straw that broke the camels back was that there is right wing support for basic income. That being said, I'm confident that I would pass ideological turing tests.

Replies from: Jiro, LawChan
comment by Jiro · 2015-08-03T15:07:15.114Z · LW(p) · GW(p)

It is generally a bad idea to change your views based on a Wikipedia page. Particularly a Wikipedia page on a politically charged subject. What you see may only mean that nobody happened to stop by the page who was willing to add the negative examples.

Also, be careful that you don't read the article as saying more than it is actually saying. it says that "several people" on the right supported it. Great, at least two, and both of them from far enough in the past that "right wing" doesn't mean what it means today.

Replies from: Good_Burning_Plastic
comment by Good_Burning_Plastic · 2015-08-03T20:48:49.434Z · LW(p) · GW(p)

It is generally a bad idea to change your views based on a Wikipedia page.

Depends on how much you knew about the topic to begin with.

comment by LawrenceC (LawChan) · 2015-08-03T15:43:41.338Z · LW(p) · GW(p)

That being said, I'm confident that I would pass ideological turing tests.

Cool! You can try taking them here: http://blacker.caltech.edu/itt/

comment by 27chaos · 2014-11-27T22:00:41.491Z · LW(p) · GW(p)

I agree that people who want to influence others should avoid having others discount their opinions. I don't see what your analysis here offers beyond noticing that and the simple fact that people generally discount the opinions of weirdos, though. Notions of weirdness vary a lot. Also, individual instances of weirdness will be visible to different people. Both these strain the idea of having any aggregated measurement of weirdness at all. People's sensitivity to weirdness also varies, sometimes in complicated ways. Some people are actually more receptive to ideas that sound weird. Other people will believe if someone is both successful and weird they must know something others don't.

I think our social brains already do a good job of keeping track of all these important details. Trying to consciously score different traits, beliefs, and actions for weirdness seems like a recipe for anxiety and disaster to me. I don't think it's worth worrying about. Inauthentic strategies like this are too hard and unpleasant to sustain.

comment by Matthew Smith (matthew-smith) · 2020-02-04T20:52:07.547Z · LW(p) · GW(p)

trevinos degrees of acceptance, not overton's window

comment by Jiro · 2014-11-29T18:46:49.433Z · LW(p) · GW(p)

Use the foot-in-door technique and the door-in-face technique

Using tactics intentionally designed to appeal to people's biases is dark arts. If you try these, you completely deserve having rationalists tell you "Sorry, I've been trying to remove my biases, not encourage them. Go away until you can be more honest."