Can you gain weirdness points?

post by Nicholas / Heather Kross (NicholasKross) · 2020-07-31T03:41:47.050Z · LW · GW · 2 comments

This is a question post.

Contents

  Answers
    19 lsusr
    4 Viliam
    4 mr-hire
    4 deluks917
None
2 comments

A useful idea I've been looking into more lately, is "weirdness points". Basically, some good ideas are really unusual. People are often biased against unusual ideas [LW · GW]. It's often seen to be easier to fight for one weird thing, than to fight for multiple weird things. Therefore, we ought to prioritize what we fight for, so the most important things get our weirdness points, and tamper down on the weirdness [LW · GW] in other areas of our lives/work.

Typical social explanations of weirdness points aren't completely helpful. Power, status, and wealth would seem to bestow weirdness points. But politicians, celebrities, and wealthy people aren't always as free from the weirdness constraint as would be guessed.

Maybe communities and media are fracturing so much that weirdness points are more dependent on your community than your actions. (The social psych idea, "idiosyncrasy credits", is defined in terms of a group's expectations, not those of society-at-large or people-who-are-reachable-but-not-already-on-your-side.)

Weirdness points seem like a valuable (and limited) resource, especially if you are promoting or enacting multiple ideas (A.I. safety and improving rationality and open borders, for example). As with anything valuable to our goals, we ought to figure out if we can get it, and at what cost.

So, the questions for discussion:

Answers

answer by lsusr · 2020-07-31T08:14:53.078Z · LW(p) · GW(p)

People do not punish nonconformity per se. People punish nonconformity iff it is a problem. If someone punishes you for being weird then that means your weirdness has caused a problem. If you can stop causing problems for other people then you can get away with being weird.

I walk around barefoot outside where there is broken glass. Instead of hosting my personal website on WordPress, I created my own content management system…in Lisp. I wrote this answer in Vim through i3 on a Linux machine. I am a heretical savant high on cocaine [LW · GW]. I wrote a series of posts [? · GW] on how to become even weirder. Yesterday, I stared at a grass field for so long my eyes malfunctioned.

I get away with being weird because I do not cause problems for other people. The value of keeping me around outweighs the cost.

Weirdness points seem like a valuable (and limited) resource, especially if you are promoting or enacting multiple ideas…

Promoting unpopular ideas turns you into a problem.

It's often seen to be easier to fight for one weird thing, than to fight for multiple weird things. Therefore, we ought to prioritize what we fight for…

Fighting the ordinary people around you turns you into a problem. The simplest way to preserve weirdness points is to not fight for things.

Promoting unpopular ideas costs social capital. How much you can influence other people is a good definition of social capital. If you want to get away with disruptive activities then you can increase your social capital or minimize the disruption you cause.

comment by ChristianKl · 2020-07-31T10:39:31.761Z · LW(p) · GW(p)

It's very hard to know to what extend one gets punished for nonconformity. You don't know about the event invitations that you didn't get because you were seen as being too weird. 

comment by Nicholas / Heather Kross (NicholasKross) · 2020-07-31T21:44:35.608Z · LW(p) · GW(p)

This is way clearer thinking than I previously had about this topic. Thank you!

comment by Matt Goldenberg (mr-hire) · 2020-07-31T21:08:56.925Z · LW(p) · GW(p)

People do not punish nonconformity per se. People punish nonconformity if it is a problem

People are more likely to be vocal about punishing non-conformity if it is a problem.  But I think there's a thing very much like an anti-holo-effect that surrounds people who are perceived as weird.

Replies from: Viliam
comment by Viliam · 2020-08-01T21:17:52.384Z · LW(p) · GW(p)

After observing too many cases of non-conforming people causing problems, people may update and start punishing non-conformity directly.

answer by Viliam · 2020-08-01T22:07:03.665Z · LW(p) · GW(p)

"Weirdness" is probably a bad abstraction, because it includes things with opposite effect. From statistical perspective, being a king is weird, being a billionaire is weird, being a movie star is weird. Yet this is obviously not what we mean when talking about carefully spending our weirdness points.

Here is a hypothesis I just made up, with no time to test it: Maybe people instinctively try to classify everyone else into three buckets: "higher-status than me", "the same as me", and "lower-status than me". The middle bucket is defined by similarity: if you are sufficiently similar to me, you are there. If you are dissimilar, the choices are only "higher" and "lower". (In other words, the hypothesis is that the instinctive classification does not support the notion of "different but equal".) Because you do not assign people higher status for no reason, it follows that if you are different from me, and there is no evidence of you being higher-status than me, then I will perceive and treat you as lower-status. And if you refuse to be treated as lower-status, I will punish you for acting above your status.

From this model, it follows that for people visibly above you, weirdness is not a problem. You expect the king to have unusual manners and opinions compared to the peasants. It is the weird peasant everyone makes fun of.

The answer then is that you must achieve superior status first, and show your weirdness later. Then people will assume that these things are related.

comment by ChristianKl · 2020-08-02T20:39:04.138Z · LW(p) · GW(p)

From statistical perspective, being a king is weird, being a billionaire is weird, being a movie star is weird. 

Weird is no statistical term and saying that some notion of weirdness that's a statistical abstraction is a bad abstraction has little to do with whether the concept in it's usual sense is a good abstraction.

answer by Matt Goldenberg (mr-hire) · 2020-07-31T21:19:16.020Z · LW(p) · GW(p)

I think it's best to view weirdness points as a fake framework [LW · GW].

I don't think there is, at any level of abstraction, an accurate gears level model that includes weirdness points as a gear. But, if you're just trying to make quick and dirty heuristics about what you can get away with, it's an excellent heuristic.

When you're looking at the gears of this phenomena, I think you start looking at signaling and countersignaling, which will give you more accurate answers than trying to count weirdness points.

comment by TAG · 2020-07-31T21:54:28.065Z · LW(p) · GW(p)

Given that you can't have a quark level model , what counts as a gear level model?

Replies from: mr-hire
comment by Matt Goldenberg (mr-hire) · 2020-07-31T21:58:09.836Z · LW(p) · GW(p)

A model that makes accurate predictions at a given level of abstraction. and can handle may cases at that level E.g. if the level of abstraction is "human behavior" (rather than say, quarks) it should give accurate predictions about the human behavior abstraction.

Replies from: TAG
comment by TAG · 2020-07-31T22:40:18.627Z · LW(p) · GW(p)

What you are talking about is a function of given-level-of-description, not an absolute. So there is a level of abstraction where "weirdness points" works.

Replies from: mr-hire
comment by Matt Goldenberg (mr-hire) · 2020-07-31T22:50:39.621Z · LW(p) · GW(p)

Probably, but not a very useful one.  It's better to just use natural levels of abstraction like "human behavior" and recognize that this is not a gears level model for that level, but a heuristic. I can't really think of a natural abstraction where weirdness points is usefully gearsy, rather than a heursitic.

Replies from: TAG
comment by TAG · 2020-08-01T02:28:32.327Z · LW(p) · GW(p)

"gears level" is defined in terms of usefulness, and so is "heuristic"

Replies from: mr-hire
comment by Matt Goldenberg (mr-hire) · 2020-08-01T02:32:51.512Z · LW(p) · GW(p)

Gears level is defined by prediction power at a given level of abstraction, heuristic is defined by something like... "speed/prediction power" at a typical level of abstraction, or something.  Whether you want gears or heuristics really depends on how much time you have and how much time you're going to spending with the model (typically heuristics).

Replies from: TAG
comment by TAG · 2020-08-03T16:07:01.637Z · LW(p) · GW(p)

Are they really different? Why would you want to use a high abstraction level if not to save Compute?

Replies from: mr-hire
comment by Matt Goldenberg (mr-hire) · 2020-08-03T16:23:08.533Z · LW(p) · GW(p)

Are they really different?

Yes.

Why would you want to use a high abstraction level if not to save Compute?

You do want to use it to save compute.

Replies from: TAG
comment by TAG · 2020-08-03T16:27:11.005Z · LW(p) · GW(p)

So why are they different?

answer by sapphire (deluks917) · 2020-07-31T17:02:48.290Z · LW(p) · GW(p)

I loudly promote a large number of rather contentious ideas. In particular, I am an animal right hardliner (an active member of Direct Action Everywhere) and a socialist top of the big rationalist stereotypes (singularity is near, poly, etc). I certainly annoy a lot of people but socially I am doing well. I have many friends, an amazing long term relationship, and am doing well financially. You can read my blog to see the sort of beliefs I promote.

It is unclear why this works out for me. I look rather average which might help? Plausible I have some sort of social skills that help me smooth things over if they get too hot. I handle conflict fairly well. It seems empirically true many people are socially successful despite having extremely controversial. In some cases it seems to help them?

2 comments

Comments sorted by top scores.

comment by Linch · 2020-07-31T09:00:21.310Z · LW(p) · GW(p)

Many of the thinker-heroes that we revere now, like Jeremy Bentham, Isaac Newton, Florence Nightingale, and Benjamin Franklin, among others, had ideas that were considered deeply weird within their time. Many of them were considered quite popular even within their own time.