Not all signalling/status behaviors are bad

post by Stabilizer · 2012-03-25T10:06:39.795Z · LW · GW · Legacy · 78 comments

Contents

78 comments

As I've recently been understanding signalling/status behaviors common among humans and how they can cloud reality, I've had a tendency to automatically think of these behaviors as necessarily bad. But it seems to me that signalling behaviors are pretty much a lot of what we do during our waking life. If you or I have abstract goals: become better at physics, learn to play the guitar, become fit and so forth, these goals may fundamentally be derived from evolutionary drives and therefore their implementation in real life would probably make heavy use of signalling/status urges as primary motivators. But that does not necessarily reduce the usefulness of these behaviors in achieving these abstract goals1,2

I suppose what we need to be cautious about are inefficiencies. Signalling/status behaviors may not be the optimal way to achieve these goals. We would have to weigh the costs of actively ignoring your previous motivators and cultivating new motivators against the benefit we would gain by having motivations more aligned to our abstract goals.

Any common examples of behaviors that assist and/or thwart goal-achievement? I've got one: health. Abstract goal: We want to be healthy and fit. Status/Signalling urge: desire to look good. The urge sometimes assists, as people try to exercise to look good, which makes you healthier. Sometimes it thwarts, like in the extreme example of anorexia. Has anybody made personal trade-offs?

 

Note:

1) I realize that this theme is underlying in many LW posts.

2) I'm not trying to talk about whether abstract goals are more important than signalling/status goals.

 

 

78 comments

Comments sorted by top scores.

comment by Kaj_Sotala · 2012-03-25T16:05:23.805Z · LW(p) · GW(p)

Suppose that I were a hard-working person, and you wanted to hire somebody who was hard-working. Now I do things which signal my hard-workingness, and you see this and hire me because of it. As a result, I got a job and you got a hard-working employee.

(Honest) signaling is about communicating the fact that you have (positive) qualities which aren't immediately obvious. To the extent that other people care about knowing whether you have such qualities, signaling is a fantastic thing, and we should all be doing it. It's only wasteful or dishonest signaling that's a problem.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-03-27T00:07:16.807Z · LW(p) · GW(p)

(Obligatory caveat: If everyone is already engaged in "dishonest" signaling and the market knows this and adjusts for it, then not engaging in "dishonest" signaling yourself is itself dishonest—it misleads the market into underestimating you. So perhaps the immoral ones are those who refuse to adapt to the market and instead take a "moral" stand against negative-sum signaling games.)

Replies from: army1987, TimS
comment by A1987dM (army1987) · 2012-10-28T23:45:00.283Z · LW(p) · GW(p)

Yes. If everyone who says "I'm X" is actually Y and everybody knows that and people still say that, then essentially X has come to actually mean Y, whatever its literal meaning was.

comment by TimS · 2012-03-27T00:19:02.736Z · LW(p) · GW(p)

Child: But everyone's doing it.

Parent: If everyone was jumping off a bridge, you'd want to as well?

Replies from: Will_Newsome
comment by Will_Newsome · 2012-03-27T00:29:06.793Z · LW(p) · GW(p)

Or, "If everyone was not jumping off a bridge, you'd want to not jump off a bridge as well?"

comment by Viliam_Bur · 2012-03-26T13:14:19.874Z · LW(p) · GW(p)

I think there is one pattern that being good at something (socially accepted) is good signalling... but trying to improve at something (and thus showing that you think you are not good enough yet) is bad signalling.

I have noticed this pattern years ago, when I was trying to write sci-fi stories, and was meeting with other people who tried the same. I found somewhere on internet that many famous writers attended some kind of writer workshops. So I suggested to my friends that we should find out whether such workshop exists near us, and if not, try to create our own workshop. Most of them were horrified by that idea. When I asked why, they told me that a person either has a talent for writing, or not. The former cannot learn anything at workshops, because the true art cannot be taught; only the latter could learn to become a more skilled art-less graphomaniac. I thought such reasoning was stupid, and asked some literary critics about it: but they confirmed that they would percieve a person known to have attended such workshops as an art-less wannabe, because the true talent must be born. I refused to accept their reasoning too (because I have read a few autobiographies of famous writers, and many of them had some kind of writing education), but I have learned that admitting to a systematic self-improvement can be a huge status loss. (So a smart thing to do is to attend the workshops secretly, and to pretend you were just born with such ability; at least until your status becomes unshakeable.)

Shortly: to fix a problem, first you have to admit it. Admitting the problem = status loss.

Recently I was thinking about how exactly is this possible: how can improving your skills seem like a status loss? Like, is there any way to have good skills other than improving them? How else can the good-skilled people become good-skilled, if not by gradual learning? But then I realized that perhaps it's the life-long learning aspect that goes against our instincts. In an ancient environment, life was short and rather simple -- people learned their skills as children (naturally low-status), and as adults they have just used them; improving them further by using, but not starting from near-zero. Today, the life is so complex that we cannot learn everything as children, but our brains still see the first steps in mastering any skill as childish. We still feel learning is natural for children (skills learned in childhood we perceive as inborn), but retarded (literally: too late) for adults.

Replies from: Crux, Multiheaded, army1987, army1987
comment by Crux · 2012-03-26T21:10:09.694Z · LW(p) · GW(p)

Great comment.

It's not just improving your skills when you're already an adult that's a status loss. Nobody faults a top 100 professional tennis player for trying to get better, and in fact the commentators often lavish praise upon the most hard-working ones as being mature, ambitious, high-quality individuals.

It's starting from ground zero that's the problem. I learned this the hard way when I tried basketball. I started tennis from a young age, and spent my incompetent years as a child. With basketball though, I attempted playing it for the first time when I was already an adult, and it was a serious social shock.

I was used to being treated with respect on the tennis court, but none of the people I played basketball with knew who I was, or knew anything about me (because it was just at some large gym with a full basketball court and a ton of people who came to play). They didn't just treat me like I was a newbie; they treated me like I deserved no respect at all (or rather many did--there were at least a few nice people).

As you said, it's OK to be low status as a child (because they're just naturally low status), but it gets socially intractable when you reach adulthood. We may literally be wired to "see the first steps in mastering any skill as childish". I cringe when I imagine learning a new skill from the ground up and being watched and judged while doing so. This is certainly an area where our status hardware is dangerously mis-aligned for our current environment.

In the ancestral environment, one could imagine that it would have been counter-productive for an individual to decide to do a "career change". Learning a new skill that didn't have sufficient micro-skill carry-over from one's old specialty to allow one to excel at a sufficiently adult-like level right away would have been a waste of one's prior skill, and a detriment to the tribe. Perhaps it may have been optimal for the children to find their comparative advantage at a young age, and then simply stick to it.

Replies from: Viliam_Bur, Alicorn
comment by Viliam_Bur · 2012-03-27T09:29:12.291Z · LW(p) · GW(p)

It's not just improving your skills when you're already an adult that's a status loss. Nobody faults a top 100 professional tennis player for trying to get better...

Yes, the exact rules are a bit more complex. Seems to me like it's OK to practice if you already have much better results. The good results are high-status... and whetever else the person does is colored by the halo effect. If they work diligently, we should praise them for their work. But I guess that even if they would do nothing and yet deliver superior results, we should praise them for their talent. Whatever a cool person does, it automatically becomes cool, although the same thing might become uncool if someone else would do that (and the corresponding rationalization would be: no, it's not really the same thing; you are doing it wrong).

On the other hand the image of "working hard" could be better than image of "just having luck" because it reduces envy. The envy-reducing factor could be also something else than hard work, for example "being crazy". Something that says that these people are superior to the average Joe, but for some reason Joe probably wouldn't want too much to be in their place. Or maybe this is not a counter-example to status loss... maybe it actually is a small status sacrifice designed to reduce the envy of the less successful people. Status games are complicated: if you get too much status, someone could get angry and kill you (either literally, or just work hard to ruin your carreer).

It could be interesting to find out whether top players get status loss among their peers if they practice visibly more than their peers but don't deliver better results (yet).

And by the way, laughing at people who are trying to learn something also makes good sense as a zero-sum-game strategy. By threatening status loss you eliminate a future competition. If someone is already far ahead of you, it's too late to stop them, but you can still stop people at your level from improving and leaving you behind. This sounds horrible, but it can be done unconsciously, like you really feel they don't have a chance and are only making themselves funny, so you have to give them a helpful feedback.

Also there are some exceptional situations where adult learning from zero does not cause status loss. For example when the personal computers were new, older people did not lose status for learning the basics; they actually gained status even for minimum knowledge, because they were obviously superior in their age group.. and nobody expected them to really compete against 20-years olds (trying that seriously they would become very low-status).

Generally, if something is known to be new (not necessarily new technology, but also new fashion, for example Zumba dancing in my country), then even being a beginner increases your status among people who are at zero level. Probably because the zero level is percieved as average, so by being above-zero you automatically become elite.

comment by Alicorn · 2012-03-27T00:33:56.522Z · LW(p) · GW(p)

Yesterday I went ice skating for the first time in years (and I was never any good at it). I did very poorly. Small children zipped by me on the ice. It occurred to me that this situation could have been embarrassing, but I didn't happen to feel embarrassed. I vaguely remember consciously editing out that reaction to that sort of situation, and think it was in response to my dad reacting badly to expressions of such embarrassment when I was years younger than I am now (maybe 12-15) but still older than others who would have been in beginner-classes-of-things I could have joined.

Replies from: Crux, Vladimir_Nesov
comment by Crux · 2012-03-27T00:53:50.679Z · LW(p) · GW(p)

Interesting. It must be a nice cognitive situation to be in, or rather I guess I do remember what it was like. I spent many years almost utterly asocial and unaffected by what most people thought of me (at least in most ways), and it was certainly instrumental in allowing me to change life paths and develop new skills from the ground up, especially when it made me sacrifice a lot of perceived competence in the short term.

But since that time, or more specifically since I started being social again, this strategy has become defunct, and now I've re-acquired the standard, crippling fear of embarrassing myself in front of others. With anything I'm not already adult-level competent in, I have a hell of a time getting myself to go out there and not be afraid of screwing up and being judged by people.

This really sucks because there are at least a few things I'm unusually bad at that are highly important to me, and where the path to competence requires being around other people. I need to figure out a new way to avoid the fear of making mistakes, and specifically one that doesn't require staying away from people (which is how I used to handle it).

comment by Vladimir_Nesov · 2012-03-27T01:04:46.507Z · LW(p) · GW(p)

Oddly, I feel slightly embarrassed when I'm reading a textbook printout (in English, which most Russians can't read) during a commute, and it's only undergraduate or first-year graduate level pure math, and not something more advanced...

comment by Multiheaded · 2012-03-26T18:30:27.255Z · LW(p) · GW(p)

Excellent comment. All of this is indeed, as some writer put it, "a tragedy for those who feel and a comedy for those who think".

(Which extends to the fact that expressing either particular sadness or particular amusement at these everyday facts is in itself looked down upon, while it is a "sane" and "balanced" attitude that's inconsistent. In other words, a naively rational response to the realities of our life and society is a one that would get you admitted to a hospital.)

comment by A1987dM (army1987) · 2012-10-29T00:17:36.801Z · LW(p) · GW(p)

That doesn't seem to be the case where I am: I often hear "Have you been practicing? You've gotten much better than last time" in (what sounds to me like) a complimentary tone, whereas replying to "Where did you learn that?" with "I didn't, I'm just improvising" is often met with (what looks to me like) disappointment/disenchantment. (EDIT: But I'm 25. What age did you have in mind as "adult?"

Replies from: Viliam_Bur
comment by Viliam_Bur · 2012-10-29T08:44:57.192Z · LW(p) · GW(p)

In your example people first notice you being better (that is high status) and only then become curious about the causes. My examples were about people noticing someone practicing, or just discussing a hypothetical practice in future, with no improvement yet. That's not the same situation.

Simply said, you get -1 point for trying, +5 points for succeeding. Problem is that trying comes first, succeeding later. So there is that unpleasant phase of "already trying, not yet succeeding", which you cannot avoid (though you can keep it secret). During this phase you have low status. Only later, when the success comes, your status becomes higher than it was originally.

The high-status answer to "Where did you learn that?" is "I am just naturally good at it". Of course that works only if it is credible, which depends on the audience.

For example if I would try to get high status for my programming skills, to a totally computer-illiterate person I could say "I just naturally understand the computers; I was like this since my childhood". No details necessary. To them, any computer skills are probably magic, accessible just for special kind of people, and I just confirmed the hypothesis.

To a fellow programmer I could say: "I played with computers since I was a child; then I participated in programming competitions and won them; then I studied university, which was rather easy for me; and now I just read some tutorial on the web or google a few examples, and I get it; anyway, most of the stuff is easy if you already know a lot". I cannot pretend that one can learn programming magically without learning; but I can still move my magic more meta and pretend that it's not my programming skills per se, but my learning-programming skills which are magical. Yes, I had to learn programming, but the learning was always easy and quick -- I never failed, never got stuck, never had to ask another person for help, never doubted my success for a moment. (Which is psychologically almost as unlikely as being born with magical programming abilities, but I would expect an average programmer to believe this and to feel inferior compared to it.)

If I can take your answer literally, perhaps the word "improvising" had some bad vibe. It contains a possibility of failure, uncertainty. Also, trying is low status, but teaching institutions can have high status, so maybe people expected an answer like: "I had an internship at Google and that's where I learned that".

(Age is context-dependent. If you are 25 in a job, and you are youngest of your colleagues, they see you as a child.)

Replies from: army1987
comment by A1987dM (army1987) · 2012-10-29T21:09:52.704Z · LW(p) · GW(p)

which depends on the audience.

Yeah, probably that's it. While I'm positive that among musicians just having a decent sense of rhythm and melody and improvising on the E flat minor pentatonic scale (AKA “only playing the black keys”) is lower status than having spent hundreds of hours taking piano lessons and rehearsing, I'm not at all sure whether it'd also be lower status among other people, and indeed now that I think about it, my model of non-musicians says it wouldn't.

If I can take your answer literally,

I only normally use the word “improvising” about playing an instrument, or occasionally about vernacular dance (just discovered this term, BTW).

comment by A1987dM (army1987) · 2012-10-28T23:54:59.824Z · LW(p) · GW(p)

Not where I am. ISTM that here, being better than you used to be until recently is received pretty favourably.

comment by Grognor · 2012-03-25T23:04:36.895Z · LW(p) · GW(p)

The important thing isn't to try to not signal things (which is of course impossible) but to be aware of the nature of one's own signaling and how it can impact the exchange of information and belief.

Look at this post by Robin Hanson: Smiles Signal. Nobody is arguing that smiling is bad. But if you think about it, you realize that if smiling weren't a signaling behavior, it probably wouldn't be visible to others. Awareness of the ten thousand ways these adaptations influence our behavior is a tremendous component of what it means to be a rationalist.

One should also beware of using "signaling!" as a fully general counterargument of the form:

 My opponent argues for 
 position X. But in doing 
 so, he is only signaling
 high status. Therefore,
 not-X.
comment by RolfAndreassen · 2012-03-25T17:59:42.054Z · LW(p) · GW(p)

Please observe the following distinction:

All X are not Y

is not the same as

Not all X are Y

In your case, you are claiming that no signalling behaviours are bad. You probably intended to say that at least some signalling behaviours are not bad.

Replies from: erratio, Stabilizer, army1987
comment by erratio · 2012-03-25T19:08:20.371Z · LW(p) · GW(p)

you are claiming that no signalling behaviours are bad

Actually, Stabilizer may not be making any such claim. There's a linguistic phenomenon where the population can basically be split into people who can take a sentence like "All X are not Y" and only get the interpretation "No X are Y", and people who can get both that interpretation and also "[not all] X are Y". I would be willing to wager that Stabilizer is in the latter group, since it's pretty clear from the post that they're not trying to claim that no signalling behaviour is bad.

Replies from: RolfAndreassen
comment by RolfAndreassen · 2012-03-25T19:30:38.114Z · LW(p) · GW(p)

Well yes, and the latter group is just mistaken, which is what I'm pointing out.

Replies from: erratio, None, army1987
comment by erratio · 2012-03-25T20:39:46.365Z · LW(p) · GW(p)

They're not, that's not how language works. I can agree that there are better ways to express oneself that are not ambiguous, but calling an interpretation "mistaken" which is perfectly fine for a decent chunk of the population is pointlessly prescriptivist.

Replies from: RolfAndreassen, RolfAndreassen
comment by RolfAndreassen · 2012-03-25T22:45:47.209Z · LW(p) · GW(p)

It is not pointless at all. When there is one way that is unambiguous, and another that creates an unnecessary ambiguity, then the ambiguous way may reasonably be considered wrong, and people who use it corrected as a way to improve the language.

Replies from: David_Gerard
comment by David_Gerard · 2012-03-25T23:03:28.351Z · LW(p) · GW(p)

In practice, human language isn't precision-oriented technical jargon.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2012-03-25T23:31:05.159Z · LW(p) · GW(p)

That's a bug, not a feature. ;)

Replies from: None, David_Gerard
comment by [deleted] · 2012-03-27T15:47:44.255Z · LW(p) · GW(p)

Actually, it just might be a feature.

You know, just between you and me, I sometimes worry that there is a naive view loose out there — most students come to linguistics believing it, and there appear to be some professional linguists who regard it as central and explanatory — that language has something to do with purposes of efficiently conveying information from a speaker to a hearer. What a load of nonsense. I'm sorry, I don't want to sound cynical and jaded, but language is not for informing. Language is for accusing, adumbrating, attacking, attracting, blustering, bossing, bullying, burbling, challenging, concealing, confusing, deceiving, defending, defocusing, deluding, denying, detracting, discomfiting, discouraging, dissembling, distracting, embarassing, embellishing, encouraging, enticing, evading, flattering, hinting, humiliating, insulting, interrogating, intimidating, inveigling, muddling, musing, needling, obfuscating, obscuring, persuading, protecting, rebutting, retorting, ridiculing, scaring, seducing, stroking, wondering, ... Oh, you fools who think languages are vehicles for permitting a person who is aware of some fact to convey it clearly and accurately to some other person. You simply have no idea.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2012-03-28T04:06:52.584Z · LW(p) · GW(p)

Very well, I will thus ignore any information in your comment.

comment by David_Gerard · 2012-03-26T00:02:59.884Z · LW(p) · GW(p)

Bah. Joseph Conrad picked English for its interesting ambiguities!

comment by RolfAndreassen · 2012-03-26T19:44:11.914Z · LW(p) · GW(p)

Perhaps you'll find this interesting, it touches on how language works and corrects your apparent misconception that it's all about usage:

http://esr.ibiblio.org/?p=737

Replies from: jmmcd
comment by jmmcd · 2012-03-27T19:28:28.086Z · LW(p) · GW(p)

That is an interesting essay. For me, Raymond's arguments don't really stand up. This is the core of his argument that the "popular usage" position, apparently common among linguists, is not well-grounded:

At the bottom of it, for most people, is the belief that popular usage always wins in the end, so why fight it? But this isn’t actually even remotely true; as far back as Middle English, academic grammarians imported Latin and French words into English wholesale, and they often displaced more popular “native” words. The anti-populist effect of class stratification has been taken over in our time by mass media, especially television and movies, which have enormous power to ratify minority usages and pronunciations and make them normative.

It misses the fact that the academic grammarians and mass media he mentions are an influential part of the popular-usage process, not in opposition to it. News networks don't announce that their usages are normatively correct: it is (a certain segment of) the population who make that argument.

Replies from: RolfAndreassen
comment by RolfAndreassen · 2012-03-27T19:39:21.599Z · LW(p) · GW(p)

If there's no distinction to be made between elite prescription and mass usage, then what is the point of appealing to "common usage via Google" at all? By your argument, I'm just as much an "influential part of the popular-usage process" as the Google results that were being used as an argument against my position. Either there is a distinction between common and elite usage, or there isn't. If there is, we can argue about which is more important in what circumstances. If not, then we're back to arguing about function and ambiguity.

Replies from: jmmcd
comment by jmmcd · 2012-03-27T19:52:48.230Z · LW(p) · GW(p)

No, there is a distinction here, but it's not between common and elite usage. It's about whether the authority is normatively correct even when the people disagree. If an authority is against a usage and most people continue using it, most linguists (holding the "popular usage" position) will be for that usage. If an authority is against a usage and most people are also against it (whether influenced by the authority or not), most linguists will be against it.

I'm just as much an "influential part of the popular-usage process"

Yes! If you're influential, that is. Google certainly is.

Replies from: RolfAndreassen
comment by RolfAndreassen · 2012-03-27T23:22:23.652Z · LW(p) · GW(p)

If an authority is against a usage and most people continue using it, most linguists (holding the "popular usage" position) will be for that usage.

ESR's argument is precisely that this is not true; that linguists will generally not approve of a popular usage as against an elite usage when, and only when, the elite usage is less ambiguous. Do you have any data (anecdotes will do, that's what he's basing his assertion on) that shows otherwise?

Replies from: erratio
comment by erratio · 2012-03-28T02:03:54.230Z · LW(p) · GW(p)

If I told you that most linguists think that Strunk and White is a load of crap, would that help? Or how about that most linguists I know will happily admit that these days there's little or no difference between "fewer" and "less", or complementiser usage of "which" and "that", because the vast majority of people don't make a principled distinction between the two? I'm pretty sure I've also heard at least one of them using "less" in a classroom context that prescriptively ought to have been "fewer"

(Actually, I'm not sure if there was ever a really principled distinction between fewer and less - it seems like one of those things that teachers have always been complaining about our misuse of)

Replies from: RolfAndreassen
comment by RolfAndreassen · 2012-03-28T04:04:27.349Z · LW(p) · GW(p)

Well, "most linguists" is a phrase that really cries out for some Wiki tags. "Citation needed", "who", and "weasel words" come to mind. That aside, I do not see what Strunk and White has to do with it; they were giving advice on writing style, not on how to express yourself un-ambiguously. As for fewer and less, and which and that, I don't see where these gave rise to any actual precision of language. Saying 'fewer people' is not actually needed to inform you that people are countable; you already know that. So the alleged additional information is redundant. Which is, indeed, why people don't bother with the distinction, and why linguists merely catalog the usage. Your examples are quite different both from the original "not all are/all are not" distinction, and from the ones in the essay, and thus don't actually carry your point.

Replies from: erratio, jmmcd
comment by erratio · 2012-03-28T12:21:35.930Z · LW(p) · GW(p)

You ask for anecdotal evidence and then demand citations when given some? I'm tapping out of this conversation for good.

Replies from: RolfAndreassen
comment by RolfAndreassen · 2012-03-28T17:17:19.931Z · LW(p) · GW(p)

You did not give anecdotes; you made assertions. There's a difference. If I say "Person such-and-such, who is a linguist, told me this-and-that", this is anecdotal evidence that linguists hold such a position. If I say "Most linguists think", that is assertion.

Replies from: erratio
comment by erratio · 2012-03-28T23:43:10.461Z · LW(p) · GW(p)

I see. In that case, let me rephrase: every member of the class of linguists that I am aware of, including but not limited to the ones on Language Log, the ones at my old department and the ones at my current department, think that Strunk and White and other similar prescriptivism is a load of crap and are in favour of usage-based grammaticality.

I also request that any subsequent comments I make in this thread be downvoted, because I am clearly having problems disengaging.

Replies from: RolfAndreassen
comment by RolfAndreassen · 2012-03-29T00:24:49.762Z · LW(p) · GW(p)

Again, what do Strunk and White have to do with it? They were giving advice on writing style, saying "If you say it this way your readers will like your writing better", not "This is the correct way to say it". Now perhaps they gave bad advice, it is a point on which reasonable men might differ, but what of it? To beat up on Strunk and White may be popular, but it has nothing to do with prescriptivism in linguistics.

comment by jmmcd · 2012-03-28T08:29:34.855Z · LW(p) · GW(p)

As for the wiki tags, Language Log provides some examples. I don't think these examples have to include the precision aspect of the question to support the claim that Raymond is over-reaching in his attack on the "popular usage" position.

comment by [deleted] · 2012-03-26T02:01:45.802Z · LW(p) · GW(p)

In common usage (based on a Google search for "all * are not *") you are wrong: in fact, most usages of the phrase seem to mean "not all X are Y". Probably the phrase is ambiguous, but then we should not use it at all, and either say "No X are Y" or "Not all X are Y". And in that case it is silly to criticize a use of the phrase which you admit that you have correctly parsed.

Replies from: RolfAndreassen
comment by RolfAndreassen · 2012-03-26T19:38:04.717Z · LW(p) · GW(p)

Most people also understand "if" to mean "if and only if"; it does not follow that we ought not to correct such ambiguous and context-dependent use. I'm down with common usage in most cases, but not when it comes to making logical distinctions in writing. There is a place for prescriptivist precision in language, and this is it.

comment by A1987dM (army1987) · 2012-10-29T00:41:06.294Z · LW(p) · GW(p)

Nope. "All that glisters is not gold." You are probably implicitly assuming that "not" in English only negates what's after it and not what's in front of it, but English isn't that simple -- cf "You must not do X" (where "not" negates "do") and "You need not do X" (where "not" negates "need").

comment by Stabilizer · 2012-03-25T20:59:01.281Z · LW(p) · GW(p)

Fixed. Thanks.

comment by A1987dM (army1987) · 2012-10-29T00:34:11.362Z · LW(p) · GW(p)A quirk of the English language means that the former is often interpreted as though it was the latter (IOW, the scope of a negation in English isn't always everything after it in the sentence and nothing in front of it -- e.g. "All that glisters is not gold". To unambiguously express the former meaning you have to say "No X is Y."

(Hadn't seen this had already been discussed -- never mind.)

comment by TimS · 2012-03-25T12:56:24.950Z · LW(p) · GW(p)

There's nothing wrong with signaling (turn signals are wonderful). But signals can be faked. (What precisely does a college degree mean?) Signaling is socially problematic when effort is spent on the signal instead of what it represents. (i.e. Cheap talk)

Tribal membership declaration and dominance games occur via signaling, but it is analytically clearer to treat those topics separately from signaling itself.

comment by Crux · 2012-03-26T20:39:00.435Z · LW(p) · GW(p)

It's true that not all signaling and status posturing is bad. What happens is that we recognize how destructive so many of our signaling and social status tendencies are, but then apply an overly general heuristic of, "Avoid doing anything we could refer to as status posturing or signaling."

It's certainly the case that a lot of the status behaviors are dangerously mal-adapted for our current environment (especially when it comes to our epistemic rationality), but not all of them are, or rather most of them probably aren't--it's just that the destructive ones are most visible.

This also happens with food. People notice that the output from our taste buds is alarmingly mal-adapted to our present food selection, and that (at least for most people's utility functions) just because something tastes good doesn't mean you should eat it, but then they overly generalize and end up with a heuristic that completely dismisses taste as a useful indicator. It's not that you should never listen to your taste buds; it's just that you need to know when to do so.

Just as removing the junk from your food selection and replacing it with a wide variety of traditional food helps re-align the indicators and allow your taste buds to become trustworthy again, putting yourself in a better social situation will lead to something similar. I certainly trust my status posturing tendencies more when I'm on this website than when I'm hanging around with a bunch of comparatively irrational, incompetent people.

If you want to know whether a signaling or status behavior is rational or good, just ask yourself whether getting social props in the situation at hand is well-aligned with being successful in the other ways you care about (just like considering whether some food tasting good is well-aligned with it being healthy and providing the right nutrients). If the answer is no, then ask yourself how you can re-align them, or else avoid the situation--unless of course the social props at stake are what's most important to you.

Not all signaling and status behaviors are bad, just as you don't need to stick to just bland, uninteresting food to lead a healthy lifestyle.

comment by Jade · 2012-08-20T15:41:13.848Z · LW(p) · GW(p)

Looking "good" is still based on priors, which in anorexics, vegans, and ascetics usually involve perceptions of costs their brains subconsciously figure would be reduced if people ate less, ate less meat, or consumed less of everything.
Some vegans feel disgust when thinking of meat, even lab meat
"Disgust as embodied moral judgment"

Generally, all signaling is good from the perspective of the signaler's brain, which may be updated, like when Buddha left groups of ascetics to continue optimizing.

comment by falenas108 · 2012-03-26T12:53:06.087Z · LW(p) · GW(p)

The thought process for signaling should usually be something like this: I recognize that although X is associated with Y, X is actually just signaling. But, is my life better off if I do X anyway?

For example, making my bed is associated with being an clean person, even though making a bed doesn't actually clear up any space. But, I find it more aesthetically pleasing to make my bed every day, even though I don't get any other benefits from it. Plus, others coming into my room may also make judgements about me based on an unmade bed.

Replies from: Alex_Altair
comment by Alex_Altair · 2012-03-26T14:32:06.477Z · LW(p) · GW(p)

I attribute bed-making and similar things as reducing the cognitive cost of visual processing. If you enter a clean room, it's easy to asses what few things are present. But if there is a mess, there are all those extra visual objects which must be sorted through in your visual attention circuits.

Having said that, I think avoiding cognitive cost is something we acquired from evolution because thought was very costly in terms of calories. So it might not be valid to continue avoiding, especially when it comes to questions more important than bed-making. This is one reason we rely on cached thoughts and so forth. Does anyone remember if there was a sequence post on the caloric cost of thinking?

Replies from: wedrifid, Vladimir_Nesov, Crux
comment by wedrifid · 2012-03-27T00:58:14.948Z · LW(p) · GW(p)

I attribute bed-making and similar things as reducing the cognitive cost of visual processing. If you enter a clean room, it's easy to asses what few things are present. But if there is a mess, there are all those extra visual objects which must be sorted through in your visual attention circuits.

This fits with the studies that I have read (the abstracts of) pertaining to the effect of clutter on both productivity and indicators of stress.

Having said that, I think avoiding cognitive cost is something we acquired from evolution because thought was very costly in terms of calories. So it might not be valid to continue avoiding, especially when it comes to questions more important than bed-making. This is one reason we rely on cached thoughts and so forth. Does anyone remember if there was a sequence post on the caloric cost of thinking?

This strikes me as the opposite conclusion to the right one (and so I question the strength of the reasoning). See previously alluded to studies that can be paraphrased as "mess bad". While I agree that thinking on net is probably desirable I rather confidently assert that we are not best off doing so by making less effort to clear up clutter - be it mental or physical. Most people would be best served by reducing the cognitive load from mess, not letting it build up more. (After all, even once the bed is all nice and neat we still have more stuff lying around to process than, well, back before we learned how to build stuff to keep lying around.)

Replies from: Alex_Altair, gwern
comment by Alex_Altair · 2012-03-27T01:57:32.260Z · LW(p) · GW(p)

Most people would be best served by reducing the cognitive load from mess

That's a good point. I think I was confusing two ideas here. 1) How difficult it is to process certain information. 2) How I feel when considering whether to think about something.

Cleaning messes falls under the first category. It is unchangeably difficult to process certain kinds of information. There is probably some information theory demonstrating this.

As an example of the second, I once figured out that I don't like doing dishes because I feel like it would take a lot of concentration and though to make sure I got them clean. But all the thought costs me is will power. I think this is an instance where evolved reluctance to spend glucose on thinking (and I'm pretty sure I read something about that here) is no longer valid, because I have more glucose than I know what to do with.

This is the kind of thing that I would like to make an explicit skill in catching. I think it is the instrumental rationality analog to the epistemic rationality skill of noticing when you flinch away from a thought.

Replies from: wedrifid
comment by wedrifid · 2012-03-27T02:04:48.971Z · LW(p) · GW(p)

This is the kind of thing that I would like to make an explicit skill in catching. I think it is the instrumental rationality analog to the epistemic rationality skill of noticing when you flinch away from a thought.

It's certainly a worthwhile skill. (Probably more important for most practical purposes than all that 'epistemic' stuff.) It may be best to develop the skill in a somewhat original-cause agnostic fashion. It is somewhat hard to trace the exact cause of a particular instance of aversion to "aversion to spending glucose on thought" vs "aversion to spending glucose on doing stuff in general". Yet often the reasoning we use to bypass those biases and do the smart thing anyway is the same regardless.

(If I don't base my skills entirely upon my just-so stories it means I don't necessarily have to abandon them if it turns out my history was wrong but practical psychology was not.)

comment by gwern · 2012-03-27T01:41:49.728Z · LW(p) · GW(p)

Matches the studies that I have read (the abstracts of) pertaining to the effect of clutter on both productivity and indicators of stress.

This is relevant to my interests.

comment by Vladimir_Nesov · 2012-03-26T22:43:42.687Z · LW(p) · GW(p)

(I downvoted the above, because I found the ratio of convincing explanation to wild theorizing too low, which is bad epistemic hygiene, especially when there are impressionable people around.)

Replies from: Crux
comment by Crux · 2012-03-27T00:24:14.160Z · LW(p) · GW(p)

You don't seem to have anywhere near enough information about me to responsibly pass that sort of judgment. I understand the epistemological status and limitations of evolutionary psychology (phrased very concisely in the first two sentences HughRistik wrote in this post).

In the spirit of trying to figure out why exactly your comment annoyed me and activated my status-posturing hardware to such a great extent, I'd say it was probably the presumptuous, subtle, passive-aggressive nature of indicting me so offhandedly in a comment not even replying to me, but to the OP of this subthread.

To avoid coming off as so condescending and turning the discussion into a status game (which this surely has become), I would recommend instead replying directly and doing so in a much more charitable, thoughtful way.

Replies from: Vladimir_Nesov, wedrifid
comment by Vladimir_Nesov · 2012-03-27T00:54:35.879Z · LW(p) · GW(p)

What I'm curious about is whether it'll work, be more memorable than other things I could've done quickly. I do believe it was a clear-cut case of overvaluing an unsubstantiated assertion ("highly insightful ... example of ... answering some esoteric question I've had for years"), which is a serious problem that might let all sorts of cobwebs to clutter one's mind if left unchecked... The comment was also directed to Alex_Altair.

Replies from: wedrifid, Crux
comment by wedrifid · 2012-03-27T01:11:43.456Z · LW(p) · GW(p)

What I'm curious about is whether it'll work, be more memorable than other things I could've done quickly.

Another thing that you could have done quickly is write the same message but with the passive aggressive status game truncated. Finish with "hygiene.)". That would have got your point across at least as well and even the act of lending support to a challenged downvote that way already does a lot to undermine Crux without adding in any gratuitous insults.

No, the above wouldn't have been quite as 'memorable' as what you chose to do but at least people would have remembered your desired message regarding epistemic hygiene. This way the lesson that people will take - and that people should take - from your move is that petty passive aggressive status assassination is frowned upon here.

Replies from: Vladimir_Nesov, Vladimir_Nesov
comment by Vladimir_Nesov · 2012-03-27T01:29:38.576Z · LW(p) · GW(p)

That would have got your point across at least as well

Emotional experiences are remembered better, that much I think is true. This seems to be my real reason for adding that remark; the problem is that I don't sympathize enough to automatically notice the downside, so this event repeats the lesson once more.

comment by Vladimir_Nesov · 2012-03-27T01:14:42.088Z · LW(p) · GW(p)

This way the lesson that people will take - and that people should take - from your move is that petty passive aggressive status assassination is frowned upon here.

A separate concern.

Edit: Yup, status defense talking, please disregard this.

Replies from: wedrifid
comment by wedrifid · 2012-03-27T01:30:42.137Z · LW(p) · GW(p)

A separate concern.

No Vladimir, you miss the point. This isn't just a side effect. You actively undermine the memorability of the position you claim that you are trying to make memorable.

To the extent that you truly are unfettered from all other concerns like maintaining a non-hostile community, basic courtesy and not undermining your own reputation you have still failed at the rudimentary "memorability maximisation" goal you attribute to yourself.

Your point not being remembered is exactly the concern that was mentioned. And it will indeed be remembered less because you decided to obfuscate your point behind personal insults (insults of a different user, no less!) This is only magnified by attempts to justify the move as though it is an optimized support of some higher ideal of epistemic purity.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2012-03-27T01:38:06.485Z · LW(p) · GW(p)

(As I added in an edit to that now-removed comment, I've noticed that the comment was a status defense response on my part, which permitted that statement to be posted past its relevance. A rationalization, finally! I agree that different impressions compete for memorability, and intended one can be displaced by something undesirable.)

comment by Crux · 2012-03-27T01:45:02.947Z · LW(p) · GW(p)

I certainly see what you're saying, and I may in fact try to phrase myself differently next time in order to prevent this sort of situation from happening again, but the problem here isn't his insight or how I valued it. It's the common tendency to overvalue evolutionary psychology itself and misunderstand its epistemological limitations.

I try to gather as many of those sorts of insights as possible and organize them into a system, and I do so simply because of how useful of a hypothesis generator it is, and not because I believe them directly. I read his comment, and I incorporated it into my thinking, but I didn't do so as a standalone belief (because that would be a misunderstanding of the epistemological status of that sort of insight).

I don't know. I'm certainly not explaining this very well, and that's because I'm leaving out an absolute ton of information because I don't want to turn this comment into a lengthy exposition of the epistemology of this sort of reasoning, but hopefully at least you see enough of what I mean to get my basic point here.

Let me sum this up. I don't think there's absolutely anything wrong with his comment, nor do I think there's anything wrong with how highly I valued it (as a highly insightful point on a random esoteric topic), but I certainly see how conversations like this may be epistemically hazardous to those who take evolutionary reasoning far too seriously, or rather to those who don't understand the epistemology.

But this seems like a difficult problem when people are posting on such a large public forum. Inferential distance is always a factor, and one that changes depending on who you're talking to, and it would certainly be impractical to expect every comment to close the entire inferential distance for everybody who may read it, or even for the majority if it's a thorny or difficult subject.

Sometimes inferential distance gaps are more dangerous than others, and perhaps this is a case you identified as being especially epistemically hazardous, but then I guess your course of action should have been to make a comment in an attempt to close that inferential distance, put the comment in its proper place, and make it explicit what it's limitations are.

You could have said something like, "This is an interesting insight as far as it goes, but keep in mind X." Where X is what I've just been talking about--the epistemological limitations of that sort of reasoning. That is if you even agree this much. Maybe you just think the comment is useless, and you don't even agree with what I'm saying about the inferential distance problem or whatever. I don't know.

In any case though, there was no reason to indict me specifically, and do so with such presumption. If you thought there was something wrong with my comment, you should have just engaged me about it, and done so charitably and thoughtfully.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2012-03-27T01:59:41.170Z · LW(p) · GW(p)

I do think the comment is useless, but simple qualifiers indicating the hypothetical nature of its statement would've made it less hazardous. I agree that attacking you was incorrect, for reasons that I failed to pay attention to due to lack in skill of emulating empathy. I didn't even think of the comment as primarily addressing you, that was a secondary motivation, so you see how poorly I understood its effect.

Replies from: Blueberry, Crux
comment by Blueberry · 2012-03-27T02:59:32.693Z · LW(p) · GW(p)

"emulating empathy?" What?

Replies from: Vladimir_Nesov, Antisuji
comment by Vladimir_Nesov · 2012-03-27T06:24:24.505Z · LW(p) · GW(p)

The skill for estimating others' emotional responses to various stimuli that compensates for the flaws of my own native circuitry responsible for the task. How would you call that?

comment by Antisuji · 2012-03-27T05:38:27.708Z · LW(p) · GW(p)

The charitable way of reading that term is to treat "emulating" as a modifier of "empathy", as in empathy implemented through emulation of the other. I'm inclined to think this is also the intended meaning, if only because the non-charitable sense would be better expressed as "simulated empathy".

comment by Crux · 2012-03-27T02:12:22.702Z · LW(p) · GW(p)

I see. Seems like this discussion has run its course (unless you have more to say). See you elsewhere on the forum, and hopefully this exchange will have no bad social effects.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2012-03-27T02:14:50.911Z · LW(p) · GW(p)

Bad social effects teach us valuable lessons.

comment by wedrifid · 2012-03-27T00:42:09.075Z · LW(p) · GW(p)

In the spirit of trying to figure out why exactly your comment annoyed me and activated my status-posturing hardware to such a great extent, I'd say it was probably the presumptuous, subtle, passive-aggressive nature of indicting me so offhandedly in a comment not even replying to me, but to the OP of this subthread.

That is what I saw when I read it. I applaud you for responding calmly to what I judge to be a rather blatant social violation.

Replies from: Crux
comment by Crux · 2012-03-27T01:45:37.646Z · LW(p) · GW(p)

Thank you.

Replies from: Crux
comment by Crux · 2012-03-28T04:32:02.697Z · LW(p) · GW(p)

To whoever downvoted this: What do you want me to do, ignore him?

As a meta point, perhaps it would be useful to have a quick acronym for requesting that nobody upvote or downvote your comment because it's not supposed to function as anything other than a quick acknowledgement or something. Maybe we could do "KF" for "karma freeze". Like, "Thank you. KF"

I don't know if this would catch on or anything, but one of the annoying things about the current karma system is that it creates this atmosphere where everything must be "all business" or something. What if I just want to signal a brief acknowledgement or whatever? Am I supposed to just deal with the inevitable downvotes?

A norm for marking your comment as being "not for karma appraisal" may be useful because you could use it to signal that you're not looking for karma, and just trying to engage in some social nicety or whatever. I suspect that part of the reason why the sort of comment I'm replying to here often gets downvoted is because it may almost seem like the person writing the comment is hoping for some extra karma or something.

I don't know. Even if this wouldn't be a good way to solve it, I nevertheless think it's a problem that I always have to expect to get downvoted when I acknowledge somebody without adding anything substantial, or whatever. Sometimes there's really nothing else to say besides a quick positive acknowledgement, and sometimes not doing that quick signal would be socially suboptimal.

Replies from: pedanterrific
comment by pedanterrific · 2012-03-28T04:47:47.271Z · LW(p) · GW(p)

Can we have a community norm against obsessing over karma?

Replies from: Crux
comment by Crux · 2012-03-28T04:52:51.784Z · LW(p) · GW(p)

Do you mean to say that I shouldn't have written the comment you're replying to?

Replies from: pedanterrific
comment by pedanterrific · 2012-03-28T04:54:07.676Z · LW(p) · GW(p)

Am I supposed to just deal with the inevitable downvotes?

Yes. People downvote for bad reasons, and you can't control what they do.

Replies from: Crux
comment by Crux · 2012-03-28T05:08:55.555Z · LW(p) · GW(p)

It's just that it's a trend that I've noticed, and one that may have a corrosive effect on this community by essentially disincentivizing social niceties and the like. Despite the consistent downvotes, I personally plan on continuing forth in my effort to acknowledge those who address me even if I have nothing else to say, and also never leave anybody hanging, but you can probably see why many would not.

I do agree that yes, I should deal with the inevitable downvotes in these sorts of situations because plenty of people downvote for bad reasons. But I don't agree that I should just give up trying to change the trend for a reason like, "You can't control what they do." Well, why can't I? Sure, I can't hope to influence everybody, but this isn't an isolated event--it's been a trend for a long time.

I'm going to continue posting quick acknowledgments when they're appropriate whether or not I get downvoted anonymously each time, but I don't see why I shouldn't also respond to them by defending my comments and engaging in meta discussion about what sorts of voting patterns would be optimal in this community.

Replies from: pedanterrific
comment by pedanterrific · 2012-03-28T05:10:38.916Z · LW(p) · GW(p)

Okay.

comment by Crux · 2012-03-26T21:18:20.541Z · LW(p) · GW(p)

I just balanced a -1 to 0. No idea who downvoted you, or why. I found it highly insightful, and yet another example of a random comment on Less Wrong answering some esoteric question I've had for years.

Replies from: Alex_Altair
comment by Alex_Altair · 2012-03-26T21:42:26.398Z · LW(p) · GW(p)

Haha thanks. LW is big enough now that I'm not surprised by random up or down votes.