Maps vs Buttons; Nerds vs Normies
post by Bound_up · 2017-12-28T21:11:40.577Z · LW · GW · 23 commentsContents
23 comments
It took me the longest time to see through the illusion that was "rational" discussion.
The setting: A friend and I exchange inquiries about each other's beliefs about X.
The result: the friend would give a number of answers that allowed me to piece together their view of X. An astounding percentage of the time, even a majority, perhaps, one of their answers would contradict another. So, I point this out. My friend, after hearing an explanation, agrees it is a contradiction. We move on, me convinced that I had contributed to their work (the work I assumed that others, not just myself, engaged in: namely, piecing together as accurate a model of reality as possible), only to find, weeks or months later, that on discussing X again, they still gave the same contradictory answers as before, with apparently no memory of our past discussion.
Or discussions. This pattern has repeated up to 4 times (that I've bothered to keep track), my interlocutor agreeing with my corrections, and then showing no sign of having even heard of them down the road.
My whole above description of "the result" is faulty in a subtle way, one that undermines the entirety of it. Here's what was really happening, expressed from my friend's perspective.
The (real) result: A friend and I were discussing X. They showed respect for me by asking me to express myself on it several times. At some point, they dared to point out, rather like a pedantic schoolboy, that there was a sort of inconsistency between my expressions and the forms we studied in textbooks on logic. They were not rude, though, so I graciously acknowledged the rather dry, uninteresting observation, after which we happily continued our conversation. I had thought for a moment that they wished to challenge me, but all seemed to be rather well-resolved, so I forgot the particulars of the incident, only taking with me the general, updated state of our relationship, including, for example, the points that my expressions were generally respected, but that my friend was willing to make some small challenges if they felt like it, though also willing to move on after I was gracious in response.
Or something like this, perhaps. And this key difference I call the map vs the button idea.
I, and other nerds, are conceptual cartographers, attempting to piece together a map of reality. I've often wondered how I seem to update my map so consistently, since I make no specific effort to remember any alterations or additions I deem worth making. Shouldn't I be forgetting them? Maybe I should dedicate some time to writing down and reviewing these precious insights!
But I've since realized that my map is very personal to me. I am well-acquainted with its pieces, which are interconnected in many ways. So long as I swap out one piece at a time, while maintaining mostly the same interconnections with the replacement piece, I seem to remember such changes very easily.
When you ask me what my beliefs on some subject are, I consult my mental map, and I produce for you, on the fly(!), a description of what the map says. How do I answer your inquiry? I deduce my answer to your question by referring to my map.
Two points:
- This produces conceptually consistent answers, as consistent as my map is, which is to say, very, since I constantly compare its pieces against each other.
- My expressions, my answers to questions, are often clunky and unwieldy, since I've constructed them on the fly. If you're like me, you might wonder why this should make my answers any clunkier than anyone else's, for surely this is what everyone does, right? Herein lies a clue to the great insight.
Now the contrast. Others are not making maps. I think of them rather as like a big wall of buttons. Push a button, and a slip of paper comes out. When I ask such a button-type person a question, I am pushing a certain button, and they, without even thinking, spit out the coresponding piece of paper (there is some relatedness here to the idea of "cached thoughts" as Eliezer Yudkowsky spoke of them. You could well say that these people are mostly just vast collections of cached thoughts).
Where do they spend their effort, if not on map-making? On the following:
- Reading your reaction to their slips of paper (their answers).
- Re-writing and refining the writing on the papers. The higher your status, the more your reaction to their slips of paper compels them to change them.
Some important differences that flow naturally out from these two different methods and focuses:
- Mapmaking nerds give more consistent answers than button people, since they view their map as a whole, each part being compared against the others, whereas the slips of paper are not judged (even in part) by how well they match other slips of paper, but rather they are judged according to the reactions that they evoke in the audience.
- Button people give much more eloquent, refined, and most of all, socially advantageous answers than mapmakers, and do so with less apparent difficulty or searching for words, analogies, descriptions. This improvement in delivery comes from the fact that they don't have any thinking to do on the fly; they're just repeating the same thing as always (have you ever heard people who always give, word-for-word, the same commentary or response to some particular issue which may be brought up?), so their answer comes out smoothly and immediately. (Indeed, if a question does not exactly ask something they have a prepared response for, they may well give one of their rehearsed lines, anyway, considering the cost of imprecisely answering less than the cost of having to stop and, you know, think about how to answer. Less chance to err, you see) Also, since the answer has, effectively speaking, been "practiced" many times in the past, and has been refined over time as ideas come to them, or as they seek to ameliorate negative reactions, there is greater eloquence and ease of presentation. (It occurs to me that this may also explain why the "best" answers according to this kind of system, especially when addresssing highly controversial issues, are sometimes as unclear and meaningless as possible. Suppose a piece of your answer offends people A, so you take it out. Then, another piece offends people B, so you transform it into something inoffensive, and so on. Eventually, you end up with an answer that gives everyone generally positive feelings, without pissing anybody off, but which is so devoid of content that not only was it not deduced from a map of reality, but the whole method of deducing replies by consulting models could never produce it, so incoherent and meaningless is the expression! That is, a nerd, looking at their map and reporting what they saw, would never produce such a statement, because it is not the kind of statement that comes from trying to describe how something works or how it is. Rather, these meaningless niceties are the kind of statement that come from trying to appease and impress people with the content and presentation of your social signals. Statements which embody these social goals to this extreme degree sound so completely different from the ones that nerds use to communicate actual ideas, that the illusion is broken, and instead of thinking their interlocutor is reporting on their personal map when they're not (as was my common error (see 1st paragraph)), a nerd is likely to just give a blank stare and ask for clarification (I feel this is whole aside is somewhat unclear, but if you're me, or like me, I hope it has a ring of familiarity to it), receiving only more incoherent niceties in response (since there is no content to be brought into focus) until they or the interlocutor tires and abandons the project).
Belief(1) shall refer to the kind of thing a nerd says when asked what they believe. It is as accurte a report of their model of reality as they can give on the fly.
Belief(2) shall refer to the kind of answer a normal, social person gives when asked. It is as nice (to their in-group, anyway) and as impressive a statement on the subject as they so far know how to give. It's very much like whipping out the verbal version of a pretty bauble to gain ooh's and aah's (attention and status).
And so I say, belief(1) =/= belief(2).
In a very real sense, normal people just don't even have beliefs(1)! More precisely, if we're talking about things where they don't need to be factually right in order to prosper, if we're talking about politics, religion, philosophy, etc., they don't have beliefs(1) at all.
At the same time, they sustain a convincing illusion of having beliefs(1) (not that that's on purpose or anything; thinking in terms of beliefs(2) is their natural instinct), because when you say "what's your belief on gender equality", they immediately produce a nice shiny answer that definitely sounds like it's saying something about the nature of reality.
The natural but mistaken thing to do is treat what is an automatic, unthinking, knee-jerk, instinctive attempt to impress and to signal which groups they owe fealty to as instead an attempt to describe some part of reality, merely because all of the words in their response sound like a description of a part of reality. That's the error.
So, nerds talk in beliefs(1), normal people talk in beliefs(2), and the result is that both sides commit mutual faux pas and talk past each other. Nerds are constantly embarrassing themselves according to social rules, and normal people don't have any ideas worth listening according to nerd rules.
The specifics of these errors can be deduced once you realize that each side sees their own methodology as so natural that they assume the other side is also using it, but are doing so incompetently. So nerds think normal people are failed nerds, rather than successful normal people, while normal people think nerds are failed normal people, rather than successful nerds.
Normal people think nerds are trying to signal and impress (and failing pathetically and amusingly) and nerds think normal people are trying to make models of the world, and epic fail so hard as to be incapable of discussing any subject beyond four sentences without contradicting themselves. And so on.
If you're a nerd, you might read all this and think I'm being hard on normal people (how can you say such awful things about them as that they're not logically consistent and that they don't ponder before answering questions?), while if you're a normal person reading this (haha, jk), you might think I'm awful hard on nerds (how can you say such mean things as that they don't care what others think and are incapable of properly expressing themselves?). This phenomenon occurs again because both sides are judging everybody by their own standard, not recognizing that others have other standards and succeed very well by them.
One last note. I've spoken as if people are in one camp or the other, but it's really more of a spectrum. Even more precise, it might be correct to say that people are on a spectrum of nerdy/normal for each specific topic they care about. Frankly, very few humans manage to be nerdy about politics and religion, even "nerds." People who are nerdy in many ways suddenly turn into political animals, social thinkers, when you bring up anything controversial. And a normal person might well become nerdy and actually try to learn how some thing really works if they happen to be interested in it for some reason.
23 comments
Comments sorted by top scores.
comment by Benquo · 2017-12-29T17:05:20.918Z · LW(p) · GW(p)
It seems to me like "normal" here is an amalgam of (a) the use of narratives to build coalitions, and (b) a defensive response to authoritarian rules of cognition.
In the OP's first story, the friend seems to be interpreting the rules of logic as a way of being good at school, to make your statements more defensible against social attacks, but not a way to be good at life. This is distinct from the "arguments as soldiers" intuition, or more broadly, beliefs as expressions of affiliation.
Related:
comment by Hazard · 2017-12-29T14:07:32.101Z · LW(p) · GW(p)
I think your buttons and maps is a useful first approximation of the difference between "nerds" and "normals" (I don't like those names, but I mostly get the distinciton you're making). I like the mental image of one pushing a button and a printer shooting out a slip of paper.
(meta comment) I'm glad you wrote this, cause even though one could say it's "just a rehashing of things that have been said" it seems like you found a way of frameing it that made it click for you, and it might click for others.
Replies from: Bound_up↑ comment by Bound_up · 2017-12-29T14:09:07.995Z · LW(p) · GW(p)
Thank you. I do feel it's a work in progress and am hoping to produce at some point a much more understandable and thorough version
Replies from: Hazard↑ comment by Hazard · 2017-12-29T19:35:36.123Z · LW(p) · GW(p)
I think one think to explore/expand is how "nomals" view their own thought process. I'm guessing that the number of people who operate using beliefs(2) and have an understanding of what they are doing, is very limited. It seems like most people would consider their beliefs to be some attempt at some thing they consider truth. What happens if you pump themp for predictions, are they willing to extrapolate other things from their slips of paper?
Replies from: Benquo, Bound_up, Florentine↑ comment by Bound_up · 2017-12-29T21:57:30.528Z · LW(p) · GW(p)
I think you're right that it's really rare. I mean, we're sort of looking for beliefs(1) about how beliefs(2) feel from the inside. They'd have to turn to the nerd side, at least, they'd have to for this one area.
My first thought is that trying to get them to dig deeper by asking them about their responses is likely to lead mostly to:
- Free-associating off of your words (rather than your concepts) which will randomly push their buttons, ejecting the slips of paper stored therein.
- Blurring the lines between what you really asked and the nearest equivalent that they already have a good answer for, and answering that instead.
- Shutting down the conversation once you persist long enough that it's difficult to avoid having to resort to on-the-fly production of sayings on the spot.
- Or, if they're really smart, they might welcome the challenge and just amuse themselves trying to come up with cool new sayings, using your words (rather than your concepts) as inspiration, sort of like the conversational equivalent of a fun impromptu jam session
Asking them to think and talk like nerds ("Come on, just for a second, please? Have you tried just not thinking like a political animal?")...is a tricky thing.
I think a better avenue might be to narrow the group down to those people who are nerdy in at least one area of thought, namely, the area of thought dedicated to analyzing how nerds and normal people think. Then, those people can think nerdy about:
- The other areas of their life, which they don't think nerdy about. Like politics or religion (the tricky part here is that they're likely to lose their detached, nerdy analysis as soon as the topic switches to something like that...)
- The way they used to be. That's what I'm doing, trying to remember how I used to think and distilling out the key changes which have occurred. It's tricky because my thinking has changed so much that it's hard to imagine thinking how I used to. For example, I do have an embarrassing draft of a post from when I first found LW and tried to write a justification for believing in God that avoided any errors the LW audience would pick at. I've noticed that it's very dense, unclear, yet concept-sparse. It takes forever to explain a few simple concepts, because I'm using all the rest of the space to signal my intelligence and sophistication, probably in a subconscious attempt to argue along the lines of "smart people believe in God, too, so it should be considered a respectable option among the rationalist crowd."
↑ comment by Florentine · 2018-01-10T23:04:35.696Z · LW(p) · GW(p)
I pretty much only give answers based on beliefs(2), but I’m not sure I’m the kind of person you’re looking for feedback from. Mostly it comes down to: Mapping is costly and usually not personally beneficial, and if you wanted a real judgment based on a real map, a judgment I could base predictions on...you’d have to wait a couple of weeks while I figured out how to get sources on the subject to even begin to do research on it to form a useful position, (which assumes I care so much about this topic that I’m willing to do days or weeks of research on it) and then when you challenged me on it I’d assume you had more background and more information, would ask you to explain your position, would find it convincing in the moment, and would only really know hours or days later, after thinking about what you said some more, whether I—actually—found it convincing enough to update my map.
That’s...very difficult and time-invested, especially if I’m not going to interact with that conversation partner a lot. Signaling allegiance with the shibboleth of the hour is by contrast low-cost, easy to fake, and good enough for most interactions.
Do I have beliefs(1)? Sure, on some things, but I bet I’ll have to look something up to remember why.
I have no idea how well anything like that generalizes to other people who give mostly/exclusively beliefs(2) based answers (I feel like most people are actually using such answers for coalition building or self-defense; the self-defense in the sense of keeping one’s job or not being cussed at or punched makes sense to me; the objectives of the coalitions are generally opaque to me); I hope it’s useful to someone, but take with several grains of salt.
Replies from: Florentine↑ comment by Florentine · 2018-01-10T23:08:01.585Z · LW(p) · GW(p)
Addendum: Even among groups that call themselves nerds, a pause long enough to usefully trace the origin of an idea tends to bring the conversation to a screeching trainwreck halt, or politely ignore your silence while everyone else moves the conversation on past whatever you were thinking about, but that might be a sample-specific thing.
Replies from: Bound_upcomment by iridium · 2017-12-29T06:30:13.576Z · LW(p) · GW(p)
If you're a nerd, you might read all this and think I'm being hard on normal people (how can you say such awful things about them as that they're not logically consistent and that they don't ponder before answering questions?)...
Well, yes. This isn't just saying normal people aren't logically consistent, or that they don't put much effort into logical consistency. It's saying that they have no concept of truth. (Gut reaction: how could anyone ever trust such a person?)
while if you're a normal person reading this (haha, jk), you might think I'm awful hard on nerds (how can you say such mean things as that they don't care what others think and are incapable of properly expressing themselves?)
This is testable. Does anyone here know a non-nerd who could be persuaded to give some feedback? These are thoughts I've had before, and they do a good job of explaining some nasty experiences I've had, but I want to be cautious about what feels like dehumanizing the outgroup.
Replies from: Benquo, estelendur, Bound_up, Lanrian, Bound_up↑ comment by Benquo · 2017-12-29T21:21:59.624Z · LW(p) · GW(p)
Gut reaction: how could anyone ever trust such a person?
You can't trust(1) = count on them to accurately report their likely future behavior or give you an accurate account of what they know. You can trust(2) = believe that they're part of your coalition and expect them to act to favor that coalition over outsiders, by observing their costly signals of commitment, e.g. adherence to the narrative even when it's inconvenient to do so, hard-to-fake signals of affection and valuing your well-being.
Related: Authenticity and instant readouts, Authenticity vs factual accuracy, Bindings and assurances
Replies from: Bound_up↑ comment by Bound_up · 2017-12-29T22:01:41.703Z · LW(p) · GW(p)
You seem to have already covered much of this in your own way. Thanks for the links; I'm going to look them over.
By the way, this focus on social stuff and so on..is this what they call metarationality? I've never quite understood the term, but you seem like you might be one who'd know
Replies from: Benquo↑ comment by Benquo · 2017-12-29T22:16:30.372Z · LW(p) · GW(p)
I figured better to merge into a shared discourse, especially since we seem to have independently arrived here. A lot of Robin Hanson's old stuff is pretty explicitly about this, but I somehow failed to really get it until I thought it through myself.
This is definitely within the broad strain of thought sometimes called "postrationality" (I don't recall hearing "metarationality" but this post links them), which as far as I can tell amounts to serious engagement with our nature as evolved, social beings with many strategies that generate "beliefs," not all of which are epistemic. My angle - and apparently yours as well - is on the fact that if most people are persistently "irrational," there might be some way in which "irrationality" is a coherent and powerful strategy, and "rationality" practice needs to seriously engage with that fact.
↑ comment by estelendur · 2018-01-02T16:17:20.191Z · LW(p) · GW(p)
Hi! I'm a nerd in many respects but I am enough of a normal person to have noticed a sense of offense while reading this. Perhaps I am an unrepresentative sample: I was offended because I want to believe that I am engaging in real thought, and that I have a map of the world that I refer to when constructing my answers, but I suspect that secretly I am just a button-presser when you get right down to it. I did feel pretty dehumanized, actually.
I actually feel as though I am not able to adequately participate in either form of communication, the nerd or the normal. This may be the result of inadequate social training teaming up with inadequate intellectual rigor. But because I cannot do "nerd" well enough, that (to my subconscious) must mean I am "normal" and hence inferior.
↑ comment by Bound_up · 2017-12-29T14:08:11.319Z · LW(p) · GW(p)
It's dehumanizing according to nerd standards, but, then again, we're familiar with the kind of social status afforded nerds.
Which I don't mean in a "we can hit them 'cuz they hit us" sense, but merely to say that they don't think it's dehumanizing to think their way.
Normal people are the majority after all, the ones who are political through and through are both the mob and its leaders, whereas nerds tend to be political only about...most things? A lot of things? Even if they're not political for humans, they're still quite political.
Rationalists are nerdier than nerds, in this sense, since they try to take the nerd mindset into everything, essentially undoing their political instincts (politics is the mind-killer?). Does that make them more or less human?
Well, I remember how political I used to get. I feel like I've improved; I feel that undoing my political instincts and overwriting them with cold nerd reason has been good, so, according to these standards, normal people who have even more political drive than I ever did are indeed worse off.
At the same time, nerds are often called cold and...inhuman, aren't they. Which is more human, nerdiness or politics? Well, normal people think it's that political nature that defines them as humans. In a sense, they're right. Nerds are trying to do instrumentally what would be correct for any species. Truth and (non-social) power would work as well for aliens as humans, so you can't really say that nerdiness is an especially human quality, quite the opposite.
Why, then, do I feel better having nerdified myself? Well, it may be less human, but I feel it's more alive, more aware, more powerful. It's only since I've become a stronger kind of nerd that I've become powerful enough to understand why nerds have the disadvantages they do, how they come about, and (I'm working on it), how to overcome them and give nerds the best of both worlds. Learning this required becoming more nerdy, not less. Ironically, the result is that I now appear less nerdy to those normal people who only ever judged nerdiness according to social terms, because I'm undoing the tell-tale social signs of nerdiness by understanding and correcting them. So, I think it's great!
But, a lot of this learning has come from banging my head on the wall that is trying to communicate truth to normal people and realizing that they really don't care. Well, that makes us different. C'est la vie. They're not going to grant me any more social power on the basis of any of this stuff, but will do so only insofar as I learn to swim in their waters and speak their language. They, in contrast to me, think it's great to not do these things I'm so obsessed with.
As for them having no concept of truth, it's not quite as bad as all that, it's just that when we say "beliefs" or "truth," those are about social signals to them. They do have real nerd-beliefs about the weather and traffic and their jobs and so on; they have them wherever they need them to properly navigate the world which is, to them, a social world. On the other hand, they don't have them about (almost) anything if having them would decrease their social powers. They don't really care about economics (that shouldn't come as all that much of a surprise, should it? And it shouldn't sound like any great insult, either; I assure you they don't think it does; they might even take pride in not being interested in such an obviously dry, weird, nerdy subject) for all that it sounds like they do as they assure us that their ingroup's economic plan will produce well-being for all and can recite the party script as to how that should function (even if it contradicts itself).
↑ comment by Lukas Finnveden (Lanrian) · 2018-01-04T21:28:07.344Z · LW(p) · GW(p)
>while if you're a normal person reading this (haha, jk), you might think I'm awful hard on nerds (how can you say such mean things as that they don't care what others think and are incapable of properly expressing themselves?)
Testing this sounds worth doing. Intuitively, I think it's false. Caring too much about what other people think is in general a low status thing, while caring about the truth is a high status thing (if not particularly important).
↑ comment by Bound_up · 2017-12-30T09:34:34.438Z · LW(p) · GW(p)
Have you ever heard someone say "Don't you trust me?" And maybe you think "What's that supposed to mean? I basically trust you to act like you've acted in the past; in your case, that means I expect you to display behaviors X and Y with great consistency, and behavior Z with moderate consistency..."
I've done that a lot. "I trust you to do XYZ," I would say. But...even at the time, I had a nagging feeling that this wasn't really what they meant. This is what I (and other nerds) mean by trust, not what they mean.
What they mean by "trust," is roughly, an expectation that someone will model their own interests with pretty good accuracy and generally work to fulfill them. They will act as an agent seeking their fully general benefit.
So, "don't you trust me?" is basically asking "don't you think I more or less know what you want and will avoid hurting you, and also will help you as it is convenient, or sometimes even inconvenient for me to do so?"
They think of trust differently, and in their sense of the word, they can be perfectly trustworthy even while displaying the political behaviors that would make them, for example, poor scientists.
Now, you've probably always felt as I did that this question is never asked except the expected answer by "yes." I have a sense that there is some significance here, probably revolving around the idea that any answer other than "yes" is an insult, for which you will be in their debt, while getting that "yes" is a way of getting your commitment, and thus, your complicance...but I have a definite sense that I don't quite understand this completely yet.
At the same time, I think I see why my old "Well, I trust you to do XYZ" actually worked pretty well for me, even if it was by accident, out of obliviousness. It's not insulting at all, but it does get me out of committing to follow their lead generally, and thus, in the specific instance that they're probably trying to get me to help them out with.
Replies from: iridium↑ comment by iridium · 2017-12-30T20:05:39.582Z · LW(p) · GW(p)
What they mean by "trust," is roughly, an expectation that someone will model their own interests with pretty good accuracy and generally work to fulfill them. They will act as an agent seeking their fully general benefit.
That's what I was referring to. Not "do I trust you to do X?" but "are you my ally?"
In this model, if the set of things I need to get right is not the same as the set of things a given normal needs to get right, they may give me dangerously bad advice and have no notion that there's anything wrong with this. Someone who will happily mislead me is not a good ally.
Replies from: Viliam↑ comment by Viliam · 2018-01-05T16:03:33.406Z · LW(p) · GW(p)
Someone who will happily mislead me is not a good ally.
They don't see it as "misleading". They are teaching you the socially approved reaction to a stimulus (but they obviously wouldn't use these words), which is exactly what a good ally in their world is supposed to do. Unfortunately, such precious gifts are wasted on nerds, who try to translate them into maps of territory instead of memorizing and repeating them as a part of social performance. From their point of view, they are cooperating with you... it's just that they play a completely different game.
I have a few friends among normies, but I usually don't go to them asking for advice about the real world (unless they happen to be domain experts at something). Normies can be a wonderful source of warm emotions; that's what their world is mostly about. Any factual statement needs to be triple checked (without telling them about it), though.
Note that I am not dismissing friendship with normies here. Warm emotions are important. And so is domain expertise, because I can't always go and ask a fellow rationalist about some obscure detail (also, nerds are prone to overconfidence at domains they lack expertise in; no, you can't replace tons of data with mere high IQ). But trying to bring normies on your travel at exploring the real world is an exercise in frustration.
comment by Kenny · 2018-01-31T12:25:19.216Z · LW(p) · GW(p)
This very much reminded me of this anecdote (which I'm pretty sure has been referenced in at least one LW post previously):
- Richard Feynman on education in Brazil
Key excerpt:
After a lot of investigation, I finally figured out that the students had memorized everything, but they didn’t know what anything meant. When they heard “light that is reflected from a medium with an index,” they didn’t know that it meant a material such as water. They didn’t know that the “direction of the light” is the direction in which you see something when you’re looking at it, and so on. Everything was entirely memorized, yet nothing had been translated into meaningful words. So if I asked, “What is Brewster’s Angle?” I’m going into the computer with the right keywords. But if I say, “Look at the water,” nothing happens – they don’t have anything under “Look at the water”!
comment by Chris_Leong · 2017-12-30T06:26:04.400Z · LW(p) · GW(p)
This seems very similar to Paul Graham's Why Nerds Are Unpopular. However, instead of discussing the conflict between truth and popularity, it discusses the (very similar) conflict between intelligence and popularity. It also explains how spending a lot of your time trying to learn one means that you have less time to learn the other.
comment by Elizabeth (pktechgirl) · 2017-12-29T15:36:49.916Z · LW(p) · GW(p)
Moved to front page
comment by alkjash · 2018-01-02T01:14:06.589Z · LW(p) · GW(p)
I think the problem isn't exactly that nerds care about reality and normals care about status games, but rather that different data structures are called for in different applications, and the reality vs. status games dichotomy is just one dimension of "different," and a secondary one at that. See my post Data Structures.