What is Ra?
post by Chris_Leong · 2020-06-06T04:29:23.413Z · LW · GW · No commentsThis is a question post.
Contents
Answers 38 pjeby 27 Viliam 26 Rob Bensinger 8 Chris_Leong 2 mako yass None No comments
Sarah Constantine coined Ra in this blog post and it has become a frequently referenced concept within the Rational-sphere. However, I don't feel she provided a clear definition. How would you define it?
Answers
I thought the article provided a pretty clear definition: i.e., a preference for Mysterious, Prestigious, collective Authority over known, functional, individual capability.
Thank you for posting this, btw, because I hadn't actually heard of it before, and reading the article allowed me to finally make sense of a way that my mother treated me as a child, that I couldn't get my head around before. (Because it just seemed like she was demeaning me and my abilities personally, rather than simply having a semi-religious belief that no mere individual imperfect human could ever do something meaningful through their own actions, rather than through the divine authority of proper institutions.)
Oddly enough, I was actually trying to change a belief I'd picked up from her (that I can't do anything meaningful or important "for real") when I had the impulse to go look at LW and spotted your question, then read the article. It was just what I needed to wrap my head around the belief and remove it so that I don't get nervous when I get close to doing something meaningful "for real".
Indeed, what I found was that while I hadn't fully internalized her belief in Ra, I effectively picked up as a background assumption the idea that only certain blessed people are allowed to market themselves successfully or succeed in business in a big way, or write Proper Books... and that I'm not one of them.
So even though I am about as anti-Ra in philosophy as they get, I still had a Ra-like alief that made me feel inadequate compared to the Mysterious Authorities when I tried writing books or promoting my work too effectively. (Versus the very ineffective methods of doing both that I've been doing for the past 14 years.) I'm very much looking forward to see what I can do when I don't have Ra-induced feelings of inadequacy dogging my steps.
↑ comment by Chris_Leong · 2020-06-06T21:55:05.662Z · LW(p) · GW(p)
Great to hear that this article helped you
Ra is an emotional drive to idealize vagueness and despise clarity. It is a psychological mindset rather than rational self-interest; from inside, this cognitive corruption feels inherently desirable rather than merely useful.
Institutions become corrupted this way, as a result of people in positions of power exhibiting the same kind of bias. It is not a conspiracy, just a natural outcome of many people having the same preferences. It is not conformity, because those preferences already pointed in the specific direction. (The people would have the same preference even if it were a minority preference, although social approval probably makes them indulge in it more than they would have otherwise.)
This attitude is culturally coded as upper-class, probably because working-class people need to do specific tasks and receive direct feedback if they get an important detail wrong, while upper-class people can afford to be vague and delegate all details to their inferiors. (Also, people higher in hierarchy are often shielded from the consequences of mistakes, which further reduces their incentives to understand the details. Thus the mistakes can freely grow to the level when they start interfering with the primary purpose of the institution. Even then the behavior is difficult to stop, because it is so distributed that firing a few key people would achieve no substantial change. And the people in positions to do the firing usually share the same attitude, so they couldn't correctly diagnose it as a source of the problem. But Ra is not limited to the domain of business.)
From inside, Ra means perceiving a mysterious perfection, which is awesome by being awesome. It has the generic markers of success, but nothing knowable beyond that. (If you can say that some thing is awesome because it does some specific X, that makes the thing less Ra.)
For example, an archetypally Ra corporation would be perceived as having lots of money and influence, and hiring the smartest and most competent people in the world, but you wouldn't know what it actually does, other than it is an important player in finance or technology or something similar. (Obviously, there must be someone in the corporation, perhaps the CEO, who has a better picture of what the corporation is actually doing. But that is only possible because the person is also Ra. It is not possible to fully comprehend for an average mortal such as you.)
The famous Ra advertising template is: "X1. More than X." (It is important that you don't know how specifically it is "more" than the competing X's, which implies it contains more Ra.)
The Virtue of Narrowness [LW · GW] was written as an antidote against our natural tendencies towards Ra.
When people become attached to something that in their eyes embodies Ra, they are very frustrated about those who challenge their attitude. ("What horrible mental flaw could make this evil person criticize the awesomeness itself?" To them, disrespecting Ra does not feel like kicking a puppy, but rather like an attempt to remove all the puppy-ness from the universe, forever.) The frustrating behaviors include not only actively opposing the thing, but also ignoring it (an attack on its omni-importance), or trying to analyze it (an attack on its mysteriousness).
People under strong influence of Ra hate: being specific; communicating clearly, being authentic, exposing your preferences, and generally exposing anything about yourself. (If specific things about you are known, you cannot become Ra. You are stupid for throwing away this opportunity, and you are hostile if you try to make me to do the same.) From the opposite perspective, authenticity and specificity are antidotes to Ra.
Seems to me that Ra is a desire to "become stronger" without any respect for the "merely real" and lots of wishful thinking. A superstimulus that makes the actual good feel like a pathetic failure.
(Tried to summarize the key parts of the original article, and add my own interpretation. It is not exactly a definition -- maybe the first paragraph could be considered one -- but at least it's shorter.)
From a January 2017 Facebook conversation:
Rob B: I gather Ra is to a first approximation just 'the sense that things are impersonally respectable / objective / authoritative / credible / prestigious / etc. based only on superficial indirect indicators of excellence.'
Ruby B: I too feel like I do not understand Ra. [...] Moloch, in my mind, was very clearly defined. For any given thing, I could tell you confidently whether it was Moloch or not. I can't do that with Ra. Also, Moloch is a single clear concept while Ra seems to be a vague cluster if it's anything. [...]
Rob B: Is there anything confusing or off about the idea that Ra is 'respectability and prestige maintained via surface-level correlates of useful/valuable things that are not themselves useful/valuable (in the context at hand)'? Either for making sense of Sarah's post or for applying the concept to real-world phenomena?
Ruby B: Yes, there is something off about that summary since the original post seems to contain a lot more than "seeking prestige via optimizing for correlates of value than actual value". [...] If your summary is at the heart of it, there are some links missing to the "hates introspection", "defends itself with vagueness, confusion, incoherence." [...]
Rob B: There are two ideas here:
(1) "a drive to seek prestige by optimizing for correlates of value that aren't themselves valuable"
(2) "a (particular) drive toward inconsistency / confusion / vagueness / ambiguity"
The connection between these two ideas is this paragraph in Sarah's essay:
"'Respectability' turns out to be incoherent quite often — i.e. if you have any consistent model of the world you often have to take extreme or novel positions as a logical conclusion from your assumptions. To Ra, disrespectability is damnation, and thus consistent thought is suspect."
1 is the core idea that Sarah wants to point to when she says "Ra". 2 is a particular phenomenon that Sarah claims Ra tends to cause (though obviously lots of other things can cause fuzzy/inconsistent thinking too, and a drive toward such). Specifically, Sarah is defining Ra as 1, and then making the empirical claim that this is a commonplace drive, that pursuing any practical or intellectual project sufficiently consistently will at least occasionally require one to either sacrifice epistemics or sacrifice prestige, and that the drive is powerful enough that a lot of people do end up sacrificing epistemics when that conflict arises.Ruby B: Okay, yeah, I can start to see that. Thanks for making it clearer to me, Rob!
Rob B: I think Sarah's essay is useful and coherent, but weirdly structured: she writes a bunch of poetry and mentions a bunch of accidental (and metaphorical, synesthetic, etc.) properties of Ra before she starts to delve into Ra's essential properties. I think part of why I didn't find it confusing was that I skimmed the early sections and got to the later parts of the essay that were more speaking-to-the-heart-of-the-issue, then read it back in reverse order. :P So I got to relatively clear things like the Horus (/ manifest usefulness / value / prestige-for-good-reasons) vs. Ra (empty respectability / shallow indicators of value / prestige-based-on-superficial-correlates-of-excellence) contrast first:
"Horus likes organization, clarity, intelligence, money, excellence, and power — and these things are genuinely valuable. If you want to accomplish big goals, it is perfectly rational to seek them, because they’re force multipliers. Pursuit of force multipliers — that is, pursuit of power — is not inherently Ra. There is nothing Ra-like, for instance, about noticing that software is a fully general force multiplier and trying to invest in or make better software. Ra comes in when you start admiring force multipliers for no specific goal, just because they’re shiny."
And:
"When someone is willing to work for prestige, but not merely for money or intrinsic interest, they’re being influenced by Ra. The love of prestige is not only about seeking 'status' (as it cashes out to things like high quality of life, admiration, sex), but about trying to be an insider within a prestigious institution."
(One of the key claims Sarah makes about respectability and prestige maintained via surface-level correlates of useful/valuable things that are not themselves useful/valuable (/ Ra) is that this kind of respectability accrues much more readily to institutions, organizations, and abstractions than to individuals. Thus a lot of the post is about how idealized abstractions and austere institutions trigger this lost-purposes [LW · GW]-of-prestige mindset more readily, which I gather is because it's harder to idealize something concrete and tangible and weak, like an individual person. Or maybe it has to do with the fact that it's harder to concretely visualize the proper function and work of something that's more abstract and large-scale, so it's easier to lose sight of the rationale for what you're seeing?)
"Seen through Ra-goggles, giving money to some particular man to spend on the causes he thinks best is weird and disturbing; putting money into a foundation, to exist in perpetuity, is respectable and appropriate. The impression that it is run collectively, by 'the institution' rather than any individual persons, makes it seem more Ra-like, and therefore more appealing."
All of that stuff makes sense. The earlier stuff from the first 2 sections of the post doesn't illuminate much, I think, unless you already have a more specific sense of what Sarah means by "Ra" from the later sections.Ruby B: Your restructuring and rephrasing is vastly more comprehensible. That said, poetry and poetic imagery is nice and I don't begrudge Sarah her attempt.
And given your explanation, perhaps your summary description could be made slightly more comprehensive (though less comprehensible) like so:"Ra is a drive to seek prestige by optimizing for correlates of value that aren't themselves valuable because you have forgotten the point of the correlates was to attain actual value." [...]
Rob B: Maybe "Ra is a drive to seek prestige by optimizing for correlates of value, in contexts where the correlates are not themselves valuable but this fact is made non-obvious by the correlate's abstract/impersonal/far-mode-evoking nature"?
I formed my own opinion at the start, but I didn't post it right away since I didn't want to possibly bias other people into agreeing with me. I guess the way I'll answer this will be slightly different from the other answers, since I think the dynamics of the situation are more complex than an idealisation of vagueness. Pjeby seems hotter(/closer) in estimation when they say it's a preference for mysterious, prestigious authority, but again I think we have to dive deeper.
I see Ra as a dynamic which tends to occur once an organisation has obtained a certain amount of status. At that point there is an incentive and a temptation to use that status to defend itself against criticism. One way of doing that is providing vague, but extremely positive-sounding non-justifications for the things that it does and use the status to prevent people from digging too deep. This works since there are often social reasons not to ask too many questions. If someone gives a talk, to keep asking followups is to crowd out other people. People will often assume that someone who keeps hammering a point is an ideologue or simply lose interest. In any case, these can usually be answered with additional layers of vagueness.
This also reminds me of the concept of hyperreal or realer than real. Organisations that utilise Ra become a simulation of a great organisation instead of the great organisation that they might have once been. By projecting this image of perfection they feel realer than any real great organisation which will inevitably have its faults and hence inspire doubt.
↑ comment by pjeby · 2020-06-07T02:06:39.380Z · LW(p) · GW(p)
ISTM that's a result of worshipping Ra, rather than Ra-worship itself. Perhaps I am biased by my mother's example, but she was not a part of any mysterious organizations or their status incentives. She merely believed that Church, State, Schools, Companies, or other such Capitalized Entities had mystical powers to which mere human individuals could not aspire, unless they were assimilated into those institutions and thereby earned the blessing of said mystical powers.
AFAICT, this did not come from the type of organizational evolution and incentives that you're talking about; rather, this was simply a widely-held belief of hers that was largely independent of what competencies or institutions were being discussed. In her mind, ordinary humans couldn't do jack squat; anything an ordinary human did without an appropriate institutional blessing was merely an exception that didn't count the same as doing the thing "for real" -- it was in her mind the same as an actor pretending to be a priest not being able to actually forgive your sins or perform a marriage ceremony... just extended to everything that institutions or some sort of orthordoxy existed for.
So ISTM that the primary dynamic is that deification of the abstract offers a superstimulus that can't be matched by real, concrete, imperfect individuals, leading to worship of the abstraction in place of critical thinking or analysis. In effect, my mother was just doing the organizational/societal equivalent of people preferring their anime waifus or surgically-altered pornstars over real-life people. (IOW, removing details that imply imperfection or excess complexity is already a standard route to superstimulus in humans.)
Replies from: Chris_Leong, Viliam↑ comment by Chris_Leong · 2020-06-08T08:38:06.572Z · LW(p) · GW(p)
Maybe I should have said that there two sides to Ra - the institutional incentive and the reason why people fall for this or (stronger) want this
↑ comment by Viliam · 2020-06-07T14:33:26.419Z · LW(p) · GW(p)
Establishing an institution is a costly signal that there is a group of people committed to spend years of their life working on some issue.
For example, Machine Intelligence Research Institute gives me the hope that if tomorrow Eliezer gets hit by a car, converts to Mormonism, or decides to spend the rest of his life writing fan fiction, the research will go on regardless. Which is a valuable thing.
But if you go along this direction too far, you get superstimuli. If MIRI is better then Eliezer's blog, then a Global Institute For Everything Important must be million times better, and MIRI should be ashamed for competing with them for scarce resources.
Another problem is that creating an institution is a signal of commitment to the agenda, but prolonged existence of the institution is often just a signal of commitment to salaries.
Maybe you should just play along and rename Mind Hackers' Guild to, dunno, Institute for Mental Modification. Or something less Orwellian. :D
↑ comment by Jay · 2020-06-07T02:09:31.294Z · LW(p) · GW(p)
If you want to see Ra in its purest form, look to advertising. It's positive affect free of information. Olive Garden is not your family; not all who eat Doritos are bold. Ra is a tale told by an idiot, full of sound and fury, signifying nothing. It is also often encountered in celebrities and politics (what is Kim Kardashian famous for, exactly?).
The opposite of Ra is the question "What have you done for me lately?".
Over time, the concept of Ra settled in my head as... the spirit of collective narcissism, where we must recognize narcissism as delusional striving towards attaining the impossible social security of being completely beyond criticism, to be flawless, perfect, unimprovable, to pursue Good Optics with such abandon, as to mostly lose sight of whatever it was you were running from.
It leads to not being able to admit to most of the org's imperfections even internally, though they may admit to that imprefection internally, doing so resigns them to it, and they submit to it.
I don't like to define it as the celebration of vagueness, in my definition that's just an entailment. Something narcissism tends to do, to hide.
No comments
Comments sorted by top scores.