Should we stop using the term 'Rationalist'?

post by Bob Jacobs · 2020-05-29T15:11:18.329Z · LW · GW · 11 comments

This is a question post.

Contents

  Answers
    Richard_Kennaway
    romeostevensit
    Dagon
    lejuletre
    Rob Bensinger
    Viliam
    George
None
11 comments

This might be an unpopular opinion, but I really dislike the term 'rationalist' for four reasons:

1) It makes you sound self-aggrandizing. The term gives the impression that you think you are already rational and therefore smarter-than-thou.

2) It's already a term used to describe a different group. In fact, that way of using the term is not only way older, but also way more common.

3) The people it describes are not only interested in 'rationality', but also talk at length about things like AI, Bayes-theorem, Utilitarianism, etc.

4) We don't even agree on what it means [LW(p) · GW(p)] anymore. I'm not sure we ever did, but post-rationality has put the nail in the coffin.

I've toyed with introducing the term 'Aspirationalist', but maybe we should just split it into 'Lesswrongers', 'SSCers', 'Effective Altruist' etc?

Answers

answer by Richard_Kennaway · 2020-06-02T16:55:14.843Z · LW(p) · GW(p)

"Rationalism" has the baggage of having meant the idea of finding truth by pure reason, without needing to look at the world.

"Empiricism" has the baggage of having meant the idea of finding truth just by looking at the world, without applying reason to discern its inner structures.

"Bayesianism" is far too narrow.

"Baconianism" might be close enough, but too obscure.

There does not appear to be any word that means "finding the truth by reason and observation, not separate from each other, but different aspects of a single method, as described in the Sequences", however many of the individual ideas there can be found in sources predating them.

comment by Bob Jacobs · 2020-06-02T18:03:31.552Z · LW(p) · GW(p)

Upvoted for introducing me to the term baconianism, even though it is a little bit off. We could do what every academic and their dog does when they find something they almost agree with and slap a 'neo-' in front of it to create e.g neobaconianism. But if we are gonna invent new terms anyway we might as well go with aspirationalist.

answer by romeostevensit · 2020-05-30T20:15:03.857Z · LW(p) · GW(p)

I have taken to calling myself a dilettante after someone called me that at an auspicious gathering of thought leaders. I don't actually know what it means but it sounds very French (that means sophisticated for those of you who don't know).

comment by Bob Jacobs · 2020-05-30T21:13:38.129Z · LW(p) · GW(p)

I speak French. It means 'amateur'.

comment by romeostevensit · 2020-05-30T21:27:02.170Z · LW(p) · GW(p)

Ah, a person who engages in a profession while unpaid. Why yes, I am also a philanthropist. I applaud my original complimenteur for noticing.

comment by Bob Jacobs · 2020-05-30T22:26:25.842Z · LW(p) · GW(p)

I really really like that you didn’t take that as an insult. I should really start worrying less about offending people on this website. I rewrote that answer like five times wondering if I should make it sound less harsh, but I didn’t and you remained upbeat. The faux-posh writing style also made me grin from ear to ear. I verily say unto thee, Taketh my upvote good sire.

comment by lsusr · 2020-05-31T10:25:02.745Z · LW(p) · GW(p)

I didn't notice the faux-posh style until you pointed it out. Thank you for bringing that to my attention.

comment by Richard_Kennaway · 2020-05-31T18:14:12.832Z · LW(p) · GW(p)

In English it means a particular kind of amateur: one without commitment, a dabbler, whose knowledge is merely superficial. "Amateur" is also used in the same sense, although it has not entirely lost the meaning of one who engages in something for the love of it, and may be (and occasionally is) the equal of a professional.

answer by Dagon · 2020-05-29T15:21:40.148Z · LW(p) · GW(p)

Who's "we"?

I don't use the term, and don't generally refer to the groups it might describe. When I need to describe something, I try to consider the specifics of what I'm trying to convey, and to whom - almost always there are better terms for whatever is under discussion. "philosophical techno-nerds" is my current go-to, but "LessWrong participants" is more precise.

[note: yes, that first question is intended to both be a dig at some assumptions AND as a pithy restatement of the primary question of community.]

comment by ChristianKl · 2020-05-29T20:44:13.568Z · LW(p) · GW(p)

The difference between people who I would call ratioanlists and "Philosophical techno-nerds" in general is that for rationalists their rationality actually affects the way they act. Rationalists do a lot more sport then your average "philosophical techno-nerd".

comment by Dagon · 2020-05-29T22:42:39.284Z · LW(p) · GW(p)

Heh. We may know different slices of those who identify as Rationalists and those we're calling Philosophical techno-nerds. I agree that there's probably only about 80% overlap, but I'd say the variance in rigor and in behavior is pretty close to the same in the two groups, just with a wider variety of topics that can be too-narrowly analyzed in the PTNs.

comment by Bob Jacobs · 2020-05-29T15:40:56.820Z · LW(p) · GW(p)

I don't use it either (obviously, why would I use a term I actively dislike), so this question is actually aimed at the people who do. But saying 'You should stop calling yourself a rationalist' is coming on way too strong and I generally try to be nice. Since you already don't use it you're not really the target audience, but thanks for commenting anyway because if the term is actually only used by a minority, having that pointed out to them might help retire the term.

answer by lejuletre · 2020-05-29T16:31:49.831Z · LW(p) · GW(p)

I generally using "rationalist" as a short-hand catchall among people who will already know what i'm talking about, ie with my girlfriend or with ppl in ratsphere tumblr. i would never introduce myself to someone outside of the community that way, so maybe i'm also not the target audience for your question.

however, i feel like the minority of people who would self-identify as a "rationalist" to someone decidedly outgroup (hasn't heard of LessWrong/EY, isn't interested in EA, consequentialism, etc) is a different problem where the term itself isn't inherently the problem. people would probably be equally weirded out if you described yourself as a "utilitarian" or "effective altruist" just bc describing ourselves by our philosophies is not super common in the world-at-large.

i do really like the term aspirationalist tho. is it pronounced like aspir-rationalist or aspiration-alist ?

comment by Bob Jacobs · 2020-05-29T16:49:27.094Z · LW(p) · GW(p)

While I do think it's fine to use them term reactively with people who initiate it, I would still advice against using it proactively because sometimes a person you think is ingroup is actually surprisingly outgroup. My local EA group has a surprising lack of LWers and I would rather we slowly faze-out or replace it's usage, instead of continuing to use it and increasing the risk of causing unnecessary confusion amongst the pseudo-ingroup.

EDIT: Also saying you're utilitarian is indeed weird, but it is not actively causing confusion like 'rationalist' does, so I would honestly prefer that.

Since this is me throwing a compromise to the folk that do use rationalist, I would suggest pronouncing it like aspir-rationalist.

answer by Rob Bensinger · 2020-06-06T21:30:44.823Z · LW(p) · GW(p)

"LessWrongers" doesn't sound fancy and Latinate enough to be an intellectual movement. We need something like "error-reductionists".

  • Error-reductionism: the idea that error is inevitable, but we're trying to reduce how much. Probabilism, perpetual beta, and ambition/audacity/grit [LW · GW].
  • Error-reductionism: the idea that errors are reducible, i.e., explainable in terms of causes and parts such as cognitive biases and bad micro-habits.
  • Error-reductionism: philosophical reductionism (the world is physical, decomposable, lawful, understandable, not inherently mysterious or magical) combined with an error theory about non-reductionist ideas. We have Bayesianism as a principled (reductive) account of science; we don't need to call Thor mean names like "meaningless" or say he's in a separate magisterium from science. We're allowed to say those ideas were just wrong. We learn about the world by looking at the world and seeing what stuff happens and what methods work — not by applying a priori definitions of "what hypotheses sound sciencey to me".
comment by Bob Jacobs · 2020-06-06T21:57:12.243Z · LW(p) · GW(p)

Ooh I like that, although it is a bit long and it contains a hyphen. The vocalization is also going to be a bit awkward (too many r's in a row). We could shorten in to erreductionist since to err means the same thing, but you do lose some clarity.

answer by Viliam · 2020-05-30T23:07:00.123Z · LW(p) · GW(p)

Something like "Lesswrongers" would be okay for me; at least it is obvious for insiders what it refers to. (For outsiders, there will always be the inferential distance, no matter what label you choose.)

"Effective Altruists" is a specific group I am not a member of, although I sympathize with them. In my opinion, only people who donate (now, not in their imagined future) 10% of their income to EA charities should call themselves this.

"SSCers" on the other hand is too wide to be a meaningful identity, at least for me. It is definitely not a good replacement for what we currently call "rationalists".

About the objections 3 and 4 -- let's look at how we historically got where we are. There was a blog by Eliezer Yudkowsky about some ideas that he considered important to make a blog about. It felt like those ideas made a unified whole. Gradually the blog attracted people who were more interested in some subset and less interested in the rest, or in ideas that were related to some subset, etc., and thus the unity was gradually lost. We can still point towards the general values: having true knowledge about the nature of the world and ourselves, improving the world, improving ourselves by overcoming our cognitive shortcomings and generally becoming stronger, individually and also cooperating with each other. There are also people who like to hang out with the crowd despite not sharing all of these values.

comment by MikkW (mikkel-wilson) · 2020-05-31T03:58:56.682Z · LW(p) · GW(p)

EA is more than just giving- people who work careers based on EA principles have every right to call themselves EAs, even if they never donate a single penny

comment by Viliam · 2020-05-31T21:13:04.233Z · LW(p) · GW(p)

I don't want to judge individual people, but it is my opinion that many people call themselves EAs although they shouldn't. This could become a problem in future, if it becomes common knowledge that most "bailey EAs" are actually not "motte EAs".

If someone is developing a malaria vaccine, it sounds reasonable to consider them an EA even if they don't donate a penny, because their research can save millions of lives. If someone makes millions without donating, in order to reinvest and make billions, in which case they will donate the billions, it also makes sense to call them an EA (or perhaps "future EA").

But it is known that people's values often change as they age. For example, people who in their 20s believe they would sacrifice everything for Jesus (and sign abstinence pledges and whatever), can become atheists in their 30s. In the same way, it is completely plausible that people in their 20s sincerely believe they would totally donate 10% of their future income to EA causes (and sign pledges)... and change their opinion in their 30s when they start having an actual income. I am not saying this will happen to all student EAs, but I am saying it will happen to some. (I would expect the fraction to grow if EA becomes more popular in mainstream, because this feels like something most normies would do without hesitation.)

Thus I am in favor of having a norm "you have to do something (more than merely self-identifying as an EA) to be actually called an EA". If it depended on me, the norm would be like "actually gives 10% of income, and the income is at least the local minimum wage". But I am not an EA, so I am just commenting on this as an outsider.

comment by Philip Dhingra (philip-dhingra) · 2020-06-06T14:17:03.015Z · LW(p) · GW(p)

+1 for "Lesswrongers" or "the LessWrong community"

A name for an emergent community is going to have to also be, well, emergent. But you can nudge that emergence in the direction you choose. I think LessWronger is the next natural candidate. I was introduced to a group once as a "LessWronger" even though today is my first time posting or upvoting anything here despite being an avid SSCer for 3 years. I've always been aware of LW, and the label would have been OK for me.

comment by Bob Jacobs · 2020-05-30T23:50:02.099Z · LW(p) · GW(p)

Ok, that second suggestion was not: let’s call ourself one of these three things (LW or SSC or EA), I suggested we drop ‘rationalist’ in general and split our community into (these and other) subcommunities. And I’m not sure I agree with you on some terminology either. I would call myself an Effective Altruist even though I don’t donate 10% (I’m a studying ethics to work for EA later), because I’m on the giving what we can pledge and I’m active in my local EA community.

And EY’s blog was never as coherent as people say it was. But lets be extremely charitable and cut away all his other interest in AI, economics etc and only talk about: 1) having accurate beliefs and 2) making good decisions. For one this is so vague its almost meaningless and secondly even that is not coherent because those two things are in conflict. The first is the philosophy of realism and the second is pragmatism, two irreconcilable philosophies. I’ve always dropped realism in favor of pragmatism and apparently that makes me a post-rationalist now? Do people realize that you can’t always do both?

comment by Viliam · 2020-05-31T21:55:51.447Z · LW(p) · GW(p)

Commented on EA under sibling comment. Sorry, it wasn't meant as a personal attack, although it probably seems so. Sorry again.

From my perspective, the narrative behind the Sequences was like this: "The superhuman artificial intelligence could easily kill us all, for reasons that have nothing to do with Terminator movies, but instead are like Goodhart's law on steroids. It would require extraordinary work to create an intelligence that has human-compatible values and doesn't screw up things on accident. Such work would require smart people who have unconfused thinking about human values and intelligence. Unfortunately, even highly intelligent people get easily confused about important things. Here is why people are naturally so confused, and here is how to look at those important things properly. (Here is some fictional evidence about doing rationality better.)"

1) having accurate beliefs and 2) making good decisions. For one this is so vague its almost meaningless and secondly even that is not coherent because those two things are in conflict.

To me it seems that pragmatism without accurate beliefs is a bit like running across a minefield. You are so fast that you leave all the losers behind. Then something unexpected happens and you die. (Metaphorically speaking, unless you are Steve Jobs.) A certain fraction of people survives the minefield, and then books and movies are made celebrating their strategy; failing to mention the people who used the same strategy and died. To me it seems like an open question whether such strategy is actually better on average. (Though maybe this is just my ignorance speaking.)

In real life, many people who try to have accurate beliefs are failing, often for predictable reasons. So, maybe this whole project is indeed as doomed as you see it. But maybe there are other factors. For example, both "trying to have accurate beliefs" and "failing at life" could be statistical consequences of being on the autistic spectrum. In that case, if you already happen to be on the spectrum, you cannot get rid of the bad consequences by abandoning the desire to have accurate beliefs. Another possible angle is that "trying to have accurate beliefs" is most fruitful when you associate with people who have the same values. Most of human knowledge is a result of collaboration. In such case, creating a community of people who share these values is the right move.

I don't want to go too deep in "the true X has never been tried yet" territory, but to me LW-style rationality seems like rather new project, which could possibly bring new fruit. (The predecessors in the same reference class are, I suppose, General semantics and Randian objectivism.) So maybe there is a way to success that doesn't involve self-deception. At least for myself, I don't see a better option. But this may be about my personality, so I don't want to generalize to other people. Actually, it seems like for most people, LW-style rationality is not an option.

I suppose my point is that Less Wrong philosophy -- the attempt to reconcile search for truth with winning at life -- is a meaningful project; although maybe only for some kinds of people (not meant as a value judgment, but: different personality types exist and different strategies work for them).

comment by Bob Jacobs · 2020-06-01T09:08:40.323Z · LW(p) · GW(p)
Commented on EA under sibling comment. Sorry, it wasn't meant as a personal attack, although it probably seems so. Sorry again.

It didn't, because you couldn't even if you wanted to. You don't know me personally so why would I assume you were attacking me personally? I was merely trying to state a terminological disagreement in an attempt to change the readers hidden inference [LW · GW].

To me it seems that pragmatism without accurate beliefs is a bit like running across a minefield.

This is not what philosophical pragmatism is about. With pragmatism you learn what is useful which in 99.999% of cases will be the thing thats accurate. Note that I said:

Do people realize that you can’t always do both? [emphasis added]

But philosophy is all about the edge cases. What do you do when there is knowledge that is dangerous for humanity's survival? Do you learn things that are probably memetic hazards? Realism says 'yes', Pragmatism says 'no'. Pragmatism is about 'winning', realism is about 'truth'. If somehow you can show that these clearly opposed philosophies are actually reconcilable you will win all the philosophy awards. Until that time, I choose winning.

comment by Viliam · 2020-06-02T14:34:54.369Z · LW(p) · GW(p)

OK, thanks for explanation. The part about avoiding memetic hazards... seems like a valuable thing to do, but also seems to me that in practice most attempts to avoid memetic hazards have second-order effects. (Obvious counter-argument: if there are successful cases of avoiding memetic hazards that do not have side effects, I would probably not know about them. An important part of keeping a secret is never mentioning that there is a secret.)

But this would be a debate for another day. Maybe an entire field of research: how to communicate infohazards. (If you found it, there is a chance other people will, too. How can you decrease that probability, without doing things that will likely blow back later.)

In the meanwhile, if in most cases the accurate thing is the useful thing, and if we don't know how to handle the remaining cases, I feel okay going for the accurate thing. (This is probably easier for me, because I personally don't do anything important on a large scale, so I don't have to worry about accidentally destroying humanity.)

answer by George · 2020-06-01T00:34:19.731Z · LW(p) · GW(p)

To address 2) specifically, I would say that philosophical "Rationalists" are a wider group but they would generally include the kind of philosophical views that most people on e.g. LW hold, or at least they include a pathway to reaching those view.


See the philsophers listed in the wikipedia article for example:


Pythagoras -- foundation for mathematical inquiry into the world and mathematical formalism creating in general

Plato -- foundation for "modern" reasoning and logic in general, with a lot of ***s

Aristotle -- (outdated) foundation for observing the world and creating theories and taxonomies. The fact that he's mostly "wrong" about everything and the "wrongness" is obvious also gets you 1/2 of the way to understand Kuhn

René Descartes -- "questioning" more fundamental assumptions that e.g. Socrates would have had problems seeing as assumptions. Also foundational for modern mathematics.

Baruch Spinoza -- I don't feel like I can summarize why reading "Spinoza" leads one to the LW-brand of rationalism. I think it boils down to this obsession with internal consistency and his obsession to burn any bridge for the sake of reaching a "correct" conclusion.

Gottfried Leibniz -- I mean, personally, I hate this guys. But it seems to me that the interpretations of physics that I've seen around here, and also those that important people in the community (e.g. Eliezer and Scott) use are heavily influenced by this work. Also arguably one of the earliest people to build computers and think about them so there's that.

Immanuel Kant -- Arguably introduced the Game Theoretical view to the world. Also helped correcting/disproving a lot of biased reasoning in philosophy that leads to e.g. arguments for the existence of good based on linguistic quirks.


I think, at least in regards to philosophy until Kant, if one were to read philosophy following this exact chain of philosopher, they would basically have a very strong base from which to approach/develop rationalist thought as seemingly espoused by LW.

So in that sense, the term "Rationalist" seems well fitting if wanting to describe "The general philosophical direction" most people here are coming from.

comment by Protagoras · 2020-06-01T02:21:23.902Z · LW(p) · GW(p)

Looking at the listed philosophers is not the best way to understand what's going on here. The category of rationalists is not "philosophers like those guys," it is one of a pair of opposed categories (the other being the empiricists) into which various philosophers fit to varying degrees. It is less appropriate for the ancients than for Descartes, Spinoza, and Leibniz (those three are really the paradigm rationalists). And the wikipedia article is taking a controversial position in putting Kant in the rationalist category. Kant was aware of the categories (indeed, is a major source of the tradition of grouping philosophers into those two categories), and did not consider himself to belong to either of them (his preferred terms for the categories were "dogmatists" for the rationalists and "skeptics" for the empiricists, which is probably enough on its own to give you a sense for how he viewed the two groups). There is admittedly a popular line of Kant interpretation which reads him as a kind of crypto-rationalist, but there are also those of us who read him as a crypto-empiricist, and not a few who take him at his word as being outside both categories.

In any event, the empiricist tradition has at least as much, if not more, influence on the LW wrong crowd as the rationalist tradition, and really both categories work best for early moderns and aren't fantastic for categorizing most in the present era. So anybody familiar with the philosophical term is likely to find the application to this community initially confusing.

comment by Bob Jacobs · 2020-06-01T09:20:09.444Z · LW(p) · GW(p)

Great comment. I would just like to add that Kant killed/unified Empiricism and Rationalism and after Kant the terms quickly started the fizzle out.

comment by Viliam · 2020-06-02T14:43:44.773Z · LW(p) · GW(p)

Seems like Kant killed it by naming it.

.

The Tao that can be named is not the eternal Tao;

because the later philosophers will call themselves "post-Taoists".

comment by Bob Jacobs · 2020-06-02T16:29:33.326Z · LW(p) · GW(p)

Cool quote, but in this case probably not accurate. From wikipedia:

The term became useful in order to describe differences perceived between two of its founders Francis Bacon, described as an "empiricist", and René Descartes, who is described as a "rationalist".

From etymonline:

Were I obliged to give a short name to the attitude in question, I should call it that of radical empiricism, in spite of the fact that such brief nicknames are nowhere more misleading than in philosophy. I say 'empiricism' because it is contented to regard its most assured conclusions concerning matters of fact as hypotheses liable to modification in the course of future experience; and I say 'radical,' because it treats the doctrine of monism itself as an hypothesis, and, unlike so much of the half way empiricism that is current under the name of positivism or agnosticism or scientific naturalism, it does not dogmatically affirm monism as something with which all experience has got to square. The difference between monism and pluralism is perhaps the most pregnant of all the differences in philosophy. [William James, preface to "The Sentiment of Rationality" in "The Will to Believe and Other Essays in Popular Philosophy," 1897]

EDIT: There is some debate as to when "modern" use of the term empiricism started, the first use was at least much much older. Stanford.edu writes:

The first people to describe themselves as empiricists (empeirikoi) were a group of medical writers of the Hellenistic period. We know of these thinkers only indirectly, through the work of other ancient writers, in particular Galen of Pergamon (129–ca. 200 CE).

EDIT 2: [emphasis added]

11 comments

Comments sorted by top scores.

comment by elityre · 2020-05-30T20:22:51.331Z · LW(p) · GW(p)

I just want to highlight that there are at least two separate things that one could mean by the world "rationalist".

This first is a practitioner of a method, or an aspirant to an ideal, of truth-seeking.

The second is a participant of a particular social cluster.

By the first definition, one might call many scientists or other intellectuals "rationalists" even if they never engage with, or in fact dislike, LessWrong and co.

My impression is that when Eliezer first wrote the sequences, he was using the world in the first sense, as in "how can we become better rationalists?" But, overtime (unsurprisingly), it came to describe the social group of people sprung up around the nucleus of those sequences.

In 2020, most people, if they have an association with the word "rationalist" at all, it is either the philosophical school, or the social group, because many people (say, my parents, or members of the SF tech industry), are not going to know much more about what it means to be a "rationalist" than "Oh. I know some people who are into that." So our label for a method / ideal naturally turns into a tribal marker.

I think one thing that would be really great is if there was some way to have terms for those two things, without having them inevitably smoosh together.

comment by Raemon · 2020-05-30T20:28:58.151Z · LW(p) · GW(p)

For the past year, I've used "rationalist" to mean "person who has made a serious study of truthseeking skills", and used "LessWrong folk" or "Berkeley Rationality Community" or other more specific names to refer to the second group 

comment by elityre · 2020-05-31T17:33:00.087Z · LW(p) · GW(p)

That sounds good, but also most outsiders are still going to refer to us as “the rationalists“.

Which is not to say that we can do anything about that, or that we ought to try and change how other people refer to the groups to which we belong.

comment by Raemon · 2020-05-31T21:10:15.506Z · LW(p) · GW(p)

I didn’t think of ‘what others call us’ as the topic of this post, and think it’s much harder to change. 

comment by elityre · 2020-06-01T04:17:56.418Z · LW(p) · GW(p)

Fair point.

comment by Raemon · 2020-05-29T23:10:53.654Z · LW(p) · GW(p)

"Aspirationalist" is actually... maybe surprisingly good as a word? Aspiring Rationalist was bad because it was too long, but I might actually be able to use Aspirationalist in conversation.

It has the downside of getting even more obviously appreviated to "Aspie Rat", but, well, maybe that's fine. :P

comment by Bob Jacobs · 2020-05-30T08:00:58.228Z · LW(p) · GW(p)

Happy to hear you like it, though I wish I got some more reactions so I could take stab at guessing the rates of approval/disapproval/ambivalence. I know this post got downvoted but knowing percentages might give a clearer picture of whether or not we could switch. Any chance we could get a poll feature in the future? We used to have annual surveys but they have stopped for some reason?

comment by elityre · 2020-05-30T20:10:53.939Z · LW(p) · GW(p)

You could make a poll?

comment by Bob Jacobs · 2020-05-30T21:33:22.720Z · LW(p) · GW(p)

For just this thing? Seems a little overblown. Maybe I should make a medium large survey with some other questions I have. Since we are no longer doing annual surveys, maybe this could be an impromptu quarterly survey? You think people would be interested in that?

comment by hybridhuman · 2020-05-31T16:56:55.336Z · LW(p) · GW(p)

I'm pretty new here and can say that, as someone who had done a bit of philosophy before at school before arriving, I was familiar with the definition of rationalism you link above but not with the rationalist community in the LessWrong sense.

Aware this is very anecdotal evidence, but thought it might be vaguely useful in some way.

I will, though, point out that you're effectively conflating two different questions here: "What would it be easier for other people to call us, or how should we self-identify as members of a certain community in the public at large?" and "How should we self-refer within the community?" The answers to these questions do overlap, but the factors to consider when arriving at each answer are markedly different.

comment by Bob Jacobs · 2020-06-01T09:13:20.203Z · LW(p) · GW(p)

Yes you are technically correct, but in this case (as with most cases) I would like terminology to be universal. Eg I hate it that philosophy, economics and sociology sometimes discover the same things but then name them differently. I would like the term the ingroup uses to be the same as the outgroup uses. As Raemon said:

I didn’t think of ‘what others call us’ as the topic of this post, and think it’s much harder to change.