The Core Tags
post by Ruby · 2020-04-22T01:23:34.757Z · LW · GW · 5 commentsContents
The Tags Rationality AI Alignment World Modeling World Optimization Practical Community Common Confusions Rationality vs AI Rationality vs Practical World Modeling vs The Rest World Optimization vs World Modeling World Optimization vs Practical None 5 comments
Concepts and posts on the LessWrong site are organized according to six “core tags” which roughly segment the major clusters of content on the site. Users may wish to filter for or filter out these clusters.
The core tags:
- Will have complete coverage over all content, i.e., moderators will have decided for every post which of these tags apply.
- Are listed by default as filters when opening the tag settings on the Latest Posts. Other tags can be used as filters, but you have to search for them.
The core tags are designed to be mostly exclusive on posts but will overlap in some smallish percentage of cases.
The Tags
Rationality [? · GW]
AI Alignment [? · GW]
World Modeling [? · GW]
World Optimization
Practical [? · GW]
Community [? · GW]
Common Confusions
Rationality vs AI
Both Rationality and AI are about minds – that's why AI is such a natural fit on LessWrong– and this causes many ideas applying to AI to also apply to Rationality. Some posts and tag might legitimately belong to both, but a good heuristic is that Rationality content should be of interest to a reader even if they're not interested in the design or engineering of artificial intelligences.
Rationality vs Practical
Both of these apply to topics that help people do better. The distinction is that we want the Rationality to more exclusively about doing better by thinking better, in fancy words, by improving your cognitive algorithms [LW · GW]. Practical, in contrast, is for all the object-level ways to do better, e.g. getting a good night's sleep and taking care of your health.
World Modeling vs The Rest
Almost all discussions on LessWrong concern models of the world, doesn't this tag apply to everything? Potentially, yes, but we wanted a name that applied to topics primarily driven by a broad curiosity about the world. Raw sciences and similar. "How does it work?" being the foremost question. Science might have been an alternative name, but many topics we wanted to include topics aren't implied by that, e.g. history.
In a way, the other core tags could be considered specialized sub-tags of World Modeling, and World Modeling is the catch-all for the rest.
World Optimization vs World Modeling
World Optimization is the tag intended to capture all discussion of topics directly relevant to trying to make the world a better place. That means discussion of altruistic causes, but also models of things which are especially relevant to changing things, e.g. incentive structures in large institutions. Many social models have felt appropriate in World Optimization for that reason.
World Optimization vs Practical
The difference between these two is largely scale. World Optimization concerns how we make the world at large better, while Practical are topics relevant to individuals trying to make their local situation. "Altruism vs Self-Help" isn't quite it but points in the correct direction.
5 comments
Comments sorted by top scores.
comment by abramdemski · 2020-07-09T19:02:14.620Z · LW(p) · GW(p)
I currently would like tags differentiating epistemic rationality from instrumental rationality. (I know I can just go ahead and create them, now, but the overlap with existing categories is sufficient that I'd rather talk it out a bit. The main reason I want these tags to exist is actually for use with the filtering feature, so I want there to be enough consensus for others to use them!) For example, my recent post about betting [LW · GW] is clearly epistemic rationality. But without creating a tag, rn the closest I can come is to label it "epistemology" -- that doesn't seem right, yeah? It isn't theorizing about epistemics -- it's just a practical thingy about improving one's epistemics.
World Modeling / World Optimization seem like a nod to epistemic/instrumental clusters, but again, don't actually seem to be the thing I want to tag with -- World Modeling sounds like object-level world modeling, not practical but meta-level advice about how to world-model better.
Thoughts?
Replies from: habryka4, abramdemski↑ comment by habryka (habryka4) · 2020-07-09T20:23:27.759Z · LW(p) · GW(p)
My current model here is something like, "epistemic rationality" is an OK natural cluster, but there are still a bunch of quite fuzzy boundaries between epistemic and instrumental rationality that make me hesitant to just make one of the core tags "epistemic rationality". The most relevant set of stuff that feels like a core part of rationality that underlies a lot of Eliezer's writing, but doesn't really fit into either of those, is everything that has to do with motivations and figuring out what you care about, and a bunch of stuff about the complicated interactions between truth-seeking and motivations.
I do think that "instrumental rationality" is not a particularly natural cluster and kind of only makes sense if you define it in contrast to epistemic rationality. Like, presumably everything that has some claim to usefulness is in some sense part of instrumental rationality. Learning math, could be instrumental rationality since it might help you get a job, working out, could be instrumental rationality since you might live lonegr, organizing meetups, could be instrumental rationality since it might help you make friends and win more that way, etc.
My current sense of what I want out of the system is for the "Rationality" core tag to be mostly epistemic rationality flavored, but to also include a bunch of stuff about motivation and kind of being an embedded human that is kind of a messy cludge of stuff where you just don't have a really nice map-and-territory divide. And for most instrumental rationality content to go into the more specific categories that they belong to, like "Practical" for stuff that is immediately practically useful, "World Optimization" for stuff that is at a higher level about optimizing the world, and "World Modeling" for object-level insights that are important for acting in the world. My sense is that those other categories cover most of what I consider "instrumental rationality" and that on average those categories should take precedence over the rationality category, which should be more narrowly about cognitive algorithms.
That said, having a non-core tag that tries to be something more pure and is about really just the slice of rationality that has nothing to do with motivations and goals and is more about abstract truth-seeking algorithms, could be a valuable addition. The main hesitation I have about that is that I am generally hesitant to create tags that will have many hundreds of posts in them, since it's hard to actually make sure that that tag gets updated whenever new posts come out, which makes it harder to use it for the purpose of frontpage filtering (i.e. imagine the world where we have 50 tags that all apply to a substantial fraction of posts, in that case I have to check for every new post whether it belongs to any of those 50 tags, which is a really expensive operation. Right now we only make sure that we make sure the core tags have full coverage of new posts, to make sure that you can reliably use those to filter your experience)
↑ comment by abramdemski · 2020-07-09T20:49:16.717Z · LW(p) · GW(p)
Looking at things a bit more, maybe practical vs rationality is supposed to cover what I want instrumental rationality vs epistemic rationality to cover? But if so I don't find those descriptions very intuitive and don't expect people to apply them correctly.
Replies from: jimrandomh↑ comment by jimrandomh · 2020-07-09T20:54:06.681Z · LW(p) · GW(p)
I think the combination of the Rationality and Practical tags gets pretty close to what you want, but to get all the way there you also would add the Motivations tag. Ie
Epistemic Rationality = Rationality & !Practical & !Motivations
Instrumental Rationality = Practical | Motivations
I think the real reason we didn't make Epistemic & Instrumental core tags was because when we tried tagging sample posts, too large a fraction of posts hit corner cases and failed to be well classified by the distinction.
Replies from: abramdemski↑ comment by abramdemski · 2020-07-09T20:58:39.698Z · LW(p) · GW(p)
What's the idea behind "motivations"? I don't understand what your proposal is.
Edit: Ah, hadn't read Habryka's comment yet :p