Rationality outreach vs. rationality teaching
post by Lenmar · 2023-12-26T00:37:39.240Z · LW · GW · 2 commentsContents
2 comments
Epistemic status: Exploratory and in testing
I'm reasonably confident there are a lot of smart/curious people who would like to learn rationality, that is, how to think better and correlate the contents of your mind better to reality.
Framing rationalist outreach as establishing branches of the LW-community, Rationalist Clubs, Effective Altruist Meetups, etc. may be effective in growing the community to some extent, but anyone who doesn't already think of themselves as a rationalist will come only from whatever tribe the local branch seems to mainly consist of, whether that's weird engineers, animal-welfare vegans, crypto nuts, secular Buddhists, etc.
And then of course there are the cases where the assumption already exists that Effective Altruism is the thing Sam Bankman-Fried pretended he did before he stole all that money, and LessWrong is that place where they talk about how AI will become Evil Vaguely Judeo-Christian God who Tortures Us in the Future.
However, I am moving towards the conclusion that if detached from the tribal baggage, the majority of general-purpose debiasing tech/utilitarianism is not that inherently difficult to teach to smart/curious/motivated people, even from non-LW-median tribes. It is not outrageously Deep Magic to consider that students learn arbitrary parroting instead of knowledge, and continue to think like that after they graduate [? · GW], or people use their moral philosophies to feel like they agree and associate with their tribes [? · GW], and then go from there.
An old success case I found on my first search [LW · GW]. I have had a pretty decent success rate with leftist-tribe friends and acquaintances as well and plan to continue testing. Of course, you need to have the norm of having genuine/abstract conversations first, but that's fun and useful to establish anyway.
(And any large-scale societal rise in the sanity waterline will presumably involve normalizing concepts outside the community, not expanding the community to that scale, so it's a good time to start.)
2 comments
Comments sorted by top scores.
comment by Heron (jane-mccourt) · 2023-12-26T12:30:51.602Z · LW(p) · GW(p)
I agree that many of us outsiders would like to understand and utilise rationalist thinking. I did not, for example, notice the 'rationalist' take that ' AI will become Evil Vaguely Judeo-Christian God who Tortures Us in the Future'!