[SEQ RERUN] Mandatory Secret Identities
post by MinibearRex · 2013-04-15T07:08:06.432Z · LW · GW · Legacy · 3 commentsContents
3 comments
Today's post, Mandatory Secret Identities was originally published on 08 April 2009. A summary (taken from the LW wiki):
This post was not well-received, but the point was to suggest that a student must at some point leave the dojo and test their skills in the real world. The aspiration of an excellent student should not consist primarily of founding their own dojo and having their own students.
Discuss the post here (rather than in the comments to the original post).
This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Whining-Based Communities, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.
Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.
3 comments
Comments sorted by top scores.
comment by Viliam_Bur · 2013-04-15T10:06:01.669Z · LW(p) · GW(p)
Seems to me this post suggests including the outside view in determining how much awesome is a given rationalist.
As a rationalist you are not required to admire everyone who is admired by the world (some of those just got lucky), but you also shouldn't admire people whose only success is e.g. having a high karma on LessWrong.
More precisely, if someone has a high karma on LessWrong, it is rational to admire their ability to get high karma on LessWrong, if you value that ability. (Imagine that someone promises you $1.000.000 if within a year you get the highest karma on LessWrong without using sockpuppet accounts or otherwise cheating. Then you certainly would want to study the strategies of high-karma people.) Just avoid the halo effect; don't believe that the karma reflects given person's rationality besides karma gaining.
Using an example outside LessWrong, if you know that Kiyosaki made a lot of money writing and selling books about getting rich, you should assume that Kiyosaki is an expert on writing and selling books about getting rich; but you should not automatically believe that e.g. his advice makes real sense. But he should be your role model if you want to get rich by writing and selling books about getting rich.
If a teacher of rationality can impress thousand students by teaching rationality, that per se only proves that the teacher has an ability to impress thousands of students; nothing else. To infer something else about the teacher, you need further evidence.
Replies from: falenas108↑ comment by falenas108 · 2013-04-15T15:39:59.662Z · LW(p) · GW(p)
On a related note, if the students of a rationality teacher routinely produce more rational students, this is evidence that they can recognize and teach rationality to others.
This does not necessarily mean they are good at applying rationality in their everyday lives, which goes somewhat against Eliezer's point that someone respected as a good teacher should also be respected for what they do outside teaching.
So, what this means is that there should be two categories for respect related to this: One for ability to teach and train others in rationality, and another for how they use rationality in their life.
comment by Petruchio · 2013-04-22T14:22:41.181Z · LW(p) · GW(p)
I have wondered what a bayesian secret idenity would look like. That is to say, I desire to use bayes theorem in my own life, day-to-day, but it seems impracitical. Does anyone have any success stories here about using bayes or other rationalist techniques for the win in their career/personal life?