Posts

Heuristics for preventing major life mistakes 2023-12-20T08:01:09.340Z
Is there a name for this bias? 2011-02-08T04:52:17.388Z
Link: Compare your moral values to the general population 2010-11-28T03:21:59.940Z

Comments

Comment by SK2 (lunchbox) on Scholarship: How to Do It Efficiently · 2011-05-14T16:40:59.405Z · LW · GW

CiteULike is quite nice for this.

Connotea is a similar "personal research library" service but it doesn't let you store PDFs, just links to articles.

Comment by SK2 (lunchbox) on Optimal Employment · 2011-02-01T19:08:21.112Z · LW · GW

Even considering that, the 3% figure still seems wildly implausible. This would require something like 90% of the population thinking they pay 0% taxes, and the remaining 10% thinking they pay 30% taxes (which is still an underestimate).

The PDF that Louie linked to doesn't explain what the numbers mean. Surely there would be lots of articles about this epidemic of grossly underestimating taxes. Can anyone provide more evidence?

Comment by SK2 (lunchbox) on Statistical Prediction Rules Out-Perform Expert Human Judgments · 2011-01-19T20:44:42.502Z · LW · GW

This is a great article, but it only lists studies where SPRs have succeeded. In fairness, it would be good to know if there were any studies that showed SPRs failing (and also consider publication bias, etc.).

Comment by SK2 (lunchbox) on The Best Textbooks on Every Subject · 2011-01-17T04:41:05.668Z · LW · GW

Here is a very similar post on Ask Metafilter. (It is actually Ask Metafilter's most favorited post of all time.)

Comment by SK2 (lunchbox) on The Mathematics of Beauty [link] · 2011-01-16T00:48:44.214Z · LW · GW

Here's an insightful comment on the article:

http://www.reddit.com/r/math/comments/ezm6s/the_mathematics_of_beauty/c1c87ts

This is the same reason that when shopping on Amazon I ignore the reviews from people who rated the product 1 or 5 stars. They often have an ulterior motive of trying to damage/help the image of the product as much as possible.

Comment by SK2 (lunchbox) on Techniques for probability estimates · 2011-01-05T03:50:50.664Z · LW · GW

Related positions include operations research analysts and quants at finance firms.

Comment by SK2 (lunchbox) on Efficient Charity: Do Unto Others... · 2010-12-25T16:57:55.191Z · LW · GW

It's a useful exercise for aspiring economists and rationalists to dissect charity into separate components of warm fuzzies vs. efficiency. However, maybe it's best for the general population not to be fully conscious that these are separate components, since the spirit of giving is like a frog: you can dissect it, but it dies in the process (adaptation of an E.B. White quote).

Lemma: we want charity to be enjoyable, so that more people are motivated to do it. (Analogy: capitalist countries let rich people keep their riches, to create an incentive for economic growth, even though it might create more utility in the short term to tax rich people very highly.)

Consider this quote from the article:

If he went to the beach because he wanted the sunlight and the fresh air and the warm feeling of personally contributing to something, that's fine. If he actually wanted to help people by beautifying the beach, he's chosen an objectively wrong way to go about it.

Sure, but making the lawyer conscious of this will give him a complete buzzkill. He will realize that he was unconsciously doing the act for selfish (and kind of silly) reasons. Your hope in telling him this is that he will instead opt to use his $1000 salary to hire people, but I question whether he would actually follow through with that kind of giving in the long run, since his unconscious original motive was warm fuzzies, not efficiency. In effect, you may have prevented him from doing anything charitable at all. Don't let the perfect be the enemy of the good.

So, this article is great fodder for someone trained in rationalist/economic thought, but keep in mind that this type of thinking makes many people uneasy.

Comment by SK2 (lunchbox) on Smart people who are usually wrong · 2010-12-05T18:43:32.873Z · LW · GW

These people comment only on difficult, controversial issues which are selected as issues where people perform worse than random.

Related, maybe they only comment when they have something original and unorthodox to say (selection bias). It's easy to echo conventional wisdom and be right most of the time; for a smart person it's more exciting to challenge conventional wisdom, even if this gives them a higher risk of being wrong. In other words, maybe they place a lower priority on karma points, and more on building their muscles for original thought.

Example 1: In my youth, I tried to only hold beliefs I could derive myself, rather than accepting what was told to me. As a result, I held many unorthodox beliefs, many of which turned out to be wrong. Statistically, I would have had a better track record if I had just accepted the conventional view.

Example 2: Robin Hanson. I think he is wrong a lot of the time, but he also thinks for himself a lot more than I do, and has advanced human thought way more than I have. He could easily hold more conventional views and increase his accuracy, but I'm sure he finds the risk and challenge appealing.

Comment by SK2 (lunchbox) on Link: Compare your moral values to the general population · 2010-11-28T04:39:36.343Z · LW · GW

I had the same issue with the Schwartz test. It seems not to correct for people who rate everything high (or low).

Comment by SK2 (lunchbox) on Rationality Quotes: November 2010 · 2010-11-04T15:09:49.431Z · LW · GW

Talib Kweli is nonreligious, so I'm not changing the meaning of the quotation. "God" is often used poetically. Example:

"Subtle is the Lord, but malicious He is not."

Albert Einstein

Even if Kweli were religious the point would not be to put words in his mouth, but to reapply a beautiful quotation to another context where it is meaningful.

Comment by SK2 (lunchbox) on Rationality Quotes: November 2010 · 2010-11-04T07:13:54.149Z · LW · GW

All my confidence comes from knowing God's laws.

-- Talib Kweli (substitute "nature" for "God")

Comment by SK2 (lunchbox) on Open Thread: February 2010, part 2 · 2010-02-22T01:40:01.197Z · LW · GW

Thanks Nick. That paper looks very interesting.

Comment by SK2 (lunchbox) on Conversation Halters · 2010-02-21T21:03:55.163Z · LW · GW

Oops, yes, I misread the original post. Thanks for pointing that out.

Comment by SK2 (lunchbox) on Conversation Halters · 2010-02-21T00:06:09.250Z · LW · GW

The items on that list of appeals can also be ranked. According to mainstream US values, "Appeal to egalitarianism" trumps "Appeal to unquestionable authority", "Appeal to personal freedom" trumps "Appeal to egalitarianism"; and so on. The standard political talk show debate consists of a back-and-forth escalation up this ladder.

For example, in a televised debate on regulation:

Person 1: "The National Bureau of Economics Research published a study showing conclusively that regulation of X is harmful" (authority)

Person 2: "Well, I don't care what the elite economists say; the poor are not getting equal access to X and that is unfair." (egalitarianism)

Person 1: "Sure, it's unequal, but if the government played big brother with X, that would violate our fundamental freedoms." (personal freedom)

Comment by SK2 (lunchbox) on Open Thread: February 2010, part 2 · 2010-02-20T23:09:25.055Z · LW · GW

Exercising "rational" self-control can be very unpleasant, therefore resulting in disutility.

Example 1: When I come buy an interesting-looking book on Amazon, I can either have it shipped to me in 8 days for free, or 2 days for a few bucks. The naive rational thing to do is to select the free shipping, but you know what? That 10-day wait is more unpleasant than spending a few bucks.

Example 2: When I come home from the grocery store I'm tempted to eat all the tastiest food first. It would be more "emotionally intelligent" to spread it out over the course of the week. But that requires a lot of unpleasant resistance to temptation. Also, the plain food seems more appealing when I'm hungry and it's the only thing in my fridge.

Of course, exercising restraint probably builds willpower, a good thing in the long run. But in some cases we should admit that our willpower is only so elastic, and that the most rational thing to do is to give in to our impulses.

What are some other seemingly "irrational" things we do that are in fact rational when we factor in the pleasantness of doing them?

Comment by SK2 (lunchbox) on Generalizing From One Example · 2010-02-17T06:50:05.334Z · LW · GW

I think clever people are especially susceptible to the belief that their perceptions are typical. Let's say you can't visualize images in your mind, but your coworker insists that he can. Since you're not a brain scientist, you can't verify whether he's right or whether he's just misinterpreted the question. However, the last few times you had a disagreement with him on a verifiable subject, you were vindicated by the facts, so you can only assume that you are right this time as well. Add to that the fact that people's stated perceptions and preferences are frequently dishonest (because of signaling), and it's very easy to mistrust them.

One useful first step to overcoming this bias is to compare one's results on a test like UVA's Moral Foundations Questionnaire here to other segments of the population.

However, it's not enough to just learn the facts about how other people perceive the world; sometimes one has to experience them firsthand. I have always been an ambitious high achiever and used to get frustrated and confused by people who were not able to follow through with their goals. However, a few years back I had an adverse reaction to a medication, and experienced for a few hours what depression must be like. From then on, it all made perfect sense.

One day I wonder if it will be possible to alter my brain chemstry safely and temporarily so that I can experience what it is like to perceive the world as a conservative, a liberal, a luddite, a woman, a blue collar worker, a depression sufferer, a jock, an artist, etc. The impact on my emotional maturity and ability to empathize would be tremendous.

Comment by SK2 (lunchbox) on Open Thread: February 2010, part 2 · 2010-02-17T05:01:38.992Z · LW · GW

Exactly.

http://en.wikipedia.org/wiki/Texas_sharpshooter_fallacy

Comment by SK2 (lunchbox) on Adaptive bias · 2010-01-26T06:30:45.941Z · LW · GW

I think this disagreement comes down to the definition of "bias", which Wikipedia defines as "a tendency or preference towards a particular perspective, ideology or result, when the tendency interferes with the ability to be impartial, unprejudiced, or objective." If a bias helps you make fewer errors, I would argue it's not a bias.

Maybe it is clearer if we speak of behaviors rather than biases. A given behavior (e.g. tendency to perceive what you were expecting to perceive) may make you more biased in certain contexts, and more rational in others. It might be advantageous to keep this behavior if it helps you more than it hurts you, but to the extent that you can identify the situations where the behavior causes errors, you should try to correct it.

Great audio clip, BTW.

Comment by SK2 (lunchbox) on Open Thread: January 2010 · 2010-01-11T01:29:38.951Z · LW · GW

How do people here consume Less Wrong? I just started reading and am looking for a good way to stay on top of posts and comments. Do you periodically check the website? Do you use an RSS feed? (which?) Or something else?

Comment by SK2 (lunchbox) on Test Your Calibration! · 2010-01-09T19:11:33.409Z · LW · GW

Imagine an experiment where we randomize subjects into two groups. All subjects are given a 20-question quiz that asks them to provide a confidence interval on the temperatures in various cities around the world on various dates in the past year. However, the cities and dates for group 1 are chosen at random, whereas the cities and dates for group 2 are chosen because they were record highs or lows.

This will result in two radically different estimates of overconfidence. The fact that the result of a calibration test depends heavily on the questions being asked should suggest that the methodology is problematic.

What this comes down to is: how do you estimate the probability that a question has an unexpected answer? See this quiz: maybe the quizzer is trying to trick you, maybe he's trying to reverse-trick you, or maybe he just chose his questions at random. It's a meaningless exercise because you're being asked to estimate values from an unknown distribution. The only rational thing to do is guess at random.

People taking a calibration test should first see the answers to a sample of the data set they will be tested on.

Comment by SK2 (lunchbox) on Test Your Calibration! · 2010-01-09T10:00:23.770Z · LW · GW

I have seen a problem with selection bias in calibration tests, where trick questions are overrepresented. For example, in this PDF article, the authors ask subjects to provide a 90% confidence interval estimating the number of employees IBM has. They find that fewer than 90% of subjects select a suitable range, which they conclude results from overconfidence. However, IBM has almost 400,000 employees, which is atypically high (more than 4x Microsoft). The results of this study have just as much to do with the question asked as with the overconfidence of the subjects.

Similarly, trivia questions are frequently (though not always) designed to have interesting/unintuitive answers, making them problematic for a calibration quiz where people are expecting straightforward questions. I don't know that to be the case for the AcceleratingFuture quizzes, but it is an issue in general.

Comment by SK2 (lunchbox) on Reason as memetic immune disorder · 2009-12-26T11:43:04.009Z · LW · GW

Another reason converts are more zealous than people who grew up with a religion is that conversion is a voluntary act, whereas being born into a religious family is not. Converting to a religion late in life is a radical move, one that generally requires a certain amount of zeal and motivation to begin with, so converts are pre-selected to be zealous.

Comment by SK2 (lunchbox) on Are these cognitive biases, biases? · 2009-12-26T10:01:14.797Z · LW · GW

Regarding the "Repent" example: as conformists, human beings are more likely to make particular decisions (like wear a "Repent" sign) if they believe others would do the same. So instead of framing this study as showing that "sign-wearing volunteers overestimate the probability others would volunteer", one could flip the implied causality and say "people who think others would volunteer are more likely to volunteer themselves", a much more banal claim. One could test the effect by re-running the experiment on self-identified nonconformists, or using behaviors for which conformity is not believed to play a big role. I predict the False Consensus Effect discovered in those settings would be much weaker.

The blue/red ball analogy is good food for thought, but there are way too many differences between it and the "Repent" study for the numerical similarity to be considered anything more than a coincidence. Our approximations of other people's behavior are much more elaborate than making a naive inference based on a sample of one.

Comment by SK2 (lunchbox) on Doing your good deed for the day · 2009-12-26T03:11:48.233Z · LW · GW

I wonder how long-lasting this "quota" effect is. The study only looked at the immediate effects of moral behavior, not the more important long-term effects.

To make an analogy with physical exercise, maybe flexing your moral muscles exhausts your ability to be moral for the rest of the day, but when you wake up tomorrow your moral strength will be not only restored but actually strengthened. Most forms of exertion I can think of (e.g. learning, writing, working) work like this, so I wouldn't be surprised if the same held for doing good deeds.