[SEQ RERUN] Just Lose Hope Already
post by Tyrrell_McAllister · 2011-05-01T21:08:42.324Z · LW · GW · Legacy · 15 commentsContents
15 comments
Today's post, Just Lose Hope Already, was originally published on 25 February 2007. A summary (taken from the LW wiki):
Casey Serin owes banks 2.2 million dollars after lying on mortgage applications in order to simultaneously buy 8 different houses in different states. The sad part is that he hasn't given up - hasn't declared bankruptcy, and just attempted to purchase another house. While this behavior seems merely stupid, it recalls Merton and Scholes of Long-Term Capital Management who made 40% profits for three years and then lost it all when they overleveraged. Each profession has rules on how to be successful which makes rationality seem unlikely to help greatly in life. Yet it seems that one of the greater skills is not being stupid, which rationality does help with.
Discuss the post here (rather than in the comments to the original post).
This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Politics is the Mind-Killer, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.
Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.
15 comments
Comments sorted by top scores.
comment by endoself · 2011-05-02T02:31:06.167Z · LW(p) · GW(p)
From the comments:
Probability of success if you continue: small. Probability of success if you give up: zero.
Doug, that's exactly what people say to me when I challenge them on why they buy lottery tickets. "The chance of winning is tiny, but if I don't buy a ticket, the chance is zero."
I can't think of one single case in my experience when the argument "It has a small probability of success, but we should pursue it, because the probability if we don't try is zero" turned out to be a good idea. Typically it is an excuse not to confront the flaws of a plan that is just plain unripe. You know what happens when you try a strategy with a tiny probability of success? It fails, that's what happens.
--Eliezer Yudkowsky
This was a great addition and probably should have been in the post, so I'm reposting it here for everyone
Replies from: Plasmon, wedrifid↑ comment by Plasmon · 2011-05-02T16:26:18.988Z · LW(p) · GW(p)
Is this ("It has a small probability of success, but we should pursue it, because the probability if we don't try is zero") not a standard pro-cryonics argument? Given a sufficiently large expected payoff, it seems perfectly valid ...
Replies from: Eliezer_Yudkowsky, endoself↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2011-05-02T19:20:50.909Z · LW(p) · GW(p)
Cryonics should just work if everything we currently already believe about the brain is true and there are no surprises. It is not a small probability. It is the default mainline probability.
Replies from: wedrifid↑ comment by wedrifid · 2011-05-02T20:07:31.356Z · LW(p) · GW(p)
Cryonics should just work if everything we currently already believe about the brain is true and there are no surprises. It is not a small probability. It is the default mainline probability.
Cryonics being possible given advanced technology is the default mainline probability. But the probability of being revived given that you prepare to be cryo-preserved is not.
"My head remains in stasis in a facility that remains functional until such time as an agent in the future is willing and able to revive me" is not something that just happens. It could even be said to be a long shot. But the only shot available.
Replies from: Plasmon↑ comment by Plasmon · 2011-05-03T06:02:32.808Z · LW(p) · GW(p)
Cryonics being possible given advanced technology is the default mainline probability. But the probability of being revived given that you prepare to be cryo-preserved is not.
That's exactly what I meant. A lot of practical things can go wrong even if our beliefs about the brain are entirely correct. Rationality, Cryonics and Pascal's Wager gives a probability of 0.228 which is, indeed, not that improbable, but it is still less than 50%.
I conclude, then, that the supposedly useless heuristic described above
It has a small probability of success, but we should pursue it, because the probability if we don't try is zero
is useless only if the probability of success is very small
↑ comment by endoself · 2011-05-02T18:41:25.258Z · LW(p) · GW(p)
No. In cryonics we do an explicit cost-benefit calculation in order to see whether we value it enough to spend money that could be used elsewhere. Eliezer is referring to the specific case where something is found to be far less likely than cryonics (which isn't that improbable) but is pursued anyways because the alternative has exactly zero benefit. Such situations almost always ignore some cost or benefit in order to rationalize a choice despite ~0 probability.
↑ comment by wedrifid · 2011-05-02T20:14:00.254Z · LW(p) · GW(p)
Probability of success if you continue: small. Probability of success if you give up: zero.
Doug, that's exactly what people say to me when I challenge them on why they buy lottery tickets. "The chance of winning is tiny, but if I don't buy a ticket, the chance is zero."
I can't think of one single case in my experience when the argument "It has a small probability of success, but we should pursue it, because the probability if we don't try is zero" turned out to be a good idea. Typically it is an excuse not to confront the flaws of a plan that is just plain unripe. You know what happens when you try a strategy with a tiny probability of success? It fails, that's what happens.
--Eliezer Yudkowsky
Shut up and do the impossible!
Doug was right, Eliezer was wrong. At least as the quote is stated. That is not to say that implementing a heuristic avoidance of low probability plans is not usually a good idea.
Replies from: endoself↑ comment by endoself · 2011-05-02T22:32:44.605Z · LW(p) · GW(p)
I think there is a distinction. In this case literally the entire argument is "It has a small probability of success, but we should pursue it, because the probability if we don't try is zero". It would be valid to justify a low probability with a high utility, but sometimes people just ignore or refuse to calculate probabilities because they believe that all alternatives are futile, even in the face of repeated counterevidence pushing the probability ever-lower. While such a situation is possible, beliefs of this type are far more likely to be caused by rationalization.
comment by beriukay · 2011-05-02T09:15:02.622Z · LW(p) · GW(p)
As it is, I appreciate this post for reminding you that quitting is sometimes the better option. It reminds me of a time when I was doing something or another, failed, and quit. A friend, trying to look on the bright side, said "Well, at least you gave up."
It would be nice if there was some kind of practical gauge provided in the post. We can easily identify it in others, but how do we know when we are guilty of it ourselves? When do we know to give up?
Not wanting to pose questions without thought, I have a couple possible ideas:
- If you can't find 10 people who think what you are doing is a good idea, it probably isn't. Hacker News pointed me to that one.
- If what you are doing causes you no end of soul-crushing despair, occupies a gigantic proportion of your thoughts, and still rates terrible returns (according to your self-report), then you should probably think about giving up soon.
- If you take some time to look at how your actions impact others, and find that it negatively impacts someone at least half as much as it positively affects you, you should really think about quitting. Doing the math here is probably intractable, but if you are doing anything that involves the equivalent of millions of dollars, you should really try to do the math.
That would be my first stab at this question.
comment by Vladimir_M · 2011-05-02T05:34:15.418Z · LW(p) · GW(p)
One problem with this post, which is also seen in various other posts in the sequences and many LW posts in general, is the tendency to jump to conclusion that people's behavior is irrational. Neither of these examples (Casey Serin and the LTCM people) strikes me as being obviously irrational, in the sense that these people would be acting against their own interest clearly and to an unusual degree.
This is especially true of the LTCM example. While it might be possible that this venture was really a mistake for Merton, Scholes, and the others involved (from the perspective of their own self-interest), the way this conclusion is reached in the post and the idea that they would have something to learn from being lectured about this topic are just childishly naive.
Replies from: JohnH↑ comment by JohnH · 2011-05-03T04:21:51.424Z · LW(p) · GW(p)
Agreed.
The whole Bayesian being an ultra-rational agent and any deviation on the part of humans is a sign of irrationality has been moved away from by some economists. Now some think that humans are rational and the trick is to figure out how what appears irrational at first glance can be considered rational. This because the model for rationality didn't give very many useful results outside of gambling and playing stocks, even then people have gotten burned by it.
Much of it comes down to limited information and variable time preferences.
Replies from: Vladimir_M↑ comment by Vladimir_M · 2011-05-05T16:37:15.147Z · LW(p) · GW(p)
I think you misunderstood my point. I meant that the behavior of these people may well have been rational, or at least not remarkably irrational, by the standard economic definition of the term, i.e. in the sense of advancing one's own self-interest.
Even if you're responsible for a failure with large total costs, this may still advance your self-interest if the benefit you derive from it is larger than the share of the costs you have to bear yourself (plus of course the future reputational and other indirect costs, of course). It seems to me this may have been the case in both examples from the original post, so it's unjustified to parade them as obvious examples of irrational behavior.
comment by [deleted] · 2011-05-01T23:30:32.356Z · LW(p) · GW(p)
I think this post would have been a lot more interesting if Eliezer had used it to explain the sunk cost fallacy, which is generally the reason that people don't want to give up on ventures like this even if they've already lost a lot of money. Still, it was a decent post and I liked the fact that it was short and to the point.
Replies from: Tyrrell_McAllister↑ comment by Tyrrell_McAllister · 2011-05-02T00:14:39.788Z · LW(p) · GW(p)
the sunk cost fallacy, which is generally the reason that people don't want to give up on ventures like this even if they've already lost a lot of money.
Another important motivation is fear of loss of face.
comment by MinibearRex · 2011-05-02T05:16:36.415Z · LW(p) · GW(p)
I remember a situation from a few months ago when a female friend of mine was being romantically pursued by a guy who she was not even slightly interested in. He had indicated to a few of her friends that he was going to ask her out, and they had told him that she was certainly going to say no. She liked someone else, she wasn't interested in you, etc. He responded by making a poker analogy, and saying that if he didn't try, the probability of success was zero. They pointed out that the plan was not going to work, and he was going to come off looking really stupid. Don't forget to factor the negative penalties of plans not working into your expected utility calculations.