Hammertime Day 8: Sunk Cost Faith
post by alkjash · 2018-02-05T01:00:00.503Z · LW · GW · 20 commentsContents
Day 8: Sunk Cost Faith The Uncanny Valley of Sunk Cost Sunk Cost Faith Faith in the Past Daily Challenge None 20 comments
[Author’s note: I will be moving future Hammertimes to my personal page to avoid cluttering the frontpage. This one is sufficiently short and probably controversial to leave here.]
This is part 8 of 30 in the Hammertime Sequence. Click here for the intro.
It pains me to begin a post about planning with an announcement about two slight changes of plans for Hammertime:
First, I will be travelling the week after next, so there will be a week-and-a-half intermission between the first and second cycles.
Second, when I sat down to write a post about Focusing, I found myself unable to add anything productive to this excellent post: Focusing, for Skeptics. Focusing is probably the second most powerful technique I learned from CFAR, so I will return to it in future cycles after more thought.
Instead, I want to write three posts on planning. These are the first steps to becoming the kind of person who can make thoughtful long-term plans and follow through with them.
Day 8: Sunk Cost Faith
One of my main motivations ever since writing The Solitaire Principle is to solve the Control Problem in humans: the problem of making and following through with long-term plans and habits despite new information and, even worse, value drift. I propose that what is commonly known and vilified as the Sunk Cost Fallacy actually exists for a good reason and is a useful first-order solution to the Control Problem.
The Uncanny Valley of Sunk Cost
Related: Sunk Costs Fallacy Fallacy.
Here is the uncanny valley one falls into when one naively cancels out Sunk Cost Fallacy:
- You suck at making plans, but follow through with them anyway. You get a moderate amount done by making overconfident and poorly-thought-out plans, and just doing them despite contradictory information.
- One day, you learn about the Sunk Cost Fallacy. You decide to be a Good Rationalist and categorically abandon ship on projects that no longer appeal to you. You still suck at making plans. All your plans fail and you get nothing done.
- Over time, you learn that you’re not the kind of person who can follow through with long-term projects. You jump from bright light to bright light, captured by the briefest caprice. You don’t remember what it’s like to be diachronic. Your time horizon shrinks and you stop bothering with plans at all.
There’s an extremely insidious demon hiding at Step 2, related to adverse selection. Over the course of a multi-year (or multi-day, for that matter) plan, all sorts of noisy information can arise. Imagine your valuation of the project to be something like a Brownian motion bouncing around due to new information that slowly converges towards the “true value.”
If at any point the current valuation of the project randomly walks under the Worth It line, you’ll promptly give up the project.
Because of the noisiness of information, following the strategy of “give up whenever it falls below the Worth It line” will make you give up on many projects that actually turn out to be Worth It, just because a long random walk will always usually fall quite a bit below the mean at least once.
And this doesn’t even take into account all the motivated reasoning and other reasons that shiny new projects putter out of momentum.
Sunk Cost Faith
I think the Uncanny Valley above is a serious and common failure mode in the rationalist community, and one that happened to me.
My diagnosis is that one should not fix one’s Sunk Cost Fallacy without first learning to make strong, fault-tolerant plans, and that one cannot learn to make strong, fault-tolerant plans without the data that comes from following through on bad ones. Therefore, the first step towards becoming good at planning is restoring your Sunk Cost Fallacy and use it to follow through on bad plans. This move I dub Sunk Cost Faith – faith that your past self made good decisions. Faith, of course, because it’s entirely unjustified.
If you find yourself in the position described in the previous step, pick up your Sunk Cost Fallacy again and turn it into Sunk Cost Faith. Build yourself into a diachronic person. Follow through on your plans even after they no longer appeal to you. Expand your time horizon to the scale of months and years so that you’re the kind of person who can actually do things.
Once you can actually follow through with plans again, only then can you get better at planning. This will involve explicitly building into your plans defenses against the dark side of Sunk Cost Fallacy, defenses such as unambiguous ejector seats.
Faith in the Past
Today’s exercise is directed at people who find themselves giving halfway on projects too frequently.
Pick a completely useless activity (be creative!) that takes about five minutes, and do it every day for a week with Yoda Timers.
Daily Challenge
Convince me that I’m wrong about Sunk Cost Fallacy and it’s actually just bad.
20 comments
Comments sorted by top scores.
comment by Qiaochu_Yuan · 2018-02-05T23:10:58.062Z · LW(p) · GW(p)
I think you're basically right, although there's also a large and important signaling component: it's in fact good to signal willingness to complete projects if you want people to work on projects with you.
Also, as I understand it, this was exactly C. S. Lewis' attitude towards faith. I don't have a convenient pull quote though.
Replies from: alkjash↑ comment by alkjash · 2018-02-06T00:51:39.191Z · LW(p) · GW(p)
After I took the time to actually think about planning and motivation, I realized that there were a ton of glaringly obvious things I was missing. What you said, but also this thing:
If you're halfway done with a project and most of its value is in the finished product, then it's worth twice as much to finish it than to start a brand new, equivalent project.
I can't believe I never considered this while hopping from half-finished thing to half-finished thing. This observation might be a sufficient antidote to the "dropping things halfway" bug that I don't even need Sunk Cost Faith.
Replies from: Qiaochu_Yuan↑ comment by Qiaochu_Yuan · 2018-02-06T18:19:02.700Z · LW(p) · GW(p)
In addition to just the value of having the finished project, there's also the value of learning about how to do the various stages of the project. I can imagine projects where most of the learning happens in finishing, e.g. putting on a stage play.
comment by servy · 2018-02-06T11:11:55.609Z · LW(p) · GW(p)
I'm curious what are the "ejector seats" that you mention in this post and in Day 1 post, that can help with the time sinks and planning. While other concepts seem familiar, I don't think I heared about the ejector seats before. I can guess that those are something like TAP's with the action of "adandoning current project/activity". Looking forward to your Day 10 post on planning that will hopefully have an in depth explanation and best practices of building those.
Thanks for the sequence that focuses on instrumental every-day rationality.
Replies from: alkjash↑ comment by alkjash · 2018-02-06T14:49:14.063Z · LW(p) · GW(p)
Thanks for asking! This is one of the topics I planned to cover in the later cycles.
Briefly, I use "ejector seats" to refer to building into your plan rules for when you're allowed to give up or modify it. For example, you can read Eight Studies on Excuses and try to develop and precommit to meta-rules for what kinds of excuses you accept for yourself; i.e. "Super Bowl Sunday is not a valid excuse for skipping leg day, but grandma's funeral is." In individual plans, you should take a moment to think about what kinds of happenings and new information you would allow to influence them, i.e. "I will make the preliminary decision to attend MIT, except if Harvard accepts me or another school gives much better financial aid; I have considered all the other factors already and will not think about changing my mind for them."
The point is that every time you break your own rules or plans without having pre-committed to allowing the excuse, you might be left with the nagging suspicion and guilt that you were looking for excuses to give up, or that you're not the kind of person who keeps promises. Ejector seats allow you to minimize the number of times this happens.
I will have to think and practice this a bit before writing about it, probably in the second cycle.
comment by [deleted] · 2018-02-05T19:57:06.957Z · LW(p) · GW(p)
I like the analogy you give for why sometimes "toughing it out" can be viable, especially if your estimates are going to be changing over time.
One specific area where I'm sure that the sunk cost fallacy is actually Just Bad, though, is in finance / gambling related matters. It causes people to consistently underperform vs if they had just stuck to some heuristics or something.
comment by theme_arrow · 2021-01-10T00:24:50.242Z · LW(p) · GW(p)
Epistemic status: I was told to argue this position.
For a long-term project (say, for example, finishing a PHD rather than mastering out), the true utility you'll derive from it is a random variable with some true mean and variance. Maybe finishing the PHD will take 8 years and you'll never get that TT position you dream of, maybe it'll take 5 and there will be a perfect job for you at the end. You can't know the true mean utility, your guess of the mean is an estimator, which is itself a random variable. I think your argument was that sometimes your estimate of the mean utility will be negative, but that the true mean utility will be positive. So you should stick through and see if it improves. But the inverse can also be true - you can have cases where your estimate of the mean is positive and the true mean is negative. So "stick with things even though your estimate of the mean utility you'll derive from it is negative because it might actually be positive in reality" seems like as logical a rule as "drop things even though your estimate of the mean utility is positive because it might actually be negative."
comment by itavero · 2020-11-29T22:56:29.778Z · LW(p) · GW(p)
I think you're right and I appreciate this post a lot. You described a lot of experiences I've had but your random walk example is something I'll remember for a long time. Just as I should do more favors for my future self, I should have more faith in my past self.
"Finishing things" is a skill I would love to get better it. It's difficult because most of my internal motivation for projects come from the first 60% - 90% of it, but almost all of the social or external rewards come from finishing the thing (and the long tail of chores in the last 1% - 5%). So I end up doing a lot of "internally impressive" things, but having nothing to show for it (unreleased compositions, unused skills/knowledge, etc).
There are many benefits I'm missing out on from not having faith in past me and pushing through the last few steps. Thank you for making me think about that.
comment by Peter Berggren (peter-berggren) · 2024-12-29T22:42:50.953Z · LW(p) · GW(p)
It seems to me like the "random walk" case you described is poorly formed; the possibility of a project turning out to be worth it after all should be factored into one's estimate of how "worth it" it is. If it doesn't, then that's a problem of motivated reasoning, not a reason to have a sunk cost fallacy.
Intentionally inducing fallacious reasoning in oneself is classified as "Dark Arts" for a reason, especially since it can bias one's own assessment of how well it turns out and whether to continue doing it.
Replies from: alkjash↑ comment by alkjash · 2024-12-30T02:25:56.677Z · LW(p) · GW(p)
I don't follow. As a project progresses it seems common to acquire new information and continuously update your valuation of the project.
Replies from: peter-berggren↑ comment by Peter Berggren (peter-berggren) · 2024-12-30T02:34:05.342Z · LW(p) · GW(p)
Sorry if this is confusing. What I'm saying is, you have some estimate of the project's valuation, and this factors in the information that you expect to get in the future about the project's valuation (cf. Conservation of Expected Evidence). If there's some chance the project will turn out worthwhile, you know that chance already. But there must also be some counterbalancing chance that the project will turn out even less worthwhile than you think.
Replies from: alkjash↑ comment by alkjash · 2024-12-30T14:16:50.603Z · LW(p) · GW(p)
I still don't understand. Your valuation of the project will still change over time as information actually gets revealed though. The probability the project will turn out worthwhile can fluctuate.
Replies from: peter-berggren↑ comment by Peter Berggren (peter-berggren) · 2024-12-30T16:49:54.941Z · LW(p) · GW(p)
At any given point, you have some probability distribution over how worthwhile the project will be. The distribution can change over time, but it can change either for better or for worse. Therefore, at any point, if a rational agent expects it not to be worthwhile to expend the remaining effort to get the result, they should stop.
Of course, if you are irrational and intentionally fail to account for evidence as a way of getting out of work, this does not apply, but that's the problem then, not your lack of sunk costs.
Replies from: alkjash↑ comment by alkjash · 2024-12-30T17:17:42.826Z · LW(p) · GW(p)
I don't disagree with what you're saying about theoretically rational agents. I think the content of my post was [there are a bunch of circumstances in which humans are systematically irrational, sunk cost fallacy is on net a useful corrective heuristic in those circumstances. Attempting to make rational decisions via explicit legible calculations will in practice underperform just following the heuristic.]
To spell out a bit more, imagine my mood swings cause a large random error term to be added to all explicit calculations. Then if the decision process is to drop a project altogether at any point where my calculations say the project is doomed, then I will drop a lot of projects that are not actually doomed.
Replies from: peter-berggren↑ comment by Peter Berggren (peter-berggren) · 2024-12-30T18:14:36.787Z · LW(p) · GW(p)
I agree with you on this, but I also don't think "sunk cost fallacy" isn't the right word to describe what you're saying. The rational behavior here is to factor in the existence of a random error term resulting from mood swings into these calculations, and if you can't fully factor it in, then generally err on the side of keeping projects going. I understand "sunk cost fallacy" to mean "factoring in the amount of effort already spent into these decisions," which does seem like a pure fallacy to me.
It's reasonable e.g. when about to watch a movie to say "I'm in a bad mood, I don't know how bad a mood I'm in, so even though I think the movie's not worth watching, I'll watch it anyway because I don't trust my assessment and I decided to watch it when in a calmer state of mind." Sunk cost fallacy is where you treat it differently if you bought yourself the tickets versus if they were given to you as a gift, which does seem, even in your apology for "sunk cost fallacy," to remain a fallacy.
comment by silentbob · 2020-10-21T06:32:01.466Z · LW(p) · GW(p)
I'd probably put it this way – the Sunk Cost Fallacy is Mostly Bad, but motivated reasoning may lead to frequent false positive detections of it when it's not actually relevant. There are two broad categories where sunk cost considerations come into play, a) cases where aborting a project feels really aversive because so much has gone into it already, and b) cases where on some level you really want to abort a project, e.g. because the fun part is over or your motivation has decreased over time. In type a cases, knowing about the fallacy is really useful. In type b cases, knowing about the fallacy is potentially harmful because it's yet another instrument to rationalize quitting an actually worthwhile project.
You can use a hammer to drive nails into walls, or you can use a hammer to hurt people. The sunk cost fallacy may be a "tool" with higher than usual risk of hurting yourself. This is probably a very blurry/grayscale distinction that varies a lot between individuals however, and not a clear cut one about this particular tool being bad. But I definitely agree it makes a lot of sense to talk about the drawbacks of that particular concept as there is an unusually clear failure mode involved (as described in the post).
comment by tcheasdfjkl · 2018-08-04T18:58:37.987Z · LW(p) · GW(p)
The "Focusing for Skeptics" post doesn't seem to exist at the link you gave anymore. Do you know where I can find it?
Replies from: tcheasdfjkl↑ comment by tcheasdfjkl · 2018-08-04T18:59:06.042Z · LW(p) · GW(p)
Same for "Diachronic Done Right", it turns out...
Replies from: None, DoubleFelix↑ comment by [deleted] · 2018-08-04T21:59:16.071Z · LW(p) · GW(p)
The original author had them removed. You can find quite a few of them now on Medium instead, here: https://medium.com/@ThingMaker
↑ comment by DoubleFelix · 2020-08-11T00:08:58.462Z · LW(p) · GW(p)
There doesn't seem to be any archive of this particular post, but the comments are intact here at least: https://www.greaterwrong.com/posts/G78CnAYMLbWEDKj6w/diachronic-done-right [LW · GW]
(you'll need to open this with copy/paste; LW is too clever and replaces the link's actual URL with lesswrong.com, which doesn't have the comments. Gotta open it on greaterwrong.)