Sunk Cost Fallacy
post by Z_M_Davis · 2009-04-12T17:30:52.592Z · LW · GW · Legacy · 44 commentsContents
44 comments
Related to: Just Lose Hope Already, The Allais Paradox, Cached Selves
In economics we have this concept of sunk costs, referring to costs that have already been incurred, but which cannot be recouped. Sunk cost fallacy refers to the fallacy of honoring sunk costs, which decision-theoretically should just be ignored. The canonical example goes something like this: you have purchased a nonrefundable movie ticket in advance. (For the nitpickers in the audience, I will also specify that the ticket is nontransferable and that you weren't planning on meeting anyone.) When the night of the show comes, you notice that you don't actually feel like going out, and would actually enjoy yourself more at home. Do you go to the movie anyway?
A lot of people say yes, to avoid wasting the ticket. But on further consideration, it would seem that these people are simply getting it wrong. The ticket is a sunk cost: it's already paid for, and you can't do anything with it but go to the movie. But we've stipulated that you don't want to go to the movie. The theater owners don't care whether you go; they already have their money. The other theater-goers, insofar as they can be said to have a preference, would actually rather you stayed home, making the theater marginally less crowded. If you go to the movie to satisfy your intuition about not wasting the ticket, you're not actually helping anyone. Of course, you're entitled to your values, if not your belief. If you really do place terminal value on using something because you've paid for it, well, fine, I guess. But we should all try to notice exactly what it is we're doing, in case it turns out to not be what we want. Please, think it through.
Dearest reader, if you're now about to scrap your intuition against wasting things, I implore you: don't! The moral of the parable of the movie ticket is not that waste is okay; it's that you should implement your waste-reduction interventions at a time when they can actually help. If you can anticipate your enthusiasm waning on the night of the show, don't purchase the nonrefundable ticket in the first place!
You can view ignoring sunk costs as a sort of backwards perspective on the principle of the bottom line. The bottom line tells us that a decision can only be justified by its true causes; any arguments that come strictly afterwards don't count; if it just happens to all turn out for the best anyway, that only means you got lucky. The sunk cost fallacy tells us that a decision can only be justified by its immediate true causes; any arguments considered in the past but subsequently dismissed don't count; if you could have seen it coming, why didn't you?
Another possible takeaway: perhaps don't be so afraid to behave inconsistently. Rational behavior may be consistent, but that doesn't mean you can be more rational simply by being more consistent. (Compare with the argument against majoritarianism: the Aumann results guarantee that Bayesians would agree, but that doesn't mean we can be more Bayesian simply by agreeing.) Overcoming Bias commenter John suggests that you go so far as to pretend that you've just been dropped into your current life with no warning. It may be disturbing to even consider such a radical discontinuity from your past—but you can consider something hypothetically, without necessarily having to believe or act on it in any way. And if, on reflection, it turned out that your entire life up to now was a complete waste, well, wouldn't you want to know about it?—and do something about it?
Decision theory is local. Don't be afraid to ask of your methodology: "What have you done for me lately?"
44 comments
Comments sorted by top scores.
comment by dreeves · 2009-04-13T04:20:11.415Z · LW(p) · GW(p)
It's easy to make up excuses for why it might still be rational to go to the movie. Here's how to factor all that out and cut to the real issue:
Scenario 1: You bought a $10 non-refundable ticket to a show. (And note that you definitely would not have done so if the show cost $20.) As you get to the theater you realize you lost your ticket. Luckily, they have more available, still at $10. Do you buy another ticket?
Scenario 2: You didn't buy a ticket ahead of time. As you get to the theater you realize that $10 has fallen out of your pocket and is lost. Luckily, you still have enough to buy a ticket. Do you do so?
Everyone agrees on Scenario 2. Of course you do. No one's on such a tight budget that an unexpected change in wealth of $10 changes their utility for theater.
But many people refuse (I've checked) to see that Scenario 1 is fully equivalent. They can't bear to pay another $10 for a show they already paid $10 for. If Scenarios 1 and 2 don't feel fully equivalent, you're probably suffering from the sunk cost fallacy!
Replies from: None, CronoDAS, AnnaSalamon, MrHen↑ comment by [deleted] · 2014-11-10T14:56:09.783Z · LW(p) · GW(p)
Agreed! I only want to add the reference to Tversky and Kahneman's original study which you are talking about: The Framing of Decisions and the Psychology of Choice. The 10$ experiment is on page 457.
He talks about how this is called by creating different mental accounts. However, Kahneman writes in "Thinking, Fast and Slow" that it is more rational to only have one mental account for all incomes and expenses. In this case people wouldn't make the mistake of having savings in one account and credit card debt in the other.
↑ comment by AnnaSalamon · 2011-05-20T06:02:48.010Z · LW(p) · GW(p)
If Scenarios 1 and 2 don't feel fully equivalent, you're probably suffering from the sunk cost fallacy!
There's at least one other common reason why scenarios 1 and 2 may feel non-equivalent, besides the sunk costs fallacy: you may maintain different mental bank accounts for different goals, and so the $10 in scenarios 1 and 2 may be debited from different bank accounts.
For example, if you budget yourself $10/day for recreation, in scenario 1 you would have used up your recreational budget for the day, while in scenario 2 you would not.
Even if you do not maintain separate explicit budgets for e.g. fun vs. productivity vs. do-gooding, you may operate approximately, internally, by allowing each of your goals a certain approximate shares of time/money/etc., with which to optimize for what it wants. And so you might still have an intuitive feeling that you'd spent enough on recreation today in scenario 1, and not in scenario 2.
My non-confident guess is that it's healthy to do have different implicit or explicit budgets for different goals.
↑ comment by MrHen · 2009-04-14T15:12:18.447Z · LW(p) · GW(p)
The reason scenario 1 matches scenario 2 is because a ticket is worth $10 but extra tickets are worth nothing.
The state diagram:
- Initial state: a ticket is worth $10 and you buy one
- Purchased state: further tickets are now worth $0 but you lose yours
- Lost state: now that you have no ticket, tickets are worth $10 again
The lost state and initial state are equivalent.
Replies from: dreeves↑ comment by dreeves · 2009-04-15T14:36:23.249Z · LW(p) · GW(p)
I'm not sure if we're saying the same thing but I think the reason they're equivalent is just that the cost of the first ticket is sunk so in both cases you're $10 poorer and are faced with the decision of whether to spend $10 on the show.
By the way, there are two ways to fall prey to the sunk cost fallacy: In the original post the problem is throwing good money (or effort/energy) after bad. In the lost ticket scenario the problem is refusing to throw good money after bad. In general, the problem is being influenced in either direction by money/effort that is spent and unrecoverable.
In examples like in the original post, I will ask myself "would I go to see this show (or whatever) right now if it were free?". I've actually seen people hyper-correct for the sunk cost fallacy and ask themselves "do I still think this is worth $X?". The point is to make your decision now as if the cost had never happened, hence "sunk cost".
Replies from: MrHen↑ comment by MrHen · 2009-04-15T15:28:12.676Z · LW(p) · GW(p)
I'm not sure if we're saying the same thing but I think the reason they're equivalent is just that the cost of the first ticket is sunk so in both cases you're $10 poorer and are faced with the decision of whether to spend $10 on the show.
Yeah, I think we are saying the same thing. History is irrelevant when determining the worth of a movie ticket. I just mentally represent it as a state diagram instead of worrying about whether the ticket lost was a sunk cost in order to avoid a fallacy.
By the way, there are two ways to fall prey to the sunk cost fallacy. [...] In examples like in the original post, I will ask myself "would I go to see this show (or whatever) right now if it were free?".
Right, and I think your question is the valid point.
For what it is worth, the state diagram for the first example would shift the worth from the movie ticket to watching the movie itself:
- Initial state: A ticket costs $10 and allows admission to a movie. If you predict watching the movie will be worth $10, buy the ticket.
- Purchased state: The night of the movie, since you have a ticket, you can watch a movie without paying for a ticket. If it is worth watching the movie, use the ticket.
By the way, the reason I use state diagrams is because I arrive at the "purchased state" if someone else gives me a ticket. If someone gives me a ticket to a movie, am I obligated to use it? Ignoring any social concerns, the answer is no.
comment by Daniel_Burfoot · 2009-04-13T14:16:10.834Z · LW(p) · GW(p)
I assert that the sunken cost "fallacy" is actually a quite sophisticated mechanism of human reasoning. People who take into account sunken costs in everyday decisions will make better decisions on average.
My argument relies on the proposition that a person's estimate of his own utility function is highly noisy. In other words, you don't really know if going to the movie will make you happy or not, until you actually do it.
So if you're in this movie-going situation, then you have at least two pieces of data. Your current self has produced an estimate that says the utility of going to the movie is negative. But your former self produced an estimate that says the utility is substantially positive - enough so that he was willing to fork over $10. So maybe you average out the estimates: if you currently value the movie at -$5, then the average value is still positive and you should go. The real question is how confident you are in your current estimate, and whether that confidence is justified by real new information.
Replies from: Yasser_Elassal, drethelin↑ comment by Yasser_Elassal · 2009-04-14T19:56:30.940Z · LW(p) · GW(p)
Your utility estimates at any given time should already take into account all of the data available to you at that time, including your previous estimates.
In other words, if you decide you don't want to go to a movie you've already purchased a ticket for, that decision has already been influenced by the knowledge that you did want to go to the movie at some point, so there's no reason to slide your estimate again.
Replies from: DanielLC↑ comment by drethelin · 2011-05-09T18:27:38.255Z · LW(p) · GW(p)
You give no examples as to which situations you actually come out ahead from believing in the sunk cost fallacy, and your justification for it amounts to no more than the fallacy itself. "You thought at some point this was a good idea, so you should probably keep doing it."
New information should not be averaged with ignorance. When you did not know how you would feel that night, you thought it likely you would want to see the movie. but now you KNOW you don't feel like seeing it. This should have a much stronger effect on your decision than the prior data you had. You're not averaging two relatively similar pieces of data, you're putting together more accurate information with information that's basically a guess.
comment by mariz · 2009-04-12T21:11:15.758Z · LW(p) · GW(p)
We tend to use trivial examples to illustrate sunk costs, like deciding whether to go to the movies, or deciding whether to leave a restaurant if the food is bad, but there are important real world situations where these considerations matter. For example, many people are afraid to change careers mid stream because they are afraid that all their education and experience up to that point would be a waste of time. They've already spent thousands of dollars on a degree, or tens of thousands of dollars on a professional degree, and now they have to start over? But isn't that better than being miserable for the rest of your career? You can't get your tuition and your time back, but you can still make yourself happier.
Replies from: ciphergoth↑ comment by Paul Crowley (ciphergoth) · 2009-04-12T22:04:48.277Z · LW(p) · GW(p)
And, of course, people can have sunk costs in belief - you don't want to waste all that time you spent in church...
comment by billswift · 2009-04-12T19:52:46.247Z · LW(p) · GW(p)
There are reasons to do things that look like honoring sunk costs. The most common example, that I have seen mentioned in almost every discussion I have read on sunk costs, is reputation - if doing what you had planned, rather than what you would currently prefer would have negative future reputational costs, then it is still rational to do it, even though you would now prefer not to.
The second reason is one that occured to me while reading your post - sometimes I regret not doing something I had planned, because I didn't feel it was the best use of my time when it came up. So, if you felt motivated enough to buy the ticket in the first place, then it may still be rational to use it, despite how you feel at the time, if you think you may later regret not using it.
Notice neither of these is actually honoring sunk costs. In both cases, you do what you had planned despite changing your mind about how desirable you think it would be in order to avoid future problems - reputational or regrets.
Replies from: anonym↑ comment by anonym · 2009-04-12T21:20:26.182Z · LW(p) · GW(p)
There is a very nice discussion of the sunk cost fallacy in Rational Choice in an Uncertain World: The Psychology of Judgement and Decision Making. They make your first point there.
They also give an excellent way of re-framing the issue when arguing against honoring sunk costs: explain your position as "not forsaking a project or enterprise, but, rather, wisely refusing 'to throw good money after bad'" [42].
One other reason for honoring the sunk cost that I haven't seen mentioned is that it might much more strongly motivate you to improve your decision making in the future, thereby preventing many future costly mistakes. Again though, that's not really a case of honoring sunk costs, since staying at the movie turns out to be a net win, just not for the obvious reasons.
Replies from: Eliezer_Yudkowsky, billswift↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-04-12T21:26:22.634Z · LW(p) · GW(p)
One other reason for honoring the sunk cost that I haven't seen mentioned is that it might much more strongly motivate you to improve your decision making in the future, thereby preventing many future costly mistakes.
I find myself doubtful of this as empirical psychology. Actually abandoning your course in midstream feels so much more painful - though only for a moment, not in the long run - and more importantly involves so much more of an acknowledgment of error and the initial action as being the key mistake, that I would think the activity substantially less likely to be repeated.
Replies from: anonym↑ comment by anonym · 2009-04-12T22:22:03.348Z · LW(p) · GW(p)
I was thinking of situations when abandoning course is what you want to do, because it allows you to "cut your losses" and avoid the unpleasantness to follow. Staying the course in that sort of situation would be a form of self-punishment, done with full acknowledgment of the mistakes you made.
Sometimes a strong, harsh lesson is a better instructor than many small, mildly unpleasant lessons.
EDIT: I guess nobody here agrees that sitting through a bad movie unnecessarily might be more unpleasant than leaving immediately but having wasted $10. Or is it that you disagree that imposing the punishment that honors the sunk cost could ever be more effective than the strategy of abandonment? If you can abandon strategy at any point from conception to completion, you can abandon when sunk cost is low or nonexistent and costs to be beared still are very high, or sunk cost is high and still to bear is low, or anywhere in between, including something like 25% sunk/75% unsunk, when 25% might not make much of an impression on you and thus won't change your behavior but 75% would make a sufficiently stronger impression that the behavior is changed.
comment by shaesays · 2009-04-15T18:40:13.765Z · LW(p) · GW(p)
I became aware of the sunk-cost fallacy one day when I bought a giant diet coke that tasted funny.
My first impulse was to drink it because I had paid for it, and because it was huge, and because it seemed a shame to waste it. I think I was too far away from the fast food place to complain.
Then I realized that I wasn't particularly thirsty (I'd been interested in the taste and had eaten the food I wanted to drink it with), it had no nutritional value, and I didn't get it for the caffiene boost. There was absolutely no reason to drink it, so I didn't. And felt pretty proud of myself for noticing how stupid it would have been to have drunk it.
comment by Hans · 2009-04-12T21:59:46.680Z · LW(p) · GW(p)
Another reason for honoring the sunk cost of the movie ticket (related to avoiding regret) is that you know yourself well enough to realize you often make mistakes. There are many irrational reasons why you would not want to see the movie after all. Maybe you're unwilling to get up and go to the movie because you feel a little tired after eating too much. Maybe a friend who has already seen the movie discourages you to go, even though you know your tastes in movies don't always match. Maybe you're a little depressed and distracted by work/relationship/whatever problems. Etc.
For whatever reason, your past self chose to buy the ticket, and your present self does not want to see the movie. Your present self has more information. But this extra information is of dubious quality, and is not always relevant to the decision. But it still influences your state of mind, and you know that. How do you know which self is right? You don't, until after you've seen the movie. The marginal costs, in terms of mental discomfort, of seeing the movie and not liking it, are usually smaller than the marginal benefit of staying home and thinking about what a great movie it could have been.
The reasoning behind this trivial example can easily be adapted to sunk cost choices in situations that do matter.
The sunk cost fallacy is easy to understand and to point out to others, but I caution against using it too often. The point of the fallacy is to show that only future costs and benefits matter when making a decision. This is true, but in reality those costs and benefits (and especially their probabilities) are hard to define. It is not clear whether the extra information that was received after 'sinking' the cost has an impact on the cost and benefit probabilities. You also know that, in any case, if the decision to sink the cost in the first place was the right one after all, the decision to continue is even more rational as a large part of the cost has already been spent. You can go see a movie for free that other people still have to pay for.
comment by komponisto · 2009-04-12T17:45:58.707Z · LW(p) · GW(p)
Going to the theater in this case can be viewed as respecting the wishes of your past self, and possibly future selves (who may desire to have seen the film in question). Without some ability to do this, long-term planning would be impossible.
Your current self counts, but not necessarily more than temporally displaced selves.
Replies from: Alicorn, Peter_Twieg↑ comment by Alicorn · 2009-04-12T18:25:27.237Z · LW(p) · GW(p)
The desire to have done something, in the absence of the desire to actually do it, is an interesting phenomenon. I find that in my case it most often applies to travel, which I hate, although I desire to have been to places I would need to travel to in order to have visited. We have to juggle the wants of self-at-t1-for-self-at-t1 with the wants of self-at-t2-for-self-at-t1. It's like procrastination, except that instead of forming an intention to do something later, one dissolves the intention to do it at all.
Replies from: ikrase, Simetrical, AllanCrossman↑ comment by Simetrical · 2009-04-14T23:00:58.803Z · LW(p) · GW(p)
Like Mark Twain's definition of a classic: "Something that everybody wants to have read and nobody wants to read."
↑ comment by AllanCrossman · 2009-04-12T18:27:28.556Z · LW(p) · GW(p)
I find that in my case it most often applies to travel, which I hate, although I desire to have been to places I would need to travel to in order to have visited.
That sounds entirely rational - presumably your objection to going to those places would disappear if teleportation became an option?
Replies from: Alicorn↑ comment by Alicorn · 2009-04-14T19:01:54.950Z · LW(p) · GW(p)
Yes, if I could teleport, I'd certainly travel more - but I don't think I would travel to all of the places I would like to have been, since I'd still have to do other things I don't like about traveling, like packing and dealing with language gaps and setting aside time to be at the place, even if travel time is negligible. In all likelihood, teleportation wouldn't be free, either.
Replies from: ciphergoth↑ comment by Paul Crowley (ciphergoth) · 2009-04-14T20:07:30.911Z · LW(p) · GW(p)
If you could teleport, you wouldn't need to pack, or book a hotel for that matter...
Replies from: Hans↑ comment by Hans · 2009-04-16T00:23:26.923Z · LW(p) · GW(p)
Yes, and if there was a utility lever that you could pull to gain utility, you would spend your entire life pulling the lever. But there isn't. And you cannot teleport, nor will you be able to in the foreseeable future. So Alicorn will have to continue taking the burden of travel into account when deciding whether or not to visit a place he would like to have visited.
↑ comment by Peter_Twieg · 2009-04-12T21:19:32.959Z · LW(p) · GW(p)
Actually, without some ability to do that in the future, long-term planning would be impossible. Whether one has the ability in the present to uphold obligations to the past is only relevant to future time-consistency insofar as we think this directly lends itself to having this ability in the future, and... I think it's far from clear whether that relationship will reliably occur, and even whether it should occur.
comment by steven0461 · 2009-04-13T09:20:52.247Z · LW(p) · GW(p)
There's also a sort of reverse sunk cost fallacy. Example: you're reading a book that's worth 8 hours of your time but only if you read all of it, and after 5 hours of reading you realize it's going to take another 5 hours to finish the book, so the original investment of 5 hours was a mistake, so to "correct" the mistake you stop reading when you can now get 8 hours' worth of time for only an additional 5 hour investment.
ETA: I googled "reverse sunk cost fallacy" and found this, which makes the same point.
Replies from: MrHen↑ comment by MrHen · 2009-04-14T15:04:28.348Z · LW(p) · GW(p)
This reminds me of the Dollar Auction. What happens if the estimate keeps getting pushed out? Are you willing to read the book forever?
I suppose if you keep messing up the estimate, the confidence in your next estimate will go down, which should tip the scales at some point.
Replies from: MrHen↑ comment by MrHen · 2009-04-15T15:35:02.268Z · LW(p) · GW(p)
(Off-topic) In terms of karma, what should I take away when this comment gets a negative score? Do I need to expound on my point? Was my point invalid/stupid/inane/irrelevant?
Also, is there a kosher method for switching to off-topic discussion or is it something to avoid altogether?
(Edit) Hm. After asking about it, my comment was apparently modded up, so nevermind. Weird.
Replies from: jimrandomh, thomblake↑ comment by jimrandomh · 2009-04-15T16:20:09.376Z · LW(p) · GW(p)
Also, is there a kosher method for switching to off-topic discussion or is it something to avoid altogether?
For small bits of off-topic discussion, no special action is required. For lengthy asides, you can start a new post (for lengthy topics) or use the monthly open thread (for smaller ones), leaving a link in the place where you would segue.
comment by Hans · 2009-05-14T12:21:34.137Z · LW(p) · GW(p)
Seth godin has a few examples of sunk costs. I believe these examples better represent true sunk costs than some of the examples given here (such as the movie ticket).
For example, suppose you have paid 50 dollars for a Bruce Springsteen concert. You have searched long and hard for tickets this cheap. Suddenly, somebody offers you 500 dollars for the ticket. Do you sell it? The ticket is now worth $500 to you, and you would have never paid $500 for a ticket in the first place.
comment by ata · 2011-01-12T19:02:42.510Z · LW(p) · GW(p)
Suppose you have n shares of stock X, and you're trying to decide whether to sell or to hold onto it for a bit longer; and suppose that if you instead had the current cash value of those shares, you would easily decide not to buy those shares. So it would seem that this amount of money is currently worth more to you as available cash than as shares of this stock. Yet if you already have those shares, particularly if you bought them at a higher price, you may be reluctant to sell them immediately — you'd be inclined to hold onto it in hopes of recouping your loss, even if, given those shares' value in cash instead, you could think of a better use for it than investing it back in the same stock.
Questions:
- This probably happens a lot, right?
- Is this pattern of thinking ever not a bad idea? Or should stock traders consistently apply this sort of reversal test to stocks they already own?
- Is there a name or description for this particular fallacy/bias or is it just grouped with the sunk cost fallacy?
↑ comment by gwern · 2011-01-12T20:11:40.688Z · LW(p) · GW(p)
I think I would call it either the sunk cost fallacy, or the endowment effect/bias.
The latter because you are equally uncertain about whether its value will go up or go down, and so its value to you ought to be exactly its (current) cash-equivalent, and in fact, less because presumably you are risk-averse like all other humans (and so an equal chance of loss or gain is worse than a guaranteed no-loss/no-gain) - and yet you are still holding the stock.
↑ comment by A1987dM (army1987) · 2011-11-20T14:35:02.670Z · LW(p) · GW(p)
Is this pattern of thinking ever not a bad idea?
If there are some kind of taxes/fees/whatever on buying/selling stock so that if you buy a certain number of shares and sell them back immediately there's a fraction of your money you don't get back, then...
↑ comment by simpleton · 2011-01-13T03:00:31.758Z · LW(p) · GW(p)
This does happen a lot among retail investors, and people don't think about the reversal test nearly often enough.
There's a closely related bias which could be called the Sunk Gain Fallacy: I know people who believe that if you buy a stock and it doubles in value, you should immediately sell half of it (regardless of your estimate of its future prospects), because "that way you're gambling with someone else's money". These same people use mottos like "Nobody ever lost money taking a profit!" to justify grossly expected-value-destroying actions like early exercise of options.
However, a bias toward holding what you already own may be a useful form of hysteresis for a couple of reasons:
There are expenses, fees, and tax consequences associated with trading. Churning your investments is almost always a bad thing, especially since the market is mostly efficient and whatever you're holding will tend to have the same expected value as anything else you could buy.
Human decisionmaking is noisy. If you wake up every morning and remake your investment portfolio de novo, the noise will dominate. If you discount your first-order conclusions and only change your strategy at infrequent intervals, after repeated consideration, or only when you have an exceptionally good reason, your strategy will tend towards monotonic improvement.
↑ comment by Unnamed · 2011-01-13T06:41:15.946Z · LW(p) · GW(p)
There's a closely related bias which could be called the Sunk Gain Fallacy: I know people who believe that if you buy a stock and it doubles in value, you should immediately sell half of it (regardless of your estimate of its future prospects), because "that way you're gambling with someone else's money".
That's called the house money effect (from Thaler & Johnson, 1990).
comment by serenus · 2013-05-15T18:24:37.669Z · LW(p) · GW(p)
It reminds me a bit about how Jefferson Hope went about getting his betrothed's ring back. He hired an actor to fool Holmes, to salvage what he knew he'd lost. He could (if he were a real criminal, and not a dummy for Holmes's deductions) gain more than only his ring back, if the actor also brought him valuable intelligence (or even incapacitated Holmes in some way). Would it be turning a sunk cost into a purchase, or am I mistaken?
comment by gwern · 2012-02-05T03:17:58.404Z · LW(p) · GW(p)
Related: http://lesswrong.com/lw/9si/is_sunk_cost_fallacy_a_fallacy/
comment by nazgulnarsil · 2009-04-13T00:47:26.499Z · LW(p) · GW(p)
this just seems like loss aversion.
comment by agolubev · 2009-04-13T15:58:03.234Z · LW(p) · GW(p)
- not all costs are financial. 2. controlling habit patterns 3 and most importantly its a BAD example. If i wanna see a movie and i don't want to see it taht night, i know i'll want to see it in the future, so you go to see it to avoid a FUTURE cost. This blog reminds me of LTCM.