Giving calibrated time estimates can have social costs
post by Alex_Altair · 2022-04-03T21:23:46.590Z · LW · GW · 16 commentsContents
Types of costs People may think you're slow People may think you're lazy People dislike pessimism People can feel insulted People want commitment Being out of sync with others Solutions? None 16 comments
In the normal course of my rationalist upbringing, I learned about the classic cognitive biases, including the planning fallacy. This is essentially the fact that it almost always takes longer for a task to get done than people estimate at the beginning. The explanation described in the original sequence post [LW · GW] is that people visualize the mainline path of accomplishing the steps of the task, and then just add those times together, whereas in reality, at least one of those steps will have something go wrong, and make the whole thing take much longer.
So I read this, and then just... updated? Typically one should be very skeptical about the feeling that they are not subject to a bias, but as discussed in the original sequence post, this one is empirically reported to be correctable. And I'm sure it took me some time to adjust, but it wasn't too difficult to install a TAP [? · GW] to just consult the outside view instead of the inside view.
And then I went forth into the world, and gave calibrated time estimates, because I want to have true beliefs and I want to say true things. Slowly, over time, I got the sinking sense that this was costing me social points.
Types of costs
People may think you're slow
A primary context in which I had to give time estimates was when working as a software engineer. I've worked at multiple companies that regularly used "sprints" to plan work, and we often assigned explicit estimates to the tasks at hand. I regularly gave longer estimates than my coworkers or manager. These estimates were also calibrated; I actually took about that long. Sometimes more, sometimes less. My coworkers usually took longer than their estimates, but their estimates were shorter.[1]
Over time, I believe my managers got a worse impression of me.[2] One simple problem here is that shorter estimates just sound better than longer estimates because the manager wants the thing done sooner. But another slightly subtler problem is that since almost everyone gives optimistic estimates, my manager would reasonably be used to hearing optimistic estimates. So if they hear me give a longer estimate, they don't know that it's more calibrated, so they subconsciously assume it's just as optimistic as my coworkers, and then they would reasonably believe that I am worse.
Ideally, my calibration would eventually be evident through cumulative statistics. But realistically, people aren't paying attention on this level. They regularly feel the short-term displeasure of longer estimates, and they only irregularly get the pleasure from noticing that I finished it earlier than I said, and it's hard to connect up these two signals that are days or weeks apart.
People may think you're lazy
Claiming that things will take longer also just sounds like you're trying less hard. If Jane says she can get this done in a day, why can't you? Are you saying she's lying? Maybe you're just not trying as hard.
People dislike pessimism
Saying that something will take longer is negative-valence. You are claiming that the world is worse, and hearing that feels bad. People don't like regulations and safety expectations, because in the short term they are annoying and costly, even though in the long term they are designed to be much better for you. Hearing someone say that something will take longer is annoying and costly in the short term, even though being calibrated could allow the whole project to plan better.
People can feel insulted
The planning fallacy also applies to other people telling me their time estimates. I have no reason to assume other people are calibrated, and in my experience they usually do take longer than they estimate. And yet, telling them that you think their estimate is too short is rarely going to be taken well. Often I can just hear their estimate, not say anything, and then personally assume and prepare for it to take much longer. But that's not always an option, for example if you're working together on the task, or if you care about helping the person plan better
People want commitment
In a related problem, sometimes people will ask me "can you get this done by [time]?" and I'll say something like "seems reasonable" but as the discussion carries on, it becomes clear that they want me to basically *promise* that I'll get the task done by then. My problem with this has two parts. One is that my priors on how long it takes to do things in general are not just later than others, but also very wide. There just is a good chance that things will take *much* longer.
The second part is that, for me to utter a statement X, I have to believe it with a certain probability that is fairly high. So when someone wants me to echo back the unqualified statement "I will get it done by [time]", they're basically asking me to be, I dunno, 98% confident, and I just can't be, because I know that the 98th percentile time is actually more like 3*[time]. So I continue to hedge, and then if they push I basically tell them the above, and then they feel like I'm making a point to not commit, and then maybe they start thinking the above things, like that I'm slow or lazy or something.
The annoying thing here is that I believe the only difference between me and another task doer in this situation is that I have more accurate beliefs, or I have a higher belief threshold for making claims (or something similar, like that I only use statements for communicating beliefs and not for socially enforcing a commitment to myself). I think I can get the thing done just as fast as someone else, and thus I think I can satisfy the task-giver just as much, but they currently feel less satisfied because of the above communication.
Being out of sync with others
The last type of cost I can think of is when me and someone else are both equal parties subject to something else getting done. Say that me, Jane and Jack are going to the movies. I know that Jack always takes a long time to get ready at the last minute, and so I am psychologically prepared for this. We show up at Jack's house, and I go ahead and sit down on the couch and start reading on my phone. Maybe we'll miss the first few minutes of the movie, or get worse seats or something, but I've already accepted that as a likely consequence of going to the movies with Jack. Jane however has not done the same level of planning fallacy compensation as me, and so Jane feels surprised and frustrated when Jack takes a long time. This isn't exactly a conflict between me and Jane, but it can be awkward. Maybe Jane is anxiously pacing around while I'm relaxing on the couch. Maybe she feels like I should be on her side by feeling equally impatient, or something. Maybe she feels like I'm enabling Jack by not demonstrating urgency.
An example that did happen to me is that my house is having its electrical system replaced. This is taking a very, very, very long time. Many of my roommates seem not just annoyed (which anyone reasonably would be) but also something like surprised by this. And, I dunno, I just knew it would take forever and ever? I just decided to start living my life as if it would never end. I remember literally thinking to myself, "it would be nice if this got finished in 2021". I currently have my fingers crossed for it being finished in 2022.
And again, this isn't exactly a conflict between me and my roommates, but I feel a little bit snobbish just for even writing the above sentences. And I think that difference can seed some kind of low-key tension after enough repetitions.
Solutions?
Mostly my solution to the above is to just remember that they happen, and add that in to the social calculus that I'm always running every time I interact with a person. Often I just pay the social costs, partly because I have pretty high social resilience, partly because I continue to want to be a person that acts from my beliefs even when it's costly. (The solution of "just give shorter time estimates" is basically a non-starter for me.) The adjustments I do make are usually in the form of giving the estimate with softer wording, some kind of hedging, or just being more clear about my intention versus my uncertainty. I'm pretty sure that a more socially skilled person than me would have clearly communicable solutions that used some kind of charisma warmth to make the listener feel better despite the longer estimate.
- ^
Here and elsewhere, feel free to just, like, not believe my self-reports. I do not in fact have the data handy in spreadsheets, or anything.
- ^
To be totally fair, there were also times when I was actually worse than my coworkers. But this problem compounded the impression during those times.
16 comments
Comments sorted by top scores.
comment by Vael Gates · 2022-04-03T22:59:27.739Z · LW(p) · GW(p)
Thanks Alex :). Comment just on this section:
"The annoying thing here is that I believe the only difference between me and another task doer in this situation is that I have more accurate beliefs, or I have a higher belief threshold for making claims (or something similar, like that I only use statement for communicating beliefs and not for socially enforcing a commitment to myself)."
As someone who was in this situation with Alex recently (wanting a commitment from him, in order to make plans with other people that relied on this initial commitment), I think there's maybe an additional thing in my psychology and not in Alex's which is about self-forcing.
I'm careful about situations where I'm making a very strong commitment to something, because it means that if I've planned the timing wrong, I'll get the thing done but with high self-sacrifice. I'm committing to skipping sleeping, or fun hangouts I otherwise had planned, or relaxing activities, to get the thing done by the date I said it'd be done. I'm capable and willing to force myself to do this, if the other person wants a commitment from me enough. It's not 100% certain I'll succeed -- e.g. I might be hit by a car -- but I'm certain enough of success that people would expect me to succeed barring an emergency, which is mostly what I expect from other people when they're for-real-for-real committing to something.
So when I'm asking someone to for-real-for-real commit to me, I'm asking "are you ready to do self-sacrifice if you don't get it done by this date, barring an emergency. It's fine if it's a later date, I just want the certainty of being able to build on this plan". And I do think there's a bunch of different kinds of commitments in day-to-day life, where I make looser commitments all the time, but I do have a category for "for-real-for-real commitment", and will track other people's failures to meet my expectations when I believe they've made a "for-real-for-real" commitment to me. I might track this more carefully than other people do though-- feels like it kinda rhymes with autism and high conscientiousness, maybe also high-performance environments but idk?
Anyway, this all might be the same thing as "I only use statement for communicating beliefs and not for socially enforcing a commitment to myself". I'm not sure I'd use exactly the the "social enforcing a commitment to myself" phrase; in my mind, it feels like a social commitment and also feels like "I'm now putting my personal integrity on the line, since I'm making a for-real-for-real commitment, so I'd better do what I said I would, even if no one's looking".
Amusingly, I think Alex and I are both using self-integrity here, but one hypothesis is that maybe I'm very willing and able to force myself to do things, and this makes up the difference with respect to what concepts we're referring to with respect to (strong) commitment?
Always fun getting unduly detailed with very specific pieces of models :P.
comment by TheMajor · 2022-04-04T08:21:31.076Z · LW(p) · GW(p)
If your colleagues are regularly giving unrealistically optimistic estimates, and you are judged worse for giving realistic estimates, clearly your superiors don't care for the accuracy of the estimates all that much. You're trying to play a fair game in a situation where you will be rewarded for doing the opposite of that.
Personally I've had good mileage out of offering to lie to the people asking for estimates. When asked for estimates during a sprint, or the likes, and if I sufficiently trust the people involved I would say something like "You are asking us to do X, which I think will take 2 months. My colleagues are giving estimates of 2-3 weeks, but the previous times they gave estimates like that the project took 6-10 weeks. I'm committed to the project, and if you want to hear that we can do it in 3 weeks I'm happy to tell you that, but I don't think we will finish it within 2 months."
If after that you still find you are being punished for giving realistic estimates, consider not telling the truth?
comment by pjeby · 2022-04-11T21:01:57.766Z · LW(p) · GW(p)
This probably doesn't help your specific situation, but best practice for estimation in software projects is to use "ideal/naive" estimates and then use a velocity multiplier. What that means is, take everybody's estimates and then multiply them by a team-wide constant factor to allow for the fact that there are interruptions, meetings, unexpected problems, etc. This is usually a 2-4x multiplier, and can be treated as a thing to optimize at the team level (i.e., figuring out what things increase or decrease the multiplier for the team as a whole).
Of course, if that's what the team is explicitly doing, then you know to go ahead and give your naive estimate anyway and let the manager worry about the multiplier. But a lot of organizations don't do this explicitly, because the manager is just doing it in the back of their head and maybe not even acknowledging it to themselves because they think that if they do, it means everybody will act like they have more time. In essence, you may be being judged for defecting from the implicit norm of giving naive estimates for this process.
comment by cranberry_bear · 2022-04-11T20:37:26.268Z · LW(p) · GW(p)
I often use the following way of explaining time estimates. "If I was to do thing X and not stumble across any issues I haven't accounted for / unknown unknowns / circumstances don't change, it will take me about 2 weeks" followed by "in my experience, most of the time issues do come up, therefore a more conservative estimate is on the order of magnitude of months, not weeks." This way I can calibrate with others who are estimating "how long does this take if everything goes completely smoothly" while also bringing up how unrealistic this is.
comment by waveman · 2022-04-03T23:02:45.883Z · LW(p) · GW(p)
I ran into a similar problem. I was doing estimates of time and costs for projects which then went into the business case. As with OP my estimates were calibrated and usually fairly accurate.
Others' estimates were massively biased to low $ and time and often wildly wrong - in one case too low by a factor of 12.5. This is not rare of course - Microsoft Word for Windows V1.0 took over 5 years but never had an "end date" more than 1 year out.
The problem is that the business units wanted lowball estimates so they could get their projects started. It was then not too hard to exploit the sunk cost fallacy to keep the project alive. They felt I was not a "team player" and so forth.
See the extracts from Moral Mazes https://www.lesswrong.com/posts/45mNHCMaZgsvfDXbw/quotes-from-moral-mazes [LW · GW] for more on this kind of world.
comment by Alex_Altair · 2023-01-01T04:43:11.541Z · LW(p) · GW(p)
I remember literally thinking to myself, "it would be nice if this got finished in 2021". I currently have my fingers crossed for it being finished in 2022.
Update: it has not finished in 2022.
comment by Lukas Finnveden (Lanrian) · 2022-04-12T18:31:56.875Z · LW(p) · GW(p)
Here's a related model for why it might be beneficial to give optimistic ETA, as well as an argument for this (perhaps) explaining why we're over-optimistic in the first place: https://sideways-view.com/2016/11/26/if-you-cant-lie-to-others-you-must-lie-to-yourself/
Replies from: Alex_Altair↑ comment by Alex_Altair · 2022-04-12T20:21:04.248Z · LW(p) · GW(p)
Gosh, I have never disagreed so strongly with something Paul has written.
Replies from: Lanrian↑ comment by Lukas Finnveden (Lanrian) · 2022-04-12T21:33:06.117Z · LW(p) · GW(p)
Why / which part?
Replies from: Alex_Altair↑ comment by Alex_Altair · 2022-04-12T23:44:50.997Z · LW(p) · GW(p)
Right, so, I should give the preface that I'm not up for fully explaining why I think my position/opinion/reaction is true/correct. But to answer your question, these are parts that I objected to;
- The title, "If we can’t lie to others, we will lie to ourselves"
- The statement "I’d prefer the first outcome."
- "my loss function is a sum of two terms"
- "In fact, this procedure always results in distorted estimates, no matter how large we make the penalty for bad predictions."
And that's where I stopped reading, to be honest. It sounds like he's trying to prove to me that I must lie no matter what, and the content I read is overwhelmingly inadequate for getting me to take that seriously. It kind of feels like someone trying to argue to me that I must either stab other people, or stab myself, at least to some degree, at least paper cuts -- and I just don't need to take that kind of argument seriously. It feels similar to Pascal's mugging, in that I'm confident that I don't need to figure out why it's wrong to know that it's okay to ignore.
Of course, as a rationalist, it is deeply important to me that I accept that if it is true. And I am at least moderately curious about his argument. Just not curious enough to keep reading.
I don't think his math is wrong, but I think his model is a wrong representation of the situation. If I had to guess without thinking to hard about what his actual model error is, it would be that my "loss function" is not a sum of two terms, but is instead a case-structure; just say the true thing, up until some level of other costs (like social) at which point just stop saying things, up until some further very high cost (like if my life is on the line), at which point it's okay to lie.
Replies from: Lanrian↑ comment by Lukas Finnveden (Lanrian) · 2022-04-13T15:53:54.534Z · LW(p) · GW(p)
The first few sections are best read as empirical claims about what's evolutionarily useful for humans (though I agree that the language is sloppy and doesn't make this clear). Later sections distinguish what we consciously want and what our brains have been optimised to achieve, and venture some suggestions for what we should do given the conflict. (And it includes a suggestion that it might be ok to give over-optimistic ETA, but it doesn't really argue for it, and it's not an important point.)
Your suggested alternate loss-function seems like a plausible description of your conscious desires, which may well be different from what evolution optimised us for.
comment by Martin Randall (martin-randall) · 2022-04-05T02:27:20.749Z · LW(p) · GW(p)
I aim to give well-calibrated estimates. Sometimes projects finish faster than my median estimate, and sometimes they finish slower. The ones that finish faster then become convenient examples for deflecting social problems. Hypothetical simplified conversation:
- Alice: "I think this will take a couple of weeks"
- Martin: "Maybe two weeks if everything goes right, but I expect something more like six to eight weeks. 80% confidence."
- Bob: "There's no way it will take six weeks! We just have to add the thing to the other thing."
- Martin: "Remember I thought Project X would take two weeks and then Alice knocked it out over the weekend? I'm not saying Alice is wrong, just giving my best estimate."
Naturally any final plan presented externally has to fit the external organizational culture, whatever that is, but within a team there are benefits to a diversity of optimism/pessimism and quantitative/qualitative.
comment by amitlevy49 · 2022-04-04T12:38:38.162Z · LW(p) · GW(p)
Stress is a major motivator for everyone. Giving a fake overly optimistic deadline means that for every single project, you feel stressed (because you are missing a deadline) and work faster. You don't finish on time, but you finish faster than if you would have given an accurate estimate. I don't know how many people internalize it, but I think it makes sense that a manager would want you to "promise" to do something faster than possible - it'll just make you work harder.
Taking this into account, whenever I am asked to give an estimate, I try to give a best-case estimate (which is what most people give naturally). If I'd take the time I spent aimlessly on my phone into account when planning, I'd just spend even more time on my phone, because I wouldn't feel guilt.
comment by MikeH (mike-heaton) · 2022-04-04T04:51:42.113Z · LW(p) · GW(p)
...sometimes people will ask me "can you get this done by [time]?" and I'll say something like "seems reasonable" but as the discussion carries on, it becomes clear that they want me to basically *promise* that I'll get the task done by then.
I think you're right in this diagnosis, but I don't think that this is strange or unreasonable. In complex organizations, people do often need to know with reasonable certainty that someone else's task will be done by a certain time. So in many work cultures the counterparty is actually asking "can you promise to work as hard as it takes to get it done by X time", not asking for a calibrated estimate for how long it would take when you're working an average 40 hour week.
Replies from: gjm↑ comment by gjm · 2022-04-04T09:25:55.886Z · LW(p) · GW(p)
This is common but note that when this interacts with the other problem, of everyone operating on unrealistic time estimates, it becomes very unreasonable: you may be paid to work 40 hours a week but then expected by your peers and managers to give commitments that amount to "I will work 90-hour weeks to get this done".