The D-Squared Digest One Minute MBA – Avoiding Projects Pursued By Morons 101
post by Benquo · 2017-03-19T18:48:55.856Z · LW · GW · Legacy · 16 commentsThis is a link post for https://dsquareddigest.wordpress.com/2004/05/27/108573518762776451/
Contents
16 comments
16 comments
Comments sorted by top scores.
comment by Benquo · 2017-03-19T18:54:31.742Z · LW(p) · GW(p)
Good ideas do not need lots of lies told about them in order to gain public acceptance. I was first made aware of this during an accounting class. We were discussing the subject of accounting for stock options at technology companies. There was a live debate on this subject at the time. One side (mainly technology companies and their lobbyists) held that stock option grants should not be treated as an expense on public policy grounds; treating them as an expense would discourage companies from granting them, and stock options were a vital compensation tool that incentivised performance, rewarded dynamism and innovation and created vast amounts of value for America and the world. The other side (mainly people like Warren Buffet) held that stock options looked awfully like a massive blag carried out my management at the expense of shareholders, and that the proper place to record such blags was the P&L account.
Our lecturer, in summing up the debate, made the not unreasonable point that if stock options really were a fantastic tool which unleashed the creative power in every employee, everyone would want to expense as many of them as possible, the better to boast about how innovative, empowered and fantastic they were. Since the tech companies’ point of view appeared to be that if they were ever forced to account honestly for their option grants, they would quickly stop making them, this offered decent prima facie evidence that they weren’t, really, all that fantastic. [...]
Fibbers’ forecasts are worthless. Case after miserable case after bloody case we went through, I tell you, all of which had this moral. Not only that people who want a project will tend to make innacurate projections about the possible outcomes of that project, but about the futility of attempts to “shade” downward a fundamentally dishonest set of predictions. If you have doubts about the integrity of a forecaster, you can’t use their forecasts at all. Not even as a “starting point”. By the way, I would just love to get hold of a few of the quantitative numbers from documents prepared to support the war and give them a quick run through Benford’s Law.
Application to Iraq This was how I decided that it was worth staking a bit of credibility on the strong claim that absolutely no material WMD capacity would be found, rather than “some” or “some but not enough to justify a war” or even “some derisory but not immaterial capacity, like a few mobile biological weapons labs”. My reasoning was that Powell, Bush, Straw, etc, were clearly making false claims and therefore ought to be discounted completely, and that there were actually very few people who knew a bit about Iraq but were not fatally compromised in this manner who were making the WMD claim. [...]
The Vital Importance of Audit. Emphasised over and over again. Brealey and Myers has a section on this, in which they remind callow students that like backing-up one’s computer files, this is a lesson that everyone seems to have to learn the hard way. Basically, it’s been shown time and again and again; companies which do not audit completed projects in order to see how accurate the original projections were, tend to get exactly the forecasts and projects that they deserve. Companies which have a culture where there are no consequences for making dishonest forecasts, get the projects they deserve. Companies which allocate blank cheques to management teams with a proven record of failure and mendacity, get what they deserve.
[...] The raspberry road that led to Abu Ghraib was paved with bland assumptions that people who had repeatedly proved their untrustworthiness, could be trusted. There is much made by people who long for the days of their fourth form debating society about the fallacy of “argumentum ad hominem”. There is, as I have mentioned in the past, no fancy Latin term for the fallacy of “giving known liars the benefit of the doubt”, but it is in my view a much greater source of avoidable error in the world. Audit is meant to protect us from this, which is why audit is so important.
comment by Dagon · 2017-03-20T16:42:00.062Z · LW(p) · GW(p)
downvoting still disabled, but I didn't find this link worth following.
Weird defensive tone. Political references to things I don't follow closely (and that are actively mind-killing). Generalizations that are simply wrong.
Replies from: Benquocomment by Lumifer · 2017-03-20T15:20:41.463Z · LW(p) · GW(p)
What interesting ideas do you find here? This looks like a ranty of-course-it's-clear-in-the-rearview-mirror "wisdom" to me.
Replies from: Luke_A_Somers, Benquo↑ comment by Luke_A_Somers · 2017-03-20T21:49:08.182Z · LW(p) · GW(p)
A) the audit notion ties into having our feedback cycles nice and tight, which we all like here.
B) This would be a little more interesting if he linked to his advance predictions on the war so we could compare how he did. And of course if he had posted a bunch of other predictions so we could see how he did on those (to avoid cherry-picking). That would rule out rear-view-mirror effects.
Replies from: satt, Benquo↑ comment by satt · 2017-03-24T00:13:17.250Z · LW(p) · GW(p)
This would be a little more interesting if he linked to his advance predictions on the war so we could compare how he did. And of course if he had posted a bunch of other predictions so we could see how he did on those (to avoid cherry-picking).
We may be able to get part of the way there. I found the following suspiciously prediction-like (and maybe even testable!) statements by Ctrl-Fing the pre-invasion posts on D-Squared's blog.
From October 21, 2002:
On the other hand, I am also convinced by Max Sawicky’s argument that Iraq is likely to be the first excursion of an American policy of empire-building in the Middle East, which is likely to be disastrous under any possible performance metric.
But, I retain my original belief that improvement in Iraq is politically impossible unless there is some sort of shooting war in the area culminating in the removal of Saddam Hussein. I don’t set much score by “national-building”, and don’t really believe that what the Gulf needs is more US client states, and I never believed any of the scare stories related to the “WMD” acronym which is currently doing such sterling duty in picking out weblog authors who don’t have a fucking clue what they’re talking about. [...]
So, how can we square these beliefs a) that something has to be done and b) that if something is done, it will be a disastrous imperial adventure by George Bush.
But apparently, having given up on the bin Laden connection and the Saddam-has-nukes idea, we are now going to be emotionally blackmailed into a war. In my experience, good ideas don’t usually need quite so many outright lies told to support them, but what the hey.
This February 26, 2003 post doesn't explicitly make predictions, but it's clearly written from the premise that the Bush administration would "completely fuck[] up" "the introduction of democracy to Iraq". Compare the end of the footnote on this February 5, 2003 post.
There might be empirical claims relating to WMD in later posts. Such might still count as predictions because the amount of WMD to be found in Iraq remained contentious for some time after the invasion.
↑ comment by Benquo · 2017-03-20T19:44:53.088Z · LW(p) · GW(p)
Putting zero weight on the estimates of people or institutions with a track record of misrepresentations seems obvious but also really hard to do, so it's interesting to see what sort of person can do it anyway, despite substantial social momentum on the other side. Overall, this seems like an extension of the recent Slate Star Codex post about lying on the internet. If lying is cheap and effective, then this level of caution is entirely appropriate.
To give a decision-relevant example, I think this sort of attitude would have long since given up on something like EA as mostly worthless. Is that excessively skeptical?
Replies from: Lumifer↑ comment by Lumifer · 2017-03-20T20:07:31.030Z · LW(p) · GW(p)
Putting zero weight on the estimates of people or institutions with a track record of misrepresentations seems obvious
I don't know about that. How are you going to deal with information coming out of the political establishment, for example? There is an abundant track record of misrepresentations, but if you just discard all of it, you are left with very little.
Or take such an institution as Internet X-D
Plus the OP quite explicitly decided that reversed stupidity is intelligence: "This was how I decided that it was worth staking a bit of credibility on the strong claim that absolutely no material WMD capacity would be found".
Replies from: dglukhov↑ comment by dglukhov · 2017-03-20T20:47:19.892Z · LW(p) · GW(p)
Just out of curiosity, how much work would you expect to complete to look for evidence of WMD (or lack thereof)? I'm sure it'd take more than just a couple of quick phone calls to the CIA, or even a trip to the region itself...
Replies from: Lumifer↑ comment by Lumifer · 2017-03-20T21:04:05.227Z · LW(p) · GW(p)
How much work for whom to achieve what degree of confidence?
Replies from: dglukhov↑ comment by dglukhov · 2017-03-21T12:56:39.966Z · LW(p) · GW(p)
I wouldn't know, I don't know the space of evidence in the first place. I guess in hindsight, that question is a little silly, since you can't know until you know.
What I really wanted to capture was the idea that looking for such evidence seems highly impractical for the average person writing a simple blog. The logistics of going out and finding such evidence doesn't seem trivial. Unless I'm not particularly creative, I'd at least start by integrating into the military operation there, which can range anywhere from active service to doing some civilian work contract there. Even then, I'd have to know the right people with the right informants, evidence of WMD would likely be sensitive information and even more like be kept under wraps for this reason.
Replies from: Lumifer↑ comment by Lumifer · 2017-03-21T16:22:33.016Z · LW(p) · GW(p)
looking for such evidence seems highly impractical for the average person writing a simple blog
Right. Which implies that the average person shouldn't have a strong opinion on the topic.
Unless she can analyze the publicly-available contradictory information and come to a conclusion (which still shouldn't be particularly strong because it's all based on hearsay, essentially).
comment by Benquo · 2017-03-20T20:39:38.949Z · LW(p) · GW(p)
Good ideas do not need lots of lies told about them in order to gain public acceptance.
There's one construction of this which is obviously false - lies being told in support of X doesn't inherently discredit X, because often there are also lies being told supporting not-X, and they can't both be false. But in the stock options example, Davies is pointing to something more specific: a principled argument for lies, on the grounds that they are necessary to support the desirable policy. His application of this to the Iraq war generalizes this somewhat: when you find people explaining away the misleading statements of the principal advocates for an action or proposition, as just part of a sales pitch, you should suspect that the lies are in fact central to the case, and not just accidental.
Fibbers’ forecasts are worthless.
This is a pretty radical claim. It makes the most sense in conjunction with the last point, about audits. In the absence of any force holding people to account, once they've shown themselves willing to mislead at all, we should expect people to lie quite a lot. But, in practice there are varying levels of audit and I'm not sure what cognitive simplifications to use.
Replies from: Lumifer↑ comment by Lumifer · 2017-03-20T21:14:19.172Z · LW(p) · GW(p)
a principled argument for lies, on the grounds that they are necessary to support the desirable policy
It's not like this is unusual. In the generic form this is probably as old as politics.
Here is a recent example: Jonathan Gruber being candid about how the Obamacare sausage was made:
This bill was written in a tortured way to make sure that the CBO (Congressional Budget Office) did not score the mandate as taxes. If CBO scored the mandate as taxes, the bill dies. Okay. So it was written to do that. In terms of risk-rated subsidies, if you had a law that said healthy people are going to pay in -- if you made it explicit that healthy people pay in sick people get money it would not have passed. Okay. Lack of transparency is a huge political advantage. And basically call it the stupidity of the American voter, or whatever, but basically that was really, really critical in getting the thing to pass