The Flow-Through Fallacy
post by Chris_Leong · 2023-09-13T04:28:28.390Z · LW · GW · 7 commentsContents
7 comments
There is something of an informal reasoning fallacy where you promote something that seems like something that we should "surely do", something that seems obviously important, but where we haven't thought through all of the steps involved and so we aren't actually justified in assuming that the impact flow through.
Here are some examples:
- The Dean of a computer science department thinks that "surely, it's important to produce not only technically proficient graduates but those who use their skills for good". So they mandate that all students must do a "technology and ethics class". The only problem is that none of their professors are an expert in this nor even interested in it, so it ends up being a poorly taught and run subject such that students put in the absolute minimum effort and forget everything a week after the exam.
- The prime minister of a country wants to reduce crime. He notices that the police department is severely underfunded so he significantly increases funding. Unfortunately, they're so corrupt and nepotistic that the department is unable to spend the funds effectively.
- A government wants to increase recycling, so they create a "national recycling day" reasoning “surely this will increase recycling. Unfortunately, most people end up ignoring it and, out of those who actually are enthusiastic, most make an extra effort to recycle for a few days or even a week, then basically forget about it for the rest of the year. So it ends up having some effect, but it’s basically negligible.
In each of these cases, the decision maker may not have chosen the same option if they'd taken the time to think it through and ask themselves if the benefits were likely to actually accrue.
Domain experience can be helpful to know that these kinds of issues are likely to crop up, but so is the habit of Murphyjitsu [? · GW]'ing your plans.
To be clear, I'm not intending to refer to the following ways in which things could go wrong:
- Side-effects: The government introduces snakes to reduce the rodent population, but this has the side-effect of causing more people to gain snake bites.
- Reactions: Amy donates $10 million to the Democrats. Bob hears about this and decides to donate $20 million to the Republicans in response
- Value confusion: James spends years acquiring his Pokemon card collection as a kid. He builts an amazing Pokemon card collection, but regrets all the time and money he spent on this once he becomes an adult.
(Please let me know if this fallacy already has a name or if you think you've thought of a better one.)
7 comments
Comments sorted by top scores.
comment by noggin-scratcher · 2023-09-13T08:30:26.675Z · LW(p) · GW(p)
There's the old syllogism,
- Something must be done
- This is something
- Therefore: this must be done
Not sure if there's a snappy name for it
Replies from: Richard Horvath↑ comment by Richard Horvath · 2023-09-13T18:31:40.238Z · LW(p) · GW(p)
"Politician's logic"
Wiki: https://en.wikipedia.org/wiki/Politician%27s_syllogism
Snappy British sitcom clip:
comment by romeostevensit · 2023-09-13T05:10:13.037Z · LW(p) · GW(p)
Related effects referred to under the headings of lost purposes and principle agent problems.
comment by Nathaniel Monson (nathaniel-monson) · 2023-09-13T05:05:51.593Z · LW(p) · GW(p)
I think lots of people would say that all three examples you gave are more about signalling than about genuinely attempting to accomplish a goal.
Replies from: zrkrlc↑ comment by junk heap homotopy (zrkrlc) · 2023-09-13T15:34:48.492Z · LW(p) · GW(p)
I wouldn’t say that. Signalling the way you seem to have used it implies deception on their part, but each of these instances could just be a skill issue on their end, an inability to construct the right causal graph with sufficient resolution.
For what it’s worth whatever this pattern is pointing at also applies to how wrongly most of us got the AI box problem, i.e., that some humans by default would just let the damn thing out without needing to be persuaded.
Replies from: Viliam↑ comment by Viliam · 2023-09-13T19:43:35.835Z · LW(p) · GW(p)
How would one even distinguish between those who don't actually care about solving the problem and only want to signal that they care, and those who care but are too stupid to realize that intent is not magic? I believe that both do exist in the real world.
I would probably start charitably assuming stupidity, and try to explain. If the explanations keep failing mysteriously, I would gradually update towards not wanting to actually achieve the declared goal.
comment by Dweomite · 2023-09-14T02:14:26.897Z · LW(p) · GW(p)
Sounds similar to fabricated options [LW · GW].