Necessary, But Not Sufficient

post by pjeby · 2010-03-23T17:11:03.256Z · LW · GW · Legacy · 15 comments

There seems to be something odd about how people reason in relation to themselves, compared to the way they examine problems in other domains.

In mechanical domains, we seem to have little problem with the idea that things can be "necessary, but not sufficient".  For example, if your car fails to start, you will likely know that several things are necessary for the car to start, but not sufficient for it to do so.  It has to have fuel, ignition, and compression, and oxygen...  each of which in turn has further necessary conditions, such as an operating fuel pump, electricity for the spark plugs, electricity for the starter, and so on.

And usually, we don't go around claiming that "fuel" is a magic bullet for fixing the problem of car-not-startia, or argue that if we increase the amount of electricity in the system, the car will necessarily run faster or better.

For some reason, however, we don't seem to apply this sort of necessary-but-not-sufficient thinking to systems above a certain level of complexity...  such as ourselves.

When I wrote my previous post about the akrasia hypothesis, I mentioned that there was something bothering me about the way people seemed to be reasoning about akrasia and other complex problems.  And recently, with taw's post about blood sugar and akrasia, I've realized that the specific thing bothering me is the absence of causal-chain reasoning there.

When I was a kid, I remember reading once about a scientist saying that the problem with locating brain functions by what's impaired when somebody has brain damage there, is that it's like opening up a TV set and taking out a resistor.  If the picture goes bad, you might then conclude that the resistor is the "source of pictureness", when all you have really proved is that the resistor (or brain part) is necessary for pictureness.

Not that it's sufficient.

And so, in every case where an akrasia technique works for you -- whether it's glucose or turning off your internet -- all you have really done, is the equivalent of putting the missing resistor back into the TV set.

This is why "different things work for different people", in different circumstances.  And it's why "magic bullets" are possible, like vitamin C as a cure for scurvy.  When you fix a deficiency (as long as it's the only deficiency present) then it seems like a "magic" fix.

But, just because some specific deficiency creates scurvy, akrasia, or no-picture-on-the-TV-ia, this doesn't mean the resistor you replaced is therefore the ultimate, one true source of "pictureness"!

Even if you've successfully removed and replaced that resistor repeatedly, in multiple televisions under laboratory conditions.

Unfortunately, it seems that thinking in terms of causal chains like this is not really a "natural" feature of human brains.  And upon reflection, I realize that I only learned to think this way because I studied the Theory of Constraints (ToC) about 13 years ago, and I also had a mentor who drilled me in some aspects of its practice, even before I knew what it was called.

But, if you are going to reason about complex problems, it's a very good tool to have in your rationalist toolkit.

Because, what the Theory of Constraints teaches us about problem solving, is that if you can reason well enough about a system to identify which necessary-but-not-sufficient conditions are currently deficient (or underpowered relative to the whole), then you will be able to systematically create your own "magic bullets".

So, I encourage you to challenge any fuzzy thinking you see here (or anywhere) about "magic bullets", because a magic bullet is only effective in cases where it applies to the only insufficiency present in the system under consideration.  And having found one magic bullet, is not equivalent to actually understanding the problem, let alone understanding the system as a whole.  (Which, by the way, is also why self-help advice is so divergent: it reflects a vast array of possible deficiencies in a very complex system.)

15 comments

Comments sorted by top scores.

comment by taw · 2010-03-23T18:16:57.071Z · LW(p) · GW(p)

(Which, by the way, is also why self-help advice is so divergent: it reflects a vast array of possible deficiencies in a very complex system.)

Scurvy advice was also vast and divergent before it turned out it was all wrong and the problem was solved by a magic bullet. And so was our cholera advice. And advice about every single issue about which we were wrong. Advice divergence can as easily reflect our lack of knowledge as actual complexity of the problem.

Replies from: Richard_Kennaway, pjeby
comment by Richard_Kennaway · 2010-03-23T18:34:42.647Z · LW(p) · GW(p)

The linked article on scurvy also describes how the magic bullet for scurvy was lost for a while. The problem was that people didn't know how the magic bullet worked or exactly what it was, so when faster ships reduced the risk of scurvy, they lost feedback on how effective their precautions were.

Even when the magic bullet works reliably and repeatably, you still need to understand why.

Replies from: taw
comment by taw · 2010-03-23T18:45:13.399Z · LW(p) · GW(p)

Understanding why and how is all great, but we don't understand why half of the things work the way they work - we usually only have good phenomenological descriptions and severely wrong theory.

If we waited for full laws of thermodynamics before starting to use fire, we'd still be gathering roots on African savanna.

If small sugary drinks at right times work, we need to figure out what's the right amount of drink at which circumstances, what would be side effects, which problems can be solved by it and which cannot. If they don't, we should figure that out too. Fine details of why it works can wait.

Replies from: pjeby
comment by pjeby · 2010-03-23T18:54:23.279Z · LW(p) · GW(p)

If small sugary drinks at right times work

You haven't yet sufficiently defined what "work" means. Scientists have had trouble replicating some ego depletion experiments in humans, and there appear to be confounding factors introduced by the fact that it's the experimenters choosing the tasks, and the fact that the subjects are already motivated to co-operate.

This is almost nothing like the environment for individuals' personal/private akrasia.

That being said, I would be happy to be proven wrong. Feel free to go try it for a month, then come back and add it to the anti-akrasia techniques survey results.

comment by pjeby · 2010-03-23T18:50:23.973Z · LW(p) · GW(p)

Advice divergence can as easily reflect our lack of knowledge as actual complexity of the problem.

Certainly. It's something you'd expect to appear either way.

However, if the relationship between akrasia and glucose were the same as the relationship between scurvy and vitamin C, you would expect that the divergent-but-successful self-help advice would all involve ways of (directly or indirectly) modifying glucose availability or depletion, in the same way that all the divergent-but-successful scurvy cures affected vitamin C consumption or depletion. (e.g. fresh meat, short trips, not using copper pots, etc.)

Replies from: taw
comment by taw · 2010-03-23T19:18:57.203Z · LW(p) · GW(p)

Most of advice on scurvy did not address scurvy in any way - they were just wrong. The kind which worked - like short trips - were extremely far removed from the real solution.

Likewise with cholera advice. It was just ridiculously wrong.

comment by Jonathan_Graehl · 2010-03-23T20:44:09.318Z · LW(p) · GW(p)

I think most of us made the assumption that past a certain necessary level of glucose available to the brain, there would be little benefit to willpower.

A related example: creatine significantly increases timed-IQ-test performance on average in vegetarians, and not omnivores - creatine (and its precursors, for endogenous production) is found in beef, pork, and fish much more than in plants.

What I'd like to know is where I lie on the marginal improvement/dose curve at any moment (as you warn, an accurate model of this may be quite complicated). If it's hard for me to predict this, maybe I can titrate (not all proposed magic bullets are easy to incrementally dose and quick to reflect results, but some are).

comment by Johnicholas · 2010-03-23T22:24:51.214Z · LW(p) · GW(p)

If anyone has not yet read it, the original post references (I think) a paper called "Can a biologist fix a radio?"

On the other hand, it might be a standard example from the "systems" discipline. Anyway, it's a good paper.

Replies from: pjeby
comment by pjeby · 2010-03-23T23:52:34.329Z · LW(p) · GW(p)

That's not what I was referring to; it really was something I read decades ago -- maybe in a Time/Life book about the brain. However, the paper you referenced is a hilarious read, and I was especially amused by this bit:

At this stage, the Chinese saying that it is difficult to find a black cat in a dark room, especially if there is no cat, comes to mind too often.

comment by saliency · 2010-03-23T19:54:30.912Z · LW(p) · GW(p)

Oversimplification has it's uses.

In an environment where you get multiple chances to solve a problem and the cost of failure is small it is often efficient to use trial and error. In order to efficiently use trial and error it is best to fix all but one variable and confirm or eliminate possibilities.

The success of trial and error techniques influence the way we think about problems. We naturally seek to simplify the system to the point in which it can be tested. When something works we start with it next time but if next time it does not work we move on to the next variable we can isolate.

The problem is that this way of thinking can infect other areas of thinking. A person who has experience with his car not starting because his battery no longer can take a charge may first replace his battery before checking that his alternator is working.

Replies from: pjeby
comment by pjeby · 2010-03-23T21:33:36.719Z · LW(p) · GW(p)

Oversimplification has it's uses.

Note that necessary-but-not-sufficient doesn't rule out oversimplification. You don't need to know about every necessary condition to solve a problem, only about the ones that are deficient in the current instance.

You just can't generalize from such a solution, if the evidence supports the idea that there may be other necessary conditions that are deficient in other cases.

Replies from: saliency
comment by saliency · 2010-03-24T14:08:57.911Z · LW(p) · GW(p)

"only about the ones that are deficient in the current instance."

How do you know what is deficient A priori? Investigation / diagnosis takes resources and have an opportunity cost.

comment by Matt_Stein · 2010-03-30T18:04:45.010Z · LW(p) · GW(p)

I realize that it's not the main focus of the article, but I found the bit about locating brain functions and whether a part of the brain is necessary or sufficient to cause some function interesting. To me, that's the largest hole in my belief in materialism: we've observed that certain areas of the brain are necessary for some functions, but not that they are sufficient. I hope that once these areas have been properly simulated, it may prove sufficient, but there is some doubt.

Replies from: RobinZ
comment by RobinZ · 2010-03-30T20:35:33.838Z · LW(p) · GW(p)

I hope that once these areas have been properly simulated, it may prove sufficient, but there is some doubt.

Not a lot, though - we have strong reasons independent of personality to believe that materialism is correct, and it remains plausible that remaining questions can be answered without resorting to non-materialist explanations.

P.S. Welcome to Less Wrong! Feel free to post an introduction in the welcome thread.

comment by Wx_Doc · 2013-05-16T14:55:11.685Z · LW(p) · GW(p)

You have stated "For some reason, however, we don't seem to apply this sort of necessary-but-not-sufficient thinking to systems above a certain level of complexity... such as ourselves". The theologian spends a great deal of time with this question. Witness the life of Martin Luther and others who have considered which of righteousness, the works of righteousness, grace, and faith as being necessary but not sufficient for salvation. To an unbeliever I realize this exercise may be one of futility, but to the Religious it is an all consuming task and the answer(s) are considered to have eschatological consequences.