The Mind Is A Shaky Control System

post by DirectedEvolution (AllAmericanBreakfast) · 2021-09-29T20:49:02.479Z · LW · GW · 0 comments

Judges tell juries to disregard statements they find inadmissible in court. Juries have a hard time following this advice.

The mind is a shaky control system. It trembles, like your hands. It gets squished by the g forces, like your body. Information flows in, and the mind responds, whether you want it to or not.

But you do have some control. You can choose to increase or decrease the amount of attention you pay to certain inputs. You can choose what to seek out. You can make decisions about what to include or exclude from a formal, explicit model. You can mentally "tag" pieces of information as being "no evidence," "weak evidence," "strong evidence," and so on. A practice of rationality is like an umbrella to protect you from the rain, or a windbreaker to keep off the wind. It doesn't always work, but it might help. There's a theory to it, and there's a practice. The practice doesn't always look quite like the theory.

How these attempts to control your own mind's response to information will actually affect the way you synthesize all the evidence available, and how that synthesis will ultimately inform your decisions, is hard to specify. When you say that an argument from authority is "not evidence," how does hearing that argument, and tagging it as "not evidence," affect your actual synthesis? How does that synthesis affect your decision-making?

We may not be able to describe this process in detail. But it surely diverges from any ideal, mathematical specification of a perfect process for interpreting evidence and forming judgments. The mind is an imperfect control system.

So is the body. My hands shake, I bump into things, and I struggle to approach every task with the level of power, grace, relaxation and focus that would be most appropriate. Over a lifetime of banging my literal head on literal doorframes, I've both learned how to improve my body control, and how to mitigate my imperfections.

I've gone through the same process with learning to control my mind, and mitigate its failings.

I recognize that many forms of evidence commonly regarded as "not evidence" are more precisely forms of "extremely weak evidence." The fact that Andrew Yang claimed once that "the smartest people in the world predict that 1/3 of Americans will lose their jobs to automation in 12 years" is, perhaps, extremely weak evidence that this prediction is correct.

But most people with a modicum of rationality would describe this statement as an argument from authority. I think this is wisdom. Such forms of "extremely weak evidence" are, first of all, so weak that they're not worth repeating. We should punish those who clog up the airwaves with it by making fun of it as fallacious. Furthermore, by mentally "tagging" this statement as "fallacious," rather than as "extremely weak evidence," I believe that I prevent my mind from exaggerating the weight that I should ascribe to it. This is me grappling with the limitations of my own mind.

From a physical standpoint, the mind has weird physics. With physical matter, F = MA. Force is directly proportional to mass. When it comes to thoughts, however, F ≠ MA. A lightweight piece of evidence can exert a disproportionate amount of force on the mind. One way we can compensate for that is by explicitly tagging evidence as "I should ignore this." Probably, that won't actually happen, for the same reason that juries can't just choose to ignore inadmissible statements that they've already heard. But it will help.

Rationality training ought to make us more rational. One way that could work is by teaching us the ideal form of rational reasoning, and then striving to execute it. An alternative, and one that seems more plausible to me, is to understand such weird tricks as mentally tagging extremely weak evidence as "not evidence" or "fallacious." It's valid to consider the ways they diverge from ideal reasoning. But we can and should also ask why they nevertheless seem helpful for helping the human mind think more accurately.

Sometimes, there will be alternatives to these tricks that both mirror an ideal reasoning process and work nicely with human psychology. Other times, there will be a tradeoff. Understanding that tricks to improve human reasoning may diverge from ideal rationality might be helpful. And understanding how divergences from rationality might nevertheless be useful for the human mind could be equally valuable. In general, I think it would be best to try to consistently address both of these issues when we explore rationality techniques and try to improve our thinking.

0 comments

Comments sorted by top scores.