Notes on Caution
post by David Gross (David_Gross) · 2022-12-01T03:05:21.490Z · LW · GW · 0 commentsContents
What is caution? Is it a mistake to call the virtue “caution”? Caution and “prudence” Risks are complicated Related virtues and vices #yolo How to develop the virtue of caution Know the limits of instinct Attend to good advice, #rtfm, #lfmf Dreams and daydreams Drugs and risk-taking Institutional help The Precautionary Principle Appendix: We’re Beginning Our Descent None No comments
This post examines the virtue of caution. As with my other posts in this sequence, I’m less interested in breaking new ground and more in gathering and synthesizing whatever wisdom I could find on the subject. I wrote this not as an expert on the topic, but as someone who wants to learn more about it. I hope it will help people who want to know more about this virtue and how to nurture it.
What is caution?
To a first approximation, someone with the virtue of caution is habitually alert to potential risks, weighs those risks well when making decisions, and deploys strategies to mitigate them.
I want to distinguish this definition from another one that suggests itself: Caution as the habit of making choices that minimize possible negative consequences. That is “risk aversion” and can be part of caution, or in an unbalanced form can become an unwise timidity.
However, unlike phrónēsis [LW · GW] (prudence, practical wisdom), caution does focus on negative outcomes.
Is it a mistake to call the virtue “caution”?
Does framing this virtue as “caution”—rather than just subsuming caution under “practical wisdom”—put a thumb on the scale rather than weighing things wisely? How can you know ahead of time whether the best decision is more cautious rather than more bold? There is something to that criticism, but here are some reasons why this framing might nonetheless make sense:
For one thing, a virtue is a habitual variety of choice-making. There may well be advantages to making cautious choices by habit (and therefore by default) and incautious ones as an exception that requires a more deliberate override. Perhaps in any particular instance there is no good reason to prejudge it as being best met by caution, before you have a chance to deliberate, but for those many choices we make without deliberating much, a cautious default mode may save us a lot of trouble.
For another, the virtue of caution—properly understood—has to do less with risk-averse results of your decision-making [LW · GW] than with a risk-aware process of decision-making. Caution does not insist that you make risk-minimizing decisions, but that you be alert for risks and carefully weigh those risks during your deliberation.
For another, in human lives risk/reward is not symmetrical. For example, death is a sort of risk-singularity for which there is no reward counterpart (unless you are among the believers in the possibility of eternal life). People do not see the range of possible future up-sides and down-sides as equivalent opposites, and this is not merely an irrational bias, but reflects certain asymmetries in the human condition.
A more mundane possibility is that the virtue is called “caution” because recklessness is more common or more problematic than timidity, and so if you’re counseling someone to aim at the golden mean, you’ll more likely be telling them to be more cautious. This would be similar to how we call the virtue “patience” [LW · GW] even though there are a minority of people who are too long-suffering and would be better off losing their cool and not putting up with so much.
Caution and “prudence”
In my Notes on Prudence [LW · GW] I noted that “prudence” was the traditional translation for phrónēsis (Greek) and prudentia (Latin) as the name of the virtue of practical wisdom, but that the word “prudence” has since become more associated with caution specifically. For example, the VIA Institute on Character, which takes a modern virtue-based approach, summarizes prudence in this way: “I act carefully and cautiously, looking to avoid unnecessary risks and planning with the future in mind.”[1]
I chose to stick with the more traditional and expansive definition of prudence-as-practical-wisdom when I discussed that virtue, and I’ll discuss this more restricted sense of prudence-as-caution in this post.
That said, it is difficult to discern where caution ends and practical wisdom begins. If you act or deliberate to avoid a danger or threat, you’re acting cautiously. But what if you do so in order to avoid something that is merely suboptimal: if you are careful not so much to avoid a danger but to avoid missing an opportunity? Is the consideration “I don’t want to look back on my life and realize I never took any big chances” also a cautious consideration?
Risks are complicated
The risks you are subject to at any time include a shifting variety of potential threats to your health, reputation, property, comfort, plans, family & friends, values, things you hold sacred, and so forth. In order to evaluate and respond to such diverse risks well, you must also at least implicitly have some way of ranking these many things or of evaluating trade-offs between them. If you value certain of these components in a superstitiously or disproportionately imbalanced way, your evaluation of the portfolio of risks will also be imbalanced, and your response to those risks will suffer.
You may find that you have a large-scale risk strategy that is subject to change, with effects that ripple throughout your decisions and habits. For example, at some point in your life you may have adopted a roughly “maximin” outlook toward life: trying to make decisions that protect you from the worst down-sides, but perhaps at the cost of taking risks that could gain you some of the better up-sides. Then something in your life or outlook changes and you decide that you would be better off sticking your neck out and reaching for the brass ring. This is one interpretation of the mid-life crisis: someone has arranged their life on conservative, playing-it-safe principles, and then abruptly asks “is this all there is?” and decides to switch strategies. What makes the mid-life crisis sometimes comic is the awkward way in which such a strategy tweak ripples out in the form of uncharacteristic decisions and value-rankings and risk tolerances—ones that can seem incongruous and ridiculous, at least until the pendulums resettle.
All of your possible options have consequences that in turn present their own sets of risks, for many of which you can only roughly estimate the probability distribution of the various possible degrees of severity. Each of those risks in turn invites a set of additional options in the form of possible mitigation strategies or other responses, and so ad infinitum. Each option also has potential benefits, and among the risks of each option is the set of opportunity costs of not having chosen differently.
For that matter, there’s an opportunity cost paid for putting in the effort to be better-informed about risk, better prepared, more thorough in evaluating options, and so forth. This too is a risk, and a chicken-and-egg problem: how do you know if this cost is worth paying or is too high until you’ve already paid it?
Given all this, it is probably beyond our grasp to choose optimally. We need shortcuts.
Unfortunately, many common heuristics are demonstrably faulty. Using them, people often estimate risks poorly and plan for them badly. Lists of typical human cognitive biases show few that are not also ways risk-assessment can go awry.[2] We seem to have a variety of contradictory devices that are good enough to help us make the day-to-day quick decisions we need to muddle through life, but that reveal themselves to be shockingly absurd when examined closely.
The popularity of casino gambling, and its addictiveness in some people, suggests that even when we gamify simple scenarios of risk management and provide prompt negative feedback for poor risk assessment, people can fail to correct appropriately.
Certainly if the stakes are high enough and we have enough time to think about it, we would be wise to insist on more rational methods than “just eyeballing it” with our ramshackle instincts. This is especially true in circumstances in which we are exposed to risks very different from those our ancestors would have faced—such as driving on the freeway, starting a course of chemotherapy, or sharing an unguarded opinion on an internet forum. In such cases we can expect even less reliable help from our instinctual heuristics.
One way we may be able to improve the reliability of our caution is to be better aware of cognitive biases so that we are more apt to notice when they lead us astray, or so that we can correct for their effects. For example, people are often tripped up by risks that are individually negligible but either cumulatively (cigarette smoking) or occasionally (fatigued driving) tremendous. Because these choices only rarely result in immediately threatening consequences, we may not instinctively regard them as risks to be taken into account. We can adjust for this by more deliberately and rationally looking on them as real risks.
It is unintuitive to judge tiny probabilities of catastrophic outcomes, but these can be really important. Plenty of people people die from catastrophes that were themselves improbable in their specifics, but that are not at all exceptional when you see them as making up part of the large class of ordinary accidents to which mortals are liable. You are assailed on all sides at all times by a swarm of tiny risks, no one of which is at all likely to be instantly threatening, but each of which could happen to be the one that has your number. A habit of carefulness (that is, the virtue of caution) helps you to better your odds against the whole swarm, whereas a painstakingly rational calculation to counteract a habitual incautiousness on a risk-by-risk basis is comparatively expensive and ineffective.
Related virtues and vices
The vice of deficiency goes by names like recklessness, carelessness, incautiousness. The vice of excess is timidity.
There are other failure modes of cautiousness that don’t map well to linear deficiency/excess. For example, phobias and superstitions can cause carefulness to be poorly-targeted at the wrong sorts of threats. People often also exaggerate certain sources of risk in irrational ways that don’t rise to the level of phobia (things like air travel feeling riskier than road travel). Advertisers and political manipulators may drum up fears or promote safety-mimicking responses that are really only in their own interests. When risks are difficult to understand, or whenever “demonstrating safety” becomes more important than being cautious, rational [LW · GW] action can give way to rituals and totems or to “safety theater.”
#yolo (see below) can go beyond recklessness to be intentional risk-seeking for thrills or for show.
Some people expect the universe to be just, to right the scales, to protect the righteous, to deal a fair hand. That turns out not to be a practical substitute for caution. We may not realize we’re counting on the cosmos to take our side until it fails us, and then we catch ourselves saying things like “how could they?” or “why me?”
Virtues in tension with caution include optimism [LW · GW] (which may obscure caution’s concern for downsides with its focus on the upsides) and boldness/daring (which encourages you to take high-reward gambles now and again). A common failure of caution is when you over-represent the possible rewards of risks in your calculations and then choose unwisely based on this exaggeration. This seems to be at least part of what’s going on in casino gambling and lotteries–wins are big and flashy (though infrequent); losses are small and subdued (but cumulatively costly).
Virtues that can come to the assistance of caution include attention, focus [LW · GW], foresight, and curiosity (how could this go wrong?). Willful ignorance can interfere with risk analysis. It is easy to get in the habit of substituting the most convenient of either “here there be dragons” or “nothing to see here” as placeholders for things you haven’t investigated well. If you are already motivated to either engage in or refrain from some course of action, you may be tempted to use placeholders like these to justify such a decision.
Subject-matter expertise [LW · GW] can help you to better intuit when things don’t smell right and it’s time to get up your guard. “Wisdom” [LW · GW] more generally helps you anticipate the variety of potential dangers, estimate their likelihoods, and mitigate them most efficiently.
If we lack courage [LW · GW], some sorts of frightening risks will seem more dangerous than they really are because we “fear fear itself.” Lack of courage can masquerade as caution and can hide the fact that we are deficient in caution. Much of the challenge of courage has to do with mastering our emotional response to fear, whereas much of the challenge of caution has to do with the cognitive challenge of assessing risk well. Still, there is some overlap, and some people who think of themselves as overly risk-averse may need to work on courage as much as or more than on risk-assessment.
Caution is a component or ingredient in virtues like frugality [LW · GW], care [LW · GW], prudence, preparedness, fitness [LW · GW], and know-how.
#yolo
Sometimes (and always, to some extent) danger envelops us such that all of our paths forward are dangerous ones. And sometimes we may rationally decide that the promise of exceptional gains merits taking on exceptional risks. But there are also occasions on which some people appear to be intentionally incautious, as if recklessness was itself desirable—they flout caution in order to take on what seem to be gratuitous risks. Why might this be, and can this be justified? Here are some possibilities:
- It is an excusable folly of youth.
- There may be some survivorship bias here, but that probably can’t explain all of it.
- Youth are less experienced, don’t always have a good grasp of consequences, and haven’t yet been burned-once in order to be twice-shy.
- Perhaps the rewards of risk-taking are higher at a younger age, or the costs are lower? When you are young, you may have more of a support network to pick you up when you fall, for one thing. And maybe a higher percentage of rewards are available through one-off feats of derring-do during youth, while adults are better equipped to cash in on slow-and-steady long-term plans. I’m skeptical and would want to see the math before I’d accept this argument, though.
- It could be peacocking, in a sexual selection sense.[3] That’s a plausible just-so-story, anyway.[4] This might also explain the stereotypical sexual dimorphism in #yolo behavior. Men might be more punished than women for being mediocre or more rewarded for being exceptional, in terms of reproductive success, and this could encourage young men to take more chances.
- It is an impressive demonstration of courage. People often think of those who court danger, at least in some ways, as being formidable, dashing, and vigorous (not merely endangered, vain, or foolhardy).
- To the person who confronts danger, this can be thrilling, either from the adrenaline rush or from the more cerebral sense that one is dancing with death, taking one’s destiny into one’s hands, living fully in the moment, and that sort of thing. For example, a researcher who studied skydiving enthusiasts said such a person “passes from a state of fear to a condition of happiness and excitement characterized by hyperrealism and perception of time which is focused on the present. Once the risk behaviour has come to a happy end, a very pleasant feeling of self-realization develops. [They] talk about having felt a kind of purification, an amplification of the self and of feeling a higher level of self-determination.”[5]
- Curiosity may also be a motive for incautious behavior (what will really happen if I stick a fork in the light socket?).
- Deliberately putting yourself into a dangerous situation can be a way of testing your limits and learning more about your capacity or resilience [LW · GW].
- You might put yourself in “danger” because you idiosyncratically do not believe in the danger, and you hope to debunk invalid warnings.
- Some dangers are only apparent (for example, those in certain stunts or magic tricks).
- You might deliberately tempt disaster in order to learn from mistakes. When the stakes are low, it can be useful to practice being innovative to overcome deliberately invited challenges. For example, a strong player in chess may offer to handicap themselves against a weaker opponent (e.g. by removing certain of their pieces before gameplay begins) in order to better learn from the game.
On reflection, some of these excuses for incautious-seeming behavior strike me as inadequate. In particular, the excuse that you “feel more alive” when dancing with death I think deserves more scrutiny than it’s usually given. We may nod or shrug when we hear this excuse from, say, “free solo” rock climbers, but it should raise our suspicions that we hear similar words from people who deliberately cut or burn themselves as a symptom of depression or certain personality disorders. “Depend upon it, sir, when a man knows he is to be hanged in a fortnight, it concentrates his mind wonderfully,” said Samuel Johnson,[6] but such wonders aren’t worth getting hanged over. (There are healthier ways to improve your focus [LW · GW].) If you feel you need to brush up against the final curtain in order to feel alive, maybe that’s more an issue to confront than an urge to indulge.
That said, it’s true that times of danger can, in retrospect, be charged with significance and feel like meaningful, valuable landmarks in your life: times when you were put to the test and your character developed a new polish. What distinguishes these from mere #yolo stunts, though, is that typically they involve real stakes. Surviving a “hold my beer” moment may be thrilling, but is unlikely to be life-defining (except in a bad way, e.g. if you are maimed).
Sometimes you hear it said that “it’s better to regret something you have done than to regret something you haven’t done”[7] or you hear that people, as they approach the end of their lives, regret not having taken more chances. I think there is good reason to be suspicious of this line of thought. For one thing, it’s too easy to cast the glow of the best possible outcome onto risks you regret not having taken. Really, what you regret is missing out on that rosy-colored fantasy-outcome. The real-life distribution of likely outcomes was probably far different. For another, I don’t think end-of-life regrets of this sort deserve uncritical respect, even if they turn out to be more than anecdata. Survivorship bias is one reason (those who live long enough to have regrets about their caution may have been more well-served by their caution than they realize). But also, at the end of life you may sensibly discount mere survival: caution to preserve your few waning years may no longer seem as valuable as it once did. Meanwhile, risks for great rewards may seem more out-of-reach than before. When you cast your mind back to when you had an opportunity to roll the dice you may wish you had another chance since the wager now seems more in your favor.
However, it can make sense to be cautious about being overcautious. You don’t want to be so afraid of taking risks that you lose out on opportunities for worthwhile rewards, or to be so obsessed about preserving your life that you are afraid to live it. But I think this, properly understood, is not in opposition to the virtue of caution but is incorporated in it.
How to develop the virtue of caution
Caution incorporates several distinguishable strategies:
- Anticipate possible risks.
- Avoid exposing yourself to risk unnecessarily.
- Take preventative steps to reduce negative consequences.
- Prepare to cope with such consequences.
- Act with care and attention.
This sort of subdivision can be helpful because each component can go awry in its own way. You may be able to most effectively improve your caution by concentrating on one component of it.
For example, if you are frequently blindsided by things that other people seem to more-easily anticipate, you may need to work on the first of these components. If for you “what’s the worst that could happen?” is always merely a rhetorical question, you may want to apply yourself here. People who have irrational phobias also may recognize that there is something awry in the way they anticipate possible sources of risk.
People who take unnecessary risks, or who do not seem to take risks into account when they choose among options, would be wise to attend to the second subcomponent.
There are some risks that can and should be avoided, but others must be faced. For those especially you need the next two components of caution: mitigation and preparedness. You cannot prevent the rain if the storm comes, but you can bring an umbrella and an extra pair of dry socks.
Finally, although cautious choice involves imagining, thinking ahead, and anticipating possibilities, cautious action requires focus on the present and actual: taking care, not getting sloppy or skipping steps, staying alert. “Watch what you’re doing.”
Know the limits of instinct
Particularly when you are dealing with things that are very high, very deep, very sharp, very fast, very hot, very cold, very bright, very powerful, very heavy, very big, and so forth: don’t just eyeball it. Your instincts are liable to mislead you when they confront something outside of the parameters in which they evolved.
Consider instead (when these are available) the hard-won risk mitigation strategies embodied in the practices of experts, institutions, and traditions.
Don’t be too eager to override or skip safety features that seem at first glance to be unnecessary or excessive. Such things may be a small price to pay to avoid uncommon disasters. If you do things extra-cautiously by default, you’ll be doing things extra-cautiously that one time when it really matters.
Another way instinct misleads is by exaggerating the danger of things that may emotionally feel scary (harmless spiders, graveyards) and discounting objectively dangerous things that feel reassuringly comforting (climate-controlled car interiors, cigarette breaks, cocktails).
Attend to good advice, #rtfm, #lfmf
Don’t waste time learning from your own mistakes when there are so many excellent prefab mistakes out there you could learn from instead. When you embark on something unfamiliar, there’s no shame in asking for advice from those with experience, or in reading the manual first.
Meditate on others’ misfortunes. Indulge your pity or your schadenfreude if you must, but use the opportunity to learn from their fail. If you lend a sympathetic ear to those who are suffering [LW · GW], one beneficial side effect is that you may learn more about paths that lead to suffering and about choices people wish they had made differently. Where did they make a wrong turn or fail to prepare? How might you have known better?
Reporting, books, and literature of all sorts can also tell stories of disasters and of missed opportunities to avoid them (or nick-of-time decisions that did). However such stories are selected for how gripping they are rather than for how representative or accurate they are, so caveat lector.
Dreams and daydreams
Maybe this is just a personal foible, but I’ve noticed that in idle moments (or during meditation when I’m being especially alert to what spray my mind tosses up from its turbulence), my thoughts often turn to improbable horror stories. What if I were accused of a murder I didn’t (or did!) commit? What if I were walking along the Golden Gate Bridge with my mother and a sudden gust of wind blew her into the deep? What if a passenger jet crashed into the parking lot of my apartment complex? (These aren’t very good horror stories, typically, nor very realistic.) I used to get frustrated at my brain for dangling this sort of cheap macabre entertainment into my consciousness. Now I wonder whether maybe it’s trying to prepare me for the sort of rare, long-tail sorts of crises that do pop up from time to time. My mind seems to take crisis elements in odd combinations, almost improv-theater style, and then present them to me as problems to solve, in my otherwise free time.
It does seem to me that some of my real-world precautions have been prompted by side-effects of such daydreams. Is it an abstract actuarial understanding or some wandering fantasy of an unlikely conflagration that actually gets me to check my smoke alarm batteries or buy a fire extinguisher?
This sort of thing seems to happen in dreams, too. They often involve me having some agenda that keeps getting frustrated by the warped Escher landscape of the dream-world, so that I have to continually improvise new solutions. Is this a kind of training for how to handle novel challenges?
Dystopian literature might also be seen in this light: as exploring certain possible-if-unlikely disaster modes as a way of preparing for or preventing them.
“Catastrophizing” is a pathological version of this. People who catastrophize exaggerate the likelihood and the danger of the worst possible outcomes of scenarios. This can result in anxiety and in excessive timidity.
Drugs and risk-taking
One way you can promote your caution is to beware of decisions you make under the influence of drugs. Alcohol, of course, is a notorious #yolo promoter. But any drug that causes altered states of consciousness as part of its menu of effects may also alter the way you detect, evaluate, and react to risk.
Stimulants, medications and recreational drugs that tweak dopamine, and even acetaminophen/paracetamol can demonstrably and predictably incline people to more risky choices.[8]
Consider the advantages of making decisions about risk by using similar brain chemistry to that with which you calibrated your risk evaluation. Or, more plainly: maybe sleep on it, and see if it still seems like a good idea tomorrow when you’re sober.
More generally, your state of mind is important to your ability to practice caution. The more well-rested you are, the less stress you are under, the less clutter or distraction there is in your environment, the more capable you are of focusing, the more you will be able to make cautious decisions and take careful actions.
Recreational drug use can itself be an incautious behavior. It is in some cases an unwise risk that comes pre-packaged with additional incentives to take the risk or rewards for having taken it (the high) along with neurochemical changes that degrade or distort caution or that indeed promote risk-taking. When you look at that package as a whole, it ought at the very least to get your guard up.
Institutional help
In this post I concentrate on caution as a personal virtue, rather than on safety generally. However: Safety-oriented conventions as embodied in institutions, authorities, and traditions can supplement personal caution, and it can be a component of personal caution to regard such things wisely. However, ostensibly safety-oriented conventions can also be dangerous—even (especially?) well-meaning ones. One way to exercise social responsibility [LW · GW] is by improving and promoting the best such conventions, while helping to usher the worst to their deserved obsolescence.
Adding safety features and procedures to some process may make that process more safe but at the same time more costly, and thereby incentivize the use of alternate processes that are even less safe, such as Paul Graham asserts happened with nuclear power in the United States:
People sometimes respond to the presence of safety features by using them to compensate for additional risk rather than allowing them to reduce risk. In particular, safety interventions that reduce the harm or likelihood of otherwise frequently-encountered negative consequences of a risk can increase people’s willingness to undertake that risk and increase their exposure to otherwise less-frequently encountered negative consequences. For example, a frequently-encountered risk of speeding is traffic tickets; a less frequently-encountered risk is a fatal traffic accident. Some feature that makes you more safe from the frequently-encountered risk (a radar detector, a “back the blue” bumper sticker) may thereby encourage you to speed and expose yourself to the less frequently-encountered but more severe danger.
(One economist recommended, tongue-in-cheek I think, that vehicles have sharp spikes protruding from the center of their steering wheels such that accidents would be more dangerous to drivers, in order to counteract this sort of risk compensation and thereby make driving more safe.)[9]
Similarly, some safety features—because they are meant to supplement fallible human attention—can cause people to compensate with increased inattention. Traffic engineer Hans Monderman made a name for himself by demonstrating how removing lane markers, crosswalks, advisory signage, traffic lights, and other such safety features can make people more cautious and alert and as a result can reduce traffic accidents (a strategy called “shared space”).
You can count me among those who tend to believe that our society is over-regulated, choked with laws, and stiflingly litigious. And yet I have to acknowledge that it is difficult to come up with an objective answer about whether or not such a belief is really true. Regulations and lawsuit-preventing precautions, when they do successfully prevent disaster, typically do so without fanfare: Nobody notices the catastrophe that didn’t happen. It’s possible that Ralph Nader saved my life at some point, and in my obliviousness I never even knew to thank him. I bristle at the paternalism of “nudges” like taxing alcohol and subsidizing vaccines, but will grudgingly admit that, at least considered in some isolation, they can probably save lives. At the same time, the victims of overzealous safety regulation can also be hidden: people whose lives could have been saved by a medical device that was red-taped out of existence, people who turned to dangerous black-market drugs when legal alternatives were prohibited, or the many diverse benefits people have had to forego in order that the expensive demands (or, yes, nudges) of the safety bureaucracies may be satisfied.
Sometimes institutions adopt “safety” procedures, but do so under the influence of incentives that do not actually prioritize the safety of those people who will be following those procedures. It can require some sophisticated understanding of these institutions and incentives to understand when this might be the case. (Other times, with only a little experience of how the world works, you can recognize that a three-page, fine-print “important safety advisory” is really only an “our lawyers made us say something that should be obvious to anyone.”)
Regulators may come to understand that their only job is to reduce direct risk, and so they lose all sense of cost-benefit proportion. Would more lives be saved if air traffic were made cheaper by being allowed to become more dangerous, if this also meant fewer people traveling by car? Maybe so, but those who regulate airline safety are unlikely to find that saving those lives is part of their job description.
As the covid pandemic emerged, safety guidance from some of the most authoritative sources was disturbingly hit-or-miss. Public health authorities, whose incentives at times seemed to be dominated by concern about optics and ass-covering, too frequently gave advice (or promoted/hindered mitigation strategies) in a way that was sub-optimal or indeed harmful from a safety perspective. And unfortunately, the sensible alternatives to this struggled to be heard over a tsunami of superstition and snake-oil and disinformation. Anyone who wanted to behave with an appropriate level of caution has had to put in extraordinary effort to learn how to do so.
Jason Crawford’s post, “Towards a Philosophy of Safety” has some good thoughts on how we should think about safety in these institutional contexts, what biases we should be vigilant about, and what techniques might help us better optimize safety trade-offs.[10] Crawford also considers how our traditional institutional safety culture, which typically springs into action to prevent repeats of disasters that have already happened, might be retooled for an era of rapid technological change in which we would be wise to prevent novel disasters from happening the first time.
The Precautionary Principle
“The Precautionary Principle” has been formulated in a variety of ways.[11] The lack of a consensus definition means that sometimes people form strong opinions about different conceptions of it and then agree or disagree with one another without being clear about what they’re agreeing or disagreeing about.
In broad outline, this principle goes something like this: If a sufficiently bad consequence is a foreseeably possible (even though not certainly probable) outcome of some endeavor, then before we pursue that endeavor, we ought to take steps to remedy that possible consequence (even in this absence of certainty).
(There is also a closely-related “Catastrophe Principle” in which if the foreseeably possible consequence is sufficiently apocalyptic [? · GW], you must certainly act to prevent that consequence, even if it seems unlikely. There are some Pascal’s Mugging [? · GW]-like edge cases of this to beware of, but it can be a persuasive argument.[11])
Sometimes the Precautionary Principle is deployed to put the burden-of-proof on those who propose a change from the status quo: demonstrate that you have thought through the possible consequences and that you are prepared to meet them before you go through with your plan. This is a conservative, status-quo-preserving, Chesterton’s Fence [? · GW]-upholding interpretation. Critics point out that sometimes bad consequences may plausibly but unprovably result not only from an action, but also from preventing that action, or from failure to act. What standard justifies [? · GW] holding action to a higher burden of proof than inaction?
It seems possible to interpret the Precautionary Principle in such a way that it similarly treats action and inaction, risks and remedies, advances and retreats. But then it seems to become simply the more mundane advice to anticipate and prepare for the possible consequences of whatever comes to pass, quantify risks carefully, and choose your course in life wisely based on these possible consequences: in other words, to practice caution.
Appendix: We’re Beginning Our Descent
It may be helpful, when assessing your risk of death in particular, to remind yourself of the statistically most common causes of death, so that you aren’t overly-fearful of rare but flashy sources of doom, nor astonished by the creepingly common assassins. To this end, I have composed a mnemonic poem:
Were you warned? of March? the Ides?
I’ve been warned too: triglycerides.
Chronic obstructive lung disease,
Blocked coronary arteries.
None would marvel if I croak
From diabetes or a stroke.
I miss a step; head-first I fall.
That lump was cancer after all.
A full-grown man, could it be true:
Laid on the slab just from the flu?
Before they had a good vaccine,
I rubbed my nose: Covid-19.
“It’s like he never saw the red”
(The driver explains why I’m dead).
A brain rot nobody can cure,
A “hunting accident” they’re sure.
Charts of mortality provide
These common ends and more beside.
The Steward said as we embarked:
“The exit rows are clearly marked.”
- ^
“Character Strengths: Prudence” VIA Institute on Character: The 24 Character Strengths
- ^
Flavio Gerbino “Logical Fallacies when Assessing Risks” scip
- ^
George Leybourne “The Daring Young Man on the Flying Trapeze” (1867)
- ^
Ryan H. Murphy “The Rationality of Literal Tide Pod Consumption” Journal of Bioeconomics (2019)
- ^
Stephen Lyng “A social psychological analysis of voluntary risk taking” American Journal of Sociology (1990) as described in Fiorenzo Ranieri “Extreme Risk Seeking Addiction: Theory and Treatment” British Journal of Psychotherapy (2011)
- ^
James Boswell, Life of Samuel Johnson (1791), entry for September 19, 1777
- ^
Butthole Surfers, “Sweat Loaf” Locust Abortion Technician (1987)
- ^
@Scott Alexander [LW · GW] “The Psychopharmacology Of The FTX Crash” Astral Codex Ten 16 Nov. 2022
Peter Dockrill “The Most Common Pain Relief Drug in The World Induces Risky Behavior, Study Shows” ScienceAlert 14 Nov. 2022
- ^
Jason Torchinsky “There’s Actually A Name For A Steering Wheel With A Big Spike In The Middle” Jalopnik 22 Jan. 2017
- ^
@jasoncrawford [LW · GW] “Towards a Philosophy of Safety” [LW · GW] LessWrong 16 Sep. 2022
- ^
Neal A. Manson “Formulating the Precautionary Principle” Environmental Ethics (2002)
0 comments
Comments sorted by top scores.