Gems from the Wiki: Do The Math, Then Burn The Math and Go With Your Gut

post by habryka (habryka4), riceissa · 2020-09-17T22:41:24.097Z · LW · GW · 3 comments

This is a link post for https://www.lesswrong.com/tag/do-the-math-then-burn-the-math-and-go-with-your-gut

Contents

  History
  See also
  External links
None
3 comments

During the LessWrong 1.0 Wiki Import [LW · GW] we (the LessWrong team) discovered a number of great articles that most of the LessWrong team hadn't read before. Since we expect many others to also not have have read these, we are creating a series of the best posts from the Wiki to help give those hidden gems some more time to shine.

The original wiki article was fully written by riceissa [LW · GW], who I've added as a coauthor to this post. Thank you for your work on the wiki!


"Do the math, then burn the math and go with your gut"1 [? · GW] is a procedure for decision-making that has been described by Eliezer Yudkowsky [? · GW]. The basic procedure is to go through the process of assigning numbers and probabilities that are relevant to some decision ("do the math") and then to throw away this calculation and instead make the final decision with one's gut feelings ("burn the math and go with your gut"). The purpose of the first step is to force oneself to think through all the details of the decision and to spot inconsistencies.

History

In July 2008, Eliezer Yudkowsky wrote the blog post "When (Not) To Use Probabilities", which discusses the situations under which it is a bad idea to verbally assign probabilities. Specifically, the post claims that while theoretical arguments in favor of using probabilities (such as Dutch book and coherence arguments) always apply, humans have evolved algorithms for reasoning under uncertainty that don't involve verbally assigning probabilities (such as using "gut feelings"), which in practice often perform better than actually assigning probabilities. In other words, the post argues in favor of using humans' non-verbal/built-in forms of reasoning under uncertainty even if this makes humans incoherent/subject to Dutch books, because forcing humans to articulate probabilities would actually lead to worse outcomes. The post also contains the quote "there are benefits from trying to translate your gut feelings of uncertainty into verbal probabilities. It may help you spot problems like the conjunction fallacy. It may help you spot internal inconsistencies – though it may not show you any way to remedy them."2 [? · GW]

In October 2011, LessWrong user bentarm gave an outline of the procedure in a comment in the context of the Amanda Knox case. The steps were: "(1) write down a list of all of the relevant facts on either side of the argument. (2) assign numerical weights to each of the facts, according to how much they point you in one direction or another. (3) burn the piece of paper on which you wrote down the facts, and go with your gut." This description was endorsed by Yudkowsky in a follow-up comment. bentarm's comment claims that Yudkowsky described the procedure during summer of 2011.3 [? · GW]

In December 2016, Anna Salamon [? · GW] described the procedure parenthetically at the end of a blog post. Salamon described the procedure as follows: "Eliezer once described what I take to be the a similar ritual for avoiding bucket errors, as follows: When deciding which apartment to rent (he said), one should first do out the math, and estimate the number of dollars each would cost, the number of minutes of commute time times the rate at which one values one's time, and so on. But at the end of the day, if the math says the wrong thing, one should do the right thing anyway."4 [? · GW]

See also


  1. Qiaochu Yuan. "Qiaochu_Yuan comments on A Sketch of Good Communication" [LW(p) · GW(p)]. March 31, 2018. LessWrong. [? · GW]
  2. Eliezer Yudkowsky. "When (Not) To Use Probabilities" [LW · GW]. July 23, 2008. LessWrong. [? · GW]
  3. bentarm. "bentarm comments on Amanda Knox: post mortem" [LW(p) · GW(p)]. October 21, 2011. LessWrong. [? · GW]
  4. Anna Salamon. "'Flinching away from truth' is often about *protecting* the epistemology" [LW · GW]. December 20, 2016. LessWrong. [? · GW]

3 comments

Comments sorted by top scores.

comment by Adam Zerner (adamzerner) · 2020-09-18T00:34:58.994Z · LW(p) · GW(p)

Relevant excerpt from Chapter 86 of HPMOR:

Harry was wondering if he could even get a Bayesian calculation out of this. Of course, the point of a subjective Bayesian calculation wasn't that, after you made up a bunch of numbers, multiplying them out would give you an exactly right answer. The real point was that the process of making up numbers would force you to tally all the relevant facts and weigh all the relative probabilities. Like realizing, as soon as you actually thought about the probability of the Dark Mark not-fading if You-Know-Who was dead, that the probability wasn't low enough for the observation to count as strong evidence. One version of the process was to tally hypotheses and list out evidence, make up all the numbers, do the calculation, and then throw out the final answer and go with your brain's gut feeling after you'd forced it to really weigh everything. The trouble was that the items of evidence weren't conditionally independent, and there were multiple interacting background facts of interest...

Replies from: riceissa
comment by riceissa · 2021-12-24T19:58:18.334Z · LW(p) · GW(p)

Thanks, I have added the quote to the page.

comment by riceissa · 2021-12-24T20:00:23.149Z · LW(p) · GW(p)

Vipul Naik has discovered that Alfred Marshall had basically the same idea (he even used the phrase "burn the mathematics"!) way back in 1906 (!), although he only described the procedure as a way to do economics research, rather than for decision-making. I've edited the wiki page to incorporate this information.