Your intuitions are not magic
post by Kaj_Sotala · 2010-06-10T00:11:30.121Z · LW · GW · Legacy · 42 commentsContents
42 comments
People who know a little bit of statistics - enough to use statistical techniques, not enough to understand why or how they work - often end up horribly misusing them. Statistical tests are complicated mathematical techniques, and to work, they tend to make numerous assumptions. The problem is that if those assumptions are not valid, most statistical tests do not cleanly fail and produce obviously false results. Neither do they require you to carry out impossible mathematical operations, like dividing by zero. Instead, they simply produce results that do not tell you what you think they tell you. As a formal system, pure math exists only inside our heads. We can try to apply it to the real world, but if we are misapplying it, nothing in the system itself will tell us that we're making a mistake.
Examples of misapplied statistics have been discussed here before. Cyan discussed a "test" that could only produce one outcome. PhilGoetz critiqued a statistical method which implicitly assumed that taking a healthy dose of vitamins had a comparable effect as taking a toxic dose.
Even a very simple statistical technique, like taking the correlation between two variables, might be misleading if you forget about the assumptions it's making. When someone says "correlation", they are most commonly talking about Pearson's correlation coefficient, which seeks to gauge whether there's a linear relationship between two variables. In other words, if X increases, does Y also tend to increase. (Or decrease.) However, like with vitamin dosages and their effects on health, two variables might have a non-linear relationship. Increasing X might increase Y up to a certain point, after which increasing X would decrease Y. Simply calculating Pearson's correlation on two such variables might cause someone to get a low correlation, and therefore conclude that there's no relationship or there's only a weak relationship between the two. (See also Anscombe's quartet.)
The lesson here, then, is that not understanding how your analytical tools work will get you incorrect results when you try to analyze something. A person who doesn't stop to consider the assumptions of the techniques she's using is, in effect, thinking that her techniques are magical. No matter how she might use them, they will always produce the right results. Of course, assuming that makes about as much sense as assuming that your hammer is magical and can be used to repair anything. Even if you had a broken window, you could fix that by hitting it with your magic hammer. But I'm not only talking about statistics here, for the same principle can be applied in a more general manner.
Every moment in our lives, we are trying to make estimates of the way the world works. Of what causal relationships there are, of what ways of describing the world make sense and which ones don't, which plans will work and which ones will fail. In order to make those estimates, we need to draw on a vast amount of information our brains have gathered throughout our lives. Our brains keep track of countless pieces of information that we will not usually even think about. Few people will explicitly keep track of the amount of different restaurants they've seen. Yet in general, if people are asked about the relative number of restaurants in various fast-food chains, their estimates generally bear a close relation to the truth.
But like explicit statistical techniques, the brain makes numerous assumptions when building its models of the world. Newspapers are selective in their reporting of disasters, focusing on rare shocking ones above common mundane ones. Yet our brains assume that we hear about all those disasters because we've personally witnessed them, and that the distribution of disasters in the newspapers therefore reflects the distribution of disasters in the real world. Thus, people asked to estimate the frequency of different causes of death underestimate the frequency of those that are underreported in the media, and overestimate the ones that are overreported.
On this site, we've also discussed a variety of other ways by which the brain's reasoning sometimes goes wrong: the absurdity heuristic, the affect heuristic, the affective death spiral, the availability heuristic, the conjunction fallacy... the list goes on and on.
So what happens when you've read too many newspaper articles and then naively wonder about how frequent different disasters are? You are querying your unconscious processes about a certain kind of statistical relationship, and you get an answer back. But like the person who was naively misapplying her statistical tools, the process which generates the answers is a black box to you. You do not know how or why it works. If you would, you could tell when its results were reliable, when they needed to be explicitly corrected for, and when they were flat-out wrong.
Sometimes we rely on our intuitions even when they are being directly contradicted by math and science. The science seems absurd and unintuitive; our intuitions seem firm and clear. And indeed, sometimes there's a flaw in the science, and we are right to trust our intuitions. But on other occasions, our intuitions are wrong. Yet we frequently persist in holding onto our intuitions. And what is ironic is that we persist on holding onto them exactly because we do not know how they work, because we cannot see their insides and all the things inside them that could go wrong. We only get the feeling of certainty, a knowledge of this being right, and that feeling cannot be broken into parts that could be subjected to criticism to see if they add up.
But like statistical techniques in general, our intuitions are not magic. Hitting a broken window with a hammer will not fix the window, no matter how reliable the hammer. It would certainly be easy and convenient if our intuitions always gave us the right results, just like it would be easy and convenient if our statistical techniques always gave us the right results. Yet carelessness can cost lives. Misapplying a statistical technique when evaluating the safety of a new drug might kill people or cause them to spend money on a useless treatment. Blindly following our intuitions can cause our careers, relationships or lives to crash and burn, because we did not think of the possibility that we might be wrong.
That is why we need to study the cognitive sciences, figure out the way our intuitions work and how we might correct for mistakes. Above all, we need to learn to always question the workings of our minds, for we need to understand that they are not magical.
42 comments
Comments sorted by top scores.
comment by GreenRoot · 2010-06-10T16:46:20.551Z · LW(p) · GW(p)
Thanks for the well-written article. I enjoyed the analogy between statistical tools and intuition. I'm used to questioning the former, but more often than not I still trust my intuition, though now that you point it out, I'm not sure why.
Replies from: xv15, Jayson_Virissimo↑ comment by xv15 · 2010-06-11T10:42:15.443Z · LW(p) · GW(p)
You shouldn't take this post as a dismissal of intuition, just a reminder that intution is not magically reliable. Generally, intuition is a way of saying, "I sense similarities between this problem and other ones I have worked on. Before I work on this problem, I have some expectation about the answer." And often your expectation will be right, so it's not something to throw away. You just need to have the right degree of confidence in it.
Often one has worked through the argument before and remembers the conclusion but not the actual steps taken. In this case it is valid to use the memory of the result even though your thought process is a sort of black box at the time you apply it. "Intuition" is sometimes used to describe the inferences we draw from these sorts of memories; for example, people will say, "These problems will really build up your intuition for how mathematical structure X behaves." Even if you cannot immediately verbalize the reason you think something, it doesn't mean you are stupid to place confidence in your intuitions. How much confidence depends on how frequently you tend to be right after actually trying to prove your claim in whatever area you are concerned with.
↑ comment by Jayson_Virissimo · 2010-06-10T21:02:25.625Z · LW(p) · GW(p)
I do know why I trust my intuitions as much as I do. My intuitions are partly the result of natural selection and so I can expect that they can be trusted for the purposes of surviving and reproducing. In domains that closely resemble the environment where this selection process took place I trust my intuition more, in domains that do not resemble that environment I trust my intuition less.
Black box or not, the fact that we are here is good evidence that they (our intuitions) work (on net).
Replies from: diegocaleiro, therufs, tommyjohn↑ comment by diegocaleiro · 2010-06-12T08:18:54.954Z · LW(p) · GW(p)
How sexy is that?
If you are evaluating intuitions, there are two variables you should account for. The similarity with evolutionary environment, indeed. AND your current posterior belief of the importance of this kind of act in the variance of offspring production.
We definitely evolved in an environment full of ants. Does that mean my understanding of ant-colony intelligence is intuitive?
↑ comment by tommyjohn · 2011-11-18T22:27:32.188Z · LW(p) · GW(p)
So then anything that has evolved may be relied upon for survival? It is impossible to rationalize faith in an irrational cognitive process. In the book Blink, the author asserts that many instances of intuition are just extremely rapid rational thoughts, possibly at a sub-conscious level.
comment by fool_hill · 2010-06-10T18:20:11.033Z · LW(p) · GW(p)
i don't know why we prefer to hold on to our intuitions. your claim, that " we persist on holding onto them exactly because we do not know how they work" has not been proven, as far as I can tell, and seems unlikely. I also don't know why our own results seem sharper than what we learn from the outside [although about this later point, i bet there's some story about lack of trust in homo hypocritus societies or something] .
As somebody who fits into the "new to the site" category, I enjoyed your article.
Replies from: RobinZ, JDM↑ comment by RobinZ · 2010-06-10T19:15:23.729Z · LW(p) · GW(p)
Welcome to Less Wrong! Feel free to post an explicit introduction on that thread, if you're hanging around.
I think the critical point is in the next sentence:
We only get the feeling of certainty, a knowledge of this being right, and that feeling cannot be broken into parts that could be subjected to criticism to see if they add up.
Yes, we don't know what the interiors are - but the original source of our confidence is our (frequently justified) trust in our intuitions. I think another related point is made in How An Algorithm Feels From Inside, which talks about an experience which is illusory, merely reflecting an artifact of the way the brain processes data. The brain usually doesn't bother flagging a result as a result, it just marks it as true and charges forward. And as a consequence we don't observe that we are generalizing from the pattern of news stories we watched, and therefore don't realize our generalization may be wrong.
↑ comment by JDM · 2012-11-05T13:18:55.680Z · LW(p) · GW(p)
I think it's a combination of not understanding the process with a lifetime of experience where's it's far more right than wrong (Even for younger people, if they have 10-15 years of instinctive behavior being rewarded on some level, it's hard to accept there are situations it doesn't work as well). Combine that with the tendency of positive outcomes to be more memorable than others, and it's not too difficult to understand why people trust their intuition as much as they do.
your claim, that " we persist on holding onto them exactly because we do not know how they work" has not been proven, as far as I can tell, and seems unlikely.
It may not be the only reason, but an accurate understanding of how intuitions work would make it easier to rely less on it in situations it's not as we'll equipped for, just as an understanding of different biases makes it easier to fight them in our own thought processes.
comment by Wei Dai (Wei_Dai) · 2010-06-11T14:45:47.444Z · LW(p) · GW(p)
Intuition seems to be one of the least studied areas of cognitive science, at least until very recently. The Wikipedia entry on cognitive sciences that the post links to has no mention of "intuition", and one paper I found said that the 1999 MIT Encyclopedia of Cognitive Sciences doesn't even have a single index entry for it (while "logic" has almost 100 references).
After a bit more searching, I found a 2007 book titled Intuition in Judgment and Decision Making, which apparently represents the current state of the art in understanding the nature of intuition.
comment by [deleted] · 2017-01-19T14:17:37.635Z · LW(p) · GW(p)
Indeed, intuitions are fallible. Though beware of the other extreme: writing off your intuitions altogether and trying to live solely based on logic. I've seen various people in the LW sphere try this, and it doesn't quite work. In some cases, like nutrition or social life, there is a bottomless pit of complexity. Trying to provably 'solve' such problems will lead to a bottomless pit of thinking, stagnation, and depression.
Logic is not a magic hammer either.
comment by Alexander (alexander-1) · 2022-01-26T07:37:22.213Z · LW(p) · GW(p)
The intuitions vs reasons debates often appear misguided to me. You've eloquently pointed out that intuitions result from the same black box of spaghetti code as reasons, namely the brain.
Replies from: tomcatfish↑ comment by Alex Vermillion (tomcatfish) · 2022-07-16T05:19:13.299Z · LW(p) · GW(p)
Volcanoes and dolphins are both generated from physics, but that doesn't convince me they're the same, so there must be more to this argument that that!
Replies from: alexander-1, alexander-1↑ comment by Alexander (alexander-1) · 2022-07-16T07:27:42.140Z · LW(p) · GW(p)
Additionally, your analogy doesn't map well to my comment. A more accurate analogy would be to say that active volcanoes are explicit and non-magical (similar to reason), while inactive volcanoes are mysterious and magical (similar to intuitions), when both phenomena have the same underlying physical architecture (rocks and pressure for volcanoes and brains for cognition), but manifest differently.
↑ comment by Alexander (alexander-1) · 2022-07-16T07:01:59.455Z · LW(p) · GW(p)
I just reckon that we are better off working on understanding how the black box actually works under the hood instead of placing arbitrary labels and drawing lines in the sand on things we don't understand, and then debating those things we don't understand with verve. Labelling some cognitive activities as reason and others as intuitions doesn't explain how either phenomenon actually works.
comment by Douglas_Knight · 2010-06-10T22:40:08.507Z · LW(p) · GW(p)
People who know a little bit of statistics - enough to use statistical techniques, not enough to understand why or how they work - often end up horribly misusing them.
How often do people harm themselves with statistics, rather than further their goals through deception? Scientists data-mining get publications; financiers get commissions; reporters get readers.
ETA: the people who are fooled are harming themselves with statistics. But I think the people want to understand for themselves generally only use statistics that they understand.
Replies from: SilasBarta, Dre↑ comment by SilasBarta · 2010-06-10T22:44:02.780Z · LW(p) · GW(p)
True, but many of those scientists and reporters really do want to unravel the actual truth, even if it means less material wealth or social status. These people would enjoy being corrected.
↑ comment by Dre · 2010-06-11T01:25:21.908Z · LW(p) · GW(p)
There is also an opportunity cost to the poor use of statistics instead of proper use. This may be only externalities (the person doing the test may actually benefit more from deception), but overall the world would be better if all statistics were used correctly.
comment by Franco Vairoletti (franco-vairoletti) · 2019-08-05T19:05:20.517Z · LW(p) · GW(p)
Hi! I'm new here and I'd like to thank for the site and for this instructive article in particular. I'm quite convinced that the overconfidence in our own intuition, even without knowing its underlying mechanisms, is one of the main obstacles to rational thinking. Maybe this blackbox working of our intuition also plays a role in communication of our ideas to others. How we can change someone's opinions if we don't know how they got them in first place?
Thanks again and congrats for your work!
comment by michael_b · 2015-01-29T13:02:15.917Z · LW(p) · GW(p)
The immediately available example supporting your article for me is the relationship between dietary cholesterol and blood cholesterol. There's high general confusion around this health claim.
What no doubt compounds the confusion on the issue is that intuitively you might infer that eating zero cholesterol should lower blood cholesterol, or that eating high cholesterol should raise blood cholesterol. Evidence shows this often happens, but not always. There are enough notable outliers that the claim has been defeated in the general mind because it doesn't support the intuitive story.
That is, vegans who eat almost no cholesterol containing foods, can have high blood cholesterol. On the flip side, surely everyone has heard of that friend of a friend who eats inf eggs a day and has low blood cholesterol.
There's a reasonably interesting story that fits the evidence for the claim that if dietary cholesterol then blood cholesterol, but the nonlinearity of the relationship and also the incidence of intuition defeating cases cloud the issue.
comment by akshatrathi · 2010-08-13T01:50:57.295Z · LW(p) · GW(p)
I enjoyed your article and as a scientist, I've been interested to understand this: what seems an intuitive method to use to solve a scientific problem is not seen as an intuitive method while solving 'other' problems.
By 'other', I mean things like psychological problems or problems that arise from conflicts amongst people. It may be obvious why it is not 'intuitive' but what goes beyond my understanding is most will not even consider using the scientific method for the latter types of problem ever.
comment by Yuval Maharshak · 2021-02-11T17:44:31.424Z · LW(p) · GW(p)
Your first idea is always in doubt seems like a nice law. But then I think is it a nice law cause my first idea was that it was nice.
comment by Данило Глинський (danilo-glinskii) · 2019-09-17T20:21:04.095Z · LW(p) · GW(p)
Yet in general, if people are asked about the relative number of restaurants in various fast-food chains, their estimates generally bear a close relation to the truth.
The link is broken. Is it this article https://psycnet.apa.org/record/1992-18641-001 ?
comment by AFinerGrain_duplicate0.4555006182262571 · 2017-10-02T23:52:39.505Z · LW(p) · GW(p)
I originally learned about these ideas from Thinking Fast and Slow, but I love hearing them rephrased and repeated again and again. Thinking clearly often means getting in the cognitive habit of questioning every knee-jerk intuition.
On the other hand, coming from a Bryan Caplan / Michael Huemer perspective, aren't we kind of stuck with some set of base intuitions? Intuitions like; I exist, the universe exists, other people exist, effects have causes, I'm not replaced by a new person with memory implants every time I go to sleep...
You might even call these base intuitions, "magic," in the sense that you have to have faith in them in order to do anything like rationality.
Replies from: TheAncientGeek↑ comment by TheAncientGeek · 2017-10-03T11:37:19.564Z · LW(p) · GW(p)
Well, we don't know if they work magically, because we don't know that they work at all. They are just unavoidable.
It's not that philosophers weirdly and unreasonably prefer intuition to empirical facts and mathematical/logical reasoning, it is that they have reasoned that they can't do without them: that (the whole history of) empiricism and maths as foundations themselves rest on no further foundation except their intuitive appeal. That is the essence of the Inconvenient Ineradicability of Intuition. An unfounded foundation is what philosophers mean by "intuition". Philosophers talk about intution a lot because that is where arguments and trains of thought ground out...it is away of cutting to the chase. Most arguers and arguments are able to work out the consequences of basic intutitions correctly, so disagrements are likely to arise form differencs in basic intuitions themselves.
Philosophers therefore appeal to intuitions because they can't see how to avoid them...whatever a line of thought grounds out in, is definitiionally an intuition. It is not a case of using inutioins when there are better alternatives, epistemologically speaking. And the critics of their use of intuitions tend to be people who haven't seen the problem of unfounded foundations because they have never thought deeply enough, not people who have solved the problem of finding sub-foundations for your foundational assumptions.
Scientists are typically taught that the basic principles maths, logic and empiricism are their foundations, and take that uncritically, without digging deeper. Empircism is presented as a black bx that produces the goods...somehow. Their subculture encourages use of basic principles to move forward, not a turn backwards to critically relflect on the validity of basic principles. That does not mean the foundational principles are not "there". Considering the foundational principles of science is a major part of philosophy of science, and philosophy of science is a philosophy-like enterprise, not a science-like enterprise, in the sense it consists of problems that have been open for a long time, and which do not have straightforward empirical solutions.
Does the use of empiricism shortcut the need for intuitions, in the sense of unfounded foundations?
For one thing, epistemology in general needs foundational assumptions as much as anything else. Which is to say that epistemogy needs epistemology as much as anything else. -- to judge the validity of one system of epistemology, you need another one. There is no way of judging an epistemology starting from zero, from a complete blank. Since epistemology is inescapable, and since every epistemology has its basic assumptions, there are basic assumptions involved in empiricism.
Empiricism specifically has the problem of needing an ontological foundation. Philosophy illustrates this point with sceptical scenarios about how you are being systematically deceived by an evil genie. Scientific thinkers have closely parallel scenarios in which humans cannot be sure whether you are not in the Matrix or some other virtual reality. Either way, these hypotheses illustrate the point that the empiricists are running on an assumption that if you can see something, it is there.
comment by [deleted] · 2015-02-10T01:10:26.886Z · LW(p) · GW(p)
I think this article doesn't quite appreciate the full role intuitions play in science. It seems to me that intuitions help shape science in large ways. For instance, our intuitions that 'deduction works' and 'induction works' seems to stop all of us from turning into Cartesian sceptics, and preventing any science. Intuitions (and philosophical arguments) about metaphysics shape the basis of acceptable hypotheses within physics. Intuitions about what makes a scientific theory good/explanatory/falsified shape how science proceeds. Intuitions also serve to define concepts we have. If I remember correctly, in the Newtonian era, mass was not analysed in terms of anything else. It was a primitive concept in Newton's physics, and it was defined intuitively. Nowadays, modern physics has analysed concepts in terms of more and increasingly obscure concepts; but nevertheless, there is always a limit to what has been analysed in terms of what, and what remains is held, insofar as we know of it, as known primitively. That is, known intuitively.
I also have a question: Does this site in general take a negative view of heuristics humans have? I've seen various pages complaining about heuristics humans have, and not much about how helpful they are in keeping us all functioning.
comment by MikeDobbs · 2013-03-25T13:28:08.270Z · LW(p) · GW(p)
This was an excellent read- I particularly enjoyed the comparison drawn between our intuition and other potentially "black box" operations such as statistical analysis. As a mathematics teacher (and recreational mathematician) I am constantly faced with, and amused by, the various ways in which my intuition can fail me when faced with a particular problem.
A wonderful example of the general failure of intuition can be seen in the classic "Monty Hall Problem." In the old TV game show Monty Hall would offer the contestant their choice of one of three doors. One door would have a large amount of cash, the other two a non-prize such as a goat. Here's where it got interesting. After the contestant makes their choice, Monty opens one of the "loosing" doors, leaving only two closed (one of which contains the prize), then offers the contestant he opportunity to switch from their original door to the other remaining door.
The question is, should they switch? Does it even matter? For most people (myself included) our intuition tells us it doesn't matter. There are two doors, so there's a 50/50 chance of winning whether you switch or not. However a quick analysis of the probabilities involved shows us that they are in fact TWICE as likely to win the prize if they switch than if they stay with their original choice.
That's a big difference- and a very counterintuitive result when first encountered (at least in my opinion)
Replies from: TheOtherDave↑ comment by TheOtherDave · 2013-03-25T15:39:49.838Z · LW(p) · GW(p)
I was first introduced to this problem by a friend who had received as a classroom assignment "Find someone unfamiliar with the Monty Hall problem and convince them of the right answer."
The friend in question was absolutely the sort of person who would think it was fun to convince me of a false result by means of plausible-sounding flawed arguments, so I was a very hard sell... I ended up digging my heels in on a weird position roughly akin to "well, OK, maybe the probability of winning isn't the same if I switch, but that's just because we're doing something weird with how we calculate probabilities... in the real world I wouldn't actually win more often by switching, cuz that's absurd."
Ultimately, we pulled out a deck of cards and ran simulated trials for a while, but we got interrupted before N got large enough to convince me.
So, yeah: counterintuitive.
Replies from: None↑ comment by [deleted] · 2013-03-25T16:40:32.395Z · LW(p) · GW(p)
I remember how my roommates and I drew a game tree for the Monty Hall problem, assigned probabilities to outcomes, and lo, it was convincing.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2013-03-25T16:59:38.385Z · LW(p) · GW(p)
(nods)
It continues to embarrass me that ultimately I was only "convinced" that the calculated answer really was right, and not some kind of plausible-sounding sleight-of-hand, when I confirmed that it was commonly believed by the right people.
Replies from: MikeDobbs↑ comment by MikeDobbs · 2013-03-25T17:29:23.233Z · LW(p) · GW(p)
One of my favorites for exactly that reason- if you don't mind, let me take a stab at convincing you absent "the right people agreeing."
The trick is that once Monty removes one door from the contest you are left with a binary decision. Now to understand why the probability differs from our "gut" feeling of 50/50 you must notice that switching amounts to winning IF your original choice was wrong, and loosing IF your original choice was correct (of course staying with your original choice results in winning if you were right and loosing if you were wrong).
So, consider the probability that you original guess was correct. Clearly this is 1/3. That means the probability of your original choice being incorrect is 2/3. And there's the rub. If you will initially guess the wrong door 2/3 of the time, then that means that when you are faced with the option to switch doors you're original choice will be wrong 2/3 of the time, and switching would result in you switching to the correct door. Only 1/3 of the time will your original choice be correct, making switching a loosing strategy.
It becomes more clear if you begin with 10 doors. In this modified Monty Hall problem, you pick a door, then Monty opens 8 doors, leaving only your original choice and on other (one of which contains the prize money). In this case your original choice will be incorrect 9/10 times, which means when faced with the option to switch, switching will result in a win 9/10 times, as opposed to staying with your original choice, which will result in a win only 1/9 times.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2013-03-25T18:07:25.620Z · LW(p) · GW(p)
(nods) Yah, I'm familiar with the argument. And like a lot of plausible-sounding-but-false arguments, it sounds reasonable enough each step of the way until the absurd conclusion, which I then want to reject. :-)
Not that I actually doubt the conclusion, you understand.
Of course, I've no doubt that with sufficient repeated exposure this particular problem will start to seem intuitive. I'm not sure how valuable that is.
Mostly, I think that the right response to this sort of counterintuitivity is to get seriously clear in my head the relationship between justified confidence and observed frequency. Which I've never taken the time to do.
comment by JamesCole · 2010-06-11T07:07:17.933Z · LW(p) · GW(p)
Yet our brains assume that we hear about all those disasters [we read about in the newspaper] because we've personally witnessed them, and that the distribution of disasters in the newspapers therefore reflects the distribution of disasters in the real world.
Even if we had personally witnessed them, that wouldn't, in itself, be any reason to assume that they are representative of things in general. The representativeness of any data is always something that can be critically assessed.
Replies from: Kutta↑ comment by Kutta · 2010-06-11T09:25:53.020Z · LW(p) · GW(p)
For many people, representativeness is the primary governing factor in any data analysis, not just a mere facet of reasoning that should be critically assessed. Also, aside from the mentioned media bias that is indeed relatively easily correctable, there are many subtler instances of biasing via representativess, on the level of cognitive processes.
comment by nazgulnarsil · 2010-06-11T02:23:38.490Z · LW(p) · GW(p)
"However, like with vitamin dosages and their effects on health, two variables might have a non-linear relationship."
if we limit our interval we can make a linear approximation within that interval. this is often good enough if we don't much care about data outside that interval. the easy pitfall of course is people wanting to extend the linearization beyond the bounds of the interval.
Replies from: Nanani↑ comment by Nanani · 2010-06-16T03:10:25.507Z · LW(p) · GW(p)
Voted down because tangential replies that belong elsewhere really get on my nerves. Please comment on the post about the vitamin study, linked in the OP.
Replies from: nazgulnarsil↑ comment by nazgulnarsil · 2010-06-17T16:04:15.821Z · LW(p) · GW(p)
0_o I was responding directly to the OP.
comment by fatnomen · 2021-10-17T08:17:47.371Z · LW(p) · GW(p)
Interesting text, but I'm getting a nagging intuition here regarding what assumptions you are using for correct reasoning. Correct me if I am wrong, but is your assumption that your assumption for normative rationality (both decision making and inquiry) is one of pure statistical and logical inference, and a deviation from this norm you consider a fallacy.
Is this correct of me to assume?
Replies from: Kaj_Sotala↑ comment by Kaj_Sotala · 2021-10-17T09:26:25.611Z · LW(p) · GW(p)
It's been quite a while since I wrote this post, so it's hard for me to remember what exactly I was thinking when writing it. :) But I think that I at least meant that there's a significant component of epistemic rationality that's basically the same as statistical inference, and that deviating from its norm is likely to create incorrect beliefs.
I don't know whether I would have said that that's the only part of rationality, but at least I wouldn't endorse such a claim now, and I think the post works even without such an assumption.