Rationality Compendium: Principle 2 - You are implemented on a human brain
post by ScottL · 2015-08-29T16:24:37.866Z · LW · GW · Legacy · 5 commentsContents
Related Materials Wikis: Posts Popular Books: Papers: Notes on decisions I have made while creating this post None 5 comments
Irrationality is ingrained in our humanity. It is fundamental to who we are. This is because being human means that you are implemented on kludgy and limited wetware (a human brain). A consequence of this is that biases ↓ and irrational thinking are not mistakes, persay, they are not misfirings or accidental activations of neurons. They are the default mode of operation for wetware that has been optimized for purposes other than truth maximization.
If you want something to blame for the fact that you are innately irrational, then you can blame evolution ↓. Evolution tends to not to produce optimal organisms, but instead produces ones that are kludgy ↓, limited and optimized for criteria relating to ancestral environments rather than for criteria relating to optimal thought.
A kludge is a clumsy or inelegant, yet surprisingly effective, solution to a problem. The human brain is an example of a kludge. It contains many distinct substructures dating from widely separated periods of evolutionary development ↓. An example of this is the two kinds of processes in human cognition where one is fast (type 1) and the other is slow (type2) ↓.
There are many other characteristics of the brain that induce irrationality. The main ones are that:
- The brain is innately limited in its computational abilities and so it must use heuristics ↓, which are mental shortcuts that ease the cognitive load of making a decision.
- The brain has a tendency to blindly use salient or pre-existing responses to answers rather than developing new answers or thoroughly checking pre-existing solutions ↓.
- The brain does not inherently value truth. One of the main reasons for this is that many of the biases can actually be adaptive. An example of an adaptive bias is the sexual over perception bias ↓ in men. From a truth-maximization perspective young men who assume that all women want them are showing severe social-cognitive inaccuracies, judgment biases, and probably narcissistic personality disorder. However, from an evolutionary perspective, the same young men are behaving in a more optimal manner. One which has consistently maximized the reproductive success of their male ancestors. Another similar example is the bias for positive perception of partners ↓.
- The brain acts more like a coherence maximiser than a truth maximiser, which makes people liable to believing falsehoods ↓. If you want to believe something or you are often in situations in which two things just happen to be related then your brain is often by default going to treat them as if they were right ↓.
- The brain trusts its own version of reality much more than other peoples. This makes people defend their beliefs even when doing so is extremely irrational ↓. It is also makes it hard for people to change their minds ↓ and to accept when they are wrong ↓
- Disbelief requires System 2 thought ↓. This means that if system 2 is engaged then we are liable to believe pretty much anything. System 1 is gullible and biased to believe. It is system 2 that is in charge of doubting and disbelieving.
One important non-brain related factor is that we must make use of and live with our current adaptations ↓. People cannot reconform themselves to fulfill purposes suitable to their current environment, but must instead make use of pre-existing machinery that has been optimised for other environments. This means that there is probably never going to be any miracle cures to irrationality because eradicating it would require that you were so fundamentally altered that you were no longer human.
One of the first major steps on the path to becoming more rational, is the realisation that you are not only by default irrational, but that you are always fundamentally comprimised. This doesn't mean that improving your rationality is impossible. It just means that if you stop applying your knowledge of what improves rationality then you will slip back into irrationality. This is because the brain is a kludge. It works most of the time, but in some cases its innate and natural course of action must be diverted if we are to be rational. The good news is that this kind of diversion is possible. This is because humans possess second order thinking ↓. This means that they can observe their inherent flaws and systematic errors. They can then through studying the laws of thought and action apply second order corrections and from doing so become more rational.
The process of applying these second order corrections or training yourself to mitigate the effects of your propensities is called debiasing ↓. Debiasing is not a thing that you can do once and then forget about. It is something that you must either be doing constantly or that you must instill into habits so that it occurs without volitional effort. There are generally three main types of debaising and they are described below:
- Counteracting the effects of bias - this can be done by adjusting your estimates or opinions in order to avoid errors due to biases. This is probably the hardest of the three types of debiasing because to do it correctly you need to know exactly how much you are already biased. This is something that people are rarely aware of.
- Catching yourself when you are being or could be biased and applying a cogntive override. The basic idea behind this is that you observe and track your own thoughts and emotions so that you can catch yourself before you move to deeply into irrational modes of thinking. This is hard because it requires that you have superb self-awareness skills and these often take a long time to develop and train. Once you have caught yourself it is often best to resort to using formal thought in algebra, logic, probability theory or decision theory etc. It is also useful to instill habits in yourself that would allow this observation to occur without conscious and volitional effort. It should be noted that incorrectly applying the first two methods of debiasing can actually make you more biased and that this is a common conundrum and problem faced by beginners to rationality training ↓.
- Understanding the situations which make you biased so that you can avoid them ↓ - the best way to achieve this is simply to ask yourself: how can I become more objective? You do this by taking your biased and faulty perspective as much as possible out of the equation. For example, instead of taking measurements yourself you could get them taken automatically by some scientific instrument.
Related Materials
Wikis:
- Bias - refers to the obstacles to truth which are produced by our kludgy and limited wetware (brains) working exactly the way that they should. ↩
- Evolutionary psychology - the idea of evolution as the idiot designer of humans - that our brains are not consistently well-designed - is a key element of many of the explanations of human errors that appear on this website.
- Slowness of evolution- The tremendously slow timescale of evolution, especially for creating new complex machinery (as opposed to selecting on existing variance), is why the behavior of evolved organisms is often better interpreted in terms of what did in fact work ↩
- Alief - an independent source of emotional reaction which can coexist with a contradictory belief. For example, the fear felt when a monster jumps out of the darkness in a scary movie is based on the alief that the monster is about to attack you, even though you believe that it cannot.
- Wanting and liking - The reward system consists of three major components:
- Liking: The 'hedonic impact' of reward, comprised of (1) neural processes that may or may not be conscious and (2) the conscious experience of pleasure.
- Wanting: Motivation for reward, comprised of (1) processes of 'incentive salience' that may or may not be conscious and (2) conscious desires.
- Learning: Associations, representations, and predictions about future rewards, comprised of (1) explicitpredictions and (2) implicit knowledge and associative conditioning (e.g. Pavlovian associations). ↩
- Heuristics and biases - program in cognitive psychology tries to work backward from biases (experimentally reproducible human errors) to heuristics (the underlying mechanisms at work in the brain). ↩
- Cached thought – is an answer that was arrived at by recalling a previously-computed conclusion, rather than performing the reasoning from scratch. ↩
- Sympathetic Magic - humans seem to naturally generate a series of concepts known as sympathetic magic, a host of theories and practices which have certain principles in common, two of which are of overriding importance: the Law of Contagion holds that two things which have interacted, or were once part of a single entity, retain their connection and can exert influence over each other; the Law of Similarity holds that things which are similar or treated the same establish a connection and can affect each other. ↩
- Motivated Cognition - an academic/technical term for various mental processes that lead to desired conclusions regardless of the veracity of those conclusions.
- Rationalization - Rationalization starts from a conclusion, and then works backward to arrive at arguments apparently favoring that conclusion. Rationalization argues for a side already selected; rationality tries to choose between sides. ↩
- Opps - There is a powerful advantage to admitting you have made a large mistake. It's painful. It can also change your whole life. ↩
- Adaptation executors - Individual organisms are best thought of as adaptation-executers rather than as fitness-maximizers. Our taste buds do not find lettuce delicious and cheeseburgers distasteful once we are fed a diet too high in calories and too low in micronutrients. Tastebuds are adapted to an ancestral environment in which calories, not micronutrients, were the limiting factor. Evolution operates on too slow a timescale to re-adapt to adapt to a new conditions (such as a diet).
- Corrupted hardware - our brains do not always allow us to act the way we should. Corrupted hardware refers to those behaviors and thoughts that act for ancestrally relevant purposes rather than for stated moralities and preferences. ↩
- Debiasing - The process of overcoming bias. It takes serious study to gain meaningful benefits, half-hearted attempts may accomplish nothing, and partial knowledge of bias may do more harm than good. ↩
- Costs of rationality - Becoming more epistemically rational can only guarantee one thing: what you believe will include more of the truth. Knowing that truth might help you achieve your goals, or cause you to become a pariah. Be sure that you really want to know the truth before you commit to finding it; otherwise, you may flinch from it.
- Valley of bad rationality - It has been observed that when someone is just starting to learn rationality, they appear to be worse off than they were before. Others, with more experience at rationality, claim that after you learn more about rationality, you will be better off than you were before you started. The period before this improvement is known as "the valley of bad rationality".
- Dunning–Kruger effect - is a cognitive bias wherein unskilled individuals suffer from illusory superiority, mistakenly assessing their ability to be much higher than is accurate. This bias is attributed to a metacognitive inability of the unskilled to recognize their ineptitude. Conversely, highly skilled individuals tend to underestimate their relative competence, erroneously assuming that tasks that are easy for them are also easy for others. ↩
- Shut up and multiply - In cases where we can actually do calculations with the relevant quantities. The ability to shut up and multiply, to trust the math even when it feels wrong is a key rationalist skill. ↩
Posts
- Cognitive science of rationality - discusses fast(Type 1), slow (Type 2) processes of cognition, thinking errors and the three kinds of minds (reflective, algorithmic, autonomous). ↩
- The Lens That Sees Its Own Flaws - a human brain is a flawed lens that can understand its own flaws—its systematic errors, its biases—and apply second-order corrections to them. ↩
- We Change Our Minds Less Than We Think - between hindsight bias, fake causality, positive bias, anchoring/priming, et cetera et cetera, and above all the dreaded confirmation bias, once an idea gets into your head, it's probably going to stay there. ↩
- You Are A Brain - 'You Are A Brain' is a presentation by Liron Shapira that is tailored for a general audience and provides an introduction to some of the the core LessWrong concepts.
- Your intuitions are not magic - blindly following our intuitions can cause our careers, relationships or lives to crash and burn, because we did not think of the possibility that we might be wrong.
- To Spread Science, Keep It Secret - People seem to have holes in their minds for Esoteric Knowledge, Deep Secrets, the Hidden Truth. We've gotten into the habit of presenting the Hidden Truth in a very unsatisfying way, wrapped up in false mundanity.
Popular Books:
- Marcus,Kluge: The Haphazard Evolution of the Human Mind ↩
- Chabris, The Invisible Gorilla: How Our Intuitions Deceive Us
- Kurzban, Why Everyone (Else) Is a Hypocrite: Evolution and the Modular Mind
- Dawkins, The Selfish Gene: 30th Anniversary Edition--with a new Introduction by the Author
- McCauley, Why Religion is Natural and Science is Not ↩
Papers:
- Haselton, M. (2003). The sexual overperception bias: Evidence of a systematic bias in men from a survey of naturally occurring events. Journal of Research in Personality, 34-47.
- Hasselton, M., & Buss, D. (2000). Error Management Theory: A New Perspective on Biases in Cross-Sex Mind Reading. Jounral of Personality and Social Psychology, 81-91. ↩
- Murray, S., Griffin, D., & Holmes, J. (1996). The Self-Fulfilling Nature of Positive Illusions in Romantic Relationships: Love Is Not Blind, but Prescient. Journal of Personality and Social Psychology,, 1155-1180. ↩
- Gilbert, D.T., Tafarodi, R.W. and Malone, P.S. (1993) You can't not believe everything you read. Journal of Personality and Social Psychology, 65, 221-233 ↩
Notes on decisions I have made while creating this post
(these notes will not be in the final draft):
- This post doesn't have any specific details on debiasing or the biases. I plan to provide these details in later posts. The main point of this post is convey the idea in the title.
5 comments
Comments sorted by top scores.
comment by [deleted] · 2015-09-03T08:28:53.362Z · LW(p) · GW(p)
OBJECTION!
I say that's an overly specific and under sensitive claim to male: that we are implemented on a human brain.
Rather, we are implemented on a human body!
Everybody here has probably heard of the placebo effect. There are some interesting theories about how it works organically that you can read elsewhere. Though, there are other kinds of somatisation, that is, links from psychological phenomenon to physical phenomenon, which don't have any explanations. One is irritable bowel syndrome which has signicant overlap in suffering populations with individuals that have generalised anxiety disorder. Interestly, it can also be treated with cbt, as described here. Perhaps the microbes in guts control our brains! My imagination runs wild when there is little evidence for a particular line of thought! And, there's evidence for gut to brain associations in autism and mood disorders too! Perhaps one day gastroenterologists will treat half the psychological problems, and neurologists will treat the other half (e.g. psychotic disorders which have neurological organic causes rather than mere 'indicators' of problems down below, perhaps). Would love to see some LW heavy weights weigh in on this topic. I wonder if IBS tends to follow psychiatric medication use, since it's associated with higher serotonon! Though, CBT tends to work for IBS patients even when those with psychiatric diagnoses are excluded. Hm... I wonder if I should get my gut microbiome analysed? Are there any potentially useful interpretations, say for ubiome?
Replies from: ScottL, NicksNyx↑ comment by ScottL · 2015-09-03T11:10:06.335Z · LW(p) · GW(p)
There is no doubt that the brain and the body are entwined. I guess that a more explicit title would be: you are implemented on kludgy and limited wetware (a human brain) which is influenced by a myriad of factors, most of which you are unaware of.
Your body does influence you, but then so do a lot of other things. If I changed it to you are implemented on a human body, then someone else will say: "hey, what about the microbes, bacteria and organisms that live in my gut and on my skin". I would then need to acquiesce and add this in. Then, someone will say: "hey, what about social influences". I would then need to add this in. Hopefully, you get the idea that this could potentially go on for a very long time.
I think that you are implemented on a human brain is the best way to convey the ideas in the post.
Would love to see some LW heavy weights weigh in on this topic
Same. I guess it comes down to how much of the causal chain you want to consider. I am happy just considering the brain, but of course there are many other things that influence the neural patterns that get activated in the brain.
comment by imuli · 2015-08-30T01:40:19.928Z · LW(p) · GW(p)
Thinking Fast and Slow references studies of disbelief requiring attention - which is what I assume you mean by "easier".
Replies from: ScottL↑ comment by ScottL · 2015-08-31T12:18:31.506Z · LW(p) · GW(p)
Yes. That's what I mean. Thanks. I added a link to this paper: Gilbert, D.T., Tafarodi, R.W. and Malone, P.S. (1993) You can't not believe everything you read. Journal of Personality and Social Psychology, 65, 221-233.
This is the quote from Thinking Fast and Slow:
Gilbert proposed that understanding a statement must begin with an attempt to believe it: you must first know what the idea would mean if it were true. Only then can you decide whether or not to unbelieve it. The initial attempt to believe is an automatic operation of System 1, which involves the construction of the best possible interpretation of the situation. Even a nonsensical statement, Gilbert argues, will evoke initial belief. Try his example: “whitefish eat candy.” You probably were aware of vague impressions of fish and candy as an automatic process of associative memory searched for links between the two ideas that would make sense of the nonsense. Disbelief is a system 2 thought process. The moral is significant: when System 2 is otherwise engaged, we will believe almost anything. System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy. Indeed, there is evidence that people are more likely to be influenced by empty persuasive messages, such as commercials, when they are tired and depleted.