New York Times on Arguments and Evolution [link]
post by Nic_Smith · 2011-06-14T18:12:17.383Z · LW · GW · Legacy · 13 commentsContents
13 comments
I saw this in the Facebook "what's popular" box, so it's apparently being heavily read and forwarded. There's nothing earthshattering for long-time LessWrong readers, but it's a bit interesting and not too bad a condensation of the topic:
Now some researchers are suggesting that reason evolved for a completely different purpose: to win arguments. Rationality, by this yardstick (and irrationality too, but we’ll get to that) is nothing more or less than a servant of the hard-wired compulsion to triumph in the debating arena. According to this view, bias, lack of logic and other supposed flaws that pollute the stream of reason are instead social adaptations that enable one group to persuade (and defeat) another. Certitude works, however sharply it may depart from the truth. -- Cohen, Patricia "Reason Seen More as Weapon Than Path to Truth"
A glance at the comments [at the Times], however, seems to indicate that most people are misinterpreting this, and at least one person has said flatly that it's the reason his political opponents don't agree with him.
ETA: Oops, I forgot the most import thing. The article is at http://www.nytimes.com/2011/06/15/arts/people-argue-just-to-win-scholars-assert.html
13 comments
Comments sorted by top scores.
comment by Jay_Schweikert · 2011-06-16T06:35:51.184Z · LW(p) · GW(p)
Maybe I'm misinterpreting this article (or maybe the NY Times isn't exactly presenting everything correctly), but doesn't Hugo Mercier seem to be coming pretty close to saying something like "this whole attempt at identifying and correcting biases is misguided -- flaws in reasoning are 'natural,' so we should be okay with them." I mean, consider the following excerpt:
Mr. Mercier, a post-doctoral fellow at the University of Pennsylvania, contends that attempts to rid people of biases have failed because reasoning does exactly what it is supposed to do: help win an argument.
“People have been trying to reform something that works perfectly well,” he said, “as if they had decided that hands were made for walking and that everybody should be taught that.”
Am I missing something, or is this one of the most absurd statements about human rationality ever made? We shouldn't try to get rid of biases, not because the effort is futile, but because flawed reasoning works? I guess that's why most people are so successful at handling personal finances, calculating risk, evaluating political proposals, and questioning ingrained religious beliefs.
Replies from: haig↑ comment by haig · 2011-08-11T22:59:24.259Z · LW(p) · GW(p)
Reading his essay here: http://edge.org/conversation/the-argumentative-theory it appears that he does indeed come off as pessimistic with regard to raising the sanity line for individuals (ie teaching individuals to reason better and become more rational on their own). However, he does also offer a way forward by emphasizing group reasoning such as what the entire enterprise of science (peer review, etc.) encourages and is structured for. I suspect he thinks that even though most people might be able to understand that their reasoning is flawed and that they are susceptible to biases on an academic level, they will still not be able to overcome those strongly innate tendencies in practice, hence his pragmatic insistence on group deliberation to put the individual in check.
IMO, what he fails to take into consideration is the adaptability of human learning through experience and social forces, such that with the proliferation of and prolonged participation in communities like Less Wrong or other augmented reasoning systems, one would internalize the rational arts as habits and override the faulty reasoning to some extent much of the time. I still agree with him that we will always need a system like peer review or group deliberation to reach the most rational conclusions, but in the process of using those systems we individually become better thinkers.
comment by Dr_Manhattan · 2011-06-15T14:50:25.441Z · LW(p) · GW(p)
Interesting article, but my general problem with these approaches is that they seem to look for one major "reason" evolution produced brains. Even for a less sophisticated internal organs, their functions are multiple (which is what makes medicine more complicated than we'd wish) and for something that's a Turing-complete piece of wetware there is no reason to think less.
The interesting question to me is "do people argue to win arguments?", followed by "which people", "how often", "under which circumstances"? The general jist of "brains evolved for arguing" does little to control my anticipation, though addmittedly it might have been a useful reflection/observation-directing hint to my earlier more naive self.
comment by Wei Dai (Wei_Dai) · 2011-06-14T22:41:01.706Z · LW(p) · GW(p)
Thanks. One of the researchers mentioned in the article has a very interesting website. Here's a quote that seems especially relevant to LW:
Replies from: NornagestBased on the dominant, Cartesian view people have been trying for many years to reform reasoning: to teach critical thinking, to rid us of our biases, to make Kants of us all. This approach has not been very successful. According to our theory this is not surprising, as people have been trying to reform something that works perfectly well—as if they had decided that hands were made for walking and that everybody should be taught that. Instead, we claim that reasoning does well what it is supposed to do—arguing—and that it produces good results in appropriate—argumentative—contexts. So, instead of trying to change the way people reason, interventions based on the environment—institutional in particular—are much more likely to succeed. If we can increase people’s exposition to arguments, if we manage to make them argue more with people who disagree with them, then reasoning should produce very good results without having had to be reformed.
↑ comment by Nornagest · 2011-06-14T23:04:52.727Z · LW(p) · GW(p)
That sounds likely to produce more effective argumentation rather than more effective reasoning. We're essentially talking about reviving Rhetoric as a subject of study, either formally as a course or informally by way of lots of practice in the domain -- and while that might include some inoculation against biases, it's not at all clear whether that would dominate the effects of learning to leverage biases more subtly and effectively.
At a guess, in fact, I'd say the reverse is true.
Replies from: fubarobfusco↑ comment by fubarobfusco · 2011-06-15T05:20:03.010Z · LW(p) · GW(p)
That sounds likely to produce more effective argumentation rather than more effective reasoning.
If you expose populations of gazelles to populations of cheetahs, you will get gazelles who are more effective cheetah-avoiders.
They will also be faster. Actually, really, objectively faster, as measured by someone who has a clock rather than a cheetah's appetite as their metric.
The strongest techniques of argumentation — the ones that work against people who are also strong arguers — happen to be those that are in conformance with the mathematical rules of logic and evidence. That is why the ancients figured out syllogisms, and the less-ancients figured out probability, rules of evidence, symbolic logic, significance tests, the rule against hearsay, and so on.
(Evidence is not just "what seems convincing", either. In a world where other people are trying to convince you of false things in order to take advantage of you, it is to your advantage to only be convinced by that which is actually true.)
This should not be surprising. If you want to beat others on a given field, you have to take advantage of the properties of that field — not just take advantage of naïve opponents. You do not become a chess master by studying the psychology of chess players; you study chess.
Replies from: Nornagest, Desrtopa↑ comment by Nornagest · 2011-06-15T18:04:29.798Z · LW(p) · GW(p)
Evidence, at the level of a single argument in any field that isn't subject to unambiguous experimental tests, is "what seems convincing". That's almost tautological. Careers in these fields -- which make up the vast majority of talky fields out there, incidentally, and thus include the vast majority of arguments that a randomly selected member of the public will ever get into -- aren't made by being right in an abstract sense, but by convincing bosses, investors, and/or members of the public that you're right. Avoiding being manipulated by your opponents is also important, but that has a lot less to do with formal logic and a lot more to do with social dynamics.
Out in the wild, I don't see a whole lot of passion for the mathematical rules of logic and evidence in the practice of people whose job it is to argue with other strong debaters, i.e. lawyers and politicians. Same goes -- in an admittedly more sophisticated way -- for many branches of academia, which is theoretically a reference class made up entirely of people who're well-informed about the rules of logic and evidence, so we're not just dealing with a need to pander here.
What I do see is a lot of complex signaling behavior, a lot of sophistication around the selection and presentation of evidence that favors your side, and a lot of techniques for seeming, or for actually being, sincere in the presentation of your argument. Which is exactly what I'd expect. We're not dealing with predator/prey dynamics here, where the criteria for fit and unfit are unambiguous and large chunks of fitness ultimately come down to physics; we're dealing with a nasty incestuous free-for-all, where fitness is usually socially determined, using brains that're built not for formal logic but for managing personal alliances. What do you think the cheapest route to winning an argument is going to be, most of the time?
↑ comment by Desrtopa · 2011-06-17T20:31:30.089Z · LW(p) · GW(p)
Spending a lot of time arguing is very different from optimizing for being persuasive, or for only being persuaded by true arguments. Curi evidently spends a lot more time in argument than most members of this board, but I certainly wouldn't say that it's been helpful for him.
A gazelle that gets caught by a cheetah will die. A person who makes less sound points in a debate and refuses to change their mind can not only insist that they won the argument, they may even preserve more social status by doing so than acknowledging that they were wrong.
comment by Eugine_Nier · 2011-06-15T05:34:01.753Z · LW(p) · GW(p)
I don't think the only evolutionary purpose of reason is to win arguments, part of the purpose must have been to decide on the best course of action, otherwise we would have evolved to not listen to what anyone else says.
Replies from: Mercurial↑ comment by Mercurial · 2011-06-15T12:27:12.000Z · LW(p) · GW(p)
... otherwise we would have evolved to not listen to what anyone else says.
That's a death sentence for a great ape. All the great apes form tribes as one of their primary survival strategies. It could simply be that evolution didn't make ignoring others' arguments an option any more than simply ignoring dominance contests could retain one's status in the pecking order.
Replies from: ShardPhoenix↑ comment by ShardPhoenix · 2011-06-16T04:01:02.161Z · LW(p) · GW(p)
Just as dominance contests are ultimately backed by greater physical force, so arguments must ultimately be backed by greater correctness (on average, with a lot of variance). This is made more complicated by the fact that in some cases, "correctness" may have elements of "social truth" or self-fulfilling prophecy.
Replies from: Mercurial↑ comment by Mercurial · 2011-06-16T13:26:41.522Z · LW(p) · GW(p)
...arguments must ultimately be backed by greater correctness...
I'd certainly like to think so! I'm just suspicious of that intuition, especially in myself. The subjective impression that reasoning is for truth-seeking could be because it is. However, if it's not, as the lead article suggests, then we'd still be under the impression that our reasoning is in pursuit of truth and that those who disagree with us are willfully ignoring the truth. So we can't use that intuition as a guide to tell us about what's the case in this situation.
It's also worth noting that people generally aren't convinced by true arguments. They're usually convinced instead by peer pressure and repetition. Presenting a really crushing (!) argument that leaves no logical wiggle room left over can actually make the person who initially disagreed become more certain of their initial position and become resentful toward you. That really makes no sense if reason is supposed to be for truth-pursuit - but it makes a lot of sense if arguments are more about dominance than determining what's real.
Replies from: ShardPhoenix↑ comment by ShardPhoenix · 2011-06-17T02:36:38.859Z · LW(p) · GW(p)
I'm not saying I feel like reasoning is for truth-seeking, I'm saying that to some significant extent it has to be - like Eugene_Nier says, even if there's a lot of noise and social posturing involved, on average it has to bottom out in truth somewhere, else why would we evolve to put effort into something worthless? If it was purely about social dominance, why talk at all instead of sticking to fighting/physical displays?
edit: Although I'm not sure how much purely social content can be built on top of a little physical truth - maybe a lot.