Rational Groups Kick Ass
post by talisman · 2009-04-25T02:37:31.992Z · LW · GW · Legacy · 24 commentsContents
24 comments
Reply to: Extreme Rationality: It's Not That Great
Belaboring of: Rational Me Or We?
Related to: A Sense That More Is Possible
The success of Yvain's post threw me off completely. My experience has been opposite to what he describes: x-rationality, which I've been working on since the mid-to-late nineties, has been centrally important to successses I've had in business and family life. Yet the LessWrong community, which I greatly respect, broadly endorsed Yvain's argument that:
There seems to me to be approximately zero empirical evidence that x-rationality has a large effect on your practical success, and some anecdotal empirical evidence against it.
So that left me pondering what's different in my experience. I've been working on these things longer than most, and am more skilled than many, but that seemed unlikely to be the key.
The difference, I now think, is that I've been lucky enough to spend huge amounts of time in deeply rationalist organizations and groups--the companies I've worked at, my marriage, my circle of friends.
And rational groups kick ass.
An individual can unpack free will or figure out that the Copenhagen interpretation is nonsense. But I agree with Yvain that in a lonely rationalist's individual life, the extra oomph of x-rationality may well be drowned in the noise of all the other factors of success and failure.
But groups! Groups magnify the importance of rational thinking tremendously:
- Whereas a rational individual is still limited by her individual intelligence, creativity, and charisma, a rational group can promote the single best idea, leader, or method out of hundreds or thousands or millions.
- Groups have powerful feedback loops; small dysfunctions can grow into disaster by repeated reflection, and small positives can cascade into massive success.
- In a particularly powerful feedback process, groups can select for and promote exceptional members.
- Groups can establish rules/norms/patterns that 1) directly improve members and 2) counteract members' weaknesses.
- Groups often operate in spaces where small differences are crucial. Companies with slightly better risk management are currently preparing to dominate the financial space. Countries with slightly more rational systems have generated the 0.5% of extra annual growth that leads, over centuries, to dramatically improved ways of life. Even in family life, a bit more rationality can easily be the difference between gradual divergence and gradual convergence.
And we're not even talking about the extra power of x-rationality. Imagine a couple that truly understood Aumann, a company that grokked the Planning Fallacy, a polity that consistently tried Pulling the Rope Sideways.
When it comes to groups--sized from two to a billion--Yvain couldn't be more wrong.
Update: Orthonormal points out that I don't provide many concrete examples; I only link to three above. I'll try to put more here as I think of them:
- In Better, Atul Gawande talks about ways in which some groups of doctors have dramatically improved by becoming more group-rational, including OB standbys like keeping score and having sensitive discussions privately (and thus more openly).
- Google seems like an extremely rational place for a public company. Two strong signals are that they are extremely data-driven and have used prediction markets. To be painfully clear, I'm not claiming that Google's success is due to the use of prediction markets, merely that these datapoints help demonstrate Google's overall rationality.
- As AlanCrowe points out in the comments, Warren Buffett and Charlie Munger have a rationalist approach.
24 comments
Comments sorted by top scores.
comment by Paul Crowley (ciphergoth) · 2009-04-25T09:10:32.165Z · LW(p) · GW(p)
Thank you! It's been said here before (by Robin Hanson among others) that the art of rationality as set out so far is a lonely art, and that's an artifact of how it was developed. In practice, it seems likely to me that even with the most desperate effort I could easily leave some central mistake like a detached lever at the heart of some complex of ideas, and someone else is going to have to point it out. We need to start thinking now about how to avoid the irrational diseases of groups, because groups can do so much to counteract the irrational diseases of individuals.
comment by orthonormal · 2009-04-25T02:44:43.937Z · LW(p) · GW(p)
Concrete examples would be much appreciated here.
Replies from: AlanCrowe, talisman↑ comment by AlanCrowe · 2009-04-25T15:08:28.720Z · LW(p) · GW(p)
The author of This article is clearly persuing rationality in a similar vein to LessWrong and OvercomingBias and recommending it to others who would emulate his wordly success.
Replies from: orthonormal↑ comment by orthonormal · 2009-04-26T19:05:58.132Z · LW(p) · GW(p)
Excellent link! There's definitely a lot of consonance there, especially on the subject of heuristics and biases. Is anyone with some knowledge of finance interested in reading Munger's book on investing with an eye to practical uses of rationality?
↑ comment by talisman · 2009-04-25T02:46:47.480Z · LW(p) · GW(p)
There are three specific examples linked to; I agree that I could/should have done more.
Replies from: hrishimittal, AnnaSalamon↑ comment by hrishimittal · 2009-04-25T09:26:54.469Z · LW(p) · GW(p)
How have you used rationality in your marriage and family life? Did it help you choose the right partner?
How do you 'imagine a couple that truly understood Aumann'?
Replies from: talisman↑ comment by talisman · 2009-04-28T00:56:27.132Z · LW(p) · GW(p)
I was several years away from starting to learn about x-rationality when I met my partner.
Since there seems to be some interest, I'm going to try to collect my thoughts to describe the contribution of x-rationality to my personal life, but this may take considerable time; I've never tried to put it in words, and there's a strong dash of "dancing about architecture" to it.
↑ comment by AnnaSalamon · 2009-04-25T03:46:58.245Z · LW(p) · GW(p)
How about examples from your own work, marriage, or circle of friends?
Replies from: talisman↑ comment by talisman · 2009-04-25T03:53:11.075Z · LW(p) · GW(p)
I wanted to avoid the anecdotes-ain't-data writeoff and to avoid making the post too much about me specifically. Is that a mistake?
Replies from: AnnaSalamon, Eliezer_Yudkowsky↑ comment by AnnaSalamon · 2009-04-25T04:12:39.288Z · LW(p) · GW(p)
Anecdotes often are significant evidence; it depends how rare the anecdotal successes, how large a population of individuals the anecdotes are selected from (either by you as you choose anecdotes, or implicitly by the community if individuals who by chance have certain sorts of anecdotes are more likely to share), and on how high the prior is on "these tricks really do help" (if the tricks are a priori plausible, it takes less data to establish that they're likely to really work).
But whether or not your anecdotes are significant evidence, do share. If nothing else, it'll give us a better idea of what kind of rationality you have found to be what kind of useful. "Rationality" is such an abstract-sounding term; we need to put flesh on it, from scenes in daily life. Being about you specifically is fine.
↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-04-25T03:55:14.010Z · LW(p) · GW(p)
Probably. Specificity really matters for effective writing.
Besides, this is, technically, a blog...
comment by outlawpoet · 2009-04-25T02:54:45.099Z · LW(p) · GW(p)
A point further in favor, dysrationalia accumulates in groups much as the small advantages you describe do.
Mensa and AMA members may not have superpowers(to pick two weakly rationalist groups), but they also don't spend millions of dollars sponsoring attempts to locate Noah's Ark, or traces of the Jewish tribe that the LDS church believes existed in South America.
Replies from: hirvinen↑ comment by hirvinen · 2009-04-25T03:24:18.440Z · LW(p) · GW(p)
Using the martial arts metaphor, at least Mensa appears to be more about having a lot of muscle, not about fighting skills, and there isn't a strong agenda to improve either.
Replies from: outlawpoet↑ comment by outlawpoet · 2009-04-25T03:34:41.759Z · LW(p) · GW(p)
I agree. Mensa and the AMA aren't actually avowedly rational, nor do they have any group goals that require the same, but they are weakly rational groups, because they contain a lot of smart people and they have institutional biases against failures of intelligence and opinion.
This keeps out certain types of dysrationalia, which is all I needed for my comparison to more vulnerable groups like the LDS and those Charismatic Protestants.
Replies from: JulianMorrison↑ comment by JulianMorrison · 2009-04-25T09:04:35.761Z · LW(p) · GW(p)
I'd say they have no better success at rationalism than the Mormons. All they have is a reactive distaste for some of the traditional symptoms of dumb, including the sillier kinds of religion. They are completely undefended against other death spirals, even closely related ones concerning silly but detailed theories with no evidence.
comment by Cameron_Taylor · 2009-04-25T13:36:32.029Z · LW(p) · GW(p)
The success of Yvain's post threw me off completely. My experience has been opposite to what he describes: x-rationality, which I've been working on since the mid-to-late nineties, has been centrally important to successses I've had in business and family life.
I don't believe you. I think a counterfactual Talisman with his x-rationality cut out and substituted for a double dose of "How to Make Friends and Influence People" would still have a succesfull business and family life. More or less successful, I really couldn't guess.
When it comes to groups--sized from two to a billion--Yvain couldn't be more wrong.
Nothing you said seemed to support that conclusion. Partly because it implies a bizarre charicature of Yvain's post and misses his main point.
comment by Mike Bishop (MichaelBishop) · 2009-04-25T06:17:43.878Z · LW(p) · GW(p)
It sounds like you're arguing that there are increasing returns to rationality in groups.
I am not sure. But I think that it would be helpful to think about what experiment would demonstrate the argument you're making here. e.g. Give a rationality diagnostic exam to a bunch of people, then put them in groups of various sizes and measure how well they perform various tasks.
Replies from: talisman↑ comment by talisman · 2009-04-28T01:12:45.805Z · LW(p) · GW(p)
Relatively rational people can form deeply irrational groups, and vice versa.
I would probably take a group with rational institutions but irrational members over a group with irrational institutions but rational members.
Of course, rational people will be better on average at building rational groups, so I would still predict a positive correlation in the experiment.
comment by cabalamat · 2009-04-25T18:04:51.087Z · LW(p) · GW(p)
If groups magnify the effectiveness of rational thinking, what would an entire community or nation of rationalists be like? And how could such an outcome be achieved?
Replies from: Cameron_Taylor↑ comment by Cameron_Taylor · 2009-04-26T06:00:12.026Z · LW(p) · GW(p)
If groups magnify the effectiveness of rational thinking, what would an entire community or nation of rationalists be like?
Extremely dangerous! Any population that has a sex imbalance in favour is a war waiting to happen. A rampaging army of nerds with 'freaking laser beams'? Save us all!
And how could such an outcome be achieved?
No thanks! (Unless there are catgirls.)
comment by Technologos · 2009-04-25T16:40:14.119Z · LW(p) · GW(p)
I broadly agree with your conclusions, and I wanted to further note that this article in the PNAS draws a link between cognitive skills and material success in a way not simply mediated by job choice.
While we certainly cannot say that cognitive skills and rationality are identical, the article does discuss how groups with higher average ability to plan, take calculated risks, etc. seem to do better over the long-run, including a specific discussion of the Industrial Revolution in Britain.
comment by MrShaggy · 2009-04-25T03:33:07.154Z · LW(p) · GW(p)
It is true that groups magnify the importance of rational thinking, but I don't think the few examples cited prove that non-x rational thinking was the cause of the success and even if that were proven, it wouldn't prove that x-rational thinking would (given opportunity cost) make a group better. The first example is Toyota, but the link isn't a serious argument that the main cause for their success versus the US auto industry is the continuous improvement, nor that their adoption of continuous improvement was them being rational and the US auto industry not adopting it was them being less rational. I doubt all those claims.
I also doubt that "Companies with slightly better risk management are currently preparing to dominate the financial space." As best I can tell, this presupposes that the financial crises hit all companies the same and the difference in effects was due to risk management policies. I tend to think that the different crises hit different companies differently because the crisis were varied and somewhat random in their effects, so that some companies got lucky (though some probably did have better risk policies).
And I really don't understand the claim about certain countries having more rational systems which over centuries led to their current improved "ways of life." I assume this refers to the highly industrialized countries. Was their growth only .5% more per year than other countries? Was the Tsarist rule of Russia more rational than that of imperial China? Or Great Britain more rational than Denmark? I highly doubt that amounts of rationality is what explains the current economic distinctions between countries.
Finally, there is the cliche about finance professors--if they really understood it, why aren't they rich? Similarly, why not start a company using x-rationality to guide it? Find, for example, a new small business that is growing, and just do the same thing--only using x-rationality.
Replies from: stcredzero↑ comment by stcredzero · 2009-04-25T04:46:36.188Z · LW(p) · GW(p)
From the OP: "Whereas a rational individual is still limited by her individual intelligence, creativity, and charisma, a rational group can promote the single best idea, leader, or method out of hundreds or thousands or millions."
GIGO. At any given time, the rational group will be limited by their consensus beliefs and mental models. (Indicator of group quality -- do their mental models improve over time?) Rationality is just a tool for uncovering truth. I've often found that I'll go to work on a hard problem using rigorous intellectual tools (Mathematics, on which rationality is based) but then I'll be flummoxed and have to set it aside. Then the answer will pop into my head the next morning as I'm eating breakfast, at which point I'll use math to validate the intuition I've just had.
Sometimes we need to stumble on the truth by accident or other non-rational means. Can rationality help us if we don't even know the right questions to ask in the first place? I think it's a powerful tool, not a panacea.
We may well benefit from rationality, but that doesn't mean all of the answers we seek will come sliding down the chute when we turn the crank on the machine. However, I will say that it is a fantastic filter for revealing which are the wrong answers.
Toyota is an example of a company that utilizes rationality for a competitive edge. Whenever they have an assembly line problem, they go through the "Five Whys" exercise. Ask "why" iteratively five times. Why five? It's a manageable number, usually enough to get at the real underlying issue, and you have to set some fairly low limit, otherwise employees doing the exercise will keep quitting their jobs and go off to live in the woods to live as philosopher ascetics.