The Benefits of Rationality?
post by cousin_it · 2009-03-31T11:17:39.503Z · LW · GW · Legacy · 80 commentsContents
80 comments
Robin wrote how being rational can harm you. Let's look at the other side: what significant benefits does rationality give?
The community here seems to agree that rationality is beneficial. Well, obviously people need common sense to survive, but does an additional dose of LessWrong-style rationality help us appreciably in our personal and communal endeavors?
Does LessWrong make us WIN?
(If we don't WIN, our evangelism rings a little hollow. Science didn't spread due to evangelism, science spread because it works. Art spreads because people love it. I want to hold my Art to this standard. Push-selling a solution while it's still inferior might be the locally optimal decision but it corrupts long-term, as many of us have seen in the IT industry. That's if the example of all religions and political movements isn't enough for you. Beware the Evangelism Death Spiral!)
We may claim internal benefits such as improved clarity of thought from each new blog insight. But religious people claim similar internal benefits that actually spill out into the measurable world, such as happiness and charitability. This fact gives us envy and we attempt to use our internal changes to group together for world-benefitting tasks. To my mind this looks like putting the cart before the horse: why compete with religion on its terms, don't we have utility functions of our own to satisfy?
No, feelings won't do. If feelings turn you on, do drugs or get religious. Rationalism needs to verifiably bring external benefit. Don't help me become pure from racism or somesuch. Help me WIN, and the world will beat a path to our door.
Okay, interpersonal relationships are out. Then the most obvious area where rationalism could help is business. And the most obvious community-beneficial application (riffing on some recent posts here) would be scientists banding together and making a profitable part-time business to fund their own research. I can see how many techniques taught here could help, e.g. PD cooperation techniques. If a "rationalism case study" of this sort ever gets launched, I for one will gladly offer my effort. Of course this is just one suggestion; everything's possible.
One thing's definite for me: rationalism needs to be grounded in real-world victories for each one of us. Otherwise what's the point?
80 comments
Comments sorted by top scores.
comment by Scott Alexander (Yvain) · 2009-03-31T12:04:41.416Z · LW(p) · GW(p)
All this stuff about "Something to Protect" and "Rationalists Should Win" is all nice and well. And Eliezer's point that grounding rationalism in some real-world need ensures that we don't enter affective death spirals around some particular less-than-optimal rational method, is well-taken.
But dammit, truth for the sake of truth is okay too!
I want to know how to become better at certain things, but that's not the main reason I'm on here. Have you ever had all the wires connected to your computer all tangled up, and it's not really making anything that much harder to use, but it just gets on your nerves until you have to stop what you're doing and spend however long it takes trying to untangle all of them and put them in nice little lines? That's how my brain feels all the time. Reading Overcoming Bias helped me untangle some of those wires.
Of course, if I didn't then take advantage of my new clarity to advance some of my other values, you'd have to wonder whether I'd really learned anything. But that wasn't the main reason I came here, and it's not the main reason I stay.
...and judging by how much time people here spend on Newcomb-like problems, then unless you have some really unusual day jobs I'm guessing I'm not alone.
PS: If you think rationality's important for succeeding at business, and I prove to you that great-looking hair is a greater contributor to success in business than knowledge of Bayescraft, would you stop reading Less Wrong and start reading hairstylist blogs? Would you recommend other people do so?
Replies from: cousin_it, MichaelVassar↑ comment by cousin_it · 2009-03-31T12:40:59.204Z · LW(p) · GW(p)
Truth isn't something you feel is true.
Programming abounds with theories that give their adherents a sense of clarity - for example the relational model of data, or object-oriented design, or REST. Funny that all those theories have active "evangelists". My experience with them all had a pattern of initial "wow" followed much later by a painful un-clarification, as contact with the real world brought out the theory's shortcomings. Curiously, my experience with Overcoming Bias has followed the same pattern: I bought into the ideas wholesale for the first months, then slowly grew disillusioned over about a year. Today it's all the same old same old. The idea of correcting cognitive biases just isn't as powerful by itself as we like to think. Hence my post.
Don't get stuck in the box an ideology outlines for you. It may be a good fit today but you'll hopefully grow.
Re PS: if rationality had no other real-world uses except business success, and in that single area hairstyle proved to be a greater contributor, then yes, I'd leave and never look back.
Replies from: Yvain↑ comment by Scott Alexander (Yvain) · 2009-03-31T14:11:08.103Z · LW(p) · GW(p)
And yet Newton didn't develop the theory of gravitation as part of his quest to make cheaper widgets. Nor did Einstein develop relativity because there was anything practical that classical mechanics couldn't do.
Looking for truth is not a fancy way of saying "looking for the first half-assed solution you can find so you can feel like you know something true and go back to what you were doing before."
The easiest way to get a bundle of beliefs that completely dissolve previously mysterious questions, don't contradict one another, don't contradict experience, don't contain sacred mysteries, and don't force you into dark side epistemology to maintain them - as far as I know the easiest way to get such beliefs is to believe things that are true.
If there's some other weird attractor state of beliefs that also fulfills those requirements, I guess I risk falling into it. But then again, so do you - such beliefs would have to predict experience as successfully as the truth, which means they would have to give you the same widget-making capacity as true beliefs.
Despite starting out at Less Wrong as a believer in the "rationalism helps you win!" school, the more I read the less I think going from the sort of person who's read all of Overcoming Bias to the sort of person who's read OB plus all of Less Wrong is going to grant you any hugely significant extra real-world-winning capacity. I should make a post on this sometime.
Replies from: cousin_it, pjeby, cousin_it↑ comment by cousin_it · 2009-03-31T14:51:13.476Z · LW(p) · GW(p)
Never mind widgets. I overemphasized business in the original post. Any kind of reality check will do. Newton had a reality check, what about us? Not many accurate predictions here. Choose any real-world metric that suits you - just don't degenerate into "what biases have you overcome today" soft-science bullshit.
I once joked that science has four levels, high to low: "this works", "this is true", "this sounds true", "this sounds neat". We here are still at number three, no?
Replies from: Yvain, AlexU↑ comment by Scott Alexander (Yvain) · 2009-03-31T18:10:06.573Z · LW(p) · GW(p)
I think you have too limited a picture of what searching for truth entails, and that we don't have as great a difference between our views as you think.
Newton and Einstein used rationality to seek truth and bring unity to experience, not for practical results. But they were both smart enough to know they'd better check their results against experience, or they'd get the wrong answer and never be able to move further. If we're smart, we'll do the same, whether we're after truth or whatever.
Someone once said there were two kinds of rich people - those who really like having luxury goods, and those for whom money is just a way to keep score. The same could apply to rationalists; there are those who want some specific practical goal or predictive ability, and there are others for whom the ability to achieve practical goals or make predictions is a way to keep score. Einstein was happy to hear his theory successfully predicted the path of light during an eclipse, I'm sure, but not because he was in it for the eclipse-light-predicting.
Replies from: cousin_it, Nebu↑ comment by cousin_it · 2009-03-31T18:46:10.992Z · LW(p) · GW(p)
You're right, we are more or less in agreement. The expression "to keep score" captures the topic perfectly. Pickup artists have attained a very accurate/predictive view of female mating psychology because they keep score. :-) I'd love to have something similarly objective for rationalism.
↑ comment by Nebu · 2009-04-14T14:02:30.398Z · LW(p) · GW(p)
Newton and Einstein used rationality to seek truth and bring unity to experience, not for practical results. But they were both smart enough to know they'd better check their results against experience, or they'd get the wrong answer and never be able to move further.
Replies from: Vladimir_NesovIn 1919, Sir Arthur Eddington led expeditions to Brazil and to the island of Principe, aiming to observe solar eclipses and thereby test an experimental prediction of Einstein's novel theory of General Relativity. A journalist asked Einstein what he would do if Eddington's observations failed to match his theory. Einstein famously replied: "Then I would feel sorry for the good Lord. The theory is correct."
It seems like a rather foolhardy statement, defying the trope of Traditional Reality that experiment above all is sovereign. Einstein seems possessed of an arrogance so great that he would refuse to bend his neck and submit to Nature's answer, as scientists must do. Who can know that the theory is correct, in advance of experimental test?
↑ comment by Vladimir_Nesov · 2009-04-14T14:06:42.921Z · LW(p) · GW(p)
A typo in the Yudkowsky's article: Traditional Reality -> Traditional Rationality.
↑ comment by AlexU · 2009-03-31T15:10:32.826Z · LW(p) · GW(p)
Yes. I've been a semi-regular reader of OCB for about a year. I think it's an interesting blog. But have I learned anything useful from it? Has it made any practical difference in the choices I make, either day-to-day or longterm? The answer is no. Admittedly, this may be my own fault. But I recall a post, not too long ago, soliciting people's feedback on "the most important thing you learned from OCB in the past year," or something of that sort. And while there were lots of people excitedly posting about how much OCB has taught them, the examples they gave were along the lines of "I learned the power of fundamental attribution error!" or "I learned the importance of continually adjusting my priors!" with curiously few examples of real differences OCB made in anyone's practical choices. This raises the question: if tweaking our rationality has no appreciable affect on anything, then how can we say we're really tweaking our rationality at all? Perhaps we're just swapping new explanations for fundamentally irrational processes that are far too buried and obscure to be accessible to us.
That said, I think things like the recent posts on akrasia are strong moves in the right direction. Intellectually interesting, but with easy to grasp real-world implications.
Replies from: robzahra, Kaj_Sotala, Demosthenes↑ comment by robzahra · 2009-03-31T16:06:34.092Z · LW(p) · GW(p)
Ob has changed people's practical lives in some major ways. Not all of these are mine personally:
"I donated more money to anti aging, risk reduction, etc"
"I signed up for cryonics."
"I wear a seatbelt in a taxi even when no one else does."
"I stopped going to church but started hanging out socially with aspiring rationalists."
"I decided rationality works and started writing down my goals and pathways to them."
"I decided it's important for me to think carefully about what my ultimate values are."
Replies from: Bleys↑ comment by Bleys · 2009-03-31T22:33:11.477Z · LW(p) · GW(p)
Yes! Or even further, "I am now focusing my life on risk reduction and have significantly reduced akrasia in all facets of my life."
Replies from: AlexU↑ comment by AlexU · 2009-04-01T02:05:15.980Z · LW(p) · GW(p)
This sounds an awful lot like one of the examples I gave above. Ok, so you're focused on "risk reduction" and "reducing akrasia." So what's that mean? You've decided to buckle-up, wear sunscreen, and not be so lazy? Can't I get that from Reader's Digest or my mom?
Replies from: ciphergoth↑ comment by Paul Crowley (ciphergoth) · 2009-04-01T08:07:44.328Z · LW(p) · GW(p)
Telling people to buckle up is nothing special. Successfully persuading people to buckle up - helping people understand and fix the internal sources of error that stood in the way of doing so in the past - will save a life if you can do it enough.
↑ comment by Kaj_Sotala · 2009-03-31T21:38:51.547Z · LW(p) · GW(p)
The problem is that even though learning to identify and avoid certain biases will affect your behavior, there's no easy way to articulate those effects. It comes mainly from things not done, not things done.
For instance, upon hearing a fallacious argument, being aware of its fallacies causes the hearer not to believe in it, where he previously would have. Or if he thinks something on his own - previously a bias would have caused him to think a certain thought, which would have led to a certain action. Now, having learned to identify the bias, he doesn't even generate that thought, but instead another, which leads to him taking a different action. While these things do certainly have an effect, they're too subtle to identify. You're not going to know the thoughts you avoided (even if you can try to guess), only the ones you've actually thought.
I feel this has largely been the case for me. My behavior has certainly been affected because I now think more clearly. That, I'm pretty certain of. But can I give any concrete examples? I'm afraid not. The effect is on a too subtle of a level for me to properly observe. But that doesn't mean there aren't any concrete examples, it only means I can't verbalize them.
↑ comment by Demosthenes · 2009-03-31T16:24:21.357Z · LW(p) · GW(p)
This debate has already played out in attacking and defending Pragmatism.
A lot of the rubrics by which to judge whether or not rationalism wins or whether or not rationalism is an end in itself involve assigning meaning and value on a very abstract level. Eliezer's posts outline a reductionist, materialist standpoint with some strong beliefs about following the links of causality. Rationalism follows, but rationalism isn't going to prove itself true.
Deciding that rationalism is the best answer for your axiomatic belief system requires taking a metaphysical stand; I think that if you are looking for a definite metaphysical reason that you should practice rationalism, then you are interested in something that the practice of rationalism is not going to help much.
Replies from: AlexU↑ comment by AlexU · 2009-04-01T02:02:43.771Z · LW(p) · GW(p)
"Rationalism" as compared to what? Mysticism? Random decision-making? Of course rational behavior is going to be by far the best choice for achieving one's particular ends. I wasn't questioning the entire concept of rationalism, which clearly has been the driving force behind human progress for all of history. I was questioning how much making "tweaks" -- the kind discussed here and on OCB -- can do for us. Or have done for us. Put differently, is perseverating on rationality per se worth my time? Can anyone show that paying special attention to rationality has measurable results, controlling for IQ?
Replies from: Demosthenes↑ comment by Demosthenes · 2009-04-01T23:28:09.159Z · LW(p) · GW(p)
Mysticism and random decision making are both acceptable and highly successful methods of making decisions; most of human history has relied on those two... we still rely on them. If you are a consequentialist, you can ignore the process and just rate the outcome; who cares why nice hair is correlated with success -it just is! Why does democracy work?
What makes rationalism worth the time is probably your regard for the process itself or for its outcomes. If its the outcomes then you might want to consider other options; following your biases and desires almost blindly works out pretty well for most people.
↑ comment by pjeby · 2009-03-31T18:12:23.420Z · LW(p) · GW(p)
If there's some other weird attractor state of beliefs that also fulfills those requirements, I guess I risk falling into it. But then again, so do you - such beliefs would have to predict experience as successfully as the truth, which means they would have to give you the same widget-making capacity as true beliefs.
There are plenty of things like this -- engineering models, heuristics, etc. You don't have to have a "true" map to have a "useful" map. An idealized right-angle, not-to-scale map of a city which nonetheless allowed you to logically navigate from point A to point B would be "useful" even if not "true" or "accurate" in certain senses.
Meanwhile, if you wait around for a "true" map, you're not going anywhere.
Replies from: Yvain↑ comment by Scott Alexander (Yvain) · 2009-03-31T18:31:41.694Z · LW(p) · GW(p)
But such maps are only useful insofar as they are true. For example, the London Tube Map claims to be a useful representation of which stations are on which lines. It's useful in doing that because it is correct in its domain - every station it says is on the Piccadilly Line really is on the Piccadilly Line. It doesn't claim to accurately represent distance, and anyone who tried to use it to determine distances would quickly get some surprises.
There the danger doesn't seem to be getting something that isn't the truth, the danger is stopping at something that's just true enough for a certain purpose, and no more.
And a seeker of truth seems less likely to get stuck there than a seeker of win - witness classical mechanics, which is still close enough to be useful for everything practical, versus relativity, which exists because Einstein wouldn't accept a theory which worked well enough but had a few little loose ends.
Replies from: pjeby↑ comment by pjeby · 2009-03-31T19:42:01.065Z · LW(p) · GW(p)
There the danger doesn't seem to be getting something that isn't the truth, the danger is stopping at something that's just true enough for a certain purpose, and no more.
Why is that bad?
And a seeker of truth seems less likely to get stuck there than a seeker of win - witness classical mechanics, which is still close enough to be useful for everything practical, versus relativity, which exists because Einstein wouldn't accept a theory which worked well enough but had a few little loose ends.
How has relativity made us better off? If you want to pursue truth because you like truth, that's great -- it's a "win" for you. But if you only need the truth to get to something else, it's not a win to add useless knowledge.
Are you sure that this isn't all about signaling being a truth-seeker? (i.e. "Truth-Seeking Isn't About The Truth")
After all, credibly signaling that you value the truth could make you a valuable ally, be considered a neutral judge, etc. etc. For these reasons, credibly valuing the truth above all else might be beneficial... for reasons not having anything to do with actually getting to the truth.
So, if you're saying we should seek truth just because it's the truth, and not because it brings practical benefit or pleasure or sends good signals, then what is the use of seeking truth?
Replies from: Eliezer_Yudkowsky, HughRistik, Yvain, PhilGoetz, Nebu↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-04-14T15:41:16.634Z · LW(p) · GW(p)
How has relativity made us better off?
The lightspeed limit stops aliens from eating us.
Replies from: Eugen↑ comment by Eugen · 2018-08-22T17:12:43.468Z · LW(p) · GW(p)
What if it actually doesn't and their craft are really only limited by how fast their typical UFO-discs can spin without killing the crew inside (apparently they are spongelike inside) since unlike us they already know how to create anti-gravity to pull their ships forward? In that case the reason we are not dead yet is because they needed to figure out how to construct fast enough motherships capable of a full-scale earth invasion after we apparently killed most of their messengers. In that case our strategy should be to pool resources into defending the earth against an alien invasion and make it so costly to them that they will instead consider a trade agreement with us, which may at some point be more attractive to them than an all-out war. Trade is the way forward. Of course that is only conjecture, I don't really know if they exist, but assigning literally zero probability to this may be stupid.
↑ comment by HughRistik · 2009-03-31T23:57:37.863Z · LW(p) · GW(p)
There the danger doesn't seem to be getting something that isn't the truth, the danger is stopping at something that's just true enough for a certain purpose, and no more.
Why is that bad?
It can be bad if you mistakenly rest at a local maximum in your results.
You take a theory that is close enough to being true that it gives you results. Let's say, you make $1000 a month from a certain theory of web advertising on your website. If you worked a little harder to uncover the truth, you might confuse yourself and go down to $500 a month. Yet if you worked even harder, you might make $2000 a month. The $1000 was a local maximum. If so, seeking the truth could help you find it out, if we assume that (on average at least) more truth leads to more results in solving real world problems.
↑ comment by Scott Alexander (Yvain) · 2009-03-31T20:07:42.548Z · LW(p) · GW(p)
Why is that bad?
It's not, if you know you're doing it.
Are you sure that this isn't all about signaling being a truth-seeker?
Pretty sure. If I wanted to signal, I'd be a lot more high-falutin about it. Actually, my comments do sound a bit high-falutin' (I was looking for a better word than "truth seeker", but couldn't find one) but that wasn't exactly what I wanted to express. The untangling-wires metaphor works a little better. Nominull's "I only seek to be right because I hate being wrong." works too. It's less of a "I vow to follow the pure light of Truth though it lead me to the very pits of Hell" and more of an "Aaargh, my brain feels so muddled right now, how do I clear this up?"
Also, this would be a terrible community to signal truth-seeking in, considering how entrenched the "rationality as win" metaphor is. As I mentioned in the hair example, I think a lot more people here are signaling a burning interest in real-world application than really have one.
So, if you're saying we should seek truth just because it's the truth, and not because it brings practical benefit or pleasure or sends good signals, then what is the use of seeking truth?
Um...this line of argument applies to everything, doesn't it? What is the use of seeking money, if it doesn't bring pleasure or send good signals? What is the use of seeking love, if it doesn't bring pleasure or send good signals? What is the use of seeking 'practical benefits', if they don't bring pleasure or send good signals?
Darned if I know. That's the way my utility function works. And it certainly is mediated by pleasure and good signals, but I prefer not to say it's about pleasure and good signals because I'd rather not be turned into orgasmium just yet.
Replies from: ciphergoth, Demosthenes, ciphergoth, igoresque↑ comment by Paul Crowley (ciphergoth) · 2009-03-31T21:34:58.021Z · LW(p) · GW(p)
Yeah, "rationalists WIN!" is the most widely misued EYism on all of LessWrong.com.
↑ comment by Demosthenes · 2009-03-31T22:35:00.187Z · LW(p) · GW(p)
Yvain:
Do you really believe that you engage in Truth-Seeking for utilitarian reasons? I get the impression that you don't really believe that.
Would you be willing to to enter a computer simulation where you got to investigate higher math puzzles (or metaphysics) with no applications? Spend your days in a fantastic and never-ending Truth-Seeking project (we'll throw great sex, food and housing into the holodeck for you as well)?
I liked this better at the beginning when you were prodding people who say that they see rationalism as a means to an end! You seem to be going back to consequentialism!
I don't believe that rationalists WIN because I don't believe that winning WINS
Replies from: Nebu, Demosthenes↑ comment by Nebu · 2009-04-14T14:10:50.715Z · LW(p) · GW(p)
Would you be willing to to enter a computer simulation where you got to investigate higher math puzzles (or metaphysics) with no applications? Spend your days in a fantastic and never-ending Truth-Seeking project (we'll throw great sex, food and housing into the holodeck for you as well)?
Maybe a few videogames (or other forms of entertainment in addition to sex) and this sounds like a very sweet deal.
↑ comment by Demosthenes · 2009-03-31T22:38:14.158Z · LW(p) · GW(p)
And you must enjoy the signal value you a little bit! You aren't keeping your Less Wrong postings in your diary under lock and key!
Replies from: Demosthenes, ciphergoth, loqi↑ comment by Demosthenes · 2009-04-01T13:19:12.953Z · LW(p) · GW(p)
logi:
That's possible and probably partially accurate; if there were more posts taking the form "I believe X because..." on Less Wrong, I might be more open to the idea that people are doing that.
Ciphergoth:
Also, this would be a terrible community to signal truth-seeking in, considering how entrenched the "rationality as win" metaphor is. As I mentioned in the hair example, I think a lot more people here are signaling a burning interest in real-world application than really have one.
I just wanted to get Yvain's opinion about how much value from posting on Less Wrong was coming from signaling. Yvain suggested that this was not his or her main goal and that LW would be a uniquely poor place to attempt it. I personally doubt both of those points, but I was hoping to get some clarification since the comments about signaling and the nature of truth-seeking don't seem to be part of a system of beliefs.
Are you worried that signaling truth-seeking is legitimate enough?
↑ comment by Paul Crowley (ciphergoth) · 2009-04-01T08:16:19.797Z · LW(p) · GW(p)
Sure, but it's pretty clear that a lot of people are enjoying the WIN! signal too. Let's try not to get too caught up in who is signalling what.
↑ comment by Paul Crowley (ciphergoth) · 2009-03-31T22:28:04.364Z · LW(p) · GW(p)
I inherently value humanity's success in understanding as much as we do, but I don't discount the utility much in time; I don't much mind if we learn something later rather than earlier.
As a result, it's not that important to me to try to serve that end directly; I think it's a bigger gain to serve it indirectly, by trying to reduce the probability of extinction in the next hundred years. This also serves several other goals I value.
↑ comment by igoresque · 2009-03-31T20:39:49.774Z · LW(p) · GW(p)
There the danger doesn't seem to be getting something that isn't the truth, the danger is stopping at something that's just true enough for a certain purpose, and no more.
Why is that bad?
It's not, if you know you're doing it.
This is an interesting debate. I believe all the truth we'll ever get will be like the tube map: good for purpose X, and no more. Or at least, bad for purpose Y. Wanting more is surrendering to metaphysics, realism, platonism, absolutism - whatever you wish to call it.
I believe platonism shaped first the Hellenistic world, then christianity (Paul was of Greek culture, the whole new testament was written in Greek, and books like the one of John are soaked in primary platonic philosophy), and rules until today. It also really sucks. Because it makes people to not want to be less wrong. They want to be completely, absolutely right, in a way you can never claim with the help of mere rationality. Only delusion can help with that.
The Truth Pilgrim's progress goes like this:
Slightly Rational -> Less Wrong -> Delusional
Replies from: pjeby, thomblake↑ comment by pjeby · 2009-04-14T15:53:57.457Z · LW(p) · GW(p)
Wanting more is surrendering to metaphysics, realism, platonism, absolutism - whatever you wish to call it. ....
Because it makes people to not want to be less wrong. They want to be completely, absolutely right, in a way you can never claim with the help of mere rationality. Only delusion can help with that.
The Truth Pilgrim's progress goes like this:
Slightly Rational -> Less Wrong -> Delusional
Yep -- and that's probably as close to an "absolute truth" as you can get. Robert Anton Wilson's "Quantum Psychology" (bad title, awesome book, some parts approach GEB in awesomeness) has some very good information along these lines, along with lots of "class exercises" that might be useful for developing an instrumental rationality group.
↑ comment by PhilGoetz · 2009-04-14T14:22:47.996Z · LW(p) · GW(p)
In a TV tube, an electron gun shoots electrons at a cathode ray tube. An electromagnet bends these rays in a precisely-timed manner to make them scan the screen. Since they're travelling at relativistic speeds, they are time-dilated from our point of view; and you need to use relativity to bend them the right amount.
Replies from: pjeby↑ comment by Nebu · 2009-04-14T14:09:19.902Z · LW(p) · GW(p)
How has relativity made us better off?
I'm a bit stunned by this question, so maybe it was intended to be rhetorical. But if not, I believe things like GPS relies on relativity. And my life has been so much better ever since I got an iPhone with a GPS receiver. It integrates with google maps and the local public transportation system to actually tell me what time I should leave my house in order to be able to arrive on time at another location at a specific time by crossreferencing the departure and arrival times of all the subwaytrains and busses in the system.
Replies from: pjeby↑ comment by pjeby · 2009-04-14T15:35:02.501Z · LW(p) · GW(p)
I'm a bit stunned by this question, so maybe it was intended to be rhetorical. But if not, I believe things like GPS relies on relativity. And my life has been so much better ever since I got an iPhone with a GPS receiver.
It was intended to point out that Einstein didn't seek out relativity in order to produce useful results, and that, with the possible exception of nuclear energy and atomic bombs, it's quite likely that, had Einstein not come along, most of the "practical" uses of relativity today would've prompted engineers to add time dilation fudge factors to their plans, and then inspired some not-Einstein physicists to figure out what the heck was going on.
In other words, there was really no danger of "stopping at something that's just true enough for a certain purpose, and no more", in a way that would actually produce a bad result, or deprive us of a good one for more than a limited time.
In other words, Einstein's truth-seeking was about his personal desire to "know God's thoughts", not to improve the lot of humanity by helping us get iPhone GPS receivers. And as I said earlier in this thread, wanting truth because you're curious is all well and good, but in the end it's the search for practical models that drives progress. Science having "true" models saves engineers time and mistakes getting started, but they still have to work out practical models anyway... and sometimes need to be able to deal with things that the scientists haven't even started figuring out yet.
Case in point: hypnotism. Scientists still don't have a "true" model for it, AFAIK, but hypnotists have plenty of practical models for it.
↑ comment by cousin_it · 2009-03-31T14:51:02.243Z · LW(p) · GW(p)
Never mind widgets. Maybe I overemphasized business in the original post. Any kind of reality check will do. Newton had a reality check, what about us? I don't read many accurate predictions here. Choose any real-world metric that suits you - just don't degenerate into "what biases have you overcome today" soft-science bullshit.
I once joked that science has four levels, high to low: "this works", "this is true", "this sounds true" and "this sounds neat". We here are still at number three, no?
↑ comment by MichaelVassar · 2009-03-31T17:03:03.793Z · LW(p) · GW(p)
I'd sure appreciate exact info on the monetary cost/benefit breakdown for more expensively styled hair.
OTOH, isn't that an instance of trying to use rationality to win as well as one of trying to use great hair to win?
Rationality will very rarely win directly.
It definitely won't win by failing to go meta and not telling us to spend some of our time on anything else, but if it looks like it's telling us that and we don't win it seems silly to blame it rather than ourselves for using it wrong or making it an idol.
↑ comment by Scott Alexander (Yvain) · 2009-03-31T18:37:27.918Z · LW(p) · GW(p)
I agree that it's rational to seek better hair if better hair leads to your goals. I'm trying to point out an inconsistency: that if you claim to be after success in business, and you spend a lot of time reading Less Wrong but very little time worrying over your hair, then either you're not being as rational as you think or you're not as focused on success in business as you think.
I further wonder if some people who read this will make a token attempt to consider getting nice haircuts, not because they're really after real-world success but because they want to be able to continue telling themselves credibly that they're really after real-world success.
Replies from: ciphergoth↑ comment by Paul Crowley (ciphergoth) · 2009-03-31T22:10:24.323Z · LW(p) · GW(p)
If you are ambitious for money and power, and you are not already obsessed with your looks, http://biasandbelief.pbwiki.com/Halo-Effects-of-Attractiveness
(I have great hair, thank you for asking :-) )
comment by PhilGoetz · 2009-04-01T05:27:58.932Z · LW(p) · GW(p)
I am reading Less Wrong at 1AM even though I need to get up in the morning. Rationally, I would get better return on my time from sleeping. Rationally, it's pretty clear that, wrt my major goals of managing my life and my job, Less Wrong is more of a hazard than a potential benefit, as the time I spend on it it has often had a considerable negative impact on me. So I'm using Less Wrong to help me be irrational and lose. :P
comment by Sideways · 2009-03-31T17:22:16.011Z · LW(p) · GW(p)
No, feelings won't do. If feelings turn you on, do drugs or get religious. Rationalism needs to verifiably bring external benefit.
I couldn't disagree more. Since becoming a rationalist my repertoire of feelings has measurably improved. I'm now capable of being delighted by things that wouldn't have interested me a few months ago. In certain situations I used to feel an overwhelming, paralyzing fear that rationality has cured me of. This is a huge verifiable, external benefit for me.
Since becoming a rationalist, my job performance has improved, and I spend twice as much time (yes, I've kept track) doing the things I've always wanted to, but was previously kept from doing by akrasia or poor time management. I don't know if that's as "awesome" as a martial art, but I definitely consider it a WIN.
If anyone's interested, the four major influences on my rationalism are Marcus Aurelius's Meditations, Moshe Feldenkrais's writing on body awareness, P.J. Eby's mind hacking techniques, and OB/LW.
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-04-01T07:17:56.380Z · LW(p) · GW(p)
I developed my Art in the course of trying intensely to untangle deep confusions and answer wrong questions. If you are trying to do something similar, then I should think and hope that reading through all my reams of advice will be a HUGE win.
Does it have other applications? Oh, some of it does, I should rather think. The art of clear thinking is inherently less specialized than, say, welding.
But my Art of rationality is confessedly incomplete, a strong punch without a hint of kicking. I have not been trying with all my strength to live a happy everyday life. Perhaps you are looking for some of the other pieces, yet to be developed?
But just realizing that this sort of thing can have an art of rationality, that can be based in cognitive science, is quite a hint; and for that matter, trying to invent that art will be something of a confusing question...
Replies from: cousin_it, cousin_it↑ comment by cousin_it · 2009-04-01T12:01:00.429Z · LW(p) · GW(p)
You say your Art is grounded in something real for you. I don't dispute that! I only say the Art isn't grounded for the rest of us. This is a problem. Not a problem with your personal intellectual pursuit, but a problem outside where other people are reading and clapping. A problem with how the movement is turning out. Maybe you shouldn't care, maybe you should.
comment by Alex · 2009-03-31T15:57:01.581Z · LW(p) · GW(p)
Why are interpersonal relationships out? I think rationality can help a great deal here.
Replies from: PhilGoetz, cousin_it, MichaelVassar↑ comment by PhilGoetz · 2009-04-01T05:21:15.125Z · LW(p) · GW(p)
Can you cite specific examples which exhibit rationality, and not just self-control? (eg "I should not lose my temper or get drunk so often" is not a rationality issue.)
I don't doubt you; I'd just like specific examples for discussion before I give an up-vote. :)
Replies from: ciphergoth↑ comment by Paul Crowley (ciphergoth) · 2009-04-01T08:26:32.836Z · LW(p) · GW(p)
Sure, yes. The housework I do is more available to me than the housework J does, because I am there all the time while I do mine seeing myself do it, while hers might be done while I'm not even there, and all I get is a few seconds of "Oh, you did the dishwasher, thanks!" As a result, there's a cognitive bias in favour of thinking you a larger proportion of the housework than you really do. I think a lot of domestic disputes could be fixed if this were more widely recognised.
↑ comment by cousin_it · 2009-03-31T16:11:15.116Z · LW(p) · GW(p)
This sentence was a nod to Robin's point that believing certain falsehoods may be socially desirable, and wrongly estimating one's attractiveness etc. may be desirable romantically.
Replies from: Alex, MBlume↑ comment by Alex · 2009-03-31T16:53:02.463Z · LW(p) · GW(p)
I would say that being rational - as Robin defined it: more "rational" means better believing what is true, given one's limited info and analysis resources - might, but should not, and does not necessarily harm you.
How can rationality help you win? Maybe:
- It can help you win to the extent that its lessons contribute to the winning process.
- In terms of it hindering you, it may hinder you if it interferes with this process, but it shouldn't (you're doing something wrong), because it's about matching your map of reality closer to the territory, allowing any and all strategies to be better implemented.
Better believing what is true, I don't think, by its self, necessarily must lead to a worse outcome then believing the falsehoods (in your example), it is just an example of where believing the falsehoods isn't outright defeated by the alternative strategy provided by 'rationality'. With this in mind I think the idea that rationality can help you in your interpersonal relationships is a very interesting path to follow, and should be followed.
↑ comment by MichaelVassar · 2009-03-31T17:07:12.199Z · LW(p) · GW(p)
Hell yes.
comment by Nominull · 2009-03-31T14:44:05.816Z · LW(p) · GW(p)
I'm not like Eliezer, I don't seek to be right because I have some beautiful vision of the future I have to actualize. I'm not as smart as him, and I lack his willpower. I only seek to be right because I hate being wrong. Which means that for my purposes drugs are unproductive and religion is counterproductive.
Replies from: Annoyancecomment by AlexU · 2009-03-31T12:51:35.064Z · LW(p) · GW(p)
Your postscript raises an interesting point. I strongly suspect that readers here can have a much greater impact on their real world success by improving arational traits like charisma and physical appearance than by continuing to strive for marginal gains in what are likely to be already-high levels of rationality. At the very least, it seems uncontroversial to say that these traits play a huge role in one's real world success. If we assume, then, that "real world success" is a rational objective, why isn't everyone here hitting the gym daily, working to improve their fashion sense, and enrolling in acting classes to improve social finesse?
Replies from: cousin_it, MichaelVassar, ciphergoth, robzahra↑ comment by cousin_it · 2009-03-31T13:00:49.247Z · LW(p) · GW(p)
Funny. For the last several months I've been doing exactly what you enumerated: physical training, voice lessons and improving my fashion sense. Those were all rational decisions (caused in small part by Overcoming Bias) and they did already help me.
Replies from: Tom_Talbot↑ comment by Tom_Talbot · 2009-03-31T13:44:11.891Z · LW(p) · GW(p)
Hah! When I read the top post I immediately thought of my own everyday struggle with irrationality along the lines of, "I want to get fit and live longer. This requires rationally alloting a certain amount of time to exercise. It's hard to get motivated to exercise, due to akrasia (laziness). I want to solve the problem of akrasia, so I'll go to Less Wrong and see what others are saying about it."
The point is that rationality may have no direct benefits whatsoever, but it is still useful since it helps you choose between, and stick to, behaviours that do have direct benefits.
A classic controversial example: should rationalists go to church?
Replies from: AlexU↑ comment by AlexU · 2009-03-31T14:01:59.248Z · LW(p) · GW(p)
Rationality has its limits. We all know that daily exercise is good for us, and that it's something we should be doing. It's pretty clearly the "rational" choice. But can rationality actually get us to exercise every day? Is there some further bias we can eliminate that will enable us to drag our asses to the gym even when we're feeling completely exhausted? I doubt it -- there's just nothing much more that rationality can do for us in that department. A related (and rhetorical) question: are fat people fat because they're rationally deficient in some sense? We need to be careful not to downplay the extremely powerful and seemingly ineradicable influences of emotion and subjective experience (urges, fatigue, impulses, etc.) in our day-to-day decision-making.
Replies from: pjeby, Tom_Talbot, favrot↑ comment by pjeby · 2009-03-31T18:17:59.111Z · LW(p) · GW(p)
Is there some further bias we can eliminate that will enable us to drag our asses to the gym even when we're feeling completely exhausted?
Yes, several. Unfortunately, the exact list is usually different from one person to the next. Here are a few I've had to get rid of:
- The idea that people who like to exercise are jerks
- The idea that it's bad to be too exhausted
- The idea that I shouldn't have to do things if they're uncomfortable
- The idea that it's embarrassing to exercise if I don't already know how
- The idea that if it's too easy, I'll be an idiot for not having done this sooner
This is less than a third of the full list, it's just the ones that come to mind right off... and I'm not really done yet, either. I lost 27 pounds last year, and expect to do a similar amount this year, but my actual habit of exercising is still pretty erratic, due to another bias which I only just eliminated. (Still too soon to tell what impact it's going to have.)
Replies from: cousin_it↑ comment by Tom_Talbot · 2009-03-31T15:17:31.390Z · LW(p) · GW(p)
Why should our emotions always rule our reason? There ought to be a rational way to deal with urges, fatigue and so on. I think the methods currently under discussion are Pjeby's motivation techniques, cognitive behavioural therapy and possibly meditation. If these lines of inquiry bear fruit, then that should make it possible for people here to muster the willpower to do whatever it is they want to do. At that point we'll be able to say that any Less Wrong reader who wants to lose weight or whatever and can't, is failing to be sufficiently rational.
Replies from: AlexU↑ comment by AlexU · 2009-03-31T15:30:43.247Z · LW(p) · GW(p)
My point was just that knowing what to do and actually doing it are two separate things. It's possible that someone could come to the objectively rational conclusion in every single circumstance, yet fail to act on those conclusions for a variety of other reasons. In that case, it would very tough to say their rationality is in any way at fault.
Replies from: AlexU, Emile↑ comment by AlexU · 2009-04-02T22:52:48.220Z · LW(p) · GW(p)
Anyone care to explain why this comment (and for that matter, the one below) was downvoted? Given that my karma score just dropped about 10 points in under an hour, I can only assume someone is going through my history and downvoting me for some reason. Great use of the karma system.
Replies from: ciphergoth↑ comment by Paul Crowley (ciphergoth) · 2009-04-02T23:07:52.018Z · LW(p) · GW(p)
I've had some very weird karma behaviour recently too.
All karma systems are abused. On this one, I'd be curious to know what proportion of votes are coming from non-commenting accounts.
Replies from: SoullessAutomaton↑ comment by SoullessAutomaton · 2009-04-02T23:14:30.772Z · LW(p) · GW(p)
It seems increasingly likely that this bug needs to be prioritized, especially part 3.
↑ comment by favrot · 2009-03-31T15:17:39.410Z · LW(p) · GW(p)
Motivation often comes from witnessing the positive results of your actions. A rationalist is especially attuned to this form of observation so it would seem that exercise is the perfect arena for the rationalist to succeed. I run and lift weights and I feel and perform better (disregard looks for now because it's too loaded). If I stop, then I feel and perform worse. Therefore, as a matter of rationalist discipline I will continue to exercise. Eventually, this should normalize into a sense of motivation. For the first timer, exercise might feel like hell but over time a positive and motivating association should develop.
I would illustrate that like this: exercise (don't like) = feel and perform better (like) => exercise (like) = feel and perform better (like)
And I don't think fat people are irrational, just undisciplined. Developing habits and mental associations takes time. Doing things you don't like over a period of time (which is the same as giving up something you like in the short term) in the interest of a deferred goal is the definition of discipline.
↑ comment by MichaelVassar · 2009-03-31T17:06:38.337Z · LW(p) · GW(p)
And only rational means like practice, care to figure out what is working, looking at and beneath the surface features of what other successful people are doing, etc are going to improve real traits like charisma, physical appearance, etc.
Replies from: AlexU↑ comment by AlexU · 2009-04-01T02:14:00.982Z · LW(p) · GW(p)
Yes, but these are things most reasonably intelligent people know, or figure out, anyway. It seems correct to chalk up these insights to rationality, but trivially so. I don't see what extra work studying rationality per se would be doing for us here.
↑ comment by Paul Crowley (ciphergoth) · 2009-04-01T08:10:44.046Z · LW(p) · GW(p)
First, I'm not here to further myself; I largely have the real-world success I want. Second, I'm already attractive and charming thank you very much...
comment by Dustin · 2009-03-31T17:03:31.991Z · LW(p) · GW(p)
To be honest, I was already pretty rational! OCB and LW have helped me clarify many of the reasons I behave as I do, and clarified why I don't understand why others do the things they do (or sometimes why I do things that I don't understand).
Even if rationalism didn't bring real-world victories, I don't agree that there's no point. There's any number of scientific endeavors that I understand, or would like to understand, that don't lead me to anything other than ... understanding.
Replies from: loqi↑ comment by loqi · 2009-03-31T19:39:31.464Z · LW(p) · GW(p)
To be honest, I was already pretty rational! OCB and LW have [...] clarified why I don't understand why others do the things they do
Assuming you mean "now I understand precisely how they're being irrational", I find this to be a dangerous sentiment. I'm curious as to whether or not your rationality self-appraisal is higher or lower after reading OB, particularly Hanson's material. Mine fell considerably.
Replies from: Dustin↑ comment by Dustin · 2009-07-11T20:23:36.678Z · LW(p) · GW(p)
Old post that I only now came across...
No, I don't mean I understand precisely how they're being irrational. I can't see inside their head.
What I do mean is that I now understand more clearly that there is a system of inherent weaknesses in the way that people think.
As far as my self-appraisal goes, if we're talking about relative to the average human being, I would say my self-appraisal has remain constant or fell slightly.
If you're still reading these comments, I'm interested in what you think that says about you and I.
comment by CronoDAS · 2009-03-31T20:58:20.533Z · LW(p) · GW(p)
No, feelings won't do. If feelings turn you on, do drugs or get religious.
Most drugs that have large, noticeable positive effects on mood (cocaine, amphetamines, opiates, MDMA, etc.) are expensive, risky to acquire, and become less effective over time as the body's feedback systems adjust to compensate for the presence of the drug. Most commonly prescribed psychiatric medications (Prozac, etc.) have only small, minor effects.
If one has six months to live, spending it doing cocaine and heroin is a reasonable way to maximize personal pleasure, but if you don't plan on dying any time soon, their long-term effects make them a bad deal even from a pure hedonistic perspective.
comment by insaneabd · 2009-03-31T15:49:11.378Z · LW(p) · GW(p)
As Eliezer's Quantum Physics sequence showed, Rationality can go a long a way in helping scientist get to the better theories faster. Which helps mankind WIN, which helps the individual scientists who worked on the theories WIN etc etc. A very practical benefit of Rationality.