In What Ways Have You Become Stronger?
post by Vladimir_Nesov · 2009-03-15T20:44:47.697Z · LW · GW · Legacy · 40 commentsContents
40 comments
Related to: Tsuyoku Naritai! (I Want To Become Stronger), Test Your Rationality, 3 Levels of Rationality Verification.
Robin and Eliezer ask about the ways to test rationality skills, for each of the many important purposes such testing might have. Depending on what's possible, you may want to test yourself to learn how well you are doing at your studies, at least to some extent check the sanity of the teaching that you follow, estimate the effectiveness of specific techniques, or even force a rationality test on a person whose position depends on the outcome.
Verification procedures have various weaknesses, making them admissible for one purpose and not for another. But however rigorous the verification methods are, one must first find the specific properties to test for. These properties or skills may come naturally with the art, or they may be cultivated specifically for the testing, in which case they need to be good signals, hard to demonstrate without also becoming more rational.
So, my question is this - what have you become reliably stronger at, after you walked the path of an aspiring rationalist for considerable time? Maybe you have noticeably improved at something, or maybe you haven't learned a certain skill yet, but you are reasonably sure that because of your study of rationality you'll be able to do that considerably better than other people.
This is a significantly different question from the ones Eliezer and Robin ask. Some of the skills you obtained may be virtually unverifiable, some of them may be easy to fake, some of them may be easy to learn without becoming sufficiently rational, and some of them may be standard in other disciplines. But I think it's useful to step back, and write a list of skills before selecting ones more suitable for the testing.
40 comments
Comments sorted by top scores.
comment by AnnaSalamon · 2009-03-18T21:40:52.007Z · LW(p) · GW(p)
Ways I’ve benefitted:
My head feels clearer. I rationalize less. Life feels better and has better aesthetics.
I’m less defensive, and less attached to the particular habits or traits I started out with. I’m more willing to ditch my current ways of doing things for something else that looks promising. (This may follow from having Something to protect more than from rationality per se.)
I have more self-confidence, and I’m more likely to look around in the world and notice and address real problems, instead of self-absorbedly poking around “interesting ideas”, or notions of virtue, as a kind of entertainment. I make more decisions and am less prone to stalling around staring at all the options.
I more notice contexts where others’ starting ideas for how to proceed are better than my initial impressions, and I more often go with those ideas. I notice more contexts where others’ starting anticipations are a better guide than my starting anticipations, particularly in subjects around which I have emotional biases, and I more often believe them.
I’m more likely to stick with a difficult question instead of shying away from it.
My social skills have improved somewhat.
I'm more likely to notice when the evidence favors a particular hypothesis, instead of making up entertaining arguments to myself about how I could support this side, or that side, and aren't I far-sighted to be above the fray.
I have more true beliefs and fewer false beliefs.
When I do science, I’m better able to look at the evidence first, move through many possible hypotheses, etc., instead of staying locked into my own relatively boring initial research avenue.
↑ comment by MichaelHoward · 2009-03-18T23:10:45.832Z · LW(p) · GW(p)
Thank you! This is almost exactly my own list, but for some weird reason I had huge difficulty articulating.
I'm not sure why that was so tricky. I thought about it a lot, because despite the improvements in my thinking and decision-making, my effectiveness at actually doing stuff hasn't changed greatly. I'm steering better, but I'm not peddling faster.
Replies from: AnnaSalamon↑ comment by AnnaSalamon · 2009-03-19T00:00:25.524Z · LW(p) · GW(p)
Can you explain more what you mean by steering vs. pedaling?
What you say about steering vs. pedaling, or about improvements in “thinking and decision-making”, and not so much improvement in “doing stuff”, sounds like it might fit for me and in general. But... really? If so, by what mechanisms?
If we’re better choosing e.g. what path to take toward useful scientific research, or positive relationships, or income, this should increase the amount of useful research, goodness in relationships, or income we gain.
As to myself: my ability to make money as a tutor did increase, once I started actually trying to make money as a tutor (vs. just doing tutoring- and marketing- activities, without tracking what helped students and what made money). I think my visible social skills improved somewhat, but I’m not confident, and I should check with others. My effectiveness at improving the outside state of the world has improved to a ridiculous extent, because I’m working on existential risks now, and my picture of how to make the world a better place used to involve activity that was ridiculously less efficient. My effectiveness at writing decent prose, keeping healthy, etc., has improved... slightly... in the manner that I might’ve expected from just experimenting a bit and reading some non-rationalist self-help literature.
If “good decision making” doesn’t improve one’s actual goal-achievement, why not? Is it just a “feeling” of good decision-making, rather than actual good decision making? Does rationality only work where it isn’t measurable? Does rationality only help much for “really tricky issues” like global philanthropy, and not for questions like how to make money or build positive relationships? Do we just need to actually discuss and practice the “actually apply this rationality to your day-to-day decisions” step?
(Re: this last possibility: I was talking the other day to a good rationalist by OB/LW standards, who comments here fairly often. He was talking about his plans to get a higher-paying job, and how he was undergoing a particular certification process for the purpose. He’d gotten some distance into studying for the certification, but it hadn’t occurred to him to, like, actually look up the wages and employment rates of people who got the certification and to compare to alternatives. “Look into wages before you go through a degree/certification program, if your goal in the job is to make money” should be a cached heuristic for rationalists, and might improve the mundane usefulness of rationality. I don’t know how many other such cached heuristics we should have.)
Replies from: MichaelHoward↑ comment by MichaelHoward · 2009-03-19T00:58:16.774Z · LW(p) · GW(p)
Can you explain more what you mean by steering vs. pedaling?
To be honest I had the analogy cached from here and it seemed appropriate, but I'll try to clarify. I'm making better decisions on what I do or don't do, at what I keep doing or stop doing, at what I pay attention to or ignore, but I'm not significantly better or faster at the actual doing itself.
I think rationality will help me with that bit, indirectly, by helping me understand what the best skills are to learn for he things I want to do, and which methods work best for learning it and which are snake-oil. But at the moment, most of my learning time is going into rationality related topics because I think it's important to get those foundations first. I still have a huge amount of that to learn.
comment by Emiya (andrea-mulazzani) · 2020-03-06T13:40:20.717Z · LW(p) · GW(p)
Ways I've benefitted:
- I've managed to stop an overwhelming tendency of self-sabotaging my efforts to preserve my self esteem. I still catch myself doing from time to time, but it's nowhere as bad as in the past where it would have likely made me drop out of my university, given a couple more years or so.
- My self esteem has increased and become more stable. I'm managing to interpret hard tests I could fail at as tests of my current skills and not of my stable, absolute worth, and I think that the whole becoming stronger philosophy was the main factor, combining with learning about the incremental mindset during my studies. If I don't know how to do something, rather than retreating I try to tear down the problem and look for knowledge about it, which usually does the trick. I'm still struggling with stuff that actually proves itself to be way above my current level, that is still a hard blow, but at least I'm managing to not run from what I can do or learn to do if I try my hardest.
- I've somehow become the prevalent voice of reason in my group of friends. This likely was the result of an interaction of rationality with my studies (psychology), but I've become really good at catching the irrationalities of their behaviour and nudging them in the right direction, when asked for help. This usually is mostly effective when they need to solve interpersonal conflict with one another, I've yet to see anyone change their habits for simply advice, but I'm still shocked by with how much weight my opinion is now being considered, having lived most of my life as completely clueless about interpersonal relations.
- I've become better at making decisions when I take my time to think them through rationally. I still make mistakes, but I've moved from a stage where my brain would just keep suggesting arguments for one side or another until I went with what I'd chosen to do from the start to actually managing to weight the arguments and change my mind. I'm not yet at a level I'm satisfied with, but my feeling is that my accuracy went up considerably.
- Related to the two above, I've become significantly better at understanding situations. I feel like I can see the factors influencing something as they actually are, rather than how society, subjective points of views or common sense frames them.
- I've become better at predicting if my brain would fail me by acting instinctively and irrationally against my rational decisions, and arranging circumstances that would make me go along with them anyway. My best accomplishments in this regards are switching from a really unhealthy diet from an averagely healthy one, starting to practice a reasonable amount of physical activity almost regularly after years of not doing anything at all, and quitting smoking completely. I've not still managed to make an habit out of this and there are many instances where my dumb brain takes control and does what he instinctively wants to do, but so far I've managed to apply it to most of my specific long term issues.
- I've become really, really better at science and at judging scientific evidences, which is really important for both my studies and my future job.
- I've managed to choose something I want to protect, and I'm planning what to do after my studies accordingly. I've yet to accomplish any real result about it yet, but I think having this purpose is a considerable motivation for working to increase my abilities.
I feel like starting my journey as a rationalist had a huge impact of my life because I was the kind of person who would benefit the most of it (high intelligence, poor judgement, fluctuating self esteem that would force me into all kind of rationalisations and escheresque reasonings to make sure it was never, ever challenged by reality). So I likely benefitted a lot of it because I was suffering more from deficits it could help correct. Interestingly enough though, the one I feel most satisfaction from, and that I've felt more improvement with, doesn't seem to have started as improving from a deficit, because:
- I think that I've actually become smarter. I've always had an above average intelligence I were pretty bad at using it due the previous reasons and extreme akrasia (which is still to be solved at a satisfying level), but I feel like I'm thinking on a completely different mode than before. Finding rationality was like I had suddenly started pressing the gas pedal of my brain, hard. I remember that when I had just started learning it I would often get headaches by how much I thought about something, and that I got a tremendous emotional kick by rejoicing at all the new kind of thoughts and strategies I were suddenly having. I'd often start thinking at a difficult problem while I walked just to amuse myself and be impressed by how smart I was feeling (yes, I know how that sounds). All of a sudden, I could plot, I could find new, interesting strategies that likely had a lot of overlooked flaws I weren't seeing yet. I felt like I was thinking at my hardest, and after a while I felt like I wasn't really exerting myself as much as before anymore but that my normal thinking was just at that new level. I feel like from that point improvements were slower to come, but I still happen to look back at how I thought months or a year before now and it feels like I could reliably outsmart my older self. External reality seems to confirm my inner feelings, as I'm finding significantly fewer problems I'm unable to grasp, I manage to come up with plans that works most of the time for everyday life problems I face for the first time and most of my friends just team up against me every time there's a strategy game of some sort (which is both frustrating and exhilarating). If anyone manages to get past the point were I got excited about it while I was writing (read: started bragging) I'd be really interested to know if other people had a similar experience.
comment by Nominull · 2009-03-16T03:52:14.576Z · LW(p) · GW(p)
Learning about the halo and horns biases has helped me make more accurate predictions about people's actions, realize that my friends are terrible people and my enemies are pretty cool.
Hearing Eliezer's solutions to philosophical problems has made me stop wasting so much time on those philosophical problems, which is an advantage I've gained from Overcoming Bias, but not really from increased rationality.
comment by NicoleTedesco · 2012-01-15T15:51:49.865Z · LW(p) · GW(p)
Both my husband and I have cognitive challenges (don't we all, we imperfect results of evolution, us) that constantly threaten to sour our relationship. We strongly credit our study of cognitive science and rationality for keeping our relationship sane and enjoyable. We make conscious decisions about how we manage our moods relative to each other. We work hard to recognize cognitive biases and to tease out objective fact from fallacy. We have a very strong, twenty year relationship because of that.
Then again, one of the things that attracted each of us to each other was the fact that we were both rationality-seeking and valued that in the other. I believe, however, I can make a rational case that our approach towards mutual anger management, anxiety management, and so on, has helped our relationship to remain strong, fruitful, and enjoyable.
For certain, understanding the philosophy of science -- minimally, the concept of "confounding variables", along with other concepts and practices -- has helped us get and keep my husband's bipolar in control. For about a year he was hallucinating due to some medication imbalances. His understanding of experimental methods and rational thinking helped him differentiate hallucination from reality during that period. Our understanding of science has also helped us make some pretty good decisions about our health, and even about our dog's health (the little tyke is 17).
comment by Technologos · 2009-03-16T01:48:27.921Z · LW(p) · GW(p)
My emotional stability has noticeably increased since I left Catholicism. Realizing that I was responsible for any outcome I achieved or failed to achieve left me with no choice but to focus on winning and to shut up and multiply.
This was, of course, long before I learned of those particular concepts. The effect has significantly amplified since then. It is now very difficult to make me unhappy, for essentially that reason.
comment by meta_ark · 2011-04-05T01:06:31.212Z · LW(p) · GW(p)
I've become good at resolving fights in my family - people don't understand why the other person's angry, and I can explain the mistakes they're making in terms of probability, or in biases, cognitive science or often status-seeking behaviour, and they understand.
Also, I've become a lot better at managing my life romantically. I kept changing my mind about whether someone had feelings for me - she did, she didn't, she did, she didn't - and I could never be sure if I was re-interpreting evidence to suit my preferred hypothesis. So I decided upon a test, decided ahead of time how I should update my beliefs based on how she'd react, and did it. Saved myself a lot of heartache.
Learning about priming, consistency effects and cached thoughts has also helped me steer my future self towards what I'd like to be.
Replies from: Alicorn↑ comment by Alicorn · 2011-04-05T01:24:00.100Z · LW(p) · GW(p)
So I decided upon a test, decided ahead of time how I should update my beliefs based on how she'd react, and did it.
This sounds virtuous in terms of empiricism, but testing someone to see if she has feelings for you is kind of a nasty thing to do in general... what was your test?
Replies from: Benquo↑ comment by Benquo · 2011-04-05T02:17:16.976Z · LW(p) · GW(p)
testing someone to see if she has feelings for you is kind of a nasty thing to do in general..
You're making a big assumption about what is meant by "test". A test could be anything from simply asking "Do you have feelings for me?" to the sort of mind games I'm guessing you have in mind.
There is an illusion of transparency here - look here for a similar case, resolved successfully - and we should be quick to taboo vague words like "test" when it sounds like someone is saying something objectionable.
To be fair, you sort of did that by asking for clarification, but it would have been better to wait until after you knew what meta_ark meant by "test" before using emotionally charged words like "nasty".
Replies from: wedrifid, Alicorn↑ comment by wedrifid · 2011-04-05T03:12:45.697Z · LW(p) · GW(p)
A test could be anything from simply asking "Do you have feelings for me?"
Or change your body language and see if she mirrors. Or maintain eye contact and see how long she maintains it and in what manner she breaks it. Make a moderately funny joke and see if she laughs. Or just ask her out already - willingness to go on an outing being an easier and potentially less personal admission to make than an outright confession of feelings and also easier to decline so the test is even further away from 'nasty'.
Replies from: meta_ark, meta_ark↑ comment by meta_ark · 2011-04-05T06:07:40.544Z · LW(p) · GW(p)
wedrified: I would have, except mutual friends who had been in similar situations with her had tried that, and it made things very awkward between them for a few months. So I had to find a more subtle way.
Alicorn: I just made an overly flirtatious joke and when she didn't respond, I knew what it meant. She's usually very flirtatious with everyone, so it was very unexpected behaviour for her.
Replies from: wedrifid↑ comment by wedrifid · 2011-04-05T06:53:18.549Z · LW(p) · GW(p)
I just made an overly flirtatious joke and when she didn't respond, I knew what it meant. She's usually very flirtatious with everyone, so it was very unexpected behaviour for her.
Smoothly done. Flirtation fulfilling one of its intended roles!
except mutual friends who had been in similar situations with her had tried that, and it made things very awkward between them for a few months
In some situations and for some people I actually consider that a beneficial side effect. It doesn't apply to yourself, of course, but some people lack the self awareness or pragmatic ability to evolve relationships in a beneficial direction based on available information and preferences. In such cases the aversive emotion of awkwardness can prompt a healthy response that they are otherwise too naive to consider. Like 'Next!", for example.
↑ comment by meta_ark · 2011-04-05T06:05:56.791Z · LW(p) · GW(p)
Oh! The issue is she's a very flirtatious person normally, even with people she's not interested in. So I just made a joke about us making out which was a little more flirtatious than she usually acts. And she didn't really respond, which was very unlike her. Nothing manipulative, just checking for deviations from usual behaviour.
comment by Matt_Simpson · 2009-03-16T23:27:27.221Z · LW(p) · GW(p)
Moving from a binary logic to a logic of probability has done wonders for avoiding various philosophical paradoxes even without taking into account the Bayesian formalism. I say avoid rather than solve because they no longer are an interesting problem, so I ignore them and spend my time on something more intriguing.
Also, I used to stall on certain decisions because I would try to figure out what the "right" thing to do was. I don't do that anymore; instead I stall trying to figure out what the hell I want. I probably stall just as often, but I come from my decisions feeling much better about them than before.
comment by thelittledoctor · 2012-01-17T16:18:14.160Z · LW(p) · GW(p)
While akrasia is still an enormous problem for me (as it always has been), it is oh-so-slowly becoming less of one. I have always been a fairly devil-may-care person regarding responsibilities and schoolwork, possibly due to early encouragement about being "smart" and "talented", which led me to think that I didn't HAVE to work hard -an idea that was unfortunately born out in many ways by the evidence throughout my adolescence. I had the impression that I had some kind of supernatural power of avoiding consequences, that I was in some way vastly better and more intelligent than my peers, and so on. This in spite of being an avowed materialist and agnostic; my beliefs weren't propagating properly. LessWrong opened my eyes: The world is allowed to kill me, my innate talents are not enough to get through life (I have to work hard!), and there are right and wrong decisions based on the evidence. Since then (it's been about a year) I've been trying to turn myself into the kind of person who Gets Shit Done. I still haven't finished the Sequences (I'm apparently too akratic even to sit down and plow through something I enjoy reading), so I'm sure there's a great deal more for me to learn... But if nothing else, x-rationality has given me the clarity to know what needs to be done.
comment by glennonymous · 2012-01-07T13:05:10.661Z · LW(p) · GW(p)
The first answer that occurs to me:
- I am very significantly happier and more even-tempered.
To expand: I have long suffered from mood swings in which I would 'enjoy' a month or two of borderline hypomania, followed by one to four months of depression and anxiety, accompanied by a lot of akrasia and mildly self-destructive behavior.
Before my 'rationalist conversion' in 2005, my main support system for dealing with these problems had become various Alcoholics Anonymous-style 12-step groups. After my rationalist conversion (I'll use BRC and ARC from here), I realized these programs don't have very good efficacy, especially for their primary purpose of helping people quit self-destructive behaviors. They have a secondary purpose, which is enhancing adherents' quality of life, for which they are somewhat more efficacious, but they promote too many irrational beliefs to be recommended for this purpose IMO.
ARC I went on a fairly intense quest to discover better means to improve the quality of my life by rational means, which ultimately led me here, among other places. One of these other places was that I learned to practice on myself the psychological techniques of Stoicism (I use that to refer the ancient Greek and Roman philosophical school that was the basis for Cognitive Behavior Therapy, not the modern slang term), as outlined by William Irvine in A Guide To The Good Life.
Without writing a long post about Stoicism, one of the core techniques is to doubt the validity of your thoughts and interpretations, especially thoughts that you find disturbing or that give you pleasure. The reason is that you have zero, or close to zero, influence over many of the things that happen that you get disturbed or ecstatic about. The Stoics hold that it is irrational to get worked up over things about which you can do nothing. Thus the aim of Stoicism is to train yourself to pursue and avoid only those things it is possible to EFFECTIVELY pursue and avoid, and to cultivate serene acceptance of the The Things You Cannot Change. (Yes, this is the 12-steppers' Serenity Prayer, but with a much better set of psychological techniques for cultivating the lofty state it describes.)
In a nutshell, learning these techniques has allowed me to effectively short-circuit the mental habit of going into a "tizzy," which is what I call that thing where you start playing an anguish-provoking mental loop in your head over and over again. This in turn has reduced the cognitive component of my depression down close to nothing. It has also diminished some of the cognitive component of hypomania, by instilling a habit of being skeptical of my "high" thoughts as much as I am of my "low" thoughts. This also has a positive impact on my overall happiness by softening the crash that occurs when my rose-colored notions about things I am going to do (get rich by starting my own business, usually) fail to come true. (Note that none of this means I shouldn't start a business or aspire to become rich!! However there is a big difference in the hard-headed mental state that would set a person up for success in starting and running a business, and the fragile high I am describing.)
Bottom line: I've experienced a major improvement and stabilization in my mood, without antidepressants or other psychoactive drugs. (I do get regular exercise -- another direct outcome of Stoic practice -- and this also helps.) I haven't had a serious bout of depression in two years, which is unprecedented in my adult life.
I've got to stop writing, so for the moment I will just list a couple of other major benefits of my rationalist conversion, to be unpacked later:
- I indulge in fewer self-destructive/addictive behaviors, have lost a lot of weight, I exercise regularly, work harder and am more productive -- in short, I have less Akrasia.
This is a result of various aspects of rationality kung fu, most recently Less Wrong, commitment contracts, and Beeminder.
I also arguably:
Make more money than I would have otherwise (because I studied negotiation techniques)
Read and study more
Sleep better
And the skin on my hands is less dry, especially in winter. (I really like that last one, which is a nice little object lesson in rationality in itself, but in the interest of getting something posted, I will elaborate later.)
comment by Court_Merrigan · 2009-03-17T03:04:04.400Z · LW(p) · GW(p)
Big one for me: cutting the Gordian knot of the philosophical antimonies, e.g., those philosophical dilemmas with no answers. Someone somewhere at Overcoming Bias commented that the "useful" parts of philosophy evolved into the natural sciences; the rest became the muted academic wordgames we see today (or something like that - the poster was much more incisive).
And just like that, my interest in those endless philosophical dilemmas dissolved. What a timesaver.
If anyone can locate that post / commenter, I'd be grateful.
Replies from: Fyrius, Cameron_Taylor↑ comment by Cameron_Taylor · 2009-03-18T03:12:55.755Z · LW(p) · GW(p)
So true.
comment by AnnaSalamon · 2009-03-18T21:39:06.573Z · LW(p) · GW(p)
With Marshall, I'm surprised there are so few apparent gains listed. Are most people who benefited just being silent? We should expect a certain number of headache-cures, etc., just by placebo effects or coincidences of timing. I’ll post mine below; I hope others do so. If you’ve spent many hours on OB/LW-type content and haven’t benefitted, post that, too. Data on harms or on lack of harms also welcome.
comment by nazgulnarsil · 2009-03-17T02:45:47.030Z · LW(p) · GW(p)
I would say the biggest change is in using status seeking explanations to yield better predictions about the people around me. This has also let me form a more accurate picture of how others interpret my behavior.
Replies from: pwno↑ comment by pwno · 2009-03-17T21:11:10.734Z · LW(p) · GW(p)
I have gotten better at using my knowledge of rationality to signal for status...
Replies from: Cameron_Taylor↑ comment by Cameron_Taylor · 2009-03-18T03:12:12.113Z · LW(p) · GW(p)
Ditto on parrent and grandparent.
comment by Richard_Kennaway · 2009-03-16T23:29:58.918Z · LW(p) · GW(p)
I have learnt ways of noticing when I am acting in ways I can improve on; discerning how I came to diverge from the Way; and then correcting myself.
People have given examples of realising they were doing something dreadfully wrong, but I don't recall the general topic having been discussed of how to notice and correct errors, other than a passing mention by Eliezer somewhere recently to the effect that when a great error comes to light, it had its seed in an earlier, small error that went unnoticed.
Knowing the virtues and vices is good, but the Way has no tramlines. You cannot follow it unless you can notice when you have left it, and find your way back.
And on the subject of noticing and correcting, it would be convenient if there was a way of previewing comments here.
comment by PhilGoetz · 2009-03-16T02:08:45.898Z · LW(p) · GW(p)
I don't think I can answer this, since I think I was a rationalist by nature - at least back to the age of 5 or 6, when I was upset by logical fallacies in the Bible. My life has been more one of discovering things that being a rationalist made me bad at, and things that being a rationalist given bad information made me bad at, and trying to fix them.
I'm bad at translating thought into action, because I want to gather more data and do more analysis. I'm bad at status games, because I internalized egalitarian values to the extent that I find it painful to assert dominance over anyone. I'm bad at self-promotion and displaying value, because I was taught that doing so was prideful. Etc. A familiar story, I think.
Replies from: psycho↑ comment by psycho · 2009-03-16T21:53:29.742Z · LW(p) · GW(p)
I am not sure your 5 or 6 year-old logical fallacy discoveries in the Bible are much of a proof of being rational. I would imagine your actions that you have taken or are taking to achieve some difficult goal would be better proof of your rationality.
Just a thought...
comment by Desrtopa · 2011-01-24T04:46:42.268Z · LW(p) · GW(p)
I've learned to recognize confused questions, and avoid wasting time answering them on their own terms when the proper response is to simply dissolve them. Considering this motivated me to drop a second major in philosophy which I was originally motivated to pursue because I enjoyed messing around with those confused questions, this has saved me quite a bit of time and effort.
I've greatly refined my ability to update on evidence and actually change my mind. Some of these beliefs have little payoff in terms of behavior, but occasionally it allows me to correct for fairly harmful mistakes.
Like Anna Salamon, my social skills have benefited.
I trust my explicit reasoning. I am able to agree or disagree with arguments separate from their conclusions. When I follow an argument, I can pinpoint if and where it contains elements I disagree with. I do not reach the end of arguments without spotting flaws in them and find myself disagreeing with the conclusions for reasons I cannot articulate. I am also prepared to spot and call out flaws in arguments whose conclusions I agree with, rather than treating them as soldiers on my side.
Although I consider myself to still be a novice in this ability, I have at least begun to be able to analyze the true motivations behind my actions, and what I want to want, and make efforts to shape my behavior with that knowledge.
I have to confess that I've learned very little of my core skills in rationality by reading and participating in Less Wrong though.
comment by SeanMCoincon · 2014-07-31T00:10:56.027Z · LW(p) · GW(p)
The most useful skill I've developed has been in meeting immaturity (both in rationale and delivery) with maturity (ditto). I work in a heavily right-wing workplace that refuses to allow anything but Fox News on anything resembling a television. This is my training environment. Even in the presence of highly irrational and emotionally charged convictions, I've found that the ability to maintain an uninvested calm and slowly help my partner to make their argument better (through gradual consilience with reality) can result in ACTUALLY CHANGED MINDS. The first step seems, invariably, to point out those counterfactuals that back them away from absolute confidence; when presented as potential improvements ("You'd probably see greater success at decreasing the actual number of abortions if you could find ways to enable people to only purposefully conceive a child.") even a position they once reviled can seem outright tasteful. The key appears to be presentation of oneself as a potential ally, so as to avoid the "I must engage on all fronts" mentality that prevents meaningful engagement at all.
comment by CannibalSmith · 2009-03-16T10:13:23.650Z · LW(p) · GW(p)
I've become marginally better at predicting how much time something will take.
comment by Oklord · 2011-04-12T14:50:03.180Z · LW(p) · GW(p)
Certainly, thought rigor. I've been constantly testing every damn thing I do, be it research, leadership or what have you. What might have been a previous gut decision in work with groups now involves testing the motives of all involved, including myself, and being incredibly weary of any viewpoint that seems like a shortcut from myself or others.
The evaluation of others has certainly been improved- Predicting and rationalizing others behavior has become a lot easier (In at least one case allowing me to cherry pick co-workers for an assignment when I really should not have been able to), and I'd like to think I'm slowly working through and against general attribution bias.
Also, fear of failure/Want to get out there has reduced significantly, to the point where opportunities have come floating by every other day now. Unfortunately this has developed an acute Fear Of Missing Out, and I've reached the point where I need to cull responsibility... Too many unfinished projects and not enough time.
comment by Cameron_Taylor · 2009-03-18T03:14:18.524Z · LW(p) · GW(p)
The ability to shut up and save my energy in the face of excessive inductive difference.
comment by Marshall · 2009-03-17T17:52:39.952Z · LW(p) · GW(p)
What a good question and how strange that there are so few answers. Rationalists are a strange bunch and still rather unpredictable for me - which in a sense is rather surprising. I think "belonging" to this uneasy club of rationalists has made me believe more in the results of my analysis - and to Hell with Aumann - because rationality gives more than one answer - as repeated readings of OB og LW confirm.
comment by FeepingCreature · 2012-01-21T11:16:14.705Z · LW(p) · GW(p)
This may not count, and it's more of a fringe benefit anyway, but reading LW/EY gave me confidence that Many-Worlds is probably something that's true about reality, and that in turn has practically eliminated my far-mode fear of death.
Replies from: MarkusRamikin↑ comment by MarkusRamikin · 2012-01-21T11:49:44.061Z · LW(p) · GW(p)
and that in turn has practically eliminated my far-mode fear of death.
...why?
Replies from: FeepingCreature↑ comment by FeepingCreature · 2012-01-21T18:48:17.817Z · LW(p) · GW(p)
Because I don't anticipate ever not finding myself to exist.
Replies from: MarkusRamikin, APMason↑ comment by MarkusRamikin · 2012-01-21T19:45:19.480Z · LW(p) · GW(p)
Does this also mean you can take any risks you want, and that Eliezer is wasting his time trying to save the world?
↑ comment by APMason · 2012-01-21T18:56:55.826Z · LW(p) · GW(p)
It seems like if you say "because of MW, I ought to anticipate all possibilities to be actualised", you have to include possibilities such as "suddenly and apparently inexplicably undergoing the most extreme torture a human being can undergo" (as well as the inverse - but, really, we know they don't cancel out), so if that were the correct interpretation of MW, it wouldn't be a reason not to fell fear (although, yes, perhaps the fear of death in particular would go out the window).
However, it doesn't seem like that is the correct interpretation of MW, because not all apparent branching points are in fact quantum branching points. If someone offers me a choice between orange juice and apple juice, and I prefer orange juice, I shouldn't anticipate choosing orange juice in one universe and apple juice in another. I prefer orange juice, so I choose the orange juice.