has anyone actually gotten smarter, in the past few years, by studying rationality?
post by sboo · 2015-12-28T18:34:33.354Z · LW · GW · Legacy · 31 commentsContents
31 comments
I feel I've learned a lot about problem solving from less wrong (and HPMOR in particular). be concrete, hold off on proposing solutions, et cetera. The effect*, unfortunately, doesn't seem to be as much as I had hoped.
Small increases in intelligence are extremely general and even recursive, so it feels worth the effort. But, I found alternatives much more effective (though still modest) then studying/discussing/applying the sequences. like meditation or smart drugs.
I'm interested in other lesswrongers experiences in cognitive enhancement.
* by "smarter" I mean "better at problem solving", where examples of "problems" are writing a program, finding the right thing to say to resolve interpersonal conflict, memorizing some random fact quickly and then recalling it quickly/vividly. let me know if you want further clarification.
31 comments
Comments sorted by top scores.
comment by RomeoStevens · 2015-12-31T03:23:15.062Z · LW(p) · GW(p)
I never would have gotten a business off the ground without understanding concepts like the outside view, the introspection illusion, anchoring, fox vs hedgehog thinking, belief constraining expected evidence, basic statistics, and many more.
But it's more than that. LW type material upgraded my understanding of what it means to understand material. Previously I was doing a lot of guessing the teacher's password, CYA, moral licensing etc in justifying my actions to myself. Building models of when and where to apply concepts, how to seek out new ones when the existing concepts are insufficient, and how to validate them against problem domains is as important as the concepts themselves.
This is partly the difference between LW and just handing someone a copy of Thinking Fast and Slow. In the latter case, I would have read the book, gone "yes, that sounds very nice" and then continued on my way (I think).
comment by Raziel123 · 2015-12-29T20:39:28.744Z · LW(p) · GW(p)
The most important benefit from less wrong ist that before lw I hat a very fixed mindset of things I know and I don't, like if it were properties of the things in itself, and when I wanted to improve at something I just do it in a very vague directionless way.
A more concrete example is that I always liked modding video games but in modding is very limited what you can do comparing to coding, so at least once a year I make a half hearted attempt to learn get better at modding, which result in nothing because the next step always was to learn to code (which was in the "I can´t" bin ). After reading posts here of people doing awesome stuff , internalize that the map is not the territory and so, I realized that I could likely learn to code , an then the "I can't" bin broke. Exactly two years later know I'm fairly good with python , java and some of haskell just for the fun. I'm currently close to releasing an android game.
A life changing benefit I gain was to "cure" my social anxiety, it was mostly thanks to a post make here linking to Mark Manson, but it totally changed the way I interact with people from being all fear and uneasiness to flow and actually enjoying being around people (especially women).
Other less direct benefits are clearing a lot of philosophical confusion, save me from a couple of death spirals, I have the memorization problem mostly solved with spaced repetition, I change my mind more often, strategic thinking, meta-thinking and more stuff that's getting more abstract and I don't think is in the spirit of the question.
To answer the question, I DO think that my past self was dumber than me now, so in a way I'm gotten smarter.
Replies from: sboocomment by gjm · 2015-12-29T15:49:30.182Z · LW(p) · GW(p)
Related LW article from 2009: Extreme Rationality: it's not that great by Yvain, also known as Scott Alexander.
Replies from: None↑ comment by [deleted] · 2016-01-04T13:18:27.232Z · LW(p) · GW(p)
Wait what! Yvain is Scott Alexander?
Replies from: gjm↑ comment by gjm · 2016-01-04T15:46:18.478Z · LW(p) · GW(p)
Yup. And (this'll blow your mind) neither of those is his actual exact real name.
Replies from: Lumifer↑ comment by Lumifer · 2016-01-04T17:20:50.116Z · LW(p) · GW(p)
Um, one of these is his actual real name. Not the full one, of course.
Replies from: gjm↑ comment by gjm · 2016-01-04T17:36:50.391Z · LW(p) · GW(p)
I know his name. I chose my words carefully. Neither "Yvain" nor "Scott Alexander" is his actual exact real name, any more than (say) "Martin Luther" was Martin Luther King's actual exact real name.
Replies from: Lumifer↑ comment by Lumifer · 2016-01-04T17:41:57.544Z · LW(p) · GW(p)
You're treating "exact" as "the complete string of characters present on the birth certificate". I'm treating "exact" as "perfectly matches one of the space-delimited words in that string". "Martin" was an actual exact real name of MLK, by my lights.
Replies from: gjm↑ comment by gjm · 2016-01-04T21:54:02.961Z · LW(p) · GW(p)
I would say that Martin was an exact name of MLK but not the exact name, and I would say that neither "Martin Luther" nor "Luther King" was even an exact name because treating "Martin Luther" as a name means taking "Luther" as a surname and treating "Luther King" as a name means taking "Luther" as a given name, and neither is correct for MLK.
Of course, you may choose to use these terms differently. Perhaps you couldn't truthfully say "neither of those is his actual exact real name". But I could, and I did.
(I would also have preferred not to get into this particular dispute because although it's not exactly hard to find out Scott's actual exact real full complete official name -- or at least something that I take to be that -- I would have preferred to offer a little less assistance to anyone trying to do it. But no matter.)
comment by moridinamael · 2015-12-31T20:32:54.639Z · LW(p) · GW(p)
Pre Less Wrong: Something bad happens. I freak out at all possible levels of freaking out, identify with failure, flinch away from unpleasant thoughts and thus fail even harder, react emotionally, try primarily to preserve my self-image.
Post Less Wrong: Somethjng bad happens. I compensate for my emotional reaction, prioritizing important choices with emphasis on making dispassionate decisions, try to pause even amid emotion and use a varied problem-solving toolkit to consider multiple solutions, more concerned with outcomes than appearances.
comment by [deleted] · 2015-12-29T06:43:24.001Z · LW(p) · GW(p)
LW has definitely pointed me to some practical tools that have improved my decision making. Reading books tends to have helped more... but I would never have read those books if not for LW
comment by IlyaShpitser · 2015-12-30T20:09:46.802Z · LW(p) · GW(p)
How do you know it's not from simply growing older?
Replies from: None↑ comment by [deleted] · 2016-01-04T13:21:05.397Z · LW(p) · GW(p)
Do people usually get smarter as they get older?
Replies from: IlyaShpitser, Richard_Kennaway↑ comment by IlyaShpitser · 2016-01-07T18:05:49.607Z · LW(p) · GW(p)
I think I did. I was an idiot in my 20s, it was not a very high bar.
Replies from: None↑ comment by Richard_Kennaway · 2016-01-04T13:42:37.749Z · LW(p) · GW(p)
How do you know it's not from simply growing older?
Do people usually get smarter as they get older?
During development to adulthood, clearly so. Thereafter, interpreting "smart" as the OP's "better at problem solving", they do, or at least, they can and should. They learn more, acquire and develop skills, accumulate experiences, and so on.
But it's not clear that "simply growing older" is a thing, distinct from all of the things one actually does while time passes — including such stuff as studying rationality.
Replies from: Lumifer↑ comment by Lumifer · 2016-01-07T18:12:02.120Z · LW(p) · GW(p)
I'd probably say that "growing older" implies accumulating experience. How well you convert that experience into wisdom (= "getting smarter") very much depends on the person (and in particular, on IQ). Some people do an excellent job out of it, others not so much.
comment by MrMind · 2015-12-30T09:06:16.835Z · LW(p) · GW(p)
I think an often overlooked part of rationality is the wasted time and effort that LWers avoid put into irrelevant things: debating words' meaning, theology, blue/green affiliations, and so on. Not because of faith, but because all these have been dealt with.
True, it's not said that the resulting time/energy (action?) is put into more productive things, but it could be.
comment by chaosmage · 2015-12-29T15:53:02.082Z · LW(p) · GW(p)
It worked for me - like I described here.
But the more obvious difference between LWers and non-LWers, according to me, is that LWers do not appear "stuck", while a solid majority of non-LWers do.
Replies from: WoodSwordSquire↑ comment by WoodSwordSquire · 2015-12-29T16:23:46.451Z · LW(p) · GW(p)
That's a good way of describing how the difference in my own thinking felt - when I was Christian I had enough of a framework to try to do things, but they weren't really working. (It's not a very good framework for working toward utilitarian values in.) Then I bumbled around for a couple years without much direction. LW gave me a framework again, and it was one that worked a lot better for my goals.
I'm not sure I can say the same thing about other people, though, so we might not be talking about the same thing. (Though I tend not to pay as much attention to the intelligence or "level" of others as much as most people seem to, so it might just be that.)
comment by NancyLebovitz · 2015-12-31T12:37:43.027Z · LW(p) · GW(p)
I'm definitely better at checking on whether information makes sense and/or is grounded in somewhat verified facts. Some of that is LW, some of that is a result of spending a lot of time online. I'm more cynical about psych studies, but probably not yet cynical enough.
comment by Gunslinger (LessWrong1) · 2015-12-29T05:51:24.732Z · LW(p) · GW(p)
Surprised there still wasn't a response despite seeing this post yesterday.
On topic, the best you can do is:
- Start solving some problems;
- Write down your thoughts, and compare them with your future thoughts
- Read more, because even if you are rational it does not make you omniscient.
I don't think you should be worried much about intelligence either, I think it's vastly overrated.
comment by [deleted] · 2015-12-30T22:31:53.952Z · LW(p) · GW(p)
Based off the very rough comparison of my SAT score with my GRE score, my IQ rose around 10 points in the interim (a gap of about 10 years), and I did begin reading LW during that period. However, my own impression is that most of the increase (if it even exists) came before I started reading LessWrong, and the increase instead came from economics blogs and literature recommendations which I mainly got from economics blogs. One of those early blogs was OvercomingBias, so I suppose that could be seen as a win for LessWrong. Previously, most of my reading had consisted of either school work or fiction, typically Sci-Fi or Fantasy.
comment by WoodSwordSquire · 2015-12-29T07:02:39.537Z · LW(p) · GW(p)
The one improvement that I'm fairly certain I can contribute to lesswrong/HPMOR/etc is getting better at morality. First, being introduced to and convinced up utilitarianism helped me get a grip on how to reason about ethics. Realizing that morality and "what I want the world to be like, when I'm at my best" are really similar, possibly the same thing, was also helpful. (And from there, HPMOR's slytherins and the parts of objectivism that EAs tend to like were the last couple ideas I needed to learn how to have actual self esteem.)
But as to the kinds of improvements you're interested in. I'm better at thinking strategically, often just from using some estimation in decision making. (If I built this product, how many people would I have to sell it to at what price to make it worth my time? Often results in not building the thing.) But the time since I discovered lesswrong included my last two years of college and listening to startup podcasts to cope with a boring internship, so it's hard to attribute credit.
My memory isn't better, but I haven't gone out of my way to improve it. I'm pretty sure that programming and reading about programming are much better ways of improving at programming, than reading about rationality is. The sanity waterline is already pretty high in programming, so practicing and following best practices is more efficient than trying to work them out yourself from first principles.
It didn't surprise me at all to see that someone had made a post asking this question. The sequences are a bit over-hyped, in that they suggest that rationality might make the reader a super-human and then it usually doesn't happen. I think I still got a lot of useful brain-tools from them, though. It's like a videogame that was advertiesd as the game to end all games, and then it turns out to just be a very good game with a decent chance of becoming a classic. (For the record, my expectations didn't go quite that high, that I can remember, but it's not surprising that some peoples' do. It's possible mine did and I just take disappointment really well.)
Replies from: sboo↑ comment by sboo · 2015-12-30T04:39:16.483Z · LW(p) · GW(p)
I'm pretty sure that programming and reading about programming are much better ways of improving at programming, than reading about rationality is.
right, that's what motivated the post. I feel like spending time learning "domain specific knowledge" is much more effective than "general rationality techniques". like even if you want to get better at three totally different things over the course of a few years, the time spent on the general technique (that could help all three) might not help as much as on exclusively specific techniques.
still, I tend to have faith in abstractions/generality, as my mind has good long-term memory and bad short-term memory. I guess this is... a crisis of faith, if you will. in "recursive personal cognitive enhancement" (lol).
Replies from: Baughn↑ comment by Baughn · 2015-12-30T14:21:05.140Z · LW(p) · GW(p)
I feel like spending time learning "domain specific knowledge" is much more effective than "general rationality techniques".
However, reading Lesswrong is what prodded me towards getting better at spending my time effectively, really getting into a growth mindset. My only problem nowadays is that there are too many things I want to learn, and that's a much better problem to have; I know I can, I just have to pick and choose. I'm getting better at that, too.
Maybe the same would have happened anyway, but I don't think it would have happened quite as fast.
comment by Gunnar_Zarncke · 2016-01-05T20:19:15.856Z · LW(p) · GW(p)
I didn't get 'smarter' per se, but I got much better with interpersonal communication and the start of this trend coincides with getting in touch with LW (It also overlaps the breakup with my ex-wife but I think I can rule out a causal connection there).
Actually a lot of specific changes in my behavior can be traced back to specific advice from LW. I cannot rule out that by natural development I might have acquired these from other sources but I think the sheer amount of changes speaks a clear language.
I considered a write-up of this development from my Rationality Diary entries and such (I think Brienne mentioned somewhere that we need more success stories) but my low success with more elaborate posts here has discouraged me from that.
comment by [deleted] · 2015-12-30T20:40:45.672Z · LW(p) · GW(p)
I've noticed some positive changes in my work efficiency that seem to have happened after I took a more serious attitude to paying attention to my thought processes, writing things down, and systematically approaching tasks:
Though do take this anecdotal evidence with a grain of salt, as the below relates to a high school career (from 10th to 11th grade. I suspect you might not be able to get as effective gains if you're in a higher level environment like university or a technical career)
- I've been going to bed earlier this year, despite having a heavier workload in school (around 1 hour earlier, on average, from 10:30 to 9:30).
- I've managed to maintain approximately the same grades while doing so, and I'm managing to work in more time to read books (~30 min day) and now exercise every two days (~50 min), and end all computer usage after 7:00, phones and tablets included, all habits I didn't consciously form last year when I hadn't paid much attention to how I thought about things and used time.
comment by casebash · 2015-12-29T10:57:51.218Z · LW(p) · GW(p)
Firstly, like WoodSwordSquire I now support (total) utilitarianism. Secondly, I am much more confident at solving philosophical problems. Many philosophical puzzles which I found difficult in the past now seem rather clear to me. Thirdly, I am more aware of the assumptions that I am making and hence better able to critique them.