0 comments
Comments sorted by top scores.
comment by ChristianKl · 2018-08-13T05:58:13.362Z · LW(p) · GW(p)
To me the piece of writing that most seems like a rationalist guide to achieving goals is Nick Winter's "The Motivation Hacker". While I think Nick Winter didn't post on LessWrong, I would count him among the top ten people of the Quantified Self movement.
A few years later Nick Winter managed to go into YC without going to formal YC interview. He went to an onstage office-hour session and Paul Graham liked what he saw so much that he declared he directly accepted them into YC (and that's not something that Paul Graham did previously).
Replies from: habryka4↑ comment by habryka (habryka4) · 2018-08-13T21:17:42.144Z · LW(p) · GW(p)
I met him at the CFAR Alumni Reunion 2014, so he is definitely connected to the community.
comment by m_arj · 2018-08-09T19:56:10.247Z · LW(p) · GW(p)
(Excuse my english)
This is, so far, the list of books that have contributed to my enrichment as a rationalists:
[*] midly related to rationalism
[**] slightly related to rationalism
On Intelligence - Jeff Hawkins
What Do You Care What Other People Think? -Richard Feynman
**Allegro ma non tropo - Carlo Cipolla
Descartes Error - Antonio Damasio
** Sherlock Holmes Complete Canon - Arthut C. Doyle
** Freakonomics - Steven Levitt
You are not so smart - David McRaney
*Moonwalking with Einstein - J. Foer
*Mastermind - Maria Konnikova
*What Everybody is Saying - J.Navarro
*Louder than Words - J.Navarro.
**Ender's Game - Orson S. Card
**The Martian - Andy Weir
*The Future of Mind - Michio Kaku
Predictably Irrational - Dan Ariely
Spent - Geoffrey Miller
Why Everyone else is a Hypocrite - R.Kurzban
The Moral Animal - Robert Wright
*Our Inner Ape - Franz de Waal
All Sequences - Eliezer Yudkowsky
Sequence - Science of Winning at Life - Lukeprog
The Archaeology of Mind - J.Panksepp
Sapiens - Yuval Noah Harari
Sequence - Living Luminously - Alicorn
Sequences - Yvain's Excellent Articles
Why Zebras Don't Get Ulcers - R.Sapolsky
All LessWrong - Kaj Sotala's Posts
Origins of Human Communication - Tomasello
The Selfish Gene- Richard Dawkins
The Joy of Game Theory - P. Talwalkar
What Intelligence Tests Miss - Stanovich
The Third Chimpanzee - Jared Diamond
The Handicap Principle- Amotz Zahavi
The Mating Mind - Geoffrey Miller
The 10,000 year explosion - G.Cochran
Superintelligence - Nick Bostrom
The Body Keeps the Score - Van Der K.
Facing the Intelligence Explosion- Luke Muehlhauser
Intelligence: All That Matters. Stuart Ritchie
**Hackers and Painters. Paul Graham
Surfaces and Essences. Douglas Hofstadter
*Inside Jokes - Hurley Matthew
All Melting Asphalt Blog Posts K.Simler
The Psychology of Social Status- Cheng, Tracy
All Luke Muelhauser LessWrong Posts
The Replacing Guilt Series - N.Soares
Metaphors We Live By - George Lakoff
**Deep Thinking - Garry Kasparov
Blog -Mindlevelup - The Book
Conor Moreton 30 days of Rationality - LessWrong
Dark Lords’ Answer. E. Yudkowsky
Why Beautiful People Have More Daughters - A.Miller
Influence. Robert B. Cialdini
The Art of Thinking Clearly. R. Dobelli
Inadequate Equilibria. E. Yudkowsky
*Never Split the Difference. Chris Voss
The Elephant in the Brain.Simler.Hanson
Hive Mind-Garett Jones
**If Universe is Teeming with Aliens Where is Everybody - S.Webb
*The Edge Effect. Eric Braverman
Crystal Society. Max Harms
*Our Mathematical Universe. M.Tegmark
Visual Introduction Bayes. Dan Morris
Bayes: The Theory that would not Die. S. B. C
*Skin in the Game. Nassim Taleb
Smarter Than Us. Stuart Armstrong
Language in Thought and Action - Hayakawaw
I would like to read more books! I am ready for suggestions.
Replies from: Elo↑ comment by Elo · 2018-08-10T01:41:23.507Z · LW(p) · GW(p)
What do you want to know? And I can suggest.
Replies from: m_arj↑ comment by m_arj · 2018-08-10T07:29:54.497Z · LW(p) · GW(p)
Thank you, Elo. A list of books that helped you to become more rational. I don’t care about the subject.
Replies from: Elo↑ comment by Elo · 2018-08-10T09:50:37.305Z · LW(p) · GW(p)
http://bearlamp.com.au/books-i-read-2017-part-1-relationships-learning/
2017 reading list. But I should warn you that my path is not your path and some of them will waste your time if you don't have direction. Don't copy me if you can skip some.
comment by sone3d · 2018-08-09T18:03:57.639Z · LW(p) · GW(p)
Could you share your list of Eliezer recommended books?
Replies from: noah-topper↑ comment by Noah Topper (noah-topper) · 2018-08-10T21:34:39.355Z · LW(p) · GW(p)
Well, I tend to throw them onto my general to-read list, so I'm not entirely sure. A few I remember are Godel, Escher, Bach, Judgment Under Uncertainty: Heuristics and Biases, Influence: Science and Practice, The End of Time, QED: The Strange Theory of Light and Matter, The Feynman Lectures, Surely You're Joking Mr. Feynman, Probability Theory: The Logic of Science, Probabilistic Inference in Intelligent Systems, and Player of Games. There's a longer list here, but it's marked as outdated.
comment by norswap · 2018-08-09T13:47:03.532Z · LW(p) · GW(p)
I've been thinking about this too, and I'm not sure guide suffice. Getting in shape or learning about a topic are simple problems (not that can't be challenging in their own right) compared to the complexity of actually achieving something.
At this point, we don't even have good theories or hypotheses on why these things are hard. It's lot of small issues that aggregate and compound. Motivation is a big class of these issues. Not seeing clearly enough - failure to perceive danger, opportunities, alternative ways of doing things.
To achieve you have to get the strategy, the tactics and the operations right. There's a lot you can screw up at every level.
One key issue, I think, is that it's damn hard to hack yourself on some fundamental levels. For instance to "be more perceptive". You can't really install a TAP for that. I guess some mindfulness practice can help (although I'd be wary of prescribing meditation -- more like mindfulness on the move). Consuming self-help, insights, news, etc etc only seems to move the needle marginally.
So yeah, I don't know. Just throwing some ideas out there.
Something like this: https://www.lesswrong.com/posts/qwdupkFd6kmeZHYXy/build-small-skills-in-the-right-order [LW · GW] might be a nice starting point. Maybe, just maybe, we're trying to lift heavy weights without having built the required muscles. Worth investigating and expanding.
Replies from: None, noah-topper↑ comment by [deleted] · 2018-08-09T23:09:38.266Z · LW(p) · GW(p)
I think this is largely correct and points at where some of the larger bottlenecks are.
It's not about finding a list of good resources. There are a lot of those already. It's about what happens next. Things like:
- Getting yourself to actually read said resources.
- Figuring out ways of making the material stick.
- Looking for applications, tracking your progress.
- Repeating all of the above, over and over.
↑ comment by Noah Topper (noah-topper) · 2018-08-09T14:49:51.599Z · LW(p) · GW(p)
I definitely agree that there's a bigger issue, but I think this could be a good small-scale test. Can we apply or own individual rationality to pick up skills relevant to us and distinguish between good and bad practices? Are we able to coordinate as a community to distinguish between good and bad science? Rationality should in theory be able to work on big problems, but we're never going to be able to craft the perfect art without being able to test it on smaller problems first and hone the skill.
So yeah. I think a guide putting together good resources and also including practical advice in the posts and comments could be useful. Something like this could be the start of answering Eliezer's questions "how do we test which schools of rationality work" and "how do we fight akrasia". That second question might be easier once we've seen the skills work in practice. Maybe I should make a guide first to get the ball rolling, but I'm not sure I know a topic in-depth enough to craft one just yet.
Replies from: aaron-teetor↑ comment by tinyanon (aaron-teetor) · 2018-08-09T17:45:31.563Z · LW(p) · GW(p)
Give me a month to make a fitness one. I train a bunch of my friends including one rationalist friend that has been pushing me towards writing some analyses of studies; so I have a good amount of experience trying to find ways to get people into fitness who've had issues fighting against their baser urges just to sit down and conserve calories.
comment by gilch · 2018-08-10T02:51:18.941Z · LW(p) · GW(p)
How about A Rationalist's Guide to Early Retirement?
Or, If you're so rational, why ain'cha rich?
OK, so some of you got lucky with the Bitcoins. Can we do any better than buy-and-hold the index? (Like Wealthfront etc.) Option spreads? Tax liens? Arbitraging junk on Ebay? Do you have to start a company, or can you do just as well as a consultant? Are there easy ways to start up passive income or do you need a rich uncle?
Replies from: ChristianKl↑ comment by ChristianKl · 2018-08-13T05:26:41.142Z · LW(p) · GW(p)
If I would have a really easy way to do a task that produces a good passive income, I would hire low wage people to do the task till the task is saturated instead of writing it up.
Most good ways to make money is by leveraging what Peter Thiel calls secrets that derive from local knowledge.
Replies from: Elo, gilchcomment by gilch · 2018-08-10T02:29:58.184Z · LW(p) · GW(p)
I'd like A Rationalist's Guide to Signing Up for Cryonics.
Suppose you've made the decision and have the finances to do it. How do you go about it? Which institution would have better expected outcomes? Neuro or full-body? Which life insurance company? What do you tell your family? How can you best ensure that you actually get froze before your brain rots in case of your unforeseen accidental death, as opposed to a more convenient death due to age or disease in a controlled setting like a hospital? (Which we might expect in a younger-aged group.)
comment by gilch · 2018-08-10T01:44:38.466Z · LW(p) · GW(p)
I would like to have A Rationalist's Guide to Reading Science.
Particularly, how to benefit from existing scientific publications and understand them well enough to write a Rationalist's Guide to X or Much More Than You Wanted to Know About X, where X is some field without common knowledge consensus, like medicine or diet or exercise or psychology.
Reading science news headlines seems suboptimal. How confident can we be in any particular study? We know there are some perverse incentives in science. Publish or perish, new discoveries more valued than replications, p-hacking, etc. What should we be wary of? How much training do we need in the field? Is this impossible without a degree in statistics?
comment by gilch · 2018-08-10T02:21:50.995Z · LW(p) · GW(p)
I would like A Rationalists Guide to Personal Catastrophic Risks.
We like to think a lot about Global Catastrophic Risks (especially the EA folks), but there are smaller problems that are just a devastating to the individual.
Should we wear helmets in cars [LW · GW]? Should we wear covert body armor? Own a gun? Get a bug-out bag [LW · GW]? An emergency cube? Learn wilderness survival?
And by how much should we be concerned about those "survivalist" topics vs less obvious longevity steps like flossing your teeth [LW · GW]? Not everyone's risk profile is the same. How do we assess that?
How should we measure that? Dollars? QALYs? Micromorts? Should we use hyperbolic discounting? Do we expect to reach actuarial escape velocity (or be granted near-immortality after the Singularity) and how would that change the calculus?
Do anthropic effects matter to subjective survival? In the multiverse?
Consider also other catastrophes that don't kill you, like losing a limb, or going blind, or more social risks like identity theft or getting scammed or robbed or sued, etc.
comment by Stefan De Young (stefan-de-young) · 2018-08-09T14:50:53.370Z · LW(p) · GW(p)
This also interests me. Some of my hopes for the Ottawa Rationality Dojo are that we can assemble people who are interested in skill development, and that we can build curricula that are useful to similar groups. I'm not convinced that I'm the kind of person who could follow these curricula alone or with an online community, so I'm trying to build it locally.
One major concern that I have with trying to build a curriculum for instrumental rationality is that the art must have a purpose beyond itself. I believe that it is for this reason that CFAR has realigned itself from teaching instrumental rationality to teaching instrumental rationality for AI alignment.
At the upcoming local SSC meetups, I will be asking "For what purpose do you intend to develop the art?" If I get answers, I'll post them to LW.
Replies from: noah-topper↑ comment by Noah Topper (noah-topper) · 2018-08-09T14:58:03.767Z · LW(p) · GW(p)
Sounds awesome! A meatspace group would be great, I'm sure. One of my issues with self-study is having nobody to go to when I have questions or don't understand something. Having an empirical goal can also tell you if you've succeeded or failed in your attempt to learn the art.
Replies from: stefan-de-young↑ comment by Stefan De Young (stefan-de-young) · 2018-08-09T19:33:08.812Z · LW(p) · GW(p)
Having a group of rationalists to talk to in person has been invaluable to me this year. It's helping me emerge from depression, overcome impostor syndrome, and find my project. The previous sentence reads like the battles have been won, but they are being fought constantly.
Check this list of upcoming meetups: https://slatestarcodex.com/2018/08/09/ssc-meetups-2018-times-and-places/
Right now is a really good time to start or join meatspace communities.
comment by Screwtape · 2018-08-13T20:49:27.216Z · LW(p) · GW(p)
First of all, I'd just like to take a moment to say that I quite appreciate your username.
Second, to take your initial question literally, I don't think there are that many rationalists who actually want to rule the world. The position sounds like it would involve paperwork and talking to uninteresting yet obstinate people, so speaking for myself I don't think I'd actually want the job. There are probably many rationalists who would take the position for instrumental reasons, but because it's an instrumentally useful job, the competition for it is fierce. I'm not saying you meant it literally, but I think the distinction points at something important; what is it we actually want?
I'd like to be more in-shape, to work on more interesting programming projects, and to go on more successful dates. I'd pretty cheerfully read a guide on those subjects, and would probably be amenable to contributing to such a guide. Somebody else might want to save more lives, or have a higher class lifestyle, or lead a more interesting and exciting life. Some skills are generically useful to a large range of goals (financial management, persuasion, etc) but something that might be crucial to my goals might be irrelevant to yours. In addition, the format of whatever we're learning from matters; when learning to work out a youtube video is probably more useful than text. I would love to see more essays in the vein of SSC's Much More Than You Wanted To Know, but audio lectures, videos, or illustrations are good too. (Have you ever tried learning martial arts from a textbook? It's not ideal.)
Lastly, something worth thinking about. We all have the internet, and can all ask google for information. What advantage does a rationalist repository of teachings have? I'm confident we have some (offhand, we have common jargon, possibly a willingness to do experiments, and the occasional dedicated specialist) but if we want to do more than toss a lot of blogs in a pot and stir, it might be good to keep the comparative advantages in mind.
comment by Elo · 2018-08-09T06:29:19.816Z · LW(p) · GW(p)
The rationalists are winning. You are not looking carefully enough.
Replies from: timothy-johnson↑ comment by Timothy Johnson (timothy-johnson) · 2018-08-09T10:21:27.448Z · LW(p) · GW(p)
Can you be more specific? What evidence leads you to believe that rationalists are winning?
Replies from: norswap, Elo↑ comment by norswap · 2018-08-09T13:38:25.535Z · LW(p) · GW(p)
Elo's a nice guy, but I have no idea what he's talking about either.
Maybe rationality improves your quality of life or subjective well-being, there is certainly evidence for that.
But in terms of accomplishing more material and outwardly visible goals, you're right that the evidence is scant. CFAR and EA could be evidence, but there are a lot of non-rat institutions that perform well too.
Replies from: habryka4↑ comment by habryka (habryka4) · 2018-08-09T16:40:39.879Z · LW(p) · GW(p)
I think overall the success of EA and Rationality is pretty visible. Open Phil has access to over 10 billion dollars, which makes them one of the 10 largest foundations in the worlds, we have successfully created safety teams at many of the world's top AI labs, have had many of the world's top entrepreneurs and researchers speak at our conferences, and generally seem to be doing much better at achieving our goals than I bet almost anyone would have naively expected had you asked them in 2010.
Obviously, not everyone who reads LessWrong suddenly develops superpowers, and generally as communities grow the average level of success or competence goes down, but in aggregate I think we are doing pretty well.
(Note: I don't think most of Open Phil would self-identify as rationalist, but in particular their focus on AI alignment seems heavily influenced by the rationality community, and in general I think that a lot of the staff at Open Phil are executing the kind of algorithms that we usually describe here as "the art of rationality" (and many of them have read LessWrong and found it quite valuable))
Replies from: DanielFilan↑ comment by DanielFilan · 2018-08-09T17:24:43.142Z · LW(p) · GW(p)
I agree with the thrust of this comment, but I'd like to push back against "have had many of the world's top entrepreneurs and researchers speak at our conferences" as a measure of success (although perhaps it's a predictor in the right context).
Replies from: habryka4↑ comment by habryka (habryka4) · 2018-08-10T02:08:12.389Z · LW(p) · GW(p)
Agree that it's a weaker one, I guess it's one that comes up for me because I worked more directly on it :P
↑ comment by Elo · 2018-08-09T11:03:12.261Z · LW(p) · GW(p)
No. You will have to see for yourself. Of course you'd have to be looking for that to work.
Replies from: Aiyen↑ comment by Aiyen · 2018-08-10T16:20:30.334Z · LW(p) · GW(p)
...? "Winning" isn't just an abstraction, actually winning means getting something you value. Now, maybe many rationalists are in fact winning, but if so, there are specific values we're attaining. It shouldn't be hard to delineate them.
It should look like, "This person got a new job that makes them much happier, that person lost weight on an evidence-based diet after failing to do so on a string of other diets, this other person found a significant other once they started practicing Alicorn's self-awareness techniques and learned to accept their nervousness on a first date..." It might even look like, "This person developed a new technology and is currently working on a startup to build more prototypes."
In none of these cases should it be hard to explain how we're winning, nor should Tim's "not looking carefully enough" be an issue. Even if the wins are limited to subjective well-being, you should at least be able to explain that! Do you believe that we're winning, or do you merely believe you believe it?