What should normal people do?

post by seez · 2013-10-25T02:28:50.885Z · LW · GW · Legacy · 94 comments

Contents

94 comments

What should a not-very-smart person do?  Suppose you know a not-very-smart person (around or below average intelligence).  S/he read about rationality, has utilitarian inclinations, and wants to make the world better.  However, s/he isn't smart enough to discover new knowledge in most fields, or contribute very much to a conversation of more knowledgeable experts on a given topic.  Let's assume s/he has no exceptional talents in any area.

How do you think a person like that could best use his/her time and energy?  What would you tell the person to do?  This person may be, compared to average LW readership, less capable of noticing the irrationality in his/her actions even if s/he wants to be rid of it, and less easily able to notice the flaws in a bad argument.  S/he may never be able to deeply understand why certain arguments are correct, certain scientific facts have to be the way they are, and telling him/her to be unsure or sure about anything seems dangerous if s/he doesn't really understand why.  

My practical advice might be:

1) If you want to give to charity, follow GiveWell recommendations.  

2) Learn about the basic biases, and commit to resisting them in your own life. 

3)  Follow advice that has been tested, that correctly predicts a positive outcome.  If a hypothesis is untestable (there's an unsensible dragon in your garage) or doesn't predict anything (fire comes from phlogiston in combustable substances), or is tested and demonstrably false (god will smite you if you say it doesn't exist), don't waste time and energy on it.  If you want to improve, look for tested methods that have significant positive results relevant to the area of interest.  Similarly, if a person regularly gives you advice that does not lead to good outcomes, stop following it, and if someone gives advice that leads to good outcomes, start paying attention even if you like that person less.  

 

At a more general level, my thoughts are tentative, but might include basic LW tenets such as:

1) Don't be afraid of the truth, because you're already enduring it.

2) If all the experts in a field agree on something, they might be wrong, but you are extremely unlikely to be better at uncovering the truth, so follow their advice, which might appear to conflict with...

3) Don't trust deep wisdom.  Use Occam's razor, think about simple, basic reasons something might be true (this seems good for religion and moral issues, bad for scientific ideas and understanding)

4) If you find yourself flinching away from an idea, notice that, and give it extra attention.  

Note:  I mean this as a serious, and hopefully non-insulting question.  Most people are intellectually near-average or below-average, and I have not seen extensive discussion on how to help them lead happier lives that make the world a better place. 

94 comments

Comments sorted by top scores.

comment by summerstay · 2013-10-25T16:13:35.775Z · LW(p) · GW(p)

Here's my advice: always check Snopes before forwarding anything.

Replies from: shminux
comment by Shmi (shminux) · 2013-10-25T17:05:04.074Z · LW(p) · GW(p)

I wish there was a checkbox in email sites and clients "check incoming messages against known urban myths". Probably no harder to implement than the current automatic scam and spam filtering.

Replies from: None, Viliam_Bur
comment by [deleted] · 2013-10-26T13:42:07.604Z · LW(p) · GW(p)

Do people actually still get those things? I have literally never recieved one of those chain letters or story-forwardings.

comment by Viliam_Bur · 2013-10-26T09:19:13.254Z · LW(p) · GW(p)

Then there would a next level in arms race. Just like spammers used to add "this is not a spam" disclaimers, people who create hoax mails would add something like: "When you send this e-mail to your friends, ask them later whether they received it, because is removing criticism against them from internet."

Or the hoaxes would be sent as attached images.

Replies from: shminux
comment by Shmi (shminux) · 2013-10-26T19:14:40.960Z · LW(p) · GW(p)

Then there would a next level in arms race.

There is hardly any with spam anymore. Gmail detects virtually 100% of it these days. Maybe a few spam messages a year make it through to my Inbox.

comment by James_Miller · 2013-10-25T17:05:44.709Z · LW(p) · GW(p)

More actionable rules might be better such as:

Wear a seat belt when driving. Save 10% of your income through your pension plan's index fund option. Don't smoke. Practice safe sex. Sign up for cryonics.

Replies from: Gunnar_Zarncke
comment by Gunnar_Zarncke · 2013-10-26T11:44:55.181Z · LW(p) · GW(p)

Wear a seat belt when driving.

Don't smoke.

Practice safe sex.

Safe bet. These significantly increase your life and health expectations at almost no cost.

Save 10% of your income through your pension plan's index fund option.

This heavily depends on your age, country, social level (which affects future discounting) and what not and is thus in its generality questionable.

Sign up for cryonics.

I don't know what kind of advice this is. Sure you are convinced that it may be right for you, but it is as far away from item 2) above as it can be.

From the standpoint of normal people cryonics is not very different from other afterlife memes and thus adding it to the list has the risk of descrediting (to normal people) the whole list.

comment by cousin_it · 2013-10-25T11:13:16.539Z · LW(p) · GW(p)

It seems to me that the advice about Givewell has a lot of evidence behind it, but the rest of the advice doesn't have much evidence that it gives any benefit at all, for people of average intelligence or otherwise. It would be good to have a Givewell-like project that evaluated the costs and benefits of following various rationality advice.

Replies from: jkaufman, RolfAndreassen
comment by jefftk (jkaufman) · 2013-10-25T17:53:01.511Z · LW(p) · GW(p)

a Givewell-like project that evaluated the costs and benefits of following various rationality advice

CFAR is kind of working along these lines.

comment by RolfAndreassen · 2013-10-25T14:25:38.585Z · LW(p) · GW(p)

Heck, just having some kind of metric to see whether people were following rationality advice would be a big step forward. We can get a visceral impression that someone is more or less formidable, and we can spot patterns of repeated mistakes, but we don't really have a good way of seeing the extent to which someone is applying rationality advice in their daily lives. (Of course this is just a restatement of the good old "Rationality Dojo" problem, one of the very first posts in the Sequences.) Paper tests don't really capture the ability to apply the lessons to real-world problems that people actually care about.

comment by syllogism · 2013-10-28T11:45:20.514Z · LW(p) · GW(p)

I have a fairly wide variety of friends. Here's some advice I find myself giving often, because it seems to cover a lot of what I think are the most common problems. The wording here isn't how I'd say it to them.

Health and lifestyle

  • Don't engage the services of any non-evidence based medical practitioner.
  • If you have a health problem, you're receiving treatment/advice, and not obviously improving, get a second opinion. And probably also a third. (I am not in the US)
  • Don't smoke cigarettes. If you already smoke cigarettes, buy an e-cigarette (like, right now), even if you're sure you'll never use it. Once you've bought it, do try to use it, even if you don't think you'll like it.
  • Always use barriers for vaginal and anal penetration outside a monogamous relationship
  • Use either barriers or hormonal birth control for p-in-v intercourse
  • If you or your partner is having non-monogamous sex, get STI tests twice a year
  • If you frequently feel depressed or anxious, see a psychologist who practices cognitive behavioural therapy. Do not see one who practices psychoanalysis (see "evidence based health care"). Expect to trial several psychologists before finding one who is a good fit for you (see: "if health care not working, seek second opinion").

Personal finance

  • Don't buy cars new. There's an established value-point for depreciation vs maintainence costs for cars. Use that.
  • Don't get a credit card
  • Don't take personal loans unless the interest is dominated by the cost of having no liquidity (e.g. eviction)
  • If you can't maintain a monthly budget, give yourself a weekly budget. If you still can't maintain a weekly budget, give yourself a daily budget. A budget over a shorter time period is much less convenient --- but failing to maintain a budget at all and going broke is less convenient still.
  • Never gamble, including playing the lottery
  • Don't invest in individual stocks, or pay someone else to invest your money in individual stocks.
  • Comply with your local tax laws, including filing your taxes on time. Know that people smarter than you are being employed to catch tax cheats who are also smarter than you. You will likely lose.

Career

  • Avoid "winner takes all" professions, e.g. sports, music, academia in fields with no industrial application, etc
  • Judge potential careers by expected value, not best or worst possible outcome. Look at median salaries and working conditions.
  • Only enrol in tertiary/graduate education with a specific career outcome in mind
  • Recognise the true cost of tertiary degrees, in opportunity cost as well as course fees. e.g. a full-fee-paid PhD program with a modest stipend is still enormously expensive
  • Recognise that reputational capital distorts labor markets. Teachers are paid low because they're well respected, not despite being well respected. The how-do-you-do value of having a well respected career is not that enduring, so it's probably over-valued. It's better to avoid careers with excess reputational capital.
  • Consider careers in the allied health professions . These professions have the best job security: the aging population ensures rising demand, they must be performed locally, and most seem like unlikely targets for automation or disruption. They also offer lots of human contact, which many find produces good job satisfaction. But, because they don't require medical degrees, they are fairly neutral in reputational capital, so your peers don't just work themselves to death to win zero-sum positional games against you.

Law and the justice system

  • Avoid dominance contests. Learn to display submission readily, meekly and convincingly.
  • Regard police officers as people who have power over you. They do. Their power is broad, and not constrained by some set of written rules. If you challenge them to a dominance contest, you will likely lose. They will fuck you up. And even if you "win", it'll be a pyrrhic victory --- you'll still have been better off not getting into it in the first place.
  • Power is not moral authority. Someone may have real power over you, with no legitimate right to it.
  • Moral authority is not power. Even if you have the legitimate right to do something, someone may have the power to stop you.
Replies from: army1987, ChristianKl, Lumifer
comment by A1987dM (army1987) · 2013-10-28T19:27:44.812Z · LW(p) · GW(p)

Never gamble, including playing the lottery

Make that “never gamble large sums of money” -- spending €7 for a poker tournament with your friends isn't obviously worse than spending €7 on a movie ticket IMO.

I agree about pretty much all of the list -- and most of it is also good advice for pretty much all people, not just normal ones.

comment by ChristianKl · 2014-01-29T19:45:51.290Z · LW(p) · GW(p)

Don't get a credit card

Don't hold a credit balance on a credit card might be valid general advice. There are however many cases where the miles or cashback you can get through credit cards provide a valuable benefit.

It also builds a credit rating that might be valuable to get a mortage and given the tax reducted status of mortages for buying a home they aren't completely bad.

Replies from: Nornagest
comment by Nornagest · 2014-01-29T20:22:00.116Z · LW(p) · GW(p)

I don't think miles and cashback are the primary benefit of a credit card, although they're handy. A credit rating on the other hand is very important: at least in the US, finding housing (including apartment housing) is seriously complicated by having bad or no credit, and the same goes for buying vehicles or anything else customarily paid for on an installment plan. Making major purchases on credit is a good deal if you think hanging onto the money is worth more to you yearly than the APR, which isn't unlikely if you're investing; basically it's leverage.

If you find credit cards morally objectionable or are absent-minded enough not to always pay them off on time, can probably drop the card once you've been approved for an automotive loan or something comparably serious. I use mine to handle gas and certain other predictable expenses.

comment by Lumifer · 2013-10-28T16:18:17.526Z · LW(p) · GW(p)

That's an interesting list. Without going into individual items, what goal structure does it support?

In other words, what is it that you want to do (or, maybe, be) that following the advice on this list enables you?

comment by Shmi (shminux) · 2013-10-25T17:20:27.689Z · LW(p) · GW(p)

Just wanted to point out an implicit and not necessarily correct assumption, leading to poor-quality advice:

Suppose you know a not-very-smart person (around or below average intelligence)

It seems that you assume that intelligence is one-dimensional. In my experience, while there is a correlation, most people are smarter in some areas than in others. For example, a mathematical genius may be incapable of introspection and have to interest in rational thinking outside math. Let's take your example:

S/he read about rationality, has utilitarian inclinations, and wants to make the world better. However, s/he isn't smart enough to discover new knowledge in most fields, or contribute very much to a conversation of more knowledgeable experts on a given topic. Let's assume s/he has no exceptional talents in any area.

First, an "average person" does not read about rationality and has no "utilitarian inclinations". They often do want to make the world better if socially conditioned to do so by their church or by the TV commercials showing a sick child in the 3rd world whom you can save for a dollar a day or something. So, the person you describe is not "average".

Second, this "average person" might be (and likely is) intelligent in a way that does not show up on the IQ tests: he or she might be unusually good at running a corner store, or being a great parent, or whatever. Some of the talents may be latent, because they had no chance of being manifested. I would still call it "intelligence" by Eliezer's definition: ability to optimize the universe, or at least some small slice of it.

As a consequence, your advice is suspiciously indistinguishable from the one you'd give an "LW-smart" person. My inclination would be to find this person's area of aptitude and offer custom advice that plays to their strengths.

Replies from: army1987, seez
comment by A1987dM (army1987) · 2013-10-26T09:11:30.720Z · LW(p) · GW(p)

I would still call it "intelligence" by Eliezer's definition: ability to optimize the universe, or at least some small slice of it.

IIRC the optimization power has to be cross-domain according to his definition, otherwise Deep Blue would count as intelligent.

Replies from: magfrump, shminux
comment by magfrump · 2013-10-26T19:13:02.288Z · LW(p) · GW(p)

That doesn't seem to count as a problem with the above definition. Taboo "intelligent." Is Deep Blue an optimizing process that successfully optimizes a small part of the universe?

Yes.

Is it an optimizing process that should count as sentient for the purposes of having legal rights? Should we be worried about it taking over the world?

No.

comment by Shmi (shminux) · 2013-10-26T19:11:05.629Z · LW(p) · GW(p)

Deep Blue is a narrow AI...

comment by seez · 2013-10-27T01:04:43.945Z · LW(p) · GW(p)

I agree that the word intelligence is too vague, but I'm specifically not including a mathematical genius (who would have an exceptional talent in the area of mathematics).

I strongly disagree that average people can't or don't have utilitarian inclinations. I think utilitarianism is one of the easiest philosophies to grasp, and I know a lot of average-IQ people who express the desire to "do as much good as possible" or "help as many people as possible." Even the advertisements for charities that you mention tend to stress how much good can be achieved with how little money.

It's certainly good to customize advice, but I think there is a class of advice I would offer to smart, skeptical people that I would hesitate to give to others. For example, I would tell my brightest students to question expert advice, because then they can more deeply understand why experts think what they do, or potentially uncover a true fault in expert reasoning. With my less-bright pupils, I find that this pushes towards conspiracy theories and pseudoscience, and thus more frequently advise them to trust experts and distrust people on the fringe. When smart people question mainstream scientific thinking, they may go astray. In my experience, when average-or-below intelligence people question mainstream scientific thinking, they almost always go astray, and when they don't it's usually coincidence.

I'm trying to figure out how to help them understand things more deeply and question things in a more productive manner, and definitely borrowing lots of ideas from LW, but I still think there is a lot more room for improvement.

Replies from: Lumifer
comment by Lumifer · 2013-10-27T03:07:22.167Z · LW(p) · GW(p)

I know a lot of average-IQ people who express the desire to "do as much good as possible" or "help as many people as possible.

I'm sure they express the desire, but do they actually desire it and do they actually do it?

comment by David_Gerard · 2013-10-25T13:43:19.889Z · LW(p) · GW(p)

Read Yvain's Epistemic Learned Helplessness. You can be convinced of anything by good arguing, but forewarned is forearmed.

comment by Mestroyer · 2013-10-25T02:53:09.228Z · LW(p) · GW(p)

"Study rationality anyway. Work harder to make up for your lack of intelligence." I don't think most of LessWrong's material is out of reach of an average-intelligence person.

"Think about exactly what people mean by words when they use them; there are all kinds of tricks to watch out for involving subtle variations of a word's meaning." Read Yvain's "The Worst Argument in the World"..

"Don't fall for the sunk cost fallacy, which is what you're doing when you say 'This movie I'm watching sucks [absolutely, not just relative to what you'd expect for the cost], but I'm gonna keep watching, because I already payed to get in.'"

"Your brain loves to lie to you about the reason you want to do something. For example, maybe you're thinking about moving to a new job. You don't get along well with one of your current coworkers, but you don't think this is a good reason to get a new job, so you refuse to take that into consideration. Your brain makes a bigger deal of minor advantages of the new job to compensate. Learn to recognize these lies."

Replies from: ShardPhoenix, None, BaconServ
comment by ShardPhoenix · 2013-10-25T10:39:20.025Z · LW(p) · GW(p)

I don't think most of LessWrong's material is out of reach of an average-intelligence person.

Wasn't the average IQ here from the survey something like 130+?

Replies from: maia, DavidAgain, Moss_Piglet, ThisSpaceAvailable, Dentin
comment by maia · 2013-10-25T14:12:52.746Z · LW(p) · GW(p)

These statements don't necessarily contradict each other. Even if average-intelligence people don't read Less Wrong, perhaps they could. Personally, I suspect it's more because of a lack of interest (and perhaps a constellation of social factors).

comment by DavidAgain · 2013-10-25T15:09:41.937Z · LW(p) · GW(p)

I bet the average LessWrong person has a great sense of humour and feels things more than other people, too.

Seriously, every informal IQ survey amongst a group/forum I have seen reports very high IQ. My (vague) memories of the LessWrong one included people who seemed to be off the scale (I don't mean very bright. I mean that such IQs either have never been given out in official testing rather than online tests, or possibly that they just can't be got on those tests and people were lying).

There's always a massive bias in self-reporting: those will only be emphasised on an intellectual website that starts the survey post by saying that LessWrongers are, on average, in the top 0.11% for SATs, and gives pre-packaged excuses for not reporting inconvenient results - "Many people would prefer not to have people knowing their scores. That's great, but please please please do post it anonymously. Especially if it's a low one, but not if it's low because you rushed the test", (my emphasis).

If there's a reason to be interested in average IQ beyond mutual ego-massage, I guess the best way would be to have an IQ test where you logged on as 'Less Wrong member X' and then it reported all the results, not just the ones that people chose to share. And where it revealed how many people pulled out halfway through (to avoid people bailing if they weren't doing well).

Replies from: Luke_A_Somers, bramflakes, somervta
comment by Luke_A_Somers · 2013-10-25T15:39:38.089Z · LW(p) · GW(p)

Selection bias - which groups and forums actually asked about IQ?

Your average knitting/auto maintenance/comic book forum probably has a lower average IQ but doesn't think to ask. And of course we're already selecting a little just by taking the figures off of web forums, which are a little on the cerebral side.

Replies from: DavidAgain
comment by DavidAgain · 2013-10-25T16:43:57.396Z · LW(p) · GW(p)

True. I don't think I can define the precise level of inaccuracy or anything. My point is not that I've detected the true signal: it's that there's too much noise for there to be a useful signal.

Do I think the average LessWronger has a higher IQ? Sure. But that's nothing remotely to do with this survey. It's just too flawed to give me any particularly useful information. I would probably update my view of LW intelligence more based on its existence than its results. In that reading the thread lowers my opinion of LW intellgence, simply because this forum is usually massively more rational and self-questioning than every other forum I've been on, which I would guess is associated with high IQ, and people taking the survey seriously is one of the clearest exceptions.

BTW, I'm not sure your assessments of knitting/auto maintenance/comic books/web forums are necessarily accurate. I'm not sure I have enough information on any of them to reasonably guess their intelligence. Forums are particularly exceptional in terms of showing amazing intelligence and incredible stupidity side by side.

Replies from: Luke_A_Somers
comment by Luke_A_Somers · 2013-10-25T17:50:52.952Z · LW(p) · GW(p)

People with high IQ have extra power to be exceptionally stupid.

comment by bramflakes · 2013-10-25T15:35:02.730Z · LW(p) · GW(p)

If there's a reason to be interested in average IQ beyond mutual ego-massage, I guess the best way would be to have an IQ test where you logged on as 'Less Wrong member X' and then it reported all the results, not just the ones that people chose to share.

Would still suffer from selection effects. People that thought they might not do so well would be disinclined to do it, and people who knew they were hot shit would be extra inclined to do it. The phrase "anonymous survey" doesn't really penetrate into our status-aware hindbrains.

Replies from: DavidAgain
comment by DavidAgain · 2013-10-26T16:43:45.019Z · LW(p) · GW(p)

Yep! But it's the best way I can imagine that someone could plausibly create on the forum.

Replies from: DSherron
comment by DSherron · 2013-10-26T21:08:26.364Z · LW(p) · GW(p)

Better: randomly select a group of users (within some minimal activity criteria) and offer the test directly to that group. Publicly state the names of those selected (make it a short list, so that people actually read it, maybe 10-20) and then after a certain amount of time give another public list of those who did or didn't take it, along with the results (although don't associate results with names). That will get you better participation, and the fact that you have taken a group of known size makes it much easier to give outer bounds on the size of the selection effect caused by people not participating.

You can also improve participation by giving those users an easily accessible icon on Less Wrong itself which takes them directly to the test, and maybe a popup reminder once a day or so when they log on to the site if they've been selected but haven't done it yet. Requires moderate coding.

Replies from: epursimuove
comment by epursimuove · 2013-10-30T01:02:46.910Z · LW(p) · GW(p)

I would find such a feature to be extraordinarily obnoxious, to the point that I'd be inclined to refused such a test purely out of anger (and my scores are not at all embarrassing). I can't think of any other examples of a website threatening to publicly shame you for non-compliance.

comment by somervta · 2013-10-26T04:39:45.139Z · LW(p) · GW(p)

btw, in Markdown use double asterisks at each end for bold, like this **bold text.

with two at the end also.

comment by Moss_Piglet · 2013-10-27T14:27:58.519Z · LW(p) · GW(p)

Wasn't the average IQ here from the survey something like 130+?

The average self-reported IQ.

If we really wanted to measure LWs collective IQ, I'd suggest using the education data as a proxy; we have fairly good information about average IQs by degree and major, and people with less educational history will likely be much less reticent to answer than those with a low IQ test result since there are so many celebrated geniuses who didn't complete their schooling.

Replies from: Nornagest
comment by Nornagest · 2013-10-27T19:16:16.585Z · LW(p) · GW(p)

The average tested IQ on the survey was about 125, which is close to my estimate of the true average IQ around here; I don't entirely trust the testing site that Yvain used, but I think it's skewing low, and that ought to counteract some of the reporting bias that I'd still expect to see.

125 is pretty much in line with what you'd expect if you assume that everyone here is, or is going to be, a four-year college graduate in math, philosophy, or a math-heavy science or engineering field (source). That's untrue as stated, of course, but we do skew that way pretty hard, and I'm prepared to assume that the average contributor has that kind of intellectual chops.

Replies from: Moss_Piglet
comment by Moss_Piglet · 2013-10-27T19:35:25.220Z · LW(p) · GW(p)

I think that's a fair assessment, although it might be because my guess was around 120 to start with. I never meant to say we're not smart around here, far from it, but I don't think we're all borderline geniuses either. It's important to keep perspective and very easy to overestimate yourself.

comment by ThisSpaceAvailable · 2013-10-27T06:20:49.117Z · LW(p) · GW(p)

To comprehend a text, a person must:

  1. Become aware of it
  2. Have some reason for reading it
  3. Find those reasons to be sufficient to spend time reading it
  4. Read it
  5. Put forth the cognitive effort to understand it (reading something and putting forth cognitive resources to understand it are not the same thing)
  6. Succeed in understanding it

Intelligence is just one component of knowledge acquisition, and probably less important than affective issue. Often, intelligence acts indirectly by affecting affect, but in such cases, those effects can be counteracted. The mistaking of performance of cognitive tasks for intelligence is, I believe, often an aspect of the fundamental attribution error.

comment by Dentin · 2013-10-25T15:38:43.054Z · LW(p) · GW(p)

140+

Replies from: Nornagest
comment by Nornagest · 2013-10-27T07:08:39.466Z · LW(p) · GW(p)

Not anymore, though only just. The 2012 survey reports a mean of 138 and change with a SD of 12.7. It was 140 or higher on the 2011 and 2009 surveys, though.

All the usual self-reporting caveats apply, of course.

comment by [deleted] · 2015-07-17T05:11:58.862Z · LW(p) · GW(p)

I'm interested in operationalising your advice. By study rationality, I assume you mean read rationality blogs and try to practice persuasive prescrptions. At the moment I only read Lesswrong regularly, but I try to give the blogs mentioned in the sidebar a go once in a while. Robin Hansin opened my mind in this Youtube interview but I find it hard to understand his blog posts on Overcoming Bias. I thought maybe that's because the blog posts are very scholastic and I'm not up to date. I don't find this is the case on SSC, but it is occasionally the case here on Lesswrong. If you could describe the intended readership of each rationality blog in a way for potential audience members to decide which to commit to reading, how would you do it? Could you come up with a scale of academic rigour vs accessibility, or similar?

comment by BaconServ · 2013-10-25T05:42:14.409Z · LW(p) · GW(p)

The thing that struck me most about the sequences was how accessible they were. Minimal domain-specific jargon and a comprehensive (excessive at times, in my opinion) explanation of each concept in turn. I do believe LessWrong is not over-the-top inaccessible, but as the existence of this post implies, it seems that's not always agreed upon.

Replies from: RolfAndreassen, seez
comment by RolfAndreassen · 2013-10-25T14:30:40.313Z · LW(p) · GW(p)

I think this underestimates the difficulty average humans have with just reading upwards of 2500 words about abstract ideas. It's not a question even of getting the explanation, it's a question of simply being able to pay attention to it.

I keep repeating this: The average human is extremely average. Check your privilege, as the social-justice types might say. You're assuming a level of comfort with, and interest in, abstraction that just is not the case for most of our species.

Replies from: gattsuru, Error, DavidAgain, BaconServ
comment by gattsuru · 2013-10-26T03:51:33.780Z · LW(p) · GW(p)

Upvoted. Every time I'm tempted to provide a long post aimed at the general public, I've found it worthwhile to look at a math or biochemistry paper that is far outside of my knowledge domains -- the sort of stuff that requires you to go searching for a glossary to find the name of a symbol.

Sufficiently abstract writing feels like that to a significant amount of the populace, and worse, even Up-Goer 5-level writing looks like it will feel like that, to a lot of people who've been trained into thinking they're not good enough at this matter.

((I still tend to make a lot of long posts, because apparently I am a terrible person.))

comment by Error · 2013-10-26T03:20:11.914Z · LW(p) · GW(p)

Datum: I know at least one person who refuses to read LW links, explicitly because they are walls of text about abstract ideas that she thinks (incorrectly, IMO) are over her head. This occurs even for topics she's normally interested in. So things like that do limit LW's reach.

Whether they do so in a significant fraction of cases, I don't know. But the impact is non-zero.

comment by DavidAgain · 2013-10-25T15:00:29.244Z · LW(p) · GW(p)

This doesn't seem to me to be about fudamental intelligence, but upbringing/training/priorities.

You say in another response that IQ correlates heavily with conscientiousness (though others dispute it). But even if that's true, different cultures/jobs/education systems make different sort of demands, and I don't think we can assume that most people who aren't currently inclined to read long, abstract posts can't do so.

I know from personal experience that it can take quite a long while to get used to a new sort of taking in information (lectures rather than lessons, reading rather than lectures, reading different sorts of things (science to arguments relying on formal or near-formal logic to broader humanities). And even people who are very competent at focusing on a particular way of gaining information can get out of the habit and find it hard to readjust after a break.

In terms of checking privilege, there is a real risk that those with slightly better training/jargon, or simply those who think/talk more like ourselves are mistaken for being fundamentally more intelligent/rational.

Replies from: RolfAndreassen
comment by RolfAndreassen · 2013-10-25T16:40:41.180Z · LW(p) · GW(p)

This doesn't seem to me to be about fundamental intelligence, but upbringing/training/priorities.

Well, then I have to ask what you think "fundamental intelligence" consists of, if not ability with (and consequently patience for and interest in) abstractions.

Can we taboo 'intelligence', perhaps? We are discussing what someone ought to do who is average in something, which I think we are implicitly assuming to be bell-curved-ish distributed. How changeable is that something, and how important is its presence to understanding the Sequences?

Replies from: DavidAgain
comment by DavidAgain · 2013-10-25T16:50:02.228Z · LW(p) · GW(p)

I reject the assumption behind 'ability with (and consequentially patience for and interest in)'. You could equally say 'patience for and interest in (and consequentially ability in)', and it's entirely plausible that said patience/interest/ability could all be trained.

Lots of people I know went to schools were languages were not prioritised in teaching. These people seem to be less inherently good at languages, and to have less patience with languages, and to have less interest in them. If someone said 'how can they help the Great Work of Translation without languages', I could suggest back office roles, acting as domestic servants for the linguists, whatever. But my first port of call would be 'try to see if you can actually get good at languages'

So my answer to your question is basically that by the time someone is the sort of person who says 'I am not that intelligent but I am a utilitarian rationalist seeking advice on how to live a more worthwhile life' that they are either already higher on the bellcurve than simple 'intelligence' would suggest, or at least they are highly likely to be able to advance.

comment by BaconServ · 2013-10-25T20:04:05.190Z · LW(p) · GW(p)

Oh no, I don't expect very many people to read it all. I expect a select few articles to go viral every now and then, though. This wouldn't be possible if the writing wasn't clear and accessible.

Replies from: RolfAndreassen
comment by RolfAndreassen · 2013-10-25T20:42:33.997Z · LW(p) · GW(p)

Sure, but I suggest that "viral on the Internet" for a long text article does not in fact mean that humans of average intelligence are reading it. The Internet skews up in intelligence to start with, but the stuff that goes viral enough to be noticed by mainstream media - which at least in principle reach down to the average human - is cat videos and cute kids, not long articles. Sequence posts may certainly go viral among a Hacker-News-ish, technical, college-educated, Populares-ish sort of crowd, but that's already well outside the original "average intelligence" demographic.

Replies from: BaconServ
comment by BaconServ · 2013-10-25T20:55:31.802Z · LW(p) · GW(p)

I think you're vastly underestimating internet usage here. One of the best things Facebook has done (in my opinion) is massively proliferate the practice of internet arguing. The enforced principle of not getting socked by someone in a fit of rage just makes the internet so irresistible for speaking your mind, you know?

Additionally, every so often I see my siblings scrolling through Facebook or some "funny image collection" linked from Facebook, seeing for the first time images I saw years ago. If the internet has a higher-than average intelligence, then the internet usage resulting from Facebook is a powerful intelligence boost to the general population.

I suppose I should write my analysis here into a proper post some time, as I do consider it a significant modern event.

Replies from: gattsuru, RolfAndreassen
comment by gattsuru · 2013-10-26T04:03:43.549Z · LW(p) · GW(p)

I agree that the internet usage has lead to a massive proliferation of certain types of knowledge and certain types of intelligent thought.

At the same time, it's important to note that image memes, Twitter, and Tumblr have increasingly replaced Livejournal or other long-form writing at the same time that popular discussion has expanded, and style guides have increasingly encourage three-sentence paragraphs over five-sentence paragraphs for internet publishing. There are a few exceptions -- fanfiction has been tending to longer and longer-form, often exceeding the length of what previous generations would traditionally consider a doorstopper by orders of magnitude* -- but much social media focuses on short and often very short form writing.

  • There are at least a dozen Harry Potter fanfictions with a higher wordcount than the entire Harry Potter series, spinoff media included. Several My Little Pony authors have put out similar million-word-plus texts in just a few years, including a couple of the top twenty read fictions. This may increase tolerance for nonfiction long reads, although I'm uncertain the effects will hit the general populace.
comment by RolfAndreassen · 2013-10-26T01:32:31.373Z · LW(p) · GW(p)

I agree that the Internet is a boost to human intelligence, relative to the TV that it is replacing and to whatever-it-was that TV replaced - drinking at the pub, probably. I don't think the effect is large compared to the selection bias of hanging out in LW-ish parts of the Internet.

Replies from: BaconServ
comment by BaconServ · 2013-10-26T01:46:41.670Z · LW(p) · GW(p)

I'd agree if I thought LessWrong performed better than average.

Replies from: RolfAndreassen
comment by RolfAndreassen · 2013-10-26T01:56:00.724Z · LW(p) · GW(p)

What metric would you propose to measure LW performance?

Replies from: BaconServ
comment by BaconServ · 2013-10-26T02:44:04.064Z · LW(p) · GW(p)

My current heuristic is to take special note of the times LessWrong has a well-performing post identify one of the hundreds of point-biases I've formalized in my own independent analysis of every person and disagreement I've ever seen or imagined.

I'm sure there are better methods to measure that LessWrong can figure out for itself, but mine works pretty well for me.

Replies from: RolfAndreassen
comment by RolfAndreassen · 2013-10-26T03:45:56.073Z · LW(p) · GW(p)

identify one of the hundreds of point-biases

Not quite sure what you mean here; could you give an example?

But this aside, it seems that you are in some sense discussing the performance of LessWrong, the website, in identifying and talking about biases; while I was discussing the performance of LessWrongers, the people, in applying rationality to their real lives.

Replies from: BaconServ
comment by BaconServ · 2013-10-26T04:17:06.542Z · LW(p) · GW(p)

A good example would be any of the articles about identity.

It comes down to a question of what frequency of powerful realizations individual rationalists are having that make their way back to LessWrong. I'm estimating it's high, but I can easily re-assess my data under the assumption that I'm only seeing a small fraction of the realizations individual rationalists are having.

comment by seez · 2013-10-27T00:43:17.502Z · LW(p) · GW(p)

I think the sequences are accessible for how abstract they are and how unfamiliar the ideas are (usually, abstraction and unfamiliarity decrease accessibility). I work as a tutor in a program for young people, and one of the interesting parts of the program is that all of the students are given a variety of tests, including IQ tests, which the tutors have access to as part of an effort to customize teaching approach to best suit students' interests and abilities. I have all kinds of doubts about how ethical and useful the program is, but it has taught me a lot about how incredibly widely people vary. I don't believe most of my students would get much out of the sequences, but perhaps I'm too pessimistic. I think even if they understood the basic argument, they would not internalize it or realize its implications. I'd guess that their understanding would be crappily correlated with IQ. I have been spending a lot of time trying to figure out how to communicate those ideas without simplifying away the point.

Replies from: BaconServ
comment by BaconServ · 2013-10-27T02:51:09.341Z · LW(p) · GW(p)

These ideas are trivial. When I say "accessible," I mean in terms of the people educated in the world of the past who systematically had their ideas shut down. Anyone who has been able to control their education from an early age is a member of the Singularity already; their genius—the genius that each person possesses—has simply yet to fully shatter the stale ideas of a generation or two of fools who thought they knew much about anything. You really don't need to waste your time trying to get them to recognize the immense quality of this old-world content to old-world rationalists.

I apologize that this will come across as an extraordinary claim, but I've already grown up in the Singularity and derived 99% of the compelling content of LessWrong—sequences and Yudkowsky's thoughts included—by the age of 20. I'm gonna get downvoted to hell saying this, but really I'm just letting you know this so you don't get confused by how amazing unrestricted human curiosity is. Basically, I'm only saying this because I want to see your reaction in ten years.

comment by dougclow · 2013-10-25T17:48:32.287Z · LW(p) · GW(p)

Play to your strengths; do what you're best at. You don't have to be best in the world at it for it to be valuable.

Good things about this advice are (a) it has a fairly-sound theory behind it (Comparative advantage), and (b) it applies whether or not you're smart, normal or dumb, so you don't get in to socially-destructive comparisons of intelligence.

comment by ChristianKl · 2013-10-25T17:02:22.973Z · LW(p) · GW(p)

When in doubt, ask. The stackexchange network is great to get answers to questions.

skeptics stackexchange is for example great to get answers to general questions.
If you encouter a significant claim on the internet it's often a useful website to check. Recently I came about the claim that batteries follow something like Moore's law. I headed over to skeptics stackexchange and post a question.

Another useful habit is Anki. Especially if you don't trust your brain to remember information on it's own, let Anki help you.

comment by Viliam_Bur · 2013-10-25T08:35:33.733Z · LW(p) · GW(p)

look for tested methods that have significant positive results relevant to the area of interest

This part of advice needs to be more specific. For example which "positive results" should be trusted and which not. Because everyone who wants to sell you something will tell you about "positive results".

comment by Manfred · 2013-10-25T07:00:34.864Z · LW(p) · GW(p)

My 3 pieces of advice, for someone already convinced that something fairly accurate is going on, would be:

1) Try to sign up for cryonics before dying.

2) When donating to charity, use the recommendations of an organization like GiveWell ("the practical approach"), or donate to a charity working on existential risk ("the 'taking ideas seriously' approach).

3) (My one best piece of rationality advice) Other people have good reasons for their actions, according to themselves. That doesn't mean you'll think they're good once you find them out - but it does mean you should try to find them out.

Replies from: Creutzer
comment by Creutzer · 2013-10-25T11:12:55.037Z · LW(p) · GW(p)

Other people have good reasons for their actions, according to themselves.

Do they? (Or are you referring to the fact that people, when asked explicitly why they did something, make up some reason and convince themselves that that was it? Context suggests to me, though, that "according to themselves" refers to what they think, not what they say (and maybe then think) upon asking.)

Replies from: NancyLebovitz, niceguyanon
comment by NancyLebovitz · 2013-10-25T13:52:47.446Z · LW(p) · GW(p)

I believe that people can only do what makes sense to them, for some very expansive meaning of "makes sense". The gain from believing this is to give up the delusion that what feels right to me should automatically transfer to other people.

Replies from: Creutzer
comment by Creutzer · 2013-10-25T14:13:10.874Z · LW(p) · GW(p)

True, but I know I do a lot of things without first thinking about whether they make sense. I don't generally have the time to check that for every single action I take (e.g. performing a speech act in a conversation).

Replies from: NancyLebovitz
comment by NancyLebovitz · 2013-10-25T15:15:28.939Z · LW(p) · GW(p)

That's kind of where I was pointing with "very expansive meaning of "makes sense" "-- system 1 has its own background premises, even if they aren't verbal or filtered through the conscious mind.

I'll see if I can come up with a better phrasing.

Replies from: Creutzer
comment by Creutzer · 2013-10-26T10:19:50.053Z · LW(p) · GW(p)

It seems to me that the situation is this: everybody does everything for a reason (surprise, surprise), but they may not know it, you may not know it, it may not be what they say it is even if they try to be honest, and it may not be a good reason.

That, unfortunately, is not a neatly summarisable point, and the question of what moral to draw from it is not trivial.

comment by niceguyanon · 2013-10-25T11:45:33.961Z · LW(p) · GW(p)

I think the latter.

comment by ChristianKl · 2013-10-25T03:00:33.452Z · LW(p) · GW(p)

I'm not sure how much raw intelligence matters. If a person who's average intelligence stays with a problem which doesn't get much attention for 10 years I see no reason why they shouldn't be able to contribute something to it.

Being intellectual means staying with intellectual problems over years instead of letting them drop because a new television series is more important.

Replies from: RolfAndreassen, None
comment by RolfAndreassen · 2013-10-25T05:23:03.839Z · LW(p) · GW(p)

Since IQ correlates with practically everything, including conscientousness and the ability to concentrate, I'm not convinced this advice is helpful. The average human may be plain unable to meaningfully stick with a problem for ten years. (That is, to actually productively work on the problem daily, not just have it on the to-do list and load up the data or whatever every so often.) I fear the LW bubble gives most people here a rather exaggerated estimate of the "average"; your median acquaintance is likely one or two standard deviations above the real population average, and that already makes a big difference.

Replies from: ChristianKl
comment by ChristianKl · 2013-10-25T11:55:10.464Z · LW(p) · GW(p)

The average human may be plain unable to meaningfully stick with a problem for ten years. (That is, to actually productively work on the problem daily, not just have it on the to-do list and load up the data or whatever every so often.)

I don't think working every day on the problem is necessary. For a lot of problems visiting them monthly does a lot.

If you want to formalize the approach it's something like: I have learned something new X, how does X related to problem Y_1 to Y_n?

If you inform yourself widely, I think you have the potential to contribute. Most people aren't intellectual because they don't invest any effort in being intellectual.

Since IQ correlates with practically everything, including conscientousness

Given that papers get published with titles like Why is Conscientiousness negatively correlated with intelligence? I don't think that's the case.

comment by [deleted] · 2013-10-27T18:43:19.979Z · LW(p) · GW(p)

a problem which doesn't get much attention for 10 years

Could you give examples of problems like this?

Replies from: ChristianKl
comment by ChristianKl · 2013-10-27T23:10:29.355Z · LW(p) · GW(p)

I think will give three examples of problems with whom I stayed over longer time: Spaced repetition learning, polyphasic sleep and quantified self.

Quantified Self is the example where I have the most to show publically. I did community work in QS. My name is in a dozen mainstream media pieces in a total of three languages. Piece means either newspaper, radio or TV I did all of them multiple times.

Spaced repetition learning would be one problem which is extremly important but has very few people who are working on it.

The Mnemosyth data lies around for years without anyone analysing it. Going through that data and doing a bit of modeling with it should be easy for anyone who's searching a bachlor thesis for computer science or otherwise seeks a project.

Another question would be: How do you calculate a good brainperformance score for a given day given Anki review data? (Anki stores all the review data internally in a SQL database)

You don't need to be a genius to contribute to any of the those two issues. Both problems are pretty straightforward if you can program and have interest in modelling.

Polyphasic sleep is a problem where I would say that I contribute to the discussion. I tried it probably 8/9 years ago and I stayed with the problem intellectually. Last year a friend of mine was trying uberman for a month and in researching the topic he came about something I wrote. When talking with him about the topic he quoted one of my online opinion on the topic to me and at first it surprised me because I haven't made that point in his physical presence.

My highest rated answer on skeptic stackexchange is also about the uberman shedule: http://skeptics.stackexchange.com/questions/999/does-polyphasic-sleep-work-does-it-have-long-term-or-short-term-side-effects/1007#1007

It's not like I contributed a breakthrough in thinking about polyphasic sleep but I did contribute to the knowledge on the topic a bit.

Replies from: gwern, None
comment by gwern · 2013-10-28T17:21:17.778Z · LW(p) · GW(p)

The Mnemosyne data lies around for years without anyone analysing it. Going through that data and doing a bit of modeling with it should be easy for anyone who's searching a bachlor thesis for computer science or otherwise seeks a project.

It's a real pain to, though, because it's so big. A month after I started, I'm still only halfway through the logs->SQL step.

Replies from: ChristianKl
comment by ChristianKl · 2013-10-28T18:28:51.050Z · LW(p) · GW(p)

It's a real pain to, though, because it's so big. A month after I started, I'm still only halfway through the logs-SQL step.

That sounds like you do one insert per transaction which is the default way SQL operates. It possible to batch multiple inserts together to one transaction.

If I remember right the data was something in the size of 10GB. I think that a computer should be able to do the logs->SQL step in less than a day provided one doesn't do one insert per transaction.

Replies from: gwern
comment by gwern · 2013-10-28T22:31:44.641Z · LW(p) · GW(p)

I believe so, yeah. You can see an old copy of the script at http://github.com/bartosh/pomni/blob/master/mnemosyne/science_server/parse_logs.py (or download the Mnemosyne repo with bzr). My version is slightly different in that I made it a little more efficient by shifting the self.con.commit() call up into the exception handler, which is about as far as my current Python & SQL knowledge goes. I don't see anything in http://docs.python.org/2/library/sqlite3.html mentioning 'union', so I don't know how to improve the script.

If I remember right the data was something in the size of 10GB.

The .bz2 logs are ~4GB; the half-done SQL database is ~18GB so I infer the final database will be ~36GB.

EDIT: my ultimate solution was to just spend $540 on an SSD, which finished the import process in a day; the final uploaded dataset was 2.8GB compressed and 18GB uncompressed (I'm not sure why it was half the size I expected).

comment by [deleted] · 2013-10-28T17:10:13.870Z · LW(p) · GW(p)

Thanks for the round up! I thought that by "problems" you meant things like the millennium problems and friendly AI and couldn't picture how average people could make any progress in them (well maybe some with dedication) but these make more sense. How easy it is to get funding for these kind of projects? I'm just wondering because these are a bit fringe issues still, but of course very important.

Replies from: ChristianKl
comment by ChristianKl · 2013-10-28T18:00:54.113Z · LW(p) · GW(p)

Quantified Self is in it's nature about dealing with epistemology. It's not certain that you will learn something about how an AGI works by doing Quantified Self but the potential is there.

A mathematical model of how human memory works that could be produced by looking at Mnemosyth data could also potentially matter for FAI.

FAI is a hard problem and therefore it's difficult to predict, where you will find solutions to it.

How easy it is to get funding for these kind of projects?

It very much depends on the project. I don't know how hard it is to get grants for the spaced repetition problems I mentioned. I however think that if someone seeks a topic for a bachelor or master thesis, they are good topics if you want an academic career.

The daily Anki score would allow other academics to do experiments of how factor X effects memory. If you provide the metric that they use in their papers they will cite yourself.

I thought that by "problems" you meant things like the millennium problems

I don't understand why anyone would want to work on the Riemann Hypothesis. It doesn't seem to be a problem that matters.

It one of those examples that suggests that people are really bad at prioritising. Mathemacians work at it because other mathematician think it's hard and solving it would impress them.

It has a bit of Terry Pratchett's Unseen University which was created to prevent powerful wizards from endangering the world by keeping them busy with academic problems. The only difference is that math might advance in a way that makes an AGI possible and is therefore not completely harmless.

Replies from: None
comment by [deleted] · 2013-10-28T18:26:03.468Z · LW(p) · GW(p)

I don't understand why anyone would want to work on the Riemann Hypothesis. It doesn't seem to be a problem that matters.

Could the fact that it doesn't seem to have many practical applications is what attracts certain people towards it? It doesn't have practical applications -> it's "purer" math. You're not trying to solve the problem for some external reason or using the math as a tool, you're trying to solve it for its own sake. I remember reading studies that mathematicians are on average more religious than scientists in general and I've also gotten the impression that some mathematicians relate to math a bit like it's religion. There is also this concept: http://en.wikipedia.org/wiki/Mathematical_beauty

It could be that some are just trying to impress others but I don't think it's always that simple.

And to my knowledge, there is some application for almost all the math that's been developed. Of course, if you optimized purely for applications, you might get better results.

Replies from: ChristianKl
comment by ChristianKl · 2013-10-28T18:58:07.230Z · LW(p) · GW(p)

It could be that some are just trying to impress others but I don't think it's always that simple.

Yes, you are right it's more complicated.

comment by BaconServ · 2013-10-25T02:51:31.547Z · LW(p) · GW(p)

I find your third point for practical advice to be significantly dis-charitable to someone of average intelligence. There are people that miss obvious patterns like, "This person gives bad advice," but I think people of average intellect are already well equipped to notice simple patterns like that.

I don't believe this is a coherent set of general advice that can be given here. What specific details and methods of rationality any given "average" person is missing, and what specific cognitive biases they suffer from most severely will vary too widely to get good coverage with a few short points. My approach would be to work on an individual basis to determine what's causing the most problems for each person and address them accordingly. This may seem highly inefficient, but remember that success stories are told and retold virally as each new person has experiences that confirm the wisdom:

That sounds a lot like what I went through. What really helped me was...

There are far too many average people for me to expect a single centralized fault will be considerably effective.

comment by Vika · 2013-10-25T15:16:48.164Z · LW(p) · GW(p)

One of the most important steps to becoming more rational for an average person would be to disentangle themselves from the default goals / values imposed by society or their peers. This would free up a lot of time for figuring out their own goals and developing relevant skills.

An average person could go far with instrumental rationality techniques like those taught at CFAR. Exercises like goal factoring and habit training don't require a high capacity for abstraction, only willingness to be explicit about one's motivations. For accumulating factual knowledge, spaced repetition software could be very useful.

comment by BartMan · 2013-12-19T09:49:21.927Z · LW(p) · GW(p)

Right now I'm reading this book : " The Art of Thinking Clearly: Better Thinking, Better Decision" , so that I can get myself familiar with the biases that I unconsciously do.

comment by SoerenMind · 2013-10-25T22:22:16.523Z · LW(p) · GW(p)

I would say asking for advice seems like a pretty useful heuristic then. Approach people with the same goals that are smart and ask them where to donate, or even what to believe. The fact that a smart person (who has given a lot of thought to something) believes something is good evidence that it is true. So basically: find a mentor.

comment by niceguyanon · 2013-10-25T11:36:19.645Z · LW(p) · GW(p)

However, s/he isn't smart enough to discover new knowledge in most fields, or contribute very much to a conversation of more knowledgeable experts on a given topic. Let's assume s/he has no exceptional talents in any area.

Most people are intellectually near-average or below-average, and I have not seen extensive discussion on how to help them lead happier lives that make the world a better place.

Upvoted for caring about other people. Most of your suggestions I agree with and there are some other good ones in the comments. I want to point out that the advice given is pretty much the same it would be for most people anyway. Most people one standard deviation or more from average are also not likely to discover new knowledge either. I believe it is a matter of execution; practical advice is fitting for not-so-smart people and smart people alike (for the most part), whether you can get it done is another matter and smart people usually do, that's why they are smart/consciousness.

comment by TsviBT · 2013-10-25T06:38:39.230Z · LW(p) · GW(p)

Specific advice: [ETA: If you decide that it is worth your time and effort to work directly on improving your general thinking skills, then one difficult but effective way to do that is to] learn to program and/or to learn math. Use google to find resources. Don't be embarrassed by books/articles with "for beginners" or "introductory" or "elementary" in their titles, and especially don't be embarrassed if even those are too hard (in fact, beware of the good old "elementary" math text, meaning "elementary... for grad students, hahaha!"). Just keep looking until you find something you can dig into. Do lots of exercises; basically, do most of them until they get boring, but learn to feel the difference between being bored because it's easy and bored because you have no idea how to do it. Find someone who can check your understanding, or find books with really good worked out examples.

Learning a technical field (meaning proof-based math, programming, and to a lesser extent other sciences) - even the very basics - will help you think faster, hold more things in your head all at once, follow long and complicated arguments, notice flaws or vagueness in arguments, distinguish crucial structure from irrelevant details, etc. Accordingly, it will take a huge sustained effort.

General advice: find someone smarter than you, and ask for life strategy help. E.g., do you think you could do a non-glamorous but well-paid job and donate to charity? If so, brainstorm with google and/or a smart person for jobs like that. Maybe check out 80,000. I haven't really looked at that site much but it seems like it has some helpful stuff.

Replies from: Creutzer, bramflakes
comment by Creutzer · 2013-10-25T11:14:42.272Z · LW(p) · GW(p)

Specific advice: [...]

This is supposed to be for a person of average intelligence? ...

Replies from: TsviBT
comment by TsviBT · 2013-10-25T20:28:11.009Z · LW(p) · GW(p)

Yes it is. There was a big additional assumption I was making in my head, I've edited to clarify. Now does it make sense?

Replies from: Creutzer
comment by Creutzer · 2013-10-26T10:12:24.610Z · LW(p) · GW(p)

I don't think so. Now it basically reduces to the general claim "learning math and programming improves general thinking skills" - which, by the way, I'm not convinced of in full generality -, but has nothing to do with the average person. The problem is that learning programming and math takes so much time and effort, if it is at all possible, for the average person (and with no easily identifiable returns at that) that the antecedent of your conditional is unlikely to ever be true, thus rendering your advice largely irrelevant.

comment by bramflakes · 2013-10-25T15:40:54.263Z · LW(p) · GW(p)

You're significantly overestimating how a) easy and b) enjoyable the average person finds math-related subjects. Most people don't get past algebra.

I think a better option would be to gain a gut-level understanding of comparative advantage, and how much to value your time, so that you can get paid for what you do best, and outsource what you're bad at or find boring to other, more competent and enthusiastic people.

Replies from: TsviBT
comment by TsviBT · 2013-10-25T20:36:53.231Z · LW(p) · GW(p)

b) enjoyable

In my head, I was assuming motivation, edited to clarify.

Most people don't get past algebra.

Yeah I know, that's why I commented. Even basic facility in proof based math is an extremely powerful mental technology, as I tried to say. I would not recommend calculus. I am talking about combinatorics or graph theory, or discrete math in general, where you can see the basic building blocks of proofs and proof strategies. This is worth years of effort.

Replies from: Gurkenglas
comment by Gurkenglas · 2013-10-27T03:27:44.949Z · LW(p) · GW(p)

Maybe proficiency in proof-based math is not a cause of mental superiority, but an indicator?