Comment by wilkox on Group rationality diary, 6/4/12 · 2012-06-05T09:52:39.010Z · LW · GW

Have you considered melatonin? Quoting gwern:

Melatonin allows us a different way of raising the cost, a physiological & self-enforcing way. Half an hour before we plan to go to sleep, we take a pill. The procrastinating effect will not work - half an hour is so far away that our decision-making process & willpower are undistorted and can make the right decision (viz. following the schedule). When the half-hour is up, the melatonin has begun to make us sleepy. Staying awake ceases to be free, to be the default option; now it is costly to fight the melatonin and remain awake.

I use it for exactly this reason and it works brilliantly.

Comment by wilkox on If epistemic and instrumental rationality strongly conflict · 2012-05-10T07:51:39.541Z · LW · GW

This is like saying "if evolution wants a frog to appear poisonous, the most efficient way to accomplish that is to actually make it poisonous". Evolution has a long history of faking signals when it can get away with it. If evolution "wants" you to signal that you care about the truth, it will do so by causing you to actually care about the truth if and only if causing you to actually care about the truth has a lower fitness cost than the array of other potential dishonest signals on offer.

Comment by wilkox on A Kick in the Rationals: What hurts you in your LessWrong Parts? · 2012-04-26T04:25:51.190Z · LW · GW

I've noticed many people who practise meditation have a strong belief in meditation and the more 'rational' core of Buddhist practices, but only belief in belief about the new age-y aspects. My meditation teacher, for example, consistently prefaces the new age stuff with "in Buddhist teachings" or "Buddhists believe" ("Buddhists believe we will be reincarnated") while making other claims as simple statements of fact ("mindfulness meditation is a useful relaxation technique").

Comment by wilkox on 9/11 as mindkiller · 2011-09-13T22:56:28.078Z · LW · GW

I appreciate this. I genuinely didn't (still don't ) understand what lessdazed was trying to say, and it would be a really bad thing if downvoting ignorance became common practice.

Comment by wilkox on 9/11 as mindkiller · 2011-09-13T00:04:43.674Z · LW · GW

It's important to avoid the if-not-for-the-worst-waste-of-money-in-the-budget-the-most-worthy-unfunded-program-would-have-been-funded argument.

Can you explain why? This seems like a perfectly normal and reasonable sort of argument about dividing a limited pool of resources wisely.

Comment by wilkox on Please do not downvote every comment or post someone has ever made as a retaliation tactic. · 2011-08-21T23:09:37.476Z · LW · GW

Perhaps sparklines would work for this. They compress the recent history of a measurement in a space-efficient way which can fit inline with text.

Comment by wilkox on Leveling IRL - level 1 · 2011-08-09T23:09:47.222Z · LW · GW

This sounds a lot like the Scouting merit system, in a good way. I learned more life skills from Scouts then I ever did from public education.

Comment by wilkox on Distracting wolves and real estate agents · 2011-07-07T23:44:41.968Z · LW · GW

This doesn't seem to be an answer to Wei Dai's question.

Comment by wilkox on Best articles to link to when introducing someone to Less Wrong? · 2011-07-06T04:19:13.684Z · LW · GW

I recently introduced a friend to HPMR and she went on to discover Less Wrong entirely of her own accord. She has explicitly cited it as sparking her interest in things like Bayesian inference, which she would never have considered learning about before.

Comment by wilkox on The 48 Rules of Power; Viable? · 2011-05-27T08:42:28.240Z · LW · GW

The link "summary" and the link "Here is a little more expanded text" seem to point to the same place, in my browser at least.

Comment by wilkox on You'll die if you do that · 2011-05-15T23:45:39.214Z · LW · GW

From the linked McDonald's coffee case article:

In addition, they awarded her $2.7 million in punitive damages. The jurors apparently arrived at this figure from [the burn victim's lawyer's] suggestion to penalize McDonald's for one or two days' worth of coffee revenues, which were about $1.35 million per day.

Talk about a brilliant use of anchoring...

Comment by wilkox on You'll die if you do that · 2011-05-15T23:37:24.574Z · LW · GW

I may also explain to them that if defending oneself receives the exact same penalty that attacking someone gets it will usually be best to initiate the combat yourself.

This is excellent advice, with the caveat that the school's disciplinary penalty is probably not the only cost. Being known as "the kid who walks expressionlessly up to other kids and punches them in the testicles without warning" may be a significant penalty too. (This doesn't mean striking first is always a bad strategy, just that it needs to be done carefully).

Comment by wilkox on Holy Books (Or Rationalist Sequences) Don’t Implement Themselves · 2011-05-11T23:00:31.392Z · LW · GW

In any case, it is pretty clear that it is possible to hold rationality and religion in your head at the same time. This is basically how most people operate.

More generally, "In any case, it is pretty clear that it is possible to hold rationality and irrationality in your head at the same time. This is basically how most people operate." I'm no more surprised to hear about a religious rationalist than I am when I notice yet another of my own irrational beliefs or practices.

Comment by wilkox on Scholarship: How to Do It Efficiently · 2011-05-11T01:55:15.198Z · LW · GW

Mendeley is good for this, and specifically designed for managing a library of academic papers. It supports tagging and full text searches, as well as some half-baked "social" features which can be safely ignored. The most useful feature for me is that it can watch a directory for new papers, and add them to its library as well as my directory tree (author/year/paper). It can also maintain a bibtex file for the entire library which is handy for citations.

Comment by wilkox on The 5-Second Level · 2011-05-10T23:28:45.310Z · LW · GW

Good point. Reading my comment again, it seems obvious that I committed the typical mind fallacy in assuming that it really is a choice for most people.

Comment by wilkox on Building rationalist communities: lessons from the Latter-day Saints · 2011-05-10T06:41:32.896Z · LW · GW

Missionary work, including LDS, has a phenomenally low success rate. I don't recall it, but from memory a missionary might convert 1-2 people per year based on cold calls.

A one year doubling or tripling time doesn't strike me as "phenomenally low".

Comment by wilkox on [HPMoR] Celebratory Trailer · 2011-05-09T23:25:17.382Z · LW · GW

This was what confirmed Eliezer's skill as a writer in my mind. He resisted the (typical nerdish) impulse to vomit out pages of obsessively detailed explanations, instead leading the reader on with tantalising hints spaced far apart. It probably accounts for a lot of the book's notorious addictiveness.

Comment by wilkox on The 5-Second Level · 2011-05-09T01:09:24.715Z · LW · GW

"things that people say that really actionable beliefs even though they may not be clear on the difference"

This sounds interesting, but I can't parse it.

Comment by wilkox on The 5-Second Level · 2011-05-08T12:37:59.643Z · LW · GW

In any case there really isn't any reason to be offended and especially there is no reason to allow the other person to provoke you to anger or acting without thought.

It seems really, really difficult to convey to people who don't understand it already that becoming offended is a choice, and it's possible to not allow someone to control you in that way. Maybe "offendibility" is linked to a fundamental personality trait.

Comment by wilkox on Hollow Adjectives · 2011-05-06T08:43:23.218Z · LW · GW

Agreed, with the addendum that in this context there seems as much disagreement over the definition of "possible" as the definition of "omnipotent".

Comment by wilkox on Hollow Adjectives · 2011-05-05T09:13:56.994Z · LW · GW

This bothered me too. If 'omnipotent' is defined as 'able to do things which can be done', we're all gods.

Comment by wilkox on The Cognitive Costs to Doing Things · 2011-05-04T11:00:26.672Z · LW · GW

The difference between activation energy and inertia is that you can want to do something, but be having a hard time getting started - that's activation energy. Whereas inertia suggests you'll keep doing what you've been doing, and largely turn your mind off. Breaking out of inertia takes serious energy and tends to make people uncomfortable.

I don't mean to nitpick, but this distinction isn't obvious to me. It seems like inertia is just a component of activation energy.

Great post regardless.

Comment by wilkox on Admit your ignorance · 2011-05-03T11:22:08.099Z · LW · GW

This problem is compounded when the students feel obliged to stay in the class even if they're not getting anything out of it. The result is a room full of tired, frustrated students terrified of being "found out" or giving the wrong answer. I encourage my undergrad students to leave and work on a problem later if their brains just aren't up to the job, but they never do. It's not clear if this is because of years of authoritarian schooling, or if they just don't trust themselves to do the work outside of a classroom.

Comment by wilkox on SIAI - An Examination · 2011-05-02T08:29:34.589Z · LW · GW

Thank you very much for doing this. You've clearly put a lot of effort into making it both thorough and readable.

Formulate methods of validating the SIAI’s execution of goals.

Seconded. Being able to measure the effectiveness of the institute is important both for maintaining the confidence of their donors, and for making progress towards their long-term goals.

Comment by wilkox on Do you have High-Functioning Asperger's Syndrome? · 2011-05-02T00:54:48.772Z · LW · GW

I'm also not sure why the position of her eyes is supposed to be relevant to any of this.

Maybe something to do with the facial asymmetry JanetK mentions here?

Comment by wilkox on Mitigating Social Awkwardness · 2011-05-01T06:12:30.991Z · LW · GW

Why do you say this?

Comment by wilkox on Mitigating Social Awkwardness · 2011-05-01T02:04:51.625Z · LW · GW

Always wait for someone else to laugh at your joke before you join in.

This is generally good advice, but can backfire if you show no signs that you are conscious of making a joke. Making people laugh while remaining deadpan yourself is a high-level humour skill. Listeners who are not sure whether or not to laugh will look for cues from other listeners and from you, and if you're not laughing they may just go along with that.

Often it's better to make it obvious that you've amused yourself with your own joke, with a smile or small chuckle, but not react to whether others laugh or not. That displays confidence, and gives others the social room to laugh if they want.

Comment by wilkox on How hard do we really want to sell cryonics? · 2011-04-30T06:32:58.349Z · LW · GW

I have an intuition that most people would find it less weird to hear a pro-cryonics advertisment from an actual cryonics company than a "Public Service Announcement" from a third party. The former would be processed more like a normal advertisement, to be judged on its merits, while the latter could invite suspicion of the creators' motives. I might be wrong - anyone from marketing or advertising have something to say here?

Comment by wilkox on Meditation, insight, and rationality. (Part 1 of 3) · 2011-04-29T03:45:14.072Z · LW · GW

I'm confused by the idea that the kinds of meditation you are talking about have until now been practised by "small and somewhat private groups" in secret. Why would this kind of meditation be taboo? What did these groups have to fear that drove them to secrecy, and why has that changed?

Comment by wilkox on HELP! I want to do good · 2011-04-28T11:14:56.447Z · LW · GW

Why is continuing to donate as you did previously mutually exclusive with your evangelism plan?

Comment by wilkox on Mini-camp on Rationality, Awesomeness, and Existential Risk (May 28 through June 4, 2011) · 2011-04-27T02:41:28.296Z · LW · GW

Possibly, although I didn't think of that analogy until your comment. It seems more likely that the program will break even when I consider the potential for increased donation compared to my previous estimate, which was based only on AnnaSalamon's described expected outcomes for the program ("more rational, effective people"). I'm not sure that the program actually will break even in terms of existential risk reduction, which is why I'm very interested in seeing SIAI measure any increase in donations.

Comment by wilkox on Mini-camp on Rationality, Awesomeness, and Existential Risk (May 28 through June 4, 2011) · 2011-04-27T02:31:59.769Z · LW · GW

I don't know they will - see my above comment suggesting the SIAI actually measure donations from program participants. It does seem more likely now, however, that the program will at least break even on reducing existential risk, hence my increased comfort with the idea.

Comment by wilkox on Mini-camp on Rationality, Awesomeness, and Existential Risk (May 28 through June 4, 2011) · 2011-04-27T02:16:49.205Z · LW · GW

By way of analogy, suppose a cancer charity has $10,000 to spend. It could invest the money directly into research, for a marginal expected return in decreased cancer suffering, or it could spend it on a glitzy event where potential donors get to "try their hand" at working in a research lab for a day. The second option could sound like a waste of money, as the donors probably won't do anything worthwhile in a day of messing around in a lab. However, if they go on to contribute $100,000 more to the charity than they otherwise would have, that money can be reinvested in research for a 9x greater return on investment than investing the original $10,000 directly into research would have yielded (ignoring discount rates and assuming linear return on research investment). If any of the participants did happen to go on and become great cancer researchers, this would just be an excellent bonus effect.

The idea that this program will result in increased donations makes me more comfortable because it seems this is a more likely way the program will directly reduce existential risk than the vaguer goal of 'raising the sanity waterline'. If it does succeed in raising the sanity waterline in a way that reduces existential risk, that would be an excellent bonus.

Comment by wilkox on Mini-camp on Rationality, Awesomeness, and Existential Risk (May 28 through June 4, 2011) · 2011-04-27T01:22:06.696Z · LW · GW

The idea of holding a program to increase donations actually made me more comfortable, as it seems more like a long term investment in reducing existential risk then money squandered on something fun but not obviously essential.

Comment by wilkox on Mini-camp on Rationality, Awesomeness, and Existential Risk (May 28 through June 4, 2011) · 2011-04-27T01:05:42.937Z · LW · GW

That's a good point. An increase in donations from a specific group of people should be easy to measure too, so the SIAI could use it to directly assess the effectiveness of these programs.

Comment by wilkox on Mini-camp on Rationality, Awesomeness, and Existential Risk (May 28 through June 4, 2011) · 2011-04-26T08:28:02.029Z · LW · GW

Why is the Singularity Institute paying for this?

We're trying to reduce existential risk -- to increase the odds that an eventual Singularity is good, from the perspective of humane values. To do this, we need more rational, effective people -- people who can train to do the needed research, who can fund that or other work, and who can otherwise exert influence toward good outcomes.

I'd be interested in hearing more about how you foresee graduates of these camps working to reduce existential risk, especially as a donor to the SIAI. Is there a long term plan in place or are you just trying some things out?