Posts

Comments

Comment by Aaron3 on Worse Than Random · 2008-11-11T19:34:52.000Z · LW · GW
A combination dial often has a tolerance of 2 in either direction. 20-45-35 will open a lock set to 22-33-44.

I certainly hope not! I think you intended 20-35-45 for the first or 22-44-33 for the second.

Comment by Aaron3 on Ban the Bear · 2008-09-19T20:21:06.000Z · LW · GW

The Onion: "Bush to Cut Deficit from Federal Budget"

Comment by Aaron3 on Dreams of Friendliness · 2008-08-31T03:48:54.000Z · LW · GW

But spinning a hard drive can move things just outside the computer, or just outside the room, by whole neutron diameters

Not long ago, when hard drives were much larger, programmers could make them inch across the floor; they would even race each other. From the Jargon File:

There is a legend about a drive that walked over to the only door to the computer room and jammed it shut; the staff had to cut a hole in the wall in order to get at it!

Comment by Aaron3 on Extensions and Intensions · 2008-02-04T22:07:59.000Z · LW · GW

It's 'Peirce', not 'Pierce'.

Comment by Aaron3 on The "Outside the Box" Box · 2007-10-13T00:53:58.000Z · LW · GW

Eliezer is certainly correct that our real goal is to make optimal decisions and perform optimal actions, regardless of how different they are from those of the herd. But that doesn't mean we should ignore information about our conformity or non-conformity. It's often important.

Consider the hawk-dove game. If you're in a group of animals who randomly bump into each other and compete for territory, the minority strategy is the optimal strategy. If all your peers are cowards, you can completely dominate them by showing some fang. Or if your peers follow the "never back down, always fight to the death" strategy, you should be a coward until they've killed each other off. Non-conformity is a valid goal (or subgoal, at least).

On the other hand, in situations with networks effects, you want to be a conformist. If you're selling your widget on Bob's Auction Site, which has 20 users, instead of eBay, your originality is simply stupid.

Comment by Aaron3 on Priming and Contamination · 2007-10-10T04:35:18.000Z · LW · GW

What can we do about this? Can we reduce the effects of contamination by consciously avoiding contaminating input before making an important decision? Or does consciously avoiding it contaminate us?

Comment by Aaron3 on 9/26 is Petrov Day · 2007-09-26T19:33:02.000Z · LW · GW

Oops, as a correction to my previous comment, that should be "ground radars." "Ground satellites" is just an oxymoron.

Comment by Aaron3 on 9/26 is Petrov Day · 2007-09-26T19:22:42.000Z · LW · GW

"He sent messages declaring the launch detection a false alarm, based solely on his personal belief that the US did not seem likely to start an attack using only five missiles."

According to the Wikipedia article, Petrov claimed that he had other reasons for believing it was false alarm: lack of corroborating evidence from ground satellites and the fact that the detection technology was new and immature.

Comment by Aaron3 on Semantic Stopsigns · 2007-08-24T21:27:54.000Z · LW · GW

Oops, that should be "non-falsifiability," not "non-falsification."

Comment by Aaron3 on Semantic Stopsigns · 2007-08-24T21:25:48.000Z · LW · GW

Interesting post. It strikes me that semantic stopsigns join adoration of mystery and non-falsification as survival tricks acquired by story memes when curiosity -- and other stories -- threatened their existence.

Comment by Aaron3 on Conservation of Expected Evidence · 2007-08-13T18:07:00.000Z · LW · GW

One minor correction, Eliezer: the link to your essay uses the text "An Intuitive Expectation of Bayesian Reasoning." I think you titled that essay "An Intuitive EXPLANATION of Bayesian Reasoning." (I am 99.9999% sure of this, and would therefore pay especial attention to any evidence inconsistent with this proposition.)