Posts

Comments

Comment by Lunin on The Great Filter is early, or AI is hard · 2014-09-05T16:28:59.049Z · LW · GW

There is a third alternative. Observed universe is limited, the probability of life arising from non-living matter is low, suitable planets are rare, evolution doesn't directly optimize for intelligence. Civilizations advanced enough to build strong AI are probably just too rare to appear in our light cone.

We could have already passed the 'Gread Filter' by actually existing in the first place.

Comment by Lunin on Bayesianism for humans: prosaic priors · 2014-09-03T18:37:03.682Z · LW · GW

Some property X is 'banal' if X applies to a lot of people in an disappointingly mundane way, not having any redeeming features which would make it more rare (and, hence, interesting). In the other words, X is banal iff base rate of X is high. Or, you can say, prior probability of X is high.

I doubt that (widespread belief) <=> (high prior probability). If a statement is believed and considered obvious by a lot of people, it can be really obvious and true. Or it can be a wrong cached thought. Or just a common misconception.

If you are proficient in some area, your feeling of prior probabilities in this area is better than average, because you are already used to some counterintuitive facts of that field, and you have some idea about how proper 'counterintuitive-ness' looks like. In fact, you update your intuition. If you already believe General Relativity, you will be more apt to believe counterintuitive facts from Quantum Mechanics. Probably the same holds true if you are on the right side of the Bell Curve.

By the way, Boring Advice Repository seems not boring at all. It's more like a collection of 'lifehacks' and advices aimed to shoot blind spots and ugh fields