Posts

Comments

Comment by albatross on Free to Optimize · 2009-01-02T18:02:45.000Z · LW · GW

I find the parallel with what we want from government help kind-of interesting. Because I'm about 99% certain that I'd rather have fixed rules about how people get help (if you're unemployed, you get $X per week for N weeks maximum; if you're seriously poor, you qualify for $Y per week under qualifying conditions Z, etc.) than have some government employee deciding, on a per-case basis, how much I deserved, or (worse) trying to improve me by deciding whether I should be given $X per week, or whether that might just encourage me to laze around the house for too long.

The parallel isn't perfect--bureaucracies, like markets and legal systems, end up being more like some kind of idiot-savant AI, than like some near-omniscent one. But I think there is a parallel there--we'd probably mostly prefer consistent, understandable rules to our safety nets or whatever, rather than some well-meaning powerful person trying to shape us for our own good.

Comment by albatross on Joy in Discovery · 2008-03-21T16:54:09.000Z · LW · GW

Does this mean all the Wikipedia entries on science need spoiler alerts?

Honestly, I've had the experience of knowing something nobody else does for awhile (though in cryptography, not something world-shaking involving tides or planets), and it's kind of a cool feeling. I think part of this is anticipation of improved status, but a bigger part, at least for me, is that this gives me a way to measure myself against something external. If I'm discovering/inventing stuff that nobody else has managed to discover/invent, this gives me a sense that I'm doing a good job.

I agree with the above comment that our motivations for stuff like this are mixed; I love solving the puzzle, I get a bigger charge out of it when the puzzle is hard (but not so hard that I can't get anywhere on it and give up instead), I like the status of being the guy who did something cool, I like the knowledge that I know something nobody else does (and if I died right now, maybe nobody would figure this out for many more years), and I like the way of measuring myself against the best other people can do. And there are probably other sources of motivation for trying to discover new stuff, invent new stuff, understand things nobody else has ever understood, etc.

Comment by albatross on Conjunction Fallacy · 2007-09-19T13:45:39.000Z · LW · GW

The interesting thing to me is the thought process here, as I also knew what was being tested and corrected myself. But the intuitive algorithm for answering the question is to translate "which of these statements is more probable" with "which of these stories is more plausible." And adding detail adds plausibility to the story; this is why you can have a compelling novel in which the main character does some incomprehensible thing at the end, which makes perfect sense in the sequence of the story.

The only way I can see to consistently avoid this error is to map the problem into the domain of probability theory, where I know how to compute an answer and map it back to the story.

Comment by albatross on Bayesian Judo · 2007-08-01T00:45:11.000Z · LW · GW

Of course, you may not be invited back, if you offend them badly enough....

Comment by albatross on Universal Fire · 2007-04-30T14:20:42.000Z · LW · GW

I recall Vinge saying something along the lines of "I needed an explanation for why everyone didn't hit the singularity" in some interview or something, but I thought the zones were pretty clearly modeled on the complexity hierarchy.

Think of consciousness and FTL travel each needing certain algorithms to run. The local laws of physics bound what computing devices can be built, in a way that determines the algorithms that can be run. In the unthinking depths, no computing devices can exist on which consciousness algorithms can run. (Or maybe it just gets successively harder for consciousness algorithms to work.) I kept thinking of the transcend as where the algorithms were linear time, the beyond as polynomial with increasing exponents, and the slow zone (for FTL) or unthinking depths (for consciousness) as exponential.

The big idea behind A Deepness in the Sky was that with limits on computing power and other technology--the "failed dreams," you're caught in endless Motie-style cycles of collapse and recovery (or sometimes just collapse). All IMO. (Does Vinge ever comment on blogs?)

I keep thinking that in Dies the Fire (I've only read 1.5 books in the series), someone should start using compressed air tanks to store unlimited amounts of energy. I mean, the pressure gradient never gets above some maximum, right?