Posts

Comments

Comment by Anti-reductionist on Crisis of Faith · 2008-10-10T23:43:39.000Z · LW · GW

Many in this world retain beliefs whose flaws a ten-year-old could point out

Very true. Case in point: the belief that "minimum description length" or "Solomonoff induction" can actually predict anything. Choose a language that can describe MWI more easily than Copenhagen, and they say you should believe MWI; choose a language that can describe Copenhagen more easily than MWI, and they say you should believe Copenhagen. I certainly could have told you that when I was ten...

Comment by Anti-reductionist on Make an Extraordinary Effort · 2008-10-08T02:29:04.000Z · LW · GW

...expand itself exponentially taking over nearby space uploading all of humanity into a simulation

Ah yes, there's nothing wrong with murdering people as long as you name video game characters after them.

Comment by Anti-reductionist on My Bayesian Enlightenment · 2008-10-05T17:49:35.000Z · LW · GW

What's your justification for having P(she says "at least one is a boy" | 1B,1G) = P(she says "at least one is a girl" | 1B,1G)? Maybe the hypothetical mathematician is from a culture that considers it important to have at least one boy. (China was like that, IIRC)

Comment by Anti-reductionist on Beyond the Reach of God · 2008-10-04T16:53:31.000Z · LW · GW

Summary: "Bad things happen, which proves God doesn't exist." Same old argument that atheists have thrown around for hundreds, probably thousands, of years. The standard rebuttal is that evil is Man's own fault, for abusing free will. You don't have to agree, but at least quit pretending that you've proven anything.

Comment by Anti-reductionist on Above-Average AI Scientists · 2008-09-29T02:33:57.000Z · LW · GW

TGGP: I'm not an epiphenomenalist. My guess is if a non-conscious version of me was created, if such a thing is possible, it would claim to have been wrong all along and become an Eliezer follower.

A reductionist explanation for everything would, to avoid having anything irreducible, have to be infinitely deep; either the theory itself would be never-ending, or it would be self-referential. Is that what you believe?

Comment by Anti-reductionist on Above-Average AI Scientists · 2008-09-28T23:52:56.000Z · LW · GW

Caledonian: I don't claim that reductions aren't often possible/useful, that claim would be just stupid :) Rather, what I oppose is reductionism, the dogmatic belief that the Standard Model can explain everything. (Never mind that it can't even explain all of known physics...) Besides, I'm not nearly conceited enough to think of myself as the best.

Besides, if you aren't a p-zombie, then it's patently obvious why reductionism is flat out wrong: you can't build a consciousness out of quarks and such (as they are described by current physics, anyway) any more then you could build a house out of water (that would stay up at standard temperature and pressure; sorry, I suck at analogies). Consciousness isn't an "emergent" property.

Comment by Anti-reductionist on Above-Average AI Scientists · 2008-09-28T19:15:55.000Z · LW · GW

Ah, here we go again with the "I'm so smart because I believe in a meaningless universe". And as usual, a creationist is brought out as a straw man. Non-reductionists always have to be judged according to the worst that can be dredged up from their ranks... of course, bring out the fact that Marx, Lenin, and Stalin were all staunch reductionists and that's just going off topic.

There is nothing "rational" about the particular brand of religious beliefs espoused by Eliezer. So-called "rationalists" love to point to Occam's Razor, which can actually support anything one wants just by choosing an appropriate definition of the word "simple". Or if they're more mathematically sophisticated, they'll put lipstick on that pig and use Solomonoff Induction instead, which once again can give anything one wants a high prior just by choosing the language used. Since there exists a Universal Turing Machine where the bitstring "0" emulates a universe that's like our own except the Earth is actually flat and people only think it's round because of a massive conspiracy, a "rationalist" would have the right to assign at least a 50% prior to that hypothesis if he wanted to (and that probability is not going to decrease, since P(people say Earth is round|Earth is round) = P(people say Earth is round|massive conspiracy)). And to claim that such a language is too "complex", would just be begging the question.