Posts

Comments

Comment by Hans on Attention Lurkers: Please say hi · 2010-04-19T08:56:46.387Z · LW · GW

Hi. I've made a few posts here and there, but have mostly been lurking lately.

Comment by Hans on Let Them Debate College Students · 2009-09-11T00:56:44.034Z · LW · GW

I know that the student would be studying a related field; that was not the point. I as a hypothetical viewer would not care what the grad student was studying, exactly, I would care that he was only a 20-year old graduate student still studying at a university (that I would assume to be populated with liberal professors).

"Winners don't win by playing dumb."

And that is why I don't get this proposal. It is assumed that this college student would absolutely destroy the creationist debater and persuade the open-minded and objective audience through sheer, well, persuasiveness. But the audience, unless already completely in favor of evolution, is at least sympathetic to the creationist and interested in their views. This proposal would signal that this experienced debater and high-status leader of a movement is no more than a wet-behind-the-ears, unexperienced student. Doubting listeners would dismiss this fact out of hand and a priori; they will think it condescending to send someone like that to debate someone like this, which it is, to the creationist but especially to the audience. They will then attach less weight to any arguments, however persuasive, the student would make.

Biasing your audience against you before the debate has even started is not a viable tactic.

Comment by Hans on Let Them Debate College Students · 2009-09-10T22:59:42.337Z · LW · GW

Okay, but he's clearly young. I don't see how sending a low-status person to debate a high-status person could ever convince the adherents of the high-status person.

Comment by Hans on Let Them Debate College Students · 2009-09-10T19:31:23.276Z · LW · GW

So I've turned on the tv to watch a debate on evolution and creationism on CNN (or Fox News). The creationists have sent an older, respectable-looking gentleman in a suit, bible in hand. The evolutionists have sent a scrappy-looking college kid in jeans, barely out of his diapers and studying something fancy-shmancy at the University of Liberal Professors, Berkeley.

A priori, whose side will I be on?

How many people will think: "Is this the best guy the evolutionists have to offer?"

Comment by Hans on Misleading the witness · 2009-08-11T13:29:13.176Z · LW · GW

As previous comments have said, it would be possible to sell the 15% chance for anything up to $150k. Once people realise that the 15% chance is a liquid asset, I'm sure many will change their mind and take that instead of the $500.

What does this mean? If the 15% chance is made liquid, that removes nearly all of the risk of taking that chance. This leads me to believe that people pick the $500 because they are, quite simply, (extremely) risk-averse. Other explanations (diminishing marginal utility of money, the $1 million actually having negative utility, etc.) are either wrong, or they are not a large factor in the decision-making process.

Comment by Hans on Why Real Men Wear Pink · 2009-08-08T21:27:45.654Z · LW · GW

yeah, what he said, except that i couldn't find the words to explain it in English.

Comment by Hans on Why Real Men Wear Pink · 2009-08-08T17:18:08.854Z · LW · GW

They emphasize the legs and the thighs, and create a more "female" body posture.

Comment by Hans on Why Real Men Wear Pink · 2009-08-07T10:11:33.152Z · LW · GW

What a fascinating case of parallel evolution: As the cicada has a life cycle of 17 years (a prime number) to avoid predators with shorter life cycles, so too does the common or garden nerd choose clothes that are fashionable only once every 17 years, to minimize overlap with other, dangerous fashions.

Comment by Hans on The Hero With A Thousand Chances · 2009-07-31T09:35:45.516Z · LW · GW

"someone who's never heard of X (in this case the many-worlds interpretation of quantum mechanics and the anthropic principle) isn't going to have a clue what the hell you're talking about."

Yeah, that must be why I didn't understand anything. But I got the tolkien reference!

Comment by Hans on Shut Up And Guess · 2009-07-22T23:35:13.454Z · LW · GW

Indeed. As a consequence, once you can narrow the answer down to two or three choices, you're always better off guessing.

Comment by Hans on Shut Up And Guess · 2009-07-22T23:03:49.095Z · LW · GW

It sounds like your fellow students understood the concept of a guessing penalty, but did not realise that the guessing penalty was too low in this case. One approach to convince them might have been:

Assume you get -0.0001 points for guessing an incorrect answer. Obviously, you should answer every question, because the penalty for guessing is so low. Now, assume that the guessing penalty is -20 points. Again, you obviously shouldn't guess. What would the penalty have to be where you're indifferent between guessing and not guessing? Obviously, when the penalty is -1 point. You guess two answers, one is correct and the other not, and your expected score is 0. In this case the penalty is -0.5, which is closer to -0.0001 than to -20, therefore you should always guess.

NB At my university, multiple choice exams always feature four possible answers, and you lose .33 for guessing incorrectly. Every student understands this concept perfectly. If they had to take your exam, they would've guessed every single time. It's strange to see that there are universities where the guessing penalty is not well calibrated. It seems like an elementary thing to do.

Comment by Hans on Sunk Cost Fallacy · 2009-05-14T12:21:34.137Z · LW · GW

Seth godin has a few examples of sunk costs. I believe these examples better represent true sunk costs than some of the examples given here (such as the movie ticket).

For example, suppose you have paid 50 dollars for a Bruce Springsteen concert. You have searched long and hard for tickets this cheap. Suddenly, somebody offers you 500 dollars for the ticket. Do you sell it? The ticket is now worth $500 to you, and you would have never paid $500 for a ticket in the first place.

Comment by Hans on Of Gender and Rationality · 2009-04-20T00:53:22.766Z · LW · GW

What do the women reading this post think of these statements?

As a man, I often find myself thinking the same thing, however I have yet to meet a woman who does.

Comment by Hans on Sunk Cost Fallacy · 2009-04-16T00:23:26.923Z · LW · GW

Yes, and if there was a utility lever that you could pull to gain utility, you would spend your entire life pulling the lever. But there isn't. And you cannot teleport, nor will you be able to in the foreseeable future. So Alicorn will have to continue taking the burden of travel into account when deciding whether or not to visit a place he would like to have visited.

Comment by Hans on Extreme Rationality: It's Not That Great · 2009-04-13T10:14:30.862Z · LW · GW

I second that. Here in the LW/OB/sci-fi/atheism/cryonics/AI... community, many of us fit quite a few stereotypes. I'll summarize them in one word that everybody understands: we're all nerds*. This means our lives and personalities introduce many biases into our way of thinking, and these often preclude discussions about acting rationally in interpersonal situations such as sales, dating etc. because we don't have much experience in these fields. Anything that bridges this gap would be extremely useful.

*this is not a value judgment. And not everybody conforms to this stereotype. I know, I know, but this is not the point. I'm talking averages here.

Comment by Hans on Extreme Rationality: It's Not That Great · 2009-04-13T09:55:51.635Z · LW · GW

But many poor/middle-class people also believe that they can never become rich (except for the lottery) because the only ways to become rich are crime, fraud, or inheritance. And this leads them to underestimate the value of hard work, education, and risk-taking.

The median rationalist will perform better than these cynics. But his average wealth will also be higher, assuming he accurately observes his chances at becoming succesful.

Comment by Hans on Sunk Cost Fallacy · 2009-04-12T21:59:46.680Z · LW · GW

Another reason for honoring the sunk cost of the movie ticket (related to avoiding regret) is that you know yourself well enough to realize you often make mistakes. There are many irrational reasons why you would not want to see the movie after all. Maybe you're unwilling to get up and go to the movie because you feel a little tired after eating too much. Maybe a friend who has already seen the movie discourages you to go, even though you know your tastes in movies don't always match. Maybe you're a little depressed and distracted by work/relationship/whatever problems. Etc.

For whatever reason, your past self chose to buy the ticket, and your present self does not want to see the movie. Your present self has more information. But this extra information is of dubious quality, and is not always relevant to the decision. But it still influences your state of mind, and you know that. How do you know which self is right? You don't, until after you've seen the movie. The marginal costs, in terms of mental discomfort, of seeing the movie and not liking it, are usually smaller than the marginal benefit of staying home and thinking about what a great movie it could have been.

The reasoning behind this trivial example can easily be adapted to sunk cost choices in situations that do matter.

The sunk cost fallacy is easy to understand and to point out to others, but I caution against using it too often. The point of the fallacy is to show that only future costs and benefits matter when making a decision. This is true, but in reality those costs and benefits (and especially their probabilities) are hard to define. It is not clear whether the extra information that was received after 'sinking' the cost has an impact on the cost and benefit probabilities. You also know that, in any case, if the decision to sink the cost in the first place was the right one after all, the decision to continue is even more rational as a large part of the cost has already been spent. You can go see a movie for free that other people still have to pay for.

Comment by Hans on Toxic Truth · 2009-04-11T21:28:36.096Z · LW · GW

Your post definitely illustrates your point, by misleading otherwise well-informed LW readers for at least a few paragraphs.*

Therefore, I believe it's a useful post. However, as you can see in the comments, the temptation to write lame follow-up jokes is just too big. Don't expect too much serious discussion here.

*unless they were previously familiar with the joke, of course.

Comment by Hans on Toxic Truth · 2009-04-11T21:23:13.981Z · LW · GW

Those people died after ingesting impure DHMO, which is 'watered down' by relatively unharmful minerals, making it only fatal after ingesting large amounts. 100% pure, distilled DHMO is actually extremely dangerous even in small quantities, as it leeches essential nutrients from your body through a nefarious process called 'reverse osmosis'.

[EDIT: this is actually not true, according to the wisdom of the interwebs. Thank you, extremely expensive European public school system, for filling my young impressionable mind with this untrue factoid. Nevertheless, the dangers and risks of DHMO ingestion remain poorly understood.]

Comment by Hans on Real-Life Anthropic Weirdness · 2009-04-07T17:33:11.697Z · LW · GW

I get your point now. But all we need to know is whether P(L|H) > P(L|~H)*.

If this is the case, then if an extremely unlikely (P(L/~H) -> 0) event L happens to you, this evidently increases the chance that you're in a holodeck simulation. In the formula, P(L|H) equates to (almost) 1 as P(L|~H) approaches zero. The unlikelier the event (amazons on unicorns descending from the heavens to take you to the land of bread and honey), i.e. the larger the difference between P(L|H) and P(L|~H), the larger the probability that you're experiencing a simulation.

This is true as long as P(L|H) > P(L|~H). If L is a mundane event, P(L|H) = P(L|~H) and the formula reduces to P(H|L) = P(H). If L is so supremely banal that P(L|~H) > p(L|H), the occurence of L actually decreases the chance that you're experiencing a holodeck simulation.

Indeed, I believe that was the point of the original post.

The core assumption remains, of course, that you're more likely to win the lottery when you're experiencing a holodeck simulation than in the real world (P(L|H) > P(L|~H)).

I'm not well-versed in Bayesian reasoning, so correct me if I'm wrong. Your posts have definitely helped to clarify my thoughts.

*I don't know how to type the "not"-sign, so I'll use a tilde.

Comment by Hans on Real-Life Anthropic Weirdness · 2009-04-07T12:02:16.431Z · LW · GW

The person controlling the holodeck (who presumably designed the simulation) needs to know the probability. But the person being simulated, who experiences winning the lottery, does not need to know anything about the inner working of his (simulated) world. For the experience to seem real enough, it'd be best, even, not to know every detail of what's going on.

Comment by Hans on Real-Life Anthropic Weirdness · 2009-04-07T11:00:22.874Z · LW · GW

I interpreted the last statement as follows:

IF you assign a probability higher than 10^(-8) to the hypothesis that you are in a holodeck

AND you win the lottery (which had a probabiltiy of 10^(-8) or thereabouts)

THEN you have good reason to believe you're in a holodeck, because you've had such improbable good fortune.

Correct me if I'm wrong on this.

Comment by Hans on Real-Life Anthropic Weirdness · 2009-04-06T10:22:30.965Z · LW · GW

Not really. Think of Nozick's experience machine. If you were to use the machine to simulate yourself in a situation extremely close to the center of the singularity, would you also give yourself the looks of Brad Pitt and the wealth of Bill Gates?

a) Would this not make the experience feel so 'unreal' that your simulated self would have trouble believing it's not a simulation, and therefore not enjoy the simulation at all? In constructing the simulation, you need to define how many positive attributes you can give your simulated self before it realizes that its situation is so improbable that it must be a simulation. I'd use caution and not make my simulated self too 'lucky.'

b) More importantly, you may believe that a) doesn't apply, and that your simulated self would take the blue pill, and willingly choose to continue to live in the simulation. Even then, having great looks and great wealth would probably distract you from creating the singularity. All I'd care about is the singularity, and I'd design the simulation so that I have a comfortable, not too distracting life that would allow me to focus maximally on the singularity, and nothing else.

Comment by Hans on Why Support the Underdog? · 2009-04-05T11:50:30.200Z · LW · GW

I read your comment and I immediately wanted to vote up Marshall's original comment. After all, he's the underdog being criticized and chased away by the founder and administrator of this blog.

In the end, I didn't, probably for equally irrational reasons.

Comment by Hans on Closet survey #1 · 2009-03-15T12:07:39.337Z · LW · GW

Voted up from -1 because I want you to clarify. Do you believe that bisexuality in women is ubiquitous, while not ubiquitous, but present in some men? Or that it is completely absent in men, but present though not ubiquitous in women? Or any other combination of absent, present or ubiquitous in either women or men?

Comment by Hans on The Apologist and the Revolutionary · 2009-03-12T01:30:05.602Z · LW · GW

Actually, the trick worked, but the effects had worn off by the time you wrote this message, which is why you deny having your opinion on the AI issue completely reversed in a shocking aha-erlebnis, for a brief ten minutes at least. Remember to videotape yourself the next time.

Comment by Hans on Failed Utopia #4-2 · 2009-01-21T16:16:21.000Z · LW · GW

I really hope (perhaps in vain) that humankind will be able to colonize other planets before such a singularity arrives. Frank Herbert's later Dune books have as their main point that a Scattering of humanity throughout space is needed, so that no event can cause the extinction of humanity. An AI that screws up (such as this one) would be such an event.

Comment by Hans on Failed Utopia #4-2 · 2009-01-21T11:32:52.000Z · LW · GW

Wow - that's pretty f-ed up right there.

This story, however, makes me understand your idea of "failed utopias" a lot better than when you just explained them. Empathy.