Posts

Comments

Comment by Tenek on Harry Potter and the Methods of Rationality discussion thread, February 2015, chapter 111 · 2015-02-25T19:08:49.671Z · LW · GW

So, random possible goals... make invincible, intelligent horcrux. Provoke Harry to attempted murder.

I am somewhat suspicious of announcing one's own vulnerability as opposed to just killing her in the half-second it takes for AK.

Comment by Tenek on Harry Potter and the Methods of Rationality discussion thread, February 2015, chapter 111 · 2015-02-25T18:57:46.147Z · LW · GW

Fantasy / Double Shyamalan

Comment by Tenek on Harry Potter and the Methods of Rationality discussion thread, part 18, chapter 87 · 2012-12-24T15:04:15.852Z · LW · GW

So you're replacing 'lose a portion of magic' with 'risk being sent to Azkaban'? It changes the cost of binding but it certainly doesn't remove it.

Comment by Tenek on Harry Potter and the Methods of Rationality discussion thread, part 17, chapter 86 · 2012-12-19T18:14:21.950Z · LW · GW

OK. I'm thinking of this in terms of Harry being able to see Bellatrix because it's his cloak. Harry should then be able to see the other Harrys because they're also wearing his cloak, unless the Cloak distinguishes between "master" and "time-travelled master", or the "loan" part is significant enough that Harry wouldn't be able to see someone under the cloak if they just pick it up without him expressly loaning it to them. If that counts as "stealing" and transfers ownership then you could "loan" the Cloak to everyone and they'd never be able to take it from you.

There's something unsettling about a cloak that hides you from everyone, except its Master, unless the Master is also you.

Comment by Tenek on Harry Potter and the Methods of Rationality discussion thread, part 17, chapter 86 · 2012-12-19T17:02:15.086Z · LW · GW

Makes sense. Six possible one-directional (A loves B, B loves A, etc) relationships that can be either present or not, so 2^6 = 64. Each person has 3 graphs where they're disconnected but the others are not (A loves B, B loves A, A and B love each other), and one where there are no connections at all. 64 - 3*3 - 1 = 54.

Comment by Tenek on Harry Potter and the Methods of Rationality discussion thread, part 17, chapter 86 · 2012-12-19T16:54:24.821Z · LW · GW

On further consideration of the Moody fight - as soon as Harry walked into the office, shouldn't he have seen all his invisible copies as well? Ch. 56 -

Bellatrix was still transparent within the Cloak, but to Harry she was no longer hidden, he knew that she was there, as obvious to him as a Thestral. For Harry had only loaned his Cloak, not given it; and he had comprehended and mastered the Deathly Hallow that had been passed down through the Potter line.

Comment by Tenek on Harry Potter and the Methods of Rationality discussion thread, part 17, chapter 86 · 2012-12-19T16:49:58.646Z · LW · GW

He gets the mechanism wrong though. In a real fight, Moody kills him or at the very least takes his toys away if he's needed alive. There wouldn't be any time-turned copies of him in the first place.

Comment by Tenek on Harry Potter and the Methods of Rationality discussion thread, part 17, chapter 86 · 2012-12-19T16:45:23.816Z · LW · GW

Just because he has no reason to tell Snape doesn't mean he has any particular reason to fear the knowledge getting out. He's already earned his spot on the Supremely Dangerous Wizards list.

Kinda makes me wonder why he didn't conceal the fact that he has a magical eye at all, though.

Comment by Tenek on Harry Potter and the Methods of Rationality discussion thread, part 17, chapter 86 · 2012-12-17T18:15:31.699Z · LW · GW

It should already be pretty high though - Harry even points it out at the time (Rule 1 of Unforgivable Curse Safety) and Quirrell equivocates it away by mixing up etiquette rules with safety rules. That might just as easily have ended with "I just shot Bahry in the face" considering how fast the spell must be going - probably <100 ms to recognize he can't dodge in time, and push him away.

Comment by Tenek on Harry Potter and the Methods of Rationality discussion thread, part 17, chapter 86 · 2012-12-17T17:57:56.265Z · LW · GW

I don't. GL's canon strategy provides a perfectly reasonable explanation for all his supposed feats, and I didn't see anything in 86 to suggest this is going to be a major divergence. I expect it'll go something like the other CoS reference - at some future point we'll get a "gee, looks like he's just a fraud, moving on" with a possible joke about him teaching Defense.

Comment by Tenek on Rationality Quotes: May 2011 · 2011-05-04T17:42:11.106Z · LW · GW

Cached Thoughts

Comment by Tenek on Mental Metadata · 2011-03-30T17:14:36.850Z · LW · GW

You might want to check out TVTropes.

Comment by Tenek on You're in Newcomb's Box · 2011-02-06T15:21:06.882Z · LW · GW

Maybe Prometheus could predict your decision by running a simulation of you and putting "you" in that situation.

Comment by Tenek on Some rationality tweets · 2010-12-30T15:13:34.300Z · LW · GW

There's a difference between learning a skill and learning a skill while remaining human. You need to decide which you want.

Learning how to transplant a kidney is much easier when you have a few dozen people to experiment on. (I think that was the idea, anyways...)

Comment by Tenek on Folk grammar and morality · 2010-12-20T15:53:57.522Z · LW · GW

Is there any dialect that is readily understandable to everyone who speaks English?

Comment by Tenek on The Problem With Trolley Problems · 2010-10-26T20:08:28.137Z · LW · GW

I'm not using it as an example of why they're good. I'm offering it as an example because it's relevant to the topic.

Adding a cost to circumvent the law makes you less likely to do so, though. If you keep hiring people who are decidedly suboptimal because you have to use a lousy approximation of whatever characteristic you want, you might give up on it.

I get that you would rather, given that you're going to be rejected for your age/skin color/gender/etc, be told why. But if you want to reduce the use of those criteria, then banning it will stop the people who care a small amount (i.e not enough to bother getting around it.)

Comment by Tenek on The Problem With Trolley Problems · 2010-10-26T14:56:56.628Z · LW · GW

"I have no idea what criteria were used when I'm rejected for a job, and I'm not even seeing the jobs that never get posted because it's easier to hire someone you know than go through the formal process and jump through all its hoops." Maybe.

I don't think your views don't count - I was hoping that I'd gone to sufficient lengths to point out that while it might have just been bitterness, there was a substantial chance it wasn't. Maybe I underestimated the LW rationalist:racist ratio... actually, probably by a huge margin. %#$@.

So what would happen if you traded the kafkaesque life for the officially-banned screening methods? Would you rather have twice the number of job opportunities and lose 3/4 of them right away because you're ? Or would you rather that other people get rejected for them, if you don't personally have many of the 'undesirable' attributes?

Finally, let's go to story mode. A friend of mine applied for a job. They weren't allowed to ask her about her religion. But she has a name commonly found among members of a particular one. She got the job, and became the only employee (out of a couple dozen) not sharing the faith of the rest of them. So I guess they took a guess at her religion based on her name, and chose using that metric. I have no idea whether this is a success or failure of antidiscrimination laws. Without them, she'd have had no chance. With them, they tried anyways. But at least it was possible, even if she didn't fit in... and quit a few years later, when her husband got cancer and they blamed it on her not praying.

Comment by Tenek on The Problem With Trolley Problems · 2010-10-25T16:16:52.868Z · LW · GW

I've been rereading this comment for the past 10 minutes and I have no idea whether this is an (attempted) arms'-length assessment of discrimination law (I say attempted because of the "matters close to me" acknowledgement) or the bitter result of the author being turned down for a job. At first glance it looks like the latter, but this is exactly the sort of situation I would expect to see someone to make a completely rational analysis and not pay any attention to how it's going to come across to someone who doesn't know you're not just another bigot. (I call it Larry Summers Syndrome.

http://en.wikipedia.org/wiki/Lawrence_Summers#Differences_between_the_sexes )

It's one thing to talk about "risk profiles" or "incentives" in general terms, but when you actually want to implement something, it becomes a particular incentive, and there is no a priori reason to assume the cost will outweigh the benefit. When you concentrate on the existence of a cost (or benefit) and ignore the magnitude, you start making statements like "[the Bush tax cuts] increased revenue, because of the vibrancy of these tax cuts in the economy". Similarly, if you try to transfer utility from group A to group B, group A is going to be upset and try to minimize their loss - that doesn't mean that group A is going to completely negate it, or that group B is going to be worse off.

Comment by Tenek on Rationality quotes: October 2010 · 2010-10-06T16:05:10.039Z · LW · GW

Well, Jack doesn't want any thinking at all, so I'm not sure if that's better or worse than fuzziness.

Comment by Tenek on The Irrationality Game · 2010-10-05T04:26:35.946Z · LW · GW

I would imagine not (99%) , although it doesn't appear to be in common usage.

Comment by Tenek on The Irrationality Game · 2010-10-04T19:48:13.949Z · LW · GW

At that speed, you have less than 0.3 mm per clock cycle for your signals to propagate. Seems like you'd either need to make ridiculously tiny gadgets, or devote a lot of resources to managing the delays. Seems reasonable enough.

Comment by Tenek on The Irrationality Game · 2010-10-04T19:36:01.904Z · LW · GW

No. I intend to revive one. Possibly all four, if necessary. Consider it thawing technology so advanced it can revive even the pyronics crowd.

Comment by Tenek on The Irrationality Game · 2010-10-04T15:29:46.927Z · LW · GW

The pinnacle of cryonics technology will be a time machine that can at the very least, take a snapshot of someone before they died and reconstitute them in the future. I have three living grandparents and I intend to have four living grandparents when the last star in the Milky Way burns out. (50%)

Comment by Tenek on Against the standard narrative of human sexual evolution · 2010-07-23T17:41:51.637Z · LW · GW

How is WrongBot going to learn to think and write more skillfully by moving to a place that's collectively worse at doing so?

Comment by Tenek on Simplified Humanism, Positive Futurism & How to Prevent the Universe From Being Turned Into Paper Clips · 2010-07-22T20:43:04.459Z · LW · GW

That doesn't help maximize paperclips, though. If you make all decisions based on two criteria - paperclip count and emotions - then the only situation in which those decisions differ from what you would have decided based solely on paperclip count is one in which you choose an outcome with fewer paperclips but a better emotional result.

If you were to refuse my offer, you would not only be losing a paperclip now, but also increasing the likelihood that in the future, you will decide to sacrifice paperclips for emotion's sake. Perhaps you will one day build a paperclip-creator that creates one paperclip per second, and I will threaten to destroy a paperclip unless you shut it down. If you care too much about the threatened paperclip you might comply, and then where would you be? Sitting in an empty room where paperclips should have been.

Comment by Tenek on Simplified Humanism, Positive Futurism & How to Prevent the Universe From Being Turned Into Paper Clips · 2010-07-22T15:28:31.131Z · LW · GW

Would you trade those base emotions for a paperclip?

Comment by Tenek on Financial incentives don't get rid of bias? Prize for best answer. · 2010-07-15T14:21:02.900Z · LW · GW

Well, I would have done some research and gotten a warm fuzzy feeling out of expanding my knowledge, but if you're going to displace that motivation with only a chance at a measly $10 I guess it's not worth my time.

http://naggum.no/motivation.html

Comment by Tenek on A proposal for a cryogenic grave for cryonics · 2010-07-07T18:04:26.233Z · LW · GW

Then we can suggest that they're temporarily dead, but they're still dead, so it's a "grave". Religions have been saying that death is temporary for thousands of years anyways, it wouldn't be anything new.

Comment by Tenek on Taking the awkwardness out of a Prenup - A Game Theoretic solution · 2010-05-23T06:13:58.754Z · LW · GW

Because you need to know if they've made a commitment, and using old information can get you burned if as stated, you pre-commit simultaneously.

Comment by Tenek on Taking the awkwardness out of a Prenup - A Game Theoretic solution · 2010-05-23T06:10:44.521Z · LW · GW

Then the statement of absolute trust is accounted for by the significant rate of mistakes people make.

Alternatively, you can make that statement as part of a strategy to maximize your expected return on a marriage - if the increase in marriage quality from placing absolute trust in your spouse is greater than the expected cost of being disadvantaged in the divorce negotiaions (if your spouse turns out to be untrustworthy), then you might rationally do it anyways.

Comment by Tenek on Blame Theory · 2010-05-20T15:01:37.714Z · LW · GW

Guilt is an added cost to making decisions that benefit you at the expense of others. (Ideally, anyways.) It encourages people to cooperate to everyone's benefit. Suppose we have a PD matrix where the payoffs are: (defect, cooperate) = (3, 0) (defect, defect) = (1, 1) (cooperate, cooperate) = (2, 2) (cooperate, defect) = (0, 3) Normally we say that 'defect' is the dominant strategy since regardless of the other person's decision, your 'defect' option payoff is 1 higher than 'cooperate'.

Now suppose you (both) feel guilty about betrayal to the tune of 2 units: (defect, cooperate) = (1, 0) (cooperate, cooperate) = (2, 2) (defect, defect) = (-1, -1) (cooperate, defect) = (0, 1)

The situation is reversed - 'cooperate' is the dominant strategy. Total payoff in this situation is 4. Total payoff in the guiltless case is 2 since both will defect. In the OP $10-button example the total payoff is $-90, so people as a group lose out if anyone pushes the button. Guilt discourages you from pushing the button and society is better for it.

Comment by Tenek on Beauty quips, "I'd shut up and multiply!" · 2010-05-07T15:34:54.130Z · LW · GW

We can tweak the experiment a bit to clarify this. Suppose the coin is flipped before she goes to sleep, but the result is hidden. If she's interviewed immediately, she has no reason to answer other than 1/2 - at this point it's just "flip a fair coin and estimate P(heads)". What information does she get the next time she's asked that would cause her to update her estimate? She's woken up, yes, but she already knew that would happen before going under and still answered 1/2. With no new information she should still guess 1/2 when woken up.