Posts

Comments

Comment by Nate_Gabriel on 2014 Survey Results · 2015-01-04T06:24:40.086Z · LW · GW

P Supernatural: 6.68 + 20.271 (0, 0, 1) [1386]

P God: 8.26 + 21.088 (0, 0.01, 3) [1376]

The question for P(Supernatural) explicitly said "including God." So either LW assigns a median probability of at least one in 10,000 that God created the universe and then did nothing, or there's a bad case of conjunction fallacy.

Comment by Nate_Gabriel on Rationality Jokes Thread · 2014-12-19T21:37:32.830Z · LW · GW

An infinite number of mathematicians walk into a bar. It collapses into a gravitational singularity.

Comment by Nate_Gabriel on When should an Effective Altruist be vegetarian? · 2014-11-24T03:12:24.814Z · LW · GW

I tried something vaguely similar with completely different assumptions. I basically ignored the number of animal deaths in favor of minimizing the amount of animal torture. The whole thing was based on how many animals it takes before empathy kicks in, rather than an actual utility comparison.

I instinctively distrust animal-to-human utility conversions, but the ideal version of your method is better than the ideal version of mine. I do recommend that meat eaters do what I did to establish an upper bound, though. It might even convince someone to change their behavior, since it's based solely on convincing the human they already have the preference for eating less meat.

Comment by Nate_Gabriel on Open thread, Oct. 27 - Nov. 2, 2014 · 2014-10-27T23:37:29.821Z · LW · GW

Do you think we currently need more inequality, or less?

Comment by Nate_Gabriel on Open thread, Oct. 27 - Nov. 2, 2014 · 2014-10-27T13:52:51.275Z · LW · GW

Compared to technological progress, there has been little or no social/political progress since the mid-18th century - if anything, there has been a regression

Regression? Since the 1750s? I realize Europe may be unusually bad here (at least, I hope so), but it took until 1829 for England to abolish the husband's right to punish his wife however he wanted.

Comment by Nate_Gabriel on What false beliefs have you held and why were you wrong? · 2014-10-18T21:11:39.791Z · LW · GW

I once walked around a university campus convincing people that it's impossible to see the Moon during daylight hours. I think it was about 2/3 who believed me, at least until I pointed up.

Comment by Nate_Gabriel on What false beliefs have you held and why were you wrong? · 2014-10-17T01:44:41.614Z · LW · GW

Just that moment. I definitely didn't follow any of its implications. (Other than "if I say this then people will react as if I said an obvious true thing.")

Comment by Nate_Gabriel on What false beliefs have you held and why were you wrong? · 2014-10-16T21:50:41.473Z · LW · GW

I once believed that six times one is one.

I don't remember how it came up in conversation, but for whatever reason numbers became relevant and I clearly and directly stated my false belief. It was late, we were driving back from a long hard chess tournament, and I evidently wasn't thinking clearly. I said the words "because of course six times one is one." Everyone thought for a second and someone said "no it's not." Predictable reactions occurred from there.

The reason I like the anecdote is because I reacted exactly the same way I would today if someone corrected me when I said that six times one is six. I thought the person who corrected me must be joking; he knows math and couldn't possibly be wrong about something that obvious. A second person said that he's definitely not joking. I thought back to the sequences, specifically the thing about evidence to convince me I'm wrong about basic arithmetic. I ran through some math terminology in my head: of course six times one is one; any number times one is one. That's what a multiplicative identity means. In my head, it was absolutely clear that 6x1=1, this is required for what I know of math to fit together, and anything else is completely logically impossible.

It probably took a good fifteen seconds from me being called out on it before I got appropriately embarrassed.

This anecdote is now my favorite example of the important lesson that from the inside, being wrong feels exactly like being right.

Comment by Nate_Gabriel on Questions on Theism · 2014-10-09T05:00:18.122Z · LW · GW

The main prediction that comes to mind is that if Christianity is true, one would expect substantially more miracle claims by Christians (legitimate claims plus false ones) than by any other religion (false claims only).

This also assumes there isn't some saturation point of people only wanting to talk about so many miracles. (Ignoring buybuydandavis' point, which probably interacts with this one in unfortunate ways.) If people only forward X annoying chain emails per month, you'd expect X from each religion. The best we can hope for is the true religion having on average slightly more plausible claims since some of their miracles are true.

Comment by Nate_Gabriel on Questions on Theism · 2014-10-09T04:47:11.978Z · LW · GW

It wasn't actually a muscular condition. My friend is surprisingly unwilling to spread this around and only told me under the extreme circumstances of me telling her I might be about to become an atheist. I wanted to change enough that if she read this on the Internet she wouldn't know it was about her.

Comment by Nate_Gabriel on Questions on Theism · 2014-10-09T03:03:18.825Z · LW · GW

I have done this. The most impressive-sounding one happened to a friend of mine who had formerly been an athlete. She had to withdraw from sports for a year because of an unexpected muscular condition. (If this is obviously medically wrong, it's probably because I changed details for privacy.) As you probably expect, that year involved plenty of spiritual growth that she attributes to having had to quit sports.

At the end of that time, a group of church people laid hands on her and prayed, she felt some extreme acceleration in her heart rate, and her endurance was back the next time she tested it. A doctor confirmed that the muscular thing was completely gone, and she's been physically active ever since.

Now obviously this isn't bulletproof. You just need her to spontaneously recover at some point before the laying on of hands. (I have no idea how likely this would be; probably not very.) The rest is exactly the sort of thing that might happen regardless of whether there's a miracle. But it still sounds really impressive. If I weren't actively trying not to spin it to sound even more miraculous, it'd sound even more impressive.

But this is just the most miraculous-sounding story I've heard from a source I trust. I only know so many people. This account is probably well within the distribution of how miraculous anecdotes can get. I'd feel weird saying "you spontaneously got better a few months earlier, and so did anyone else with a similar story."

Comment by Nate_Gabriel on Questions on Theism · 2014-10-09T02:40:59.858Z · LW · GW

It's appointed. Doesn't mean the guy who did the appointing can't make exceptions if he feels like it.

Comment by Nate_Gabriel on Open thread, September 22-28, 2014 · 2014-09-23T21:17:58.070Z · LW · GW

Well no, because I doubt he'd share the downvoter's objective. (I assume. I wasn't following the kerfuffle.) To conclude that he would, you have to transplant his methods onto a forum setting but not his goals. Which is a weird level to model at.

Comment by Nate_Gabriel on Anthropics doesn't explain why the Cold War stayed Cold · 2014-08-23T18:33:55.895Z · LW · GW

Anthropics fails to explain King George because it's double-counting the evidence. The same does not apply to any extinction event, where you have not already conditioned on "I wouldn't exist otherwise."

If it's a non-extinction nuclear exchange, where population would be significantly smaller but nonzero, I'm not confident enough in my understanding of anthropics to have an opinion.

Comment by Nate_Gabriel on Anthropics doesn't explain why the Cold War stayed Cold · 2014-08-22T04:04:52.840Z · LW · GW

I still don't think George VI having more siblings is an observer-killing event.

Since we now know that George VI didn’t have more siblings, we obtain

Probability(You exist [and know that George VI had exactly five siblings] | George VI had more than five siblings) = 0

I assume you mean "know" the usual way. Not hundred percent certainty, just that I saw it on Wikipedia and now it's a fact I'm aware of. Then P(I exist with this mind state | George VI had more than five siblings) isn't zero, it's some number based on my prior for Wikipedia being wrong.

So my mind state is more likely in a five-sibling world than a six-sibling one, but using it as anthropic evidence would just be double-counting whatever evidence left me with that mind state in the first place.

Comment by Nate_Gabriel on Rationality Quotes April 2014 · 2014-04-02T13:30:26.034Z · LW · GW

I don't think it's lumping everything together. It's criticizing the rule "Act on what you feel in your heart." That applies to a lot of people's beliefs, but it certainly isn't the epistemology of everyone who doesn't agree with Penn Jillette.

The problem with "Act on what you feel in your heart" is that it's too generalizable. It proves too much, because of course someone else might feel something different and some of those things might be horrible. But if my epistemology is an appeal to an external source (which I guess in this context would be a religious book but I'm going to use "believe whatever Rameses II believed" because I think that's funnier), then that doesn't necessarily have the same problem.

You can criticize my choice of Rameses II, and you probably should. But now my epistemology is based on an external source and not just my feelings. Unless you reduce me to saying I trust Rameses because I Just Feel that he's trustworthy, this epistemology does not have the same problem as the one criticized in the quote.

All this to say, Jillette is not unfairly lumping things together and there exist types of morality/epistemology that can be wrong without having this argument apply.

Comment by Nate_Gabriel on Why are Harvard's alumni so wealthy? · 2014-03-17T05:14:34.086Z · LW · GW

What we need to do is convince Harvard to perform a double-blind test. Accept half their students as normal, and the other half at random from their applicants. We'll have an answer within a couple decades.

Comment by Nate_Gabriel on Mental Subvocalization --"Saying" Words In Your Mind As You Read · 2014-02-15T02:46:11.003Z · LW · GW

I always do. Mentally but not muscularly, and I can kind of suppress it if I consciously try. It is indeed the limiting factor on my reading speed.

Comment by Nate_Gabriel on Tulpa References/Discussion · 2014-01-04T00:03:32.259Z · LW · GW

Is it possible for a tulpa to have skills or information that the person doing the emulating doesn't? What happens if you play chess against your tulpa?

Comment by Nate_Gabriel on Open Thread, November 23-30, 2013 · 2013-11-26T23:40:42.296Z · LW · GW

I just realized it's possible to explain people picking dust in the torture vs. dust specks question using only scope insensitivity and no other mistakes. I'm sure that's not original, but I bet this is what's going on in the head of a normal person when they pick the specks.

Comment by Nate_Gabriel on Open Thread, November 23-30, 2013 · 2013-11-26T23:28:02.128Z · LW · GW

Not very tempted, actually. In this hypothetical, since I'm not feeling empathy the murder wouldn't make me feel bad and I get money. But who says I have to decide based on how stuff makes me feel?

I might feel absolutely nothing for this stranger and still think "Having the money would be nice, but I guess that would lower net utility. I'll forego the money because utilitarianism says so." That's pretty much exactly what I think when donating to the AMF, and I don't see why a psychopath couldn't have that same thought.

I guess the question I'm getting at is, can you care about someone else and their utility function without feeling empathy for them? I think you can, and saying you can't just boils down to saying that ethics are determined by emotions.

Comment by Nate_Gabriel on Open Thread, November 23-30, 2013 · 2013-11-25T19:02:23.579Z · LW · GW

I had actually been wondering about this recently. People define a psychopath as someone with no empathy, and then jump to "therefore, they have no morals." But it doesn't seem impossible to value something or someone as a terminal value without empathizing with them. I don't see why you couldn't even be a psychopath and an extreme rational altruist, though you might not enjoy it. Is the word "psychopath" being used two different ways (meaning a non-empathic person and meaning a complete monster), or am I missing a connection that makes these the same thing?

Comment by Nate_Gabriel on What do we already have right? · 2013-11-25T03:47:44.193Z · LW · GW

Well, it doesn't establish that induction is always valid, so I guess we might not really be disagreeing. But, pragmatically, everyone basically has to assume that it usually works, or is likely to work in whatever the particular case is. I think it's a good enough heuristic to be called a rational principle that people already have down.

Comment by Nate_Gabriel on What do we already have right? · 2013-11-25T01:29:44.679Z · LW · GW

I'm sure there are philosophers who say they don't, but I guarantee you they act as if they do. Even if they don't know anything about electronics, they'd still expect the light to come on when they flip the switch.

Comment by Nate_Gabriel on Open Thread, November 15-22, 2013 · 2013-11-21T14:27:19.192Z · LW · GW

Standard young-Earther responses, taken from when I was a young-Earth creationist.

Round Earth: Yes. You sort of have to stretch to interpret the Bible as saying the Earth is round or flat, so it's not exactly a contradiction. Things like "the four corners of the Earth" are obvious metaphor.

Animals on the boat: The "kinds" of animals (Hebrew "baramin") don't correspond exactly to what we call species. There are fewer animals in the ark than 2*(number of modern species); this is considered to be a sufficient answer even though it probably isn't. I don't know exactly what level of generality the baramin are supposed to be; I guess it depends on how much evolution the particular creationist is willing to accept. They'll typically use the example of dogs and wolves being the same "kind," but if that's the level of similarity we're talking about then there'll still be an awful lot of kinds.

Amount of water: The Earth used to be a lot smoother. Shallower oceans, lower mountains, etc. So it could be covered with a more reasonable amount of water. We know this because in the genealogies some guy named his son after the fact that "in his day the Earth was divided." (The word for divided, Peleg, means earthquake or cataclysm or something. This verse also doubles as tectonic plates being moved around.)

I don't agree with these, but thought that to avoid strawmanning I should post the l responses that I would have used. Not that they're much better than the straw version, but this is the kind of thing that would have been said by at least one YEC.

Comment by Nate_Gabriel on Open Thread, November 15-22, 2013 · 2013-11-18T20:11:00.354Z · LW · GW

Ideally, how people feel about things would be based in real-world consequences, and a chance of someone being not dead is usually strictly better than the alternative. But I can see how for a small enough chance of resurrection, it could possibly be outweighed by other people holding on to it. I still hope that isn't what's going on in this case, though. That would require people to be feeling "I'd rather have this person permanently dead, because at least then I know where I stand."

Comment by Nate_Gabriel on Open Thread, November 15-22, 2013 · 2013-11-17T23:38:36.781Z · LW · GW

That's...that's terrible. That it would feel worse to have a chance of resurrection than to have closure. It sounds depressingly plausible that that's people's true rejection, but I hope it's not.

Religion doesn't have the same problem, and in my experience it's because of the certainty. People believe themselves to be absolutely certain in their belief in the afterlife. So there's no closure problem, because they simply know that they'll see the person again. If you could convince people that cryonics would definitely result in them being resurrected together with their loved ones, then I'd expect this particular problem to go away.

Comment by Nate_Gabriel on Rationality Quotes November 2013 · 2013-11-16T09:38:26.519Z · LW · GW

And I'm not sure it's a mistake. If you're getting your information in a context where you know it's meant completely literally and nothing else (e.g., Omega, lawyers, Spock), then yes, it would be wrong. In normal conversation, people may (sometimes but not always; it's infuriating) use "if" to mean "if and only if." As for this particular case, somervta is probably completely right. But I don't think it's conducive to communication to accuse people of bias for following Grice's maxims.

Comment by Nate_Gabriel on Open Thread, September 30 - October 6, 2013 · 2013-10-03T08:15:35.711Z · LW · GW

Algebra.

Comment by Nate_Gabriel on What did governments get right? Gotta list them all! · 2013-09-19T07:04:45.810Z · LW · GW

Of each other, I think it means.

Comment by Nate_Gabriel on Open thread, September 9-15, 2013 · 2013-09-13T21:57:53.801Z · LW · GW

Of the set of all possible actions that you haven't denied doing, you've only done a minuscule percentage of them.

Of the times that you deny having done something, you lie some non-trivial percent of the time.

Therefore, your denial is evidence of guilt.

Comment by Nate_Gabriel on The Ultimate Newcomb's Problem · 2013-09-10T17:26:09.961Z · LW · GW

This post almost convinced me. I was thinking about it in terms of a similar algorithm, "one-box unless the number is obviously composite." Your argument convinced me that you should probably one-box even if Omega's number is, say, six. (Even leaving aside the fact that I'd probably mess up more than one in a thousand questions that easy.) For the reasons you said, I tentatively think that this algorithm is not actually one-boxing and is suboptimal.

But the algorithm "one-box unless the numbers are the same" is different. If you were playing the regular Newcomb game, and someone credibly offered you $2M if you two-box, you'd take it. More to the point, you presumably agree that you should take it. If so, you are now operating on an algorithm of "one-box unless someone offers you more money."

In this case, it's just like they are offering you more money: if you two-box, it's composite 99.9% of the time, and you get $2M.

The one thing we know about Omega is that it picks composites iff it predicts you will two-box. In the Meanmega example, it picks the numbers so that you two-box whenever it can, that just means whenever the lottery number is composite. So in all those cases, you get $2M. That you would have gotten anyway. Huh. And $1M from one-boxing if the lottery number is prime. Whereas, if you one-box, you get $1M 99.9% of the time, plus a lot of money from the lottery anyway. OK, so you're completely right. I might have to think about this more.

Assuming Manfred is completely right, how many non-identical numbers should it take before you decide you're not dealing with Meanmega and can start two-boxing when they're the same?

Comment by Nate_Gabriel on Polyhacking · 2013-08-25T08:01:11.237Z · LW · GW

As cool as that term sounds, I'm not sure I like it. I think it's too strongly reinforcing of ideas like superiority of rationalists over non-rationalists. Even in cases where rationalists are just better at things, it seems like it's encouraging thinking of Us and Them to an unnecessary degree.

Also, assuming there is a good enough reason to convince me that the term should be used, why is transhumanism-and-polyamory the set of powers defining the non-muggles? LessWrong isn't that overwhelmingly poly, is it?