Posts

Thwarting a Catholic conversion? 2012-06-18T16:26:42.666Z
When None Dare Urge Restraint, pt. 2 2012-05-30T15:28:27.801Z
Non-theist cinema? 2012-01-08T07:54:12.190Z
The scope of "free will" within biology? 2011-06-29T06:34:09.175Z

Comments

Comment by Jay_Schweikert on Rationality Quotes December 2014 · 2014-12-18T05:02:23.228Z · LW · GW

It is wrong to put temptation in the path of any nation, For fear they should succumb and go astray; So when you are requested to pay up or be molested, You will find it better policy to say: --

"We never pay any-one Dane-geld, No matter how trifling the cost; For the end of that game is oppression and shame, And the nation that pays it is lost!"

--Rudyard Kipling, "Dane-Geld"

A nice reminder about the value of one-boxing, especially in light of current events.

Comment by Jay_Schweikert on Rationality Quotes December 2014 · 2014-12-03T04:40:11.923Z · LW · GW

All the logical work (if not all the rhetorical work) in “Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety” is being done by the decision about what aspects of liberty are essential, and how much safety is at stake. The slogan might work as a reminder not to make foolish tradeoffs, but the real difficulty is in deciding which tradeoffs are wise and which are foolish. Once we figure that out, we don’t need the slogan to remind us; before we figure it out, the slogan doesn’t really help us.

--Eugene Volokh, "Liberty, safety, and Benjamin Franklin"

A good example of the risk of reading too much into slogans that are basically just applause lights. Also reminds me of "The Choice between Good and Bad is not a matter of saying 'Good!' It is about deciding which is which."

Comment by Jay_Schweikert on Personal examples of semantic stopsigns · 2013-12-07T22:05:55.494Z · LW · GW

I'm torn on "it's complicated." Clearly, you're correct that it can function as a powerful semantic stopsign. But increasingly, I also find that it's actually an entirely appropriate and even useful response (or at least an initial response) to many questions, especially political/policy/legal/normative questions.

For example, imagine a poll asking American citizens the following question: "In one sentence, what would you say is the major problem with the American health care system?" Now imagine the people who respond with something like "It's complicated," and ask yourself whether these individuals might ultimately have something interesting and productive to say about health care (compared to the average responder).

Comment by Jay_Schweikert on 2013 Less Wrong Census/Survey · 2013-12-07T21:57:50.735Z · LW · GW

Answered every question to which I had an answer. I haven't spent much time on Less Wrong recently, but it's really pretty remarkable how just answering Less Wrong surveys causes me to think more seriously than just about anything else I come across in any given week.

Comment by Jay_Schweikert on Rationality Quotes September 2013 · 2013-09-22T17:45:50.652Z · LW · GW

Zortran, do you ever wonder if it's all just meaningless?

What's "meaningless?"

It's like... wait, really? You don't have that word? It's a big deal over here.

No. Is it a good word? What does it do?

It's sort of like... what if you aren't important? You know... to the universe.

Wait... so humans have a word to refer to the idea that it'd be really sad if all of reality weren't focused on them individually?

Kinda, yeah.

We call that "megalomania."

Well, you don't have to be a jerk about it.

Saturday Morning Breakfast Cereal

Comment by Jay_Schweikert on Using Evolution for Marriage or Sex · 2013-05-07T04:04:16.874Z · LW · GW

I'm not quite sure what to make of this post as a whole. I find myself appreciative of the general point, and a lot of it seems to register with me, but I also agree that more precise sourcing would be desirable for such a controversial and empirically open-ended subject.

But the main reason why I wanted to comment is that Bang With Friends seems like such an obvious and obviously value-adding concept that I was surprised I'd never heard of anything like it before. If I were in the position of looking for additional sexual partners right now (and if the privacy was functional), I don't see any good reason not to use something like that. If people feel comfortable sharing on this subject, has anyone out there had positive experiences with this app or something like it?

Comment by Jay_Schweikert on Help us name the Sequences ebook · 2013-04-15T20:52:11.152Z · LW · GW

I like the general point about something catchy with pizzazz. "Being Less Wrong" is my favorite so far, but it could probably be improved on. "Winning: Theory and Practice" is also pretty good, though I wonder whether there's too much of an association between "winning" and Charlie Sheen. Maybe that's a silly concern, but we wouldn't want anyone to think this was just a joking reference to that.

Comment by Jay_Schweikert on Help us name the Sequences ebook · 2013-04-15T20:46:00.872Z · LW · GW

Is it helpful for the phrase "The Sequences" to appear in the title? My sense is that anyone who's already familiar enough with the Sequences to know what it means isn't going to need that phrase to be interested in the book, and that the phrase doesn't add much value for someone who's never heard of the Sequences before. It's sort of a weird word that doesn't immediately suggest anything about rationality.

The only people for whom it would add value would be those who (1) have at least sort of heard of the Sequences and are somewhat interested; (2) need to know that this ebook is about the Sequences to decide to read it; and (3) wouldn't understand that this was the Sequences ebook without that word in the title. I doubt that's a very large class, so my initial sense is that it's not necessary in the actual title. But then, that's just what occurred to me in the last 10 minutes, and the people who have thought about this more carefully may well have other reasons.

Comment by Jay_Schweikert on Rationality Quotes April 2013 · 2013-04-04T14:18:00.683Z · LW · GW

Jack Sparrow: [after Will draws his sword] Put it away, son. It's not worth you getting beat again.

Will Turner: You didn't beat me. You ignored the rules of engagement. In a fair fight, I'd kill you.

Jack Sparrow: Then that's not much incentive for me to fight fair, then, is it? [Jack turns the ship, hitting Will with the boom]

Jack Sparrow: Now as long as you're just hanging there, pay attention. The only rules that really matter are these: what a man can do and what a man can't do. For instance, you can accept that your father was a pirate and a good man or you can't. But pirate is in your blood, boy, so you'll have to square with that some day. And me, for example, I can let you drown, but I can't bring this ship into Tortuga all by me onesies, savvy? So, can you sail under the command of a pirate, or can you not?

--Pirates of the Caribbean

The pirate-specific stuff is a bit extraneous, but I've always thought this scene neatly captured the virtue of cold, calculating practicality. Not that "fairness" is never important to worry about, but when you're faced with a problem, do you care more about solving it, or arguing that your situation isn't fair? What can you do, and what can't you do? Reminds me of What do I want? What do I have? How can I best use the latter to get the former?

Comment by Jay_Schweikert on Rationality Quotes February 2013 · 2013-02-05T19:46:24.952Z · LW · GW

Well, but in the universe of the commercials, it clearly did, so long as you went to the appropriate expert.

Comment by Jay_Schweikert on Rationality Quotes January 2013 · 2013-01-29T18:30:59.335Z · LW · GW

And to think, I was just getting on to post this quote myself!

Comment by Jay_Schweikert on [minor] Separate Upvotes and Downvotes Implimented · 2013-01-29T16:20:51.561Z · LW · GW

Same thing happened to me, and I also had moved an article from Discussion to Main after it had gotten a lot of upvotes. So that's almost certainly the explanation.

Comment by Jay_Schweikert on Rationality Quotes January 2013 · 2013-01-16T18:25:32.793Z · LW · GW

[After analyzing the hypothetical of an extra, random person dying every second.] All in all, the losses would be dramatic, but not devastating to our species as a whole. And really, in the end, the global death rate is 100%—everyone dies.

. . . or do they? Strictly speaking, the observed death rate for the human condition is something like 93%—that is, around 93% of all humans have died. This means the death rate among humans who were not members of The Beatles is significantly higher than the 50% death rate among humans who were.

--Randall Munroe, "Death Rates"

Comment by Jay_Schweikert on Rationality Quotes January 2013 · 2013-01-10T17:16:22.317Z · LW · GW

Ah, okay, thanks for clarifying. In case my initial reply to MugaSofer was misleading, Dennett doesn't really seem to be suggesting here that this is really what most theists believe, or that many theists would try to convert atheists with this tactic. It's more just a tongue-in-cheek example of what happens when you lose track of what concept a particular group of syllables is supposed to point at.

But I think there are a great many people who purport to believe in "God," whose concept of God really is quite close to something like the "anthropomorphic representation of natural processes, or of the universe and its inner workings." Probably not for those who identify with a particular religion, but most of the "spiritual but not religious" types seem to have something like this in mind. Indeed, I've had quite a few conversations where it became clear that someone couldn't tell me the difference between a universe where "God exists" and where "God doesn't exist."

Comment by Jay_Schweikert on Rationality Quotes January 2013 · 2013-01-10T15:53:33.113Z · LW · GW

Sorry, can you clarify what you mean here? None of what passes an ideological turing test? Are you saying something like "theists erroneously conclude that the proponents of evolution must believe in God because evolutionists believe that evolution is what produced all creatures great and small"? What exactly is the mistake that theists make on this point that would lead them to fail the ideological turing test?

Or, did I misunderstand you, and are you saying that people like Dennett fail the ideological turing test with theists?

Comment by Jay_Schweikert on Rationality Quotes January 2013 · 2013-01-10T15:04:06.358Z · LW · GW

In large part, yes. This passage is in Dennett's chapter on "Belief in Belief," and he has an aside on the next page describing how to "turn an atheist into a theist by just fooling around with words" -- namely, that "if 'God' were just the name of whatever it is that produced all creatures great and small, then God might turn out to be the process of evolution by natural selection."

But I think there's also a more general rationality point about keeping track of the map-territory distinction when it comes to abstract concepts, and about ensuring that we're not confusing ourselves or others by how we use words.

Comment by Jay_Schweikert on Rationality Quotes January 2013 · 2013-01-09T16:39:07.096Z · LW · GW

Suppose you've been surreptitiously doing me good deeds for months. If I "thank my lucky stars" when it is really you I should be thanking, it would misrepresent the situation to say that I believe in you and am grateful to you. Maybe I am a fool to say in my heart that it is only my lucky stars that I should thank—saying, in other words, that there is nobody to thank—but that is what I believe; there is no intentional object in this case to be identified as you.

Suppose instead that I was convinced that I did have a secret helper but that it wasn't you—it was Cameron Diaz. As I penned my thank-you notes to her, and thought lovingly about her, and marveled at her generosity to me, it would surely be misleading to say that you were the object of my gratitude, even though you were in fact the one who did the deeds that I am so grateful for. And then suppose I gradually began to suspect that I had been ignorant and mistaken, and eventually came to the correct realization that you were indeed the proper recipient of my gratitude. Wouldn't it be strange for me to put it this way: "Now I understand: you are Cameron Diaz!"

--Daniel Dennett, Breaking the Spell (discussing the differences between the "intentional object" of a belief and the thing-in-the-world inspiring that belief)

Comment by Jay_Schweikert on Rationality Quotes January 2013 · 2013-01-09T16:22:41.270Z · LW · GW

I answered "rarely," but I should probably qualify that. I've been an atheist for about 5 years, and in the last 2 or 3, I don't recall ever seriously thinking that the basic, factual premises of Christianity were any more likely than Greek myths. But I have had several moments -- usually following some major personal failing of mine, or maybe in others close to me -- where the Christian idea of man-as-fallen living in a fallen world made sense to me, and where I found myself unconsciously groping for something like the Christian concept of grace.

As I recall, in the first few years after my deconversion, this feeling sometimes led me to think more seriously about Christianity, and I even prayed a few times, just in case. In the past couple years that hasn't happened; I understand more fully exactly why I'd have those feelings even without anything like the Christian God, and I've thought more seriously about how to address them without falling on old habits. But certainly that experience has helped me understand what would motivate someone to either seek or hold onto Christianity, especially if they didn't have any training in Bayescraft.

Comment by Jay_Schweikert on Rationality Quotes January 2013 · 2013-01-09T16:10:02.995Z · LW · GW

Upvoted. I actually had a remarkably similar experience reading Lewis. Throughout college I had been undergoing a gradual transformation from "real" Christian to liberal Protestant to deist, and I ended up reading Lewis because he seemed to be the only person I could find who was firmly committed to Christianity and yet seemed willing to discuss the kind of questions I was having. Reading Mere Christianity was basically the event that let me give Christianity/theism one last look over and say "well said, but that is enough for me to know it is time to move on."

Comment by Jay_Schweikert on New censorship: against hypothetical violence against identifiable people · 2012-12-25T20:42:41.232Z · LW · GW

Is that your true rejection? That is, if this poll were posted to Main, and all readers were encouraged to answer, and the results came back essentially the same, would you then allow the results to influence what kind of policy to adopt? Or are you just sufficiently confident about the need for such a moderation policy that, absent clear negative consequences not previously considered, you'll implement it anyway?

I don't mean at all to suggest that the latter answer is inappropriate. Overall I trust your moderating judgment, and you clearly have more experience with and have thought more about LW's public image than probably anyone. If you decide the strong version of this policy is needed, notwithstanding disagreement from most LW members, I'm happy to give substantial deference to that decision. But does it matter either way whether this post is selectively read?

Comment by Jay_Schweikert on That Thing That Happened · 2012-12-18T17:37:04.956Z · LW · GW

"Sorry, that accusation expires after one use per conversation."

Comment by Jay_Schweikert on Rationality Quotes December 2012 · 2012-12-14T17:16:31.125Z · LW · GW

I don't have any previous experience with this sort of thing, but judging from what I hear and read, I'm supposed to be asking why all this is happening, and why it's happening to me. Honestly, those questions are about the farthest thing from my mind.

Partly, that’s because they aren't hard questions. Why does our world have gravity? Why does the sun rise in the East? There are technical answers, but the metaphysical answer is simple: that’s how reality works. So too here. Only in the richest parts of the rich world of the twenty-first century could anyone entertain the thought that we should expect long, pain-free lives. Suffering and premature death (an odd phrase: what does it mean to call death "premature"?) are constant presences in the lives of most of the peoples of the Earth, and were routine parts of life for generations of our predecessors in this country—as they still are today, for those with their eyes open. Stage 4 cancers happen to middle-aged men and women, seemingly out of the blue, because that's how reality works.

As for why this is happening to me in particular, the implicit point of the question is an argument: I deserve better than this. There are two responses. First, I don't—I have no greater moral claim to be free from unwanted pain and loss than anyone else. Plenty of people more virtuous than I am suffer worse than I have, and some who don't seem virtuous at all skate through life with surprising ease. Welcome to the world. Once again, it seems to me that this claim arises from the incredibly unusual experience of a small class of wealthy professionals in the wealthiest parts of the world today. We think we live in a world governed by merit and moral desert. It isn't so. Luck, fortune, fate, providence—call it what you will, but whatever your preferred label, it has far more to do with the successes of the successful than what any of us deserves. Aristocracies of the past awarded wealth and position based on the accident of birth. Today's meritocracies award wealth and position based on the accident of being in the right place at the right time. The difference is smaller than we tend to think. Once you understand that, it’s hard to maintain a sense of grievance in the face of even the ugliest medical news. I’ve won more than my share of life's lotteries. It would seem churlish to rail at the unfairness of losing this one—if indeed I do lose it: which I may not.

The second response is simpler; it comes from the movie "Unforgiven." Gene Hackman is dying, and says to Clint Eastwood: "I don't deserve this. To die like this. I was building a house." Eastwood responds: "Deserve’s got nothing to do with it."

That gets it right, I think. It's a messed-up world, upside-down as often as it's rightside up. Bad things happen; future plans (that house Hackman was building) come to naught. Deserve's got nothing to do with it.

--William J. Stuntz, discussing his cancer diagnosis

Apologies for the length, but I wanted to include the full substantive point and hated to snip lines here and there. For what it's worth, Prof. Stuntz was a devout Christian, and the linked post went on to discuss his theological views on why "something deep within us expects, even demands moral order—in a world that shouts from the rooftops that no such order exists." Obviously I draw a different conclusion about this conflict, but I still respect that he could take such an unflinching view of how morally empty nature really is.

Comment by Jay_Schweikert on By Which It May Be Judged · 2012-12-11T17:46:17.718Z · LW · GW

I don't think this works, because "fairness" is not defined as "divide up food equally" (or even "divide up resources equally"). It is the algorithm that, among other things, leads to dividing up the pie equally in the circumstances described in the original post -- i.e., "three people exactly simultaneously spot a pie which has been exogenously generated in unclaimed territory." But once you start tampering with these conditions -- suppose that one of them owned the land, or one of them baked the pie, or two were well-fed and one was on the brink of starvation, etc. -- it would at least be controversial to say "duh, divide equally, that's just what 'fairness' means." And the fact of that controversy suggests most of are using "fairness" to point to an algorithm more complicated than "divide up resources equally."

More generally, fairness -- like morality itself -- is complicated. There are basic shared intuitions, but there's no easy formula for popping out answers to "fair: yes or no?" in intricate scenarios. So there's actually quite a bit of value in using words like "fair," "right," "better," "moral," "good," etc., instead of more concrete, less controversial concepts like "equal division," -- if you can show that even those broad, complicated concepts can be derived from physics+logic, then it's that much more of an accomplishment, and that much more valuable for long-term rationalist/reductionist/transhumanist/friendly-ai-ist/whatever goals.

At least, that's how I under this component of Eliezer's project, but I welcome correction if he or others think I'm misstating something.

Comment by Jay_Schweikert on Participation in the LW Community Associated with Less Bias · 2012-12-10T00:02:14.736Z · LW · GW

Did you specifically think at the time "well, if 'married' and 'unmarried' were the only two possibilities, then the answer to the question would be 'yes' -- but Anne could also be divorced or a widow, in which case the answer would be 'no,' so I have to answer 'not enough information'"?

Not accusing you of dishonesty -- if you say you specifically thought of all that, I'll believe you -- but this seems suspiciously like a counter-factual justification, which I say only because I went through such a process. My immediate response on learning that I got the answer wrong was "well, 'unmarried' isn't necessarily coextensive with ~married,'" except then I realized that nothing like this occurred to me when I was actually answering, and that if I had thought in precisely these terms, I would have answered 'yes' and been quite proud of my own cleverness.

Regardless, for any potential future purposes, this problem could be addressed by changing "is a married person looking at an unmarried person?" to "is a married person looking at someone who is not married?" Doesn't seem like there's any reasonable ambiguity with the latter.

Comment by Jay_Schweikert on Rationality Quotes December 2012 · 2012-12-01T21:09:40.280Z · LW · GW

Hail the heav'n-born Prince of Peace! Hail the Son of Righteousness! Light and life to all He brings, Ris'n with healing in His wings. Mild He lays His glory by, Born that man no more may die; Born to raise the sons of earth, Born to give them second birth.

--Hark! The Herald Angels Sing (traditionally, the third verse -- starts at 2:52 in the linked video)

An unusual choice, to be sure. But notwithstanding the obvious religious content, I actually find this piece of the hymn to be a beautiful expression of genuine transhumanist sentiment. We've previously discussed how rationalism doesn't seem to leave much room for "Glory be to Gauss in the Highest!", but even if the sentiment of "highest praise" is a little Dark Artsy, I find myself thinking of something like a Friendly AI Singularity when I hear these lines. Sung in the right way, the song can actually give me chills to a degree rivaling HP:MoR -- you know, that chapter. Just listen to it from that perspective see if you don't find it inspiring.

I will note that I had a hard time finding a version of the song sung exactly how I wanted. It's usually performed slow, often by a choir, whereas I imagine it brisk, sung by one person with a deep voice, and with strong accenting -- as in, "Mild he lays his glory by/Born that man no more may die/Born to raise the sons of earth . . ."

Comment by Jay_Schweikert on Rationality Quotes November 2012 · 2012-11-30T18:15:57.906Z · LW · GW

At the very least, even assuming there's no reason to worry about your own death, you would probably still care about the deaths of others -- at least your friends and family. Given a group of people who mutually value having each other in their lives, death should still be a subject of enormous concern. I don't grant the premise that we shouldn't be concerned about death even for ourselves, but I don't think that premise is enough to justify Epicurus's attitude here.

Of course, for most of human history, there genuinely wasn't much of anything that could be done about death, and there's value in recognizing that death doesn't render life meaningless, even if it's a tragedy. But today, when there actually are solutions on the table, this quote sounds more in complacency than acceptance. Upvoted though, because it points to an important cluster of questions that's worth untangling.

Comment by Jay_Schweikert on [Link] "An OKCupid Profile of a Rationalist" · 2012-11-14T15:48:02.714Z · LW · GW

This response is such a strawman! No one's arguing that "the right to conduct sexual activities in any way without being judged" is a "sacred value" that "overrides any consequentialist concern for actually producing more effective rationalists." If every other post by Eliezer made specific, detailed reference to his dalmatian fetish, or if SIAI had a specific section of their website listing the fetishes and relationship styles of all their members, then yes, that would likely be problematic -- because it would be seriously, actually distracting from important rationality work.

But that situation is miles away from one person's private dating profile listing some relatively harmless fetishes that only surprise people because we don't usually talk about that sort of thing publicly. There's just no reason to think that this is the serious concern you're making it out to be. Frankly, I'd say that posting and arguing about it here is likely to do (or has already done) more damage than the profile on its own was capable of. Seriously, look at the comments and the downvotes you're getting, and consider that maybe this isn't a productive fight to be having. Just let it go.

Comment by Jay_Schweikert on [Link] "An OKCupid Profile of a Rationalist" · 2012-11-14T04:01:24.780Z · LW · GW

So far, the evidence that this profile is a PR problem seems limited to a handful of negative comments on one Internet comment thread. Most of those comments are limited to the idea that the post is too boastful or too open, and thus unlikely to be successful in attracting women. And the same thread includes people with neutral or positive responses at roughly the same frequency (maybe a little lower, but the same order of magnitude). This evidence falls well below what I would consider sufficient to trot this issue out in public, much less to demand that Eliezer take down the profile.

Should we treat "the freedom to broadcast sexual weirdness" as a deontological good that simply cannot be balanced against PR concerns? No, probably not. But does it make sense to protect that freedom as a strong institutional value that can only be overcome for extremely important reasons? Yes, and I'm confident this profile doesn't rise to that level.

Also, this sentence--

The second is exactly analogous to an anti-abortion activist who opposes teaching birth control.

--makes very little sense to me.

Comment by Jay_Schweikert on [Link] "An OKCupid Profile of a Rationalist" · 2012-11-14T02:55:36.784Z · LW · GW

I'm pretty uncomfortable with... well, just about everything in this post.

First, even assuming that lots of commenters at Marginal Revolution "reacted negatively" to the profile, I find it hard to believe that it could really have much effect on the general LW project of "raising the sanity waterline." Fine, Eliezer talks about some personal things that most people wouldn't mention publicly and, surprise surprise, some people have a sharp reaction to that. But how many out there are really going to think "oh my, this 'rationality' business sounded okay, but now that I've seen this dating profile, not so much." Yes, I realize that's an exaggeration, and yes, I understand there can be small, even unconscious effects at the margins. But come on, anyone seriously put off by something this harmless probably wasn't a very promising rationalist anyway.

Second, who's to say whether the net effect is negative? This is just speculation on my part, but I imagine a lot of people who read something like this (even some of those who purport to act "shocked" or whatever) would actually think "wow, that sounds pretty cool -- wish my life were more like that." Just looking at the anecdotal evidence from the four comments quoted above, the first suggests he doesn't get laid, the second suggests he does, and the fourth notes that reading the profile helped uncover a source of pleasure the person hadn't notice before. Only the third comment -- invoking the "cult leader" concept -- redounds to the detriment of the community itself. And lots of people are going to think that anyway, so I don't see much in the way of an additional problem here. If anything, maybe this makes us look cool and into kinky stuff?

Third, come on, is this really something that needs to get hashed out on a LW post? To whatever extent Eliezer wants privacy, can't we just let him date in peace? A sexual fetish isn't the kind of thing that truth can destroy (let's be honest, that section is 90% of the "controversy" here), and presumably, most of us are fighting for a world that's more open, more tolerant, and just straight-up more fun when it comes to sex and relationships. It would be pretty sad if, in pursuit of that goal, our community required its more prolific members to pretend to the contrary. If my expectations are wrong and this OKCupid profile really becomes a major problem, well, I guess Eliezer will have to decide how to deal with it. But I seriously doubt we're at that point, and I kind of hope we can just drop it.

Comment by Jay_Schweikert on 2012 Less Wrong Census/Survey · 2012-11-06T01:19:57.138Z · LW · GW

Took basically all of the survey except for the extra IQ tests. Thanks, Yvain! Looking forward to seeing the results.

Comment by Jay_Schweikert on Open Thread, November 1-15, 2012 · 2012-11-04T01:58:22.749Z · LW · GW

Thanks to everyone for all the answers. I'd say this one makes the most sense to me -- pretty quick to say and easily scalable for any number -- but I guess there's just not one, well-accepted convention.

Comment by Jay_Schweikert on Open Thread, November 1-15, 2012 · 2012-11-02T15:06:36.329Z · LW · GW

This is a random question, and I have poked around a bit on Google looking for the answer: what's the convention for pronouncing particular instances of Knuth's up-arrow notation? Like, if you had 3^^^3, how would you actually say that out loud? I always find myself stumbling through something like "three three-up-arrows three," but that seems terribly clunky. I also read somewhere that "3^^^3" would read as "three threes," which is more elegant, but doesn't seem to work when the numbers are different -- e.g., how would you say "3^^^4"? Anyway, I figured someone here would know.

Comment by Jay_Schweikert on Proofs, Implications, and Models · 2012-10-31T16:44:54.810Z · LW · GW

Yeah, we were taught in basically the exact same way -- moving around different colored weights on plastic print-outs of balances. I'll also note that this was a public (non-magnet) school -- a reasonably good public school in the suburbs, to be sure, but not what I would think of as an especially advanced primary education.

I join lots of other commenters as being genuinely surprised that the content of this post is understood so little, even by mathematicians, as it all seemed pretty common sense to me. Indeed, my instinctive response to the first meditation was almost exactly what Eliezer went on to say, but I kept trying to think of something else for a while because it seemed too obvious.

Comment by Jay_Schweikert on Rationality Quotes October 2012 · 2012-10-26T18:53:10.064Z · LW · GW

What would be a better way to teach young children about the nuances of the scientific method? This isn't meant as a snarky reply. I'm reasonably confident that Tom Murphy is onto something here, and I doubt most elementary school science fairs are optimized for conveying scientific principles with as much nuance as possible.

But it's not clear to me what sort of process would be much better, and even upon reading the full post, the closest he comes to addressing this point is "don't interpret failure to prove the hypothesis as failure of the project." Good advice to be sure, but it doesn't really go to the "dynamic interplay" that he characterizes as so important. Maybe instruct that experiments should occur in multiple rounds, and that participants will be judged in large part by how they incorporate results from previous rounds into later ones? That would probably be better, although I imagine you'd start brushing up pretty quickly against basic time and energy constraints -- how many elementary schools would be willing and able to keep students participating in year-long science projects?

That's not to say we shouldn't explore options here, but it might be that, especially for young children, traditional one-off science fairs do a decent enough job of teaching the very basic idea that beliefs are tested by experiment. Maybe that's not so bad, akin to why Mythbusters is a net positive for science.

Comment by Jay_Schweikert on 2012 Less Wrong Census Survey: Call For Critiques/Questions · 2012-10-23T15:55:47.127Z · LW · GW

Agree that there needs to be a cryonics option amounting to something like "no, but planning to sign up." I'd refrain from calling it "cryocrastinating" in the survey, both because that phrase has a judgmental tinge that, even if warranted, probably doesn't belong in survey answers, and also because it's possible that you could be purposefully delaying without it being mere procrastination -- for example, maybe you anticipate starting a job in the near future that will make it significantly easier to fund a life insurance policy.

Comment by Jay_Schweikert on 2012 Less Wrong Census Survey: Call For Critiques/Questions · 2012-10-23T15:32:39.315Z · LW · GW

I agree that splitting up libertarianism into subcategories would likely yield some benefit. As I understand the "left vs. right" aspect of this question, the difference would mostly come down to what the person thinks about the state's role in providing social insurance. Presumably all libertarians would support a high degree of economic and social liberty -- basically letting people make decisions for themselves so long as those decisions are voluntary and they don't hurt non-consenting parties. But where "left libertarians" would be more comfortable using taxes to provide some minimum level of welfare provisions, "right libertarians" would lean more toward minarchism and say that governments shouldn't go beyond things like courts, police, and the military.

It's entirely possible that others won't share my intuition around these concepts, but I do think this framing helps explain some of the confusion that arises from pointing to particular countries as example. European countries like Denmark and Switzerland have higher taxation and more extensive welfare states than the U.S., but they score higher on pretty much every other measure of economic freedom. That would probably make them fairly amenable to left-libertarians, but not right-libertarians/minarchists. This post at Bleeding Heart Libertarians does a pretty good job of explaining the distinction.

Relatedly, I'd say that minarchism fits more closely with something like right-libertarianism than with anarchism. There seems to be a much bigger gulf between "the state should do this small handful of important things" and "there shouldn't be a state" than between "the state should do this small handful of important things" and "the state should do this small handful of important things, and maybe a couple other ones too."

Comment by Jay_Schweikert on Problem of Optimal False Information · 2012-10-17T17:51:24.875Z · LW · GW

Well, maybe. I'm actually skeptical that it would have much effect on my productivity. But to reverse the question, suppose you actually did know this about your boss. If you could snap your fingers and erase the knowledge from your brain, would you do it? Would you go on deleting all information that causes you to resent someone, so long as that information wasn't visibly relevant to some other pending decision?

Comment by Jay_Schweikert on Problem of Optimal False Information · 2012-10-17T16:57:26.504Z · LW · GW

Would you want him to tell you that your new boss secretly burns little puppies at night? The boss also doesn't take it kindly if people critizise him for it.

Well, yes, I would. Of course, it's not like he could actually say to me "your boss secretly burns puppies -- do you want to know this or not?" But if he said something like "your boss has a dark and disturbing secret which might concern you; we won't get in trouble just for talking about it, but he won't take kindly to criticism -- do you want me to tell you?", then yeah, I would definitely want to know. The boss is already burning puppies, so it's not like the first-level harm is any worse just because I know about it. Maybe I decide I can't work for someone like that, maybe not, but I'm glad that I know not to leave him alone with my puppies.

Now of course, this doesn't mean it's of prime importance to go around hunting for people's dark secrets. It's rarely necessary to know these things about someone to make good decisions on a day-to-day basis, the investigation is rarely worth the cost (both in terms of the effort required and the potential blow-ups from getting caught snooping around in the wrong places), and I care independently about not violating people's privacy. But if you stipulate a situation where I could somehow learn something in a way that skips over these concerns, then sure, give me the dark secret!

Comment by Jay_Schweikert on Problem of Optimal False Information · 2012-10-17T14:43:21.797Z · LW · GW

Oops, meant to say "years." Fixed now. Thanks!

Comment by Jay_Schweikert on Problem of Optimal False Information · 2012-10-16T17:44:56.487Z · LW · GW

Even if it's a basilisk? Omega says: "Surprise! You're in a simulation run by what you might as well consider evil demons, and anyone who learns of their existence will be tortured horrifically for 3^^^3 subjective years. Oh, and by the way, the falsehood was that the simulation is run by a dude named Kevin who will offer 3^^^3 years of eutopian bliss to anyone who believes he exists. I would have used outside-of-the-Matrix magic to make you believe that was true. The demons were presented with elaborate thought experiments when they studied philosophy in college, so they think it's funny to inflict these dilemmas on simulated creatures. Well, enjoy!"

If you want to say this is ridiculously silly and has no bearing on applied rationality, well, I agree. But that response pretty clearly meets the conditions of the original hypothetical, which is why I would trust Omega. If I somehow learned that knowing the truth could cause so much disutility, I would significantly revise my estimate that we live in a Lovecraftian horror-verse with basilisks floating around everywhere.

Comment by Jay_Schweikert on Problem of Optimal False Information · 2012-10-16T17:28:13.606Z · LW · GW

Can we solve this problem by slightly modifying the hypothetical to say that Omega is computing your utility function perfectly in every respect except for whatever extent you care about truth for its own sake? Depending on exactly how we define Omega's capabilities and the concept of utility, there probably is a sense in which the answer really is determined by definition (or in which the example is impossible to construct). But I took the spirit of the question to be "you are effectively guaranteed to get a massively huge dose of utility/disutility in basically every respect, but it's the product of believing a false/true statement -- what say you?"

Comment by Jay_Schweikert on Problem of Optimal False Information · 2012-10-16T14:41:45.882Z · LW · GW

Okay, I suppose that probably is a more relevant question. The best answer I can give is that I would be extremely hesitant to do this. I've never experienced anything like this, so I'm open to the idea that there's a pain here I simply can't understand. But I would certainly want to work very hard to find a way to deal with the situation without erasing my memory, and I would expect to do better in the long-term because of it. Having any substantial part of my memory erased is a terrifying thought to me, as it's really about the closest thing I can imagine to "experiencing" death.

But I also see a distinction between limiting your access to the truth for narrow, strategic reasons, and outright self-deception. There are all kinds of reasons one might want the truth withheld, especially when the withholding is merely a delay (think spoilers, the Bayesian Conspiracy, surprise parties for everyone except Alicorn, etc.). In those situations, I would still want to know that the truth was being kept for me, understand why it was being done, and most importantly, know under what circumstances it would be optimal to discover it.

So maybe amnesia drugs fit into that model. If all other solutions failed, I'd probably take them to make the nightmares stop, especially if I still had access to the memory and the potential to face it again when I was stronger. But I would still want to know there was something I blocked out and was unable to bear. What if the memory was lost forever and I could never even know that fact? That really does seem like part of me is dying, so choosing it would require the sort of pain that would make me wish for (limited) death -- which is obviously pretty extreme, and probably more than I can imagine for a traumatic memory.

Comment by Jay_Schweikert on Problem of Optimal False Information · 2012-10-16T13:54:31.182Z · LW · GW

I would pick the black box, but it's a hard choice. Given all the usual suppositions about Omega as a sufficiently trustworthy superintelligence, I would assume that the utilities really were as it said and take the false information. But it would be a painful, both because I want to be the kind of person who pursues and acts upon the truth, and also because I would be desperately curious to know what sort of true and non-misleading belief could cause that much disutility -- was Lovecraft right after all? I'd probably try to bargain with Omega to let me know the true belief for just a minute before erasing it from my memory -- but still, in the Least Convenient Possible World where my curiosity was never satisfied, I'd hold my nose and pick the black box.

Having answered the hypothetical, I'll go on and say that I'm not sure there's much to take from it. Clearly, I don't value Truth for its own sake over and beyond all other considerations, let the heavens fall -- but I never thought I did, and I doubt many here do. The point is that in the real world, where we don't yet have trustworthy superintelligences, the general rule that your plans will go better when you use an accurate map doesn't seem to admit of exceptions (and little though I understand Friendly AI, I'd be willing to bet that this rule holds post-singularity). Yes, there are times where you might be better off with a false belief, but you can't predictably know in advance when that is, black swan blow-ups, etc.

To be more concrete, I don't think there's any real-world analogue to the hypothetical. If a consortium of the world's top psychiatrists announced that, no really, believing in God makes people happier, more productive, more successful, etc., and that this conclusion holds even for firm atheists who work for years to argue themselves into knots of self-deception, and that this conclusion has the strongest sort of experimental support that you could expect in this field, I'd probably just shrug and say "I defy the data". When it comes to purposeful self-deception, it really would take Omega to get me on board.

Comment by Jay_Schweikert on Causal Diagrams and Causal Models · 2012-10-12T14:44:36.959Z · LW · GW

Right, it seems like "Burglar" and "Recession" should switch places in the third diagram.

Comment by Jay_Schweikert on Rationality Quotes October 2012 · 2012-10-09T16:24:58.261Z · LW · GW

Ah, fair enough. I suppose the title of the work and the idea of an actual course on Munchkinry should have been clues about the setting.

Comment by Jay_Schweikert on Rationality Quotes October 2012 · 2012-10-09T15:36:59.297Z · LW · GW

This is a clever little exchange, and I'm generally all about munchkinry as a rationalist's tool. But as a lawyer, this specific example bothers me because it relies on and reinforces a common misunderstanding about law -- the idea that courts interpret legal documents by giving words a strict or literal meaning, rather than their ordinary meaning. The maxim that "all text must be interpreted in context" is so widespread in the law as to be a cliche, but law in fiction rarely acknowledges this concept.

So in the example above, courts would never say "well, you did 'attend' this school on one occasion, and the law doesn't say you have to 'attend' more than once, so yeah, you're off the hook." They would say "sorry, but the clear meaning of 'attend school' in this context is 'regular attendance,' because everyone who isn't specifically trying to munchkin the system understands that these words refer to that concept." Lawyers and judges actually understand the notion of words not having fixed meanings better than is generally understood.

Comment by Jay_Schweikert on Rationality Quotes October 2012 · 2012-10-03T12:11:53.514Z · LW · GW

Agreed. Though of course, I don't really see Faramir as disagreeing -- it was, after all, the Rangers of Ithilien who ambushed the Haradrim and killed the soldier they're talking about.

Comment by Jay_Schweikert on Rationality Quotes September 2012 · 2012-10-01T23:14:38.133Z · LW · GW

I hate to break up the fun, and I'm sure we could keep going on about this, but Decius's original point was just that giving a wrong answer to an open-ended question is trivially easy. We can play word games and come up with elaborate counter-factuals, but the substance of that point is clearly correct, so maybe we should just move on.

Comment by Jay_Schweikert on Rationality Quotes October 2012 · 2012-10-01T20:27:21.627Z · LW · GW

Frodo: Those that claim to oppose the Enemy would do well not to hinder us.

Faramir: The Enemy? (turns over body of an enemy soldier) His sense of duty was no less than yours, I deem. You wonder what his name is, where he came from, and if he was really evil at heart. What lies or threats led him on this long march from home, and if he'd not rather have stayed there... in peace. War will make corpses of us all.

-- The Lord of the Rings: The Two Towers (extended edition)

Comment by Jay_Schweikert on Rationality Quotes September 2012 · 2012-09-02T17:48:43.816Z · LW · GW

Qhorin Halfhand: The Watch has given you a great gift. And you only have one thing to give in return: your life.

Jon Snow: I'd gladly give my life.

Qhorin Halfhand: I don’t want you to be glad about it! I want you to curse and fight until your heart’s done pumping.

--Game of Thrones, Season 2.