Posts
Comments
I'm fond of Perl as a first language, for a couple of reasons. Foremost among them is that Perl is fun and easy, so it serves as a gentle introduction to programming (and modules) that's easy to stick with long enough to catch the bug, and it's versatile in that it can be used for webapps or for automating system tasks or just for playing around. But I wouldn't recommend making it anybody's only language, because it IS a scripting language and consequently encourages a sort of sloppy wham-bam-thank-you-ma'am approach to coding. Start with it, learn the basics, then move on to Python, and after achieving competence there learning new languages pretty much just feels like fun and games. Perl remains my favorite language for anything to do with SQL, and also for hammering out quick scripts to automate boring tasks.
Lisp is probably not necessary, but IS fun to learn. I don't know whether if it makes you a better programmer. I'm definitely better now than I was before I learned it, but I don't know how to differentiate between "I gained experience" and "Lisp fixed my brain".
My first languages were C++ and Java, incidentally, and I would say that I became a decent programmer in spite of that rather than because of it. C++ was too much all at once, at least for twelve-year-old-me, and Java by contrast is so gentle and coddling that it became a kind of tarpit from which I almost did not escape.
I think more than anything what reliably converts you to a higher value programmer (provided you already have good math skills) is going through the larval stage as many times as possible.
What a silly thought experiment. The fact that two people use one word to refer to two different things (which superficially appear similar) doesn't mean anything except that the language is imperfect.
Case in point: Uses of the word "love".
Okay. Thank you very much for your insight; I do appreciate it.
I... Was not even aware that such a game existed; I was referring to The Once And Future King. But clicking through the wiki a little bit has me fascinated by the tangle of mythological references.
Just call me le Chevalier mal Fet.
You make an interesting point. To be sure I've understood: Behave in a more truth-seeking manner in general, because if I do so I will be a more truth-seeking person in the future from force of habit, and if I do not do so then I will be less of one? If the force of habit is really so potent in cases like this then it's a very convincing argument; I wouldn't want to give up the ability to be rational just to be a tiny bit better at manipulation.
Both twenty-one. But that is a less useful statistic than emotional maturity, which I think is what you're getting at, so I should note that there is a definite discrepancy in terms of how well we handle feelings - I have a great deal more emotional control than does she. So despite being the same age, there is a power imbalance in a sense similar to the one you're asking about. Of the two undescribed parties, one is older than me (22) and one is younger (19).
Actually, I don't quite have to pretend that the other parties are attempting manipulation in the other direction; they've all been fairly transparent in their attempts (albeit with varying degrees of persistence; of the three, J sits in the middle in terms of time spent attempting to convert me).
I look forward very much to seeing your sequence.
This is a very valid point, but I'm less interested in whether such a plan is practical than in whether, assuming feasibility, it is ethical.
Explicitly declaring "I am going to try to convert you" to any of these people would definitely eliminate or minimize all potential avenues of influence, and I do not think I am nearly subtle enough to work around that. Still, if I understand what you're saying correctly, it's more an issue of informed consent of study participants than of letting people decide whether they want their buttons pushed. Is that an accurate understanding of your perspective?
If it shows up in Elcenia, I do declare I shall explode from pure joy.
What's unreasonable about Chick tracts, I think, is that strangers can't really walk up and manipulate you like that unless you're already in an extremely emotionally vulnerable state. It's easier if there's an established relationship.
Yes. Which is a very good reason for me not to trust my inclinations.
Does LessWrong have an actual primer on the Dark Arts anywhere? There's a lot of discussion of Defense Against, but I haven't seen any Practice Of... Perhaps that's beyond the scope of what we really intend to teach here?
Heavens, no. I want my friends to be atheists for purely selfish reasons. It so happens that some of those selfish reasons involve things like "I want my friends to know what's true", but most of them are reasons like "I want this awkward piece of the relationship gone" and "It's a shame none of you believe in casual premarital sex, because I could really go for an orgy right now" and "If I have to hear you talk about how wrong gay marriage is ONE MORE TIME I do declare I shall explode."
In other words, I really do not trust my personal desires as an ethical system, because in a vacuum I'm a pretty unmitigated asshole.
I'm going to describe such a conversation (the first of what would, I think, be many) for a girl who I will call Jane, though that is not her name. Some background: Jane is a devout Catholic, an altar girl, a theology major, a performer of the singing-acting-dancing type, and one of the bubbliest people I know. She is also firmly against gay marriage, abortion, premarital sex, and consumption of alcohol or other drugs (though for some reason she has no problem with consumption of shellfish). You may have read the previous two sentences and thought "there's a lot of sexual repression going on there" and you would be quite correct, though she would never admit that. Here is what I would say and do. Don't take the wording too literally; I'm not that good.
tld: (At an appropriate moment) Jane, I have a very personal question for you.
J: Okay, shoot.
tld: It's about God.
J: Oh dear. I'm listening.
tld: So God exists. And he's up there, somewhere, shouting down that he loves us. But if tomorrow morning he suddenly vanished - just ceased to exist, packed up and left town, whatever - would you want to know?
J: I - uh - gosh. That would go against everything God's said, about how he would never abandon us- tld: I know. But just think of it as a counterfactual question. God leaves, or vanishes. Do you want to know? J: I don't know. It's - I just can't imagine that happening.
tld: taking Jane's hand, gentle smile Hey. Don't let it rattle you. Just remember, here in the real world, God's up there somewhere, and he loves us, and he would never abandon us.
J: I love hearing you say that.
tld: Sure. So in the real world, nothing to worry about. But over there in the imaginary, fake world - God vanishes. Would you want to know?
J: Well... I guess so. Because otherwise it's just living a lie, isn't it?
tld: Right. squeeze hand softly I'm glad you agree, it's very brave and honest of you to be able to say that. So the follow-up question is, what would change, in that world?
J: What do you mean?
tld: Well, God was there, and now he's left that world behind. So it's a world without God - what changes, what would be different about the world if God weren't in it?
J: I can't imagine a world without God in it.
tld: Well, let's look at it the other way, then. Let's imagine another world, just like the first two except that it never had a God in the first place, and then God shows up. He came from the other world, the first one we imagined, to give this new world some of His light, right? reassuring squeeze
J: squeeze back Okay...
tld: So God comes into this new world, and the first thing he does is make it a better place, right? That's what God does, he makes the world a better place.
J: Yeah! Yeah, exactly. God makes the world a better place.
tld: So God comes down himself, or sends down His son, and feeds the poor and heals the sick, and pretty soon the world is better off because God is there.
J: Of course.
tld: Great! smile So let's think about the other world, the one that got left behind, for a second. What would you do, if you were there?
J: What? (shocked)
tld: Well, the you in the other world finds out there's no God anymore, and that's that. So what would you do? lean in, squeeze hand again There must be some things you'd dare to do that you wouldn't otherwise.
J: pause, blush Um. Well. I don't know. I'd have to think about it.
tld: Right, it's a hard question. final hand squeeze, lean back But I hope you'll think about it, for the next time we talk, and let me know what you've come up with. I've actually got to run, it's getting kind of late (or other excuse for why I need to leave, etc)
Proceed to wait until she brings the subject up again, or bring it up again later myself.
So, yes. The above conversation has two purposes, which are (a) to plant the idea of dealing with a world where God doesn't exist, and (b) to remind Jane that there are things she wants but can't have because of her faith so that she has a reason, though unspoken, to want to be rid of it; there are a couple of other things going on as well which I'm sure faul_sname will cringe at, but that's the gist. Intended arc of development: A few months' worth of working on a truth-seeking mindset, possibly more work on building rapport and position-of-authority mojo, and eventually the Jenga moment, which it's difficult to plan out precisely in advance. And yes, I realize that playing on sexual tension to manipulate someone's beliefs is, in a word, disgusting. I did say Dark Arts for a reason.
The other two people who've been weighing on my mind are let's-call-him-James and let's-call-her-Mary, for whom the intended sequence is a little different (neither of them has an easily-accessible repressed-sexuality motivator) but you get the idea, I think.
I caught myself doing more or less the same thing (but for substantially eviller reasons), which is why I asked LW in the first place.
In fact I have attempted such meta-discussion. Unfortunately it's very difficult to get a straight answer to questions like that; people will almost always CLAIM to care about the truth, but that's also what they would claim if they merely thought they cared and didn't reflect enough on it to know otherwise.
The possibility that I am incorrect about what would make them lose their belief is a very real one; I used to think that merely repeating the things that broke MY faith in God would work on everyone, and that was clearly wrong. Still, I'd give p>.33 for success, and thus expect it to work on at least one of the three people I'm writing about.
You're absolutely right that my primary motivation is simply that I WANT to do it. But ethical reasoning is about what is right in spite of my preferences, is it not? So the question of truth-versus-negative-consequences remains an important one.
Your point about truth-seeking versus atheism as a religion is a very good one. I do generally think that converting atheists to truth-seekers is easier than converting Catholics to truth-seekers, but I had not considered the possibility that I might, rather than failing entirely (which is not unlikely), fail at the halfway point and end up with atheist zealots for friends, which would DEFINITELY create more problems than it would solve.
That was a very thoughtful piece of advice. Thank you.
Absolutely, contingent on being able to convince myself it's ethical to do so. Give me a moment to do some typing and I'll outline how I think one such conversation sequence would go.
Not quite the advice I was hoping for, but thank you for your honesty.
Didn't see this! You're right, that is quite a bit too strong. Let me reduce the strength of that statement: Among theists to whom I have become close enough to ask deeply personal questions and expect truthful answers, such levers seem prevalent.
Even if it were just a matter of telling the truth, I don't think it would be ethically unambiguous. The more general question is whether the value of increasing some person's net-true-beliefs stat outweighs the corresponding decrease in that person's ability-to-fit-comfortably-in-theist-society stat. In other words I am questioning WHETHER they would be better off, not which conditional I should thereafter follow.
The first question is a difficult one to answer - more specifically, a very difficult one to get a theist to answer genuinely rather than just as signalling.
I would approve of more-adept friends pushing analogous levers in my own head (emphasis 'friends' - I want them to be well-intentioned), but I am weird enough to make me wary of generalizing based on my own preferences.
I certainly don't mean to say that I have any kind of fully-general way to convert theists. I mean rather to say that as you get closer to individual people, you find out what particular levers they have to flip and buttons they have to push, and that with sufficient familiarity the sequence of just-the-right-things-to-say-and-do becomes clear. But if you would like an example of what I'd say to a specific person (currently there are three to whom I know what I would say), I can do that. Let me know.
And here I always thought it was set to the Imperial March.
"as I am no different from anyone else as far as rational thinking is concerned" is the part that bothers me about this. This approach makes sense to me in the context of clones or Tegmark duplicates or ideal reasoning agents, sure, but in the context of actual other human beings? Not a chance. And I think the results of Hoftstadter's experiments proved that trusting other humans in this sense wouldn't work.
No; instead I will cut a deal with Clipmega for two million paperclips in exchange for eir revealing the said information only to me, and exploit that knowledge for economic gain of, presumably, ~1e24 paperclips. 1e24 is a lot, even of paperclips. 1e6, by contrast, is not.
You wouldn't likely be able to just dissolve anhydrous caffeine powder in water and keep yourself blinded; it's incredibly bitter (second only, in my experience, to tongkat ali / eurycoma longifolia root powder).
ohgodohgodohgod
I think that this may be true about the average person's supposed caring for most others, but that there are in many cases one or more individuals for whom a person genuinely cares. Mothers caring for their children seems like the obvious example.
Well, if his trick for deactivating other wizards' patronuses (patronii?) works, he basically has an unblockable army of instant-death assassins, the only defense against which would be Apparition... That's a pretty good ultimate weapon in a Mutually Assured Destruction sense. And as long as we're discussing mutually assured destruction, there seems little doubt that Harry would be able to transfigure nuclear weaponry. Or botulinum toxin (of which it would take an appallingly small amount to kill every human on Earth). Etc, etc. Harry does not lack for access to Ultimate Weapons.
It seems irrelevant whether the AI is quote-unquote "highly intelligent" as long as it's clever enough to take over a country and kill several million people.
Assuming, from the title, that you're looking for argument by counterexample...
The obvious reply would be to invoke Godwin's Law - there's a quote in Mein Kampf along the lines of "I am convinced that by fighting off the Jews, I am doing the work of our creator...". Comments like this pretty reliably generate a response something like "Hitler was a diseased mind/insane/evil!" to which you may reply "Yeah, but he was pretty sharp, too." However, this has the downside of invoking Nazis, which in a certain kind of person may provoke an instant "This is a reactionary idiot" response and a complete discarding of the argument. So it's a temperamental trick, and I'm not skilled enough in the dark arts to know if it's a net gain.
On the other hand, you might prefer Pol Pat, or Ted Bundy, or any of a very large number of dictators and serial killers who don't produce the same mindkilling response as Hitler.
A lot of fictional evidence comes to mind as well, but we do try not to generalize from that... Still, if you just want to WIN the argument rather than win rationally, it may help to pull an example from some media form that the audience is likely to appreciate. Lex Luthor, Snidely Whiplash, Yagami Light (or L, if you prefer), Mephistopheles (or Faust), and so on.
Is that the sort of thing you wanted?
Maybe they are just more optimistic about it than to be rotting six feet under.
My feelings exactly.
I had a hidden ugh-field about that one. It took quite a few repetitions of the Litany of Gendlin to grok it.
I confess I rather enjoyed the part where Snape's head exploded. There's a certain window of "So bad it's good" in there, before you get to the "So bad it's horrible". As I said in another comment, it's not bad at the start.
Other than "cheroybbq snzvyvrf znvagnva gurve jrnygu guebhtu neovgenel zbabcbyvrf tenagrq ol gur Jvmratnzbg"?
I never, in Canon, got quite such an impression of Eerie Alien Geometries from the castle as I do in MoR. Thankfully Event Horizon hadn't come out in 1991, or I'd wager a lot of Muggleborns would be very uncomfortable in the upper floors.
Yes, Dumbledore's icy glare at the end seems to imply that he figured it out.
I actually found it fairly enjoyable as well for the first few chapters. I didn't realize how much I hated it until I came to Qhzoyrqber'f guvegrragu Ubepehk.
Of course, the mere existence of that spoiler may make you want to read more just to find out how on earth such a thing could happen.
SA?
The difference is primarily one of quality. Time Braid is excellent, provided one is willing to accept the rewritten cosmology, while Chunin Exam Day is pretty much universally considered to be refuse.
Something I just noticed on a second read-through - the reuse of the word "riddle" in context here seems like a reminder to Lucius of who he thinks Harry really is, and this is not the first time it's come up when Harry is exposed to Dementors. Perhaps this lends credence to the theory that riddle is the "strange word" he learned when first exposed?
All the same things that are wrong with Chunin Exam Day.
If he picked the right lottery he'd only need to do that once, period. There are many lotteries paying out well over two million pounds... But I suspect Locke is right on this count.
But gaze not overlong into that particular abyss.
Edit: In retrospect, TvTropes itself is probably the bigger abyss of the two. So don't gaze overlong into that one either.
Draco's going to want to go back, of course.
Why on earth did this not immediately occur to me? This is usually my first thought in time-travel stories. Clearly my dislike of Lucius is clouding my judgement.
I think my favorite part of this update comes not from the chapter, but from the Author's Notes:
"If you write sufficiently good fanfiction, you can realize your romantic dreams!"
(Although "Make him go away" is either tied for the position or a close second.)