Comment by SpaceFrank on Rationality Quotes May 2012 · 2012-05-04T20:03:04.482Z · LW · GW

When life gives you lemons, order miracle berries.

Comment by SpaceFrank on Eliezer Yudkowsky Facts · 2012-02-28T14:14:11.121Z · LW · GW

ph'nglui mglw'nafh Eliezer Yudkowsky Clinton Township wgah'nagl fhtagn

Doesn't really roll off the tongue, does it.


Comment by SpaceFrank on Harry Potter and the Methods of Rationality discussion thread, part 9 · 2012-02-28T05:10:59.879Z · LW · GW

Considering the ridiculous context of the rest of the conversation, (i.e. Dumbledore either pretending to be insane or actually letting some real insanity slip through) is it too far outside the realm of possibility for that comment to be a joke? It seemed like Dumbledore was going out of his way to screw with Harry in this chapter. Even if the machine actually does what he said it does, I could easily see the comment about "how much work it took to nail that down" being a joke Dumbledore told for his own amusement, knowing that Harry was too young to "get it".

Comment by SpaceFrank on Is That Your True Rejection? · 2012-02-03T20:18:08.916Z · LW · GW

I had to look it up, but I definitely agree. Especially considering how quickly the karma changes reversed after I edited in that footnote.

Comment by SpaceFrank on Is That Your True Rejection? · 2012-02-03T15:27:02.660Z · LW · GW

I wish I could upvote this post back into the positive.

(It seems pretty obvious to me that is a direct satire of the previous post by a similar username. What, no love for sarcasm?)

Comment by SpaceFrank on Explain/Worship/Ignore? · 2012-02-01T22:14:49.867Z · LW · GW

Such a great game. Seeing this makes me want to play it again, having discovered this site and done some actual reading on transhumanism and AI. It might change the choice I'd make at the end...

Of course, this goes even further than just proving the old saying about Deus Ex, considering you never even mentioned the title!

I know this is a serious necro-post, but I felt compelled.

Comment by SpaceFrank on Welcome to Less Wrong! (2012) · 2012-01-30T18:45:53.845Z · LW · GW

Thanks! I hadn't read that article yet, but I became familiar with the concept when reading one of Eliezer Yudkowsky's papers on existential risk assessment. (Either this one or this one) I did have a kind of "Oh Shit" moment when the context of the article hit me.

Comment by SpaceFrank on Tell Your Rationalist Origin Story · 2012-01-26T19:04:42.918Z · LW · GW

Now that I think about it, "natural selection" seems more appropriate.

Comment by SpaceFrank on Welcome to Less Wrong! (2012) · 2012-01-25T21:41:34.477Z · LW · GW

Exactly. I also suspect that logical overconfidence, i.e. knowing a little bit about bias and thinking it no longer affects you, is magnified with higher intelligence.

I can't help but remember that saying about great power and great responsibility.

Comment by SpaceFrank on Welcome to Less Wrong! (2012) · 2012-01-25T19:32:04.701Z · LW · GW

Hello, Less Wrong.

Like some others, I eventually found this site after being directed by fellow nerds to HPMOR. I've been working haphazardly through the Sequences (getting neck-deep in cognitive science and philosophy before even getting past the preliminaries for quantum physics, and loving every bit of it).

I can't point to a clear "aha!" moment when I decided to pursue the LW definition of rationality. I always remember being highly intelligent and interested in Science, but it's hard for me to model how my brain actually processed information that long ago. Before high school (at the earliest), I was probably just as irrational as everyone else, only with bigger guns.

Sometime during college (B.S. in mechanical engineering), I can recall beginning an active effort to consider as many sides of an issue as possible. This was motivated less from a quest for scientific truth and more from a tendency to get into political discussions. Having been raised by parents who were fairly traditional American conservatives, I quickly found myself becoming some kind of libertarian. This seems to be a common occurrence, both in the welcome comments I've read here and elsewhere. I can't say at this point how much of this change was the result of rational deliberation and how much was from mere social pressure, but on later review it still seems like a good idea regardless.

The first time I can recall actually thinking "I need to improve the way I think" was fairly recent, in graduate school. The primary motivation was still political. I wanted to make sure my beliefs were reasonable, and the first step seemed to be making sure they were self-consistent. Unfortunately, I still didn't know the first thing about cognitive biases (aside from running head-on into confirmation bias on a regular basis without knowing the name). Concluding that the problem was intractable, I withdrew from all friendly political discussion except one in which my position seemed particularly well-supported and therefore easy to argue rationally. I never cared much for arguing in the first place, so if I'm going to do it I'd prefer to at least have the data on my side.

I've since lost even more interest in trying to figure out politics, and decided while reading this site that it would be more immediately important anyway to try figuring out myself. I've yet to identify that noble cause to fight for (although I have been interested in manned space exploration enough to get two engineering degrees), but I think a more rational me will be more effective at whatever that cause turns out to be.

Still reading and updating...

Comment by SpaceFrank on Tell Your Rationalist Origin Story · 2012-01-24T17:24:24.884Z · LW · GW

(I'm neither a theology scholar nor an anthropologist, so I may lack some important background on this.)

I agree that the idea of early church leaders isolating members in order to explicitly limit the introduction of new ideas sounds far-fetched. It strikes me as the kind of thing that would only be said after the fact, by a historian looking for meaning in the details. But attributing those member-isolating rules to something like "preserving group identity" seems like the same thing.

I find myself wondering if something like the anthropic principle is at work here, i.e. the only religious groups to survive that long are the ones who historically isolated their members from outside ideas. There's probably a more general term for what I'm getting at.