Posts

The Upward Scaling Importance of Rationality 2013-04-27T18:30:02.865Z · score: 5 (14 votes)
The Wrongness Iceberg 2013-02-04T09:02:54.914Z · score: 20 (29 votes)

Comments

Comment by alfredmacdonald on The Upward Scaling Importance of Rationality · 2013-04-27T22:16:00.816Z · score: 1 (1 votes) · LW · GW

Kindness will only affect decisions where altruistic behavior wouldn't occur if lacking kindness. Integrity I'm even less sure about. Rationality could affect any decision where bias or fuzzy reasoning is involved, which is almost every decision.

Comment by alfredmacdonald on Are there good reasons to get into a PHD (i.e. in Philosophy)? And what to optimize for in such case? · 2013-04-27T18:20:17.076Z · score: 13 (13 votes) · LW · GW

You should get a Ph.D. in Philosophy if you consider the material studied in philosophy to be an end in itself. Philosophy is a truthseeking discipline, so if you find that inherently rewarding and could imagine doing that for a large part of your life it's a good decision. Don't worry about the wariness of philosophy: I can guarantee you that the criticisms levied here against philosophy have been addressed tenfold in actual philosophy departments, by people with sympathies closer to Luke's than you'd think.

That said, a lot of people go into graduate programs for bad reasons. Here are two I've been tempted by:

#1.

Minimizing Status Risk. A lot of people think about risk in terms of financial gain or loss, but few think about risk in terms of status when it's a real concern for many people. Graduating college can be intimidating, especially if you're at a prestigious college, because you're about to be stripped of your hierarchical standing among people your age. If you've attended, say, Harvard for four years, you've spent those four years thinking of yourself on the top of the food chain relative to other college students.

Once you're out of college, this is no longer true, and you're measured by what kind of job you have. It's extremely tempting to avoid this by applying to graduate school, because graduate school allows you to continue the imagined hierarchical standing that you've had for the past few years. Eventually you'll get a Ph.D. and be on top of the intellectual food chain. This has nothing to do with "avoiding the real world", because "the real world" as an employment area is conspicuously centered on office jobs or whatever the majority of people happen to do for money. (I wonder if farmers consider everyone else to have a "fake" job. Probably.)

It's a way of avoiding vulnerability to your status, because working as a clerk or receptionist or barista or server or whatever after college is generally not prestigious and makes you feel like your intellect isn't worth anything. That's an uncomfortable feeling, sure, but make sure you're not eyeing a Ph.D. just to avoid that feeling.

#2.

Even if you're not avoiding Status Risk, make sure you're not getting a Ph.D. just to feel like an intellectual hotshot anyway. A lot of people reason about competence in binary ways (expert or non-expert) even though competence obviously exists on a spectrum, so it's tempting to get a title that lends you immediately to the "expert" end of any discussion. That way, you can throw your weight around whenever there's a clash of words.

With philosophy especially, it's enigmatic to a lot of people. There's a mystery of what you're actually learning in an advanced program. So a Ph.D. looks like a "certified smart person" badge to a lot of people, and that's tempting. Make sure you're not getting it for that reason either.


Here's the litmus test. Ask yourself: "would I self-study this material anyway if I had the next three-five years paid for? Would this occupy a large part of my time regardless of what I'm doing?" If so, it's worth it.

Comment by alfredmacdonald on The Wrongness Iceberg · 2013-02-11T21:20:19.535Z · score: 0 (0 votes) · LW · GW

Sure, in the very short run (starting from absolutely no knowledge of the game) you'd have to make mistakes to learn anything at all. But the process of getting better is a gradual decrease of the frequency of those mistakes. You'd want to minimize your mistakes as much as possible as you got better, because the frequency of mistakes will be strongly correlated with how much you lose.

I think you're seeing "try to minimize how many mistakes you make" and reading that as "trying to make no mistakes." There are certainly mistakes you'll have to make to get better, but then there are superfluous mistakes that some people may make while others won't, or catastrophic mistakes that would make you look really bad which you'd definitely want to avoid. The depth of mistakes can go much deeper than the necessary mistakes you'd have to make to get better, in other words.

Comment by alfredmacdonald on How to offend a rationalist (who hasn't thought about it yet): a life lesson · 2013-02-08T05:53:46.672Z · score: 2 (2 votes) · LW · GW

I really liked this post, and I think a lot of people aren't giving you enough credit. I've felt similarly before -- not to the point of suicide, and I think you might want to find someone who you can confide those anxieties with -- but about being angered at someone's dismissal of rationalist methodology. Because ultimately, it's the methodology which makes someone a rationalist, not necessarily a set of beliefs. The categorizing of emotions as in opposition to logic for example is a feature I've been frustrated with for quite some time, because emotions aren't anti-logical so much as they are alogical. (In my personal life, I'm an archetype of someone who gets emotional about logical issues.)

What I suspect was going on is that you felt that this person was being dismissive of the methodology and that the person did not believe reason to be an arbiter of disagreements. This reads to me like saying "I'm not truth-seeking, and I think my gut perception of reality is more important than the truth" -- a reading that sounds to me both arrogant and immoral. I've ran across people like this too, and every time I feel like someone is de-prioritizing the truth over their kneejerk reaction, it's extremely insulting. Perhaps that's what you felt?

Comment by alfredmacdonald on The Wrongness Iceberg · 2013-02-08T05:41:19.286Z · score: 0 (0 votes) · LW · GW

I don't currently work at a restaurant, so at the moment I'm afraid of nothing.

But for the purposes of the example, it's not about discovering mistakes or incompetence -- it's about your level of incompetence being much greater than you previously estimated, for reasons you were unaware of prior to being exposed to those reasons.

Comment by alfredmacdonald on The Wrongness Iceberg · 2013-02-04T10:42:27.601Z · score: 3 (3 votes) · LW · GW

I find that similar to the concept of fractal wrongness. What distinguishes an iceberg from a fractal is that you're in situations where someone is resisting exposing the whole iceberg for one reason or another. In the dishonesty scenario, you realize one lie reveals many others but only because that person has left you a tidbit of information that cracks their facade and allows you to infer just how deeply they've lied to you -- or in the case of attraction, an event or action that only would occur if they had a much greater level of attraction existing below the surface.

Comment by alfredmacdonald on Beware Trivial Inconveniences · 2013-01-01T19:01:56.428Z · score: 4 (4 votes) · LW · GW

I think LessWrong actually has a higher barrier for contribution -- at least for articles -- because you're expected to have 20 comment karma before you can submit. This means that, if you're honest anyway, you'll have to spend your time in the pit interacting with people who could potentially shout you down, or call you a threat to their well-kept garden, or whatever.

I have at least 3 articles in draft format that I want to submit once I reach that total, but I don't comment on discussions as much because most of what I would say is usually said in one comment or another. For people like me, the barrier of "must email someone" is actually easier, since discussion contribution requires a sense of knowing how the community works, intuiting a sense of what the community deems a good comment, and posting along those lines.

Comment by alfredmacdonald on Train Philosophers with Pearl and Kahneman, not Plato and Kant · 2012-12-15T16:04:59.128Z · score: 0 (0 votes) · LW · GW

Luke, I was curious: where does informal logic fit into this? It is the principal method of reasoning tested on the LSAT's logical reasoning section, and I would say the most practical form of reasoning one can engage in, since most everyday arguments will utilize informal logic in one way or another. Honing it is valuable, and the LSAT percentiles would suggest that not nearly as many people are as good at it as they should be.

Comment by alfredmacdonald on Train Philosophers with Pearl and Kahneman, not Plato and Kant · 2012-12-15T15:48:28.754Z · score: 1 (1 votes) · LW · GW

His understanding of philosophy is barely up to undergraduate level. Sorry, but that's the way it is.

I feel like the phrasing "barely up to undergraduate level" is like saying something is "basic" or "textbook" not when it's actually basic or textbook but because it insinuates there is an ocean of knowledge that your opponent has yet to cross. If luke is "barely undergraduate" then I know a lot of philosophy undergrads who might as well not call themselves that.

While I agree that reform is far more likely to be done by a Dewey or Erasmus, your reasoning gives me a very "you must be accepted into our system if you want to criticize it" vibe.

Comment by alfredmacdonald on 2012 Survey Results · 2012-12-15T10:26:08.927Z · score: 5 (5 votes) · LW · GW

The general population would contain 50 sociopaths to 1000; I don't think LessWrong contains 50 sociopaths to 1000. Rationality is a truth-seeking activity at its core, and I suspect a community of rationalists would do their best to avoid lying consciously.

I am not sure what "the common definition of the word 'lie'" is, especially since there are a lot of differing interpretations of what it means to lie. I know that wrong answers are distinct from lies, however. I think that a lot of LessWrong people might have put an IQ that does not reflect an accurate result. But I doubt that many LessWrong people have put a deliberately inaccurate result for IQ. Barring "the common definition" (I don't know what that is), I'm defining "stating something when you know what you are stating is false" as a lie, since someone can put a number when they don't know for sure what the true number is but don't know that the number they are stating is false either.

I don't know what you mean by "mean something" with respect to Mensa Denmark's normings. They will probably be less accurate than a professional IQ testing service, but I don't know why they would be inaccurate or "meaningless" by virtue of their organization not being a professional IQ testing service.

The only way I can think of in which the self-reported numbers would be more accurate than the IQTest.dk numbers is if the LW respondents knew that their IQ numbers were from a professional testing service and they had gone to this service recently. But since the test didn't specify how they obtained this self-report, I can't say, nor do I think it's likely.

IQTest.dk uses Raven's Progressive Matrices which is a standard way to measure IQ across cultures. This is because IQ splits between verbal/spatial are not as common. It wouldn't discriminate against autistics, because it actually discriminates in favor of autistics; people with disorders on the autism spectrum are likely to score higher, not lower.

I'm not sure how the bolding of "in that way" bolsters your argument. Paraphrased, it would be "in the way that the user types the IQ score into the survey box themselves, the IQTest.dk questions are equally flawed to the other intelligence questions." But this neglects to consider that the source of the number is different; they are self-reports in the sense that the number is up to someone to recall, but if someone types in their IQTest.dk number you know it came from IQTest.dk. If someone types in their IQ without specifying the source, you have no idea where they got that number from -- they could be estimating, it could be a childhood test score, and so on.

Please consider getting some rationality training or something.

Remarks like these are unnecessary, especially since I've just joined the site.

Comment by alfredmacdonald on 2012 Survey Results · 2012-12-15T09:29:20.208Z · score: 1 (1 votes) · LW · GW

Over 1000 people took the test. Statistically speaking, it should have included about 50 sociopaths.

Not if LessWrong values truthseeking activities more than the general population, or considers lying/truth-fabrication a greater sin than the general population does, or if LessWrong just generally attracts less sociopaths than the general population. If over 1000 fitness enthusiasts take a test about weight, the statistics re: obesity are not going to reflect the general population's. Considering the CRT scores of LessWrong and the nature of this website to admire introspection and truthseeking activities, I doubt that LW would be reflective of the general population in this way.

Lies are more than untrue statements; at least, in the context of self-reports, they are conscious manipulations of what one knows to be true. Someone might think they know their IQ because they've taken less reliable IQ tests, or because they had a high childhood IQ, or because they extrapolated their IQ from SAT scores, or for a host of other reasons. In this case they haven't actually lied, they've just stated something inaccurate.

Someone could put an IQ when they have no idea what their IQ is, yes, in the sense that they have never taken a test of any sort and have no idea what their IQ would be if they took one, even an inaccurate one. I don't think many people here would do that, though, because of the truthseeking reasons mentioned earlier.

Mensa is a club not a professional IQ testing center. They're not even legally allowed to give out scores anymore. Their test scores are not considered to be accurate.

Mensa doesn't need to be a professional IQ testing center for their normings to be accurate, however. I am also not sure how not accounting for learning disorders would seriously alter IQTest.dk's validity over self-reports.

However, it's inaccurate to say that because someone puts their number in the box from IQTest.dk that they're "equally flawed" to the other intelligence questions. Someone who self-reports an IQ number, any number, may not know if that number was obtained using accurate methodology. It may be an old score from childhood, and childhood IQ scores vary wildly compared to adult IQ scores. It may be an extrapolation from SAT scores, as I mentioned above. There are a number of ways in which self-reported IQ differs from reported IQtest.dk IQ.

LessWrong is going to eat you alive, honey. Get out while you're ahead.

This reads as unnecessarily tribalistic to me. I take it you think I am an undiscriminating skeptic? In any case, cool it.

Comment by alfredmacdonald on How I Ended Up Non-Ambitious · 2012-12-15T08:57:21.599Z · score: 0 (0 votes) · LW · GW

Some personality traits may be conducive to "natural" rationality. High scores on the Narcissistic Personality Inventory for example may indicate ego-preserving tendencies that make greater levels of rationality more difficult to obtain. I'd imagine that natural levels of introversion would also help, and I say that as someone who usually maxes out the extroversion scale on these kinds of tests.

Comment by alfredmacdonald on How I Ended Up Non-Ambitious · 2012-12-15T08:53:04.449Z · score: 1 (1 votes) · LW · GW

Does anyone know if there is a similar trope website for rationality -- does the LessWrong wiki qualify? Or trope websites for humor? Or rhetorical devices?

Hell, Silvia Rhetoricae is sort of like TVTropes for rhetoric but managed by one person instead of a community.

Comment by alfredmacdonald on Philosophy: A Diseased Discipline · 2012-12-15T08:41:42.221Z · score: 0 (0 votes) · LW · GW

Often people who dismiss philosophy end up going over the same ground philosophers trode hundreds or thousands of years ago. That's one reason philosophers emphasize the history of ideas so much. It's probably a mistake to think you are so smart you will avoid all the pitfalls they've already fallen into.

While I agree that it's important to avoid succumbing to these ideas, philosophy curricula tend to emphasize not just the history of ideas but the history of philosophers, which makes the process of getting up to speed for where contemporary philosophy is take entirely too long. It is not so important that we know what Augustine or Hume thought so much as why their ideas can't be right now.

Also, "the history of ideas" is really broad, because there are a lot of ideas that by today's standards are just absurd. Including the likes of Anaximander and Heraclitus in "the history of ideas" is probably a waste of time and cognitive energy.

Comment by alfredmacdonald on Philosophy: A Diseased Discipline · 2012-12-15T08:33:40.076Z · score: 4 (4 votes) · LW · GW

YeahOKButStill has an interesting take on the interaction between philosophy done in blogs and philosophy done in journals:

"... Many older philosophers lament the current lack of creativity and ingenuity in the field (as compared to certain heady, action-packed periods of the 20th century), yet, it is a well-established fact that in order to be published in a major journal or present at a major conference, a young philosopher has to load their paper/presentation with enormous amounts of what is called the "relevant literature". This means that even the most creative people among us (a group I do not count myself as belonging to) must spend huge amounts of time, space and energy trying to demonstrate just how widely they have read and just how many possible objections to their view they can consider, lest some irritable senior philosopher think that their view has not been given a fair shake. Of course, there is no evidence whatsoever that the great philosophers of the 20th century wrote and thought in this manner, as a quick survey of that relevant literature will show.

Blogs are a space for young philosophers to explore their ideas without these sorts of constraints, to try ideas on for size and to potentially get feedback from a wide audience. Indeed, the internet has the potential to host forums that could make reading groups at Oxford and Cambridge look positively stultifying. Yet, this is not how things are playing out: most young philosophers I know are afraid to even sign their real names to a comment thread. This, as anyone can see, is an absurd situation. However, since I have no control over it, I must bid this public space adieu."

Comment by alfredmacdonald on 2012 Survey Results · 2012-12-15T08:23:49.359Z · score: 1 (1 votes) · LW · GW

I have always despised the term "pseudointellectualism" since there isn't exactly a set of criteria for a pseudointellectual, nor is there a process of accreditation for becoming an intellectual; the closest thing I'm aware of is, perhaps, a doctorate, but the world isn't exactly short of Ph.D.s who put out crap. There are numerous graduate programs where the GRE/GPA combination to get in is barely above the undergrad averages, for example.

Comment by alfredmacdonald on 2012 Survey Results · 2012-12-15T08:18:39.290Z · score: 0 (4 votes) · LW · GW

I don't think anyone on Less Wrong has lied about their IQ. (addendum: not enough to seriously alter the results, anyway.) If you come up with a "valuing the truth" measure, LessWrong would score pretty highly on that considering the elaborate ways people who post here go about finding true statements in the first place. To lie about your IQ would mean you'd have to know to some degree what your real IQ is, and then exaggerate from there.

However, I do think it's more likely than you mention that most people on LessWrong self-reporting IQ simply don't know what their IQ is in absolutely certain terms, since to know your adult IQ you'd have to see a psychometricist and receive an administered IQ test. iqtesk.dk is normed by Mensa Denmark, so it's far more reliable than self-reports. You don't know where the self-reported IQ figures are coming from -- they could be from a psychometricist measuring adult IQ, or they could be from somewhere far less reliable. It could be that they know their childhood IQ was measured at somewhere around 135 for example, and are going by memory. Or they could know by memory that their SAT is 99th percentile and spent a minute to look up what 99th percentile is for IQ, not knowing it's not a reliable proxy. Or they might have taken an online test somewhere that gave ~140 and are recalling that number. Who knows? Either way, I consider "don't attribute to malice what you can attribute to cognitive imperfection" a good mantra here.

126 is actually higher than a lot of people think. As an average for a community, that's really high -- probably higher than all groups I can think of except math professors, physics professors and psychometricists themselves. It's certainly higher than the averages for MIT and Harvard, anyway.

About the similarity between self-reported IQ and SAT scores: SAT scores pre-1994 (which many of the scores on here are not likely to fall into) are not reliable as IQ test proxies; Mensa no longer accepts them. This is because it is much easier to game. I tutor the SAT, and when I took the SAT prior to applying at a tutor company my reading score was 800, but in high school pre-college it was only in the mid-600s. SAT scores in reading are heavily influenced by (1) your implicit understanding of informal logic, and (2) your familiarity with English composition and how arguments/passages may be structured. Considering the SAT has contained these kinds of questions since the mid-90s, I am inclined to throw its value as a proxy IQ test out the window and don't think you can draw conclusions about LessWrong's real collective IQ from the reported SAT scores.

The IQTest.dk result may have given the lowest measure, but I also think it's the most accurate measure. It would not put LessWrong in the 130s, maybe, but it would mean that the community is on the same level of intellect as, say, surgeons and Harvard professors, which is pretty formidable for a community.