Posts
Comments
Well, exactly. That's what I meant when I said that it was very confusing to me, as a young grad student in an outside field, to have a course that assigned Peirce and Lacan side by side with a straight face, evidently taking them equally seriously.
There may or may not be some legitimate field of inquiry going under the name of semiotics. In grad school a number of years ago, however, I took a (graduate-level) Introduction to Semiotics that was a pretty remarkable hodgepodge of bullshit, along with just enough non-bullshit to make a complete outsider like myself (not at all fluent in the obscurantist discourse of "cultural studies," "critical theory," and the like) feel like maybe the problem was me and not the material. (Later reflection gave me a lot more confidence that the problem was, in fact, the material.)
Among the reading was Freud, Lacan, Derrida, J. L. Austin, Marcel Mauss, Saussure, Lévi-Strauss, and Peirce. (There was other stuff too that I don't recall right now.) Interestingly, of those I would say that only Lacan and Derrida were outright charlatans (which is not to endorse any of the others in particular, just to say that they were all doing something at least potentially more valuable than pulling stuff out of their asses). But the writings of the non-charlatans were presented so confusingly and tendentiously that it never remotely cohered into any sense that semiotics was a field with any integrity of its own or anything useful to contribute. That is to say, none of those thinkers would have described himself as a "semiotician," so it was very much a post hoc attempt to put a framework around a bunch of very diverse writing that in many cases was pretty foreign to its original intent.
This is all n=1, of course, but on that basis I tend to think that semiotics as a standalone field is probably more or less as you say it is.
I like how you call it "a set of heuristic practices that work well in a lot of situations, justified by complete bullshit." Because my first instinct when writing this comment was to include a remark to the effect that even if theoretical semiotics is mostly or entirely crap, there is some valuable work that calls itself some variety of applied semiotics. For example, there are some people who do musical semiotics, and—since it's not at all obvious what music signifies and how it does so, either in the general or specific cases—I have found some of that work enlightening. But on further thought, you're absolutely right in your characterization of it. Musical semiotics can be full of insight, but its adoption of the theoretical apparatus of (e.g.) Roland Barthes or Umberto Eco is no part of its value—that, instead, is an attempt to bring "theoretical" rigor to a fundamentally nonrigorous enterprise. So much the worse for any "application" of semiotics if it relies on the cesspool of semiotic theory to back up its assertions.
Phil, you've probably seen this already, but a bunch of proposals for alternative notation systems are collected here. Some of them are basically exactly what you would prefer to be reading. It would be really cool if someone wrote a Lilypond package that could output in some of these systems. (Maybe someone has, I don't know.)
Very true. Staff notation essentially says "Here are the pitches and rhythms, now it's your job to figure out how to make them happen on your instrument." As you point out, a very real alternative to staff notation exists in tablature, which (in general) is any notation system that instead says "Here's what you need to do physically on your instrument. Follow these instructions and the notes will automatically be the right ones—you don't need to worry about what they 'are'."
Tablatures are surprisingly old, apparently going back 700 years or so in various forms. Of course, their drawbacks as general musical notation are clear enough. Namely, if you want to understand what's going on in the music or play music on a different instrument, tablature is really only a kind of lookup table for actual notes, and often a very cumbersome one.
Agreed on all this.
I didn't have anything really radical in mind. I think it's pretty clear that there's a long-term trend toward high-level music-making relying on notation to a decreasing extent. I have a number of friends who are professional composers, and some of them use notation to write for instruments, while others use electronics and largely don't use notation at all. (The latter group, who compose for video games, movies, etc., are the ones who actually make money at it, so I'm by no means just talking about avant-garde electronic music.) A lot of commercial composers who would have been using paper and pencil 30 years ago are using Logic or Digital Performer today.
The other factor, of course, is that notated genres of music ("classical" music and its descendants, and some others) are increasingly marginal in Western culture. This trend is often way overblown, but is clearly visible at the timescale of decades or longer.
What I certainly don't mean to suggest is that individuals who use notation in our musical lives, like you or me, will stop using it. It'll be a cohort replacement effect, and no doubt a very gradual one. Nor do I think that music notation will entirely go away at some foreseeable point in the future. But reading and using it will slowly become a more specialized skill. My impression, though I don't have a reference for this and could be completely wrong, that the ability of American adults (not pro musicians) to read music notation with some fluency has hugely declined over the last half-century.
All this is very much the framing argument of Taruskin's Oxford History of Western Music, with its much-criticized focus on what he calls the "literate [his needlessly inflammatory term for 'notated'] traditions" of music. Within that frame, he casts the present day as essentially an "end-of-history" moment.
Correct me where I'm wrong here! I'm not a specialist in these issues.
Let me add that, like you, I absolutely love music notation, borderline fetishize it, and say all this with more than a trace of a Luddite's sadness.
Good post and I'll chime in if you don't mind. I teach this stuff for a living and even highly skilled musicians struggle with it in various ways (myself emphatically included).
The main thing I want to say is that there's a reason why essentially all music education consists of many years of rote learning. Obviously, that rote learning works better if it's guided in appropriate directions, but I really don't know of any alternative to what you describe when you say "an orders-of-magnitude-less-efficient mechanism for memorizing note-to-note mappings for every note and every pair of keys." I hate to say it, but ... yep. [EDIT: eh, let me qualify that a bit. See point (A) below.]
Sight-transposition (i.e. sight-reading plus on-the-fly transposition) is a ninja-level skill. Some instrumentalists (usually those who play non-concert-pitch instruments) can do it reasonably well for at least some transposition intervals, and a few people like professional vocal accompanists and church organists need to be able to do it fluently as an expected part of their job. But outside of those folks, even professional musicians rarely have that facility.
Here's something that directly supports your point at (D). As you know, pitch intervals in tonal theory are given names that break arithmetic—a second plus a fourth is a fifth, even though 2+4≠5. A certain well-known music theorist often expresses the view that this blatantly illogical convention is almost entirely responsible for the popular perception that music theory is a really, really difficult subject. I think this exaggerates things, but he's got a point. However, most musicians know those interval names really well and have never thought much about how stupid they are, and so then high-level music theory becomes opaque to skilled musicians because we start by renaming intervals correctly (i.e. a second is diatonic interval 1, and you can add them like normal numbers).
In the case of the frustrating conventions of staff notation, there are historical reasons going back a millennium why we write pitches like that. Various reforms have been proposed, but path-dependency basically makes it impossible that any of them would ever be adopted. Far more likely (and well underway for decades now) is that musicians will stop using notation altogether.
Just to briefly answer your other questions with my personal views:
(A) Personally yes, I have all the note-to-note mappings memorized. I do this completely via thinking in scale degrees. I can name any scale degree in any key, so questions like the one you mentioned just revolve around thinking "B-flat is scale-degree 4 in F major. What's scale-degree 4 in C or A-flat?"
(B) Yes, I do think this is plausible, and underappreciated in the specific case of music, since most musicians don't think much about the ways in which notation isn't an optimized system.
(C) Maybe this is too glib, but ... social interaction? "Overthinking it" isn't a path to doing well in social settings. For that matter, natural language might be another. In many respects it's best learned by rote (along with some theory—just like music) but I've certainly had classmates in language courses who get too hung up on the illogic of grammar to progress well in basic skills like speaking and listening comprehension.
Yes, it should be clarified. The main ambiguity that I was reacting to is that "art" can mean specifically visual arts or it can mean "the arts," extending to performing and literary arts. As it is, I'm not sure if my profession (scholarship concerning music) is "art" or "other."
In fact (now addressing Yvain again), why is this category called "Profession" instead of "field"? It creates some odd overlap with the previous category of "Work status" which produces a little bit of confusion per my original suggestion and fubarobfusco's reply.
On "Profession," the field label "Art" is vague. Better would be "Arts and humanities."
I used to hear something similar in debates over gay marriage:
Gay person: "I only want to have the same right as a straight person: the right to marry the person I love."
Gay marriage opponent: "No no, you already have the same right as a straight person: the right to marry a person of the opposite sex. If you also want the right to marry a person of the same sex, you're asking for extra rights, special privileges just because you're gay. And that simply wouldn't be fair."
Edit: bramflakes beat me to it.
Right. But, when exposed to it, some are drawn in and some run as fast as possible in the opposite direction. The point of the example was that there's a surprisingly large amount of individual variation on what kinds of fundamental sounds and timbres people find most pleasing, and (I cautiously suggest) that appears to be the most innate and least malleable or learnable aspect of a person's response to various kinds of music.
Just a couple of thoughts about this. First, as far as anyone can tell music enjoyment is a remarkably multifaceted phenomenon (and "music" itself is a term that describes a pretty giant range of human behaviors). There's no single reason, or even manageably short list of reasons, why people like it. It seems to be wrapped up in many different physical, neurological, cognitive, emotional, social, and cultural systems, any of which (in any combinations) could be responsible for a certain person's reaction to a certain kind of music. Some of the aspects of that seem to be relatively innate, like finding certain sonic timbres inherently pleasurable, while others are highly learned, like the kind of pleasurable "understanding" that comes from knowing how a classical sonata movement is ordinarily structured.
In your case, I'd guess that you have an atypically low physiological/neurological enjoyment of things like instrumental timbres, which makes the more cognitively demanding aspects of music-listening no more than a chore. For comparison, this is why we don't generally listen to spoken words (e.g., audiobooks) as background listening: there's nothing to be gained from it outside the semantic content, which is distracting unless you can tune it out, in which case why bother.
(Merely finding music distracting is not at all rare. In fact, the various professional musicians and music scholars I know listen to less music than most other people do, because our training makes it hard for us to listen as other than a "foreground" mental activity. I myself almost never listen to background music. Unlike you, though, I do like music a lot.)
We seem to have a tendency, when discussing music as when discussing other things, to assume that other people are more like us than we have any good reason to think they are. For example, I find the timbres and general sound world of noise music to be extremely unpleasant. So when I imagine someone who likes noise music a lot, my first impulse is to think they must in some sense "enjoy unpleasant things" (an obvious category error), or at least that they must find something in noise music that's rewarding enough to get past how clearly unpleasant the sounds are. And yet when I actually talk to a fan of noise music, they often tell me they find the timbres and sounds of noise music (exactly the aspects of it I can't even imagine liking) to be very pleasant or arousing in some way. The enjoyment of these basic aspects of a kind of music (what kinds of sounds it's made up of) seems to be sufficiently physiologically/neurologically determined for a lot of people that it is almost impossible to imagine liking a kind of music you don't "naturally" like.
In other words, and I do not mean this even slightly pejoratively, I would expect it to be very difficult for you to imagine why other people find, say, the sound of an orchestra playing a single major triad (NB, a purely sonic event with no syntactic or semantic content) pleasant. Much as it is for me to imagine finding noise music pleasant—it's just not what my brain is built to enjoy.
Relatedly, the history of the questions "why do people like music?" and "what kind of music is best?" feature some truly aggravating episodes that seem to stem from the idea that music is (or should be) a single kind of thing to all people, and that we just have to figure out what. (To be clear, I'm in no way suggesting that you're taking that point of view.) The idea that music is just a really, really complicated phenomenon with which everyone interacts a bit differently—and the corresponding aesthetic pluralism that follows from that fact—has been amazingly slow to spread, no less so in professional music circles than elsewhere.
This is strictly pop-science writing, but there was an interesting piece in the NYT Magazine a couple of years ago about ketosis as a treatment for pediatric epilepsy, where apparently it's extremely effective at controlling seizures in a significant fraction of patients.
I don't think I understand at all what these descriptions of confidence levels are supposed to mean. Do they refer to your confidence in specific pieces of information about the people in the descriptions? Information you heard from those people? What scenario does the business about email addresses envision?
EDIT: Apologies, I now see the parenthetical "(being applied to identity verification, where possible)," which I managed to completely overlook on a first reading. Please ignore the above criticism, but you still might want to make the deciban descriptions more explicit.
My pleasure, glad it seems useful.
Sounds like you have some good, concrete ideas about how to proceed. Contacting professors whose work interests you, to ask about graduate study in their departments and/or labs, is certainly a necessary step.
Throughout academia, we have a rule of thumb: do not ever, ever, spend any of your own money or go into debt for a PhD. That means that any place at which you should give the slightest consideration to doing graduate work should offer you a full waiver of tuition, plus a modest income ("stipend") and health insurance, for the duration of a reasonable period of study. The rationale for this rule of thumb is twofold: First, the expected financial returns to a PhD simply aren't such that you can afford to risk having tens of thousands of dollars (or more) of debt to repay. Second, a university's willingness to spend their money to fully fund you serves as a useful indicator that they think you have real potential for success.
When you correspond with scientists with whom you might want to study, they should be able to tell you roughly how funding works in their departments. It's not the same at every university or for every student. Possible sources for funding are basically: (1) You working as a researcher in someone's lab, supported by the university and/or by grants won by the lab's PI; (2) you working as a teacher or teaching assistant; (3) fellowship support provided by the university (i.e. they just give you money); (4) outside grants or fellowships you win yourself. The normal case for scientists is that your funding mostly comes from (1), but among scientists of my acquaintance there has been a healthy mixture of all four, and nearly all graduate students in science will at some point get funding from more than one of those sources. However, what they should be able to tell you before you even apply is how many years of funding are guaranteed by the university, whether funding is usually available beyond the guaranteed years, and what the typical funding package consists of (as I said earlier, it should at a minimum contain a full tuition waiver, health insurance, and a modest stipend for living expenses suitable to the area you'd be living in).
That's pretty much all I can tell you about the funding of graduate study in the sciences, since my entire academic life has been spent on the arts and humanities side, which handles graduate funding somewhat differently. The people you should be leaning on for advice are professors at your own undergraduate institution—particularly younger ones, since they will have gone through this more recently—and other knowledgeable scientists. They should be able to separate your academic and scientific potential from your lack of practical know-how and help guide you through the process of application, from identifying places to apply all the way to deciding which of your admission/funding offers to accept, if you get that far. They will have a lot more to tell you than I possibly can about what questions you should be asking of potential grad schools at all stages of the process.
A few other notes:
- If you're noticing conflicting information about how graduate funding works, it's probably just because different departments handle it differently. When in doubt, refer to the rule of thumb above. It's ok for departments to achieve full funding of graduate students in different ways, but not ok for them to fund some students but not others, or to admit you without making it clear how funding will work.
- You could also be getting conflicting information from people with experience in different branches of science. Psychology, molecular bio, evolutionary bio, experimental physics—to pick a few—all have their own characteristic ways of approaching graduate study, collaboration, funding, etc. So it's best to get advice from people as near as possible to your own interests.
- Some science departments admit graduate students to the overall program and then let them later choose which lab to affiliate with. Others admit you with the up-front understanding that you will be working in a particular lab. Find out how it works at the places you apply to.
- When weighing offers of graduate admission, try to get some data on outcomes for students in the program, such as job placement, time to degree, and success at winning grants (especially if grants are relied upon for graduate funding). Also, talk to current students in the program, who can tell you whether the program does well by its students, or alternatively makes life tough for them, e.g. by screwing them out of funding.
- A really serious round of graduate applications does cost some money. In your comments you often seem concerned about that. Unfortunately with a total lack of support from your parents you'll probably need to have a few hundred bucks in reserve for costs associated with applying, and another few hundred bucks for moving to the area where your new school is located. If you aren't prepared to live off-campus in an apartment, which carries logistical headaches that you seem quite daunted by, all large research universities have on-campus graduate dorms, so you really would not need to do anything except drive there with your personal belongings packed into a car. Anyway, save up a little money.
A lot of these concerns are a ways down the road for you, though. You'll probably find that getting funding is easier than you might think at graduate programs you really want to get into. The best thing you can do as an undergrad is make yourself an un-ignorable candidate for graduate admission. Study like crazy, get high test scores (super important, don't let anyone tell you otherwise—this is true even in the humanities), find some ways to take initiative, and if possible form some good relationships with faculty at your college.
Good luck! Do try to get a mentor at your college, it's a much more reliable source of personalized information than pseudonymous musicologists you met on the internet. There are also books and online forums for people who want to do graduate study in the sciences, although I can't personally recommend any by name.
Thanks for this post. Whatever problems the JTB definition of knowledge may have—the most obvious one of those to LWers probably being the treatment of "knowledge" as a binary condition—the Gettier problem has always struck me as being a truly ridiculous critique, for just the reasons you put forward here.
Scott Lemieux once called this the "my-utopia-versus-your-grubby-reality asymmetry," a delightful turn of phrase which has stuck with me since I read it.
Although Lemieux was talking about something subtly different from, or possibly a subset of, what you're talking about: the practice of describing the benefits of your own preferences as if you could completely and down to the smallest detail redesign the relevant system from scratch, while insisting on subjecting your opponent's preferences to a rigorous "how do we get there from here" analysis with all the compromises, imperfections, and unforeseeable obstacles the real world always entails.
"That" if you're a grammar Nazi; either one if you're a professional linguist or mere native speaker of English. :)
A big +1 to this and it echoes in many respects my advice here to a similar question. What you hit upon here that I did not do in that comment is the importance of understanding the etiology of one's new belief.
Yeah, but it might be useful to know what the person in question considers to have been the crucial aspects of their procedure, as opposed to merely ancillary aspects. This won't be failproof but will at least have better than chance odds of contributing something useful to the advice.
For starters, I'd say it would be best to take advice from people whose careers and accomplishments are to some extent a matter of public record. Then you can evaluate (a) whether they seem to have actually accomplished the things they're trying to teach you to accomplish, and (b) whether they seem to have accomplished those things via the procedure they're encouraging you to follow. If yes to both, then you might proceed further.
In that case, the problem of making good advice seem too easy might come down to a couple of things. First, you want to see a good step-by-step procedure where you can really understand each step and imagine exactly what you'd have to do to achieve it. Second, it would be a red flag if any of those steps seem to be "magic" steps such as "Have a brilliant, lucrative idea for a business."
Glad you're doing this and sorry that I do not currently have time to proofread a batch of them myself.
This might be the only time ever that I can mention this without automatically sounding like an asshole, so here goes: Eliezer, whose writing is generally amazingly consistent and well-proofread with respect to style and punctuation, has the habit of using a hyphen, surrounded by single spaces, in place of a dash. He's far from alone in doing this, and it's an entirely reasonable habit to have given that the hyphen is an ASCII character but dashes aren't. However, it doesn't look good even on the web, and looks a lot worse in nicely typeset text. I'd strongly recommend replacing these space-enclosed hyphens with one of the two standard, correct dash usages: either an em (long) dash without spaces, or an en (short) dash enclosed in single spaces.
My apologies if this kind of advice isn't welcome or, of course, if you and your proofreading team already thought of it!
If you want to be a professional biologist (or any professional scientist) you will probably need to get one or more graduate degrees. (There are exceptions to this, but your career possibilities will be more limited.) This complicates matters in some respects and simplifies it in others. Let me mostly focus on ways it simplifies matters.
- Going to grad school will give you the chance to move someplace for a few years before eventually moving on to yet someplace else. This is great for a few reasons. You get a chance to see what it's like to move -- maybe it won't be as bad as you think. You defer making a more permanent decision until you're older and have more experience. You (maybe) get to see another part of the country for some perspective on where you live now. Even if you wind up not liking the place you go very much, it'll still have been a good experience for these reasons.
- If your parents are remotely reasonable, they have to be more ok with you moving away for solid career-related reasons than they would be with you moving away for more nebulous lifestyle-related reasons. (Although from this and your other thread, it sounds like your parents might actually not be remotely reasonable at all.)
- However, if you get some admission offers from grad programs, they might pay your travel expenses to go there and check out the programs. Furthermore, most good science PhD programs will fully fund your studies and living expenses, so saving up a bunch of money before moving isn't really an issue. Having this kind of financial freedom will enable you to defy your parents' ridiculousness about this. (You won't be living high on the hog by any means, but you'll have enough to get by.)
- If you can get good advice from your college professors about which grad programs would be a good match for your interests and aptitudes, this will usefully constrict your options and help you overcome the paralysis of choice. While ultimately you may want to be more discriminating about where you move for the long run, see my first bullet point for why it would probably be good to just get that first move under your belt.
I'm not a scientist, so I may be underestimating the possibilities for interesting, fulfilling employment in the sciences without a graduate degree -- others can correct me if this is the case. But I think I've given some reasons why you might want to consider grad school even if I'm wrong in that respect.
Well, at the risk of explaining my joke, I only meant to suggest that the opening of the chapter makes it sound like Beck thinks Beethoven's Fifth would have been "famous" and instantly recognizable to Englishmen in 1678. Maybe I should charitably assume that Beck originally had it as "the latest church anthem by Purcell" but his editors made him change it.
He should have brought Archimedes's Chronophone with him instead!
(ugh, I'm sorry for that.)
The chapter begins with a pretty delightful infelicity, since in 1678 Beethoven's Fifth Symphony was still 130 years away from its premiere. Granted, this is very specialized knowledge available only to professional musicologists like myself and I doubt Beck's publisher can afford my consulting fees.
(I can just imagine the English scientists standing around wondering why this lunatic is inflicting this cacophony on them and looking at them so expectantly.)
You know, this is one of those cases (coming out as GLBT would be another one) where we sometimes have to, in essence, parent our parents. Be the patient grownup while they have their temper tantrum, and after they calm down be willing to forgive the hurtful, ridiculous things they said. I think it's more than reasonable to say you'll only talk to her about this when she can be at least calm about it. Encourage her to ask you questions and answer them honestly. Reassure her that nothing about your relationship with her has changed -- she has no need to feel that she doesn't know you.
If this is really a shock to her, it might be a while before she can get used to it, and again, you have to be the patient grownup during that time. But she will probably get used to it eventually. And if after a reasonable length of time she is still giving you grief about it and making it clear that she doesn't accept you, you can let her know that she needs to hurry up and get over it or else she will not see you as often. (All this is entirely parrotting Dan Savage's advice to people whose parents don't accept their sexual orientation: as he says, the only leverage you ultimately have over your parents is your presence in their lives.)
I'll add this: in your conversations with your mother, this is not the right time to argue for the factual correctness of atheism. Even if you don't really believe this, I would emphasize that religion is a very personal matter and that you are just the kind of person to whom religion doesn't seem right. That way you're making it about you, not attacking the foundations of her own beliefs. (Furthermore, this can help reassure her that she didn't fail as a parent -- you were just not the kind of person who could have been given a Catholic education that would really stick.) Ultimately, having close personal relationships with people who you really disagree with about religion has to involve agreeing to disagree and to compartmentalize some things, and also at times to leave some topics off the table for discussion. Obviously, this isn't the way we'd behave in a society of pure rationalists, but the fact is that we do often want to have those relationships and so allowances must be made.
Lastly: you've done the right thing by -- and sorry to keep using this metaphor -- coming out of the closet. Society as a whole is bettered when religious people think of atheists not as a faceless, scary group but as a group of normal people including their own friends and/or children.
Basically, I disagree with this. A few thoughts:
(1) Even if we were all perfectly rational, it'd still take time to research the optimal answer to every question. Why shouldn't I outsource that research to people who are interested in doing it and whose basic viewpoints I trust? [Edit: RomeoStevens already made this point above.]
(2) What's the harm to you from posts on "applied rationality" topics being posted on LW? Don't read or comment on what you aren't interested in. If you prefer posts on the theory and practice of rationality itself, just read, comment on, and write those kinds of posts. LW Discussion is currently nowhere near being such a firehose that you can't quickly sift through what's been posted recently and decide which threads you're likely to be interested in.
(3) As we improve as rationalists, it's vital to repeatedly apply those skills in various contexts in order to practice them. Why shouldn't that practice take place in a quasi-social arena where others can point out flaws or gaps? If I think I've learned some skill of rationality (such as researching the optimal product to buy for some purpose), the odds of my continuing to apply it successfully with no further input from other interested people are not very good. I guess in this bullet point I'm arguing that even seemingly very object-level discussions are, in the LW context, actually functioning in part as discussions of rationality.
So, this is about developing a photographic memory for text, one paragraph at a time. Is that really something you want? Why not make an Anki flashcard out of the one thing (or more, if it's a really information-dense paragraph) you most want to remember from the text?
Thanks for the pointer, I'll check it out.
Leo, in my line of work, is one of the most useful resources on the web. I don't think there are comparable resources for any other languages besides German, which is really too bad. (If anyone knows different, please link!)
It's a curiosity stopper in the sense that people don't worry any more about risks from AI when they assume that intelligence correlates with doing the right thing, and that superintelligence would do the right thing all the time.
Stuart is trying to answer a different question, which is "Given that we think that's probably false, what are some good examples that help people to see its falsity?"
I guess I think this is, at best, only part of your true rejection. If there were some visionary artist who wanted to create art that would get thousands of people interested in the SIAI cause, such that donations poured in and some bright mathy kids decided to help solve FAI problems, I have a feeling you'd tell that artist "Go for it, with our gratitude."
(Ahem.)
This would in no way entail converting that person into anything other than a "pure" artist. There would be no need for that person to become the kind of highly flexible SIAI researcher you're suggesting here.
I think your true rejection is roughly as follows:
What you're arguing here is that some kinds of things are just plain unuseful to the SIAI cause. You almost certainly don't need the assistance of a musicologist, much as it may pain me to say. If I show up and say "I'm a musicologist, how can I help?", you're going to say "Well, either learn to do something useful for us or else donate some portion of your lavish musicologist salary to SIAI." And then if I say, "No no, how can I use musicology to help?", you're going to think I'm an idiot. This is more or less what you've sketched above.
However, a whole bunch of other things, including all the activities you included above (artist, scientist, business person, politician, hacker) are indeed potentially useful to SIAI. What you actually don't want—and quite reasonably so—is to be in the position of needing to manage those people's efforts. This is for a variety of reasons: You don't have enough people to manage them. You don't have enough in-house expertise in those fields to manage them effectively. You don't have jobs for them yet and don't want them hanging around in the meantime. You don't want the inertia that can come along with having a bunch of affiliated helpers act like they're owed a role when they've outlived their usefulness. Or, as you hint at the end, you don't think very highly of the quality of people who want to help but don't want to learn a bunch of new skills and shed some of their old identity, which seems like a reasonable heuristic to me.
As I say, these are all perfectly good reasons to demur when asked "I'm a [whatever], how can I help?". But I do not think that the answer is really "Sorry, an artist can't help." It's more like "Sorry, we're not interested in helping an artist figure out how to use art to help—if you can figure it out yourself, knock yourself out."
If both your work and your procrastination are computer-based (and isn't that a concise description of all my problems!), Beeminder plus TagTime looks like a pretty promising combination. Beeminder keeps track of personal goal-related data for you, and TagTime is a random sampling-based way of seeing how you spend your computer time. They're put out by (at least some of) the same people, and TagTime can automatically send your data to the relevant Beeminder graph.
NB: TagTime is only available in a developer version right now, which means that I haven't tried it because I don't know how to clone a git repository (not a skill much needed among musicologists). So this is just going from the description on their website of how it works. They say it will be available in a user-friendly version eventually. Beeminder, on the other hand, I've been using for a few things, and it's cool.
Yeah, I thought so too. :)
That's a great song -- I hadn't heard it before and it's satire at its finest.
This is tangential, but I noticed that it's a very literal parody (especially at the beginning) of "I Have Confidence" from The Sound of Music, which (while not exactly a rationalist anthem or anything) is a song about the virtue of shutting up and doing the impossible, when you have to.
Interesting ideas -- I can think of a few more. On the "smart = more moral" side:
- Smart people are more likely to be able to call on Kahneman's System 2 when necessary, which correlates with utilitarian judgments (see this paper by Joshua Greene et al.). Similarly, they're more likely to have the mental resources to resist their worst impulses, if they want to resist them.
- Note that some of your "smart = less moral" proposals concern a world in which some people are much smarter than others. If cognitive enhancement were widespread, we might get its moral benefits without the drawbacks of smart people suffering social stigmas of various kinds (your first two bullets in the second set).
- Being much smarter might include being much better at interpersonal skills, increasing empathy for others.
- Likewise, if there are morality network effects -- as in the tendency for well-organized societies to be less violent -- then a smarter overall population might be very much more moral.
On the "smart = less moral" side:
- If cognitive enhancement happens such that some people are much, much smarter than others, the temptation for the much smarter people to use their intelligence to take advantage of the less-smart people may be simply too great to resist. Presumably even very, very smart people will have their price.
By and large, I think I'd agree with you that it seems right that a smarter human population would be more moral, but it's by no means certain.
This suggests to me that Task #1 is finding ways for people to engage with your ideas without involving a status competition between you and them.
I think this is exactly right. In other words, people who don't yet know how to leave themselves a line of retreat might, at the outset, need us to do it for them.
Good for you for learning this material. Let me know if you want more suggestions for things to read concerning group theory and music theory.
Wow, that's very interesting. I haven't seen any use of Bayesian methods along similar lines in music theory -- that is, to try to account for otherwise opaque compositional motivations on the part of an individual composer. I look forward to reading the article more closely, thank you for passing it along.
Where Bayes is beginning to crop up more often is in explicitly computational music theory, such as corpus music research and music cognition. I have a colleague who (among other things) develops key-finding algorithms on a large corpus of tonal music, in which Bayes's theorem is sometimes useful. I don't know for sure how much of that has appeared in print so far, since it isn't my area, but I know it's a tool that researchers are aware of.
I think how important these criticisms are depends on who the intended audience of the essay is -- which Gwern doesn't really make clear. If it's basically for SIAI's internal research use (as you might think, since they paid for it), tone probably hardly matters at all. The same is largely the case if the intended audience is LW users -- our preference for accessibly, informally written scholarly essays is revealed by our being LW readers. If it's meant as a more outward-facing thing, and meant to impress academics who aren't familiar with SIAI or LW and who judge writings based on their adherence to their own disciplinary norms, then sure. (Incidentally, I do think this would be a worthwhile thing to do, so I'm not disagreeing.) Perhaps Gwern or Luke would care to say who the intended beneficiaries of this article are.
For myself, I prefer scholarly writing that's as full of first-person statements as the writer cares to make it. I feel like this tends to provide the clearest picture of the writer's actual thought process, and makes it easier to spot where any errors in thinking actually occurred. I rarely think the accuracy of an article would be improved if the writer went back after writing it and edited out all the first-person statements to make them sound more neutral or universal.
That's it, thanks. I should have known it was on Less Wrong!
I wish I could remember where I read this (or even in what academic field). But some academic once wrote that his most acclaimed, most cited papers were always the ones he thought of as mere summaries of existing knowledge. This made a strong impression on me. In most cases when dealing with high-level ideas, very good restatements of previous research are not only valuable, but likely to make those ideas click for some non-trivial number of readers. A few other thoughts:
This seems strongly related to the notion of inferential distance -- we tend to underestimate it.
This is a good way to sum up the Lukeprog era of Less Wrong: There is plenty of low-hanging fruit in merely doing your research and saying the obvious.
If people are disinclined to say the obvious, I wonder how many conversations on difficult topics consist mostly of talking past one another. Perhaps more than we'd otherwise think.
Carrier's book may be seen as the first salvo in that attack, but this makes me wish his case had not been presented in the context of such a parochial and disreputable sub-field of history as Jesus Studies.
Boy, do I ever agree with this. I would love to be able to cite Carrier's work (edit: that is, his methodological program) without appearing to take on the baggage of interest in an area that is simultaneously irrelevant and mindkilling -- that is, in which having opinions might be taken as chiefly an indication of tribalism.
Certainly, to really get the attention of historians who might benefit from using Bayes, it does make sense to present the method in conjunction with an application of the method -- preferably one that leads to a really striking, counterintuitive claim being argued for. But the cultural loadedness of using Jesus studies seems likely to forestall Carrier's work having that result.
Tangentially: Perhaps it's emblematic that McCullagh's (non-Bayesian) Justifying Historical Descriptions has a preface that concludes with these sentences: "Finally I would like to acknowledge what I believe to have been God's guidance and support in the production of this book. It is just a pity that the clay He had to mould was so recalcitrant! Please praise Him for what is true in it, and forgive me for what is not." McCullagh does, in fact, present some useful heuristics for historians to use.
That quote is kind of awesomely terrible. Sure, as everyone knows, all fields of human endeavor have exactly the same kind of purpose!
Ok, it's undoubtedly true that de Botton and I share a good many values. But I do insist that his current project strikes me as incredibly misguided if not outright stupid. I would expect him to be quite resistant to an SIAI-like program of answers to the kinds of "philosophical" questions he's asking. He seems to believe that religious leaders, despite basing their teachings on their totally groundless factual claims about reality, are important moral teachers who must be taken with utmost seriousness. And he believes that (for example) Richard Dawkins, in advocating for factual positions that de Botton believes are correct, is being destructive. It's simply no better than a theory of non-overlapping magisteria.
Also, as I said before, I think he's wrong that research into the questions he's interested in is not being done. For a man who abandoned academia (he began a PhD in French philosophy, a field of interest which is very unlikely to be a good sign) in favor of being a popular writer, he doesn't seem very interested in seeking out that research and popularizing it. Instead he says things like (from the original link): "The arrogance that says analysing the relationship between reasons and causes is more important than writing a philosophy of shyness or sadness or friendship drives me nuts. I can't accept that." I'm not sure exactly what analysis of "the relationship between reasons and causes" he's referring to, but he clearly states that all research into metaphysics is pointless, while "philosophy of" various aspects of everyday life is of vital importance.
I see no sign that he'd find LW-style thinking congenial or constructive, or that he in fact values knowledge as such. I think he values lofty rhetoric and vague-but-profound-sounding statements about ordinary life. I deny that he plays for my team.
It's a nice quote, and correct as far as it goes. "We raise these questions not in order to provide definitive answers, but in order to stimulate questioning" is an annoying trope. However, a few thoughts:
- There may be some value in finding definitive answers offputting. Namely, if one values definitive answers too highly, one may be excessively compelled to prematurely proclaim one's answers definitive! But this isn't to say that definitive answers would not be desirable when they can be achieved.
- I doubt the attitude he describes is as prevalent in philosophy departments as he suggests. The vast majority of publications in current mainstream philosophy, whatever else you may say about them, do appear to me to be concerned with the enterprise of providing actual answers to questions. (Refining the precision of the questions themselves and knocking down failed answers to them both count as aspects of this enterprise.) If philosophy isn't coming up with actual, definitive, no-longer-questionable answers very often, it may be because the questions are actually very hard, or because philosophers are bad at answering them. But celebrating ignorance in favor of rhetorically pretty question-asking is not a frequent feature of any of the philosophy I read.
- Also, Alain de Botton is an idiot who I've been wanting to gripe about for a while now. I listened to him on the Philosophy Bites podcast and he seriously seems to believe that if religious institutions are weakened any further, we won't have community or nice architecture any more. While acknowledging the factual correctness of atheism, he doesn't want people to respond to their newfound atheism by actually changing any of their behavior surrounding religious institutions and rituals.
- The questions he claims Oprah Winfrey raises are, if you click through: "how do we live with other people, how do we cope with our ambitions, how do we survive as a society". These are all fine questions, although I don't know what they would reduce down to if formulated more precisely, but it seems just silly to think they're principally philosophy questions. Serious people are working on all three of them, just not mostly in philosophy departments. De Botton seems deeply bored by generality and abstraction -- but one thing philosophers do best is figure out how to see specific problems as special cases of more abstract ones. I think he just doesn't like philosophy very much, and (in keeping with his overriding concern with keeping religious institutions active in an atheistic society) he would prefer philosophers who remind him more of life coaches, religious sages, or the like. (When he says "how do we survive as a society", I don't think he's referring to existential risk!)
I am very, very wary of wading into anything approaching a debate with you, given my respect for you. But I feel that this comment assumes an unrealistic picture of how time/money tradeoffs work in most people's lives. Most of us do not have direct ways we can translate a couple of spare minutes into the corresponding amount of money, and even if we did, we aren't perfect utilitarians who always make as much money as we possibly can and then donate every remaining penny to the most efficient possible charity. If anyone is that kind of person, they should indeed act as you suggest.
However, most people have some inflexibility in terms of how much of their spare time they can trade money for, and how much of that money they feel prepared to give away. If you can spare 3 minutes to swab your cheek that you would not otherwise spend to earn 3 minutes' worth of money and send it to the Against Malaria Foundation, then you should probably consider that "free" time. Then the calculus shifts over to the odds that your donation, should you be asked to give it, would save someone's life. You estimated a generous 25% ($500) -- I don't know, but I'll go with that. In that case, the time would probably be worthwhile.
Let me emphasize that I understand I am not assuming perfect rationality or perfect utilitarianism, but rather how these kinds of tradeoffs are likely to play out in ordinary people's lives.
I've done this, and it is as easy as atorm says it is. I've also been the beneficiary of stem cell donation: my mother is currently alive and has a normal life expectancy after receiving a transplant that cured her of leukemia. She would otherwise have died within months of her diagnosis. Some years after her transplant, she was able to correspond with her stem cell donor, who told her that the donation was as simple as going to his local hospital and having his blood drawn. (These days, an agonizing, old-fashioned bone marrow transplant is rarely if ever necessary.) A cheap and easy source of utilons, indeed.
EDIT: this is especially worth doing if (as is apparently the case for another few days) a corporate sponsor has agreed to pay for the testing (but not to donate the same sum to the charity of your choice). If donors have to pay for their own testing, as is normally the case, I'm not claiming that this is necessarily a better use of the money than donating to some other charity. When I signed up, it was being paid for by my university.