Posts
Comments
I'd offer the counterpoints that:
a) Even at high levels, professors are rarely teaching the absolute cutting edge. With the exception of my AI/ML courses and some of the upper-level CS, I don't think I've learned very much that a professor 10-20 years ago wouldn't have known. And I would guess that CS is very much the outlier in this regard: I would be mildly surprised if more than 5-10% of undergrads encounter, say, chemistry, economics, or physics that wasn't already mainstream 50 years ago.
b) Ballpark estimate based on looking at a couple specific schools--maybe 10% of undergrads at a top university go on to a PhD. Universities can (and should) leverage the fact that very few of their students want to go on to do research, and the ones that do will almost all have 4-5 more years of school to learn how to do good research.
If I were running a university, I would employ somewhat standardized curricula for most courses and stipulate that professors must test their students on that material. For the undergrad, I would aim to hire the best teachers (conditioned on a very strong understanding of the material, obviously), while for the graduate school I would aim to hire the best researchers, who would have to teach fewer courses since they would never teach undergrads. Top researchers would be attracted by the benefit of not having to teach any intro courses, top teachers would be attracted by the benefit of not being pressured to constantly put out research, undergrads would be attracted by the benefit of having competent teachers, and PhD students would be attracted by the more individual attention they get from having research faculty's full focus. And as a university, the amount of top-tier research being outputted would probably increase, since those people don't have to teach Bio 101 or whatever.
I contend that this leaves all the stakeholders better off without being more expensive, more difficult, or more resource-intensive. Obviously I'm wrong somewhere, or colleges would just do this, but I'm unsure where...
Hm... I seem to have mistaken "flexibility" for low hours and underestimated how much professors work. Is "teaches math at Stanford" really viewed much lower than "researches math at Stanford" (or whatever college)? It seems like universities could drum up some prestige around being a good teacher if that's really the main incentive.
Update: someone IRL gave me an interesting answer. In high school, we had to take a bunch of standardized tests: AP tests, SAT and ACT, national standardized tests, etc. My school was a public school, so its funding and status was highly dependent on these exam results. This meant that my teachers had a true vested interest in the students actually understanding the content.
Colleges, on the other hand, have no such obligation. Since the same institution is the one administering classes and deciding who gets a degree, there's super low incentive for them to teach anything, especially since students will typically be willing to teach themselves the skills they need for a job anyway (e.g. all the CS kids grinding leetcode for a FAANG internship). There's actually so little accountability it's laughable. And with that little oversight, why would anyone bother being a good teacher?
God, I hate bad incentive structures.
I'm not fully convinced by the salary argument, especially with quality-of-life adjustment. As an example, let's imagine I'm a skilled post-PhD ML engineer, deciding between:
Jane Street Senior ML Engineer: $700-750k, 50-55hrs/week, medium job security, low autonomy
[Harvard/Yale/MIT] Tenured ML Professor: $200-250k, 40-45hrs/week, ultra-high job security, high autonomy
A quick google search says that my university grants tenure to about 20 people per year. Especially as many professors have kids, side jobs, etc. it seems unlikely that a top university really can't find 20 good people across all fields who are both good teachers and would take the second option (in fact, I would guess that being a good teacher predisposes you to taking the second option). Is there some part of the tradeoff I'm missing?
I agree that this is the case (and indeed, a quick google search of even my worst professors yields considerably impressive CVs). I don't understand why that's the case. Is it, as ErickBall suggests, simply cheaper to hire good researchers than good teachers? I find that a little unlikely. I also find it unlikely that this is more profitable--surely student tuition + higher alumni donations be worth more than whatever cut of NIH/NSF/etc. funding they're taking.
My question is who this system leaves better off? Students get worse professors, good researchers have to waste their time teaching and good teachers have to waste their time researching. Other than maybe the science journals or something, who has a stake in perpetuating this?