Contests vs. Real World Problems

post by badger · 2009-03-25T01:29:02.264Z · LW · GW · Legacy · 34 comments

Contents

34 comments

John Cook draws on the movie Redbelt to highlight the difference between staged contests and real-world fights. The main character of the movie is a Jiu Jitsu instructor who is willing to fight if necessary, but will not compete under arbitrary rules. Cook analogies this to the distinction between academic and real-world problem solving. Academics and students are often bound by restrictions that are useful in their own contexts, but are detrimental to someone who is more concerned with having a solution than where the solution came from.

Robin pointed arbitrary restrictions in academia out to us before, but his question then was regarding topics neglected for being silly. Following Cook's line of reasoning, are there any arbitrary restrictions we have picked up in school or other contexts that are holding us back? Are there rationalist "cheats" that are being underused?

34 comments

Comments sorted by top scores.

comment by James_Miller · 2009-03-25T14:21:45.890Z · LW(p) · GW(p)

Rather than just gambling with money, people could gamble with their lives. A global warming denier, for example, could announce that he is so sure that the earth will not be significantly warmer in ten years than it is today that if he is wrong about this he will kill himself. A legal system that enforced such a promise would, clearly, make it possible for someone to very credibly communicate the sincerity of his beliefs.

Replies from: Pierre-Andre, randallsquared
comment by Pierre-Andre · 2009-03-25T15:36:24.298Z · LW(p) · GW(p)

Scary, voted up.

This can be pushed further: law/moral/ethics are often "holding us back". The use of dissection of human body has been forbidden/allowed many time in history and this affected our knowledge of anatomy and medicine. Many physical and psychological experiments that have been done before cannot be reproduced today, for they were "unethical".

It doesn't have to be Nazis experimentations. Informed consent requires that the person knows that he is under study, which might skew the results.

Some famous experiments were even against the legislation of that time: Louis Pasteur has tested his rabies vaccine illegally.

This vaccine was first used on 9-year old Joseph Meister, on July 6, 1885, after the boy was badly mauled by a rabid dog. This was done at some personal risk for Pasteur, since he was not a licensed physician and could have faced prosecution for treating the boy. However, left without treatment, the boy faced almost certain death from rabies.

Replies from: Kaj_Sotala
comment by Kaj_Sotala · 2009-03-25T20:26:13.423Z · LW(p) · GW(p)

Related to the Nazi experiments, there are people in the scientific community who argue that they should not be cited, even in case where they provided valuable information:

Although it is difficult morally, one might concede that within the mass of pseudoscientifc Nazi data some shreds can be valuable to researchers, as a small portion of the hypothermia data has proven to be. Of course, such data should be used only in the most exceptional circumstances and only in the absence of ethically derived data.

This seems absurd. The experiments were horrid and reprehensible, no question - but if they provided useful data, shouldn't we try to salvage at least something good from their deeds?

Replies from: CarlShulman
comment by CarlShulman · 2009-03-25T20:44:51.223Z · LW(p) · GW(p)

A policy against it may provide some marginal disincentive to future scientists under vile regimes.

Edit: of course the real cause of the objection is just 'moral contamination,' the same trigger-happy associational neural machinery used to avoid poisonous foods attaches negative affect to anything associated with the Nazis. But the heuristic can sometimes be useful, just as our cooperative emotions can be hacks to implement binding commitments.

Replies from: Eliezer_Yudkowsky, taw
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-26T00:20:23.641Z · LW(p) · GW(p)

If we assume those scientists actually care about their future number of citations, then yes.

comment by taw · 2009-03-26T00:15:09.215Z · LW(p) · GW(p)

How likely is it to be a result of genuine reasoning leading to this conclusion, and how likely is it to be just a rationalization of the yuck factor? It seems pretty straightforward.

comment by randallsquared · 2009-03-25T21:09:09.831Z · LW(p) · GW(p)

A "global warming denier" doesn't necessarily believe the world is not getting warmer, or that it will certainly get colder. Just FYI.

comment by CronoDAS · 2009-03-25T05:29:12.248Z · LW(p) · GW(p)

One obvious "cheat" - ask someone else for an answer.

Replies from: matt
comment by matt · 2009-03-30T12:06:25.315Z · LW(p) · GW(p)

Much time is wasted working out something that should be looked up.

comment by talisman · 2009-03-25T02:57:01.213Z · LW(p) · GW(p)

The fear and hatred of gambling. Contra Tyler Cowen, betting your beliefs is one of the best paths to both individual and group rationality. You should be doing it twice a day, like brushing your teeth. The beliefs that don't get bet get cavities and rot; the beliefs that are unbettable create unbreakable deadlocks that later require ophtalmological intervention. Bet!

Replies from: William
comment by William · 2009-03-27T01:36:10.519Z · LW(p) · GW(p)

One warning though: Gambler's ruin is very possible with betting systems, even if your strategy has a positive expected value.

comment by TreeFrog · 2009-03-25T16:43:18.531Z · LW(p) · GW(p)

I warn those of you with a Netflix account that Redbelt is one awful mess of a movie. Yes, the Brazilian jiu-jitsu instructor refuses to compete against the evil Brazilians, but it's not because prize fights are inherently different from grappling matches. It's for some sort of dedication to an ideal of honoring a Japanese-looking master that even he can't coherently articulate. Brazilian jiu-jitsu is built upon wrestling others - in class, in tournaments, in MMA fights and in real life.

It's one thing to have an arbitrary restriction; it's another to have one that's simultaneously a contradiction of the core principles of your profession and so unclear even the person following it can't tell you what it is. And it's yet another thing to reward the instructor with a redbelt - the supposed highest honor in jiu-jitsu - solely for getting angry, beating up security guards and brawling in the hallways of an arena to protest the stealing of his stupid idea.

Mamet tried to do about eleven different things with his script and only Eijofor's performance gave that movie its sole redeeming feature.

That being said, this "do not hit a girl" thing has always annoyed me. There are biological difference between genders, but if someone deserves to be punched, they deserve to be punched.

Replies from: thomblake
comment by thomblake · 2009-04-02T16:16:46.716Z · LW(p) · GW(p)

That being said, this "do not hit a girl" thing has always annoyed me. There are biological difference between genders, but if someone deserves to be punched, they deserve to be punched.

Given the difficulty of determining whether someone's pregnant, and the large possibility of violence causing a miscarriage, and the long time it takes for a successful pregnancy, it seems like a pretty good heuristic. Almost as good as "do not hit anyone".

comment by [deleted] · 2009-03-25T05:21:29.838Z · LW(p) · GW(p)

The first thing that springs to mind are the arcane company policies in large corporations, where responsibilities and duties are so finely assigned that they often become barriers to getting actual work done. This becomes especially true if a situation comes along that there aren't codified rules for (or as I like to think of it, a selection pressure is introduced that the system isn't adapted for).

I also think it's interesting to contrast this to the rational technique of the least convenient possible world, where instead of removing artificial restrictions you keep adding them.

Replies from: billswift
comment by billswift · 2009-03-25T15:12:11.587Z · LW(p) · GW(p)

I'm curious about what "large corporations" you're writing about. I used to work for Walmart, the largest corporation in the world, for 2.5 years and my perceptions of the "responsibilites and duties" was diametrically opposite what you are claiming. Of course, I know its the leftist/academic thing to dump on business, but could some of you who do actually provide some references that you aren't just making these claims up.

Replies from: Nebu
comment by Nebu · 2009-03-25T16:06:21.633Z · LW(p) · GW(p)

I think a common situation is the manager/idea-generator relationship. An idea generator is a person who spends a lot of their working time simply "thinking", and there is no apparent output to their task until the very end, when they output an idea. A programmer trying to design the right algorithm to solve a given problem is one example of an idea generator.

Often, the manager will want to have some sort of feedback on the progress, and have an estimate of the time remaining to completion. The idea-generator, however, has no idea how long their task will take. They might find the solution this afternoon, or they may spend months brainstorming on it.

And so the manager may "assign" responsibilities like writing daily reports on what was found so far, filling in time sheets, etc. to alleviate their nervousness from seeing nothing produced. Bureaucracy like this is just taking the idea-generator's mind off of the real problem at hand, and can slow things down.

Replies from: B_Frank
comment by B_Frank · 2009-03-26T01:05:11.366Z · LW(p) · GW(p)

We might say there are two kinds of "responsibility." School teaches people to be responsible to authority; the other kind is being responsible for eventual outcomes (such as truthfulness) by asking questions and challenging authority.

An example would be something I read recently about the institutional mindset held by journalists at newspapers: older editors and managers are practically begging young reporters for new ideas... the problem is the type of people who go to work for a newspapers now tend to want responsibilities (and security) given to them.

Meanwhile a lot of people who never finished their homework or followed their assignment guidelines were distracted from school by new technologies -- sites like this -- and learning from the proliferating information available online.

comment by gwern · 2009-03-25T15:33:18.655Z · LW(p) · GW(p)

Ad hominems. We are so well schooled in 'traditional' deductive rationality that we instinctively shy away from using this strategy, even though it's quite powerful and often we're using it in practice anyway.

Replies from: Cyan, army1987
comment by Cyan · 2009-03-25T16:09:32.632Z · LW(p) · GW(p)

...we instinctively shy away from using this strategy... often we're using it in practice anyway.

Is this not contradictory?

Replies from: gwern
comment by gwern · 2009-03-28T14:31:34.185Z · LW(p) · GW(p)

You understand the hypocrisy, then. We rely on this very general & valid strategy in all sorts of real-life real-money situations, but when it comes to discussions of complex important topics? All of sudden it is 100% verboten.

This, it seems to me, is exactly what a underused rationalist cheat would look like.

Replies from: ciphergoth, thomblake
comment by Paul Crowley (ciphergoth) · 2009-04-02T21:37:00.689Z · LW(p) · GW(p)

Can you give an example of something that this change would sanction?

comment by thomblake · 2009-04-02T16:19:19.929Z · LW(p) · GW(p)

This, it seems to me, is exactly what a underused rationalist cheat would look like.

I agree that it seems to match my impression of what the form should be. However, it's not just an arbitrary rule to not use ad hominem arguments. Ad hominem is a formal fallacy - non-fallacious ad hominems are really not all that unheard-of in academia.

comment by A1987dM (army1987) · 2012-04-17T16:03:19.950Z · LW(p) · GW(p)

Well...

Replies from: gwern
comment by gwern · 2012-04-17T17:06:01.709Z · LW(p) · GW(p)

I don't think that post disagrees with me.

comment by Psy-Kosh · 2009-03-25T02:50:54.028Z · LW(p) · GW(p)

Well, are you talking about higher academia, or earlier schooling too? ie, there's always the classic "your paper must be at least X pages long", which in college professors will explicitly say that you have to unlearn.

Replies from: thomblake, badger
comment by thomblake · 2009-04-02T16:25:09.261Z · LW(p) · GW(p)

your paper must be at least X pages long ... college professors will explicitly say that you have to unlearn

Funny - I find paper lengths to be a good guide to just how much I need to unpack my arguments. And these requirements never go away - try sending a 202 word abstract to a journal that asked for 200.

Replies from: Psy-Kosh
comment by Psy-Kosh · 2009-04-02T18:48:58.134Z · LW(p) · GW(p)

What do you mean?

Do you mean something like "This is really short. I'm probably skipping a bunch of steps in my reasoning and need to spell it out more explicitly" or do you mean something else?

And the abstract thing is about upper limits vs what I was talking about, which was lower limits.

Replies from: thomblake
comment by thomblake · 2009-04-02T19:02:25.464Z · LW(p) · GW(p)

Do you mean something like "This is really short. I'm probably skipping a bunch of steps in my reasoning and need to spell it out more explicitly" or do you mean something else?

Kindof something else, though I think you get the idea.

For any paper, you can always explain more steps in your reasoning, define your terms better, or bring out more of your assumptions. One of the good heuristics for determining how much you need to do this is to consider your audience. However, when writing for a class for your professor, you don't really have this luxury (if you're writing on something noncontroversial, your argument could be written as a conclusion followed by 'you already know the rest'). Since part of the purpose of writing the paper for the class is to demonstrate whether you understand the material, you need to explain to the professor things he already knows. Paper length is a good rough heuristic to let you know just how much of that you need to do in the course of your argument.

comment by badger · 2009-03-25T03:23:06.562Z · LW(p) · GW(p)

Both, and other contexts as well. Large corporate environments often work under the constraint of "use only standard, safe software tools like Microsoft Word or Java". It's just that those sort of constraints aren't internalized as frequently.

comment by Nick_Tarleton · 2009-03-25T02:31:50.351Z · LW(p) · GW(p)

Is Science Doesn't Trust Your Rationality along the lines of what you're thinking?

Replies from: badger, Cameron_Taylor
comment by badger · 2009-03-25T03:25:27.462Z · LW(p) · GW(p)

Yes, thanks for reminding me of that post. Science (as an institution) is a very good example of a constraint that has its purposes, but sometimes has to be overridden.

comment by Cameron_Taylor · 2009-03-25T03:29:27.688Z · LW(p) · GW(p)

Interesting article. One part confused me:

Libertarianism secretly relies on most individuals being prosocial enough to tip at a restaurant they won't ever visit again.

Why is this the case? In Australia tipping isn't particularly a norm that we adhere to. What is the point that it is trying to make? How prosocial do I have to be to tip at restaurants?

Replies from: badger
comment by badger · 2009-03-25T03:37:01.929Z · LW(p) · GW(p)

I think it's intended as a sufficient, but not necessary condition. If a culture can maintain a voluntary, prosocial norm like tipping, it will do well under libertarianism.

comment by billswift · 2009-03-25T15:19:14.939Z · LW(p) · GW(p)

Problems in the real world have much more detailed context which can be used to "guess" an answer to then test rationally. Even if it turns out the answer isn't adequate as it stands, and it often isn't, it usually does provide an "advanced base" to find a better answer.

Also real world problems rarely have an accessible "best" answer; you need to find one that is good enough; and you usually need to define "good enough" on the fly as well.