Rationality Quotes May 2014

post by elharo · 2014-05-01T09:45:45.166Z · LW · GW · Legacy · 299 comments

Another month has passed and here is a new rationality quotes thread. The usual rules are:

  • Please post all quotes separately, so that they can be upvoted or downvoted separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
  • Do not quote yourself.
  • Do not quote from Less Wrong itself, HPMoR, Eliezer Yudkowsky, or Robin Hanson. If you'd like to revive an old quote from one of those sources, please do so here.
  • No more than 5 quotes per person per monthly thread, please.
  • Provide sufficient information (URL, title, date, page number, etc.) to enable a reader to find the place where you read the quote, or its original source if available. Do not quote with only a name.

299 comments

Comments sorted by top scores.

comment by johnlawrenceaspden · 2014-05-03T15:17:45.439Z · LW(p) · GW(p)

When another asserted something that I thought an error, I deny'd myself the pleasure of contradicting him abruptly, and of showing immediately some absurdity in his proposition; and in answering I began by observing that in certain cases or circumstances his opinion would be right, but in the present case there appear'd or seem'd to me some difference, etc.

I soon found the advantage of this change in my manner; the conversations I engag'd in went on more pleasantly. The modest way in which I propos'd my opinions procur'd them a readier reception and less contradiction; I had less mortification when I was found to be in the wrong, and I more easily prevail'd with others to give up their mistakes and join with me when I happened to be in the right.

Benjamin Franklin

Replies from: Grif, Torello
comment by Grif · 2014-05-06T13:58:33.856Z · LW(p) · GW(p)

Unfortunately this self-debasing style of contradiction has become the norm, and the people I talk to can instantly notice when I am pouring sugar on top of a serving of their own ass. Perhaps they are simply noticing changes in my tone of voice or body language, but in sufficiently intellectual partners I've noticed that abruptly contradicting them startles them into thinking more often, though I avoid this in everyday conversation with non-intellectuals for fear of increasing resentment.

comment by Torello · 2014-05-03T19:44:50.434Z · LW(p) · GW(p)

I would love to hear what Richard Dawkins would say in reply to this quote.

Personally, I think it's great advice--challenging people immediately and directly is often not a good long-term strategy.

Replies from: SaidAchmiz
comment by Said Achmiz (SaidAchmiz) · 2014-05-04T00:14:30.848Z · LW(p) · GW(p)

Dawkins, in arguments with theists, homeopaths, etc., is not trying to convince his interlocutors; nor are most of the other well-known atheist public figures. The aim to convince bystanders — the private atheist who is unsure whether to "come out", the theist who's all but lost his faith but isn't sure whether atheism is a position one may take publicly, the person who's lukewarm on religious arguments but has always had a rather benign and respectful view of religion, etc.

In private conversations with someone whose opinions are of concern to you, Franklin's advice make sense. The public arguments of Dawkins & Co. are more akin to performances than conversations. I think he achieves his aim admirably. I, for one, have little interest in watching people get on a public stage and have exchanges laden with "in certain cases or circumstances..." and other such mealy-mouthed nonsense.

Replies from: Jiro
comment by Jiro · 2014-05-04T02:20:07.738Z · LW(p) · GW(p)

I don't know of nontrivial cases and circumstances where homeopaths are right about homeopathy (and where their statements are taken as normally understood).

Replies from: Torello, SaidAchmiz
comment by Torello · 2014-05-04T04:49:33.232Z · LW(p) · GW(p)

We could imagine cases where people underwent homeopathic treatments and saw improvements in their symptoms for other reasons. For example, colds usually stick around for 3-4 days and dissipate without treatment, so you take a homeopathic medicine and two days your cold vanishes and you think "It worked." The correlation-causation error that might seem obvious to skeptics, but it isn't to the homeopath believers.

As I interpret the Franklin quote, you provisionally accept (don't immediately and explicitly challenge) the claim that the homeopathic medicine made the cold go away, so you can establish a further dialogue with some chance (let's just say 10%) of causing doubt in the other person. If you immediately say "There is no way that the homeopathic medicine had any effect," the person will get angry at you. You'll probably have a smaller chance of changing their mind, and they won't like you, which generally doesn't help you accomplish goals.

With Franklin's approach, I think it doesn't even matter that there are no merits to a homeopaths treatments (or insert whichever group); you need to cede some ground to keep negotiations open and to get people to like you because it's helpful later.

Replies from: roystgnr
comment by roystgnr · 2014-05-05T17:02:04.355Z · LW(p) · GW(p)

We could even imagine cases where people underwent homeopathic treatments and saw improvements in their symptoms for that reason. The placebo effect is often a real thing, and is most effective when you don't believe what you're taking is a placebo.

If it were possible to keep homeopathy from being inexplicably muddled up with non-evidence-based naturopathy (where your treatment may have negative side effects), unfortunately mixed up with anti-"allopathy" (where you forgo a more medically-effective treatment), or inescapably tied to anti-epistemology in general, it might even be a net good on its own.

Replies from: roystgnr
comment by roystgnr · 2014-05-06T16:52:22.465Z · LW(p) · GW(p)

If anyone has found that the placebo effect isn't real, making scientific history by publishing your discovery might be of higher utilty than downvoting my outdated information.

Replies from: Lumifer
comment by Lumifer · 2014-05-06T17:10:11.468Z · LW(p) · GW(p)

If anyone has found that the placebo effect isn't real

The placebo effect is complicated. See e.g. this.

Replies from: roystgnr
comment by roystgnr · 2014-05-07T15:43:32.088Z · LW(p) · GW(p)

True. But am I just being biased when I interpret that as support for my claim? "Sham acupuncture" and even placebo pills given to people who are told they're taking placebos both show significant positive effects. I'd be very surprised if placebo pills given to people who are told they're taking real "homeopathic" medicine didn't show real effects too.

Replies from: Lumifer
comment by Lumifer · 2014-05-07T16:23:55.857Z · LW(p) · GW(p)

But am I just being biased when I interpret that as support for my claim?

What is your claim, precisely?

Sure, giving homeopathic pills to people is likely to make them feel better via placebo. But by the same reasoning, this will also work for voodoo rituals, holy water, and mind rays from outer space.

comment by Said Achmiz (SaidAchmiz) · 2014-05-04T04:33:12.656Z · LW(p) · GW(p)

I'm not sure I know what point you meant to make by this.

I read Franklin's advice as applying, and intending to be applied, quite readily in those cases where one's interlocutor is totally and clearly wrong. The idea is that you take a certain roundabout approach to telling them that they're wrong, without quite coming out and saying it straight out. The fact that they are wrong need not be in question; it's merely a matter of which tactics are effective in convincing them. (The assumption, of course, is that you're interested in convincing them.)

In any case, I am unsure in what sense your comment is a response to what I said... could you clarify?

Replies from: Jiro
comment by Jiro · 2014-05-05T19:42:57.137Z · LW(p) · GW(p)

The way I read Franklin's quote is that if someone says "well, (factual statement X) is true, and from it I draw (unwarranted conclusion Y)", we should claim to agree with him (because we agree with X) and act as though drawing conclusion Y is a minor flaw in his theory that doesn't negate the fact that he's basically correct.

But he's not basically correct. He did invoke X, and X is true, but to say that he's right, or even partially right, means he's right about a substantial part of his argument, not that he's based it on at least one statement that is true. A homeopath doesn't become partly right just because he says "well, vaccines work by using a tiny amount of something to protect against it, so perhaps homeopathy can also use a tiny amount of a substance to protect against it", even if the statement about vaccines is literally correct.

Replies from: dthunt
comment by dthunt · 2014-05-05T20:54:36.823Z · LW(p) · GW(p)

What do you think of the following?

'If the data is good, but the argument is not, argue the argument (e.g. by showing that it doesn't hold water). Don't argue about the conclusion and point to the bad argument as evidence.' (not a rationality quote, just curious about your reaction)

Replies from: Jiro
comment by Jiro · 2014-05-06T14:12:55.060Z · LW(p) · GW(p)

I think that is not what Franklin was saying.

comment by B_For_Bandana · 2014-05-03T02:28:58.290Z · LW(p) · GW(p)

One afternoon a student said "Roshi, I don't really understand what's going on. I mean, we sit in zazen and we gassho to each other and everything, and Felicia got enlightened when the bottom fell out of her water-bucket, and Todd got enlightened when you popped him one with your staff, and people work on koans and get enlightened, but I've been doing this for two years now, and the koans don't make any sense, and I don't feel enlightened at all! Can you just tell me what's going on?"

"Well you see," Roshi replied, "for most people, and especially for most educated people like you and I, what we perceive and experience is heavily mediated, through language and concepts that are deeply ingrained in our ways of thinking and feeling. Our objective here is to induce in ourselves and in each other a psychological state that involves the unmediated experience of the world, because we believe that that state has certain desirable properties. It's impossible in general to reach that state through any particular form or method, since forms and methods are themselves examples of the mediators that we are trying to avoid. So we employ a variety of ad hoc means, some linguistic like koans and some non-linguistic like zazen, in hopes that for any given student one or more of our methods will, in whatever way, engender the condition of non-mediated experience that is our goal. And since even thinking in terms of mediators and goals tends to reinforce our undesirable dependency on concepts, we actively discourage exactly this kind of analytical discourse."

And the student was enlightened.

Replies from: satt
comment by satt · 2014-05-04T11:50:31.047Z · LW(p) · GW(p)

I don't think there's such a thing as "unmediated experience of the world".

(I like the quotation a lot for giving a plausible, lucid reason why Zen might spurn the usual sort of analytical discourse. But it's so clear an explanation of an idea that I think it's revealed a basic problem with the idea, namely that it points towards a non-existent goal.)

Replies from: NancyLebovitz, David_Gerard, TheAncientGeek, ChristianKl, TheAncientGeek
comment by NancyLebovitz · 2014-05-04T13:33:56.441Z · LW(p) · GW(p)

There is such a thing as a less mediated experience of the world.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2014-05-09T06:30:33.315Z · LW(p) · GW(p)

Can you give some examples of more and less mediated experiences?

Replies from: NancyLebovitz, Armok_GoB
comment by NancyLebovitz · 2014-05-09T15:08:14.030Z · LW(p) · GW(p)

That's an interesting question-- "mediated" should probably be modified by "of what?" and "by what?".

It's definitely possible for perceptions to become less mediated by focusing on small details so that prototypes aren't dominant. It's possible to become a lot more perceptive about color, and Drawing on the Right Side of the Brain is about seeing angles, lengths, shading, curves, etc. rather than objects and thus being able to draw accurately.

If you get some distance on your emotions through meditation and/or CBT, is your experience of your emotions less mediated? More mediated? Wrong questions? I think meditators assume that the calm you achieve is already there-- you just weren't noticing it until you meditated enough, so your emotions are more mediated and your calm is less mediated, but now that I've put it into words, I'm not sure what you would use for evidence that the calm was always there rather than created by meditation.

Thank you for the evidence that it's possible to get 12 karma points for something that doesn't exactly make sense.

comment by Armok_GoB · 2014-05-09T14:35:06.711Z · LW(p) · GW(p)

Reasoning inductively rather than deductively, over uncompressed data rather than summaries.

Mediated: "The numbers between 3 and 7" Unmediated: "||| |||| ||||| |||||| |||||||"

comment by David_Gerard · 2014-05-06T12:50:11.172Z · LW(p) · GW(p)

It's like neutrality on Wikipedia. You'll never attain neutrality, but there is such a thing as less and more, and you want to head in the "more" direction.

Replies from: satt
comment by satt · 2014-05-07T03:29:15.633Z · LW(p) · GW(p)

I think I see what you mean; if I mentally substitute "is closer to an" for "involves the", and "that state would have" for "that state has", the practice the quotation describes makes more sense to me. (I'm leery of the idea that it's better to head in the direction of less mediation — taking off my glasses doesn't give me a clearer view of the world — but that's a different objection.)

Replies from: Aleksander
comment by Aleksander · 2014-05-07T20:46:09.967Z · LW(p) · GW(p)

So while the original quotation talked about not thinking at all, your revised version urges that we think as little as possible. How does it qualify as a "rationality quote"?

Replies from: TheAncientGeek, satt
comment by TheAncientGeek · 2014-05-07T21:19:02.100Z · LW(p) · GW(p)

It can be rationally beneficial to realise now much mediation is involved in perception, in the same way it is useful to replace naive ealism with scientific realism.

Relatively unmediated perception is also aesthetically interesting, and therefore of terminal value to many.

comment by satt · 2014-05-09T01:49:15.014Z · LW(p) · GW(p)

How does it qualify as a "rationality quote"?

You tell me; I have to squint pretty hard to make it read as telling me something useful about rationality.

comment by TheAncientGeek · 2014-05-04T14:22:24.503Z · LW(p) · GW(p)

Because? People who claim it are lying? You dont have it, and your mind is typical?

Replies from: army1987, satt, Aleksander
comment by A1987dM (army1987) · 2014-05-04T18:58:57.633Z · LW(p) · GW(p)

Or maybe they and satt mean different things by “unmediated”.

comment by satt · 2014-05-04T23:23:18.398Z · LW(p) · GW(p)

Because causal mechanisms to relay information from the world to one's brain are a necessary prerequisite for "experience of the world", so one's "experience of the world" is always mediated by those causal mechanisms.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2014-05-05T10:06:52.810Z · LW(p) · GW(p)

And it's not possible for just the cognitive mechanisms to shut down, and leave the perceptual ones?

Replies from: Viliam_Bur, satt
comment by Viliam_Bur · 2014-05-06T09:06:29.096Z · LW(p) · GW(p)

If you shut down the cognitive mechanisms completely, would you even remember what you have perceived? Or even that you have perceived something?

Replies from: TheAncientGeek
comment by TheAncientGeek · 2014-05-06T09:42:19.660Z · LW(p) · GW(p)

Maybe not. That matches some reports of nonordinary experience.

comment by satt · 2014-05-07T02:49:46.646Z · LW(p) · GW(p)

I doubt it's possible. I'm sceptical that one can cleanly sort every experience-related bodily mechanism into a "cognitive" category xor a "perceptual" category. Intuitively, for example, I might think of my eyes as perceptual, and the parts of my brain that process visual signals as cognitive, but if all of those bits of my brain were cut out, I'd expect to see nothing at all, not an "unmediated" view of the world — which implies my brain is perceptual as well as cognitive. So I expect the idea of just shutting down the cognitive mechanisms and leaving the perceptual mechanisms intact is incoherent.

(Often there're also external physical mechanisms which are further mediators. You can't see an object without light going from the object to your eye, and you can't hear something without a medium between the source and your ear.)

Replies from: TheAncientGeek
comment by TheAncientGeek · 2014-05-07T10:52:00.555Z · LW(p) · GW(p)

So are people who claim unmediated experience lying?

Replies from: Viliam_Bur, satt
comment by Viliam_Bur · 2014-05-08T07:46:19.951Z · LW(p) · GW(p)

Or using a different definition of "unmediated", or confused about their experience, or...

comment by satt · 2014-05-08T21:18:22.366Z · LW(p) · GW(p)

My best guess is that the vast majority of them are sincere. Being correct vs. being a liar is a false dichotomy.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2014-05-09T12:18:36.489Z · LW(p) · GW(p)

So are they sincerely ,mistaken about that they think unmediated experience is, or about what you think it is?

Replies from: satt
comment by satt · 2014-05-10T17:04:28.969Z · LW(p) · GW(p)

(Presumably your first "that" is meant to be a "what"?) That question implies a false dichotomy too. The mistaken people might not be mistaken about what anyone thinks unmediated experience is; perhaps everyone pretty much agrees on what it is, and the mistaken people are simply misremembering or misinterpreting their own experiences.

This conversation might be more productive if you switch from Socratic questioning to simply presenting a reasonable definition of "unmediated experience" according to which unmediated experience exists. After all, your true objection seems to be that I'm using a bad definition.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2014-05-11T11:47:39.541Z · LW(p) · GW(p)

Anybody can be wrong about anything, That isn't an interesting observation, because it is general. Earlier you gave a specific reason, which you think is empirical, and I think is partly conceptual.

comment by Aleksander · 2014-05-07T16:39:02.012Z · LW(p) · GW(p)

There are also people who claim that they feel God's presence in their heart, you know.

Replies from: Nornagest, TheAncientGeek
comment by Nornagest · 2014-05-07T17:12:09.992Z · LW(p) · GW(p)

I believe them. I don't believe in God, but I do believe that it's possible to have the subjective experience of a divine presence -- there's too much agreement on the broad strokes of how one feels, across cultures and religions, for it to be otherwise. Though on the other hand, some of the more specific takes on it might be bullshit, and basic cynicism suggests that some of the people talking about feeling God's presence are lying.

Seems reasonable to extend the same level of credulity to claims about enlightenment experiences. That's not to say that Buddhism is necessarily right about how they hash out in terms of mental/spiritual benefits, or in terms of what they actually mean cognitively, of course.

Replies from: Aleksander
comment by Aleksander · 2014-05-07T23:27:42.660Z · LW(p) · GW(p)

I don't disagree with any of that. Who knows, could be even one and the same experience which people raised in one culture interpret as God's presence, and in another as enlightenment.

Replies from: Desrtopa
comment by Desrtopa · 2014-05-13T18:09:07.656Z · LW(p) · GW(p)

The research summarized in this book seems to suggest that this is indeed the case.

comment by TheAncientGeek · 2014-05-07T18:35:19.107Z · LW(p) · GW(p)

And people who claim to see cold fusion and canals on mars.

There is a happy medium between treating empirical evidence as infallible, and dismissing it as not conforming to your favourite theory.

comment by ChristianKl · 2014-05-08T22:40:18.309Z · LW(p) · GW(p)

Words are used to point to places. The thing that comes to your mind when you hear the words "unmediated experience of the world" might not exist. That doesn't mean that there aren't using people who use that phrase to point to something real.

Replies from: None
comment by [deleted] · 2014-05-09T00:41:05.612Z · LW(p) · GW(p)

Couldn't you say exactly that to anyone who doubts the existence of anything?

Replies from: Richard_Kennaway, ChristianKl
comment by Richard_Kennaway · 2014-05-09T06:26:10.179Z · LW(p) · GW(p)

Couldn't you say exactly that to anyone who doubts the existence of anything?

You could. And the way to resolve a dispute over the existence of, say, unicorns, would be to determine what is being meant by the word, in terms of what observations their existence implies that you will be more likely to see. Then you can go and make those observations.

The problem with talk of mental phenomena like "unmediated perception" is that it is difficult to do this, because the words are pointing into the mind of the person using them, which no-one else can see. Or worse, the person isn't pointing anywhere, but repeating something someone else has said, without having had personal experience. How can you tell whether a disagreement is due to the words being used differently, the minds being actually different, or the words and the minds being much the same but the people having differing awareness of their respective minds?

This is a problem I have with pretty much everything I have read about meditation. I can follow the external instructions about sitting, but if I cannot match up the description of the results to be supposedly obtained with my experience, there isn't anywhere to go with that.

comment by ChristianKl · 2014-05-09T11:24:07.107Z · LW(p) · GW(p)

Couldn't you say exactly that to anyone who doubts the existence of anything?

The assumptions in that sentence are interesting. It presupposes that a debate is an interaction where you compete against other person by proving them wrong. I rather want to offer friendly way to improve understanding. Whether or not the other person accept it is their choice.

In cases like this it's very useful to think about what people mean with words and not go with your first impression of what they might mean.

Replies from: None
comment by [deleted] · 2014-05-09T13:13:54.414Z · LW(p) · GW(p)

It presupposes that a debate is an interaction where you compete against other person by proving them wrong.

I don't think so. I just meant to point out that what you said was a triviality. If you intended it as a protreptic triviality, that's fine, I have no objection and that's justification enough for me.

Replies from: ChristianKl
comment by ChristianKl · 2014-05-09T13:19:01.333Z · LW(p) · GW(p)

I don't think so. I just meant to point out that what you said was a triviality.

Could you define what you mean with "triviality"?

Replies from: None
comment by [deleted] · 2014-05-09T14:12:08.797Z · LW(p) · GW(p)

I mean something which follows from anything. I don't intend it as a term of disapprobation: trivialities are often good ways of expressing a thought, if not literally what was said. If you intended this: "In cases like this it's very useful to think about what people mean with words and not go with your first impression of what they might mean" then I agree with you, and with the need to say it. I just missed your point the first time around (and if you were to ask me, you put the point much better when you explained it to me).

Replies from: ChristianKl
comment by ChristianKl · 2014-05-09T16:38:01.405Z · LW(p) · GW(p)

Yes, that roughly what I mean. However there might be no way for you to know what they mean if you lack certain experiences.

If a New Agey person speaks about how the observer effect in Quantum physics means X, his problem is that he doesn't have any idea what "observer" means for a physicist. Actually getting the person to understand what "observer" means to a physicist isn't something that you accomplish in an hour if the person has a total lack of physics background. .

The same is true in reverse. It's not straightforward for the physicist to understand what the New Agey person means. Understanding people with a very different mindset then you is hard.

Replies from: None
comment by [deleted] · 2014-05-09T17:06:20.331Z · LW(p) · GW(p)

You seem to be saying two things here:

Actually getting the person to understand what "observer" means to a physicist isn't something that you accomplish in an hour if the person has a total lack of physics background....It's not straightforward for the physicist to understand what the New Agey person means.

This entails that it is possible to simply explain what you mean, even across very large inferential gaps.

However there might be no way for you to know what they mean if you lack certain experiences.

Yet here you seem to entertain the idea that it's sometimes impossible to explain what you mean, because a certain special experience is necessary.

I endorse the first of these two points, and I'm extremely skeptical about the second. It also seems to me that physicists tend to hold to the first, and new agers tend to hold to the second, and that this constitutes much of the difference in their epistemic virtue.

Replies from: ChristianKl
comment by ChristianKl · 2014-05-10T12:29:41.993Z · LW(p) · GW(p)

Yet here you seem to entertain the idea that it's sometimes impossible to explain what you mean

I said impossible in an hour not impossible in general. It simple might take a few years. There a scene in Neuromancer where at the end one protagonist asks the AI why another acted the way they did. The first answer is: It's unexplainable. Then the answer is, it's not really unexplainable but would take 37 years to explain. (my memory on the exact number might not be accurate)

On the other hand the idea that teaching new phenomenological primitives is extremely hard. It takes more than an hour to teach a child that objects don't fall because they are heavy but because of gravity. Yes, you might get some token agreement but when you ask questions the person still thinks that a heavy object ought to fall faster than a light one because they haven't really understand the concept on a deep level. In physics education it's called teaching phenomenological primitives.

This entails that it is possible to simply explain what you mean, even across very large inferential gaps.

You can't explain a blind man what red looks like. There are discussions that are about qualia.

Replies from: Lumifer, Richard_Kennaway
comment by Lumifer · 2014-05-10T19:53:22.597Z · LW(p) · GW(p)

but when you ask questions the person still thinks that a heavy object ought to fall faster than a light one because they haven't really understand the concept on a deep level.

No, they think that a heavy object ought to fall faster than a light one because that's how it actually works for most familiar objects falling through air.

If you've just been telling without demonstrating, this is pure reliance on authority.

Replies from: Vladimir_Nesov, ChristianKl
comment by Vladimir_Nesov · 2014-05-10T20:35:56.296Z · LW(p) · GW(p)

If you've just been telling without demonstrating, this is pure reliance on authority.

(Or taking a hypothetical seriously.)

An important factor is just understanding the details of how everything supposedly fits together. Even if you don't know from observation that it's the way things work in our world, there is evidence in seeing a coherent theory, as opposed to contradictory lies and confusion. Inventing a robust description of a different world is hard, more likely it's just truth about ours.

comment by ChristianKl · 2014-05-10T20:55:54.746Z · LW(p) · GW(p)

No, they think that a heavy object ought to fall faster than a light one because that's how it actually works for most familiar objects falling through air.

Empty water bottles don't exactly fall faster than full water bottles.

But my point isn't about whether you rely or authority or don't but on how people actually make decisions. There literature on phenomenological primitives in physics.

The one time we tested the theory of gravity experimentally in school I did not get numbers that the Newtonian formula predicted. At the same time I don't think those formula are wrong. I believe them because smart people tell me that they are true and I don't care enough about physics to investigate the matter further.

Replies from: Lumifer
comment by Lumifer · 2014-05-10T22:50:36.540Z · LW(p) · GW(p)

Empty water bottles don't exactly fall faster than full water bottles.

Through air full water bottles do fall faster than empty ones.

The one time we tested the theory of gravity experimentally in school I did not get numbers that the Newtonian formula predicted. At the same time I don't think those formula are wrong. I believe them because smart people tell me that they are true and I don't care enough about physics to investigate the matter further.

LOL. "Who are you going to believe, me or your lying eyes?"

Replies from: ChristianKl
comment by ChristianKl · 2014-05-11T12:40:48.035Z · LW(p) · GW(p)

Through air full water bottles do fall faster than empty ones.

A bit maybe but I think they should have roughly the same speed. How much faster do you think they would fall?

LOL. "Who are you going to believe, me or your lying eyes?"

Sometimes you have to make hard choices...

There was a time were I thought it was about picking sides and being for empiricism or against it. I'm well past that point. There are times when believing the authority is simply the right choice.

Replies from: V_V, Lumifer
comment by V_V · 2014-05-11T14:43:01.598Z · LW(p) · GW(p)

A bit maybe but I think they should have roughly the same speed. How much faster do you think they would fall?

If the fall is sufficiently long, they reach different terminal velocities, which are proportional to the square root of their masses.
According to the Teh Interwebz, an average 0.5 litre empty plastic bottle weights about 13 g. A full bottle weights 513 g. Therefore, at terminal velocity it falls about 6.3 times faster.

Replies from: ChristianKl
comment by ChristianKl · 2014-05-11T15:21:57.105Z · LW(p) · GW(p)

If the fall is sufficiently long, they reach different terminal velocities, which are proportional to the square root of their masses.

What does sufficiently long mean in practice?

Replies from: V_V
comment by V_V · 2014-05-11T20:51:34.335Z · LW(p) · GW(p)

It depends on the drag coefficient and forward projected surface area of the bottle. My mildly informed guess is that it would take between 20 and 30 seconds.

EDIT:

Actually, I've just tried dropping 1.5 litre bottles from an height of about 1.8 m. Even if the fall lasts perhaps one second, the empty bottle starts to tumble much more than the full one, and hits the ground a noticeably later.

comment by Lumifer · 2014-05-12T02:29:29.035Z · LW(p) · GW(p)

There are times when believing the authority is simply the right choice.

In epistemic matters? I don't think so.

Replies from: ChristianKl
comment by ChristianKl · 2014-05-27T07:35:57.315Z · LW(p) · GW(p)

Information isn't free and there are many cases where gathering more information is too expensive and who have to go with the best authority that's available.

On the other hand it's worthwhile to be conscious of the decision that one makes in that regard. Most people follow authorities for all the wrong reasons.

comment by Richard_Kennaway · 2014-05-10T16:23:54.668Z · LW(p) · GW(p)

It takes more than an hour to teach a child that objects don't fall because they are heavy but because of gravity.

"Because of gravity" isn't any better an explanation than "because they are heavy". Why does "gravity" accelerate all masses the same? Really thinking about that leads to general relativity, so it actually takes many years to explain why things fall, and it can't be done without going through calculus, topology, and differential geometry.

Cf. Feynman on explanations (07:10–09:05).

Replies from: ChristianKl
comment by ChristianKl · 2014-05-10T18:14:09.270Z · LW(p) · GW(p)

Just being able to recite "because of gravity" is not enough for many purposes. I myself did well in physics at school and finished best in class in it but I haven't studied any physics since then and I'm well aware that I don't understand advanced physics.

"Because of gravity" isn't any better an explanation than "because they are heavy".

It's not perfect but it is better. Airplanes fly well based on Newtonian physics.

comment by TheAncientGeek · 2014-05-06T13:23:45.863Z · LW(p) · GW(p)

You can construe the goal as non existent, but that is an uncharitable reading.

Replies from: satt
comment by satt · 2014-05-07T02:51:03.195Z · LW(p) · GW(p)

Whether the goal exists is an empirical question, no...? I don't understand where (a lack of) charity enters into it.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2014-05-07T10:45:54.687Z · LW(p) · GW(p)

The principle of charity relates to what people mean by what they say. Unmitigated experience might be empirically nonexistent under one interpretation of unmediated but not under another. If someone claims to have had unmediated experience , that is evidence relating to what they mean by their words.

Replies from: satt, TheAncientGeek
comment by satt · 2014-05-08T21:20:48.095Z · LW(p) · GW(p)

Unmitigated [sic] experience might be empirically nonexistent under one interpretation of unmediated but not under another.

I see. What more charitable interpretation of "unmediated experience" would you prefer?

comment by TheAncientGeek · 2014-05-07T10:51:01.867Z · LW(p) · GW(p)

Maybe the PoC would be an easier sell if it were phrased in terms of the "typical semantics fallacy".

comment by Cyan · 2014-05-05T04:06:21.490Z · LW(p) · GW(p)

Bruno de Finetti heard of [the author's empirical Bayes method for grading tests] and he wrote to me suggesting that the student should be encouraged to state their probability for each of the possible choices. The appropriate score should be a simple function of the probability distribution and the correct answer. An appropriate function would encourage students to reply with their actual distribution rather than attempt to bluff. I responded that it would be difficult to get third graders to list probabilities. He answered that we should give the students five gold stars and let them distribute the stars among the possible answers.

- Herman Chernoff (pg 34 of Past, Present, and Future of Statistical Science, available here)

Replies from: Mestroyer
comment by Mestroyer · 2014-05-05T08:52:19.654Z · LW(p) · GW(p)

Actually, if you do this with something besides a test, this sounds like a really good way to teach a third-grader probabilities.

comment by roystgnr · 2014-05-02T15:35:24.092Z · LW(p) · GW(p)

PLAYBOY: So the experiment didn’t work?

[Craig] FERGUSON: No, the experiment always works. There’s no such thing as an experiment that doesn’t work. There are only results, but results may vary. Here’s what I learned:

Replies from: AndHisHorse, satt, anandjeyahar, DanielLC
comment by AndHisHorse · 2014-05-03T07:57:36.291Z · LW(p) · GW(p)

Experiments can fail if they are executed or planned improperly. If both the control and the experimental group are given sugar pills, for example, or the equipment fails in a shower of sparks, the experiment has provided no evidence by which one can update. It is a small quibble, and probably not what the quote meant to illustrate (I'm guessing that the experiment provided evidence which downgraded the probability of the hypothesis), but something to note nonetheless: experiments are not magic knowledge-providers.

Replies from: Vaniver
comment by Vaniver · 2014-05-03T20:49:18.993Z · LW(p) · GW(p)

Experiments can fail if they are executed or planned improperly. If both the control and the experimental group are given sugar pills, for example, or the equipment fails in a shower of sparks, the experiment has provided no evidence by which one can update.

I think Ferguson would call those "results," and from those you would have learned about performing experiments, not about the original hypothesis you were interested in.

Replies from: Desrtopa, wedrifid
comment by Desrtopa · 2014-05-07T05:43:02.793Z · LW(p) · GW(p)

If anything, I think a really failed experiment is one that makes you think you've learned something that is in fact wrong, which is the result of flaws in the experiment that you never become aware of.

comment by wedrifid · 2014-05-16T07:25:57.593Z · LW(p) · GW(p)

I think Ferguson would call those "results," and from those you would have learned about performing experiments, not about the original hypothesis you were interested in.

Ferguson's proposed new language is a downgrade. Being unable to identify something as a failure when the outcome sucks is fatalism and not particularly useful.

comment by satt · 2014-05-03T21:37:04.683Z · LW(p) · GW(p)

Systems built without requirements cannot fail; they merely offer surprises — usually unpleasant!

— Robert Morris, quoted in Brian Snow's "We Need Assurance!"

comment by anandjeyahar · 2014-05-03T05:39:47.023Z · LW(p) · GW(p)

I tend to disagree.. I have done some things which I thought was experimenting with but did not come up with any clear conclusion after the experiment and analysis. On rewriting the thesis it turned out there were a lot more implicit assumptions inside the hypothesis that I was not aware of. I think it was a badly designed experiment and it was rather unproductive in retrospective analysis. I suppose one could argue that it brought to light the implicit assumptions and that was a useful result. Somehow(not sure how or why) I find that a low standard to consider something an experiment.

comment by DanielLC · 2014-05-15T21:53:21.750Z · LW(p) · GW(p)

An experiment is supposed to teach you the truth. If you run the experiment badly and, say, get a false positive, then the experiment failed.

comment by Tenoke · 2014-05-07T12:25:19.994Z · LW(p) · GW(p)

"Man is not going to wait passively for millions of years before evolution offers him a better brain."

--Corneliu E. Giurgea, the chemist who synthesized Piracetam and coined the term 'Nootropic'

comment by arundelo · 2014-05-03T18:27:58.219Z · LW(p) · GW(p)

Things like linear algebra, group theory, and probability have so many uses throughout science that learning them is like installing a firmware upgrade to your brain -- and even the math you don't use will stretch you in helpful ways.

-- Scott Aaronson

Replies from: ChristianKl
comment by ChristianKl · 2014-05-03T18:59:53.848Z · LW(p) · GW(p)

The same is true for a lot of intellectual concepts outside of math.

Replies from: David_Gerard, bramflakes
comment by David_Gerard · 2014-05-04T10:12:32.472Z · LW(p) · GW(p)

If only we could put together, say, a four-year college degree course intended to have this effect ...

Replies from: johnlawrenceaspden
comment by johnlawrenceaspden · 2014-05-05T18:02:17.966Z · LW(p) · GW(p)

I think that's a super idea. I'd like to design it and I'd like to take it. The ideas that underlie everything else. Like a whole university course devoted to A-level maths, but covering every simple underlying idea. We should start by trying to work out what the syllabus should be.

(one 16 lecture course on each topic, and we'll have three courses per term so that's 36 courses in total)

Off the top of my head we should have: groups, calculus, dimensional analysis, estimation, probability (inc bayes), relativity, quantum mechanics, electronics, programming, chemistry, evolution, evolutionary psychology, heuristics and biases, law, public speaking, creative writing, economics, logic, game theory, game-of-life, how-to-win-friends-and-influence people, history, cosmology, geography, atomic theory, molecular biology ...

All taught with immediate direct applications to actual things in the immediate environment and if you can't come up with simple examples that a child would find interesting and could understand then it doesn't make the cut.

Any more suggestions? If we get loads let's make a post on 'The ideal 4-year university course'.

Replies from: David_Gerard, Kaj_Sotala
comment by David_Gerard · 2014-05-06T08:10:00.747Z · LW(p) · GW(p)

The joke was that this is precisely what a liberal arts degree was meant to be; the main problem is that liberal arts degrees haven't kept up with the times.

comment by Kaj_Sotala · 2014-05-06T06:10:19.618Z · LW(p) · GW(p)

Here's a related post, though it doesn't have that many suggestions: http://lesswrong.com/lw/l7/the_simple_math_of_everything/

comment by bramflakes · 2014-05-03T21:11:42.991Z · LW(p) · GW(p)

What like?

Replies from: tristanhaze, Torello, None, ChristianKl
comment by tristanhaze · 2014-05-04T01:37:28.093Z · LW(p) · GW(p)

For my part, I've found the economic notions of opportunity cost and marginal utility to be like this.

Replies from: johnlawrenceaspden
comment by johnlawrenceaspden · 2014-05-05T17:48:33.798Z · LW(p) · GW(p)

That's maths too.

Replies from: Viliam_Bur, TobyBartels
comment by Viliam_Bur · 2014-05-06T09:10:39.694Z · LW(p) · GW(p)

The specific application of the math does add value.

Most obviously for the opportunity costs, on the math side you only have to understand the "minus" symbol, which pretty much everyone already does. With marginal utility you have to understand the "derivative", but you still have to apply it in a situation ouside of math class.

comment by TobyBartels · 2014-05-12T04:04:37.842Z · LW(p) · GW(p)

It's applied math, not the pure math that the OP was talking about. Furthermore, these can be useful ideas even when used purely qualitatively; then it's not even applied math (except in a sense that everything is math, if we make the math sufficiently imprecise).

comment by Torello · 2014-05-04T04:35:37.776Z · LW(p) · GW(p)

"Nothing in Biology Makes Sense Except in the Light of Evolution"

— Theodosius Dobzhansky

The fact that a theory that can be stated in ten words frames an entire discipline is quite incredible. Compared to group theory and probability, it sure seems like an easier uploading process as well.

Replies from: SaidAchmiz, ChristianKl
comment by Said Achmiz (SaidAchmiz) · 2014-05-04T05:12:28.268Z · LW(p) · GW(p)

What are the ten words or less in which evolution can be stated?

Replies from: Torello, Kawoomba, infinityGroupoid, BloodyShrimp
comment by Torello · 2014-05-04T14:28:28.527Z · LW(p) · GW(p)

"Multiply, vary, let the strongest live and the weakest die."

-Charles Darwin, The Origin of Species

Replies from: Desrtopa
comment by Desrtopa · 2014-05-07T05:35:22.794Z · LW(p) · GW(p)

I think that Darwin would himself acknowledge that "fittest" is a more accurate rendition than "strongest," but whether the quote can be rendered in this way without breaking the ten words constraint comes down to a question of whether "unfittest" counts as a legit word.

Replies from: NancyLebovitz, Nornagest
comment by NancyLebovitz · 2014-05-09T15:20:38.831Z · LW(p) · GW(p)

I think "fit" has become a free-floating standard rather than meaning "fitting into a particular environment".

comment by Nornagest · 2014-05-07T06:34:01.239Z · LW(p) · GW(p)

Maladapted, as an adjective? Though I suppose that's cheating a bit since it's a sense of adaptation that draws on an evolutionary metaphor.

comment by Kawoomba · 2014-05-04T07:43:44.088Z · LW(p) · GW(p)

warped by random change

what replicates stays around

always evolving

(More constraints! More constraints!)

change without motion

the lament of the red queen

coevolution

comment by infinityGroupoid · 2014-05-07T00:49:04.703Z · LW(p) · GW(p)

Natural Selection: the differential survival of replicators with heritable variation.

comment by BloodyShrimp · 2014-05-04T06:42:53.728Z · LW(p) · GW(p)

"We have what replicated better; noise permanently affects replicative ability"?

comment by ChristianKl · 2014-05-04T13:41:07.870Z · LW(p) · GW(p)

"Mathematics is about proving theorems based on axioms and other theorems" also frames a whole discipline.

A frame tells you something about a disciple but it doesn't tell you everything.

comment by [deleted] · 2014-05-04T12:57:27.010Z · LW(p) · GW(p)

A good deal of the sequences seem to fall in this category. Conservation of expected evidence, for instance.

comment by ChristianKl · 2014-05-04T13:48:32.627Z · LW(p) · GW(p)

When it comes to general concepts cybernetics is something to which a lot of people on LW don't have much exposure and cybernetics as central as knowing probability theory for understanding how the world works.

Basically any subject in which I invested a decent amount of thought produces lessons that are applicable to other topics. I even learned a lot in an activity like Salsa dancing that's useful in other contexts.

Replies from: army1987
comment by A1987dM (army1987) · 2014-05-04T18:58:14.695Z · LW(p) · GW(p)

When it comes to general concepts cybernetics is something to which a lot of people on LW don't have much exposure and cybernetics as central as knowing probability theory for understanding how the world works.

What introductory material about it would you recommend?

Replies from: ChristianKl
comment by ChristianKl · 2014-05-04T20:33:21.127Z · LW(p) · GW(p)

Unfortunately I don't have a good recommendation. Formally I learned about it in a physiology lecture at university and the professor said that there isn't a good textbook that he could use to teach us.

While searching around I found An Introduction of Cybernetics by Ross Ashby. It's might not be perfect but I think it's probably a good enough introduction.

comment by elharo · 2014-05-01T09:53:13.277Z · LW(p) · GW(p)

The brutal truth is that reality is indifferent to your difficulty in finding enough subjects. It’s like astronomy: To study things that are small and distant in the sky you need a huge telescope. If you only have access to a few subjects, you need to study bigger effects, and maybe that wouldn’t be such a bad thing.

-- Joseph P. Simmons, The Reformation: Can Social Scientists Save Themselves

Replies from: NancyLebovitz
comment by NancyLebovitz · 2014-05-27T14:48:35.763Z · LW(p) · GW(p)

Voted up for the linked article more than for the quote.

comment by Mestroyer · 2014-05-04T03:38:21.293Z · LW(p) · GW(p)

we're human beings with the blood of a million savage years on our hands. But we can stop it. We can admit that we're killers, but we're not going to kill Today.

Captain James Tiberius Kirk dodging an appeal to nature and the "what the hell" effect, to optimize for consequences instead of virtue.

Replies from: Cyan
comment by Cyan · 2014-05-05T20:25:52.829Z · LW(p) · GW(p)

That clip is a brilliant example of Shatner's much-mocked characteristic acting-speak.

comment by redlizard · 2014-05-15T02:58:04.627Z · LW(p) · GW(p)

Even with measurements in hand, old habits are hard to shake. It’s easy to fall in love with numbers that seem to agree with you. It’s just as easy to grope for reasons to write off numbers that violate your expectations. Those are both bad, common biases. Don’t just look for evidence to confirm your theory. Test for things your theory predicts should never happen. If the theory is correct, it should easily survive the evidential crossfire of positive and negative tests. If it’s not you’ll find out that much quicker. Being wrong efficiently is what science is all about.

-- Carlos Bueno, Mature Optimization, pg. 14. Emphasis mine.

comment by aarongertler · 2014-05-16T01:13:23.605Z · LW(p) · GW(p)

“I refuse to answer that question on the grounds that I don't know the answer.”

― Douglas Adams

Replies from: brazil84
comment by brazil84 · 2014-05-16T01:42:39.946Z · LW(p) · GW(p)

I like this quote, but it occurs to me that "I don't know" is often a reasonable answer to a question.

How about this:

"I refuse to answer that question on the grounds that I can't think of an answer which I am confident will not put me in a negative light."

Replies from: AndHisHorse
comment by AndHisHorse · 2014-05-16T03:27:01.608Z · LW(p) · GW(p)

That just seems like overly honest politicking to me.

Replies from: brazil84
comment by brazil84 · 2014-05-16T12:43:54.593Z · LW(p) · GW(p)

That just seems like overly honest politicking to me.

"All the world is a political campaign. And the men and women are merely politicians."

-- me right now

P.S. "overly honest" is kinda the point of the joke.

comment by satt · 2014-05-01T23:09:26.368Z · LW(p) · GW(p)

Nothing is so obvious that it’s obvious.

Errol Morris

comment by Vulture · 2014-05-03T21:17:23.316Z · LW(p) · GW(p)

[N]ature is constantly given human qualities. Wordsworth wrote that “nature never did betray the heart that loved her.” Mother Nature has comforted us in every culture on earth. In the 20th and 21st centuries, some environmentalists claimed that the entire earth is a single ecosystem, a “superorganism” in the language of Gaia.

I would argue that we have been fooling ourselves. Nature, in fact, is mindless. Nature is neither friend nor foe, neither malevolent nor benevolent.

Nature is purposeless. Nature simply is. We may find nature beautiful or terrible, but those feelings are human constructions. Such utter and complete mindlessness is hard for us to accept. We feel such a strong connection to nature. But the relationship between nature and us is one-sided. There is no reciprocity. There is no mind on the other side of the wall. That absence of mind, coupled with so much power, is what so frightened me...

-- Alan Lightman

Replies from: Torello
comment by Torello · 2014-05-04T04:25:15.272Z · LW(p) · GW(p)

Every 100 million years or so, an asteroid or comet the size of a mountain smashes into the earth, killing nearly everything that lives. If ever we needed proof of Nature’s indifference to the welfare of complex organisms such as ourselves, there it is. The history of life on this planet has been one of merciless destruction and blind, lurching renewal.

Sam Harris, Mother Nature is Not Our Friend, in response to the Edge Annual Question 2008

http://www.samharris.org/site/full_text/the-edge-annual-question-20081#sthash.IBMyMOQN.dpuf

comment by Torello · 2014-05-01T23:33:48.566Z · LW(p) · GW(p)

Accident, n. An inevitable occurrence due to the action of immutable natural laws.

  • Ambrose Bierce, The Enlarged Devil's Dictionary, complied and edited by Ernest J. Hopkins
comment by EGarrett · 2014-05-06T16:39:52.975Z · LW(p) · GW(p)

"The power of accurate observation is commonly called cynicism by those who have not got it." -George Bernard Shaw

Replies from: philh
comment by philh · 2014-05-07T22:37:38.810Z · LW(p) · GW(p)

Or naivety, depending on how cynical the critic is.

And of course, inaccurate observations are commonly called cynical and/or naive as well...

comment by James_Ernest · 2014-05-04T06:18:56.476Z · LW(p) · GW(p)

Real probabilities about the structure and properties of the cosmos, and its relation to living organisms on this planet, can be reach’d only by correlating the findings of all who have competently investigated both the subject itself, and our mental equipment for approaching and interpreting it — astronomers, physicists, mathematicians, biologists, psychologists, anthropologists, and so on. The only sensible method is that of assembling all the objective scientifick data of 1931, and forming a fresh chain of partial indications bas’d exclusively on that data and on no conceptions derived from earlier and less ample arrays of data; meanwhile testing, by the psychological knowledge of 1931, the workings and inclinations of our minds in accepting, connecting, and making deductions from data, and most particularly weeding out all tendencies to give more than equal consideration to conceptions which would never have occurred to us had we not formerly harboured provisional and capricious ideas of the universe now conclusively known to be false. It goes without saying that this realistic principle fully allows for the examination of those irrational feelings and wishes about the universe, upon which idealists so amusingly base their various dogmatick speculations.

-- H.P. Lovecraft, Selected Letters, 1932-1934.

Replies from: johnlawrenceaspden, Vulture, James_Ernest
comment by johnlawrenceaspden · 2014-05-05T18:08:55.707Z · LW(p) · GW(p)

What's with bas'd and dogmatick? Is Lovecraft aiming at some antique effect, or did he write in a non-standard dialect?

Replies from: Nornagest
comment by Nornagest · 2014-05-05T18:27:46.963Z · LW(p) · GW(p)

Yes and yes. Lovecraft was writing in early 20th century New England, but he typically affected the forms of late 1700s British English, or at least tried to. Partly this was for stylistic effect, but I get the sense that he also thought of his native idiom as intellectually debased.

The aesthetics of tradition were kind of a thing with Lovecraft, although in other ways he was thoroughly modern. Not that these affectations were exclusive to Lovecraft by any means; William Hope Hodgson for example wrote The Night Land (a seminal 1912 horror/SF story and notable Lovecraft influence) in an excruciating pseudo-17th-century dialect.

comment by Vulture · 2014-05-04T17:54:32.384Z · LW(p) · GW(p)

Good god, he did write everything like that!

comment by James_Ernest · 2014-05-04T06:29:38.167Z · LW(p) · GW(p)

Consider my priors for knowledge of Bayes-fu by wise predecessors to be significantly raised.

comment by TobyBartels · 2014-05-12T02:57:38.824Z · LW(p) · GW(p)

Don't just tell me what you'd like to be true.

This is from Greg Egan's 1999 novel Teranesia; since there are no hits for ‘Teranesia’ in the Google custom search, I'm inferring that it hasn't been posted before.

Here's a little background. This is a spoiler for some events early in the novel, but it is early; it's not a spoiler for the really big stuff (not even in this chapter). So Prabir lives alone with his father (‘Baba’) and mother (and baby sister Madhusree who is not in this scene), and their garden has been sown with mines for some very interesting reasons that needn't concern us, and Baba has discovered this by being blown up by one. But he's still alive, so mother and Prabir have laid a ladder atop some boxes across the garden, and she's crawled along the ladder to rescue Baba without setting off more mines. But this is harder than anticipated.

She turned to Prabir. “I'm going to try sitting down, so I can get Baba on to the ladder. But then I might not be able to stand up with him, to carry him. If I leave him on the ladder and walk back to my end, do you think the two of us could carry the ladder to the side of the garden with Baba on it—like a stretcher?”

Prabir replied instantly, “Yes, we can do it.”

His mother looked away, angry for a moment. She said, “I want you to think about it. Don't just tell me what you'd like to be true.”

Chastened, Prabir obeyed her. Half his father's weight. More than twice as much as Madhusree's. He believed he was strong enough. But if he was fooling himself, and dropped the ladder …

He said, “I'm not sure how far I could carry him without resting. But I could slide the crate along the ground with me—kick it along with one foot. Then if I had to stop, I could rest the ladder on it.”

His mother considered this. “All right. That's what we'll do.” She shot him a half-smile, shorthand for all the reassuring words that would have taken too long to speak.

(taken from the American hardback edition, pages 50&51)

[Edit: grammar in the text written by me]

Replies from: shminux
comment by shminux · 2014-05-12T20:41:35.978Z · LW(p) · GW(p)

It is a good quote, and it works in context, but often it pays to (temporarily) believe that "what you'd like to be true" actually is and do your hardest (or even impossible) to figure out how you got there. “Yes, we can do it.” could be the first step toward figuring out the "how" part.

comment by raisin · 2014-05-19T17:36:56.810Z · LW(p) · GW(p)

"There's a blind spot in the center of your visual field," Sarasti pointed out. "You can't see it. You can't see the saccades in your visual timestream. Just two of the tricks you know about. Many others."

Cunningham was nodding. "That's my whole point. Rorschach could be—"

"Not talking about case studies. Brains are survival engines, not truth detectors. If self-deception promotes fitness, the brain lies. Stops noticing— irrelevant things. Truth never matters. Only fitness. By now you don't experience the world as it exists at all. You experience a simulation built from assumptions. Shortcuts. Lies. Whole species is agnosiac by default. Rorschach does nothing to you that you don't already do to yourselves."

comment by A1987dM (army1987) · 2014-05-03T09:04:11.321Z · LW(p) · GW(p)

The little boy's mother was off to market. She worried about her boy, who was always up to some mischief. She sternly admonished him, "Be good. Don't get into trouble. Don't eat all the cabbage. Don't spill all the milk. Don't throw stones at the cow. Don't fall down the well." The boy had done all of these things on other market days. Hoping to head off new trouble, she added, "And don't stuff beans up your nose!" This was a new idea for the boy, who promptly tried it out.

Wikipedia:Don't stuff beans up your nose

Replies from: Lumifer, TobyBartels
comment by Lumifer · 2014-05-03T17:45:55.713Z · LW(p) · GW(p)

There is a shorter version :-)

"Kids, while we're away, don't lock the cat in the fridge", said the parents.

"Ooooh, that's a great idea", said the kids...

comment by TobyBartels · 2014-05-12T03:57:03.578Z · LW(p) · GW(p)

That's not necessarily a bad result. If he's busy stuffing beans up his nose, then this might keep him out of greater trouble; everything else that's listed before (and which apparently he did before) seems worse. That might be just what his mother planned.

Replies from: alex_zag_al
comment by alex_zag_al · 2014-05-22T02:50:49.797Z · LW(p) · GW(p)

i once had to go to the doctor so he could fish a lego out of my nose. So, that was worse than eating all the cabbage or spilling all the milk I think. More scary, and probably more expensive, depending on how the insurance worked out.

Replies from: TobyBartels
comment by TobyBartels · 2014-05-22T12:10:05.897Z · LW(p) · GW(p)

I think that shape, hardness, and solubility would all make a Lego brick worse than a bean.

Really, the only way to tell is probably to try it out. Who wants to volunteer for an experiment?

comment by Kawoomba · 2014-05-21T15:16:02.437Z · LW(p) · GW(p)

The workman of today works every day in his life at the same tasks, and this fate is no less absurd [than that of Sisyphus]. But it is tragic only at the rare moments when it becomes conscious.

Camus, The Myth of Sisyphus

Replies from: Richard_Kennaway, roystgnr
comment by Richard_Kennaway · 2014-05-31T16:52:14.152Z · LW(p) · GW(p)

The worker is paid for his work, and with this money he obtains a roof over his head, food on the table, and the wherewithal to raise a family and to pursue other activities when he is not working. Sisyphus works for nothing and does nothing but work. That Camus sees, or affects to see, no difference between their situations says something about Camus, but nothing about work.

Replies from: MugaSofer
comment by MugaSofer · 2014-07-05T22:15:10.342Z · LW(p) · GW(p)

Is it truly different to work because the Gods have forced you, compared to working because the threat of starvation and homelessness has forced you?

I thought the quote was suggesting both tasks are equally arbitrary and pointless, though, rather than discussing compensation. It seems more interesting.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2014-07-06T09:49:50.333Z · LW(p) · GW(p)

Is it truly different to work because the Gods have forced you, compared to working because the threat of starvation and homelessness has forced you?

Yes, it is.

Some people have it harder than others, but we all work because the threat of starvation and homelessness forces us; except for those relying on the charity of friends and family (including deceased ones), or of institutions. The meat machines we live in require sustenance and shelter, without which we die, and these resources are provided either by our own work or by that of others. Death is free. Life has to be worked for.

Some are fortunate enough to have the abilities, health, energy, and social environment to be confident of always finding people to pay for whatever it is we want to direct our efforts towards. The wolves are so very far from our door that we can forget, or never realise, that they are out there, inching closer when we rest and retreating when we work.

So you can apply the story of Sisyphus to all of us, but only in the larger sense that we are forced to run all the while just to stay alive, and that only for 70 years or so. It applies just as much to Camus (whose Wiki page is rather uninformative about how he actually earned a living) as to the lowest factory worker.

We may, of course, daydream of a future in which we need care no more to clothe and eat. We may work to bring such a future about. But that is not the world we live in today, nor has it ever been, nor will it be for a very long time.

I thought the quote was suggesting both tasks are equally arbitrary and pointless, though, rather than discussing compensation.

It is suggesting that, and, I say, it is wrong.

Replies from: MugaSofer
comment by MugaSofer · 2014-07-06T18:19:20.513Z · LW(p) · GW(p)

(One might argue that "the workman of today" is less likely to accomplish something meaningful, in the course of earning their living.)

Even if everything was meaningless - which it isn't, in my opinion, but Camus does seem to have thought so - and everyone must work or starve - which, as you note, is not true because people are compassionate - surely that merely makes the comparison to Sisyphus that much more relevant? How does it undermine the quote?

Indeed, if it's that hard to escape, surely comparing starvation to the inescapable will of the gods is that much more accurate?

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2014-07-06T20:50:07.196Z · LW(p) · GW(p)

Indeed, if it's that hard to escape, surely comparing starvation to the inescapable will of the gods is that much more accurate?

That depends on the strength of one's transhumanist faith. :)

One can repurpose Camus as much as Camus repurposes Sisyphus, but the original passage does go on to say, "Sisyphus, proletarian of the gods, powerless and rebellious..." So Camus is not talking about us all, certainly not intellectuals like himself, but about the proles.

comment by roystgnr · 2014-05-31T14:12:19.103Z · LW(p) · GW(p)

I think there's a non-negligible difference between "I push the same rock around every day, and there it is back in the exact place it started again" and e.g. "I push the same kinds of rock around every day, but last year's are now embedded in the building we just finished."

Replies from: Kawoomba
comment by Kawoomba · 2014-05-31T14:32:30.438Z · LW(p) · GW(p)

Camus may answer along the lines of "since [any ascribing of meaning] is absurd in the first place, if you think there's objectively more meaning in the building you built than in the rock you pushed up, you're not taking the premise seriously". In a way we're whistling in a dark forest.

comment by Eugine_Nier · 2014-05-01T23:21:22.242Z · LW(p) · GW(p)

People are surely better off with the truth. Oddly enough, everyone agrees with this when it comes to the arts. Sophisticated people sneer at feel-good comedies and saccharine romances in which everyone lives happily ever after. But when it comes to science, these same people say, "Give us schmaltz!" They expect the science of human beings to be a source of emotional uplift and inspirational sermonizing.

Steven Pinker

Replies from: fubarobfusco, 123
comment by fubarobfusco · 2014-05-02T15:33:05.757Z · LW(p) · GW(p)

This lacks a ring of truth for me.

A lot of folks seem to expect the science of human beings to reinforce their bitterness and condemnation of human nature (roughly, "people are mostly crap"). I kinda suspect that if you asked "sophisticated people" (whoever those are) to name some important psychology experiments, those who named any would come up with Zimbardo's Stanford prison experiment and Milgram's obedience experiments pretty early on. Not a lot of emotional uplift there.

As for the arts — horror films where everyone dies screaming seem to be regarded as every bit as lowbrow as feel-good comedies.

comment by 123 · 2014-05-03T12:14:50.858Z · LW(p) · GW(p)

It's not obvious that one is better off with the truth. Assume that for some desirable thing X:

P(X|I believe X will happen) = 49%

P(X|I believe X won't happen) = 1%

It seems I can't rationally believe that X will happen. Perhaps I would be better off being deluded about it.

Replies from: Remlin
comment by Remlin · 2014-05-06T05:47:06.710Z · LW(p) · GW(p)

Sorry, I don't understand - why does sum of probabilities not equal 100% in your example? Assume that you missed "5" in "P(X|I believe X won't happen) = 1%"

Perhaps I would be better off being deluded about it.

But for what reason?

Replies from: timujin, 123
comment by timujin · 2014-05-06T08:20:20.678Z · LW(p) · GW(p)

These probabilities are not required to sum to 1, because they are not incompatible and exhaustive possible outcomes of an experiment. More obvious example to illustrate:

P(6-sided die coming up as 6 | today is Monday) = 1/6
P(6-sided die coming up as 6 | today is not Monday) = 1/6
1/6 + 1/6 != 1

Replies from: Remlin
comment by Remlin · 2014-05-06T14:48:14.426Z · LW(p) · GW(p)

I think your example is not suitable for situation above - there I can see only two possible outcomes: X happen or X not happen. We don't know anything more about X. And P(X|A) + P(X|~A) = 1, isn't so?

Replies from: timujin, JQuinton
comment by timujin · 2014-05-06T16:35:55.600Z · LW(p) · GW(p)

No. You may have confused it with P(X|A) + P(~X|A) = 1 (note the tilda). In my case, either 6-sided die comes up as 6, or it doesn't.

comment by JQuinton · 2014-05-14T16:29:14.542Z · LW(p) · GW(p)

Yes, either X happens or X doesn't happen. P(X) + P(~X) = 1, so therefore P(X | A) + P(~X | A) = 1. Both formulations are stating the probability of X. But one is adjusting for the probability of X given A; so either X given A happens or X given A doesn't happen (which is P(~X | A) not P(X | ~A)).

comment by 123 · 2014-05-07T04:56:25.309Z · LW(p) · GW(p)

But for what reason?

When Pinker said "better off", I assumed he included goal achievement. It's plausible that people are more motivated to do something if they're more certain than they should be based on the evidence. They might not try as hard otherwise, which will influence the probability that the goal is attained. I don't really know if that's true, though.

The thing may be worth doing even if the probability isn't high that it will succeed, because the expected value could be high. But if one isn't delusionally certain that one will be successful, it may no longer be worth doing because the probability that the attempt succeeds is lower. (That was the point of my first comment.)

There could be other psychological effects of knowing certain things. For example, maybe it would be difficult to handle being completely objective about one's own flaws and so on. Being objective about people you know may (conceivably) harm your relationships. Having to lie is uncomfortable. Knowing a completely useless but embarrassing fact about someone but pretending you don't is uncomfortable, not simply a harmless, unimportant update of your map of the territory. Etc.

I'm not saying I know of any general way to avoid harmful knowledge, but that doesn't mean it doesn't exist.

comment by timujin · 2014-05-18T08:29:07.644Z · LW(p) · GW(p)

“All witches are selfish, the Queen had said. But Tiffany’s Third Thoughts said: Then turn selfishness into a weapon! Make all things yours! Make other lives and dreams and hopes yours! Protect them! Save them! Bring them into the sheepfold! Walk the gale for them! Keep away the wolf! My dreams! My brother! My family! My land! My world! How dare you try to take these things, because they are mine! I have a duty!”

― Terry Pratchett, The Wee Free Men (Discworld, #30)

Replies from: DanielLC, TheAncientGeek
comment by DanielLC · 2014-05-19T03:15:50.118Z · LW(p) · GW(p)

If you want to use your selfishness to help others, then you're not selfish.

Replies from: army1987, timujin, Neo
comment by A1987dM (army1987) · 2014-05-21T09:22:09.200Z · LW(p) · GW(p)

Do we really need to go into the question what “selfishness” actually means? In ordinary situations I'd say that “the actual altruist [is] whichever one actually holds open doors for little old ladies”; maybe in certain situations we need different words to specify whether they do so because it's in their own utility function or because of religious/game-theoretical/superrational/acausal/whatever-they-call-it-these-days reasons, but...

Replies from: DanielLC
comment by DanielLC · 2014-05-21T19:37:21.318Z · LW(p) · GW(p)

I don't think this is just a problem with definitions. This is fake morality.

She's giving a fake justification for helping others as her own self interest. Someone who finds a way to justify buying a million dollar laptop is clearly just being selfish and doesn't really care about their claimed morality of altruism. Similarly, someone who tries to justify helping others is clearly just being altruistic and doesn't really care about their claimed morality of selfishness.

comment by timujin · 2014-05-19T11:01:52.652Z · LW(p) · GW(p)

Of course you're not. But human nature is supposedly selfish, and if your true goals are altruistic, you will have to find a way to turn it around.

Replies from: None
comment by [deleted] · 2014-05-19T13:24:01.080Z · LW(p) · GW(p)

human nature is supposedly selfish

Emphasis on "supposedly", since the popular hypotheses about "selfish human nature" are far too simplistic to reflect any actual results of psychological research.

Replies from: timujin
comment by timujin · 2014-05-19T15:25:36.264Z · LW(p) · GW(p)

Of course they are. Unlike those about Pratchett's witches, though. They reflect the 'locally-selfish-globally-altruistic' concept surprisingly well.

comment by Neo · 2014-05-19T08:58:54.199Z · LW(p) · GW(p)

Selfishness seems to be referred to as primarily a a mindset or attitude. Helping others as an outcome. I think they can co-exist at the same time, for example Adam Smith's invisible hand in capitalism.

Replies from: DanielLC
comment by DanielLC · 2014-05-19T18:11:23.095Z · LW(p) · GW(p)

I'm not saying that your selfishness can't result in others being helped. I'm saying that if you're trying to figure out how to use your selfishness to help others, then helping others is clearly your goal, which proves you're not selfish. If you're willing to game the system to help others, then you'd be willing to help others without gaming the system.

Replies from: VAuroch
comment by VAuroch · 2014-05-21T20:20:01.018Z · LW(p) · GW(p)

If you are selfish (this usually will cash out as "you alieve that selfishness is good") but believe it is virtuous or beneficial to act unselfishly, then you would rightly seek ways to act in ways that feel locally selfish but have unselfish consequences.

Replies from: DanielLC, DanielLC
comment by DanielLC · 2014-05-21T21:45:40.601Z · LW(p) · GW(p)

You have a left parenthesis but no matching right parenthesis.

Replies from: VAuroch
comment by VAuroch · 2014-05-21T23:01:54.682Z · LW(p) · GW(p)

I have now fixed this serious issue. (Is this sarcasm? You Decide!)

Replies from: Cyan
comment by Cyan · 2014-05-22T03:02:24.104Z · LW(p) · GW(p)

Shouldn't that be

(Is this sarcasm? You Decide!

?

Replies from: VAuroch
comment by VAuroch · 2014-05-22T16:06:44.114Z · LW(p) · GW(p)

I considered that but decided it was needlessly cruel. And now you did it for me, so I get the best of both worlds.

comment by DanielLC · 2014-05-22T01:26:27.557Z · LW(p) · GW(p)

Now that I can understand your sentence:

If you are selfish, but believe it is virtuous to act unselfishly, then you'll seek ways to act in ways that look unselfish, but have selfish consequences.

Tiffany seems to be an altruist who thinks she's supposed to be selfish, and is trying to justify acting altruistically as somehow being selfish.

Replies from: VAuroch
comment by VAuroch · 2014-05-22T16:40:56.183Z · LW(p) · GW(p)

You're describing someone who believes it is beneficial to look unselfish but not be unselfish.

If you are selfish, but have reasoned out that helping others is the correct goal to have, you would believe not that it is beneficial to look unselfish, but that it is beneficial to act unselfishly. And if you believe that but do not alieve it, System 2 would look for ways to do unselfish things that System 1 would perceive as selfish, so as to better motivate yourself toward those goals.

comment by TheAncientGeek · 2014-05-19T16:44:45.838Z · LW(p) · GW(p)

Making your identity small Is wisdom...

Making your identity large is....?

Replies from: timujin, VAuroch
comment by timujin · 2014-05-19T17:01:27.941Z · LW(p) · GW(p)

...witchcraft?

Replies from: TheAncientGeek
comment by TheAncientGeek · 2014-05-19T17:26:38.653Z · LW(p) · GW(p)

No. Try a monosyllable.

Replies from: timujin
comment by timujin · 2014-05-19T17:38:15.093Z · LW(p) · GW(p)

"...on"?

Damn, that's tricky. Only boring monosyllables come to mind, like "good". "...power" is almost a monosyllable if you say it fast enough, though.

Oh! I know.

"...life".

Replies from: TheAncientGeek
comment by TheAncientGeek · 2014-05-19T17:58:40.540Z · LW(p) · GW(p)

Getting warmer ...

Replies from: timujin
comment by timujin · 2014-05-20T03:50:42.588Z · LW(p) · GW(p)

Oh, so I am to seek the one true answer, not optimise for the most badass one? Belch... I've got nothing. Why is this conversation getting downvoted anyway?

comment by VAuroch · 2014-05-21T18:07:16.910Z · LW(p) · GW(p)

CEV, in this case.

comment by shminux · 2014-05-29T20:40:44.923Z · LW(p) · GW(p)

the fact that I don’t know exactly what consciousness is, doesn’t mean that I can’t be crystal-clear about what it isn’t!

Scott Aaronson in reply to the statements like "A stone is conscious to the “inputs” of gravity and electrostatic repulsion"

Replies from: roystgnr, MugaSofer, JosephY
comment by roystgnr · 2014-05-31T14:05:07.112Z · LW(p) · GW(p)

I'm not sure Scott isn't just falling victim to the sorites paradox here. There are lots of macroscale definitions which seem to break down at their smallest application, and it's not immediately obvious that consciousness couldn't be one of them.

Replies from: Kawoomba
comment by Kawoomba · 2014-05-31T14:40:50.026Z · LW(p) · GW(p)

The question is whether to interpret such a falling apart of a definition (which I take to mean that related decision problems cannot be clearly answered anymore) as an inherent or even necessary attribute of concepts which 'live' at a macroscale, or as a weakness of said definition, as a sign that we're mistaking a fuzzy word cloud for a precisely defined set.

comment by MugaSofer · 2014-07-05T22:09:31.807Z · LW(p) · GW(p)

Hmm. I see his point, I thinks, but I ... think it does mean that, actually. Without fully understanding the definition, you should be less sure that a better understanding wouldn't classify them differently.

Picture a slave-owner saying something similar about a slave, for instance. Slave-owners were even more confused than we are about personhood, and I think it's clear that they weren't "crystal clear on what [isn't a person", in retrospect.

Replies from: shminux
comment by shminux · 2014-07-05T23:31:11.116Z · LW(p) · GW(p)

Sure, there are debatable cases. But there are also clear-cut ones, like a bacteria, while alive, has no personhood", and if your model predicts that it has more personhood than a human (as IIT does for consciousness for a certain 2D configuration), then you should not call whatever your model describes a bacteria has more of as "personhood".

comment by JosephY · 2014-05-29T21:42:25.072Z · LW(p) · GW(p)

It reminds me of Justice Potter Stewart: "I know it when I see it!"

Replies from: shminux
comment by shminux · 2014-05-29T22:06:25.934Z · LW(p) · GW(p)

Well, it's the converse, which seems a lot more useful a criterion to me.

comment by Eugine_Nier · 2014-05-01T23:25:31.560Z · LW(p) · GW(p)

Predictors have an incentive to predict likely-events-of-low-consequence when they are not harmed by their errors. But in the real world, what matters is warning about events of high consequence. In the real world, the latter can only be revealed through skin-in-the-game as the supposedly "good predictors" go bankrupt.

Nassim Taleb

comment by NancyLebovitz · 2014-05-04T13:38:36.522Z · LW(p) · GW(p)

I look at books as investments in a future of learning rather than a fleeting moment of insight, soon to be forgotten.

--Kevan Lee

comment by shminux · 2014-05-19T18:32:46.626Z · LW(p) · GW(p)

People are extraordinarily sensitive to framing. "Art" is valuable. "Content" is not.

Patrick McKenzie on why having a publication date on your blog entry devalues it.

Replies from: Richard_Kennaway, shminux, Nornagest
comment by Richard_Kennaway · 2014-05-20T07:45:27.527Z · LW(p) · GW(p)

Patrick McKenzie on why having a publication date on your blog entry devalues it.

(Link to the, er, "content".)

And yet books always have a publication date.

ETA: as do scientific articles, of course, and the date really matters, not because of being "up to date" but because the date gives some context to whatever it is.

Replies from: gwern
comment by gwern · 2014-05-20T20:37:35.123Z · LW(p) · GW(p)

As far as books go:

Most writing only carries a publication date because that was inserted several years ago into the WordPress template by a designer. The designer likely knows nothing about your company, to say nothing of the instant work. He put in a date because WordPress makes it really easy and because everyone knows that blog posts have dates. He also probably made the decision to make the date front-and-center in the blog post, rather than treating it as minimal-impact metadata and burying it after the main text or putting it in a bots-only header.

I'm curious if showing a date is as bad as he thinks; he doesn't mention ever A/B testing the claim himself. (I'd test it on my site, except the date is already buried in the sidebar to the point where many people miss it, so I wouldn't expect much of a difference.)

Replies from: Vaniver
comment by Vaniver · 2014-05-21T20:24:16.217Z · LW(p) · GW(p)

I'm curious if showing a date is as bad as he thinks

I predict yes, but if I'm reading his position right showing the date is just a symptom of not having a Long Content focus, which is what he's really arguing for in that article (and which your site already has in spades).

Replies from: gwern
comment by gwern · 2014-05-21T22:14:21.129Z · LW(p) · GW(p)

If the problem is focusing on short-term writing which becomes worthless quickly, then simply hiding or showing dates shouldn't much affect how long readers stay on the page: most short-term stuff shows its colors very quickly. (How many sentences does it take to figure out you're not interested in a rant about John Kerry from 2004?)

Replies from: Vaniver
comment by Vaniver · 2014-05-21T22:40:43.843Z · LW(p) · GW(p)

I think McKenzie's argument is that using a date can turn long content into short content, which many people do on accident, and while he doesn't quantify it (which would be the value of A/B testing) I think he has enough evidence to establish the direction of the effect. Not using a date is obviously not sufficient to turn short content into long content, but I do think it may be helpful at getting one into the right state of mind, as it focuses the attention on sorting things by content rather than time. (Imagine trying to find all of Robin Hanson's writing on construal level theory- yes, you can use the nearfar tag on Overcoming Bias, but that's sorted by date, and there's no solid introduction.)

Replies from: gwern
comment by gwern · 2014-05-22T03:07:17.073Z · LW(p) · GW(p)

(Imagine trying to find all of Robin Hanson's writing on construal level theory- yes, you can use the nearfar tag on Overcoming Bias, but that's sorted by date, and there's no solid introduction.)

That's a good example of how weak date markers are: if the dates were deleted completely from every OB post, people would still find them incomprehensible because there's only one post which could be considered an overview of the concept, and is a needle in the haystack until and unless Hanson in some way synthesizes all his scattershot posts and allusions into a single Near-Far page.

The posts need some sort of organization imposed; the lack of that organization is what kills them, not some date markers. If my essays were broken up into 500-word chunks, and sorted either randomly or by date, they wouldn't look much better.

comment by shminux · 2014-05-21T23:54:39.596Z · LW(p) · GW(p)

To expand on this a bit: he gives the following supporting example:

I once wrote an article about salary negotiation. If you go by the numbers, it created more value for more people than any other single thing I've ever written. (I keep a label in Gmail for when folks tell me they got a raise as a direct result of advice in there. The running tally is in the high seven figures a year these days.) I think if I were to revisit the topic today I'd write substantially the same advice. However, that article has a date on it, just the fact of it having a date on it makes it less useful.

I have seen variants of the following conversation happen on Twitter / Reddit / HN / etc multiple times.

"I just got a job offer as a front-end engineer at a Valley company. How do I handle the salary negotiation?"

"Patrick wrote about that here. It is good advice."

"That looks like it was written in 2012. Do you have anything more up-to-date?"

History is a pretty wild rollercoaster, but nothing which happened in the interim has suddenly made "Don't negotiate your salary!" or "If you do negotiate your salary, start by naming a nice low number. You can always work your way up later!" into good advice. And yet if you put a date on your work, people immediately assume it gets stale.

comment by Nornagest · 2014-05-20T20:57:31.657Z · LW(p) · GW(p)

I wanted to give this a fair shake, but it reads like McKenzie has never heard of journalism.

Replies from: Vaniver
comment by Vaniver · 2014-05-21T20:22:17.231Z · LW(p) · GW(p)

Given the choice, unless you're the New York Times and your entire business is built around throwing out some of the world's best writing every day right after breakfast, you should choose to write things which last. After all, you don't write software with the explicit intention that it will suffer bitrot hours after release, now do you?)

He's writing for an audience that sells software as a service (SaaS). Why would he give journalism more than a disclaimer (which he does include)?

Replies from: Nornagest
comment by Nornagest · 2014-05-21T20:35:24.794Z · LW(p) · GW(p)

He might be writing for an SaaS audience, but he's writing about the blog format, which is built to facilitate crowdsourced magazine journalism or editorial-style content. Now, he's quite right that the format's poorly suited to long-form or reference-style content, but starting a post with "let's talk about blogging" and proceeding to talk about all the ways it sucks for those content types, without much more than a word for its intended purpose, strikes me as a pretty serious omission.

If instead he'd framed it as "blogs are often misused", then we wouldn't be having this conversation. But that's not where we're standing.

Replies from: Vaniver
comment by Vaniver · 2014-05-21T21:01:24.254Z · LW(p) · GW(p)

strikes me as a pretty serious omission.

What makes it serious? What purpose does including journalism in the article serve?

comment by Vaniver · 2014-05-27T18:06:51.357Z · LW(p) · GW(p)

Because positive illusions typically provide a short-term benefit with larger long-term costs, they can become a form of emotional procrastination.

-- Max H. Bazerman

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2014-05-27T19:01:41.642Z · LW(p) · GW(p)

Context? I can randomly replace elements of this by their opposites and get something that sounds just as truthy.

Try it!

"[Because/although] [positive/negative] [illusions/perceptions] provide a [short/long]-term [benefit/cost] with [larger/smaller] [long/short]-term [costs/benefits], they can [become/avoid] a form of [emotional/intellectual] [procrastination/spur to action]."

Replies from: Vaniver, faul_sname
comment by Vaniver · 2014-05-27T21:19:02.442Z · LW(p) · GW(p)

It's from a book on decision-making, in a section on motivational biases. Bazerman discusses the evidence that positive illusions help ('[research] suggest[s] that positive illusions enhance and protect self-esteem, increase personal contentment, help individuals to persist at difficult tasks, and facilitate coping with aversive and uncontrollable events" is a short sample), talks about clusters (unrealistically positive views of the self, unrealistic optimism, illusion of control, self-serving attributions, and positive illusions in groups and society), and then the quote is from a section labeled "Are Positive Illusions Good for You?". Here's the full paragraph it is from:

I believe that each of these findings is true and that in some specific situations (e.g., severe health conditions), positive illusions may prove to be beneficial. In addition, positive illusions may help people cope with tragic events, particularly when they have few alternatives and are not facing any major decisions. However, I also believe that the story told by this literature is incomplete and therefore dangerous in most decision-making environments. Every day, people invest their life savings in new businesses that have little chance of success. Similarly, falsely assuming that they are irreplaceable, people make ultimatums to their employers and often end up losing their jobs. Positive illusions are hazardous when they cause people to continually fool themselves. Because positive illusions typically provide a short-term benefit with larger long-term costs, they can become a form of emotional procrastination. I believe that you cannot maintain these illusions without reducing the quality of your decisions.


Try it!

It looks to me like doing an odd number of flips is often silly. ("Because positive illusions typically provide a long-term cost with larger long-term costs, they can avoid a form of emotional procrastination." What?)

comment by faul_sname · 2014-06-01T08:42:07.256Z · LW(p) · GW(p)

"Because positive illusions provide a short-term benefit with smaller short-term benefits, they can become a form of intellectual procrastination."

comment by Xelaz · 2014-05-13T21:51:45.244Z · LW(p) · GW(p)

I wander through
the dark wilderness
by the light
of my burning map

-- Lucien Zell (can't find an authoritative attribution)

Replies from: Desrtopa
comment by Desrtopa · 2014-05-14T01:49:57.920Z · LW(p) · GW(p)

I'm really not clear on what this is actually supposed to be a metaphor for.

It's clearly not something you would literally want to do, since the night is temporary and the light provided by the map is dim and brief. But maybe this is a metaphorical long-lasting night and bright burning map?

Replies from: Cube
comment by Cube · 2014-05-14T15:53:15.375Z · LW(p) · GW(p)

Destroying something that would be useful ir even necessary in the future so that you can better get through or perhaps survive the present.

Going to the same college as your high school sweetheart for example. Perhaps it will work out and you won't need the map.

comment by BloodyShrimp · 2014-05-06T02:27:31.656Z · LW(p) · GW(p)

I'm sure this has been discussed before, but my attempts at searches for those discussions failed, so...

Why is this thread in Main and not Discussion?

Replies from: Viliam_Bur, elharo
comment by Viliam_Bur · 2014-05-06T09:31:50.877Z · LW(p) · GW(p)

Tradition, I guess.

In the Age of Sequences, Eliezer sometimes posted rationality quotes, in the article text (1, 2, 3, etc.). Things written by Eliezer in that era are probably automatically considered Main-level. And the new Rationality Quotes threads don't seem worse than the traditional ones -- if we look at the highly voted quotes.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2014-05-07T01:04:26.242Z · LW(p) · GW(p)

Things written by Eliezer in that era are probably automatically considered Main-level.

Well, discussion didn't exist back than.

comment by elharo · 2014-05-06T11:04:30.083Z · LW(p) · GW(p)

Last month I posted the rationality quotes in discussion. Someone complained and said it belonged in main so I moved it there. This month I just started it in Main.

comment by Darklight · 2014-05-14T19:46:51.253Z · LW(p) · GW(p)

In the midst of it all you must take your stand, good-temperedly and without disdain, yet always aware that a man's worth is no greater than the worth of his ambitions.

-- Marcus Aurelius, Meditations, pg. 76

comment by Vulture · 2014-05-05T15:28:47.341Z · LW(p) · GW(p)

It's always a good idea to let reality be your only obstacle. Your imagination shouldn't be the limit on your success.

-- Scott Adams

comment by B_For_Bandana · 2014-05-03T21:22:00.762Z · LW(p) · GW(p)

Context: In a short video, a woman throws out an old desk lamp. The music and cinematography are contrived such that the viewer feels tempted to feel sorrow on behalf on the lamp. Then a man walks up and addresses the camera with:

Many of you feel bad for this lamp. That is because you crazy. It has no feelings. And the new one is much better.

A good example of the difference between fuzzies and utilons.

comment by Gunnar_Zarncke · 2014-05-01T21:50:17.572Z · LW(p) · GW(p)

'Whatever our calling, whether we are scientists, engineers, poets, public servants, or parents, we all live in a complex, and ever-changing world, and all of us deserve what's in this toolbox [meaning the humanities]: critical thinking skills; knowledge of the past and other cultures; an ability to work with and interpret numbers and statistics; access to the insights of great writers and artists; a willingness to experiment, to open up to change; and the ability to navigate ambiguity.'

In an opinion piece in the Boston Globe called "At MIT, the humanities are just as important as STEM" by Deborah K. Fitzgerald, Apr 30, 2014

The slashdot poster AthanasiusKircher goes on to ask

What other essential knowledge or skills should we add to this imaginary 'toolbox'?"

See slashdot post

Replies from: SaidAchmiz, Eugine_Nier
comment by Said Achmiz (SaidAchmiz) · 2014-05-02T00:08:58.593Z · LW(p) · GW(p)

critical thinking skills; knowledge of the past and other cultures; an ability to work with and interpret numbers and statistics; access to the insights of great writers and artists; a willingness to experiment, to open up to change; and the ability to navigate ambiguity.'

Some of these things are not like the others...

Replies from: johnlawrenceaspden
comment by johnlawrenceaspden · 2014-05-02T21:18:36.875Z · LW(p) · GW(p)

Which are the odd ones out?

Replies from: SaidAchmiz
comment by Said Achmiz (SaidAchmiz) · 2014-05-02T21:24:39.157Z · LW(p) · GW(p)

To a first approximation:

{ critical thinking skills; an ability to work with and interpret numbers and statistics; a willingness to experiment, to open up to change }

vs.

{ knowledge of the past and other cultures; access to the insights of great writers and artists }

Then you've got this one by itself because what the heck does it even mean:

{ the ability to navigate ambiguity }

Replies from: johnlawrenceaspden, EHeller, dspeyer
comment by johnlawrenceaspden · 2014-05-03T12:55:14.665Z · LW(p) · GW(p)

{ the ability to navigate ambiguity }

I think this is one of the most important skills you get from the humanities. I have a friend who's a history professor. He's very used to hearing 20 different accounts of the same event told by different people, most of whom are self-serving if not outright lying, and working out what must actually have gone on, which looks like a strength to me.

He has a skill I'd like to have, but don't, and he got it from studying history, (and playing academic politics).

Replies from: SaidAchmiz, Lumifer
comment by Said Achmiz (SaidAchmiz) · 2014-05-03T18:05:56.814Z · LW(p) · GW(p)

working out what must actually have gone on

How did he know that his judgment of what actually had gone on was correct? How did he verify his conclusion?

comment by Lumifer · 2014-05-03T17:48:01.743Z · LW(p) · GW(p)

{ the ability to navigate ambiguity } I think this is one of the most important skills you get from the humanities.

Statistics is precisely that, but with numbers.

Replies from: VAuroch
comment by VAuroch · 2014-05-05T20:27:25.669Z · LW(p) · GW(p)

That only works if you have numbers.

Replies from: Lumifer
comment by Lumifer · 2014-05-06T16:05:45.044Z · LW(p) · GW(p)

Luckily, you can make numbers.

Replies from: VAuroch
comment by VAuroch · 2014-05-06T20:22:41.397Z · LW(p) · GW(p)

"Making numbers" is unlikely to produce useful numbers.

Replies from: army1987, Lumifer
comment by A1987dM (army1987) · 2014-05-09T08:13:55.678Z · LW(p) · GW(p)

Not necessarily.

Relevant Slate Star Codex post: “If It’s Worth Doing, It’s Worth Doing With Made-Up Statistics

comment by Lumifer · 2014-05-06T20:40:52.999Z · LW(p) · GW(p)

"Making" is not "making up".

When you flip a coin a bunch of times and decide that it's fair, you've made numbers. There are no numbers in the coin itself, but you reasonably can state the probability of the coin coming up heads and even state your certainty in this estimate. These are numbers you made.

As a more general observation, in the Bayesian approach the prior represents information available to you before data arrives. The prior rarely starts as a number, but you must make it a number before you can proceed further.

Replies from: VAuroch
comment by VAuroch · 2014-05-07T04:15:00.851Z · LW(p) · GW(p)

There are no numbers in the coin itself, but you reasonably can state the probability of the coin coming up heads and even state your certainty in this estimate. These are numbers you made.

No, those are numbers you found. The inherent tendency to produce numbers when tested in that way ("fairness/unfairness") was already a property of the coin; you found what numbers it produced, and used that information to derive useful information.

Making numbers, on the other hand, is almost always making numbers up. Sometimes processes where you make numbers up have useful side-effects

Of course, the point of a subjective Bayesian calculation wasn't that, after you made up a bunch of numbers, multiplying them out would give you an exactly right answer. The real point was that the process of making up numbers would force you to tally all the relevant facts and weigh all the relative probabilities.

but that doesn't mean that making numbers is at all useful.

Basically, I think it's important to distinguish between finding numbers which encode information about the world, and making numbers from information you already have. Making numbers may be a necessary prerequisite for other useful processes, but it is not in itself useful, since it requires you to already have the information.

Replies from: Lumifer
comment by Lumifer · 2014-05-07T06:08:21.582Z · LW(p) · GW(p)

No, those are numbers you found.

I don't think this is a useful distinction, but if you insist...

You said: "That only works if you have numbers." Then the answer is: "Luckily, you can find numbers."

Replies from: VAuroch
comment by VAuroch · 2014-05-07T16:11:55.774Z · LW(p) · GW(p)

Finding relevant numbers is significantly difficult in most circumstances.

Replies from: Lumifer
comment by Lumifer · 2014-05-07T16:18:15.496Z · LW(p) · GW(p)

That phrase is so general as to be pretty meaningless.

I do not subscribe to the notion that anything not expressible in math is worthless, but "in most circumstances" the inability to find any numbers is a strong indication that you don't understand the issue well.

Replies from: VAuroch
comment by VAuroch · 2014-05-07T21:35:53.578Z · LW(p) · GW(p)

the inability to find any numbers is a strong indication that you don't understand the issue well.

Yes, that's the whole point. There aren't always numbers you can find, even when there are, finding them is nontrivial, and you often have to deal with the ambiguous situation or problem regardless.

{ the ability to navigate ambiguity } I think this is one of the most important skills you get from the humanities.

Statistics is precisely that, but with numbers.

What you said here is a vast oversimplification; if you have gotten to the point where you can find relevant numbers, you have already successfully navigated most of the ambiguity.

Is there still an inferential gap here? I thought I made my point clear about three comments ago, but this is clearly not as obvious a distinction as I expected it to be.

Replies from: Lumifer
comment by Lumifer · 2014-05-08T14:56:26.590Z · LW(p) · GW(p)

if you have gotten to the point where you can find relevant numbers, you have already successfully navigated most of the ambiguity.

And that's where you are being misled by your insistence on "finding" numbers instead of "making" them.

It's pretty easy to construct estimates. The problem is that without good data these estimates will be too wide to the point of uselessness. But you can think, and find some data, and clean some existing data, and maybe narrow these estimates down a bit. Go back to 1. and repeat until you run out of data or the estimate is narrow enough to fit its purpose.

Ambiguity isn't some magical concept limited to the humanities. The whole of statistics is dedicated to dealing with ambiguity. In fact, my standard definition of statistics is "a toolbox of methods to deal with uncertainty".

I understand your point, I just think it's mistaken.

Replies from: VAuroch
comment by VAuroch · 2014-05-08T17:41:06.441Z · LW(p) · GW(p)

I consider all the things you've said to be my best arguments why you're wrong, so there's clearly something wrong here. But I've run out of novel arguments and can't figure out where the disconnect is.

Replies from: Lumifer
comment by Lumifer · 2014-05-08T18:21:08.449Z · LW(p) · GW(p)

why you're wrong

What is that statement of mine to which you are assigning the not-true value?

Replies from: VAuroch
comment by VAuroch · 2014-05-08T21:33:10.663Z · LW(p) · GW(p)

You seem to think that it is generally easy to turn arbitrary ambiguities into numbers in a way amenable to using statistics to resolve them. I find that to be obviously, blatantly false.

Where you see things like this:

It's pretty easy to construct estimates. The problem is that without good data these estimates will be too wide to the point of uselessness. But you can think, and find some data, and clean some existing data, and maybe narrow these estimates down a bit. Go back to 1. and repeat until you run out of data or the estimate is narrow enough to fit its purpose.

I see something more like

In order to get an estimate narrow enough to fit the purpose, gather data, make a bad estimate, gather more data, refine the estimate, gather still more data, refine further, repeat until you can't find any more data and then hope you got something useful out of it.

Where the difficult part is gather data. If you can gather data that is relevant, then statistics are useful. But often, you can't, and so they aren't. I outlined the exact same process as you, I'm just significantly more pessimistic about how often and how well it works.

Replies from: Lumifer
comment by Lumifer · 2014-05-09T00:53:42.715Z · LW(p) · GW(p)

You seem to think that it is generally easy to turn arbitrary ambiguities into numbers

Yes, I do.

in a way amenable to using statistics to resolve them.

No, I do not. I said nothing about "resolving" things.

When I say "numbers" in the context of statistics, I really mean probability distributions, often uncertain probability distributions. For example, the probability of anything lies somewhere between zero and one -- see, we don't have any information, but we already have numbers.

You're likely thinking that when I am turning ambiguities into numbers, I turn them into nice hard scalars, like "the probability of X is 0.7". No, I don't. I turn them into wide probability distributions, often without any claims about the shape of these distributions. That is still firmly within the purview of statistics.

Where the difficult part is gather data. If you can gather data that is relevant, then statistics are useful.

If you have no data, nothing is useful. Remember, the original context was how humanities teach us to deal with ambiguity. But if you have no data, humanities won't help and if you do, you can use numbers.

I'm not saying that everything should be converted to numbers. My point is that there are disciplines -- specifically statistics -- that are designed to deal with uncertainty and, arguably, do it better than handwaving common in the humanities.

Replies from: VAuroch
comment by VAuroch · 2014-05-10T22:50:15.175Z · LW(p) · GW(p)

Your confidence in your ability to do statistics to everything is clearly unassailable, and I have no desire to be strawmanned further.

comment by EHeller · 2014-05-02T21:57:28.268Z · LW(p) · GW(p)

{ the ability to navigate ambiguity }

This is part of critical thinking. Taking a vaguely defined or ambiguous problem, parsing out what it means and figuring out an approach.

Replies from: dthunt
comment by dthunt · 2014-05-05T19:22:45.699Z · LW(p) · GW(p)

I'm rather curious;

If you take people across a big swath of humanities, and ask them about subjects where there is a substantial amount of debate and not a lot of decisive evidence - say, theories of a historical Jesus - how many of those people are going to describe one of those theories as more likely than not?

Like, if you have dozens of theories that you've studied and examined closely, are we going to see people assigning >50% to their favored theory? Or will people be a lot more conservative with their confidence?

Replies from: army1987, Vulture, Eugine_Nier
comment by A1987dM (army1987) · 2014-05-06T15:12:38.025Z · LW(p) · GW(p)

BTW, the probability that the Jesus character in the four Gospels was based on a real person would be a great question to ask in the next LW census/survey.

Replies from: Plasmon, Nornagest
comment by Plasmon · 2014-05-06T17:57:30.432Z · LW(p) · GW(p)

Was Bram Stoker's Dracula "based on" a real person ? Possibly, given an extremely weak interpretation of "based on".

What does it take for a fictional character to be based on a real person? Does it suffice to have a similar name, live in a similar place at a similar time? Do they have to perform similar actions as well? This has to be made clear before the question can be meaningfully answered.

Replies from: Nornagest
comment by Nornagest · 2014-05-06T19:07:59.635Z · LW(p) · GW(p)

That's an extraordinarily weak "based on". The Dracula/Tepes connection in Bram Stoker's work doesn't go much beyond Stoker borrowing what he thought was a cool name with exotic, ominous associations (and that "exotic" is important; Eastern Europe in Stoker's time was seen as capital-F Foreign to Brits, which comes through quite clearly in the book). Later authors played on it a bit more.

The equivalent here would be saying that there was probably someone named Yeshua in the Galilee area around 30 AD.

Replies from: Vulture, Plasmon
comment by Vulture · 2014-05-06T19:13:52.976Z · LW(p) · GW(p)

Was Yeshua that uncommon of a name? You're setting the bar pretty low here. (That being said, my understanding is that there's a strong scholarly consensus that there was a Jew named Yeshua who lived in Galilee, founded a cult which later became Christianity, and was crucified by the Romans controlling the area. So these picky ambiguities about "based on" aren't really relevant anyway)

Replies from: Nornagest, army1987
comment by Nornagest · 2014-05-06T19:19:15.607Z · LW(p) · GW(p)

Was Yeshua that uncommon of a name? You're setting the bar pretty low here.

Not that uncommon, no. I'm exaggerating for effect, but the point should still have carried if I'd used "Yeshua ben Yosef" or something even more specific: if you can't predict anything about the character from the name, the character isn't meaningfully based on the name's original bearer.

comment by A1987dM (army1987) · 2014-05-07T07:26:23.924Z · LW(p) · GW(p)

That being said, my understanding is that there's a strong scholarly consensus that there was a Jew named Yeshua who lived in Galilee, founded a cult which later became Christianity, and was crucified by the Romans controlling the area.

There also is a strongly scholar consensus that anthropogenic global warning is occurring, and yet plenty of LW census respondents put in there numbers not very close to 100%.

comment by Plasmon · 2014-05-06T19:37:18.814Z · LW(p) · GW(p)

That's an extraordinarily weak "based on"

That is true, and intentional. It is far from obvious that the connection between the fictional Jesus and the (hypothetical?) historical one is any less tenuous than that (1) . The comparison also underscores the pointlessness of the debate : just as evidence for Vlad Dracul's existence is at best extemely weak evidence for the existence of vampires, so too is evidence for a historical Jesus at best extremely weak evidence for the truth of Christianity.

(1) Keep in mind that there are no contemporary sources that refer to him, let alone to anthing he did.

comment by Nornagest · 2014-05-06T17:14:39.243Z · LW(p) · GW(p)

I predict you'd get a minority of people using it as a proxy for atheism, another minority favoring it simply because it's an intensely contrarian position, and the majority choosing whatever the closest match to "I don't know" on the survey is.

comment by Vulture · 2014-05-06T16:44:47.856Z · LW(p) · GW(p)

I seem to remember reading that virtually all serious scholars agree that there was a historical Jesus, and that the opposite claim is considered a fringe idea along the lines of homeopathy, so soundly has it been debunked. My memory might be exaggerating, but I think the gist is correct.

comment by Eugine_Nier · 2014-05-06T01:21:23.937Z · LW(p) · GW(p)

If you take people across a big swath of humanities, and ask them about subjects where there is a substantial amount of debate and not a lot of decisive evidence - say, theories of a historical Jesus

Could you have picked an example where one side isn't composed entirely of crackpots?

Replies from: Richard_Kennaway, dthunt
comment by Richard_Kennaway · 2014-05-06T06:07:05.515Z · LW(p) · GW(p)

Which side are you claiming to be crackpots?

Replies from: Eugine_Nier, elharo
comment by Eugine_Nier · 2014-05-07T01:18:00.239Z · LW(p) · GW(p)

Seriously, I can't see how anyone could claim that Jesus was ahistorical who isn't some combination of doing reverse-stupidity on Christianity or taking an absurd contrarian position for the sake of taking an absurd contrarian position.

Edit: fixed typo.

Replies from: Richard_Kennaway, JQuinton
comment by Richard_Kennaway · 2014-05-07T08:03:52.562Z · LW(p) · GW(p)

Am I correct in reading "a historical" as "ahistorical" and not as "a historical figure"?

comment by JQuinton · 2014-05-14T17:03:25.078Z · LW(p) · GW(p)

I would think that believing Jesus didn't exist would be just as absurd as thinking that all or almost all of the events in the Gospels literally happened. Yet the latter make up a significant number of practicing Biblical scholars. And for the majority of Biblical scholars who don't think the Gospels are almost literally true, still have a form of Jesus-worship going on as they are practicing Christians. It would be hard to think that Jesus both came back from the dead and also didn't exist; meaning that it would be very hard to remain a Christian while also claiming that Jesus didn't exist, and most Biblical scholars were Christians before they were scholars.

The field both is biased in a non-academic way against one extreme position while giving cover and legitimacy to the opposite extreme position.

comment by elharo · 2014-05-06T11:11:52.081Z · LW(p) · GW(p)

Modern day people who believe there was no real historical preacher, probably named Yeshua or something like that, wandering around Palestine in the first century, and on whom the Gospels are based, are crackpots. Their position is strongly refuted by the available evidence. You don't have to be a theist or a Christian to accept this. See, for example, pretty much any of the works of Bart Ehrman, particularly "Did Jesus Exist?"

There are legitimate disputes about this historical figure. How educated was he? Was he more Jewish or Greek in terms of philosophy and theology? (That he was racially Jewish is undenied.) Was he a Zealot? etc. However that he existed has been very well established.

comment by dthunt · 2014-05-06T02:37:08.844Z · LW(p) · GW(p)

Depends on your definition of crackpots. I don't think most Jesus scholars are crackpots, just most likely overly credulous of their favored theories.

What I'm curious about is if people in these fields that are starved for really decisive evidence still feel compelled to name a >50% confidence theory, or if they are comfortable with the notion that their most-favored hypothesis indicated by the evidence is still probably wrong, and just comparatively much better than the other hypotheses that they have considered.

Replies from: Fronken
comment by Fronken · 2014-05-07T08:55:29.301Z · LW(p) · GW(p)

I think he meant "jesus myth" proponents, who IIRC are ... dubious.

Replies from: dthunt
comment by dthunt · 2014-05-07T16:40:44.493Z · LW(p) · GW(p)

Well, hence "historical Jesus". If I were talking about Jesus mythicists, I would have said that. I ignorantly assume there aren't that many Jesus mythicist camps fighting each other out over specific theories of mythicism...

I'm actually looking forward to Richard Carrier's book on that, but I do not expect it to decide mythicism.

comment by dspeyer · 2014-05-08T06:37:06.996Z · LW(p) · GW(p)

Then you've got this one by itself because what the heck does it even mean:

{ the ability to navigate ambiguity }

Perhaps the ability to work with poorly-defined objectives? Including how to get some idea of what someone wants and use that to ask useful questions to refine it?

comment by Eugine_Nier · 2014-05-02T01:35:50.677Z · LW(p) · GW(p)

critical thinking skills; knowledge of the past and other cultures; an ability to work with and interpret numbers and statistics; access to the insights of great writers and artists; a willingness to experiment, to open up to change; and the ability to navigate ambiguity.

Now if only the humanities departments of most universities taught any of those things, rather than the latest PC/SJ fashionable nonsense.

Replies from: EHeller
comment by EHeller · 2014-05-02T02:30:07.487Z · LW(p) · GW(p)

Now if only the humanities departments of most universities taught any of those things, rather than the latest PC/SJ fashionable nonsense.

According to the "Academically Adrift" study, humanities and social science majors show the second highest gains in critical thinking skills, behind only science/math, above engineering and computer science.

To the extent that students are showing limited and declining learning, it largely reflexs a switch to business and education majors (business shows the least learning ,with education right behind), not a weakening of humanities majors.

Replies from: wedrifid, johnlawrenceaspden, ChristianKl
comment by wedrifid · 2014-05-02T06:25:59.462Z · LW(p) · GW(p)

According to the "Academically Adrift" study, humanities and social science majors show the second highest gains in critical thinking skills, behind only science/math, above engineering and computer science.

Is this a reflection of the influence of course participation or of reasoning capability prior to entry?

comment by johnlawrenceaspden · 2014-05-02T21:22:02.340Z · LW(p) · GW(p)

For some reason, I very much want this to be true. And I take that as a warning sign. Does anyone know if it is true? And what sort of test could possibly measure 'maths creativity' and 'english creativity' on the same scale anyway?

Replies from: EHeller
comment by EHeller · 2014-05-02T21:54:41.545Z · LW(p) · GW(p)

They aren't measuring field specific skills, which is the whole point. They are measuring gains in critical thinking using the CLA test (i.e. how much better do you get at general critical thinking as a result of studying your major.). The study itself was quite famous and made the blog rounds a few years ago, I'm sure some light googling will answer any other questions.

comment by ChristianKl · 2014-05-02T12:15:05.325Z · LW(p) · GW(p)

To the extent that students are showing limited and declining learning, it largely reflexs a switch to business and education majors (business shows the least learning ,with education right behind),

There a joke someone in education major being near the bottom when it comes to learning, but at the moment I don't know how to best make it.

Replies from: johnlawrenceaspden
comment by johnlawrenceaspden · 2014-05-02T21:20:03.860Z · LW(p) · GW(p)

.... Those that can't teach, teach teaching.

comment by NancyLebovitz · 2014-05-27T13:51:54.759Z · LW(p) · GW(p)

I don't know what I mean. I remain convinced that whatever I meant is 100% right, but what I meant is subject to change with passing whimsy.

comment by NancyLebovitz · 2014-05-06T13:04:36.172Z · LW(p) · GW(p)

"[I'm] Still thinking, remember? Means I look at things one by one."

--- The Black Opera by Mary Gentle

comment by AshwinV · 2014-06-01T14:25:17.766Z · LW(p) · GW(p)

"The best way to sort out confusion is to expose it" - Richard Dawkins. (In the greatest show on earth, p.157. )

comment by NancyLebovitz · 2014-05-27T13:59:43.735Z · LW(p) · GW(p)

.There is no such thing as absolute truth.... People are less deceived by failing to see the truth than by failing to see its limits.

  • Senac de Meilhan
comment by Eugine_Nier · 2014-05-03T04:58:12.976Z · LW(p) · GW(p)

a way to quickly evaluate any proposed new form of government or legal system: ask the proposer how arrest is distinguished from kidnapping, and search and seizure from trespassing and theft -- if they can't give a good answer, the proposal is based on ignorance and you need not waste any more of your time on it

Nick Szabo

Replies from: AndHisHorse, timujin
comment by AndHisHorse · 2014-05-03T07:39:36.220Z · LW(p) · GW(p)

The very narrow choice of values and their seemingly libertarian phrasing implies some hidden criteria for what constitutes "a good answer" - which enables whoever follows this advice to immediately dismiss a proposal based on some unspecified "good"-ness of the answer without further thought or discussion, and dramatically downgrade their opinion of the proposer in the bargain. This seems detrimental to the rational acquisition of ideas and options.

EDIT: Criticism has since been withdrawn in response to context provided below.

Replies from: David_Gerard
comment by David_Gerard · 2014-05-04T10:15:21.525Z · LW(p) · GW(p)

The quote doesn't give that impression in context, including the comments - it's actually a statement about the importance of the rule of law. From the comments, Nick notes:

Indeed, the moral principle of non-initiation of force, far from being a possible basis of society as Murray Rothbard and David Friedman would have it, is a sophisticated outcome of long legal evolution and a highly involved legal procedure that itself cannot stick to that principle: it coerces people to a certain extent so that they will not coerce each other to a much greater extent.

Replies from: AndHisHorse
comment by AndHisHorse · 2014-05-05T20:17:31.512Z · LW(p) · GW(p)

Acknowledged, and criticism withdrawn.

comment by timujin · 2014-05-06T08:28:28.647Z · LW(p) · GW(p)

Trivially true, as one who cannot point out the difference is ignorant in the field of legal systems. I guess it is not what is meant?

comment by [deleted] · 2014-05-08T18:53:32.593Z · LW(p) · GW(p)

I'm currently reading Rapture of the Nerds by Cory Doctorow and Charles Stross, and came across this passage:

"... I'm the me who spent the two years subjective actually trying transcendence, rather than denying it. You're the superstition-based Huw who foreordained the outcome of the experiment on the way in. I'm the evidence-based Huw who actually ran the experiment and had the intellectual honesty to face the outcome."

It's funny in context, as both of them originate from a forced upload and the one speaking here just gave legal testimony in favor of forcibly uploading the entire Earth and converting the planet to computronium.

Also, the Committee (who were receiving the testimony) had to run a very large number of parallel Huw-instances and boil them down to representatively divergent samples, because they have no concept of CEV. Oh, the irony.

comment by CronoDAS · 2014-05-20T22:30:39.052Z · LW(p) · GW(p)

The truth of art keeps science from becoming inhuman, and the truth of science keeps art from becoming ridiculous.

-- Raymond Chandler

Replies from: Jiro, None, MarkusRamikin, DanArmak
comment by Jiro · 2014-05-21T15:01:17.784Z · LW(p) · GW(p)

I've always been skeptical of anything which uses "truth" to mean something other than "is factually correct". It almost invariably is used as an excuse to say "we can't show this is factually correct, but we want you to treat it as such anyway".

comment by [deleted] · 2014-05-21T00:34:00.526Z · LW(p) · GW(p)

truth of art keeps science from becoming inhuman

Examples?

Replies from: zaogao
comment by zaogao · 2014-05-29T21:14:58.190Z · LW(p) · GW(p)

The first part could be read as, art (morality, aesthetics, appreciation of humanity) can prevent us from scientific methods (http://en.wikipedia.org/wiki/Nazi_human_experimentation#Freezing_experiments) or conclusions (human biodiversity). Regarding the freezing experiments, I wouldn't be surprised if that knowledge has saved more people than were killed in the experiments. While "shut up and calculate" is popular around here, I think a lot of people would have a problem with such experiments, no matter what the net positive is.

The second part could be read as being against post-modernism/relativism/new-age b.s. Sadly the pointed, acknowledged absurdity of dada and surrealism has gone mainstream, and "What I say is art is art" is interpreted non-ironically.

comment by MarkusRamikin · 2014-05-21T09:02:28.862Z · LW(p) · GW(p)

the truth of science keeps art from becoming ridiculous

Looking at modern art, I'd say it's not doing a good job...

comment by DanArmak · 2014-05-21T19:25:30.512Z · LW(p) · GW(p)

My science, unrestrained by mere art, will reveal inhuman laws of physics! I will prove inhuman mathematical theorems and research an inhuman cure for cancer!

...Seriously, is that saying anything beyond "both artists and scientists should have high status"?

comment by [deleted] · 2014-05-01T13:50:52.046Z · LW(p) · GW(p)

Context: The quotes here are taken from the C.S. Lewis sci-fi novel Perelandra in which protagonist, Ransom, goes to an extremely ideal Venus to have philosophical discoveries and box with a man possessed by a demon.

These quotes come from the beginning of the novel when Ransom is attempting to describe the experience of having been transported through space by extraterrestrial means which had augmented his body to protect it from cold and hunger and atrophy for the duration of the journey.

This discussion (taking place in a debate over the Christian afterlife) touches upon certain sentiments about how the augmentation (or, for Lewis, glorification) of modern human bodies does not lessen us as humans but instead only improves that which is there.

'Oh, don't you see, you ass, that there's a difference between a trans-sensuous life and a non-sensuous life?'

What emerged was that in Ransom's opinion the present functions and appetites of the body would disappear, not because they were atrophied but because they were, as he said, 'engulfed.' He used the word 'trans-sexual' I remember and began to hunt about for some similar words to apply to eating (after rejecting 'trans-gastronomic'), and since he was not the only philologist present, that diverted the conversation into different channels.

I was questioning him on the subject and had incautiously said, 'Of course I realise it's all rather too vague for you to put into words,' when he took me up rather sharply, for such a patient man, by saying, 'On the contrary, it is words that are vague. The reason why the thing can't be expressed is that it's too definite for language.'

C.S. Lewis, Perelandra, p. 29.

Replies from: Viliam_Bur, Aleksander, NancyLebovitz
comment by Viliam_Bur · 2014-05-02T19:14:01.076Z · LW(p) · GW(p)

The reason why the thing can't be expressed is that it's too definite for language.

This feels like a combination of words that are supposed to sound Wisely, but don't actually make sense. (I guess Lewis uses this technique frequently.)

How specifically could being "definite" be a a problem for language? Take any specific thing, apply an arbitrary label, and you are done.

There could be a problem when a person X experienced some "qualia" that other people have never experienced, so they can't match the verbal description with anything in their experience. Or worse, they have something similar, which they match instead, even when told not to. And this seems like a situation described in the text. -- But then the problem is not having the shared experience. If they did, they would just need to apply an arbitrary label, and somehow make sure they refer to the same thing when using the label. The language would have absolutely no problem with that.

Replies from: None, tristanhaze, anandjeyahar, anandjeyahar
comment by [deleted] · 2014-05-02T21:10:26.111Z · LW(p) · GW(p)

Since any attempt to defend the quote itself will only come off as a desire to shoehorn my chosen author into the rationality camp, I'll just give the simple reason why I chose to include that quote instead of stopping with the two previous:

I felt it touched on the subject of inferential distance and discussing reality using labels in a manner that was worthy of attention.

comment by tristanhaze · 2014-05-04T01:53:42.856Z · LW(p) · GW(p)

How specifically could being "definite" be a a problem for language? Take any specific thing, apply an arbitrary label, and you are done.

This remark seems to flow from an oversimplified view of how language works. In the context of, for example, a person or a chair, this paradigm seems pretty solid... at least, it gets you a lot. You can ostend the thing ('take' it, as it were) and then appy the label. But in the case of lots of "objects" there is nothing analogous to such 'taking' as a prior, discrete step from talking. For example, "objects" like happiness, or vagueness or definiteness themselves.

I think you may benefit from reading Wittgenstein, but maybe you'd just hate it. I think you need it though!

Replies from: anandjeyahar
comment by anandjeyahar · 2014-05-28T15:59:16.263Z · LW(p) · GW(p)

Am not sure I follow your comment. I think I get the basic gist of it and I agree with it, but I gotta ask. Did you really mean ostend(or was it a typo?)?. I can't really find it as a word in m-w.com or on google.

Replies from: tristanhaze, TheAncientGeek
comment by tristanhaze · 2014-07-08T03:28:12.328Z · LW(p) · GW(p)

Yep, what The Ancient Geek said. Sorry I didn't reply in a timely way - I'm not a regular user. I'm glad you basically agree, and pardon me for using such a recherche word (did I just do it again?) needlessly. Philosophical training can do that to you; you get a bit blind to how certain words are, while they could be part of the general intellectual culture, actually only used in very specific circles. (I think 'precisification' is another example of this. I used it with an intelligent nerd friend recently and, while of course he understood it - it's self explanatory - he thought it was terrible, and probably thought I just made it up.)

Hope you look at Wittgenstein!

comment by TheAncientGeek · 2014-05-28T17:18:53.455Z · LW(p) · GW(p)

As in ostention, basically pointing, or a verbal substitute.

comment by anandjeyahar · 2014-05-28T15:56:16.276Z · LW(p) · GW(p)

But then the problem is not having the shared experience. If they did, they would just need to apply an arbitrary label,

Yes.. If they had the shared experience, they would just need to apply an arbitrary label, however given how we learn language(by association based on how they are used by people around us on what we see as objective events/experiences), I am not too confident the labels will match even after having the shared experience. My previous comment assumes this, but did not make it explicit. And I derive the

The reason why the thing can't be expressed is that it's too definite for language

quote from that assumption. I may be wrong about the assumption (since it seems to be more of a thought experiment than a practical experiment at the moment) but nevertheless I assign fairly high probability/confidence on that.

comment by anandjeyahar · 2014-05-03T05:52:48.588Z · LW(p) · GW(p)

And this seems like a situation described in the text. -- But then the problem is not having the shared experience.

I tend to think of language as a symbolic system to denote/share/communicate these experiences with other brains. Ofcourse, there's the inherent challenge of seldom two experiences are same.(Even if it is an experiment on electrons). It's one of the reason, one of my sci-fi favourite scenario is brain-brain interfaces, that figure some way to interpret and transfer the empirical heuristic rules about a probability distribution(of any given event) one person has to another. Or may be am just being too idealistic about people always having such heuristics in their heads. (even if they are not aware of it) . :-)

comment by Aleksander · 2014-05-07T20:30:09.602Z · LW(p) · GW(p)

While we are quoting Perelandra

"How far does it go? Would you still obey the Life-Force if you found it prompting you to murder me?"
"Yes."
"Or to sell England to the Germans?"
"Yes."
"Or to print lies as serious research in a scientific periodical?"
"Yes."
"God help you!" said Ransom.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2014-05-07T21:24:46.816Z · LW(p) · GW(p)

A parallel passage from 1984:

"You will understand that I must start by asking you certain questions. In general terms, what are you prepared to do?'

'Anything that we are capable of,' said Winston.

O'Brien had turned himself a little in his chair so that he was facing Winston. He almost ignored Julia, seeming to take it for granted that Winston could speak for her. For a moment the lids flitted down over his eyes. He began asking his questions in a low, expressionless voice, as though this were a routine, a sort of catechism, most of whose answers were known to him already.

'You are prepared to give your lives?'

'Yes.'

'You are prepared to commit murder?'

'Yes.'

'To commit acts of sabotage which may cause the death of hundreds of innocent people?'

'Yes.'

'To betray your country to foreign powers?'

'Yes.'

'You are prepared to cheat, to forge, to blackmail, to corrupt the minds of children, to distribute habit-forming drugs, to encourage prostitution, to disseminate venereal diseases--to do anything which is likely to cause demoralization and weaken the power of the Party?'

'Yes.'

'If, for example, it would somehow serve our interests to throw sulphuric acid in a child's face--are you prepared to do that?'

'Yes.'

'You are prepared to lose your identity and live out the rest of your life as a waiter or a dock-worker?'

'Yes.'

'You are prepared to commit suicide, if and when we order you to do so?'

'Yes.'

'You are prepared, the two of you, to separate and never see one another again?'

'No!' broke in Julia.

It appeared to Winston that a long time passed before he answered. For a moment he seemed even to have been deprived of the power of speech. His tongue worked soundlessly, forming the opening syllables first of one word, then of the other, over and over again. Until he had said it, he did not know which word he was going to say. 'No,' he said finally.

comment by NancyLebovitz · 2014-05-27T14:51:27.183Z · LW(p) · GW(p)

There's a passage by Lewis, and probably from Perelandra, which is to the effect that people's actual choices are from a deeper part of themselves than the conscious mind. Might you happen to know it?

Replies from: None
comment by [deleted] · 2014-05-27T20:25:53.583Z · LW(p) · GW(p)

Off hand, I don't recall. There is a moment at the end of the book where Ransom has a revelatory experience of all life in existence and understands it as an interlocking dance, something that doesn't fit either his theory of predestination nor free will.

Actually, looked up some quotes and found this:

The whole struggle was over, and yet there seemed to have been no moment of victory. You might say, if you liked, that the power of choice had been simply set aside and an inflexible destiny substituted for it. On the other hand, you might say he had delivered from the rhetoric of his passions and had emerged in unassailable freedom. Ransom could not for the life of him, see any difference between these two statements. Predestination and freedom were apparently identical. He could no longer see any meaning in the many arguments he had heard on the subject.

comment by johnlawrenceaspden · 2014-05-05T17:47:35.073Z · LW(p) · GW(p)

Show me a good loser, and I'll show you a loser.

-- Vince Lombardi

Replies from: VAuroch, AndHisHorse
comment by VAuroch · 2014-05-05T20:35:40.794Z · LW(p) · GW(p)

Not true; one of the key skills needed to improve at most games where there are chance factors is the ability to distinguish cases when you did the right thing and lost anyway from those where you made mistakes and lost to them. You have to take loss gracefully and focus on mistakes and expected outcomes.

comment by AndHisHorse · 2014-05-05T20:13:16.585Z · LW(p) · GW(p)

Trivially true, and the meaning is ambiguous depending on the meaning of "good" (skilled/frequent at losing? able to handle losing without psychological distress? capable of pulling some benefit out of a "loss"?). Is there context which might illuminate the connotation?

Replies from: VAuroch, TheAncientGeek
comment by VAuroch · 2014-05-05T20:33:07.126Z · LW(p) · GW(p)

There's probably cultural context you're missing (I'm guessing you're not a native English speaker, or at least non-American.), because it's pretty straightforward from here without any textual context.

A "good loser" is idiomatically someone who can accept defeat graciously (i.e. not get bitter or angry at the opponent). The quote says that anyone who doesn't get offended by their own losses won't improve and will remain a loser.

Replies from: kalium, AndHisHorse, ChristianKl
comment by kalium · 2014-05-05T22:27:07.109Z · LW(p) · GW(p)

If you get offended by losing, that's not an incentive to improve beyond a pretty low threshold. It's an incentive to avoid tough competition and remain a medium-sized fish in a tiny pond.

comment by AndHisHorse · 2014-05-05T20:42:28.628Z · LW(p) · GW(p)

I actually am a native, American English speaker, and while I am aware that the common usage refers to somebody who is able to handle loss without taking offense, I did not rest on the assumption that the common usage was the relevant usage here. I would consider the meaning of the quote given the common usage inaccurate, as I find the implication that a gracious loser is necessarily an unmotivated loser incorrect. Therefore, I left open the possibility that the quote might use a less common meaning of the term "good loser".

Replies from: dthunt
comment by dthunt · 2014-05-05T20:45:43.349Z · LW(p) · GW(p)

The speaker is a football guy, if that helps. But yes, I also find it a distasteful remark. You can improve without being in poor form in front of others (or even in private, really). And it's pretty rare to literally NEVER lose.

comment by ChristianKl · 2014-05-06T14:27:42.523Z · LW(p) · GW(p)

I think being a good loser is more than that. Not investing more resources into a losing project because of the sunk cost bias is on of the things is a skill that makes someone a good loser.

comment by TheAncientGeek · 2014-05-05T21:11:38.513Z · LW(p) · GW(p)

It's true if you think that winning arbitrary competitions is iimportant , and false if you can place things in wider context. Consider losing to your boss.