Honesty: Beyond Internal Truth

post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-06-06T02:59:00.296Z · LW · GW · Legacy · 87 comments

Contents

87 comments

When I expect to meet new people who have no idea who I am, I often wear a button on my shirt that says:

SPEAK THE TRUTH,
EVEN IF YOUR VOICE TREMBLES

Honesty toward others, it seems to me, obviously bears some relation to rationality.  In practice, the people I know who seem to make unusual efforts at rationality, are unusually honest, or, failing that, at least have unusually bad social skills.

And yet it must be admitted and fully acknowledged, that such morals are encoded nowhere in probability theory.  There is no theorem which proves a rationalist must be honest - must speak aloud their probability estimates.  I have said little of honesty myself, these past two years; the art which I've presented has been more along the lines of:

SPEAK THE TRUTH INTERNALLY,
EVEN IF YOUR BRAIN TREMBLES

I do think I've conducted my life in such fashion, that I can wear the original button without shame.  But I do not always say aloud all my thoughts.  And in fact there are times when my tongue emits a lie.  What I write is true to the best of my knowledge, because I can look it over and check before publishing.  What I say aloud sometimes comes out false because my tongue moves faster than my deliberative intelligence can look it over and spot the distortion.  Oh, we're not talking about grotesque major falsehoods - but the first words off my tongue sometimes shade reality, twist events just a little toward the way they should have happened...

From the inside, it feels a lot like the experience of un-consciously-chosen, perceptual-speed, internal rationalization.  I would even say that so far as I can tell, it's the same brain hardware running in both cases - that it's just a circuit for lying in general, both for lying to others and lying to ourselves, activated whenever reality begins to feel inconvenient.

There was a time - if I recall correctly - when I didn't notice these little twists.  And in fact it still feels embarrassing to confess them, because I worry that people will think:  "Oh, no!  Eliezer lies without even thinking!  He's a pathological liar!"  For they have not yet noticed the phenomenon, and actually believe their own little improvements on reality - their own brain being twisted around the same way, remembering reality the way it should be (for the sake of the conversational convenience at hand).  I am pretty damned sure that I lie no more pathologically than average; my pathology - my departure from evolutionarily adapted brain functioning - is that I've noticed the lies.

The fact that I'm going ahead and telling you about this mortifying realization - that despite my own values, I literally cannot make my tongue speak only truth - is one reason why I am not embarrassed to wear yon button.  I do think I meet the spirit well enough.

It's the same "liar circuitry" that you're fighting, or indulging, in the internal or external case - that would be my second guess for why rational people tend to be honest people.  (My first guess would be the obvious: respect for the truth.)  Sometimes the Eli who speaks aloud in real-time conversation, strikes me as almost a different person than the Eliezer Yudkowsky who types and edits.  The latter, I think, is the better rationalist, just as he is more honest.  (And if you asked me out loud, my tongue would say the same thing.  I'm not that internally divided.  I think.)

But this notion - that external lies and internal lies are correlated by their underlying brainware - is not the only view that could be put forth, of the interaction between rationality and honesty.

An alternative view - which I do not myself endorse, but which has been put forth forcefully to me - is that the nerd way is not the true way; and that a born nerd, who seeks to become even more rational, should allow themselves to lie, and give themselves safe occasions to practice lying, so that they are not tempted to twist around the truth internally - the theory being that if you give yourself permission to lie outright, you will no longer feel the need to distort internal belief.  In this view the choice is between lying consciously and lying unconsciously, and a rationalist should choose the former.

I wondered at this suggestion, and then I suddenly had a strange idea.  And I asked the one, "Have you been hurt in the past by telling the truth?"  "Yes", he said, or "Of course", or something like that -

(- and my brain just flashed up a small sign noting how convenient it would be if he'd said "Of course" - how much more smoothly that sentence would flow - but in fact I don't remember exactly what he said; and if I'd been speaking out loud, I might have just said, "'Of course', he said" which flows well.  This is the sort of thing I'm talking about, and if you don't think it's dangerous, you don't understand at all how hard it is to find truth on real problems, where a single tiny shading can derail a human train of thought entirely -)

- and at this I suddenly realized, that what worked for me, might not work for everyone.  I haven't suffered all that much from my project of speaking truth - though of course I don't know exactly how my life would have been otherwise, except that it would be utterly different.  But I'm good with words.  I'm a frickin' writer.  If I need to soften a blow, I can do with careful phrasing what would otherwise take a lie.  Not everyone scores an 800 on their verbal SAT, and I can see how that would make it a lot harder to speak truth.  So when it comes to white lies, in particular, I claim no right to judge - and also it is not my primary goal to make the people around me happier.

Another counterargument that I can see to the path I've chosen - let me quote Roger Zelazny:

"If you had a choice between the ability to detect falsehood and the ability to discover truth, which one would you take? There was a time when I thought they were different ways of saying the same thing, but I no longer believe that. Most of my relatives, for example, are almost as good at seeing through subterfuge as they are at perpetrating it. I’m not at all sure, though, that they care much about truth. On the other hand, I’d always felt there was something noble, special, and honorable about seeking truth... Had this made me a sucker for truth's opposite?"

If detecting falsehood and discovering truth are not the same skill in practice, then practicing honesty probably makes you better at discovering truth and worse at detecting falsehood.  If I thought I was going to have to detect falsehoods - if that, not discovering a certain truth, were my one purpose in life - then I'd probably apprentice myself out to a con man.

What, in your view, and in your experience, is the nature of the interaction between honesty and rationality?  Between external truthtelling and internal truthseeking?

87 comments

Comments sorted by top scores.

comment by wuwei · 2009-06-06T06:10:18.334Z · LW(p) · GW(p)

I find that there is often a conflict between a motivation to speak only the truth and a motivation to successfully communicate as close approximations to the most relevant truths as constraints of time, intelligence and cultural conversational conventions allow.

Replies from: slr, MendelSchmiedekamp, AndrewH
comment by slr · 2009-06-07T01:05:05.430Z · LW(p) · GW(p)

Eli, you can get away with wearing whatever buttons you want because you can back-up all your claims by pointing out your writings, works etc to skeptical people.

But say I am a rationalist and I prefer honesty over dishonesty. And say I am trying to win someone's confidence/trust/love/respect and I believe in 'rationalists should win' principle. And this person doesn't necessarily care about rationality or honesty other than to the extent that was passed on to her by the prevailing social norms in her community/church etc. Moreover she doesn't see me as some ultra-rationalist guy and there is nothing I can do to prove otherwise, short of saying, hey check out all the websites I browse everyday or hey see all the books I have.

Now, when I talk to her, I twist the truth (or lie outright) to make sure I send a friendly signal to get what I want.

If I am talking to some person I've known for years, still, I'll probably calibrate my words to send a message that I know would be received in a way I want it to be received to eventually get what I want.

My gut feeling is that this way of thinking is surely not right, but why? It surely is the 'less wrong' way in some situations. So, does it just boil down to personal preferences where the line should be drawn? I think so.

Replies from: patrissimo, Eliyoole
comment by patrissimo · 2009-06-10T06:09:43.898Z · LW(p) · GW(p)

I think the problem is that lying to other people will tend to reinforce lying to yourself and to others. Your brain can't just separate the circumstances like that. Rationalists win when their hardware lets them - and our hardware is very limited.

comment by Elias (Eliyoole) · 2024-08-08T10:06:46.433Z · LW(p) · GW(p)

I think the issue is with the "get what I want" part. Isn't this treating people as a means to an end, instead of treating them as an end in and of itself?* (I think that Kant would not be happy - though I don't know of anything that has been written on lesswrong about this.)

If you are talking to another person and you are trying to convince them to adopt a certain view of you, that is not what I would call truth-oriented. So, whether you specifically lie, omit, or whatever; it's already secondary. If your goal is to have an honest interaction with another being, I don't think you can in that interaction want to edit their perception of you (apart from misunderstandings etc).

I'd say that the way you achieve your goal is to become what you want to be seen as. This is, of course, harder than just lying, but in a way it takes less effort, too. 
Plus, you avoid another important pitfall I could see here: Lying to yourself about wanting a connection with a person who doesn't share your values. If you have to lie to fit in with them, maybe not fitting in with them is a good thing, and you should pay attention to that. In this way, the impulse to lie may be similarly useful as the tiny voice telling you that you are confused.


(The following is just about the effort it takes to lie vs truth. Not really required for the core idea, read if you wish^^)
Imagine what insane effort it would take to lie all the time but try to be perceived as being honest! While "just" being honest is hard in a different way, though on subtler and subtler levels, I at least was freed of a lot of the mental overhead that lying brings with it. (Sure, part of that was replaced by the mental habits of self-checking, but still, way less. I don't have to worry about what I may have said at some point if I don't remember. I will see what I would say now, and unless I acquired new information or insight, this will probably approximate what I said then. If I am also honest about this process, my self-perceived fault of not perfect memory isn't too bad anymore. This can never work with lying, because you need to keep tabs on what you told whom, how they may have gained additional information, etc.)

 

*(The fact that you specified the gender of the other person also implies a certain degree of "means to an end" to me (yes, even without knowing your gender) unless you are talking about one specific situation and nothing else. But that may just as well be wrong.)

comment by MendelSchmiedekamp · 2009-06-06T17:06:34.413Z · LW(p) · GW(p)

Definitely. There is a significant risk in failing to communicate accurately by deciding that honesty is all we are obligated to do. This seems inconsistent with the ideal that rationalists should win, in this case winning over the difficulties of accurate communication, rather than simply trying virtuously.

More broadly, though, there is an ambiguity about what exactly honesty really means. After all as Douglas Adams points out, speaking the truth is not literally what people mean when they tell each other to be honest - for one thing this is neither a sane or a terminating request. I suspect this is one of those cases where the graceful failure of the concept is socially very useful, and so the ideal is useful, but over achievement is not necessarilly any better than under achievement (at least not in societal terms).

Replies from: Douglas_Knight, AspiringRationalist
comment by Douglas_Knight · 2009-06-06T17:56:14.902Z · LW(p) · GW(p)

I wouldn't say "trying virtuously," though maybe that's right. I definitely wouldn't talk about a "motivation to speak only the truth." It seems so rigid that I would call it a ritual or a superstition, a sense that there is only one correct thing that can be said.

Perhaps the problem is that the (unconscious) goal is not to communicate, but to show off to third parties, or even to make the listener feel stupid?

comment by NoSignalNoNoise (AspiringRationalist) · 2012-09-24T04:40:47.835Z · LW(p) · GW(p)

A good working definition might be "attempting to communicate in a way that makes the recipients map match the territory as closely as is reasonable."

comment by AndrewH · 2009-06-07T21:35:01.357Z · LW(p) · GW(p)

That's teaching for you, the raw truth of the world can be both difficult to understand in the context of what you already 'know' (Religion -> Evolution) or difficult to understand in its own right (Quantum physics).

This reminds me of "Lies to Humans" as Hex, the thinking machine of Discworld, where Hex tells the Wizards the 'truth' of something, coached in things they understand to basically shut them up, rather than to actually tell them what is really happening.

In general, a person cannot jump from any preconceived notion of how something is to the (possibly subjective!) truth. Instead, to teach you tell lesser and lesser lies, which in the best case, may simply be more and more accurate approximations of the truth. Throughout, you the teacher, have been as honest as to the learner as you can be.

But when someone has a notion of something that is wrong enough, I can see these steps as, in themselves, could contain falsehood which is not an approximation of the truth itself. Is this honest? To teach a flat-Earther the world is round, perhaps a step is to consider the world being convex, so as to explain the 'ships over the horizon disappear'.

If your goal is to get someones understanding closer to the truth, it may be rational, but the steps you take, the things you teach, might not be honest.

Replies from: johnlawrenceaspden
comment by johnlawrenceaspden · 2012-09-29T13:48:49.308Z · LW(p) · GW(p)

To teach a flat-Earther the world is round, perhaps a step is to consider the world being convex, so as to explain the 'ships over the horizon disappear'.

Only nitpicking, and I do like the example, but 'the world is convex' is actually less false than 'the world is round'.

comment by Zack_M_Davis · 2013-01-06T21:38:44.273Z · LW(p) · GW(p)

a born nerd, who seeks to become even more rational, should allow themselves to lie, and give themselves safe occasions to practice lying, so that they are not tempted to twist around the truth internally

I'm starting to think that this is exactly correct.

As we all know, natural language sentences (encoded as pressure waves in the air, or light emitted from a monitor) aren't imbued with an inherent essence of trueness or falseness. Rather, we say a sentence is true when reporting it to a credulous human listener would improve the accuracy of that human's model of reality. For many sentences, this is pretty straightforward ("The sky is blue" is true if and only if the sky is blue, &c.), but in other cases it's more ambiguous, not because the sentence has an inherently fuzzy truth value, but because upon interpreting the sentence, the correspondence between the human's beliefs and reality could improve in some aspects but not others; e.g., we don't want to say "The Earth is a sphere" is false, even though it's really more like an oblate spheroid and has mountains and valleys. This insight is embedded in the name of the site itself: "Less Wrong," suggesting that wrongness is a quantitative rather than binary property.

But if sentences don't have little XML tags attached to them, then why bother drawing a bright-line boundary around "lying", making a deontological distinction where lying is prohibited but it's okay to achieve similar effects on the world without technically uttering a sentence that a human observer would dub "false"? It seems like a form of running away from the actual decision problem of figuring out what to say. When I'm with close friends from my native subculture, I can say what I'm actually thinking using the words that come naturally to me, but when I'm interacting with arbitrary people in society, that doesn't work as a matter of cause and effect, because I'm often relying on a lot of concepts and vocabulary that my interlocutor hasn't learned (with high probability). If I actually want to communicate, I'm going to need a better decision criterion than my brain's horrifyingly naive conception of honesty, and that's going to take consequentialist thinking (guessing what words will produce what effect in the listener's mind) rather than moralistic thinking (Honesty is Good, but Lying is Bad, so I'm not Allowed to say anything that could be construed as a Lie, because then I would be a Bad Person). The problem of "what speech acts I should perform in this situation" and the problem of having beliefs that correspond to reality are separate problems with different success criteria; it really shouldn't be surprising that one can do better on both of them by optimizing them separately.

Looking back on my life, moralistic reasoning---thinking in terms of what I or others "should" do, without having a consequentialist reduction of "should"---has caused me a lot of unnecessary suffering, and it didn't even help anyone. I'm proud that I had an internalized morality and that I cared about doing the Right Thing, but my conception of what the Right Thing was, was really really stupid and crazy, and people tried to explain to me what I was doing wrong, and I still didn't get it. I'm not going to make that (particular) mistake again (in that particular form).

comment by taw · 2009-06-06T03:32:28.285Z · LW(p) · GW(p)

I have a small theory - "enhancing reality" is normal part of human social interactions, we are designed to lie exactly for this reason, and not doing it properly hurts your signaling skills and lowers your ability to achieve your social goals (including getting mates and so on).

So based on the premise that rationality is about success, rationalists should have no qualms about lying when situation is right. I'm quite good at lying, and also I'm pretty sure I'm extremely honest with myself.

Replies from: dclayh
comment by dclayh · 2009-06-06T03:42:28.659Z · LW(p) · GW(p)

In particular, those who can tell an entertaining anecdote are widely praised, and I've been told by such people that my anecdotes suffer because I adhere too closely to the truth.

Replies from: taw, pwno
comment by taw · 2009-06-06T03:54:40.777Z · LW(p) · GW(p)

That's amusing, I always enhance my anecdotes, at least by dropping the irrelevant parts.

Now that will sound funny but I think posting on 4chan helped me quite a lot with learning this. In real life (or online where you have long term identity you care about) it's difficult to train enhancing the truth, because failure makes you a known liar, and has negative consequences. On 4chan on the other hand - just go for it - nobody really cares.

comment by pwno · 2009-06-06T22:05:56.302Z · LW(p) · GW(p)

When it comes to good storytelling, using emotive language, suspense, good timing, exaggerated facial expressions, etc., are much more important than embellishing the truth.

comment by RobinHanson · 2009-06-06T08:36:20.300Z · LW(p) · GW(p)

It is a good question, but I fear I'm lacking data to answer. It is much harder to see my own self-deceptions than my lies, making it hard to see the sign of the correlation between them.

comment by knb · 2009-06-06T05:43:21.979Z · LW(p) · GW(p)

I have noticed that, in addition to being honest, rationalists (or those striving toward rationality) seem to also speak very precisely (although not necessarily accurately) . This is a trait that they seem to share in common with philosophers, lawyers, programmers, etc.

I suspect this is because these people recognize the confusion caused by the vagueness of language.

Replies from: JGWeissman
comment by JGWeissman · 2009-06-06T07:10:17.183Z · LW(p) · GW(p)

I think the ideal here is to speak with a level of precision that matches your confidence in your accuracy. For example, I would say that an event is probable to indicate I have a vague impression it is more likely to occur than not, if I say it has a probability of .62, that means I have done an explicit mathematical analysis of my uncertainty.

comment by patrissimo · 2009-06-10T06:06:35.327Z · LW(p) · GW(p)

Agreed about the non-universality and impact of personal history. I got an 800 SAT verbal and have been reasonably popular and hanging with a smart and very accepting crowd from college on. I have also never had to financially rely on any entity more conservative than Google. And I live more unusual and transparent life than most people, being open about things that most people are private about.

It is tempting to think that my open approach is the best way for everyone, but the fact is that I have not suffered for my openness at all and lots of other people have. All my exploration of non-mainstream sexuality, for example, has happened in either LA or SF, where such things are not only not lynch-worthy, they are cool and hip.

But if I had grown up in different circumstances, I'm not sure I would have been less open and honest. It's a pretty strong drive for me. I think I might just have ended up miserable and bitter about the awfulness and dishonesty of the world.

comment by PhilGoetz · 2009-06-08T19:20:50.700Z · LW(p) · GW(p)

Interesting Esquire article on Radical Honesty.

comment by randallsquared · 2009-06-08T13:23:54.499Z · LW(p) · GW(p)

When I noticed this sort of thing in myself (I don't remember exactly when, but probably in my teens), I started intentionally pausing and rehearsing what I was about to say before saying it, in most situations. This might or might not have made me less of a liar, but it sure made me say less, because after vetting what I'm about to say, it's often the case that I don't feel the need to actually say it. Most things I would say in person don't seem worth saying once I've reviewed them for a bit.

comment by saturn · 2009-06-10T05:41:13.946Z · LW(p) · GW(p)

In general I'm honest with people I think are at least moderately sane and high in agreeableness and openness, in a cooperative context. Otherwise I tend to think of myself as an actor in an improvised play that's "based on a true story." I don't seem to have too much difficulty keeping track of this, but I admit the possibility I'm self-deceived.

comment by PhilGoetz · 2009-06-08T18:44:29.903Z · LW(p) · GW(p)

There are competitions that honest people regularly do poorly at, because liars have an advantage. Mating and attracting venture capital are examples. I knew a kid who claimed to have had sex with 100 women by the time he was 20. "How?" I asked. "I told them all I loved them," he said.

(Though, in my experience, telling women you love them before you've had sex with them is more likely to get you labeled 'creepy' than get you laid. Maybe it's different for teenagers.)

Replies from: Alicorn
comment by Alicorn · 2009-06-08T18:58:46.744Z · LW(p) · GW(p)

Have you considered the possibility that this dishonest person was not honest about how many women he'd had sex with?

Replies from: PhilGoetz
comment by PhilGoetz · 2009-06-08T18:59:37.847Z · LW(p) · GW(p)

Yes. I have no way of knowing.

comment by dclayh · 2009-06-06T03:38:45.355Z · LW(p) · GW(p)

If detecting falsehood and discovering truth are not the same skill in practice, then practicing honesty probably makes you better at discovering truth and worse at detecting falsehood. If I thought I was going to have to detect falsehoods - if that, not discovering a certain truth, were my one purpose in life - then I'd probably apprentice myself out to a con man.

I've heard from many sources that con men are actually among the easiest to deceive. The rationale seems to be along the lines that con men have utter contempt for their marks, and therefore once a con man thinks of you as a mark he'll be oblivious to any signs that you're actually the one playing him.

(I've also heard, separately, a saying to the effect of "It is difficult/impossible to deceive an honest man", but it comes with no justification other than perhaps a religious one.)

Replies from: None, Cyan
comment by [deleted] · 2009-06-06T09:20:21.357Z · LW(p) · GW(p)

del

comment by Cyan · 2009-06-06T03:53:40.410Z · LW(p) · GW(p)

I recall watching a television show just as microexpressions were receiving pop-science attention which claimed that tests against videotaped gold standards showed that the only people who were reliably able to distinguish liars from truth-tellers at a rate above chance were either people explicitly trained to detect microexpressions or professional spies.

comment by nazgulnarsil · 2009-06-08T01:34:52.885Z · LW(p) · GW(p)

I never before thought of poor social skills as simply a disregard for the standard signals. I have a problem with that interpretation though because I often find that groups with poor social skills have their own sets of in-group signals and that they jockey for position just as much as the rest of us.

comment by MendelSchmiedekamp · 2009-06-06T15:59:28.217Z · LW(p) · GW(p)

We're all operating using human brains. It simply isn't safe to assume we don't lie or decieve ourselves constantly. The failure to notice self-deceptions is, I expect, one of the most pervasive forms of akrasia, if not, obviously, one of the most self-evident ones.

I doubt focused honesty or even radical honesty will have a major effect on the frequency of self-deception. It seems too much like sympathetic magic. So I expect it to work like any other placebo ritual.

I expect combating self-deception will require working against the akrasia directly. But most anti-akrasia approaches asume at least a small window of consciousness of the akrasia, which in this case is often not possible.

comment by AlanCrowe · 2009-06-06T15:10:27.361Z · LW(p) · GW(p)

I had fun writing Lies. I do not know if it is a fun read.

Replies from: cousin_it, orthonormal
comment by cousin_it · 2009-06-07T19:26:54.048Z · LW(p) · GW(p)

Great metaphor at the end. And great piece "Thrice woe". Thanks!

Edit: and also thanks for the Simpson's paradox essay.

comment by orthonormal · 2009-06-06T17:52:55.411Z · LW(p) · GW(p)

I enjoyed it, as well as some of your other Soapbox pieces. Ever thought about adapting some of them for LW?

comment by JamesCole · 2009-06-06T05:03:10.305Z · LW(p) · GW(p)

I don't think Zelazny's statement makes out that "detecting falsehood and discovering truth are not the same skill in practice". He just seems to be saying that you can have good 'detecting falsehood' skills without caring much about the truth ("I’m not at all sure, though, that they care much about truth").

If I thought I was going to have to detect falsehoods - if that, not discovering a certain truth, were my one purpose in life - then I'd probably apprentice myself out to a con man.

I think that's equating 'detecting falsehood' too much with 'detecting tricks of deception'.

If detecting falsehood and discovering truth are not the same skill in practice, then practicing honesty probably makes you better at discovering truth and worse at detecting falsehood.

I'm very doubtful that practising honesty, itself, could make you worse at detecting falsehoods.

Being naive -- for example, by assuming anything that superficially seems to make sense must be true -- can make you worse at detecting falsehoods. We often associate honesty with a kind of naivety, but the problem with being poor at detecting falsehoods is a problem with naivety not with honesty.

A certain kind of naivity is thinking that since you have good intentions about being honest, you therefore are honest. Saying or thinking or feeling that you are honest does not necessarily mean you are actually honest. Yes, having a genuine desire to be honest is going to make you more likely to be honest, and put you on the right track to being honest, but claims of honesty don't necessarily equate with a genuine desire.

To actually make sure you're more honest takes work. It requires you to monitor and reflect on what you say and do. It requires you to monitor and reflect upon the ways that you or others can be dishonest. And I reckon that means that having the ability to be genuinely honest also means you'll have pretty good skills for detecting falsehood.

The following is just sketchy thoughts:

In relation to rationality, I'd say that rationality requires certain types of honesty to yourself, and being rational is likely to make you more honest to yourself as well.

If you can be successfully rational (without any major pockets of irrationality), then you're probably more likely to be considerate of others (because you're better able to appreciate the negative consequences of your actions), and are thus more likely to be honest to people in matters where there could be non-trivial negative consequences.

But I still suspect you can be quite rational without it necessitating that you're particularly honest to others.

comment by JulianMorrison · 2009-06-08T09:15:06.941Z · LW(p) · GW(p)

I know I'm a really crap rationalist. Lots of the ways I fail at reason are in the category of "lying to myself", "making things easier by adopting a role", "maintaining a bubble of cosy normality". When I'm not lying to myself overtly, manipulating others can be a means to untruth-maintenance.

Honestly, even if lying were useful, I don't think I can trust myself with it.

Maybe a graduate Beisutsukai can lie, but an student ought to be scrupulous with truth.

Replies from: Annoyance
comment by Annoyance · 2009-06-08T23:11:38.703Z · LW(p) · GW(p)

I know I'm a really crap rationalist.

You don't have to run faster than the bear, just faster than your slowest friend.

In explicitspeak: You may be right, but your being right on this matter makes you a better rationalist than the vast majority of people who claim that title.

Replies from: JulianMorrison
comment by JulianMorrison · 2009-06-09T08:17:55.460Z · LW(p) · GW(p)

I suggest that competitive bear races, metaphorically speaking, are so rare that even looking for them is positively harmful. Nearly all tests of reason, from frying with hot fat to investing your money to buying cryonics, the only people in the race are you and the bear, and you really do have to be faster.

comment by Mike Bishop (MichaelBishop) · 2009-06-06T18:59:27.846Z · LW(p) · GW(p)

Aside: I don't think its really verbal ability, e.g. as measured by the SAT, which truthtelling requires. What is more important is having a good theory of mind and knowing what deserves emphasis. An aspect of emotional intelligence, no?

comment by MichaelGR · 2009-06-06T05:14:45.681Z · LW(p) · GW(p)

What, in your view, and in your experience, is the nature of the interaction between honesty and rationality? Between external truthtelling and internal truthseeking?

Could it be as simple as forming a habit?

Being honest as much as possible 'by default' turns into a habit that can be very helpful when it comes to being rational. I have the feeling that external truthtelling helps form that habit, which leads to fewer internal lies. But I can't prove it (maybe it's just me).

The chain goes something like: habit of external truth -> helps internal truth, even when it's inconvenient -> helps to practice the art of rationality, because it's harder to avoid tough questions and problems by deceiving yourself.

comment by Jakeness · 2012-09-23T17:00:25.911Z · LW(p) · GW(p)

After I first read this article about a year ago, I set out to be more honest in all my conversations. (At this point in time it has become a part of my persona and I no longer do it consciously.) There are a few things I've noticed since I made the switch:

  • It is easier for me to think clearly during social events. I suspect this is because I no longer have to generate lies and keep track of all of them.

  • I have become more outgoing, although undoubtedly more socially awkward. Occasionally, a person will be shocked at how carelessly I reveal something considered to be embarrassing.

  • It is easier for me to detect certain lies. I attribute this to my being able to think more clearly, because as Eliezer points out, detecting falsehood might be negatively correlated to being honest.

Note that there is still much more room for me to improve, and my personal reflections on this matter are likely to be deeply flawed.

Replies from: Swimmer963
comment by Swimmer963 (Miranda Dixon-Luinenburg) (Swimmer963) · 2012-09-28T16:28:44.443Z · LW(p) · GW(p)

I have become more outgoing, although undoubtedly more socially awkward. Occasionally, a person will be shocked at how carelessly I reveal something considered to be embarrassing.

I am like this. It occasionally creates a false note in a conversation, but for the most part it doesn't harm my relations with other people...and it feels good to realize that people don't actually judge me for the things I might be judging myself for.

Replies from: Jakeness
comment by Jakeness · 2012-11-04T05:14:30.780Z · LW(p) · GW(p)

How can you be sure you aren't being judged?

Replies from: Swimmer963
comment by Swimmer963 (Miranda Dixon-Luinenburg) (Swimmer963) · 2012-11-04T18:44:54.930Z · LW(p) · GW(p)

Clarify that to "people don't display all the usual behaviours of judging someone, i.e sharing looks and smirking with each other, avoiding me afterwards, etc." Maybe they go on to judge me behind my back, but I've seen no reflection on my overall social standing...except that I've possibly developed more of a reputation since then, in the sense that I went from being semi-invisible to fairly interesting.

comment by scav · 2009-06-08T13:28:36.395Z · LW(p) · GW(p)

I think there is a link between external truth-telling and external truth-seeking.

Say I make a decision for the reason that it is my best guess at the right thing to do in a given situation, based on the facts I have. Suppose further it is not viewed as the right thing to do for social or political reasons among my peers. (This has happened to me now and then.)

I'd like to be able to defend my decision against vaguely stated, uninformed, logically inconsistent or otherwise irrational objections, by reference to measurable and demonstrable facts. I can only plausibly take this position (not guaranteed of success, as I'm sure we all know), as long as I stick to facts. If at any point I become known as a "liar" I lose the social standing to do so in the future, even when the facts are on my side.

Someone with much better social skills could probably charm their way through dispute situations without having to bother with mere facts.

I can't pick apart my tendency towards truth-telling into its various components and their causes, but maybe my love of the truth is partly fear of the quicksand of uncontrollable social situations. I long for solid ground, and stand there fiercely.

comment by [deleted] · 2009-06-06T12:57:14.832Z · LW(p) · GW(p)

del

Replies from: Eliezer_Yudkowsky, MichaelBishop
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-06-06T17:59:04.328Z · LW(p) · GW(p)

TAWME: not on first Google page or Urban Dictionary.

Replies from: SoullessAutomaton, None
comment by SoullessAutomaton · 2009-06-06T18:19:07.812Z · LW(p) · GW(p)

This Agrees With My Experience.

It seemed obvious to me from context but that may merely be a sign that I spend too much time around people who enjoy acronyms.

comment by [deleted] · 2009-06-07T10:30:34.248Z · LW(p) · GW(p)

del

comment by Mike Bishop (MichaelBishop) · 2009-06-06T18:53:35.794Z · LW(p) · GW(p)

I assign high probability to this statement: 'If you turn away from the truth it stabs you in the back.'

Can you elaborate?

Replies from: None
comment by [deleted] · 2009-06-07T10:24:06.325Z · LW(p) · GW(p)

del

Replies from: MichaelHoward
comment by MichaelHoward · 2009-06-13T13:37:17.561Z · LW(p) · GW(p)

d) It's just plain bad karma.

I hope that was a joke.

Replies from: Eliezer_Yudkowsky, None
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-06-13T20:16:10.635Z · LW(p) · GW(p)

He means that lies get voted down on LW.

Replies from: None
comment by [deleted] · 2009-06-14T10:13:14.548Z · LW(p) · GW(p)

del

Replies from: MichaelHoward
comment by MichaelHoward · 2009-06-14T22:03:29.851Z · LW(p) · GW(p)

That too. But I mean it first in a way MichealHoward fears.

I used to myself. Hope this helps :)

Excluding the Supernatural

Joy in the Merely Real

All stories that invoke ontologically basic mental quantities have turned out to be wrong.

Replies from: None
comment by [deleted] · 2009-06-15T09:05:42.388Z · LW(p) · GW(p)

del

Replies from: MichaelHoward
comment by MichaelHoward · 2009-06-15T16:55:32.945Z · LW(p) · GW(p)

Ah, that's good. :) Most people I've known who believed in karma meant something quite different.

comment by [deleted] · 2009-06-14T10:10:43.043Z · LW(p) · GW(p)

del

comment by djcb · 2009-06-06T12:14:56.659Z · LW(p) · GW(p)

A 'Machiavellian rationalist' would only speak truths when in its in his/her own best interests, and lie when that is more useful.

However, I think Eliezer wants to be a 'friendly rationalist'[1]. Then, speaking the truth becomes the optimal thing in many more cases. It could also make it harder to succeed in fields like trade, politics, war, where bluff, misrepresentations etc. are important. And what about 'Daddy, do you like this drawing I spent three hours making for you?'

And sometimes lying seems simply the best choice - no matter what Kant thinks. The classical gestapo-knocks-on-door example comes to mind. The disutility of lying can be smaller than utility of saving other people's life (as well as you own).

Truthfulness is something to aspire to, but it's hard to be absolutist about it. I think being 'friendly'[1] is a useful dividing line, even if it can be abused to rationalize away lies.

[1] with 'friendly' here I mean: 'being nice to the rest of humanity determines your utility function in a significant way'

comment by ajayjetti · 2009-06-06T05:16:49.472Z · LW(p) · GW(p)

From what i've read and able to understand(a little) after spending 3 months reading this blog is that rationality is (just to put what has already been said lot of times by eliezer) "something that helps us getting more of what we want", please correct me if i have got it dead wrong, else i might be "de-rationalised" in a rational way. Im a pathalogical liar, i have little to hide (nobody who knows me visits this blog, i think, and if somebody does and happens to read this, then he wouldn't mind it i'm sure).

"........- the theory being that if you give yourself permission to lie outright, you will no longer feel the need to distort internal belief. In this view the choice is between lying consciously and lying unconsciously, and a rationalist should choose the former."

I like the idea of outright lying, but what i've realised over a period of time is that the "outward lying" slowly creeps into the "conscious" and "aware" mind and sometimes i forget that i am lying, i guess it happens to other also.

Again, from the definition of rational (as i ve put it above), i think honesty is related to rationality, since not being honest may help us want more of what we want or just get us what we want. How well we manage lies and are able to separate conscious and unconscious lies is subjective.

Truth is beautiful, without doubt; but so are lies--Ralph waldo emerson

Replies from: billswift
comment by billswift · 2009-06-06T12:32:16.479Z · LW(p) · GW(p)

What you say, or write down, tends to be more strongly remembered than what you just think. (This is the basis for writing down your ideas whenever you get them (speaking them aloud is helpful if you can't stop to write).)

comment by loqi · 2009-06-06T04:57:11.443Z · LW(p) · GW(p)

If detecting falsehood and discovering truth are not the same skill in practice, then practicing honesty probably makes you better at discovering truth and worse at detecting falsehood.

It seems to me that the hard part of both is being "more confused by fiction than reality", and that the only relevant additional concern when detecting falsehood to have a reasonable prior on deliberate deception.

comment by Annoyance · 2009-06-08T23:17:38.827Z · LW(p) · GW(p)

The problem, as I see it, is that it's not possible to lie to people and simultaneously act in their interests.

Lying to someone is an inherently hostile act, and indicates that (at least in regard to the matter at hand) you're enemies. There may be very special cases in which it's the act of an ally or friend (in the same way that semi-hostile microorganisms may be beneficial to our immune functioning) but they must be quite rare.

Even if some of the consequences of the lie are 'good', you're reinforcing the other person's tendencies towards irrationality by getting them to believe untruths and profiting by it.

Replies from: PhilGoetz, Jack, Alicorn, Adaptive
comment by PhilGoetz · 2009-06-09T03:53:38.690Z · LW(p) · GW(p)

The problem, as I see it, is that it's not possible to lie to people and simultaneously act in their interests.

What is your reasoning or evidence?

See the radical honesty link below. You can lie to people to avoid hurting their feelings. You can lie to children for their entertainment, their protection, or their cognitive development. (IMHO, speaking from experience, exposure to too many brutal truths of the world at the age of 5 is not a good thing. Also, training very young children to act as self-interested expectation maximizers has bad results.) You can lie to people to protect them from their predictable self-destructive responses (eg., I once told an alcoholic "No, I haven't got any whiskey in the house.")

Replies from: Annoyance
comment by Annoyance · 2009-06-09T19:19:43.309Z · LW(p) · GW(p)

What is your reasoning or evidence?

Accepting a lie means accepting an understanding of the world that is less accurate than it could be, and that in turns limits your ability to react appropriately to reality. Any so-called positive consequences of believing the lie could also be produced by choosing a rational strategy in awareness of the truth.

Viewed as an abstract hypothetical, preferring to have believed a comforting lie rather than an unpleasant truth just isn't compatible with rationality; more importantly, it's incompatible with the desires that cause people to be rational.

Is it wrong? I happen to believe that irrationality is wrong, not merely a pragmatically poor choice, but the line of reasoning that brought me to that conclusion is a rational one; I recognize the limits to that argument. If you're a rationalist, or you wish to be, preferring the irrational is the wrong way to get there.

Replies from: PhilGoetz
comment by PhilGoetz · 2009-06-09T20:54:02.995Z · LW(p) · GW(p)

Viewed as an abstract hypothetical, preferring to have believed a comforting lie rather than an unpleasant truth just isn't compatible with rationality; more importantly, it's incompatible with the desires that cause people to be rational.

Perhaps, if everyone were perfectly rational, you would be right that lying is inherently hostile.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2009-06-09T22:46:01.970Z · LW(p) · GW(p)

Only all else being equal is the truth clearly preferable to deceit. If the choice of deceit allows to save the world, it's inevitable for any notion of human morality to prefer deceit.

Replies from: Annoyance
comment by Annoyance · 2009-06-10T16:59:49.841Z · LW(p) · GW(p)

It's not inevitable. Nor is it likely, in balance, that deceit would have such large and positive effects. Far, far more probable is a shallow but wide flood of corrosive harms.

comment by Jack · 2009-06-09T03:46:07.188Z · LW(p) · GW(p)

How do you feel about lies of omission? What about other forms of deception (make-up, dress, aires of confidence when really nervous etc.) I'm really sympathetic to you view but I think it profoundly underestimates the amount of deception we routinely engage in for the purposes of avoiding conflict, smoothing things over, maintaining privacy, keeping promises, avoiding delay, and generally not being a jerk. I'm really not sure our society could function in a state of obligatory information symmetry. Deception is so enmeshed in our institutions and conventions that I don't know if you could eliminate it without destabilizing everything else.

Obviously the solution isn't going to be "lie away!" but the practice of 'radical honesty' is a genuinely revolutionary act and it should be taken with appropriate seriousness and caution.

Replies from: Annoyance
comment by Annoyance · 2009-06-09T19:22:58.784Z · LW(p) · GW(p)

You're right that most interpersonal relationships, even (and especially) casual ones, absolutely require deception and silence on certain matters.

But most humans are not rational and don't desire to be rational. That only harms them, but it can't be helped. In such situations, probably the best way to salvage things and limit the total amount of harm is to play along and deceive them.

Would it be even better to refuse to play that game, to insist on interacting only upon a rational basis, or even to refuse interaction with people dedicated to being irrational? I don't know. I can't even guess. Maybe. But the cost would be terrible.

comment by Alicorn · 2009-06-08T23:45:19.690Z · LW(p) · GW(p)

The problem, as I see it, is that it's not possible to lie to people and simultaneously act in their interests... There may be very special cases in which it's the act of an ally or friend...but they must be quite rare.

This would be true if we lived in a world in which there was always plenty of time to communicate before action needed to be taken. But we don't; occasionally action is urgent and a lie is the only thing that can induce an urgent action in a small number of syllables.

Replies from: Annoyance, SoullessAutomaton
comment by Annoyance · 2009-06-09T19:28:31.751Z · LW(p) · GW(p)

If you yell fire when there is none, and people take your word for it and rationally respond appropriately, then it's overwhelmingly likely that their reaction would not be what they would perceive to be their optimal or appropriate response to the real situation. You presumably benefit from their reaction, profiting from their inappropriate response.

If there was a real situation where the best response is approximately what the response to a fire would be, there was no pre-awareness of the possibility of this situation, and there wasn't enough time to explain, yelling "Fire!" might truly be in everyone's interests. But that would be an exceptional and highly unlikely (a priori) circumstance.

Replies from: Alicorn
comment by Alicorn · 2009-06-09T20:01:07.802Z · LW(p) · GW(p)

Should I take this to mean that you think the advice to yell "fire" instead of "rape" or something else more accurate than "fire", when one is assaulted, is ethically misguided and indicates hostility to people who hear the exclamation, or do you just think that it's an "exceptional and highly unlikely" situation? For example. Not every possible situation is a weird fringe case where you could accurately yell "Godzilla!" but, to avoid people thinking you're making a joke, you go with "fire".

Replies from: Annoyance
comment by Annoyance · 2009-06-09T20:16:02.717Z · LW(p) · GW(p)

Not at all. But that's just another example of other people's interests not being compatible with your own, and choosing to trick them into actions that they wouldn't take if they knew the truth.

In that situation, you are their enemy, and vice versa.

Replies from: Alicorn
comment by Alicorn · 2009-06-09T20:28:53.825Z · LW(p) · GW(p)

Actually, what yelling "fire" in that situation does most effectively is get the attention of a group. Fire individually endangers every passerby, so they're motivated to assess the situation and call the fire department. Whereas in a moderately well-traveled area, the bystander effect can yield tragedy.

It's (probably) not against a passerby's interest to join a dozen other people in calling the fire department and/or serving to scare off a would-be attacker with excess attention, unless the passerby is a sociopath (or unless the passerby is never going to find out about the assault and you agree with pjeby on preferences being about the map only). It's only against their interests to stop and see what's going on and try to help if, in so doing, they put themselves in danger, and they don't care about the victim in particular. Yelling "fire" gets the attention of multiple people and reduces the danger.

So every helpful passerby gets to think, "Well, I am certainly a good person and would have helped even if she had yelled "rape", but it's a good thing she yelled "fire", because if she hadn't, these other self-interested jerks would have just walked right by and then I would have been in trouble and so would she." No enmity is required, just psychological facts.

Replies from: Annoyance
comment by Annoyance · 2009-06-10T15:13:01.504Z · LW(p) · GW(p)

Or people want to avoid the personal risk without gain of confronting a potentially violent rapist, and would choose to not become involved if they knew the reality of the situation.

I'm sure few people want to consider the possibility that such considerations motivate them, and I'm equally sure that many people are in actuality motivated by them.

comment by SoullessAutomaton · 2009-06-09T00:09:18.504Z · LW(p) · GW(p)

But we don't; occasionally action is urgent and a lie is the only thing that can induce an urgent action in a small number of syllables.

The ends justify the means--how delightfully consequentialist!

Further, of course, sometimes, the situation is severe enough that the business end of a weapon is the only thing that can effectively induce action. The heart of the question is how to choose the least bad approach that is sufficient to attain the necessary results when you may not even have enough time to follow Yudkowsky's exhortation to "shut up and multiply".

Ethics is difficult.

comment by Adaptive · 2009-06-10T04:19:04.038Z · LW(p) · GW(p)

It seems to be a questionable assumption that other people's interests are best served by

  • my subjective evaluation of what is true
  • the communication of this in full, regardless of circumstance

I am reminded of the Buddhist terms upaya and prajna which I believe are commonly translated as "insight" and "means" respectively, but which I first encountered as "truth" and "utility". The principle, as I understood it when studying the subject, was that while one may feel in possession of a truth, it is not always useful to simply communicate that truth directly. I have personally taken true/useful as dual criteria for my own interpersonal (though not intrapersonal) communication.

I'll leave the ends-justifies-the-means implication of such a truth/utility formulation for a separate time and place.

comment by Sideways · 2009-06-08T01:31:42.826Z · LW(p) · GW(p)

Truth-telling is necessary but not sufficient for honesty. Something more is required: an admission of epistemic weakness. You needn't always make the admission openly to your audience (social conventions apply), but the possibility that you might be wrong should not leave your thoughts. A genuinely honest person should not only listen to objections to his or her favorite assumptions and theories, but should actively seek to discover such objections.

What's more, people tend to forget that their long-held assumptions are assumptions and treat them as facts. Forgotten assumptions are a major impediment to rationality--hence the importance of overcoming bias (the action, not the blog) to a rationalist.

comment by LukeParrish · 2009-06-06T19:35:24.241Z · LW(p) · GW(p)

Now that you mention it, I have kind of noticed this. Social skills appear to consist of a certain amount of glossing over details that you don't want to derail the more important thought processes. You don't mention aloud exactly everything you are thinking, because if you do you end up labeled "dork".

Writing/typing is a little different because you know in advance that whatever comes out is not going to be percieved immediately by anyone, so it goes through fewer filters at first. Afterwards you have the chance to edit it to be less embarassing (flow or emotion) or more correct (factually or logically).

Overall this means you can reveal more without risking as much. Rather than revealing half-formed thoughts (which people find to be threatening), you can fully develop them before revelation, and make yourself sound sophisticated. Even so, some thoughts are best left out (in some contexts) because they distract from the main point -- you can only develop so many concepts in a given amount of space.

comment by JamesAndrix · 2009-06-06T16:19:29.976Z · LW(p) · GW(p)

For a long long time I've tried not to say anything false. Usually this take the form of caveats or 'weasel words'. This can make me seem less confident than I actually am, so I guess they're misleading in their own way.

Not sure how this influences self-deception. It probably makes it easier for me to misremember my confidence.

comment by AshwinV · 2014-10-14T05:48:10.609Z · LW(p) · GW(p)

In practice, the people I know who seem to make unusual efforts at rationality, are unusually honest, or, failing that, at least have unusually bad social skills.

It's the same "liar circuitry" that you're fighting, or indulging, in the internal or external case - that would be my second guess for why rational people tend to be honest people.

I have another alternate hypotheses: most normal people are such poor rationalists, that it simply isn't worth the effort to develop proper "lying skills" (if that's an acceptable term - yes, i know it sounds weird as a phrase!!). In a majority of the situations that rationalists find themselves in, normal people will do such a thoroughly good job at deceiving themselves, and convince themselves so thoroughly, that there isn't any point in trying anymore. For that matter, there is probably a reason why even seasoned con-men have set patterns.. the number of ways in which people(average joe/jane) can be manipulated, is not just finite, but also likely to be extremely limited.

Alternately with trained rationalists, the odds of successfully telling a lie are relatively and significantly smaller, so it serves as a tremendous pico-economic disincentive to even attempt telling a fabrication. This would also explain, perhaps in some ways better than the internal circuitry theory why there is a strong correlation between rationality and honesty.

comment by Ttochpej · 2009-06-06T13:05:04.510Z · LW(p) · GW(p)

My view of the relationship between honesty and rationality is similar to taw's theory 'enhanced reality' in his comment on this page, but I would think of it as a 'augmented reality theory', I don't think we are designed to lie just that the truth is normally very complex, and we are designed to simplify.

I think there are two basic factors that limit honesty Language, and the way the human mind works.

LANGUAGE

Even a person who is trying to tell the truth and be honest is always going to have language problems, it's important to remember that new words are still being created all the time and it is impossible for one person to know them all, and we have to learn them, and even when two people are speaking the same language there is still going to be miss interpretations. before I found this site I had a lot of ideas in my head but wouldn't have been able to describe them or write them down clearly because I didn't know what words to use or the way to use the words, to express what I meant.

So for example I am doing a IT degree and I have been asked by my wife before what are you doing when I have been sitting at the computer writing, Some times I give her an accurate and honest answer like "I'm trying to fix a problem in the back propagation learning function in the code for my neural net assignment." and she just looks at me funny, because she doesn't know what half those words mean, and other times I just tell I'm programming or doing an assignment, even through a lot of the time I'm not exactly programming or doing an assignment I might be installing software I need so I can program or reading information so I can learn how to do an assignment. But giving a long more accurate answer would actually tell her less than giving a short slightly inaccurate one.

The Way The Mind Works

The other problem with honesty and the truth I think is the human mind, because peoples minds work by building associative links and over time the more these links in the mind are used the stronger they become it becomes impossible to change them instantly, and so to change them will take time. For example I have christian friends that if I was to sit down and argue with them that God doesn't exist would be pointless I would most likely just get a circular argument and even if I did provide them with the evidence they wouldn't accept it. The best thing can do is understand where they are at and slowly introduce new concepts. this isn't just something that happens with people who are religious if you look at a lot of the scientific break through s they where not instantly accepted because they where logical and gave the correct answers the ideas where slowly accepted over time.

So I think it is sometimes important to lie when the lie is a part of a bigger process of getting to the truth.

comment by joelgarnier · 2009-06-11T16:36:27.909Z · LW(p) · GW(p)

E.-

I think you are refining yourself for very good reasons. It could happen that you help 'raise' a seed AI someday. Like the quote on your pin there is, 'be the person you are seeking'. Perhaps your personal honesty practices are very important for your AI work. If you ever find yourself working with an evolving machine consciousness, it might be very good that you've taken time to search yourself.

I like your comment about feeling your nervous system change while run into that inconvenience-of-reality-what-to-say-or-do-myaaa!... thing. I think it's lucky that there are feelings to go along with lying, but very sad that it is so common and uncomfortable. You've seen others lie? A chill falls on things? Being absorbed in loads of truth seems like a flow state. Again, what is happening in our brains when we've really got it right?

Last, the practical part of my comment; have you considered having a look at your brain activity in some kind of honesty experiment? I know that would be expensive to do on a whim, but something to keep in mind for the future. In the next decade or so, wouldn't it be nice to get to know that 'wetware'; grab Nick Bostrom, Kurzweil, Aubrey de Grey, and have an MRI party? I'm being cheeky, but also sincere; you folks are so likely to be involved in some sort of GAI success, it would be neat to know your own minds in presently unusual but really useful ways.

If this is non-sense, go easy. I'm a dreamer.

comment by Annoyance · 2009-06-10T15:15:42.993Z · LW(p) · GW(p)

There is no theorem which proves a rationalist must be honest

I'd like to see the non-existence proof of that claim. Absent that, it should really start "I am not aware of the existence of a theorem which..."

How do we know that there is no such theorem?

comment by [deleted] · 2009-06-06T08:44:48.305Z · LW(p) · GW(p)

It's a special case of the interaction between seeking and sharing.

If you want something for yourself, are you obligated to help others achieve it, or at least not hinder them, even against your own interest? If you want something for yourself, are people who don't want it just wrong? Should you push it on them?

Different people will handle it different ways.

I don't have an absolute rule. I avoid lying, or try to, but I don't particularly avoid being silent.

Replies from: tut
comment by tut · 2009-06-11T05:52:16.318Z · LW(p) · GW(p)

I think it goes beyond that. Does aquireing the habit of telling the truth make it easier to see/remember the truth yourself. It just might.

Replies from: None
comment by [deleted] · 2009-06-11T09:39:51.163Z · LW(p) · GW(p)

Saying some particular thing a lot will help you remember it. But suppose a topic comes up in which the people you are talking to would be upset if you spoke the truth. What makes you want to speak the truth? What makes you not want to? The thought that I might need to refresh my memory by saying it out loud never seems to be the biggest factor.

I have seen people confused by their own lies, and it's one of many reasons I try to avoid lying, but that seems to only work in one direction and be confined to specific issues. You can walk around all day saying the sun is up, and it won't help you find your car keys.

comment by Peterdjones · 2013-09-10T16:42:35.594Z · LW(p) · GW(p)

There is no theorem which proves a rationalist must be honest - must speak aloud their probability estimates.

Speaking what you believe may be frankness, candour or tactlessness, but it isn't honesty. Honesty is not lying. It involves no requirement to call people Fatty or Shorty.