A Dissent on Honesty

post by eva_ · 2025-04-15T02:43:44.163Z · LW · GW · 20 comments

Contents

  Context
    Reminder About Winning
    Technical Truth is as Bad as Lying
    Being Mistaken is Also as Bad as Lying
    This is Partially a Response
    Examples
  Biting the Bullet
    My Proposed Policy
    Appearing Trustworthy Anyway
    Cooperative Epistemics
  Conclusion
None
21 comments

Context

Disney's Tangled (2010) is a great movie. Spoilers if you haven't seen it.

The heroine, having been kidnapped at birth and raised in a tower, has never stepped foot outside. It follows, naturally, that she does not own a pair of shoes, and she is barefoot for the entire adventure. The movie contains multiple shots that focus at length on her toes. Things like that can have an outsized influence on a young mind, but that's Disney for you.

Anyway.

The male romantic lead goes by the name of "Flynn Rider." He is a dashingly handsome, swashbuckling rogue who was carefully crafted to be maximally appealing to women. He is the ideal male role model. If you want women to fall in love with you, it should be clear that the optimal strategy is to pretend to be Flynn Rider. Shortly into the movie is the twist: Flynn Rider (real name Eugene Fitzherbert) is also just pretending to be Flynn Rider. He was once a nerdy, loner orphan with no friends (I hope this is cutting close to your heart) till he read a series of books called "The Tales of Flynnigan Rider" and decided to steal the protagonist's personality wholesale to use as his own. Presumably what followed was an unbroken series of victories and successes all the way until he stumbled into the arms of a beautiful, naive teenager with implausible hair, who he seduced on his first try in about a day and a half.

Flynn admits his real name (and parts of his real personality) to his love interest and only his love interest, and only after he's already successfully seduced her. She accepts this, finding his (selective) openness endearing.

The lesson here is likewise clear: If your actual personality isn't good enough, pretend to be Flynn Rider to everyone at all times, with the sole carve-out being people who love you, like your mother or a princess. This works because people who love you will find your openness endearing, whereas everyone else will think you pathetic and use it against you.

Actually, even if your personality is good enough, you should probably still pretend to be Flynn Rider, because his personality is better. It was, after all, carefully designed by a crack team of imagineers. Was yours? Didn't think so.

Reminder About Winning

Once upon a time, two armies went to war. The first army desired honour and glory, to prove their bravery against their foe, to stand their ground whatever their odds, to bring no shame to their ancestors, and to be worthy of great ballads that would be sung across the lands for generations to come. The second army wanted to kill the people in the first army, without dying themselves in the process. It should be of little surprise to you that, since none of their goals contradicted, in the end everybody got what they wanted.
- Sun Tzu maybe, IDK I made it up.

Philosophers get to pick one axiom, one highest value, to declare superior to all others. If you have no highest value you have no way to judge all your other values. If you have two highest values you will not make it three paces from your own front door before they start contradicting with each other.

Rationality can be about Winning [LW · GW], or it can be about The Truth, but it can't be about both. Sooner or later, your The Truth will demand you shoot yourself in the foot, while Winning will offer you a pretty girl with a country-sized dowry. The only price will be presenting various facts about yourself in the most seductive order instead of the most informative one.

If your highest value isn't Winning, you do not get to be surprised when you lose. You do not even get to be disappointed. By revealed preference, you have to have a mad grin across your face, that you were able to hold fast to your highest-value-that-isn't-winning all the way to the bitter end.

And yet, the rationalist movement has some kind of weird fetish for honesty, without much formal proof or empirical evidence that it's a good idea. Why? Did you watch the wrong Disney movies growing up?

Technical Truth is as Bad as Lying

There is a philosophy I want to call "Technical Truthism". Technical Truthists believe that, so long as what they said was technically true, you have no right to be mad at them, including when they tricked you into giving them all your money, cleverly danced around important issues, lied to themselves so they could truthfully report their own delusional beliefs as if they were valuable, laundered their opinions through a series of shell companies to create the illusion of an independent source that they were merely quoting, refused to give a straight answer on the important "have I just been scammed" question, and publically lauded their own commitment to Absolute Honesty while they did it.

The gospel of Technical Truthism includes such sacred wisdom as:

I'm not sure which Disney movies get you into this, because every pop culture example I can think of is either the devil or a lawyer who looks like a standin for the devil. I think this philosophy is ridiculous and self-defeating. It defeats the entire point of telling the truth in the first place.

If you are an honest person, and others can by some mechanism know this, then they will believe you when you say things, and this can be used to share valuable information. If you are a liar and everyone knows it, there's nothing you can do to get the village to save you from the wolf, because when you yell wolf they just ignore you.

The purpose of a word is to carve reality at a joint useful for the discussion taking place, and we should pause here to note that the joint in question isn't "emits true statements", it's "emits statements that the other party is better off for listening to". Nobody should care if your statement "A wolf has been observed near our sheep!" is technically true if, when they come running, they find it was a drawing of a wolf and you're laughing at them. That is no better for their interests than an outright lie. The technical truth is useless as a defense, except against courts that are obligated to follow explicit laws and exact wordings. Nobody made out of meat should care.

Being Mistaken is Also as Bad as Lying

Just as we don't have reason to care if they technically saw a wolf, we also don't have much reason to care if they thought they saw a wolf and were merely sincerely mistaken. Sure, malice can be disincentivised with punishments in a way that mere errors are less susceptible to, but when it comes to whether we should listen to them next time, being frequently wrong because they're not very bright is just as bad as being just as wrong for any other reason.

The honest might say "By never lying, I get a reputation for never lying, so people will always believe me". This isn't true. They'd also have to never be mistaken, never be mislead by the lies of another, never misread anothers report, never stumble over their words and say something they didn't mean, never accidentally imply something they didn't mean and be mistaken for saying it outright, etc. Basically they'd have to be omniscient, but they're not omniscient. They're made out of meat too, remember.

Fortunately, you don't need a perfect reputation, just a good enough reputation that other people think it passes their Expected Utility Calculation to act on what you say. If you are an aspiring rationalist, you may well be so far above the median in accuracy of belief that you can get away with far above median dishonesty if you want, and still be an authority figure.

This is Partially a Response

In Meta-Honesty: Firming Up Honesty Around Its Edge-Case [LW · GW]", Eliezer writes, and the community seems to agree [? · GW] in the direction of a general premise that truth-speaking is admirable, and something rationalists should aspire to have more of.

As to whether the honest can better ability to discern lies than the liars, Eliezer writes:

This is probably not true in normal human practice for detecting other people's lies. I'd expect a lot of con artists are better than a lot of honest people at that.

I think this is probably correct. You can tell because Eliezer says so, and it's an admission against interest, so he wouldn't say it if he didn't believe it and he wouldn't believe it (because of self-interest biases) unless it was probably correct, but you might still check over your memories of your own life or try looking for an independent study anyway.

He goes on to write:

I once heard somebody claim that rationalists ought to practice lying, so that they could separate their internal honesty from any fears of needing to say what they believed. That is, if they became good at lying, they'd feel freer to consider geocentrism without worrying what the Church would think about it. I do not in fact think this would be good for the soul, or for a cooperative spirit between people. This is the sort of proposed solution of which I say, "That is a terrible solution and there has to be a better way."

Here (I think) he is straightforwardly wrong, and you can tell because he's only able to defend it by resorting to non-rational frames. Who cares if it is "good for the soul", souls aren't real and we're supposed to be trying to Win here. There does not in fact have to be a better way. Sometimes the best option isn't also the maximally honest one. Tradeoffs exist, and you aren't going to make those tradeoffs at anywhere near an optimal rate if you're refusing to ever think of the possibility for fear of spiritual damage.

Whether it is bad for a "cooperative spirit" I promise I will get back to.

The purpose of this essay is not to disagree with Eliezer's Meta-Honesty as a principle for how to be unusually honest despite the dangers (I think it's mostly correct given its premise), but rather to disagree that being unusually honest is a good idea in the first place.

Examples

It is very easy to come up with moral hypotheticals where You Must Lie or Else, but lets ceremonially do it anyway.

A Paragon of Morality is out travelling, when he is beset by bandits. They demand he hand over his gold or they will kill him and take it from his corpse. This is not a decision-theoretic threat [? · GW] because the bandits value getting his gold more than they disprefer commiting murder, but would otherwise avoid the murder if possible. If he hands over all his gold he will lose all his gold. If he hands over all the gold in his pockets, neglects the extra he has hidden in his sock, and says "I have given you all my gold" in a sufficiently convincing tone of voice, then he will lose less than all his gold.

These isn't Omega we're dealing with here, they're totally trickable by a moderately convincing performance. If he keeps some of the gold he can donate it to Givewell approves charities and save however many QALYs or whatever.

Does he have a moral obligation to lie?

Yeah, obviously. Just do the Expected Value Calculation. Why care about Honour here, they're bloody bandits. I think even the particularly devout agree here.

A Normally Honest Man is applying for a job as a Widget Designer. He has many years of industry experience in Widget Engineering. He has memorised the Widget Manufacturing Process. He's actually kind of obsessed with Widgets. Typically whenever a conversation becomes about Widgets he gushes openly and makes a bad impression with his in-laws. Since that incident he has developed the self control to pretend otherwise, and the rest of his personality is okay.

The interviewer works for a Widget Manufacturing company but seems to only care about Widgets a normal amount. He asks "How interested are you in Widgets?" He has learnt from previous job interviews that, if he answers honestly, the interviewer will think he is any of lying, insane, or too weird to deal with, and not hire him, even though this is not in the best financial interests of the company, were they fully informed.

Should he pretend to like widgets the amount most likely to get him hired, or does he have a moral obligation to keep answering honestly until he runs out of rent money and becomes homeless?

The thing I'm trying to point at here is that Not Lying is not a good decision principle in general. It might still be a valuable social norm e.g. for "Cooperative Epistemics" (It isn't, but I'll get to that later), but you definitely shouldn't think of it as any kind of bright line or guiding light.

These aren't the important examples to me though, the important example is this:

A Self-Improvement and Epistemics Nerd has an online community for Self-Improvement and Epistemics Nerds. Half the people reading it are autists with bad social skills, who weren't at exactly the right age to be saved by Disney's Tangled. They struggle with navigating ordinary social situations and obtaining true beliefs because they're bad at lying, and insufficiently aggressive at anticipating it in others.

Would they be doing anyone a favour in encourage a social norm of truthfulness and the expectation of truthfulness in others, when all those people will inevitably have to leave the computer one day and end up like the subjects of the previous two examples? Would they be making the world a better place?

I don't think they would be doing a favour, and I expect they would be making the world a worse place. In fact, if they pretended it was a good idea anyway, I think they would by lying.

Biting the Bullet

Other people are not all your friends, do not all love you, and have not sworn an unbreakable oath not to use the information you provide them against your interests. Sometimes you are made better off by them having less accurate information.

Saying words is just an action, like any other action. Whether the words are literally true or not is just a fact about the action, like any other fact about an action. It's not the morally important fact. You judge actions by their consequences, whether you expect it to lead to more good or bad. Then you take the action with the best consequences overall.

Far more important for an aspiring rationalist however is the inverse: You should expect other people to do similar, and (sometimes) emit sentences that it does not benefit you to listen to. Even if they're good people, even if they're paragons of virtue who donate to all the right charities. You have an obligation to think things through yourself and carefully check. You cannot sell this obligation to the other party.

How to actually do this will involve some details.

My Proposed Policy

Lie by default whenever you think it passes an Expected Value Calculation to do so, just as for any other action. Include the reputation affects of your statement, the loss of valuable credibility if you are known to be a liar, and the risk of revenge by angry victims if your lies hurt others in your calculation. If your lie would be breaking a valuable and delicate social institution that actually exists and is being stably participated in by your counterparties (you have to check for this, do not just assume one exists), consider the value of that institution also in your decision. If you could go to prison about it, remember that prison is a really incredibly terrible place and that even tiny probabilities of ending up there can quickly dominate Expected Values.

Practice lying until you are good at it, so you can use it when you need to. Practice quickly assessing whether it is a good idea to say something, seperately from whether that something is true. Practice to discern the lies of others, and better track and model reputation and reliability in different circumstances.

Once you have these tools, reassess all your beliefs for whether you really believe them or were just tricking yourself because you felt you needed to believe it to maintain your social relationships in the absence of your new ability to lie. For any such beliefs you find,  secretly abandon it in favour of believing the truth. If necessary, lie to all your friends and pretend you still believe it to maintain your social position (If you wouldn't let yourself do this, you won't be able to really reassess the belief in the first place).

Treat Cooperate-Cooperate dynamics, where they locally exist, to be extremely valuable things that you would not want to cheaply destroy, but do not assume they exist where they do not. Require proof and err towards caution. If you think your friend is mistaken or overly naive, try to help them reach truth if and only if you aren't shooting yourself in the foot even moreso by doing that.

When your friends ask you about how trustworthy you are, make no implications that you are abnormally honest. Tell them truthfully (if it is safe to do so) about all the various bad incentives, broken social systems, and ordinary praxis that compel dishonesty from you and any other person, even among friends, and give them sincere advice about how to navigate these issues. 

Build a mental model of how and when other people are trustworthy based on past behaviour, demographic and selection effects, random mannerisms, and anything else you find that is useful. As you know someone better, you will update away over time from the general distribution to a more accurate subpopulation and eventually a precise internal model of how that individual thinks. If a specific claim greatly affects your choices and would be cheap to check, check anyway, as your Expected Value Calculations direct you.

I know you can't actually do an Expected Value Calculation. I just mean pretend to do one to the best of your ability, make up a number, and then act on that. The universe won't be able to tell a made up number from a real one anyway, nobody else can check your internal thoughts. It's still a better policy than just trusting people.

Appearing Trustworthy Anyway

People often compress reputation into a single scalar, or worse a single boolean, when it really isn't.

If you mostly emit (verifiably) bad sentences that hurt the listener, they will eventually notice and stop listening. If you lie always and obviously about a specific topic they will disbelieve you about that topic but might still believe you about other things. If you lie only in ways where you can't get caught, like about your private preferences or beliefs ("yes, I am feeling fine", "yes, I do like your hat", "I don't know, I didn't see") then you're not going to be seen as dishonest even if you did somehow get caught.

Reputation is an over-simplification. The other person has in their mind a model of you and how you behave, that they use to make decisions. Your actions impact that model, and not in the Functional Decision Theory [? · GW] sense where they quickly end up with a perfect clone of you [? · GW]. They are not going to be able to accurately model the real you, because the real you is too big to fit in their tiny mental model.

Most of the time, the amount of brainpower you're putting into your facade is both wider and deeper than what they're putting into trying to get a read on you. They're distracted by other people and other tasks. They are going to apply limited heuristics because they have no other choice. If you want to be trusted, you do not even need to trick the other person, just the heuristic they're using to assess credibility.

Typically people will trust you more if you more accurately match their existing model of how a trustworthy person behaves (wearing a suit, sitting straight, answering politely, etc.) even when those things aren't causally related, and even when you are doing those things deceptively to gain their trust. If you show up to the job interview with your real personality and really are a person who would never mislead anyone, but that personality has features that correlate with dishonour in people they've met before, sucks to be you.

If you want a reputation and appearance of Trustworthiness, you have to roleplay the Flynn Rider of Appearing Trustworthy, not your real self who obsesses over the Actual Truth. Most people who obsess over the truth are crazy, and have so many mistaken beliefs that they're worse than liars. You do not want to look like them. The details and techniques of how to do this fill many books, so I have sprinkled some examples through this essay as a scavenger hunt. Or if you prefer, pick on any other persuasive writer you like and dig out all the ways they try to make you hallucinate credibility via text alone.

Cooperative Epistemics

I promised to get back to this earlier (I promise I made solely so I can point it out now, so you think me a promise-keeper and therefore more trustworthy (and I am now pointing that out too in the hopes for another chance for you to see how this works)).

The motto of science is not "If we all tell the Truth to each other, we can form a little bubble where we collectively figure out all the big issues and come to fully understand the world". The motto of science is nullius in verba, "take nobody's word for it".

You cannot actually make a community of good epistemics on the expectation of trust and cooperation. It is like trying to start a communist utopia on the expectation that everybody just does their part and takes only what they need. People will not just.

Even if they would just, that wouldn't even be good enough. A person who is trying to be truthful can still be mistaken. They can still be selectively silenced. They can still trick themselves out of fear or regret, or because it's necessary to protect their ego. They can have a psychotic break. Their account could be hacked.

People who have heard other people claim that title and for some reason believed them, are creating the worst possible set of incentives. The greatest possible force by which to bring sociopaths into your community, or to make otherwise good people decide just maybe this one time to fudge an answer. Nobody would notice. Everybody is just trusting them. Do them the charity of not pretending they wouldn't be making a terrible mistake by imagining they can take you or anyone else at their word. Build your Cooperative Epistemics on distrust instead.

Conclusion

I believe that we are all friends here: I am not an honest person. You can tell that's true because if it wasn't I wouldn't say it, would I? You can tell I think you friends, because if I didn't I'd lie and say I was honest. It is only because I believe this that I am admitting to the incentives around dishonesty, and trying to help you all navigate towards truth better in the choppy waters where you find yourselves.

Do not go through life as a pedantic idiot with no social model just because people on a forum for truthseekers who hate social models think it's admirable and virtuous. Especially do not do it if you are trying to accomplish some other task like saving lives or getting a job or getting someone to love you.

I imagine you are going to feel bad about it, like you are doing something wrong. That sense of wrongness will show on your face and make you look like a crazy person who panics over normal small talk, so you're going to have to get over it. To your benefit in getting over it, it isn't actually wrong.

Saying words is just an action, like any other action. You judge actions by their consequences. Are people made worse off or not? Most of the time, you're not poisoning a shared epistemic well. The well was already poisoned when you got here. It's more of a communal dumping ground at this point. Mostly you'd just be doing the sensible thing like everybody else does, except that you lack the instinct and intuition and need to learn to do it by rote.

When it makes sense to do so, when the consequences are beneficial, when society is such that you have to, when nobody wants the truth, when nobody is expecting the truth, when nobody is incentivising the truth: just lie to people.

20 comments

Comments sorted by top scores.

comment by Julian Bradshaw · 2025-04-15T21:27:39.592Z · LW(p) · GW(p)

Have we forgotten Sam Bankman-Fried already? Let’s not renounce virtues in the name of expected value so lightly. 
 

Rationalism was founded partly to disseminate the truth about AI risk. It is hard to spread the truth when you are a known liar, especially when the truth is already difficult to believe. 

comment by Sonata Green · 2025-04-15T14:43:46.130Z · LW(p) · GW(p)

I feel like this post contains a valuable insight (it's virtuous to distrust and verify, to protect the community against liars), sandwiched inside a terrible framing (honor is for suckers).

comment by tailcalled · 2025-04-15T08:46:42.608Z · LW(p) · GW(p)

Actually, even if your personality is good enough, you should probably still pretend to be Flynn Rider, because his personality is better. It was, after all, carefully designed by a crack team of imagineers. Was yours? Didn't think so.

Personalities don't just fall into a linear ranking from worse to better.

Imagineers' job isn't to design a good personality for a friendless nerd, it's to come up with children's stories that inspire and entertain parents and which they proudly want their children to consume.

The parents think they should try to balance the demands of society with the needs of their children by teaching their children to scam the surrounding society but being honest about the situation with their loved ones. Disney is assisting the parents with producing propaganda/instructions for it.

https://benjaminrosshoffman.com/guilt-shame-and-depravity/

Basing your life on scamming society is a bad idea but you shouldn't solve it by also trying to scam your loved ones. If you are honest, you can more easily collaborate with others to figure out what is needed and how you can contribute and what you want.

comment by AnthonyC · 2025-04-15T10:50:37.428Z · LW(p) · GW(p)

Lie by default whenever you think it passes an Expected Value Calculation to do so, just as for any other action. 

How do you propose to approximately carry out such a process, and how much effort do you put into pretending to do the calculation?

I'm not as much a stickler/purist/believer in honest-as-always-good as many around here, I think there are many times that deception of some sort is a valid, good, or even morally required choice. I definitely think e.g. Kant was wrong about honesty as a maxim, even within his own framework. But, in practice, I think your proposed policy sets much too low a standard, and in practice the gap between what you proposed vs "Lie by default whenever it passes an Expected Value Calculation to do so, just as for any other action," is enormous in both the theoretical defensibility, and in the skillfulness (and internal levels of honesty and self-awareness) required to successfully execute it.

Replies from: eva_
comment by eva_ · 2025-04-15T11:25:47.459Z · LW(p) · GW(p)

How do you propose to approximately carry out such a process, and how much effort do you put into pretending to do the calculation?

The thing I am trying to gesture at might be better phrased as "do it if it seems like a good idea, by the same measures as you'd judge if any other action was a good idea", but then I worry some overly consciencious people will just always judge lying to be a bad idea and stay in the same trap, so I kind of want to say "do it if it seems like a good idea and don't just immediately dismiss it or assign some huge unjustifiable negative weight to all actions that involve lying" but then I worry they'll argue over how much of a negative weight can be justified so I also want to say "assign lying a negative weight proportional to a sensible assessment of the risks involved and the actual harm to the commons of doing it and not some other bigger weight" and at some point I gave up and wrote what I wrote above instead.

Putting too much thought into making a decision is also not a useful behavioural pattern but probably the topic of a different post, many others have written about it already I think.

I think your proposed policy sets much too low a standard, and in practice the gap between what you proposed vs "Lie by default whenever it passes an Expected Value Calculation to do so, just as for any other action," is enormous

I would love to hear alternative proposed standards that are actually workable in real life and don't amount to tying a chain around your own leg, from other non-believers in 'honest-as-always-good'. If there were ten posts like this we could put them in a line and people could pick a point on that line that feels right.

Replies from: Seth Herd, AnthonyC
comment by Seth Herd · 2025-04-15T15:04:04.753Z · LW(p) · GW(p)

I think this issue of the difficulty of making each decision about lying as an independent decision is the main argument for treating it as a virtue ethics or deontological issue.

I think you make many good points in the essay arguing that one should not simply follow a rule of honesty. I think that in practice the difference can be split, and that is in fact what most rationalists and other wise human beings do. I also think it is highly useful to write this essay on the mini virtues of lying, so that that difference can be split well.

There are many subtle downsides to lying, so simply adding a bit of a fudge factor to the decision that weighs against it is one way to avoid taking forever to make that decision. You've talked about practicing making the decision quickly, and I suspect that is the result of that practice.

This is a separate issue, but your point about being technically correct is also a valuable one. It is clearly not being honest to say things you know will cause the listener to form false beliefs.

I have probably aired on the side of honesty as have many rationalists, treating it not as an absolute deontological issue and being willing to fudge a little on the side of technically correct to maintain social graces in some situations. I enjoy a remarkable degree of trust from my true friends, because they know me to be reliably honest. However, I have probably suffered reputational damages from acquaintances and failed friends, for whom my exceptional honesty has proven hurtful. Those people don't have adequate experience with me to see that I am reliably honest and appreciate the advantages of having a friend who can be relied upon to tell the truth. That's because they've ceased being my friend when they've been either insulted or irritated by my unhelpful honesty.

There is much here I agree with and much I disagree with. But I think this topic is hugely valuable for the rationalist community, and you've written it up very well. Nice work!

comment by AnthonyC · 2025-04-15T11:35:31.031Z · LW(p) · GW(p)

We apply different standards of behavior for different types of choices all the time (in terms of how much effort to put into the decision process), mostly successfully. So I read this reply as something like, "Which category of 'How high a standard should I use?' do you put 'Should I lie right now?' in?"

A good starting point might be: One rank higher than you would for not lying, see how it goes and adjust over time. If I tried to make an effort-ranking of all the kinds of tasks I regularly engage in, I expect there would be natural clusters I can roughly draw an axis through. E.g. I put more effort into client-facing or boss-facing tasks at work than I do into casual conversations with random strangers. I put more effort into setting the table and washing dishes and plating food for holidays than for a random Tuesday. Those are probably more than one rank apart, but for any given situation, I think the bar for lying should be somewhere in the vicinity of that size gap.

comment by jenn (pixx) · 2025-04-15T19:46:06.786Z · LW(p) · GW(p)

Thanks for writing this post! I think it's insightful, and agree about technical truthtelling being annoying. After I thought about it though, I come down on the side of disagreeing with your post, largely on practical grounds.

A few thoughts:

  1. You propose: Lie by default whenever you think it passes an Expected Value Calculation to do so, just as for any other action. This is fine, but the rest of the section doesn't make it clear that by default there are very few circumstances where it seems theoretically positive EV to lie (I think this situation happens once or twice a year for me at most, certainly not enough for there to be good feedback loops.) Lies are annoying to keep track of, they bite you in the ass often, and even if you're fine with lying, most people are bad at it. This means that the average liar will develop a reputation for dishonesty over time, which people generally won't tell you about, but will tell other people in your social network so they know to watch out. More explicitly, I disagree with the idea that since each person is on average not paying attention, lying is easy. This is because people love to gossip about other people in their social circle who are acting weird, and being noticed by any person means that the information will propagate across the group.
  2. You propose: Practice lying. Same as Tangled, this only works if you start very young. If you do this after high school, you will permanently burn social capital! In the case of you doing so with non-consensual subjects, you will be caught because you are bad at it, and people will think that you are deceptive or weird. In the case where you find parties who can actively help you become a more dishonest person, those people will reasonably trust you less, and also it seems generally unwise to trust such parties.
  3. Re: developing the skill of detecting the relative honesty of other people: I agree that this is a good skill to have, and that "people will lie to you" is a good hypothesis to entertain on a regular basis. However this is a separate skill tree, and also one where facts and logic™ can thankfully save you. I'm not terrible at assessing vibes, decent at thinking about if stories check out, and I also can tap into the network of mutual acquaintances if something seems subtly off or weird about a person. This has not made me any less terrible at lying.
  4. Advocating for more lying seems like especially bad advice to give to people with poor social skills, because they lack the skills to detect if they're succeeding at learning how to lie or if they're just burning what little social capital they have for no gain. For people with poor social skills, I recommend, like, reading books about improving your social skills or discussing their confusions with friends who are more clued in, and for autistic people I recommend developing a better model of how neurotypicals think. I have disagreements with some of the proposed models in the book, but I think A Field Guide to Earthlings by Ian Ford is a good place to start.
  5. The flip side to the average person not being totally honest, is that if you can credibly signal that you are unusually honest using expensive signals, there actually are many niches for you in the world, and people pay attention to that too. I touch on this in a previous post of mine on unusually scrupulous non-EA charities [LW · GW]. While it's true that a few folks on the website can stand to become a little savvier socially[1], I think in general it would be better if they chose to play to their advantage. This seems like the higher EV route to me. And this is actually one of the reasons that I'm annoyed about technical truth telling - people who practice it are technically honest but they're not even getting any good reputation for it because they're functionally deceiving people, badly.
  6. All of the best things in my life came from moments where it felt very scary to tell the truth, and then I was brave and did so anyways.
  1. ^

    i think this case is generally overstated, btw. its true that some lw people are bad at social skills but i think the median user is probably fine.

comment by Said Achmiz (SaidAchmiz) · 2025-04-15T09:30:49.747Z · LW(p) · GW(p)

I agree with a lot of things in this post and disagree with a lot of things in this post, but before I comment in more detail, I would like to clarify one thing, please:

Are you aware that there exist moral frameworks that aren’t act consequentialism? And if so, are you aware that some people adhere to these other moral frameworks? And if so, do you think that those people are all idiots, crazy, or crazy idiots?

(These questions are not rhetorical. Especially the last one, despite it obviously sounding like the most rhetorical of the set. But it’s not!)

Replies from: eva_
comment by eva_ · 2025-04-15T09:44:54.823Z · LW(p) · GW(p)

Yes I am aware of other moral frameworks, and I freely confess to having ignored them entirely in this essay. In my defence, a lot of people are (or claim to be, or aspire to be) some variant of consequentialist or another. Against strict kantian deontologists I admit no version of this argument could be persuasive and they're free to bite the other bullet and fail to achieve any good outcomes. Against rule utilitarians (who I am counting as a primary target audience) this issue is much more thorny than to act utilitarians, but I am hoping to be persuasive that never lying is not actually a good rule to endorse and that they shouldn't endorse it.

I don't necessarily think they're crazy, but to various extents I think they'd be lowering their own effectiveness by not accepting some variation on this position, and they should at least do that knowingly.

Replies from: SaidAchmiz
comment by Said Achmiz (SaidAchmiz) · 2025-04-15T10:08:45.351Z · LW(p) · GW(p)

Ok, thanks. (You omit from your enumeration rule consequentialists who are not utilitarians, but I infer that you have a similar attitude toward these as you do towards rule utilitarians.)

Well, as I am most partial to rule consequentialism, I have to agree that “this issue is much more thorny”. On the one hand, I agree with you that “never lie” is not a good rule to endorse (if even for the very straightforward reason that lying is sometimes not only permissible, but in fact is morally obligatory, so if you adopted a “never lie” rule then this would obligate you to predictably behave in an immoral way). On the other hand, I consider act consequentialism[1] to be obviously foolish and doomed (for boring, yet completely non-dismissable and un-avoidable, reasons of bounded rationality etc.), so your proposed solution where you simply “do the Expected Utility Calculation” is a non-starter. (You even admit that this calculation cannot be done, but then say to pretend to do it anyway; this looks to me like saying “the solution I propose can’t actually work, but do it anyway”. Well, no, if it can’t work, then obviously I shouldn’t do it, duh.)

(More commentary to come later.)


  1. Utilitarianism, specifically (of any stripe whatsoever, and as distinct from non-utilitarian consequentialist frameworks) seems to me to be rejectable in a thoroughly overdetermined manner. ↩︎

comment by cousin_it · 2025-04-15T11:54:06.455Z · LW(p) · GW(p)

I mean, Flynn Rider was also really good-looking. For a lot of people, maybe most, this look is just unattainable. Even if you can get in as good physical shape (which is far from easy), what if you're older, shorter, balder, have a goofy face and so on.

comment by Jiro · 2025-04-15T05:41:23.351Z · LW(p) · GW(p)

He asks “How interested are you in Widgets?” He has learnt from previous job interviews that, if he answers honestly, the interviewer will think he is any of lying, insane, or too weird to deal with, and not hire him, even though this is not in the best financial interests of the company, were they fully informed.

By the standard "intentionally or knowingly cause the other person to have false beliefs", answering 'honestly' would be lying, and answering in a toned down way would not (because it maximizes the truth of the belief that the interviewer gets).

comment by dirk (abandon) · 2025-04-15T19:43:55.267Z · LW(p) · GW(p)

This is directionally correct and most lesswrongers could probably benefit from taking the advice herein, but goes too far (possibly as deliberate humor? The section about Flynn especially was quite funny XD).

I do take issue with the technical-truths section; I think using technical truths to trick people, while indeed a form of lying, is quite distinct from qualifying claims which would be false if unqualified. It's true that some philistines skim texts in order to respond to vibes rather than content, but the typical reader understands qualifiers to be part of the sentences which contain them, and to affect their meaning. That is why qualifiers exist, to change the meanings of the things they qualify, and choosing to ignore their presence is a choice to ignore the actual meaning of the sentences you're ostensibly reading.

Replies from: eva_
comment by eva_ · 2025-04-15T23:08:47.155Z · LW(p) · GW(p)

I think a distinction can be made between the sort of news article that's putting a qualifier in a statement because they actually mean it, and are trying to make sure the typical reader notices the qualifier, and the sort putting "anonymous sources told us" in front of a claim that they're 99% sure is made up, and then doing whatever they can within the rules to sell it as true anyway, because they want their audience of rubes to believe it. The first guy isn't being technically truthist, they're being honest about a somewhat complicated claim. The second guy is no better than a journalist who'd outright lie to you in terms of whether it's useful to read what they write.

comment by Shankar Sivarajan (shankar-sivarajan) · 2025-04-15T16:58:03.781Z · LW(p) · GW(p)

If you believed this, why would you write this post?

Replies from: eva_
comment by eva_ · 2025-04-15T23:01:12.573Z · LW(p) · GW(p)

I like a lot of the people in this space, have seen several of them hurt themselves by doing not this, would prefer they stopped, and nobody else seems to have written this post for me somewhere I can link to.

comment by Afterimage · 2025-04-15T10:18:39.579Z · LW(p) · GW(p)

Great post! I really enjoy your writing style. I agree with everything up to your last sentence of cooperative epistemics. It looks like a false equivalence between a community of perfect trust and a community based on mistrust. I'm thinking a community of "trust but verify" with a vague assumption of goodwill will capture all the benefits of mistrust without the risks of half rationalists or "half a forum of autists" going off the deep end and making a carrying error in their EV calculations to overly negative results.

Corrupted Hardware [? · GW] leads me to think we need to aim high to end up at an optimum level of honesty. 

Edit: Thanks Cole and Shankar. 

Replies from: Amyr, noggin-scratcher
comment by Cole Wyeth (Amyr) · 2025-04-15T13:25:42.270Z · LW(p) · GW(p)

Highlight, right-click, the little diagonal line thing that usually symbolizes links.

Replies from: shankar-sivarajan
comment by Shankar Sivarajan (shankar-sivarajan) · 2025-04-15T17:01:00.981Z · LW(p) · GW(p)

Or Ctrl-K, the standard shortcut.