Degrees of Radical Honesty
post by MBlume · 2009-03-31T20:36:10.497Z · LW · GW · Legacy · 51 commentsContents
51 comments
The Black Belt Bayesian writes:
Promoting less than maximally accurate beliefs is an act of sabotage. Don’t do it to anyone unless you’d also slash their tires, because they’re Nazis or whatever.
Eliezer adds:
If you'll lie when the fate of the world is at stake, and others can guess that fact about you, then, at the moment when the fate of the world is at stake, that's the moment when your words become the whistling of the wind.
These are both radically high standards of honesty. Thus, it is easy to miss the fact that they are radically different standards of honesty. Let us look at a boundary case.
Thomblake puts the matter vividly:
Suppose that Anne Frank is hiding in the attic, and the Nazis come asking if she's there. Harry doesn't want to tell them, but Stan insists he mustn't deceive the Nazis, regardless of his commitment to save Anne's life.
So, let us say that you are living in Nazi Germany, during WWII, and you have a Jewish family hiding upstairs. There's a couple of brownshirts with rifles knocking on your door. What do you do?
I see four obvious responses to this problem (though there may be more)
- "Yes, there are Jews living upstairs, third door on the left" -- you have promoted maximally accurate beliefs in the Nazi soldiers. Outcome: The family you are sheltering will die horribly.
- "I cannot tell you the answer to that question" -- you have not deceived the Nazis. They spend a few minutes searching the house. Outcome: The family you are sheltering will die horribly.
- "No, there are no Jews here" -- your words are like unto the whistling of the wind. The Nazis expect individuals without Jews in their homes to utter these words with near certainty. They expect individuals with Jews in their homes to utter these words with near certainty. These words make no change in P(there are Jews here) as measured by the Nazis. Even a couple of teenaged brownshirts will possess this much rationality. Outcome: The family you are sheltering will die horribly.
- Practice the Dark Arts. Heil Hitler enthusiastically, and embrace the soldiers warmly. Thank them for the work they are doing in defending your fatherland from the Jewish menace. Bring them into your home, and have your wife bring them strong beer, and her best sausages. Over dinner, tell every filthy joke you know about rolling pennies through ghettos. Talk about the Jewish-owned shop that used to be down the street, and how you refused to go there, but walked three miles to patronize a German establishment. Tell of the Jewish moneylender who ruined your cousin. Sing patriotic songs while your beautiful adolescent daughter plays the piano. Finally, tell the soldiers that your daughter's room is upstairs, that she is shy, and bashful, and would be disturbed by two strange young men looking through her things. Appeal to their sense of chivalry. Make them feel that respecting your daughter's privacy is the German thing to do -- is what the Feurer himself would want them to do. Before they have time to process this, clasp their hands warmly, thank them for their company, and politely but firmly show them out. Outcome: far from certain, but there is a significant chance that the family you are sheltering live long, happy lives.
I am certain that YVain could have a field day with the myriad ways in which response 4 does not represent rational discourse. Nonetheless, in this limited problem, it wins.
(It should also be noted that response 4 came to me in about 15 minutes of thinking about the problem. If I actually had Jews in my attic, and lived in Nazi Germany, I might have thought of something better).
However:
What if you live in the impossible possible world in which a nuclear blast could ignite the atmosphere of the entire earth? What if you are yourself a nuclear scientist, and have proven this to yourself beyond any doubt, but cannot convey the whole of the argument to a layman? The fate of the whole world could depend on your superiors believing you to be the sort of man who will not tell a lie. And, of course, in order to be the sort of man who would not tell a lie, you must not tell lies.
Do we have wiggle room here? Neither your superior officer, nor the two teenaged brownshirts, are Omega, but your superior bears a far greater resemblance. The brownshirts are young, are ruled by hormones. It is easy to practice the Dark Arts against them, and get away with it. Is it possible to grab the low-hanging fruit to be had by deceiving fools (at least, those who are evil and whose tires you would willingly slash), while retaining the benefits of being believed by the wise?
I am honestly unsure, and so I put the question to you all.
ETA: I have of course forgotten about the unrealistically optimistic option:
5: Really, truly, promote maximally accurate beliefs. Teach the soldiers rationality from the ground up. Explain to them about affective death spirals, and make them see that they are involved in one. Help them to understand that their own morality assigns value to the lives hidden upstairs. Convince them to stop being nazis, and to help you protect your charges.
If you can pull this off without winding up in a concentration camp yourself (along with the family you've been sheltering) you are a vastly better rationalist than I, or (I suspect) anyone else on this forum.
51 comments
Comments sorted by top scores.
comment by James_Miller · 2009-04-01T01:35:42.691Z · LW(p) · GW(p)
You have left out one horrible option.
(6) Talk to the Nazis about Jews. Like many other Europeans of the time become convinced that they are right and gladly tell them where Anne Frank is hiding.
Replies from: Teerth Aloke↑ comment by Teerth Aloke · 2019-04-07T09:01:31.302Z · LW(p) · GW(p)
Well, then why would the person even give refuge to Anne Frank? Or wait for the Brownshirt to come to his house.
comment by Alicorn · 2009-04-01T00:48:44.302Z · LW(p) · GW(p)
Telling the truth is an expression of trust, in addition to being a way to earn it: telling someone something true that could be misused is saying "I trust you to behave appropriately with this information". The fact that I would lie to the brownshirts as convincingly as possible shouldn't cause anyone else to mistrust me as long as 1) they know my goals; 2) I know their goals and they know that I do; 3) our goals align, at least contextually; and 4) they know that I'm not just a pathological liar who'll lie for no reason. The Nazis will be misled about (1), because that's the part of their knowledge I can manipulate most directly, but anyone with whom I share much of a trust relationship (the teenage daughter playing the piano, perhaps) will know better, because they'll be aware that I'm sheltering Jews and lying to Nazis.
The fact that I would lie to save the world should only cause someone to mistrust my statements on the eve of the apocalypse if they think that I think that they don't want to save the world.
Replies from: PhilGoetz, MBlume↑ comment by PhilGoetz · 2009-04-01T04:36:37.843Z · LW(p) · GW(p)
Edited to not sound like I know what Eliezer is thinking:
In the Nazi example, there are only 3 likely options: Nazi, anti-Nazi, or self-interested. If non-Nazi C sees person A lie to Nazi B, C can assume, with a high degree of certainty, that person A is on the non-Nazi side. Being caught lying this way increases A's trustworthiness to C.
Radical honesty is a policy for when one is in a more complicated situation, in which there are many different sides, and there's no way to figure out what side someone is on by process of elimination.
In Eliezer's situation in particular, which probably motivates his radical honesty policy, some simple inferences from Eliezer's observed opinions on his own intelligence vs. the intelligence of everyone else in the world, would lead one to give a high prior probability that he will mislead people about his intentions. Additionally, he wants to get money from people who are going to ask him what he is doing and yet are incapable of understanding the answer; so it hardly seems possible for him to answer "honestly", or even to define what that means. Most questions asked about the goals of the SIAI are probably some variation of "Have you stopped beating your wife?"
Radical honesty is one way of dealing with this situation. Radical honesty is a rational variation on revenge strategies. People sometimes try to signal that they are hot-tempered, irrational people who would take horrible revenge on those who harm them, even when to do so would be irrational. Radical honesty 0 is, likewise, the attempt, say by religious people, to convince you that they will be honest with you even when it's irrational for them to do so. Radical rational honesty is a game-theoretic argument that doesn't require the radically honest person RHP to commit to irrationality. It tries to convince you that radical honesty is rational (or at least that RHP believes it is); therefore RHP can be trusted to be honest at all times.
And it all collapses if RHP tells one lie to anybody. The game-theory argument needed to justify the lie would become so complicated that no one would take time to understand it, and so it would be useless.
(Of course nobody can be honest all the time in practice; of course observers will make some allowance for "honest dishonesty" according to the circumstances.)
The hell of it is that, after you make this game-theoretic argument, somebody comes along and asks you if you would lie to Nazis to save Anne Frank. If you say yes, then they can't trust you to be radically honest. And if you say no, they decide they wouldn't trust you because there's something wrong with you.
Because radical honesty is a game-theoretic argument, you could delimit a domain in which you will be radically honest, and reserve the right to lie outside the domain without harming your radical honesty.
Replies from: Eliezer_Yudkowsky↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-04-01T05:28:16.965Z · LW(p) · GW(p)
Phil, how many times do I have to tell you that every time you try to speak for what my positions are, you get it wrong? Are you incapable of understanding that you do not have a good model of me? Is it some naive realism thing where the little picture in your head just seems the way that Eliezer is? Do I have to request a feature that lets me tag all your posts with a little floating label that says "Phil Goetz thinks he can speak for Eliezer, but he can't"?
There's some here that is insightful, and some that I disagree with. But if I want to make promises I'll make them myself! If I want to stake all my reputation on always telling the truth, I'll stake it myself! Your help is not solicited in doing so!
And it all collapses if he tells one lie to anybody.
I strive for honesty, hard enough to take social penalties for it; but my deliberative intelligence literally doesn't control my voice fast enough to prevent it from ever telling a single lie to anybody. Maybe with further training and practice.
Replies from: PhilGoetz↑ comment by PhilGoetz · 2009-04-01T16:19:23.777Z · LW(p) · GW(p)
I do not have a good model of Eliezer. Very true. I will edit the post to make it not sound like I speak for Eliezer.
But if you want to be a big man, you have to get used to people talking about you. If you open any textbook on Kant, you will find all sorts of attributions saying "Kant meant... Kant believed..." These people did not interview Kant to find out what he believed. It is understood by convention that they are presenting their interpretation of someone else's beliefs.
If you don't want others to present their interpretations of your beliefs, you're in the wrong business.
↑ comment by MBlume · 2009-04-01T01:11:40.008Z · LW(p) · GW(p)
What if you need to explain to a nazi general that the bomb he's having developed could destroy the world? Your goals don't align, except in the fairly narrow sense that neither of you wants to destroy the world.
Replies from: Alicorn↑ comment by Alicorn · 2009-04-01T15:00:08.599Z · LW(p) · GW(p)
That's an interesting case because, if the Nazi is well-informed about my goals, he will probably be aware that I'd lie to him for things short of the end of the world and he could easily suspect that I'm falsely informing him of this risk in order to get him not to blow up people I'd prefer to leave intact. If all he knows about my goals is that I don't want the world to end, whether he heeds my warnings depends on his uninformed guess about the rest of my beliefs, which could fall either way.
Replies from: MBlume↑ comment by MBlume · 2009-04-01T19:27:43.671Z · LW(p) · GW(p)
That's why I think that if, say, a scientist were tempted by the Noble Lie "this bomb would actually destroy the whole earth, we cannot work on it any further," this would be a terrible decision. By the same logic that says I hand Omega $100 so that counterfactual me gets $10000, I should not attempt to lie about such a risk so that counterfactual me can be believed where the risk actually exists
comment by NancyLebovitz · 2010-08-28T14:08:41.261Z · LW(p) · GW(p)
A small scale and probably more common example-- a friend who lost his job because his co-workers and his immediate boss didn't trust him because he wouldn't pad his expense account.
I think the problem was that having a non-standard moral system meant he was too unpredictable.
Admittedly, (and I only have his account of his life) there was another problem-- he has Asperger's and he's a large guy. Neurotypicals (perhaps only neuroytypical men) would automatically see him as physically threatening.
If you want to offer advice about the situation he was in, please make it hypothetical. At this point, his depression and anxiety are bad enough that he's on disability and not in the job market.
For another example, note that whistle-blowers need legal protection to keep their jobs. In the case of the man who released the Abu Graib photos, he needed protection because of death threats.
Where does signaling absolute honesty fit in a world where many people make it unwelcome?
Replies from: jimrandomh↑ comment by jimrandomh · 2010-08-30T17:20:17.600Z · LW(p) · GW(p)
A small scale and probably more common example--a friend who lost his job because his co-workers and his immediate boss didn't trust him because he wouldn't pad his expense account.
They were probably worried that he would inform on others for padding their expense accounts. Someone who follows a stricter set of moral rules, but doesn't plan to force those rules on others, should be sure to clarify that they don't mind others following less strict rules, within reason, and wouldn't make trouble over them.
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2010-08-30T17:26:09.932Z · LW(p) · GW(p)
From what my friend told me, his Asperger's meant that he was very bad at getting neurotypicals to trust him.
Also, his explicit thinking was that he had no way to figure out when to bend the explicit rules, so he was going to follow them. This might imply that if the rules changed to require him to report people who were breaking them, he'd comply.
Replies from: WrongBot↑ comment by WrongBot · 2010-08-30T21:53:57.923Z · LW(p) · GW(p)
These sound like two problems of the same class. To avoid similar future problems of this type, your friend should study how NTs create and enforce social norms, probably with reference to game theory.
Many of the social deficiencies that are typically a part of AS can be worked around by explicitly thinking out chains of reasoning that NTs perform instinctively, and then practicing that kind of thinking enough that it becomes (relatively) instinctual. Social skills are skills.
comment by AlanCrowe · 2009-03-31T23:44:42.546Z · LW(p) · GW(p)
Somewhere in Polybius is a line about character. Men have characters. Yours may lead you to greatness, but when times change and your character is ill-suited to new circumstances, you remain the same. Disaster follows.
I agree that there is a problem. I think that it has been recognised long ago and has no solution.
I was very struck that in our imaginations we are always the good guys. I have developed serious doubts about this. Looking back on my life it struck me that as a young man I was rather attracted by authoritarian governments. If I had been a 13 year old German boy when Hitler came to power in 1933 I would have worshipped the man. He would have been my hero and I would have believed all that stuff about Jews.
I'm not trying to shock or abase myself. There is actually an element of smugness as I feel ahead of head of the pack in realising that the good guys and the bad guys are mostly just ordinary guys and land on one side or the other because it is hard to transcend ones circumstances. We know that Hitler was popular in 1933. What must also be true is that if we had been young and German we might well have fallen under his spell. We might perhaps have been proud of our philosophical sophistication and knowledge of Heidegger but it wouldn't have helped.
I see the post as aiming way too high. If you are crap at lying that has consequences, perhaps good, perhaps bad. Time will tell, don't worry about it. The more interesting question is about how to transcend ones circumstances. How does one avoid being one of the brownshirts?
When we worry here about promoting maximally accurate beliefs it is self-deception that is the principle concern, not the deception of others. Eliezer posted about affective death spirals. An example could be being so impressed by the Fuehrer that we believe everything he says. Later, we notice that he says an awful lot. We notice, wtih some surprise, that It is all true. What a genius! And round we go. I've understood Eliezer to be addressing the issue of how we can tell from the inside; how we can escape our own affective death spirals. That is a very different problem from seeing the problem in others and opposing them.
Replies from: gwern↑ comment by gwern · 2009-04-01T03:56:03.434Z · LW(p) · GW(p)
'Somewhere in Polybius...'
See, that's where your post first goes wrong. This is LW: you need to quote either anime, SF, fanfiction, or something preferably all three. I'll spot you a suitable quote, but next time you're on your own!
'Somewhere in Dune Messiah, Frank Herbert writes "A creature who has spent his life creating one particular representation of his selfdom will die rather than become the antithesis of that representation."...'
comment by MendelSchmiedekamp · 2009-03-31T21:31:39.018Z · LW(p) · GW(p)
Perhaps this is tangential, but if we're assuming non-omniscient perspectives then what allows you to conclude that the only way to be trusted absolutely is to never be observed to have lied?
If the supervisors weigh positive evidence over negative evidence, a very common bias, then we should actually expect them to trust you more if they have accumulated a great deal of reliable evidence of when you do lie. That's one reason being caught in an understandable lie is a way con artists can build trust.
It seems that you're using honesty as a behavior, truth as a value of a statement or a communication, and trust as a feature of a relationship as though they are interchangeable. But they really aren't.
Even if you wish to behave in such a way to communicate things as accurately as possible (honesty) it becomes necessary to say things which are not accurate in themselves (lie), in order that the outcome of communication is accurate (honest).
Replies from: MichaelBishop↑ comment by Mike Bishop (MichaelBishop) · 2009-04-01T00:10:17.950Z · LW(p) · GW(p)
Mendel, do you have an example for the claim in your last sentence?
Replies from: Jordancomment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-03-31T23:14:32.483Z · LW(p) · GW(p)
If the soldiers are old, play 3: become the person who has nothing to hide.
4 might work against the young, maybe. Older soldiers would notice that you're behaving differently from most houses they knocked on.
To be specific, try to become the most common sort of person the soldiers would encounter who has nothing to hide - a decent German patriot, but still nervous of course about being questioned.
Replies from: outlawpoet↑ comment by outlawpoet · 2009-04-01T00:09:40.188Z · LW(p) · GW(p)
so, lie?
Replies from: Eliezer_Yudkowsky↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-04-01T02:33:00.695Z · LW(p) · GW(p)
In this case? Yes. Even if the Nazis had Omega-like powers, you'd still want to fool them - they're not any sort of game-theoretic counterpart who you wish would trust your honesty. I'm not entirely sure I'm describing all the factors here, but this scenario doesn't even feel to me like it's about the quantity ordinarily known as honesty, there is no bond you are breaking.
The proper form of this scenario is if a Nazi soldier who's feeling conflicted comes to you and says he wants to talk to you, but only if you vow silence. You do, and he tells you that he suspects there's a Jewish family next door. He gives you a chance to talk him out of turning them in. You fail. Do you warn the family next door? Now that's a dilemma of honesty with someone else's life at stake.
And of course it can get even worse. E.g. Knut Haukelid.
Replies from: MBlume, Demosthenes, ciphergoth↑ comment by MBlume · 2009-04-01T03:13:37.937Z · LW(p) · GW(p)
Can the world be so easily partitioned into those we simply wish to fool, and those with whom we might need to cooperate? Is there a simple, parsimonious precommitment to honesty that allows for fooling nazis, taking confessions, and being believed when we point out global risks? I guess that's what this post was getting at.
↑ comment by Demosthenes · 2009-04-01T03:58:40.807Z · LW(p) · GW(p)
"To tell the truth is a duty, but is a duty only with regard to the man who has a right to the truth."
Kant disagrees and seems to warn that the principle of truth telling is universal; you can't go around deciding who has a right to truth and who does not. Furthermore, he suggests that your lie could have terrible unforeseen consequences.
Lie to the Nazis who you feel "don't deserve the truth" and then they end up treating everyone on the rest of the block like liars and sending all sorts of people to the concentration camps or outright killing them because its not worth trying to ferret out truth etc..etc..etc...
Eliezer:
When I was reading through your other article I thought the "fate of the world" part suggested that not lying should be the basis for a universalizable duty like Kant's. The existence of a future "Fate of the world" event makes it seem like you are getting at the same unforeseen consequences point as The Big K -is this accurate?
I am concerned that deciding on who is rational enough to treat honestly is a slippery slope.
Personally, this seems like a point where you take a metaphysical stand, rationally work your way through the options within your axiomatic system and then apply your rational program to the choice at hand. I am more utilitarian than Kant, but it is not hard to ignore "proximity" and come up with a cost/benefit calculation that agrees with him.
↑ comment by Paul Crowley (ciphergoth) · 2009-04-01T07:48:33.305Z · LW(p) · GW(p)
I don't see the Haukelid comparison. Haukelid maximised for expected lives saved; here it's clear what decision does that, but the cost is that you wouldn't be in a position to do that if the other party had known that's what you would do.
comment by Sean · 2009-04-03T22:19:32.939Z · LW(p) · GW(p)
Now, I'll nitpick a bit, but the site's name is "less wrong", so I'll give it a go:
I do not think the scenarios proposed in this post are realistic, and I think it shows that a lot of the commenters are Americans, who seem to not know a lot about how things worked in Europe during the War.
I'm a Dane, not a German, but my great grandparents did hide a member of the Danish resistance - my grandfathers brother - for the Germans during the war. If they had not helped him hide in the barn one night where four members of the SS came looking for him at their house, he would have been caught and sent to the Danish koncentration camp in Frøslev. They all knew that their own lives were at stake as well - hiding jews or members of the resistance was often enough to make you a member of the resistance in the eyes of the Nazis, and everybody knew this.
In short, if you were living in Nazi Germany during WW2 and you hid jews for the Nazis, you would share the fate of the jews you were hiding if they were ever found, no matter what you told the search party. Telling the truth would not only condemn the jews to death - it could also quite easily become the same as signing your own death sentence.
comment by randallsquared · 2009-04-01T20:15:01.976Z · LW(p) · GW(p)
EY was quoted as:
If you'll lie when the fate of the world is at stake, and others can guess that fact about you, then, at the moment when the fate of the world is at stake, that's the moment when your words become the whistling of the wind.
For the vast majority of people, however, the fate of the world will never be at stake in a way that turns on their statements. For those people (and those of us who expect to be them), mentioning that we'd lie to save the world increases others' trust in a way that asserting that we'd never lie no matter what fails to do so.
comment by JulianMorrison · 2009-04-01T07:25:07.741Z · LW(p) · GW(p)
I hate to burst the feel-good bubble of #5, but have you noticed that would just put the burden of lying right back onto the newly rational brownshirts? Unless that is you were such a capable rationalist teacher that you could train them to teach, on a few hours notice - and set lose a recursion that would obliterate the Nazi ideology in days, leaving the rest of the world caught between relief and awe, and wondering if such a pointedly infectious idea ought to be quarantined.
Replies from: MBlume↑ comment by MBlume · 2009-04-01T07:27:38.093Z · LW(p) · GW(p)
There was no feel-good bubble -- I listed 5 as being the obviously impossible extreme of "promote maximally accurate beliefs"
Replies from: JulianMorrison↑ comment by JulianMorrison · 2009-04-01T07:43:16.211Z · LW(p) · GW(p)
No criticism implied. I'm just having fun with a counterfactual. The feel good bubble was mine as much as anybody's. "Wouldn't it be nice - oh wait, but ten they'd need to ... and they'd end up having to convert Hitler ...". Heh. And now I'm wondering what a nation of sudden rationalists would do with a Nazi war machine if they had one - dismantle it? Or fight someone? (Sorry, I know I'm hijacking a bit. Threaded replies should defend the main discussion, anyhow.)
comment by smoofra · 2009-04-01T04:30:10.397Z · LW(p) · GW(p)
I think you have entirely missed the point. If there are Nazis at your door, sabotaging them is exactly what you are trying to do. There's nothing irrational about that. Black Belt isn't saying rationalists don't lie, he's saying they don't lie to people they don't want to harm.
The point of the Eliezer quote isn't radically different either. If your friends think you'd lie to them "for their own good", then they may disbelieve you when you need their trust the most. So again, don't lie to your friends.
Replies from: Nominull↑ comment by Nominull · 2009-04-01T17:57:48.616Z · LW(p) · GW(p)
I think you have entirely missed the point. BBB is happy to sabotage his enemies by lying to them, but Eliezer isn't. You only tell half the story of Eliezer's quote: sure, he doesn't want you to lie to your friends, but it is just as much a consequence of his argument that if your enemies think you'd lie to them for your own good, they may disbelieve you when you need their trust the most, so don't lie to you enemies.
EDIT: After reading his comments, it seems Eliezer himself does not realize that this is a consequence of his arguments. It's so frustrating when people do not realize how insightful they are being.
comment by whpearson · 2009-03-31T22:15:30.009Z · LW(p) · GW(p)
If the wise are truly wise they wouldn't judge your honesty as a binary choice. They would allow you to run more complex algorithms that were scrupulously honest in some situations but dishonest in others and see them as separate clusters in person-honesty-given-situation space.
Replies from: Tom_Talbot↑ comment by Tom_Talbot · 2009-04-01T00:20:54.323Z · LW(p) · GW(p)
I agree. The wise ought to recognise when you were forced into telling a lie because you valued something more highly than your reputation, and that, in an oddly self-nullifying way, should enhance your reputation. At least among the wise.
EDIT: Gods! I just noticed the accidental similarity to Newcomb's problem (omega ("the wise") is allocating reputation instead of money). I've been reading Yudkowsky for too long.
comment by mdcaton · 2009-04-01T00:27:25.845Z · LW(p) · GW(p)
Is this question really so hard? Remind me never to hide from Nazis at your house!
First off, Kant's philosophy was criticized on exactly these grounds, i.e. that by his system, when the authorities come to your door to look for a friend you're harboring, you should turn him in. I briefly scanned for clever Kant references (e.g. "introduce the brownshirts to your strangely-named cat, Egorial Imperative") but found none. Kant clarified that he did not think it immoral to lie to authorities looking to execute your friend.
The larger issue here is the purpose of rationality. We start in medias res and reason is a tool to help us navigate the world better. I imagine that none of us have a commitment to rationality for its own sake, but rather support a clearer world-view out of some initial kind self-interest. Consequently I'm a-okay with engaging in the Dark Arts in cases where even my basic interest (my own and my friend's continued survival) and that of another party totally diverge. Otherwise the joke about the engineer (or atheist) and the guillotine isn't really a joke.
Often the long-term best strategies in game theory are irrational in the short-term; as in, games of chicken, or in punishing wrongdoers even though the cost of punishment is more than letting them off.
Replies from: Nick_Tarleton, Pablo_Stafforini↑ comment by Nick_Tarleton · 2009-04-01T00:35:23.912Z · LW(p) · GW(p)
Often the long-term best strategies in game theory are irrational in the short-term; as in, games of chicken, or in punishing wrongdoers even though the cost of punishment is more than letting them off.
↑ comment by Pablo (Pablo_Stafforini) · 2009-04-01T01:54:23.543Z · LW(p) · GW(p)
Kant's philosophy was criticized on exactly these grounds, i.e. that by his system, when the authorities come to your door to look for a friend you're harboring, you should turn him in. I briefly scanned for clever Kant references (e.g. "introduce the brownshirts to your strangely-named cat, Egorial Imperative") but found none. Kant clarified that he did not think it immoral to lie to authorities looking to execute your friend.
The critic was Benjamin Constant. He wrote:
The moral principle stating that it is a duty to tell the truth would make any society impossible if that principle were taken singly and unconditionally. We have proof of this in the very direct consequences which a German philosopher has drawn from this principle. This philosopher goes as far as to assert that it would be a crime to tell a lie to a murderer who asked whether our friend who is being pursued by the murderer had taken refuge in our house.
For Kant's reply, see his essay On a Supposed Right to Lie Because of Philanthropic Concerns.
Replies from: thomblake↑ comment by thomblake · 2009-04-03T15:36:37.535Z · LW(p) · GW(p)
Thanks for the citation - that's actually exactly the sort of thing that informed the background of my original comment, but didn't have time to look it up. Though I was of course at the time addressing a virtue ethics reading of the problem, not a Kantian one.
comment by Jonnan · 2009-04-01T04:18:05.715Z · LW(p) · GW(p)
I think you're undervaluing the value of simple respect in the equation, as opposed to strict honesty. There is the potential for simply telling your boss - "I don't have the skill required to explain this in laymens terms yet, and you don't have the skill required to evaluate this as raw data yet, but we have a serious problem. Give me time to get someone smarter than me to either debunk this or verify it"
It has worked for me numerous times.
comment by igoresque · 2009-03-31T21:03:00.200Z · LW(p) · GW(p)
Not that it matters to the discussion, but Anne Frank lived in Amsterdam, Netherlands (not in Nazi Germany). As a native of the city I can't not make this remark. There have been many Jews in Amsterdam for centuries, until WW2. But it would be relatively hard to find German establishments there. Sorry for being off-topic.
Replies from: MBlumecomment by Stuart_Armstrong · 2009-04-01T10:39:33.916Z · LW(p) · GW(p)
If you'll lie when the fate of the world is at stake, and others can guess that fact about you, then, at the moment when the fate of the world is at stake, that's the moment when your words become the whistling of the wind.
But there's implicit assumptions there; for instance that you can credibly signal that you would not lie with the fate of the world at stake. You can do this by building up something of a reputation, but most people will not believe that you will scale your truth-telling to the "fate of the world" level.
There is only one way to do this: to have told the truth when the fate of the world was at stake, when it would have been advantageous to lie. Preferably, to have done this more than once.
Similarly, the only way of being entirely credible when the Nazis ask you "are you hiding jews?" is if you have previously hidden jews, and previously denounced them when asked.
comment by blink · 2009-04-01T02:49:22.331Z · LW(p) · GW(p)
I am having trouble with the Nazi scenario because it seems paradoxical. Hiding is implicitly lying, so the always honest person should not be able to get into this situation in the first place. (The family could make this explicit simply by asking what you would do if the Nazis came.) Turning the family away may lead to horrible deaths as well, so this may not be an improvement. One may simply be the type of person who is too honest to hide a fugitive, no matter the benefit.
Replies from: Rings_of_Saturn↑ comment by Rings_of_Saturn · 2009-04-01T04:44:10.788Z · LW(p) · GW(p)
Yes, I think that you are just shunting the moral problem here down the pike.
Maybe you can imagine you just discovered Anne Frank's family was living in your attic moments before the brownshirts come knocking?
... and resume with dilemma.
comment by lloyd · 2012-09-14T01:22:19.247Z · LW(p) · GW(p)
The basis for honesty are arguments for development of a an egalatarian relationship. If the relationship is not based on equality then dishonesty is an inevitable result in resolving moral dilemmas. In the example case there is no reason to consider whether or not deception in words should mirror the deception of hiding filthy Jews. To split hairs further the ability to convey the truth is absolutely impossible in language. The allusion of of the 1st quote is towards this understanding: anything contained in language is only an approximation of the truth. So how honest can we really be?
comment by Kevinburke · 2009-04-01T07:48:19.064Z · LW(p) · GW(p)
Success in option 5 has much more to do with persuasiveness than rationality. I believe that a hyper-rational Urkel would fail every time.
Is it ethical to use irrational persuasion techniques to convince people to become more rational?
Replies from: MBlume↑ comment by MBlume · 2009-04-01T09:42:52.515Z · LW(p) · GW(p)
I don't see rationality as meaning we become hyperrational urkels. Our emotional faculties are part of us, and we should strive to grow to use them wisely. There is nothing irrational in this.
Persuasion as such is a Dark Art. However, there is an Art to telling the truth well, of telling it clearly, so that it can be understood. This is the art which I will spend my life attempting to master. Were one sufficiently advanced in this art, it seems to me that you could simply explain to a couple of brownshirts why they ought not to be brownshirts any more. I do not expect to reach this level on this side of the singularity, but I see no reason that it could not be done.
comment by Demosthenes · 2009-04-01T00:24:29.967Z · LW(p) · GW(p)
If you'll lie when the fate of the world is at stake, and others can guess that fact about you, then, at the moment when the fate of the world is at stake, that's the moment when your words become the whistling of the wind.
Is it correct to interpret this as similar to Pascal's Wager? The possibility of a fate-of-the-world moment is very low but the payout for being an honest fellow in this case is huge?
Replies from: Eliezer_Yudkowsky↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-04-01T02:35:04.646Z · LW(p) · GW(p)
No, it's a similar dilemma at all scales - the point is that it doesn't change just because the stakes are large.
comment by billswift · 2009-04-01T17:45:37.486Z · LW(p) · GW(p)
Don't lie, just kill all of the Nazis. http://williambswift.blogspot.com/2009/03/violence.html
Replies from: thomblake↑ comment by thomblake · 2009-04-03T15:28:27.175Z · LW(p) · GW(p)
just kill all of the Nazis?
Are you serious? A German factory-worker with a family (possibly a Nazi himself who happens to not be anti-semitic) should personally kill all the Nazis? This seems to you like a way to cut this particular knot?
Replies from: pnrjulius