The Incoherence of Honesty

post by Gordon Seidoh Worley (gworley) · 2018-06-08T02:28:59.044Z · LW · GW · 16 comments

Contents

16 comments

Some rationalists have what to me feels like an obsession with honesty [LW · GW]. That's fine: we can all be obsessed with our own concerns. But I also think it's a rather strange sort of obsession given the nature of truth and our relationship to it even if we assume the framing of Bayesian epistemology. This is a brief case that most thinking about honesty rests on shaky ground (and is therefore strictly incoherent) as presented and only makes sense through the lens of a generous, reconstructive reading.

Let's first be clear about what "honesty" means. The word "honesty" has its origins in "honor" and comes to have its relationship to truth telling by means of conflating together all moral virtues. I'm going to diverge from this history and use "honesty" as a technical term with a precise definition, and I do so because oddly English has the single-word verb "to lie" but has to use a verb phrase to mean its antonym, "to tell the truth". This is unfortunate because "to lie" is the only word of the bunch that is not attempting to make a normative statement about the subject's actions or imply a disposition to either the action or the subject, but it would be an odd circumlocution to talk about antilying, so "honesty" it is.

(As we'll see, though, there's perhaps good reason for this: it's much easier to talk about what it means to lie than what it means to not lie!)

We might like to say a statement is honest if it is a true statement and to conflate honesty with truth as English tries to do, but this is to ignore that a person can be mistaken, and we would not call them a lier for stating what they believed to be the truth, i.e. they did not try to deceive. So instead we could say a statement is honest if it is a true statement about a person's knowledge, but this phrasing has two problems. First, it supposes that truth is a quality a statement can have rather than an assessment of some criteria someone makes about the statement; it would be safer to say that a statement is honest for a particular subject if they believe it to be a true statement about a person's knowledge. But perhaps you are an essentialist (you believe ontology precedes epistemology) and this is not a problem for you. No worries, because the second problem with this definition of honesty is that it still supposes we can reliably tell what is true contrary to epistemic circularity.

Since it's important, I'll elaborate. The problem of epistemic circularity is that to know something reliably something else must first be reliably known. It's core identification comes from the problem of the criterion, which observes that to know something reliably we must know the criteria by which things are reliably known, but to know the criteria by which things are reliably known is to know something reliably. It is tied to the problems of infinite regress and induction and creates an unreliable gap in our knowledge, positivists be damned, that prevents us from really knowing anything. I'll admit, this is mainly a philosophical problem because it has a pragmatic solution of just assuming some hinge propositions to be true and getting on with life anyway well enough to keep living, but it's worth knowing about because it is the source of uncertainty in all epistemology.

So if we want to talk about honesty in terms of truth we'll be hard pressed to do so in a coherent way because, while it does not necessarily require resolving the nature of truth, it does at least require resolving the question of assessing truth. Instead I think it makes sense to talk about honesty without appealing to truth, but then it turns into something rather weird: a statement about beliefs and beliefs about beliefs. Specifically, we might say a statement is honest if the subject making the statement believes the statement will lead listeners to believe the statement to be as likely to be true as the subject does. This makes honesty about beliefs rather than truth, and I think this is the undoing of much excitement about honesty.

The trouble is that beliefs, unlike reliable yet unobtainable knowledge of reality we might call truth or facts, are extremely unreliable in practice to the point that we often don't even know our own beliefs. This makes any thoroughgoing attempt to be "honest" doomed from the start because we can't even accurately assess the likelihoods we assign to our own beliefs let alone the beliefs of others. Even if we strive for something like Christiano's integrity we face serious issues of unreliability, and so I'm left wondering what this honesty business is even important for.

Obviously, virtue signaling. Making strong statements about honesty signals virtue and makes you appealing to ally with. But putting such extra-material reasons aside, I'm inclined to conclude that folks' interest in honesty is either due to their using "honesty" to mean something coherent but without defining terms or due to not fully updating on the fundamental unreliability of knowledge that epistemic circularity implies. I say this because I don't even know how to really trust my own beliefs, let alone the beliefs of others, but I don't view this as a problem to be solved so much as a reality to be lived with. Knowledge is unreliable, we will always be forced to reason unreliably, and concerns about honesty and truth distract from the real problem of better predicting future experiences (including experiences of learning about present and past events) and aligning those future experiences with ones values.

If you want to talk about coordination under uncertainty rather than honesty, that sounds much more interesting to me!

16 comments

Comments sorted by top scores.

comment by ChristianKl · 2018-06-08T17:22:36.243Z · LW(p) · GW(p)

There's one great cybernetics book with the German title "Wahrheit ist die Erfindung eines Lügners" which translates into "Truth is the invention of a liar". It's unfortunate that cybernetics came out of fashion and lost in the intellectual marketplace against probabilistic thinking. Fourty-years ago there were a lot of smart nerds in cybernetics and our community might likely get a lot out of integrating it more.

Especially when it comes to the article about meta-honesty I was also thinking to myself: "It's not trivial to know whether one will lie in a certain situation beforehand. In the back of my mind there's the thought that the studies we have likely showed that people are quite bad at that kind of self introspection about possible future situations."

All the examples also seemed very detached. Quite practically emotions matter a great deal when it comes to whether or not people lie and the kind of abstract examples of the meta-honesty post didn't contain any information about the emotional state of the speaker. I think it's questionable how honest a reply to the question of whether you would lie in situation X is when it doesn't consider the emotional state in which the situation occurs when we see as honesty about telling the truth.

I like the kind of honesty I saw at Radical Honesty workshop that doesn't include any "you shall not lie or you shall tell the truth" much more desirable then what Eliezer proposed in his post. I haven't heard the word cybernetics at a Radical Honesty workshop but it still works in that general paradigm. It's also worth noting that withholding of information counts as lying in the Radical Honesty context (e.g. glomarization would be seen as lying).

Replies from: TAG
comment by TAG · 2018-06-09T11:10:38.902Z · LW(p) · GW(p)

How does cybernetics help with epistemology?

Replies from: ChristianKl
comment by ChristianKl · 2018-06-09T22:11:07.813Z · LW(p) · GW(p)

If we admit that we don't know the truth we can still calibrate ourselves with reality through feedback processes.

For those who know German https://media.ccc.de/v/24c3-2334-de-die_wahrheit_und_was_wirklich_passierte is a great view of that kind of Skeptic thinking.

In the case of Radical Honesty, it comes out of Gestalt therapy which was founded by Fritz Perls who was influenced by cybernetics. Practically, it has a lot to do with speaking out emotions that one feels in response to the environment. Unfortunately, I can't do it justice in explaining it in a few paragraphs and I didn't get what it was about myself before attending workshops.

Replies from: TAG
comment by TAG · 2018-06-10T13:29:00.016Z · LW(p) · GW(p)

I think you can be sceptical about know-that knowledge whilst accepting the existence of know-how knowledge. Doesn't have much to do with cybernetics, though.

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2018-06-10T15:16:43.857Z · LW(p) · GW(p)

Cyberneticists is relevant since learning about the world is a project necessarily carried out by cybernetic systems. It has something to say about anything where we have feedback.

comment by ESRogs · 2018-06-08T20:44:55.587Z · LW(p) · GW(p)

I'm not sure I understand -- what is the claim or hypothesis that you are arguing against?

The point about the limits of knowledge is well taken (and also a familiar one around here, no?), but I'm not sure what that implies for honesty.

Surely you would agree that a person or statement can still be more or less honest?

Is the idea that there's nothing much to be gained by trying to be especially honest -- that there's no low-hanging fruit there?

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2018-06-08T21:23:40.315Z · LW(p) · GW(p)

The point about the limits of knowledge is well taken (and also a familiar one around here, no?), but I'm not sure what that implies for honesty.

I'm not so sure it is a familiar point, or at least not as familiar as it should be. Maybe it's just me, but I continue to read a strong positivist undercurrent in the expressed thinking of many rationalists even though it has Bayesian epistemology layered on top. I think of this as sort of a half measure, like saying "we can't know things 100% for sure" but then still assuming we can use experiential data to know something about everything and the only thing in our way is gathering more data within the available time bounds. As I argue above, there are deeper-seated blind-spots that make this sort of approach impossible without first making some assumptions, like say that sense data is data about a world that exists independent of the subject (the typical scientific materialist position). I'm not opposed to making some assumptions like these since they are necessary to address epistemic circularity, but I think there is a strong tendency to forget that these assumptions are metaphysical speculation which results in weird things, like say overly valuing naive notions of honesty and truth.

Surely you would agree that a person or statement can still be more or less honest?

Given the way I eventually define honesty above, sure, but at that point I think I've gutted most of what people care about when they talk about honesty. To me it seems that the fact that someone would bother to talk about honesty rather than coordination is holding on to something like an unsupportable view like truth essentialism.

Is the idea that there's nothing much to be gained by trying to be especially honest -- that there's no low-hanging fruit there?

I'd sort of say so in so far as we consider naive notions of honesty. There are hard problems around how to coordinate and how to convince others of things you want to convince them of, like perhaps that they should have high credence in the things you say, and this is sort of a reasonable reinterpretation of what people are talking about when they talk about honesty but it's also importantly different because it abandons any attempts to use normative assumptions about truth to simplify the problem.

comment by cousin_it · 2018-06-08T07:16:19.677Z · LW(p) · GW(p)

It’s core identification comes from the problem of the criterion, which observes that to know something reliably we must know the criteria by which things are reliably known, but to know the criteria by which things are reliably known is to know something reliably.

Not sure how any knowledge could be reliable in an absolute sense. Heck, it’s possible that 2+2=5 under the usual rules of arithmetic, and demons are constantly adjusting everyone’s thoughts to maintain the lie that 2+2=4. The best we can get is knowledge that was output by some machine, with no guarantee that the machine outputs only true things. Of course we could compare the outputs of different machines or use a machine to inspect its own parts, but that still amounts to a consistency check, not a truth check. Colloquial "truth" then means something that passes many consistency checks against many other things, and we shouldn't desire anything more absolute than that.

Replies from: TAG
comment by TAG · 2018-06-09T11:23:27.641Z · LW(p) · GW(p)

The starting point is that knowledge is defined as the output of a reliable process. If that definition is correct, and there are no reliable processes, the conclusion should be that there is no knowledge. You seem to be saying that there clearly is knowledge... but then how are you defining it?

Replies from: cousin_it
comment by cousin_it · 2018-06-09T22:39:42.167Z · LW(p) · GW(p)

I think knowledge is relative to a particular process.

Replies from: TAG
comment by TAG · 2018-06-10T13:20:56.148Z · LW(p) · GW(p)

There isn't much content there. I don't think it's impossible to solve epistemology, but I don't think it's possible in one sentence.

Replies from: cousin_it
comment by cousin_it · 2018-06-10T21:02:30.092Z · LW(p) · GW(p)

My ambition doesn't go as far as solving epistemology. But it seems to me that the problem of criterion relies on the sentence "everything requires justification", which sounds wrong. I believe a different sentence instead: "everything except cousin_it's axioms requires justification". Call me self-serving, but it just sounds so right, and I can't seem to derive a contradiction from it! :-)

Of course things are more fuzzy in practice. The "axioms" are more like laws, with provisions for amending themselves. So your beliefs could also be described as "everything except TAG's axioms requires justification" etc. And a group of people can have shared knowledge justified by the common subset of their axioms, without appealing to anything universal.

That still leaves the question of what our axioms/laws actually say and where their self-amendment leads. But I suspect the answer to that is complicated and path-dependent, like everything else about us.

Replies from: TAG
comment by TAG · 2018-06-13T13:46:51.863Z · LW(p) · GW(p)

Epistemology remains unsolved because dropping the requirement knowledge (not "everything") to be justified created further problems. In particular, treating individual axioms sets as true leads to relativism.

Replies from: cousin_it
comment by cousin_it · 2018-06-13T14:52:52.602Z · LW(p) · GW(p)

I only treat my axiom set as true. Is that relativism? What problems does it lead to?

Replies from: Ikaxas
comment by Vaughn Papenhausen (Ikaxas) · 2018-06-14T02:13:06.490Z · LW(p) · GW(p)

What does your epistemology recommend for others? For example, should I:

1. treat cousin_it's axioms as true?

2. treat Ikaxas's axioms as true?

3. Something else?

If the first, why should the rule be

C: For all x, x should treat cousin_it's axioms as true

rather than say "treat TAG's axioms as true" or "treat Barack Obama's axioms as true" or "treat Joe Schmoe's axioms as true"? Don't symmetry considerations speak against this "epistemological egoism"?

If the second, then the rule seems to be

A: For all x, x should treat x's axioms as true.

This is pretty close to relativism. Granted, A is not relativism--Relativism would be

R: For all x, x's axioms are true for x

--but it is fairly close. For all that, it may in fact be the best rule from a pragmatic perspective.

To put this in map-territory terminology: the best rule, pragmatically speaking, may be, "for all x, x should follow x's map" (i.e. A), since x doesn't really have unmediated access to other people's maps, or to the territory. But the rule could not be "for all x, the territory corresponds to x's map," (i.e. R) since this would either imply that there is a territory for each person, when in fact there is only one territory, or it would imply that the territory contains contradictions, since some people's maps contain P and others' contain not-P.

Alternatively, perhaps your epistemology only makes a recommendation for you, cousin_it, and doesn't tell others what they should believe. But in that case it's not complete.

Also, it's not clear to me what "everything except cousin_it's axioms requires justification" has to do with the original statement that "knowledge is relative to a particular process." That statement certainly seems like it could be open to the charge of relativism.

Replies from: cousin_it
comment by cousin_it · 2018-06-14T06:45:04.963Z · LW(p) · GW(p)

Of course my honest advice to others is that they should follow my axioms! :-)

For example, let's say I have an axiom that the winning lottery number is either 1234 or 4321. You're thinking about playing the lottery, and have an opportunity to snoop the first digit of the winning number before making a bet. Then in my opinion, the best strategy for you to achieve your goal is to bet on 1234 if you learn that the first digit is 1, or bet on 4321 if you learn that the first digit is 4. Learning any other first digit is in my opinion impossible for you, so I don't have any advice for that case. And if you have an axiom of your own, saying the winning number is either 1111 or 4444, then in my opinion you're mistaken and following your axiom won't let you achieve your goal.

That seems like the only reasonable way to think about beliefs, no matter if they are axioms or derived beliefs. Symmetry considerations do matter, but only to the extent they are themselves part of beliefs.

But I'm not sure why my advice on what you should do is relevant to you. After all, if I'm right, you will follow your own axioms and use them to interpret anything I say. The only beliefs we can agree on are beliefs that agree with both our axiom sets. Hopefully that's enough to include the following belief: "a person can consistently believe some set of axioms without suffering from the problem of criterion". Moreover, we know how to build artificial reasoners based on axioms (e.g. a theorem prover) but we don't know any other way, so it seems likely that people also work that way.