What are the components of intellectual honesty?

post by abramdemski · 2019-01-15T20:00:09.144Z · LW · GW · 1 comment

This is a question post.

Contents

  Answers
    7 G Gordon Worley III
    4 Dagon
None
1 comment

I have a pretty strong intuition that "intellectual honesty" points to a specific thing, with a characteristic set of behaviors (both outward (in a conversation/writing) and inward (patterns of thinking)). My concept of intellectual honesty also seems to largely coincide with the concept as used by others, but I'm not sure.

I am talking about a high standard. This isn't at all about violating a code of academic ethics.

I am talking about the sort of thing which helps to foster epistemic trust, helping to take conversations to higher levels [LW · GW]. (But don't read those links if you already know enough what I mean to answer and want to avoid anchoring your view to mine.)

Answers

answer by Gordon Seidoh Worley (G Gordon Worley III) · 2019-01-15T20:33:24.011Z · LW(p) · GW(p)

I tend to think when it comes to matters of honesty or otherwise talking about behavior that works towards some shared epistemic ends (i.e. pro-social behavior), the main issue is whether or not we see evidence of deception.

My reason for focusing on deception rather than, say, truth or facts is that I don't think we can reliably assess those things [LW · GW] to a fine enough degree to not get stuck in a debate with infinite regress. But even if you disagree with my epistemological stance, I still think getting honesty separate from questions of facts helps because most of what I think we care about in terms of honesty is issues of how we relate to reality and the facts we think we know about it, rather than the facts themselves. That is, we want a notion of honesty that allows us to make honest mistakes, so to me the way to do that is by moving away from directly looking at epistemic actions and instead looking at actions that inform epistemic behavior.

Thus I tend to think of honesty as the opposite of deception. If in deception one is trying to confuse or mislead others as to what one believes the facts are, in honesty one is trying to deconfuse and show others plainly what one believes to be true. Honesty is, in this way, a kind of virtue we can cultivate to be straightforward and upright in our presentation of our beliefs, not hiding and distorting things to purposes other than seeing reality without hinderance.

To add more subtlety, I think there is also an active/passive component to honesty and deception. Sometimes people are aware and actively trying to deceive, like the villain in a plot, and other times they are unaware and passively performing deception without intent, like when people forget things that are uncomfortable for them or hidden beliefs that a person wouldn't necessarily endorse warp their perspective such that they can't see things as they are. This is not to make a moral distinction, although I suppose you could on this basis do that, but instead to point out that deception is often sneaky and even if a person is not actively being dishonest they may still not succeed at expressing honesty because of passive deception that performs through them.

Total, radical honesty, then, is just what happens when we stop even passively deceiving ourselves. Quite the virtue to strive for, but in the context of something like epistemic trust, it helps make sense of why some people are more deserving of trust than others, even if no one is actively trying to deceive.

comment by Raemon · 2019-01-16T03:53:52.199Z · LW(p) · GW(p)

I like this answer, but it made me think about this post on privacy [EA · GW], which argues that radical honesty can end up leading you to start self deceiving so that you don't accidentally reveal damaging things. This isn't precisely an argument against your frame, just something to consider as you go about trying to cultivate intellectual honesty.

answer by Dagon · 2019-01-15T21:55:40.416Z · LW(p) · GW(p)

I think that most of what people call "intellectual honesty" would be more accurately called "epistemic humility". It's not just about trying to minimize deception and bias, it's about recognizing that it's an impossible task, and according the same rights to possibly-wrong beliefs to others as to yourself.

Which isn't to say that all wrong beliefs are equally acceptable, just that there's a wider range of reasonable beliefs than you probably realize when you're discussing/debating.

1 comment

Comments sorted by top scores.

comment by dxu · 2019-01-16T01:54:37.319Z · LW(p) · GW(p)

(Posted as a comment rather than an answer because all of this is pretty rambling, and I'm not super-confident about any of the stuff I say below, even if my tone or phrasing seems to suggest otherwise.)

For the purposes of a discussion like this, rather than talk about what intellectual honesty is, I think it makes more sense to talk about what intellectual honesty is not. Specifically, I'd suggest that the kinds of behavior we consider "intellectually honest" are simply what human behavior looks like when it's not being warped by some combination of outside incentives. The reason intellectual honesty is so hard to find, then, is simply that humans tend to find themselves influenced by external incentives almost all of the time. Even absent more obvious factors like money or power, humans are social creatures, and all of us unconsciously track the social status of ourselves and others. Throw in the fact that social status is scarce by definition, and we end up playing all sorts of social games "under the table".

This affects practically all of our interactions with other people, even interactions ostensibly for some other purpose (such as solving a problem or answering a question). Unless people are in a very specific kind of environment, by default, all interactions have an underlying status component: if I say something wrong and someone corrects me on it, I'm made to seem less knowledgeable in comparison, and so that person gains status at my expense. If you're in an environment where this sort of thing is happening (and you pretty much always are), naturally you're going to divert some effort away from accomplishing whatever the actual goal is, and toward maintaining or increasing your social standing. (Of course, this behavior needn't be conscious at all; we're perfectly capable of executing status-increasing maneuvers without realizing we're doing it.)

This would suggest that intellectual honesty is most prevalent in fields that prioritize problem-solving over status, and (although confirmation bias is obviously a thing) I do think this is observably true. For example, when a mathematician finds that they've made a mistake, they pretty much always own up to it immediately, and other mathematicians don't respect them less for doing so. (Ditto physicists.) And this isn't because mathematicians and physicists have some magical personality trait that makes them immune to status games--it's simply because they're focused on actually doing something, and the thing they're doing is more important to them than showing off their own cleverness.

If you and I are working together to solve a particular problem, and both of us actually care about solving the problem, then there's no reason for me to feel threatened by you, even if you do something that looks vaguely like a status grab (such as correcting me when I make a mistake). Because I know that we're fundamentally on the same side, I don't need to worry nearly as much about what I say or do in front of you, which in turn allows me to voice my actual thoughts and opinions much more freely. The atmosphere is collaborative rather than competitive. In that situation, both of us can act "intellectually honest", but importantly, there's not even a need for that term. No one's going to compliment me on how "intellectually honest" I'm being if I quickly admit that I made a mistake, because, well, why would I be doing anything other than trying to solve the problem I set out to solve? It's a given that I'd immediately abandon any unpromising or mistaken approaches; there's nothing special about that kind of behavior, and so there's no need to give it a special name like "intellectual honesty".

The only context in which "intellectual honesty" is a useful concept is one that's already dominated by status games. Only in cases where the incentives are sharply aligned against admitting that you're wrong does it become something laudable, something unusual, something to be praised whenever someone actually does it. In practice, these kinds of situations crop up all the time because status is something humans breathe, but I still think it's useful to point out that "intellectual honesty" is really just the default mode of behavior, even if that default mode is often corrupted by other stuff.