Nate Soares' "Assuming Good Intent"

post by Raemon · 2017-04-30T17:45:10.088Z · LW · GW · Legacy · 5 comments

This is a link post for http://mindingourway.com/assuming-positive-intent/

Contents

5 comments

5 comments

Comments sorted by top scores.

comment by Raemon · 2017-04-30T17:46:30.407Z · LW(p) · GW(p)

This is a recent post from Nate's Minding Our Way, touching on some disagreements lately within the rationality and EA communities.

comment by RomeoStevens · 2017-05-02T00:27:53.129Z · LW(p) · GW(p)

I'm not sure things have actually changed, I think some people's perception is getting sharper. But on the flip side, there is the notion of not regarding an entire person as bad or well intentioned. People are loose coalitions.

comment by tristanm · 2017-04-30T23:51:32.931Z · LW(p) · GW(p)

I agree that it would be weird to assume that we all weren't fairly closely aligned with each other - especially within the rationality community. But I think there are a few factors involved that might cause one to assume that someone is acting adversarially towards them when that's not really the case.

First is that no one is completely transparent, especially during times of disagreement. When we disagree with each other, we might only present a filtered list of our reasons for it, and be particularly careful with how we word them. The intent of this is to do the least harm to our listener, but if not done carefully it might inadvertently create the impression of secrecy or evasiveness. In this case I think it is important, when having a conversation over a disagreement, to either be as forthright as possible or at least indicate that there is some context or other information that is producing one's final assessment on the issue, but that may not be possible to share or fully articulate at the given moment. I think Aumann's theorem says that two rational people cannot disagree as long as they had the same priors and share all of their information with each other. So in any disagreement we either have different information or different intuitions guiding our thought processes. Therefore increased transparency should probably mitigate this issue.

The second issue is that even if our goals (utility functions if you want to think of them that way) are closely aligned, there might be small differences between peoples' goals in certain areas that actually do cause different actions (perhaps even dramatically different) to be taken. This is sort of a harder issue because it is difficult to say exactly when this is the cause of different actions to be taken instead of simply having different beliefs. This is probably behind some choices of whether or not to gain different types of utility over others - say I want to focus on gaining status or prestige, or money, or some other form of capital, before developing another skill that will help me obtain a goal. All of those actions might help me attain that goal, but to outside observers who have only seen one of those actions, it may cause them to assume I have completely different motives from my true motives.

Human brains also seem to be hardwired to get excited about politics / status games. I don't fully understand why, but it might cause us to at best look for signs of competition where there aren't any, or at worst, subtly motivate our actions to be more adversarial with each other than is necessary. This seems to be a problem that could be mitigated by simply improving our general rationality.

Overall, I think the solution as far as I can tell seems to converge on greater transparency. And of course, simply being nicer to each other.

Replies from: Dagon
comment by Dagon · 2017-05-01T15:59:01.763Z · LW(p) · GW(p)

Overall, I think the solution as far as I can tell seems to converge on greater transparency.

Ehn. Complete transparency is impossible and often counterproductive. No individual or organization has a correct and concise model of themselves that can be communicated to others. Attempts to do so slow things down and inevitably mislead the conversation into minutia that mask deeper disagreements over values or beliefs. And those disagreements ALSO imply that there's a strategic advantage to being somewhat opaque.

And of course, simply being nicer to each other.

Hard to argue against that!

Replies from: tristanm
comment by tristanm · 2017-05-01T16:22:25.233Z · LW(p) · GW(p)

Ehn. Complete transparency is impossible and often counterproductive. No individual or organization has a correct and concise model of themselves that can be communicated to others. Attempts to do so slow things down and inevitably mislead the conversation into minutia that mask deeper disagreements over values or beliefs. And those disagreements ALSO imply that there's a strategic advantage to being somewhat opaque.

Transparency is something that I strongly believe is desirable, and my prediction is that it is considered a good by many other people as well, and that they (we) are unlikely to be swayed by arguments like these. The problem is not that I disagree that accurate communication is costly - I do believe that - and it's not that I believe there aren't risks associated with increased transparency. It's just that I think the alternative is much much worse, and that the short term gain associated with opacity is far outweighed by the long-term benefits of greater transparency.

But I believe this may come down to, as you said, deeper disagreements over values and beliefs, and I don't expect we will converge on a consensus on this very easily.