More or Fewer Fights over Principles and Values?
post by
Ben Pace (Benito),
Vaniver ·
2023-10-15T21:35:31.834Z ·
LW ·
GW ·
10 comments
Contents
Start of Dialogue
Returning for summary and takeaways after lunch
None
10 comments
I wrote:
I wish that people would generally get into more conflicts over the values and principles that they hold
Vaniver wrote:
I think we are unusually fractious and don't value peace treaties and fences as much as we should
I wrote:
Fight me :)
Start of Dialogue
People seem to me to be very conformist and most of the values that people want to build around get overwhelmed by the aggressively conventional-minded people (of Paul Graham's essay). My sense is that people who fight for their values in honorable ways just make the world so much better and we should all aspire to be more like that.
How much of this is a difference between "people" and "we"? I was imagining, like, LessWrongers, instead of people that I went to college with. Who specifically are you talking about?
I mean, many LWers are people with ordinary-enough lives who have enough sanity to see that the world could be better, but still most of them just fade away with time. So I'm including LWers in this too.
I think now maybe I'm also confused about what specifically you mean by values and principles and so on. I like for things to be high-quality and notice flaws; also I have developed an attitude of tolerance about this to be a bit less bitchy. It's not yet clear to me if you're trying to argue that I should be more critical of things in general.
Similarly, one of the things that I'm more in touch with now than before is animal suffering things; I could fight with housemates who buy chicken because it's cheaper about how I think they're doing something hideous but this goes against my sense of good neighborliness.
(I think the thing where I most feel like you're right that I could be more fractious is, like, talking with people about how much time I think we have left and how much I think various political changes are necessary; I think here I have learned helplessness from being a rare libertarian instead of actually endorsing how fighty I am.)
I have historically found myself with few allies in the comment section of the EA Forum for most of the history of EA, when I tried to stand up for letting people speak their mind and not be subject to crippling PR constraints.
Also I have found myself supportive of people who I thought were in the EA ecosystem for the values and for doing what's right, and then when I pointed out times that the ecosystem as a whole was not doing that, someone said to me "I try with a couple of percentage points of my time to help steer the egregore, but mostly I am busy with my work and life", and I had the wind knocked out of my sails a bit.
And yet I think it often has outsized effects and the costs aren't that big.
I agree that the fighting over animals thing sounds like a lot of friction and probably not worth it... my proposal is to not be generally more annoying, but to orient more to conversations with people as hashing out key disagreements and principles. For instance I think if you lived in my house and were always uneasy about me eating chicken yet never made the argument to me that would feel like a mistake. And I think it would also be good for you to look for trades where you could get this value satisfied.
(One of the principles of my group house is 'face conflicts directly and vulnerably'; I agree it would be a mistake to let resentments fester rather than ever bringing them up. I think one of the questions here is "should I be stoking this resentment or smothering it?".)
I think this points to a possible crux: it seems to me like the core distinguishing factor between 'annoying' conversations and cooperative ones is whether both people have bought in to the particular conflict. If they also think that animal suffering is important but don't share my view on chickens vs. beef, then we can easily have a good conversation about orienting to the world together.
But if I'm like "hey I kind of think that we're complicit in a colossal crime and you could just not do that relatively cheaply" this is opening up a big worldview shift for them unilaterally? And many people oppose that.
But if I'm like "hey I kind of think that we're complicit in a colossal crime and you could just not do that relatively cheaply" this is opening up a big worldview shift for them unilaterally? And many people oppose that.
I don't see why I should respect someone's desire not to have a big worldview shift toward truth.
I don't see why I should respect someone's desire not to have a big worldview shift toward truth.
I mean, this is a 'peace treaty' where you don't do it to them and they don't do it to you. I have the sense that, among rationalists, that treaty mostly got repealed (except you still thought it would be annoying ;) ) but it's not obvious that this is one of our functional adaptations!
So, maybe I can move to a slightly more circumspect position, which is not "locally become more annoying and argumentative", but something like (note: this is going to be poorly stated at first, I will iterate on it):
Maneuver your environment and life such that at least 20% of your conversations involve arguing for the principles and values that you care about.
For the record I have sort of given up any interest in the frame that rationalists are weird and it's unreasonable to want people to share rationalist norms. There's a Krishnamurti quote
It is no measure of health to be well adjusted to a profoundly sick society.
I bring this up to say, if most people get along okay by having a peace treaty not to hear about major ways in which their worldview is wrong, I consider this ~no evidence that this is a reasonable treaty to hold.
Hmm I think 20% of your conversations, is, like, a lot! And not very flexible based on your circumstances.
I think a related frame that seems better to me is, like, the fraction of your close friends that you have values disagreements with.
(I generally think people should have friends and acquaintances that they have value disagreements with. My default behavior is to hide and deflect on a bunch of 'irrelevant' features, I think downstream of growing up closeted, but it seems easy to imagine that I should be more open about a bunch of these things.)
I think the main downside of openness that I'm worried about is—even if I'm tolerant of variance, defining myself means that I'm at risk from everyone who is intolerant of variance. (I might think that people are making mistakes by eating chicken but it's mostly fine, whereas someone who thinks that I'm judging them for eating chicken now doesn't want to be friends with me anymore, and so I gave up a bunch for a relatively small amount.)
But of course the numbers matter here! I basically no longer expect to lose anything (on net) by people knowing that I'm gay; maybe also I should no longer expect to lose anything on net by being open about various values and principles.
I bring this up to say, if people have a peace treaty not to hear about major ways in which their worldview is wrong, I consider this ~no evidence that this is a reasonable treaty to hold
Yeah I don't want to make the majoritarian argument here; I want to say that we should look at it and figure out whether or not it makes sense.
I'm not sure what I make about the friends thing.
I'm tempted to say "I don't want to have friends who withhold substantial values disagreements from me."
What aura would you need to radiate such that friends who had substantial value disagreements with you were open about them / it made sense from their point-of-view?
I should ask another disambiguating question first, since I realized it wasn't quite obvious. Which do you want?
- friends with known value disagreements with you
- no friends with value disagreements with you, which them hiding is preventing you from achieving
I think one heuristic I follow is that I try to explicitly state disagreements and values-disagreements, without an implied need to hash it out. That part is optional if it comes up, but it means we know where we stand and aren't surprised by it later.
From your list, I'm pretty sure I want the first one.
There's lots of people with major values differences that I want to coordinate with and that I would be satisfied by being close allies and friends with.
I think I partially am like "perfect values alignment is not really a thing anyway".
I think I partially am like "perfect values alignment is not really a thing anyway".
Same.
So I'm trying to take a step back and think about whether or not I think a lot of people moving in this direction would be helpful for them.
Like,
There's lots of people with major values differences that I want to coordinate with and that I would be satisfied by being close allies and friends with.
I think my argument here is "this is both a target and a methodology" and my suspicion is that we should deploy more 'traditional' solutions as our methodology.
Pardon me, what is implied the methodology?
I think I mostly mean 'professionalism norms' here, which is people hiding lots of things 'at work' and only sharing them 'at home'. Like, when I take your advice to the typical American workplace, someone who follows a particular religion probably should tell other people about it so their disagreements can be predicted ahead of time, whereas when I take the 'standard professionalism' advice, it's more like "don't even wear your cross because then maybe people will anticipate disagreements with you."
I think when coordinating with folks with whom I have factual and moral disagreements, I often have an attitude of "In this meeting we are both attempting to coordinate on getting X done. We also have disagreements and conflicts A, B and C, and I will not attempt to hide that if it comes up, but my intent here is not to resolve them." It's possible that this is deceptive in some way.
But I don't think hiding them is a good idea. I would often want to publicly state that we have these disagreements and are working together anyway. I would never want to publicly be read as having perfect values alignment with everyone I work with. That seems preposterous.
I don't know what to make of the religion example. All theistic/deistic religions are false and should probably die of natural causes.
It seems to me like there are a few possible 'peace' states: 1) everyone agrees, 2) everyone disagrees and know they disagree, 3) everyone knows what topics are irrelevant, and 4) everyone disagrees but mistakenly believes they agree.
It seems to me like you're anti-4 (and I am also) but I think 2 and 3 are importantly different.
My guess is still that I should be able to coordinate with religious folks, but there's a ton of whom where I think the right answer is "These people themselves are not capable of coordinating with people who have factual disagreements with them and take factual disagreements as an attack." That's a hard spot for me to be in.
So a contemporary example is the current Israel-Palestine conflict. (I was playing board games with someone recently and one person brought it up and another person responded with "please let's not talk about that" in the style of 3, since it was basically "only downside" for us to know what each other thinks about that.)
These people themselves are not capable of coordinating with people who have factual disagreements with them and take factual disagreements as an attack.
I think this is a bucket error? I think if you are working with a Christian accountant about financial stuff, and you point out to them that Christianity is false, they'll behave pretty differently than if you point out to them that they made a math error somewhere. That's because the first is a package deal about their club membership and family relationships and many other things, and the latter is about work.
(I, of course, wish their club memberships and family relationships and all that didn't depend on their beliefs about the supernatural, but here we are.)
On "3) everyone knows what topics are irrelevant" I am not endorsing the strategy of "whenever having a business conversation with someone, make sure to share a comprehensive list of all disagreements you have with them at the start".
However I do endorse something like "If a claim is mentioned that you disagree with, briefly state your position, and if a disagreement is likely to be decision-relevant, pro-actively state your position on it, and be willing to debate it a little, though work to avoid it derailing the meeting."
However I do endorse something like "If a claim is mentioned that you disagree with, briefly state your position, and if a disagreement is likely to be decision-relevant, pro-actively state your position on it, and be willing to debate it a little, though work to avoid it derailing the meeting."
Implicitly this seems to me like a claim that "known but set aside disagreements are better than plausible deniability about whether or not there is a disagreement." This seems like it could be true, but how would we check? What's the empirical grounding we could use to resolve this?
(Noting that this could easily be a style thing, like the Bridgewater ratings system, where some people love it and some hate it and it's good that there are multiple companies so people can self-select into it being there or not.)
The current Israel-Palestine conflict... my guess for one thing going on there is that lots of people are experiencing grief and fear, and care a lot about being respected in that, and feel easily triggered when people state opposing positions to theirs without respecting their current emotional state. Respecting people's state and letting them process the grief in the middle of a board-game is often too hard to do and the board game will just not continue, and so I think it's reasonable to hold off on discussions of it during that period. But no, I would not want to avoid knowing someone's position, and sometimes have let people know my position differs from theirs leading them to cry and be upset (not on this issue to be clear), and I think that should also happen sometimes when a person thinks they have consensus.
I think this is a bucket error? I think if you are working with a Christian accountant about financial stuff, and you point out to them that Christianity is false, they'll behave pretty differently than if you point out to them that they made a math error somewhere. That's because the first is a package deal about their club membership and family relationships and many other things, and the latter is about work.
I don't mean that they cannot handle it in full-generality, I mean that they probably have heard prescriptions from their leaders like "He who does not believe in God and does not accept God into his heart, immediately accepts the Devil into his heart, and by doing so is in a conflict with you, and you must fight this sinner at every opportunity." Like, specifically on these key beliefs the disagreement is a social conflict.
Insofar as my lack of belief in God means to them "Ben will not grow with you, you will not accept him into your community, he will not be a long-term part of your life, etc" then that seems good to let them know. I hope that they can still work together with me on the accounting job, and I think there is some skill to learn about how to work respectfully with people where you don't have a long-term interest in knowing each other.
I mean that they probably have heard prescriptions from their leaders
Yes but they have also heard prescriptions from other leaders to not bring up religion at work; I think you might not be appreciating the relative strengths of, like, professionalism and religion and wokeness and so on.
[Like I think it would be way easier for someone to say "hey, no political talk at work" to shut down a conversation they dislike than to say "hey, some of your coworkers support Trump" or w/e.]
there is some skill to working respectfully with people where you don't have a long-term interest in knowing each other.
I can't tell if we agree or disagree on whether deflection is included in this skill, or whether you see deflection as easier or harder than tolerance. [Deflection here trying to point to the idea of turning away from irrelevant disagreements, rather than sharing them.]
Yes but they have also heard prescriptions from other leaders to not bring up religion at work; I think you might not be appreciating the relative strengths of, like, professionalism and religion and wokeness and so on.
So, sometimes I think it is acceptable to not be open about the specific details of everyone's salaries, because our human brains are too status oriented to be able to be sane about it. I think similarly about some aspects of sex and romance. I can imagine having a space where we just don't know details about each others' core values and principles, just in order to get along. But in every instance it seems super sad and not The Way and I am not convinced that we can't do better.
But in every instance it seems super sad and not The Way and I am not convinced that we can't do better.
Sure, I think you can do better—I think the question is whether or not tolerance on irrelevant things is worth the cost, where the peace treaty view is "no, you're paying more than what you get back."
(On the meta level I think it might be worthwhile to pop up the stack a little and check in how this relates to the broader question of more-or-fewer values-and-principles fights.)
Returning for summary and takeaways after lunch
[For those following at home: Ben had chicken for lunch.]
So here's how I think about this:
- Values and principles can be more or less relevant to particular relationships or community memberships.
- Hashing out conflicts / integrating positions has value proportional to that relevance. (You get a lot out of integrating your views on parenting with your spouse; you get much less out of integrating your views on bowling with them.)
- Hashing out conflicts / integrating positions has costs proportional to the personal relevance of the issues for the people involved. (Even if parenting (or not) is very important to your relationship, it's not obviously important to you-the-person). Ben's point about the Israel-Palestine conflict probably evoking strong emotions for people is the claim that it's predictably a high-personal-relevance topic, and thus a particularly costly one to integrate on.
- Professionalism is a strategy of deliberately keeping things separate unless they're high-relevance (a 'high-decoupling' style), which can be seen as a way to balance maximizing benefits against minimizing costs.
This is in contrast to 'high-integrity' styles (think authentic relating, radical honesty, and so on); the goal there is to get all of the benefits and to pay fewer costs by 'buying in bulk', basically.
I should also add that a lot of these have 'peace treaty' nature to them (i.e. it's about managing conflict between individuals / groups) instead of 'clear principle' nature to them. Many years ago, the Charity Navigator folks wrote a nasty letter about EA being insufficiently 'cause-neutral', because it dared to say not just that some charities were more or less effective than others but some causes were more or less worthy than others. But disagreement of opinion among EAs meant that we didn't end up with the one best cause, instead several different parallel clusters of effective charities for pushing that particular cause. (This story is not quite accurate because x-risk did, I think, become the top cause for many but not all EAs.) The impulse to keep different causes under one big tent pushes people towards tolerance of disagreement on the issue of cause prioritization (and tolerance's cheaper cousin of avoidance), where the impulse to maximize effectiveness instead pushes towards conflict resolution. Would EA have been better off over the years if more or less effort had been spent on trying to prioritize between wildly different causes?
Beyond the more strategic "what's the endpoint?" question, there's also the more tactical question of "where are we on the margin?". Even if everyone agrees that the 'professional' approach is the right way to go, they might disagree on how to order topics by relevance or where the cutoff thresholds should be. ("Actually someone's opinion on the Israel-Palestine conflict is relevant to EA because--") I think my disagreement with Ben is on the more fundamental point (is it better to do conflict avoidance or conflict tolerance) but we might also have different views on where we are even after settling that.
My summary of our conversation (with me saying your points in my own words):
- I said I wanted people to fight for their values more
- You said it's good for people to be able to live and work alongside one another given values differences
- You said that people have coordinated to not bring up lots of disagreements in order to be able to get along
- I took the position that I think people should instead practice tolerance of disagreements and values-conflicts rather than hiding them
- We discussed some examples like work meetings and religious differences and ongoing military conflicts, and there's a bunch more pragmatic detail that we could go into
I guess I ended up arguing that people should even know about disagreements. My guess is that this would lead to more discussion of disagreements too.
Reading Vaniver's last response:
- Hm, I don't like the "relevance" frame for whether to bring up disagreements. I think any disagreement that is mentioned in a conversation is worth naming, and shouldn't pass through people's faulty models of how important it is, which I think people will often just get wrong. It's better to use your importance model for choosing which disagreements to spend time talking about than to choose which disagreements people should even know exist.
- Yep, there's definitely also pragmatic questions of "where does the boundary lie for which disagreements to bring up in a meeting or when playing board games" that we didn't touch on. There's another pragmatic question of "when is it too derailing to even spend 30 seconds naming the existence of a disagreement". My guess is that there is some disagreement here.
- This isn't very important, but I feel a bit sad reading your list of high-integrity styles, as I don't identify with either of the examples listed.
- I don't think "professionalism" is a good name for the vision of "getting things done without getting derailed by orthogonal disagreements and conflicts". I think of "professionalism" as a good name for "being a blankface". I guess that might just be a degenerate case of the latter. I think I might prefer names like "Building-oriented" over "Conflict-oriented". But not got a strong recommendation here.
I think we both wrote a bunch of stuff, and have some crisp presentations of our views at the end, but I don't feel like we actually 'got into it'? Like, we have some sense of how conflict avoidance works and some sense of how conflict tolerance works, but I didn't put forward much reason to choose avoidance besides "well, the mainstream chooses it" and I don't see much of your argument for conflict tolerance besides "well, it seems less sad / more in tune with my personality".
Comments sorted by top scores.
comment by DanielFilan ·
2023-10-15T23:08:08.690Z · LW(p) · GW(p)
So a contemporary example is the current Israel-Palestine conflict. (I was playing board games with someone recently and one person brought it up and another person responded with "please let's not talk about that" in the style of 3, since it was basically "only downside" for us to know what each other thinks about that.)
I wonder how much of this is desiring to not collapse the wave-function? There are certain contexts where I wouldn't want to talk about Israel-Palestine because I might get rounded off to "pro-Israel" or "pro-Palestine", including by myself, in a way that I wouldn't endorse.
comment by DanielFilan ·
2023-10-15T23:06:35.057Z · LW(p) · GW(p)
I mean that they probably have heard prescriptions from their leaders like "He who does not believe in God and does not accept God into his heart, immediately accepts the Devil into his heart, and by doing so is in a conflict with you, and you must fight this sinner at every opportunity."
I'm pretty sure this is wrong as a statement about the distribution of Christians.
Replies from: Benito↑ comment by Ben Pace (Benito) ·
2023-10-15T23:42:19.119Z · LW(p) · GW(p)
I agree, but is it a wrong statement about the distributions of Christians who would be unwilling to work with me if I mentioned that I don't believe in any theistic gods?
Replies from: JohnBuridan↑ comment by JohnBuridan ·
2023-10-16T03:23:25.276Z · LW(p) · GW(p)
I know sincere intelligent Christians who would just be relieved and respect that you've actually thought about the question of deism, and see that as a positive sign of intelligence, maybe even truth seeking?
Replies from: Benito↑ comment by Ben Pace (Benito) ·
2023-10-16T03:44:36.487Z · LW(p) · GW(p)
Perhaps I'm writing unclearly, but I'll try to restate: the point I'm making is that if were conditionalizing on someone being unable to work with me because I'm an atheist, then I'm saying this sort of thing is likely something that they have heard.
Replies from: JohnBuridancomment by Vaniver ·
2023-10-15T21:37:09.228Z · LW(p) · GW(p)
I think of "professionalism" as a good name for "being a blankface".
FWIW I think these are pretty distinct concepts!
Replies from: zac-hatfield-dodds↑ comment by Zac Hatfield-Dodds (zac-hatfield-dodds) ·
2023-10-15T22:38:45.943Z · LW(p) · GW(p)
I certainly think there are ways to comport yourself as a professional which have very little in common with Scott's conception of a blankface, although pretending to professionalism is a classic blankface strategy.
A blankface is anyone who enjoys wielding the power entrusted in them to make others miserable by acting like a cog in a broken machine, rather than like a human being with courage, judgment, and responsibility for their actions. A blankface meets every appeal to facts, logic, and plain compassion with the same repetition of rules and regulations and the same blank stare—a blank stare that, more often than not, conceals a contemptuous smile.
A professional may be caught in a broken machine or bureaucracy; your responsibility (and the call for courage and judgement) then is to choose voice and eventually exit rather than loyal complicity.
comment by Vaniver ·
2023-10-17T07:15:16.630Z · LW(p) · GW(p)
I have historically found myself with few allies in the comment section of the EA Forum for most of the history of EA, when I tried to stand up for letting people speak their mind and not be subject to crippling PR constraints.
Also I have found myself supportive of people who I thought were in the EA ecosystem for the values and for doing what's right, and then when I pointed out times that the ecosystem as a whole was not doing that, someone said to me "I try with a couple of percentage points of my time to help steer the egregore, but mostly I am busy with my work and life", and I had the wind knocked out of my sails a bit.
And yet I think it often has outsized effects and the costs aren't that big.
I think I never really responded to this but also it was probably the main generator of Ben's opinion?
I'm not sure whether I would have said my initial "we" statement about EAs. (Part of this is just being less confident about what EA social dynamics are like; another is thinking they are less fractious than rationalists.)
comment by romeostevensit ·
2023-10-16T16:20:11.913Z · LW(p) · GW(p)
There are pieces of evidence that are not problematic for consensus reality. People disagree over these on differing priors for how to handle certain forms of evidence. Then there are secrets. Secrets aren't random but are over represented in the category of things that are uncomfortable tensions in consensus reality. Disagreement over differing secrets and differing interpretations of secrets (bc there can't be widespread consensus about secrets but definition) are difficult because they mix together the factual nature of the secrets with differing opinions about
What the relationship of the secret to consensus reality is.
What the relationship of the secret to consensus reality ought to be.
Both of these both in particular and in general.
From these conversations going poorly, myself and I imagine others wind up conditioned to have fewer disagreements, especially outside of high trust contexts. Which tends to keep the secrets smaller.