On Virtue
post by Midnight_Analyst · 2022-01-13T16:36:56.306Z · LW · GW · 15 commentsContents
16 comments
This post is comprised partly of my own ideas, and partly of the ideas of Scott Alexander. Part of what I am doing here is just explaining my interpretation of his post. It is my first post, and it’s essentially just an experiment. I’d be highly appreciative of feedback and constructive criticism. I’m not 100% sure I’ve completely expressed what I want to, but I think I’ve at least made a reasonable start. In the spirit of writing quickly, I decided not to my doubts hold me back. My point might seem quite obvious/self-evident, but I do feel emotionally like it’s a point I need to make.
A lot of us have people we look up to, and some people feel inadequate because they can’t replicate the amazing feats that their idols can. Perhaps this post can go some way towards helping people to feel better about themselves relative to those they admire, and to feel more self-worth in general.
Virtue almost certainly doesn’t actually exist. But if we imagine for a moment that we’re living in a world where it does, it is almost certainly directly proportional to effort, and nothing else. Any other model of virtue seems, to me, somewhat incoherent.
The first part of Scott Alexander’s ‘I Can Tolerate Anything Except the Outgroup’ really struck a chord with me, and it illustrates quite well the point I want to make. His point is essentially encapsulated in the following extract:
“There are a lot of people who say “I forgive you” when they mean “No harm done”, and a lot of people who say “That was unforgivable” when they mean “That was genuinely really bad”. Whether or not forgiveness is right is a complicated topic I do not want to get in here. But since forgiveness is generally considered a virtue, and one that many want credit for having, I think it’s fair to say you only earn the right to call yourself ‘forgiving’ if you forgive things that genuinely hurt you.
To borrow Chesterton’s example, if you think divorce is a-ok, then you don’t get to “forgive” people their divorces, you merely ignore them. Someone who thinks divorce is abhorrent can “forgive” divorce. You can forgive theft, or murder, or tax evasion, or something you find abhorrent.
I mean, from a utilitarian point of view, you are still doing the correct action of not giving people grief because they’re a divorcee. You can have all the Utility Points you want. All I’m saying is that if you “forgive” something you don’t care about, you don’t earn any Virtue Points.”
This post is about Virtue Points, and (importantly) not about Utility Points. I’ve noticed that I habitually assign people merit in my head, and I suspect almost everyone else does too. This post is about that process.
I want to set aside for the moment whether it's right to think in terms of people 'deserving' things. As I've already said, the concept of 'desert' is probably best thought of as something instrumentally useful rather than as a fundamental principle of the universe. But to the extent that we assign Virtue Points, I feel like it's best for everyone if we do it in a way that is fair and makes sense.
As Scott says, in terms of our behaviour, it probably makes sense to praise people at least partly based on the effects of their actions, rather than how much effort they put into them. We shouldn’t just reward pain. But this post isn’t about our outward behaviour - it’s about how we internally process instances of people doing things we consider admirable. It’s about the natural human reaction of ‘Wow, this person really did a good thing, so I’d better mentally assign them some Virtue Points. They deserve good things to come their way as a result of the good thing they themselves have done.' I think this is a good and useful way to think to some degree, but I also believe we need to be very careful when thinking in this way not to confuse Virtue Points with Utility Points, so as to avoid seriously over- or under-assigning merit.
Let’s take the example of learning a new skill. Developing a skill is often a difficult process, and going through that process usually merits Virtue Points. People who have learned new languages, built muscle, or overcome social anxiety deserve a lot of Virtue Points for doing these things. But I think we should be careful not to internally kid ourselves that these people deserve a lot of Virtue Points for continuing to do these things day-in day-out, once they have made the initial investment in learning the skill.
Sometimes people talk about those who have achieved good things in a way that makes me worry that they’re not internally assigning Virtue Points in a fair, healthy way. For example, my French teacher used to praise me in almost every lesson for being good at French. Sure, maybe I deserved some praise for the continued effort I put into French, but I didn’t need praise for the general skill of being good at learning languages, which is one of the main components of me being good at French. I think that the skill of being good at languages is almost entirely a result of genes and circumstance, and is basically not at all a result of my own effort or suffering. I therefore think I deserve very few Virtue Points for it, if any. On the other hand, there are all sorts of things that I’ve done in my life that were really hard, and that I don’t think anyone gave me Virtue Points for.
Or take Bob, for example. Bob has a job that he loves, working for an EA organisation. He doesn’t have any anxiety disorders, or depression, or any other major health problems. He eats well, exercises, and meditates every day, and has done all these things for years. There are probably a few people in the world with circumstances similar to Bob’s, although given how unusually comfortable his life is, I’d be surprised if there were many.
Now, I think Bob is amazing. But I think this in the sense that what he has accomplished is unusual and good for the world. I would probably praise Bob outwardly, partly out of habit and/or social pressure, and partly because praising him might encourage him to continue to do similar good actions in the future. But I don’t think that Bob is amazing in the sense that his continued daily activities involve an unusual amount of effort and suffering. I think his life is probably actually quite easy. Those Virtue Points instead should probably go mostly to people who find it near-impossible to get up in the morning but still do it anyway, or single mothers who have to hold down three jobs just to feed their families, or to chickens suffering in factory farms.[1] Bob gets a lot of my Utility Points, but very few of my Virtue Points.
When I’m deciding how many Utility Points I should give you - which hopefully correlates reasonably well with how much I outwardly praise you - I care about the effects of your actions. But when I’m dishing out Virtue Points, I couldn’t care less about things you can do without breaking a sweat. I care about what you do in moments of uncertainty. I care about what you did when you didn’t want to do the right thing. I care about what you did when you felt so confused that you had a very poor grasp on what to do, but knew that you had to do something. I care about whether you did the right thing when doing the right thing took effort.
The fact that someone has learnt a skill is impressive. The fact that they continue to do it is often quite ordinary. Not really impressive at all. We’re not surprised by it.
An important implication of what Scott is saying, and one I want to underline, is that you should be careful of mentally assigning people many Virtue Points for things they probably no longer have to fight for much. And you certainly shouldn’t feel bad about yourself for not doing these same things on a day-to-day basis. What’s amazing about people is the ways that we grow, not the ways we stay the same.
- ^
A lot of this post is about voluntary suffering, but you’ll notice that the suffering of factory-farmed animals is not voluntary. I actually don’t think I make much of a distinction in my head between voluntary and involuntary suffering in terms of deciding what someone ‘deserves’. So I think I give beings who suffer involuntarily something akin to ‘Virtue Points’ as well.
15 comments
Comments sorted by top scores.
comment by jaspax · 2022-01-13T14:16:59.335Z · LW(p) · GW(p)
Virtue almost certainly doesn’t actually exist
Strongly disagree. If you think of Virtue-with-a-capital-V as a Hellenic deity or a Platonic essence, then sure, Virtue doesn't exist. But if "virtue" is like "honesty", ie. a name for a particular pattern of action that is observable and which can be somewhat-objectively evaluated, then virtue absolutely exists, even if we might disagree about its identification.
Now, about the actual meat of your argument: you seem to have inverted the classical way of thinking of the virtues, which is that a person is virtuous precisely insofar as their virtues are habituated. You suggest that we award Virtue Points for initially acquiring some skill which is both difficult and praiseworthy, but stop awarding them thereafter. To the classical authors, this is backwards: the state in which you effortlessly always do the right thing is virtue, while the struggles to get there are merely the process of its acquisition.
(And yeah, this implies that some people are naturally more virtuous than others, just like some people are taller or smarter than others. C'est la vie.)
I don't think the models are totally incompatible: your Virtue Points are just the delta between different levels of virtue. Nonetheless, I think it's more useful to model virtue as a state and not as a series of events, and to think of yourself and others as virtuous based on their expressed habits, not on how much they're struggling to acquire them.
Replies from: Midnight_Analyst↑ comment by Midnight_Analyst · 2022-01-13T16:25:49.490Z · LW(p) · GW(p)
Strongly disagree. If you think of Virtue-with-a-capital-V as a Hellenic deity or a Platonic essence, then sure, Virtue doesn't exist. But if "virtue" is like "honesty", ie. a name for a particular pattern of action that is observable and which can be somewhat-objectively evaluated, then virtue absolutely exists, even if we might disagree about its identification.
I could rephrase my claim here by saying that utility is what actually matters, and that there isn't actually a thing called virtue that matters in and of itself. I agree that virtue can still be a useful concept for generating utility. But it seems wrong to me to say 'virtue can be a useful concept for generating utility, and therefore virtue has intrinsic value'. To me, it's just a useful concept.
Now, about the actual meat of your argument: you seem to have inverted the classical way of thinking of the virtues, which is that a person is virtuous precisely insofar as their virtues are habituated. You suggest that we award Virtue Points for initially acquiring some skill which is both difficult and praiseworthy, but stop awarding them thereafter. To the classical authors, this is backwards: the state in which you effortlessly always do the right thing is virtue, while the struggles to get there are merely the process of its acquisition.
(And yeah, this implies that some people are naturally more virtuous than others, just like some people are taller or smarter than others. C'est la vie.)
I don't think I agree with virtue ethics, but I definitely agree that the virtue that I'm talking about and the virtue that virtue ethicists talk about are different things.
I don't think the models are totally incompatible: your Virtue Points are just the delta between different levels of virtue.
Interesting. I guess since I'm not a virtue ethicist, I think it's probably not a good idea to define my virtue with respect to that kind of virtue.
Nonetheless, I think it's more useful to model virtue as a state and not as a series of events, and to think of yourself and others as virtuous based on their expressed habits, not on how much they're struggling to acquire them.
I think that utility comes from people's expressed habits and not how much they're struggling to acquire them, but my claim in the post is that we shouldn't be internally praising expressed habits as opposed to effort to acquire them as much as we currently do. It's probable that we also shouldn't be externally praising expressed habits as much as we currently do either, although this isn't a claim I explicitly make in the post, and one I'd need to think about more in order to be confident in.
Replies from: zac-hatfield-dodds↑ comment by Zac Hatfield-Dodds (zac-hatfield-dodds) · 2022-01-14T19:37:03.941Z · LW(p) · GW(p)
comment by Viliam · 2022-01-13T14:10:24.236Z · LW(p) · GW(p)
If you can achieve the same outcome using an easy way, or using a hard way, then the easy way is preferable, and you do not get any extra point for using the hard way. Your utility points depend on the outcome, your virtue points are proportional to the difficulty of the easy way -- whether you actually took it or not.
If you can achieve an outcome only using the hard way, your utility points depend on the outcome, your virtue points are proportional to the difficulty of the hard way.
That is, your utility points always depend on the outcome. Your virtue points... seem proportional to the easiest way that way possible to you. You get the virtue points for going towards the outcome (as opposed to not doing anything), but you do not get extra virtue points for making it more difficult than it needed to be.
If you are a Superman (all tasks are super easy for you), does it mean you can't get many virtue points? Not at all; you just need to do a lot of tasks. So much that it will be difficult to do all of them... which is kinda the point.
This is all about what you should do as an individual who works alone. Then we also need to consider implications (for virtue points and utility points) of dividing the labor (by trade or otherwise) between people who have different skills.
Intuitively, the group utility points depend on the total outcome produced. The group virtue points should be calculated based on the most reasonable division of labor (everyone following their relative advantages); there should be no extra points for being idiots and assigning each work to the person least fit to do it.
Trade allows you to increase total utility. I mean, if for person A it is easy to do X, but difficult to do Y; and for person B it is difficult to do X, but easy to do Y; if they agree that A does all X, and B does all Y, they can produce more in total.
But it seems like the trade does not allow you to increase total virtue. I mean, assuming that you spend 100% of your energy when you were working along, then whatever trade you arrange, ultimately you can only spend 100% of your energy doing your part, so even if you produce more... you don't get more virtue points.
But then, if we ignore the utility points for a moment, is there any point to engage in trade from the virtue perspective? I think the answer is that if you know there is an opportunity to trade, but you refuse it, you should be docked some virtue points for needlessly choosing the more difficult way (i.e. your virtue points now do not depend on how hard you actually work, but how hard would you have to work if you used the benefits of trade).
This all works out in short term -- each individual independently maximizing their virtue points is aligned with more output. What about long term? Virtue points are awarded for the fraction of your current capacity that you use, but in long term, there is a question of increasing your capacity (or refusing to do so). As you say, working on increasing your capacity is a virtuous thing. Having your capacity increased is not; you did the virtuous thing in the past, but you are not doing it now.
comment by romeostevensit · 2022-01-15T18:31:15.064Z · LW(p) · GW(p)
Effort is opaque and subject to goodharting if it becomes the measure of goodness. It generally eventually turns into suffering as currency, which is how hells are constructed.
comment by noggin-scratcher · 2022-01-13T15:16:18.186Z · LW(p) · GW(p)
the concept of 'dessert'
Amusing typo. Dessert is the sweet food at the end of a meal; desert is the one to do with things you deserve.
Replies from: Midnight_Analyst↑ comment by Midnight_Analyst · 2022-01-13T16:29:09.663Z · LW(p) · GW(p)
And there I was, convinced I had already looked this up and found that it was 'dessert'. Thanks.
comment by Dacyn · 2022-01-13T13:35:28.732Z · LW(p) · GW(p)
From the perspective of the collective, the point of awarding Virtue Points is so that people know what traits to signal to remain in good graces with the community. From the perspective of the individual, a lot of the time that will feel like doing the Right Thing and not getting rewarded, due to phenomena discussed here [LW · GW].
Since involuntary suffering doesn't show up in this paradigm, I think it is irrelevant to the notion of Virtue Points.
Replies from: Midnight_Analyst↑ comment by Midnight_Analyst · 2022-01-13T15:55:10.246Z · LW(p) · GW(p)
From the perspective of the collective, the point of awarding Virtue Points is so that people know what traits to signal to remain in good graces with the community. From the perspective of the individual, a lot of the time that will feel like doing the Right Thing and not getting rewarded, due to phenomena discussed here [LW · GW].
I think with my post I'm pointing to something quite specific - a collection of ideas I expect to be somewhat useful in some not-particularly-well-thought-through way, by making sure that, to the extent that people think 'person X deserves recompense', they think so in a way that is fair. Basically, I think I'm trying to make sure people don't get Utility Points and Virtue Points muddled up. I'm not going into whether people should mentally assign others Virtue Points, but I'm saying that most people will mentally assign others Virtue Points whatever anyone says, and that it'd probably be good for those people to be fairer in the way they do so.
I want to distinguish this mental action from the behaviours that result from it. I'm trying not to make claims directly about who and what should be outwardly praised.
On the connection to involuntary suffering, I have written the following in response to another comment:
Replies from: DacynI said 'something akin to Virtue Points', because I agree that someone getting hit is not actually more virtuous than someone not getting hit. I can understand why you would be very surprised if I thought that.
I think perhaps the whole post could be rewritten and framed in terms of suffering (or pain, or something of that nature), because I think that's essentially what I'm getting at, and I feel it might be what Scott is getting at as well. I think it's a highly common intuition that suffering is bad, and people often think that those who suffer deserve some kind of compensation, regardless of whether it was voluntary or not.
For example, say I have the following options:
A) Give a meal to a starving child.
B) Give something equally valuable to a healthy, non-starving child (note that obviously 'something equally valuable' doesn't mean 'a meal' in this case, because a meal is a lot less valuable to a non-starving child than to a starving child. It'd probably have to be something more expensive than a meal.)
I've tried to define this such that, from a utilitarian perspective, there's no difference between choosing option A and choosing option B.
I'd still rather choose A, because even though I know the Utility Points from both A and B are equal, there's something about balancing out past suffering that makes me feel nice and fuzzy inside, and gives me a sense of justice. I expect this sense of justice is quite common, probably very common.
I should say that I think my post generally should not change the behaviour of people who hold strongly utilitarian views. But I think that even those who would consider themselves staunch utilitarians still possess to some degree these evolved intuitions about virtue and suffering, and to the extent that they do, I feel like it'd be nice (and probably valuable) for them (and everyone else) to be assigning their mental Virtue Points in ways that make more sense and are fairer.
↑ comment by Dacyn · 2022-01-13T18:10:09.802Z · LW(p) · GW(p)
I want to distinguish this mental action from the behaviours that result from it. I’m trying not to make claims directly about who and what should be outwardly praised.
I totally agree that this is an important distinction (it is the distinction the post I linked to is about), and when I talked about Virtue Points in my comment I was meaning the mental action. But mental actions still have effects, which is why the first sentence of my comment isn't self-contradictory.
On the connection to involuntary suffering, fair enough, but then you shouldn't call it "Virtue Points". That already means something else.
Replies from: Midnight_Analyst↑ comment by Midnight_Analyst · 2022-01-15T18:04:16.288Z · LW(p) · GW(p)
Yeah so on the question of the effects of mentally assigning Virtue Points, I think the extent to which the ideas of my post should change behaviour, and whether that change would be good, is unclear. I wrote the post under the assumption that it's be better for us to have this fairer understanding of how the amount of suffering involved in a task can be drastically different for different people depending on their existing abilities. I feel like it's important for society to realise this, and I feel like we've only partially realised it at the moment. But possibly this isn't the case, and I need to think about it more. I'm open to the idea that actually the way people currently assign Virtue Points actually shouldn't be meddled with (which is why my post is more of a 'starting point for discussion' than 'thing I am completely sure about'). I think you're right to see the effects (rather than the mental action itself) as the thing that is actually important at the end of the day.
On involuntary suffering, having thought about this a bit more, I suppose the phrase 'something akin to Virtue Points' does imply that I think 'Virtue Points' would be an okay-ish name for the kind of thing I'm pointing to in the case of involuntary suffering, which is not the case. I do agree that Virtue Points is not a good name for that. I was trying to point out in the post that, as a very general statement, I feel like sufferers deserve compensation whether or not the suffering was voluntary.
comment by Alex Vermillion (tomcatfish) · 2022-01-13T05:05:20.500Z · LW(p) · GW(p)
I'm curious why you think "Virtue Points" are appropriate to award for involuntary suffering. I do not share this belief, so I'm wondering what motivated it for you. I'm very surprised you would feel that (ex) someone getting hit is more virtuous than someone not getting hit (all else equal, of course).
Replies from: Midnight_Analyst↑ comment by Midnight_Analyst · 2022-01-13T09:28:50.790Z · LW(p) · GW(p)
I said 'something akin to Virtue Points', because I agree that someone getting hit is not actually more virtuous than someone not getting hit. I can understand why you would be very surprised if I thought that.
I think perhaps the whole post could be rewritten and framed in terms of suffering (or pain, or something of that nature), because I think that's essentially what I'm getting at, and I feel it might be what Scott is getting at as well. I think it's a highly common intuition that suffering is bad, and people often think that those who suffer deserve some kind of compensation, regardless of whether it was voluntary or not.
For example, say I have the following options:
A) Give a meal to a starving child.
B) Give something equally valuable to a healthy, non-starving child (note that obviously 'something equally valuable' doesn't mean 'a meal' in this case, because a meal is a lot less valuable to a non-starving child than to a starving child. It'd probably have to be something more expensive than a meal.)
I've tried to define this such that, from a utilitarian perspective, there's no difference between choosing option A and choosing option B.
I'd still rather choose A, because even though I know the Utility Points from both A and B are equal, there's something about balancing out past suffering that makes me feel nice and fuzzy inside, and gives me a sense of justice. I expect this sense of justice is quite common, probably very common.
I should say that I think my post generally should not change the behaviour of people who hold strongly utilitarian views. But I think that even those who would consider themselves staunch utilitarians still possess to some degree these evolved intuitions about virtue and suffering, and to the extent that they do, I feel like it'd be nice (and probably valuable) for them (and everyone else) to be assigning their mental Virtue Points in ways that make more sense and are fairer.
Replies from: mike↑ comment by mike · 2022-01-13T14:06:24.147Z · LW(p) · GW(p)
In defining A and B as equally valuable, I have to equate the two. That said, it's hard to imagine something that exists that would be as valuable to a non-starving healthy child right now as the meal to the starving child, so if the valuable thing you gave in scenario b were at all marketable, the inefficiency of choosing to use it to help b instead of the x (where x > 1) starving children would make the real psuedo-equation:
utility(a) = utility(b)
cost(a) = x * cost(b)
if x > 1, do a, if x < 1 do b
comment by Alex Vermillion (tomcatfish) · 2022-01-13T05:03:50.187Z · LW(p) · GW(p)
You can use Markdown to make real footnotes here if you want, I almost missed your footnote because of a misread. In your text, it looks like[1] (I typed that as [^this]
) and below it looks like
I typed this like
[^this]:
(the same as above with a colon and space after) ↩︎
↑ comment by Midnight_Analyst · 2022-01-13T09:31:39.279Z · LW(p) · GW(p)
Thanks! :)