Do you vote based on what you think total karma should be?

post by Rafael Harth (sil-ver) · 2020-08-24T13:37:52.987Z · LW · GW · 13 comments

This is a question post.

Contents

  Answers
    27 jimmy
    22 Ericf
    14 Viliam
    12 Bucky
    12 Dagon
    7 Measure
    6 algon33
    5 Mark Xu
None
13 comments

I recently strong-downvoted a post that I would have weak-upvoted if it had been at a lower karma. In general, I usually vote primarily based on what I think the total karma should be. I'm curious whether other people do similar things.

This is both a question and a poll. The poll is in the comments; it works via upvotes but there is a karma balance comment. (Note that one can recover the non-weighted results (i.e., number of votes) by hovering one's mouse over the current score.) This is about votes on LessWrong only.

I'm also wondering whether this behavior is, in some sense, anti-virtuous. If everyone votes based on what they think the total karma should be, then a post's karma reflects [a weighted average of opinions on what the post's total karma should be] rather than [a weighted average of opinions on the post]. This feels worse, though I'm not entirely sure that it is.

Correction: as jimmy points out [LW(p) · GW(p)], voting independently of current karma does not give you a weighted average of opinions on the post because there are only a limited number of ways you can vote.

Meta: There's been some speculation [LW(p) · GW(p)] about this (maybe read after voting), but nothing conclusive.

Current non-weighted results (08/28 07:05 EDT) (TK is 'target karma'.)

Answers

answer by jimmy · 2020-08-24T17:32:49.958Z · LW(p) · GW(p)

Voting based on current karma is a good thing.

Without that, a post that is unanimously barely worth upvoting will get an absurd amount of upvotes while another post which is recognized as earth shatteringly important by 50% will fail to stand out. Voting based on current karma gives you a measure of the *magnitude* of people's like for a comment as well as the direction, and you don't want to throw that information out.

If everyone votes based on what they think the total karma should be, then a post's karma reflects [a weighted average of opinions on what the post's total karma should be] rather than [a weighted average of opinions on the post].

This isn't true.

If people vote based on what the karma should be, the final value you get is the median of what people think the karma should be -- i.e. a median of people's opinion of the post. If you force people to ignore the current karma, you don't actually get a weighted average of opinions on the post because there's very little flexibility in how strongly you upvote a post. In order to get that magnitude signal back, you'd have to dilute your voting with dither, and while that will no doubt happen to some extent (people might be too lazy to upvote slightly-good posts, but will make sure to upvote great ones), you will get an overestimate of the value of slightly-good posts.

This is bad, because the great posts hold a disproportionate share of the value, and we very much want them to rise to the top and stand out above the rest.

comment by Bucky · 2020-08-24T19:47:55.971Z · LW(p) · GW(p)

This is an interesting point that I hadn't thought of.

Without that, a post that is unanimously barely worth upvoting will get an absurd amount of upvotes while another post which is recognized as earth shatteringly important by 50% will fail to stand out. 

I think this oversells the problem somewhat.

First a technicality - strong votes are, at least for active members, stronger than a weak vote.

Second, if a post is earth shatteringly important to some then it is likely to be net positive to many others so would also receive a large number of weak upvotes.

So a more realistic scenario would be:

100% weakly upvoted post

is similar to

20% strongly upvoted, 30% weakly upvoted.

These would clearly both be very high scoring posts so would certainly stand out from the crowd, exactly as they should. It doesn't seem obvious to me that the former should stand out significantly less (or be rewarded significantly less) than the latter.

Replies from: Bucky
comment by Bucky · 2020-08-24T21:02:26.575Z · LW(p) · GW(p)

Further, I'm not sure having a voting condition of "vote to try to bring the karma to the value you think it should be" helps in this situation. If 50% of people didn't get any value from a post/comment then they would be trying to vote the karma down to 0. So a "50% earth shattering, 50% meh" post would end up with ~0 karma.

Replies from: sil-ver
comment by Rafael Harth (sil-ver) · 2020-08-24T21:14:40.893Z · LW(p) · GW(p)

Speaking as the only person so far who has chosen Option 1, I definitely don't downvote every post that I don't get value out of. "How much I get out of it" and "how much karma I think it should have" are two very different things. If I read something and don't get value out of it, usually I'm agnostic as to how much other people get out of it and don't vote. Downvoting it requires me being confident that public consensus is wrong, which is rare (but happens occasionally).

I guess this means that my algorithm isn't 'always assess desired karma target and move in that direction' but that plus a whole bunch of epistemic humility. I'm also not claiming to know exactly what I'm doing, though I think it's right to say that it's primarily about the karma target.

Replies from: Bucky
comment by Bucky · 2020-08-24T21:50:25.304Z · LW(p) · GW(p)

Hmm, interesting - I'm now slightly confused by:

I recently strong-downvoted a post that I would have weak-upvoted if it had been at a lower karma

Was that post good or bad? It sounded to me like you thought the post had value, just not as much as was currently showing. If you downvoted a post you thought had positive value (you were confident that its current karma value was too high?), why not downvote one that you don't see any value in?

If being agnostic is a cause for not voting at all, a 50% great, 50% agnostic post would get a higher score than a 50% great, 50% slightly good post as the slightly good experiences would downvote and the agnostics wouldn't.

I think my main concern with the "vote to try to give posts/comments the total karma they should have" rule is that I can't see a way to operationalise it which doesn't suffer from worse problems than the simple "I want to see more/less of this" rule.

Replies from: sil-ver, Raemon
comment by Rafael Harth (sil-ver) · 2020-08-25T11:23:44.325Z · LW(p) · GW(p)

Was that post good or bad? It sounded to me like you thought the post had value, just not as much as was currently showing. If you downvoted a post you thought had positive value (you were confident that its current karma value was too high?)

It was good -- the if part of your quote is pretty accurate.

why not downvote one that you don't see any value in?

Downvote a different post of the same author because I didn't like that one? That doesn't sound like a good idea.

Replies from: Bucky
comment by Bucky · 2020-08-25T11:40:19.249Z · LW(p) · GW(p)

Downvote a different post of the same author because I didn't like that one? That doesn't sound like a good idea.

No, I mean why wouldn't you downvote a hypothetical post that you are agnostic about? 

Imagine there are two posts, both have 50 karma.

You read one and feel confident that it is net positive but that 50 is too high.

You read the other and it is not net positive for you - you just have a meh reaction to it.

It seems very odd to me that one would downvote the former but not the latter. The net effect is to encourage people to read/write a post that is more likely to provide a meh reaction than be net positive.

Replies from: sil-ver
comment by Rafael Harth (sil-ver) · 2020-08-26T09:16:09.331Z · LW(p) · GW(p)

No, I mean why wouldn't you downvote a hypothetical post that you are agnostic about? 

Because such a post has more probability mass on the "it's really good" hypothesis. If I'm confident that a post is only slightly good, well then I'm also confident that it's not very good.

comment by Raemon · 2020-08-25T00:12:59.949Z · LW(p) · GW(p)

I think my main concern with the "vote to try to give posts/comments the total karma they should have" rule is that I can't see a way to operationalise it which doesn't suffer from worse problems than the simple "I want to see more/less of this" rule.

What's the worse problems you're seeing?

Replies from: Bucky
comment by Bucky · 2020-08-25T10:46:03.124Z · LW(p) · GW(p)

The fundamental problem is that we're trying to map a multidimensional thing into a single dimension. Whenever you do this you end up throwing out some information and you have to do the best you can. 

As described by jimmy, with the "I want to see more/less of this" rule you lose some information on magnitude of like/dislike with the "I want to see more/less of this" rule. This is somewhat mitigated by having weak and strong votes, plus the dither factor jimmy describes (which I think for me is quite significant) so overall I'm not hugely worried about this. 

(You can also get some of this information back if you're really interested by comparing total number of votes to score although this is less obvious)

 

I'm not sure how a "how much total karma should this post have" rule even works in practice but a couple of options:

How much karma a post has needs to link to post value / correct amount of reward to the author.

If I judge this according to how much value I personally got out of it then the great-great grandparent comment applies and 50% awesome, 50% meh posts get 0 karma - a worse result than with the "I want to see more/less of this" rule, with all of the information from the 50% of people who found it awesome disappearing.

If instead I am trying to judge how much value I think the average LWer would get out of it then I think this gets really hard to assess. As an example, the recent 10 fun questions results showed that people weren't very good [LW(p) · GW(p)] at guessing whether others believed the Civilisational Inadequacy thesis more or less than they themselves did. Here you lose some information on people's actual opinions in favour of information on what other people think their opinion might be, adding significant noise to the result.

Whichever option you chose you probably end up throwing out information on how many people got value from the post. You can try to get around this by each person estimating how many others would find it useful but I think this just adds more noise to the result.

 

You can try to make the rule some combination of rules (as it seems most people do) but then to me it seems like interpreting karma scores becomes really difficult. We also run into the problem of how much weighting to give to each sub-rule and if people give different weightings then you get a discrepancy [LW(p) · GW(p)] in how effective each person's opinion is.

 

I'm interested if someone can explain another way that a "how much total karma should this post have" rule would work in practice which doesn't run into such problems.

Replies from: Raemon
comment by Raemon · 2020-08-25T20:25:16.809Z · LW(p) · GW(p)

(note: none of this is representing LW team opinion or anything)

I think karma is sufficiently noisy that trying to get some kind of "real" information out of it is already pretty intractable. People definitely vary in how they use it – some people vote liberally, some people vote rarely, some people use it as a "yay/boo" thing for things they disagree with, some people use it in some principled fashion. I'm betting that almost nobody consciously considers whether to upvote every single comment they read.

(Note that we deliberately don't show number of upvotes/downvotes on a post (mostly for a reason unrelated to this, but it means that "how many people downvoted this" is not public information that is supposed to communicate anything))

In my mind, the algorithm I'm implementing is still the "do you want to see more/less of this?", I just also have a vague term for "more/less relative to what?". If I see a bunch of people have upvoted a thing that I want to see more of, enough that I feel like the system has already given it enough reinforcement, I feel less inclined to upvote it myself, because the system already was giving it the amount of "more of this" that I wanted.

This just seems like the usual system with a bit more information, rather than less, to me.

Karma has a few concrete effects on the site: how long posts stay on the home page, whether comments get collapsed, and how comments and sometimes posts get sorted on a page. I think is basically fine to use voting to deliberately manipulate those things, based on your judgment of how relatively important a post/comment is. That essentially is the operationalization of what it means to want "more or less of this", and it seems super reasonable to have that be encouraged rather than treated as an abuse of the system (otherwise, you have a system that relies on people avoiding being strategic, which is an asshole filter)

Downvoting a comment because it's getting sorted to the top of a thread, when you think another comment is relatively more important, seems fine to me.

(This is in addition to "karma has some vague psychological effect on how high status / community-endorsed a post feels", which I think is also fine to vote on as another knob on the "less/more of this" thing)

Some notes on my own usage:

  • I very rarely downvote things because they are "too high." When I do that, it's in particular cases where I honestly think it was important that something was said, but there were some aspects of it that I definitely don't want too much of on the site. The prototypical example is a post that makes some good points, but also some bad points (or uses bad arguments). If it were to get 100+ karma, I'd feel pretty bad about a high profile post making bad arguments. But, it still contributed enough value that I'd also feel bad if it didn't get at least some reinforcement.
  • Usually, I make the decision once based how many people have already voted on a thing, and it's just a matter of "not bothering to upvote." If someone makes a clever joke, and it already has a few upvotes, I usually don't upvote it further. I like having clever jokes on LessWrong, but I think it's somewhat bad if they end up getting more karma than the more substantive comments.
  • I sometimes withdraw upvotes (on, say, a clever joke that ended up getting 70 karma, where when I first voted on it it only had 5). I do this more commonly than downvoting things that are "too high." Obviously I don't do this reliably, but I also wasn't reliably checking each comment and deciding how hard to upvote it and I don't expect most people were either. This doesn't seem like it adds any noise to the system to me beyond what was there already, and meanwhile I think sends a more accurate signal of "Rays preferences about what there should be more/less-of" than not doing that when I happen to notice.
Replies from: Bucky, Raemon
comment by Bucky · 2020-08-25T21:52:06.277Z · LW(p) · GW(p)

I think the asshole filter is a good point and to be honest its possibly enough to get me to change my mind about this subject. There should be some mitigation in the karma weighting system but even long term members might be assholes.

Does it prove too much? Should the current karma, then, actually be the main consideration in deciding how to vote? Few people on the site seem willing to bite that bullet. Should I almost always use my strong votes as if I don't then an asshole might do that and therefore have an oversized effect?

Count me confused.

 

On the other points I won't go through point by point. The main thing I think is that what you're describing is in conflict with the explicit phrasing of the reasons for voting. Compare:

What should my votes mean?

We encourage people to vote such that upvote means “I want to see more of this” and downvote means “I want to see less of this.”

with

What should karma indicate?

The karma on a post is intended to indicate whether, and by how much, members of the site would like to see more of the posts/comments in question. We encourage people to vote accordingly.

The former is the FAQ but I think the latter is what you're describing. If this is the case then I think this ends up being an asshole filter in itself and the phrasing in the FAQ should be corrected.

(I realise as Ray qua user this has nothing to do with you but if you can pass this along to Ray qua admin that would be great!)

Replies from: Raemon
comment by Raemon · 2020-08-25T23:03:24.580Z · LW(p) · GW(p)

I actually think it's fairly bad that Strong Upvotes aren't asshole-filter-proof, and in this case I bite the bullet in the direction of "we should limit the power of strong upvotes somehow so they can't be abused." (I've thought this for awhile, it just hasn't been top priority, and/or the team couldn't come up with an improvement that seemed better to everyone)

That said, I think you did just remind me that there a ton of vulnerabilities in the karma system that absolutely rely on people not abusing them most of the time, and yeah I just actually retract that part of my argument. I do think we should eventually someday have a karma system that's more resilient, and the only reason it's not a higher priority is that in fact people are mostly good people, and the system just actually mostly works, so it's not as high priority as other site changes.

But, I do still stand by "manipulating the position/weighting/visibility of posts is basically what "see more / less of this" is actually supposed to mean, and is basically in the spirit of it.

We encourage people to vote such that upvote means “I want to see more of this” and downvote means “I want to see less of this.”

I basically always interpreted this to mean "I want to see more/less of this, and among the things that factor into what I want to see more/less of are subtle things about site norms that impact other people." 

I realize it's ambiguously worded.

My overall biggest crux here is "Karma is so far away from being a robust system that means concrete things that you just shouldn't worry too much about what exactly it means. You should know that other people are using it differently from you [for any value of 'you']. It's a vague, kludgy approximation that seems to mostly output reasonable things, and that's basically fine for now."

Replies from: Bucky, ChristianKl
comment by Bucky · 2020-08-26T13:13:22.210Z · LW(p) · GW(p)

Ok, I think I actually agree with your crux.

The points I was trying to make were (kinda scattered across the comments here!):

1. It is advantageous if people have a shared understanding of the system

2. Voting your own belief actually should work pretty well

3. There is a written norm in favour of voting your own belief

I think we disagree on all 3 to some extent, at least in how important they are. I think if we lose the disagreement on number 3 then disagreements on 1&2 are less important.

I'm ok with a norm of voting based somewhat on target karma (making it overly strong an effect I think would be detrimental), especially as this is now common knowledge and seems to be most people's preference. 

This whole thing has resolved some of my confusion as to why karma scores end up the way they do.

Replies from: Raemon
comment by Raemon · 2020-08-26T21:09:31.610Z · LW(p) · GW(p)

I want to note that the I see the "vote towards the ideal karma" as completely compatible with "vote your belief."

I think there are two fairly different questions here:

  • Should your vote include your beliefs about how much you want other people to see a given post, or what you think is best for others?
  • Should you vote based on your ideal total-karma for a post

It so happens I think we might disagree about both of them (and disagree about what the best interpretation of the current rules are about them). But, those are quite different questions, and you can do the second based entirely on your own preferences/beliefs. 

When a post is at 50, I can think that is a bit too high just from my general sense of what I want to see more of on the site. And it's be throwing away information about my own beliefs to not give me the fine-gradation of "I want to see these posts on the site about as often as I would if they got 50 karma, not the amount that I would if they got 200 karma."

Replies from: Bucky, Bucky
comment by Bucky · 2020-08-26T23:23:36.457Z · LW(p) · GW(p)

When a post is at 50, I can think that is a bit too high just from my general sense of what I want to see more of on the site. And it's be throwing away information about my own beliefs to not give me the fine-gradation of "I want to see these posts on the site about as often as I would if they got 50 karma, not the amount that I would if they got 200 karma."

This is true when the equilibrium position of the karma system is set to Total Karma Voting. 

 

I think that Blind voting would move the karma system to a new equilibrium. I'm not convinced we should do so as I think it would be a fairly unstable equilibrium but I think it would work if everyone did it and would allow for fine grained expressions of your belief.

The equilibrium I envisage would be that the current amount of something that LW has is taken into account when people blind vote their opinion.

As an example, I think the reason that joke comments can get fairly high karma is that they're rare. If more people start writing joke comments as a result then that's fine for as long as people are upvoting.

At some point the people who value the jokes least stop upvoting them or start downvoting them. This continues until the reward experienced by the jokers roughly matches the effort taken or some other balancing factor.

In the case of low positive value posts, some people have a higher threshold for what they will give an upvote for and the more low positive value posts there are the fewer people will upvote them.

(I think its important to note here that we are not really that homogenous in our opinions and weightings of different sources of value. Alot of the worries about Blind voting seem to assume that we're all going to vote the same way about the same posts which I think is highly unrealistic. There also seems to be the assumption that everything fractionally above 0 value will get an upvote which again seems unrealistic. Frankly I think that anyone who can write a post which is good enough that it persuades 100 different people with different standards to click the upvote button then they deserve to get 150 karma!)

The key then is that in order to get an oversized reward for the amount of effort put in, you have to do better than average at providing value.

In Blind Voting, accounting-for-how-much-of-a-certain-thing-there-currently-is-on-LW is doing the same thing as considering-what-message-the-total-karma-sends does with Total Karma Voting. The former seems to have a lag in the message getting out but I think when you're in a rough equilibrium the lag is relatively short.

 

So this brings me onto what I think the main cost of Total karma voting is. If an author looks at a post which has 25 karma from 10 votes, what does it mean? Roughly speaking it means that it was considered about as valuable as another 25 karma post. The 10 votes tells the author how efficient the karma market was for the post and possibly gives limited information on how varied the opinions were.

With Blind voting the author sees that and knows that 10 people had an opinion that this post was wanted more or less and that their average strength of opinion was 2.5 karma points in favour. This probably consists of something like 3 people who want alot more like it and 7 people who want a little more like it (or possibly some who wish there was less like it or were just yay/booing).

I agree that karma is a kludge and the true meaning isn't necessarily clear but with Blind voting it seems importantly less of a kludge and some extra information can be extracted.

comment by Bucky · 2020-08-26T21:39:11.163Z · LW(p) · GW(p)

I want to note that the I see the "vote towards the ideal karma" as completely compatible with "vote your belief."

Agreed. I was looking for a shorthand way of referring to the different voting policies but am yet to find one which is satisfactory - you’ve (rightly) shot down a couple of my ideas! Total Karma voting seems fine for one policy, maybe direct opinion voting for the other? If you shoot that one down too you can come up with your own!

Replies from: Raemon
comment by Raemon · 2020-08-26T21:50:33.700Z · LW(p) · GW(p)

I think "blind voting" captures the distinction better – the key difference is whether you're supposed to look at or model the outcome.

Btw another reason I think "take total karma into account" is important is because of how big a slap downvotes feel like. Blind voting both means that "mildly good comments" will get like 80 karma, but also means that mildly bad comments will get like -80 karma, which would make the site feel very punishing.

Replies from: Bucky
comment by Bucky · 2020-08-26T23:31:35.948Z · LW(p) · GW(p)

I do think that it would be very bad if this happened. However I don't think this is likely. Quoting my other comment:

I think its important to note here that we are not really that homogenous in our opinions and weightings of different sources of value. Alot of the worries about Blind voting seem to assume that we're all going to vote the same way about the same posts which I think is highly unrealistic. There also seems to be the assumption that everything fractionally above 0 value will get an upvote which again seems unrealistic. 

This seems even more true for downvotes - I think people realise that downvotes feel extra bad and only use them sparingly. For instance, I only really downvote when I think something has been a definite breaking of a conversational norm or if someone is doubling down on an argument which has been convincingly refuted.

I  think a spread of opinions on what constitutes a downvote (and a general feeling that comments get less votes in general) would make the -80 only happen to super egregiously bad comments.

comment by ChristianKl · 2020-08-26T13:53:30.192Z · LW(p) · GW(p)

It seems the definition of abuse of Strong Upvotes is about a person using them all the time. You could say that if a person uses Strong votes more then X% of the time they vote the impact of their Strong votes gets reduced.

Replies from: jimmy
comment by jimmy · 2020-08-26T18:31:39.220Z · LW(p) · GW(p)

Adjusting in the other direction seems useful as well. If someone Strong Upvotes ten times less frequently than average I would want to see their strong upvote as worth somewhat more.

Replies from: Raemon
comment by Raemon · 2020-08-26T19:36:38.244Z · LW(p) · GW(p)

There's a hypothetical direction we could go where voting-weight is determined based on your vote frequency. The main disadvantage of this is that it becomes a lot harder to predict and conceptualize what voting does.

One hesitation habryka had about penalizing excessive strong downvotes is people would end up trying to conserve them as a resource, like a videogame where you end up hoarding all your potions because you "might need them some day" and never actually use them.

comment by Raemon · 2020-08-25T20:27:14.322Z · LW(p) · GW(p)

Also note:

If instead I am trying to judge how much value I think the average LWer would get out of it then I think this gets really hard to assess.

I think this sort of problem is still there if you're not trying to "move posts towards the 'correct' karma." 

"I want to see less/more of this" still depends on "how good do I think this is for LW as a whole?"

comment by Lukas Finnveden (Lanrian) · 2020-08-26T19:46:04.996Z · LW(p) · GW(p)

I could see this argument going the other way. If a post is loved by 45% of people, and meh to 55% of people, then if everyone use target karma, the meh voters will downvote it to a meh position. As you say, the final karma will become people's median opinion; and the median opinion does not highlight things that minorities love.

However, if everyone votes solely based on their opinion, 45% will upvote the comment, and 55% won't vote at all. That means that it will end up in an overall quite favorable spot, as long as most comments are upvoted by less than half of readers.

I think both systems would have to rely on some people not always voting on everything. The nonTK system relies on there being large variability in how prone people are to voting (which I think exist; beware the typical mind fallacy... maybe another poll on how often people vote?) whereas the TK system relies on people abstaining if they're uncertain about how valuable something is to other people.

comment by Rafael Harth (sil-ver) · 2020-08-24T17:56:02.697Z · LW(p) · GW(p)
If you force people to ignore the current karma, you don't actually get a weighted average of opinions on the post because there's very little flexibility in how strongly you upvote a post.

Oops. Yes, this seems pretty obvious now that you've said it. I've edited the correction into the post.

comment by SarahSrinivasan (GuySrinivasan) · 2020-08-25T15:01:08.233Z · LW(p) · GW(p)

I think you mean that having a measure of the magnitude of people's like for a comment is a good thing, and voting based on current karma is the only easy way to get that, at present, even though voting based on current karma is an abjectly silly thing. Or I hope you mean that.

answer by Ericf · 2020-08-24T16:45:25.112Z · LW(p) · GW(p)

This seems like an actively harmful norm, and should be stopped. If the existing karma total influences your vote strength at all, then the same post could end up with different final karma depending on the order people read/rate it. I think that is actively harmful to the goals of the karma system.

comment by Rafael Harth (sil-ver) · 2020-08-24T17:23:31.667Z · LW(p) · GW(p)
This seems like an actively harmful norm, and should be stopped. If the existing karma total influences your vote strength at all, then the same post could end up with different final karma depending on the order people read/rate it. I think that is actively harmful to the goals of the karma system.

Yeah, this seems like a pretty reasonable reaction to me.

You're right about the dependence on order. However, it's worth pointing out that there is another way in which karma will depend on order that exists without this norm: people will decide to click or not to click on a post based on current karma. So, for example, if a post is at 2 Karma, persons and will both give it -2 upon reading it, but will only click on it if it has at least 2 karma while only needs at least 0, then the order will lead to -2 karma, while will lead to 0.

And a third way karma could end up depending on order is if people's perception of how good a post is depends on how much karma it has.

You could still be right about introducing yet another dependence being a bad idea.

answer by Viliam · 2020-08-24T20:50:06.939Z · LW(p) · GW(p)

I think this way of voting is completely fucked up.

upvote = I liked it

downvote = I disliked it

spitevote = I don't really like/dislike it, I just resent that others dislike/like it

If you like it then fucking upvote it, and if you don't like it then fucking downvote it, but don't do this "I am gonna let you vote first, and then whatever you choose, I will do the exact opposite so that your vote gets cancelled". You are just adding noise, and if many people do this, then the outcomes will depend on the order people voted -- an article that divides the audience 50:50 may end up upvoted or downvoted depending on whether the vote order was "downvoters, then spitevoters, then upvoters" or "upvoters, then spitevoters, then downvoters".

EDIT:

I agree that the current system has the problem that essentially karma = appeal × visibility, so that a "slightly better than meh" content with lots of visibility can score lots of total karma. So maybe there should be a third way to vote, some kind of "mehvote" that would give +0 karma but also somehow drag the total karma towards zero, so that a comment with 5 upvotes and 5 mehvotes would result in less than 5 karma, but if it later gets additional 3 downvotes, it would still remain positive. Not sure about the exact formula, but the idea is that the result is positive if upvotes outweigh the downvotes, negative if it's the other way round, exactly U-D if there are no mehvotes, and adding the mehvotes brings the result asymptotically closer to zero.

Feedback should be genuine, and not include strategic thinking about other people's feedback. You can't have the wisdom of crowd if too many people are concerned with what other people think. Maybe adding the third vote option is necessary, I am not sure. As a side effect, it would help distinguish between "I haven't voted on this yet" and "my vote is: meh".

comment by Raemon · 2020-08-25T00:08:41.752Z · LW(p) · GW(p)

It seems like your opinion here is carrying over from a pretty different voting mechanism (in political elections, people only vote once, and the point of the vote is to choose a single thing). Here, people can change their vote willy nilly, and the point of the vote is to get a general sense of how good something is, and people can constantly adjust their vote in response to other people if they want.

The karma = appeal + visibility thing makes your preferred way of voting an absolute dealbreaker in my opinion – it automatically outputs a wrong answer to the question I think voting is trying to answer ("which posts or comments are best?"). Naively, it results in any slightly good post getting the same amount of karma as a great post. (Or, now that we've added Strong Votes, it only allows two clusters of karma scores for anything that most people agree are good)

I think it's plausible we should change the whole voting system to accommodate the feedback concern you're advocating here, but IMO you are advocating for basically switching to a new voting system, not "properly" implementing the current one.

Replies from: Viliam, Bucky
comment by Viliam · 2020-08-25T13:38:55.045Z · LW(p) · GW(p)

I think it is a desirably property of a voting mechanism (whether here or in politics) that your vote should reflect your opinion on the issue, instead of... some strategic calculation that includes other people's votes.

Here, people can change their vote willy nilly, and [...] constantly adjust their vote in response to other people if they want.

In theory yes, but is this how you really want to spend your time? Revisiting the old discussions and reconsidering your old votes in light of the new votes from other users...

IMO you are advocating for basically switching to a new voting system, not "properly" implementing the current one.

I have a strong opinion on "don't downvote the content you like, and don't upvote the content you dislike". Other than this, I am quite happy with the voting mechanism as it exists now.

However, if spitevoting becomes a common practice, then I'd prefer to see the voting mechanism changed rather than abused. If other people (not me) feel a strong desire to express that some content is mediocre (in a way different than abstaining from voting), I would prefer that they have a first-class option to do that, instead of strategically abusing the existing options.

comment by Bucky · 2020-08-25T14:24:03.758Z · LW(p) · GW(p)

IMO you are advocating for basically switching to a new voting system, not "properly" implementing the current one.

Compare to the LW FAQ [? · GW]:

What should my votes mean?

We encourage people to vote such that upvote means “I want to see more of this” and downvote means “I want to see less of this.”

comment by Ben Pace (Benito) · 2020-08-24T21:14:34.810Z · LW(p) · GW(p)

I am inclined to take your strong language as expressive, kind of like Shia LaBeouf roaring at me.

But in case not, I think it's good to remember Your Price For Joining [? · GW].

But usually... I observe that people underestimate the costs of what they ask for, or perhaps just act on instinct, and set their prices way way way too high.  If the nonconformist crowd ever wants to get anything done together, we need to move in the direction of joining groups and staying there at least a little more easily.  Even in the face of annoyances and imperfections!  Even in the face of unresponsiveness to our own better ideas!

The voting system is overall doing its job well. Great posts reliably find their way to the top, we're not overrun by newbies taking all the attention (I claim), and a number of other good things.

If I find out a lot of people this whole time have been voting using an algorithm that seems bad to me... it's not my favorite thing, but I can live with it, clearly. I won't escalate very much on that fight, and I don't think it's worth it to escalate too much. 

I don't mean with this comment to take an object level stance on the question at hand.

Replies from: Viliam, sil-ver
comment by Viliam · 2020-08-25T14:01:14.472Z · LW(p) · GW(p)

My strong language is an expression of annoyance, not anger, just to avoid misunderstanding.

The effects of a voting system depend on how people use it. For example, you could have exactly the same voting mechanism and a group norm of "upvote everything that contains your ingroup's applause lights, and downvote everything that contains your outgroup's applause lights" and the effects would be quite different.

If it becomes common knowledge that "you should sometimes downvote the stuff you like, and upvote the stuff you don't like", gods help us all. I am trying to fight this... emerging group norm.

On reflection, "downvoting a comment you like but not too much" is the lesser problem here. What makes my blood boil is people strategically upvoting comments that "are stupid, but I think that -10 karma goes a bit too far". But I assume the same norm covers both.

Replies from: Benito
comment by Ben Pace (Benito) · 2020-08-25T19:20:51.680Z · LW(p) · GW(p)

(Thx for the reply, that makes sense. Will try to get around to writing my own answer on this thread soon.)

comment by Rafael Harth (sil-ver) · 2020-08-24T21:24:35.135Z · LW(p) · GW(p)

By the way, I think (~80%) you were the one who once made a comment mentioning that you consider current karma in your votes. That comment was what got me thinking about this in the first place.

Replies from: Benito
comment by Ben Pace (Benito) · 2020-08-24T22:04:17.903Z · LW(p) · GW(p)

That makes sense. I was thinking of saying something in an answer/comment. I agree with many of the critiques in this thread, has been pretty helpful, so thanks for setting it up.

comment by Rafael Harth (sil-ver) · 2020-08-24T21:03:03.999Z · LW(p) · GW(p)

Thanks for being unfiltered here; I definitely want to know if others think this is a bad thing to do. I share the intuition (at least somewhat).

Replies from: Viliam
comment by Viliam · 2020-08-24T21:29:50.741Z · LW(p) · GW(p)

In recent election in my country, there was a political coalition that was new and quite popular in my bubble. According to the law, they needed 7% of votes to get into the parliament. And many people I know were like "they are safely above the limit, so although I prefer them, I will strategically vote for one of the less popular but still okay parties instead, to help them also pass the limit". Then all the votes were counted and the coalition received 6.96%. I personally know at least three people who then regretted their vote.

So I may be a bit more sensitive about this topic than usual. But this "voting to balance other votes" is not a new idea, and I already opposed it before.

answer by Bucky · 2020-08-24T20:10:42.830Z · LW(p) · GW(p)

It is important that everyone use a similar condition for voting.

Inasmuch as voting has a defined meaning understood by the community ("I want to see more/less of this [? · GW]"), using it to mean something else is a Simulacrum level 2 action which starts to distort the shared map.

If we want to change the meaning of karma to be self-referential then I guess that might work but it would require this being agreed by the community as the new meaning. Doing so unilaterally on an individual basis increases the effectiveness of one's own opinion at the expense of others' opinions.

answer by Dagon · 2020-08-24T15:21:28.047Z · LW(p) · GW(p)

I think it's a mistake to focus very much on karma scores. My voting is not consistent - sometimes I vote based on like/dislike the topic, sometimes I vote for well-argued/confusing presentation, etc. Other than "don't want this on LW", I tend to leave a comment about downvotes, but rarely do for up-votes, and almost never for not-voting.

I tend to at least glance at most new posts every morning as I wake up and get my head moving before work, so I often vote before there is much accumulated karma on the post. I do sometimes later remove the vote, or even downvote, if it has accumulated much more than I think it should.

I answered "non-negligible consideration", but it's worth mentioning that I follow a pattern: I tend to vote based on my simple reaction if the current score is between 1 and 25-ish. And I tend to vote more strategically ("what signal do I want the total score to send") below 1 and above 25.

answer by Measure · 2020-08-26T16:33:20.455Z · LW(p) · GW(p)

I rarely vote on anything since I mainly lurk without contributing anything.

If I did start voting more often, then my strategy would take into account desired karma target, but I would use current karma as evidence of a post's value to the community (which I also value in addition to the post's direct value to myself).

A way to mitigate the vote-order effect at the mechanism level would be to have users vote by specifying a desired karma target and then have the system vote strategically on their behalf (changing the vote later if necessary as new votes are cast).

answer by algon33 · 2020-08-25T20:31:48.565Z · LW(p) · GW(p)

When I bother to vote, I do take TK into account when upvoting. Karma serves a signalling purpose. But only when abs(TK) is large. If I see a post with +50 karma, I would have quite high expectations of it. If it exceeds that expectation, and I remember voting is a thing, I will upvote it. Since I almost never downvote, I can't say how much TK affects that.

answer by Mark Xu · 2020-08-24T16:19:38.806Z · LW(p) · GW(p)

copying my comment from https://www.lesswrong.com/posts/PX7AdEkpuChKqrNoj/what-are-your-greatest-one-shot-life-improvements?commentId=t3HfbDYpr8h2NHqBD

Note that this is in reference to voting on question answers.

> Downvoting in general confuses me, but I think that downvoting to 0 is appropriate if the answer isn't quite answering the question, but downvoting past zero doesn't make sense. Downvoting to 0 feels like saying "this isn't that helpful" whereas downvoting past 0 feels like "this is actively harmful".

13 comments

Comments sorted by top scores.

comment by Rafael Harth (sil-ver) · 2020-08-24T12:50:10.432Z · LW(p) · GW(p)

Voting Thread

(Don't forget the karma balance comment.)

Replies from: sil-ver, sil-ver, sil-ver, sil-ver
comment by Rafael Harth (sil-ver) · 2020-08-24T12:54:59.770Z · LW(p) · GW(p)

Option 2: Weak-upvote this if you give non-negligible consideration to what you think the total karma should be, but it isn't your primary concern.

comment by Rafael Harth (sil-ver) · 2020-08-24T12:55:46.120Z · LW(p) · GW(p)

Option 3: Weak-upvote this if you give zero or negligible consideration to what you think the total karma should be.

comment by Rafael Harth (sil-ver) · 2020-08-24T12:53:05.861Z · LW(p) · GW(p)

Option 1: Weak-upvote this if you vote primarily based on what you think the total karma should be.

comment by Rafael Harth (sil-ver) · 2020-08-24T12:58:26.578Z · LW(p) · GW(p)

Karma Balance: Weak-downvote this if you participated in this poll by weak-upvoting any of the three options.

comment by Dagon · 2020-08-25T15:56:19.019Z · LW(p) · GW(p)

It's interesting that this is somewhat related to the abstraction-levels and truth-telling discussions. Are votes a statement of fact (how you think about this post), or a performative act (trying to influence future behavior)?

The impact of scores is entirely about the aggregate level. Why WOULDN'T a consequentialist focus on the effect of their action, rather than the written-but-unenforceable suggestions for how to vote?

Replies from: Bucky
comment by Bucky · 2020-08-25T18:19:35.357Z · LW(p) · GW(p)

I like that framing in the first paragraph.

In the second paragraph I can’t work out if the question is intended rhetorically, ironically or genuinely!

Replies from: Dagon
comment by Dagon · 2020-08-25T21:29:16.745Z · LW(p) · GW(p)

Sorry for confusion! intended as semi-ironic, semi-genuine rhetoric. In many discussions, I take the position that communication is an action by an agent, rather than necessarily a conveyance of truth, more strongly than many people on LW. I probably wrote it in an overly-reactive style, but I don't regret enough to change it.

Replies from: Bucky
comment by Bucky · 2020-08-25T22:22:48.626Z · LW(p) · GW(p)

Now I fell less stupid for not getting it - at least I included all of the different parts of the recipe! Very impressive content density comment.

I have a close-to-deontological belief in the need to obey the rules of a community that's trying to create things together (even when the rules seem wrong) and I think I tend to interpret things in that frame (for or against) even if that isn't the intention. In the immortal words of Scott Alexander:

No! I am Exception Nazi! NO EXCEPTION FOR YOU!

Replies from: Dagon, Raemon
comment by Dagon · 2020-08-26T00:01:23.328Z · LW(p) · GW(p)
I have a close-to-deontological belief in the need to obey the rules of a community that's trying to create things together (even when the rules seem wrong) and I think I tend to interpret things in that frame (for or against) even if that isn't the intention.

Yeah, I acknowledge that I'm a bit of a jerk in that I disregard rules more easily than most people find comfortable. I take more of a Chesterton's fence approach within an overall consequentialist framework. Following the rules is a great default choice. If I don't want to put the energy into analyzing the reasons behind the rules, or can't understand the situation well enough to know WHY I think the universe is improved by my rules violation, I should just obey.

But if I do have a belief that the outcome is better by some other action, I take that action.

The written, legible rules are, I believe, a map of the ideas (maps) of the authors of the rules. The actual rules are what happens - the results of my actions, whether that's better identification of great posts, or more interesting discussions, or confusion and discomfort in readers, or my ejection from the community. The written rules are both a prediction and a coarse-grained statement of intent about those results, but they almost always diverge from reality.

Note that I do not generalize to "everyone should take this action". I'm denying the completeness of rules, not creating new ones (though I do sometimes propose new rules or different Schelling points, that's just another level consequentialism). I'm also something of a jerk in my level of elitism that lets me do things I think are good, EVEN IF those choices wouldn't scale, and would cease to be good if many others did them (I'd STOP the rogue actions if that occurred, but I wouldn't avoid them just because of the counterfactual universality).

For voting, I haven't analyzed whether I should change my strategy based on some, many, or most other LW voters are voting their conscience or voting for results. I think my preference (vote for results) scales, but I'm not certain.

comment by Raemon · 2020-08-25T23:07:27.690Z · LW(p) · GW(p)

I have a close-to-deontological belief in the need to obey the rules of a community that's trying to create things together

To be clear, I agree with this, it's just that in this case I think it's actually kinda important that the rules of karma are vague and underspecified (so that it can handle a wide variety of problems in different contexts), and trying to "follow the letter of the law" deontologically probably isn't a good use of your effort.

comment by Dagon · 2020-08-26T17:12:31.248Z · LW(p) · GW(p)

As a followup - do you vote using the same strategy for comments, shortforms, personal blogposts, linkposts, and frontpage original content?

I hadn't really thought about the distinction until this question came up, but I think my strategy is "vote naively (based only on my opinion)" for things with low total vote expectations (say, less than 15), which describes most comments and shortforms, some personal posts, and only a few frontpage posts.

comment by Slider · 2020-08-24T23:26:22.029Z · LW(p) · GW(p)

At a time I modeled as my posts scores being dominated by this kind of voting. It lead me to think about post stereotypes and norms rather than individual people being individually opinionated.

There is something funky about irregular level of scrutinity. if every post was voted by everyone that read it that would be one thing. But the norms seems to be different on the top of the nest vs in the depths of the nest.