Meta: Karma and lesswrong mainstream positions

post by FAWS · 2011-04-07T10:44:28.446Z · LW · GW · Legacy · 45 comments

My impression is that critiques of lesswrong mainstream positions and arguments for contrary positions are received well and achieve high karma scores when they are of very high quality. Similarly posts and comments that take lesswrong mainstream positions will still be voted down if they are of very low quality. But in between there seems to be a gulf: Moderately low quality mainstream comments will stay at 0 to -1 karma while contra-mainstream comments of (apparently) similar quality score solidly negative karma, moderately high quality mainstream comments achieve good positive karma while similar quality contra-mainstream comments stay at 0 to 2. 

Do you share my impression? And if this is the case, should we try to do something about it? 

45 comments

Comments sorted by top scores.

comment by David_Gerard · 2011-04-07T11:59:59.804Z · LW(p) · GW(p)

Best thing to do about it, I think, would be to point out that putting effort into your comment will get you paperclips.

In my experience, if I do my homework and show it (e.g. links in the post), I can be not merely contrary but actually wrong, and still get awarded paperclips for effort. I presume this is a vote for "more like this", i.e. posts that put in effort.

This will also result in higher-quality comments, which is the point of awarding paperclips in the first place. So, so far so good.

Replies from: jwhendy, None, radical_negative_one
comment by jwhendy · 2011-04-07T17:48:33.867Z · LW(p) · GW(p)

In my experience, if I do my homework and show it (e.g. links in the post), I can be not merely contrary but actually wrong, and still get awarded paperclips for effort. I presume this is a vote for "more like this", i.e. posts that put in effort.

Well, that, or the effort makes you look like you're right. You could be right and a vote is translated as, "Yes, more of this," but an alternative would be, "Wow, sources, looks smart and probably right."

Or, people just don't follow the links but upvote because they're passing by and there's links with attract attention and a click of the mouse.

(Not poo-pooing your effort, just suggesting alternative interpretations.)

comment by [deleted] · 2011-04-09T05:51:15.128Z · LW(p) · GW(p)

Best thing to do about it, I think, would be to point out that putting effort into your comment will get you paperclips.

Probably. But suppose you want to increase your paperclips per unit of effort. Then that point won't really help.

If someone wants to up their account karma per unit of effort, there are ways.

  • Rather than writing one long comment, splitting it into two gives you the chance to garner two upvotes per fan instead of one. If your comment is a mix of good and bad points, dividing it may separate the gold in your comment from the lead.

  • Putting a lot of effort into your comments can backfire, because hard to write can mean hard to read. If your long comment isn't read, it probably won't be upvoted.

  • Repetition works. If you notice that one of your comments got a lot of karma, repeating the performance can multiply your profit.

So: does effort pay off? I'm sure it does on average, but from what I've seen, the variation is very high.

comment by radical_negative_one · 2011-04-09T05:32:39.191Z · LW(p) · GW(p)

Nothing substantive to say, but i do remember somebody recently suggesting that the karma "points" should be nicknamed "paperclips". It's nice to see somebody using this convention, it amuses me.

Replies from: David_Gerard
comment by David_Gerard · 2011-04-09T07:34:35.088Z · LW(p) · GW(p)

Paperclips are great. I'm finding that as I spend more time on LessWrong, I'm becoming increasingly interested in the idea of paperclips.

comment by Alicorn · 2011-04-07T15:43:49.325Z · LW(p) · GW(p)

I tend to downvote comments that are abusive, borderline spam (nonborderline stuff I can ban), repetitive ad nauseum with other comments the same person has made recently, or -

have three or more entangled things wrong with them (where "wrong with them" doesn't include the bare fact of a non-mainstream position, but often includes the reasoning that leads to such positions).

When it's just one or two problems, or problems that exist independently and can be addressed separately, I prefer to confront the problems in a comment, and my voting thereafter is mostly dependent on how my comment is answered, if at all. (Repetition of the content of the original comment trips the "repetitive" switch, for instance, and abuse also sometimes enters the conversation at this stage.)

When it's three problems or more, all of which relate to and intertwine with each other, I feel helpless to attack any of them, because I feel like I'd have to somehow manage all of the problems simultaneously. Sometimes I click "reply" and then start to write an answer four or five times, before giving up because any grammatically correct first sentence won't do justice to the awfulness I'm reacting to. The only way to express how very bad such a comment is, is to give the entire thing a good hard swat, which means downvoting.

Replies from: Desrtopa
comment by Desrtopa · 2011-04-07T16:56:58.176Z · LW(p) · GW(p)

This is pretty much the same standard I use (save for not being able to ban spam.) I also downvote people who persist in confusions with less than three entangled problems in the face of repeated attempts at correction, but usually not when I'm attempting to provide the corrections myself because it makes me feel rude and more likely to become frustrated.

comment by PhilGoetz · 2011-04-08T03:20:26.359Z · LW(p) · GW(p)

I have similar impressions. I think there may also be a difference in time: Contrarian posts and comments will sometimes get voted down very quickly, within a few minutes of being posted, then be voted up over the next few days. It could be that LW conformists are more likely to check the site obsessively, and so most of the first dozen people to see a comment are conformists.

There is a phenomenon on Wikipedia, where some editors spend their days hunched over the keyboard, not trying to make Wikipedia better, but trying to enforce their vision of Wikipedia's social norms, mostly by deleting articles with the zeal of a Spanish Inquisitor and the rules-lawyering of a D&D munchkin. See the user contributions of Realkyhick for an example of this. Note that he has several times deleted an article within minutes of it first appearing, while the author was still writing it. Perhaps there's a similar psychology at work on LW.

Replies from: wedrifid
comment by wedrifid · 2011-04-08T03:37:06.261Z · LW(p) · GW(p)

I think there may also be a difference in time: Contrarian posts and comments will sometimes get voted down very quickly, within a few minutes of being posted, then be voted up over the next few days. It could be that LW conformists are more likely to check the site obsessively, and so most of the first dozen people to see a comment are conformists.

I attribute this to disagreement within the immediate discourse. In general whenever a group of a few people are having an argument in a sequence of comments and (at least) one side clearly cares a new comment that refutes the previous one often receives quick downvotes. This is independent of whether the position disagrees with the community at large and depends directly on the 'other side' taking it personally.

I know that some of my highest voted series of comments actually started at below -2. It is only after a day or two that they settled at their stable 'approved of' status. In contrast to your suggestion this would seem to suggest that it takes time for the 'mainstream' tide of opinion on comment value to overwhelm the eddy current of personal dislike.

comment by TheOtherDave · 2011-04-07T14:10:30.356Z · LW(p) · GW(p)

My observations are consistent with your impression and don't think we should try to do anything about it.

I also suspect that an analysis of tone -- roughly speaking, to how obnoxiously the commenter comes across -- would account for a large chunk of the variance.

Replies from: Gray
comment by Gray · 2011-04-08T04:03:42.637Z · LW(p) · GW(p)

I was just trying to think of what obnoxious means in this context because, well, who of us wants to come across as obnoxious? And I think it means, with some latitude, that the writer suggests that he or she is aiming at something different than what the other participants are aiming at. This could be egotism/narcissism, persuading others towards a pet belief system, or taunting others/trollishness.

The other alternative could be issues concerning rhetorical style. Either the rhetoric of the writer is uncomfortable to what the reader is accustomed to, or the emphasis of the posts makes it difficult for readers to pierce the arguments for substance.

Or other meanings I'm not aware of.

Replies from: TheOtherDave
comment by TheOtherDave · 2011-04-08T13:52:46.643Z · LW(p) · GW(p)

If I were to do this analysis, there are a few dimensions I would look at first to see how much variance they account for:

  • ratio of words devoted to negation, vs. words devoted to proposing an alternative idea or to asking for clarification

  • nonresponsiveness... that is, where comment C1 = X, and C2 = "-X, because Y1..Y3", and C3 = X with no significant addressing of Y1..Y3. (This is a little tricky, though because in general I expect the absolute value of karma scores to decrease the more nested they are.)

  • the combination of stridency and incoherence. (Either in isolation I wouldn't expect to account for much negative karma.)

Replies from: Gray
comment by Gray · 2011-04-08T21:44:13.274Z · LW(p) · GW(p)

Ahah, that certain aids in understanding your statement.

comment by [deleted] · 2011-04-07T23:49:02.195Z · LW(p) · GW(p)

One potential function of a karma system I have not seen mentioned is that it can act as a release valve that gives a person a way to express his displeasure, fair or not. Without that release valve the person might instead write a flame, triggering a flame war. This is in addition to the fact that actual flames are themselves likely to be heavily downvoted. Some unfairly assigned negative karma may be a fair price to pay for relative freedom from flamewars and the tremendous waste of time they represent.

comment by Vladimir_M · 2011-04-07T23:21:14.483Z · LW(p) · GW(p)

My impression is that there are a few issues where contrarian positions will tick people off and result in undeserved downvotes unless they're extremely well written and argued. There is also a somewhat larger set of issues where posts and comments will get upvoted heavily, and sometimes stratospherically, despite being pure applause lights. However, as someone who has written many comments of varying quality criticizing various positions that are prevalent on LW, I can say that as long as they don't touch any of the few third rails, contrarian comments overwhelmingly end up with non-negative scores even if they're less than stellar.

When it comes to posts downvoted to -1, and sometimes also -2, one confounding issue are the passive-aggressive downvotes of frustrated participants in the discussion. (Scores below -2 usually indicate a wider range of downvoters.) These typically get reversed by other readers, but sometimes nobody sees it or cares enough.

On the whole, I think the present system works as well as could be reasonably expected. The only change I'd like to see is separate tracking of upvotes and downvotes, so that controversial comments would stand apart from those that are just plain uninteresting.

Replies from: Swimmer963, None
comment by Swimmer963 (Miranda Dixon-Luinenburg) (Swimmer963) · 2011-04-10T02:49:12.549Z · LW(p) · GW(p)

The only change I'd like to see is separate tracking of upvotes and downvotes, so that controversial comments would stand apart from those that are just plain uninteresting.

I would like that too...also for top-level posts, it would be nice to know.

comment by [deleted] · 2011-04-08T01:28:26.587Z · LW(p) · GW(p)

contrarian comments overwhelmingly end up with non-negative scores even if they're less than stellar.

Really? There are currently four discussion posts on Bayesian Epistemology versus Popper:

http://lesswrong.com/r/discussion/lw/54u/bayesian_epistemology_vs_popper/ http://lesswrong.com/r/discussion/lw/551/popperian_decision_making/ http://lesswrong.com/r/discussion/lw/552/reply_to_benelliott_about_popper_issues/ http://lesswrong.com/lw/3ox/bayesianism_versus_critical_rationalism/

These posts have attracted close to six hundred comments between them, yet the first three remain at around 0 to -1 and the last one at 4. The first three posts where written by curi, a Popperian critical of Bayesianism, who has also contributed a large number of quality comments, and probably many more comments than anyone else on LW during the same time period. So you've got this high quality commenter who has the skills to write a lot of good stuff very quickly and who is generating interest but who has fairly dismal kharma. Explain.

Personally, I don't think the votes are anything people should care about.

Replies from: JoshuaZ, CarlShulman, Desrtopa, TheOtherDave, curi, curi, curi
comment by JoshuaZ · 2011-04-08T03:08:50.338Z · LW(p) · GW(p)

So you've got this high quality commenter who has the skills to write a lot of good stuff very quickly and who is generating interest but who has fairly dismal kharma. Explain.

You are making potentially unsubstantiated assumptions here. Note for example that curi at one point asserted that he didn't want to read up on Bayesianism because he'd find it boring but the fact that he had read Harry Potter and the Methods of Rationality should help. Curi's comments have been of highly variable quality.

Replies from: curi
comment by curi · 2011-04-08T09:28:10.451Z · LW(p) · GW(p)

It does help create familiarity with the culture. I have of course also read up on Bayesianism. I specifically said I wasn't interested in the sequences. I read stuff from Eliezer before they existed. I see no need to read more of the same. Be more careful not to misquote.

Note that I just read an academic paper on Bayesianism. And before that I read from the Jaynes' book. I ordered two books from the library on recommendations. So obviously I will read up on Bayesianism in some ways.

http://lesswrong.com/lw/54u/bayesian_epistemology_vs_popper/3vew

Replies from: JoshuaZ
comment by JoshuaZ · 2011-04-08T13:33:20.759Z · LW(p) · GW(p)

It does help create familiarity with the culture. I have of course also read up on Bayesianism. I specifically said I wasn't interested in the sequences. I read stuff from Eliezer before they existed. I see no need to read more of the same. Be more careful not to misquote.

I don't know how this is a misquote. You didn't know the details of Cox's theorem (it wasn't even clear you were familiar with the theorem).

I said:

You may have valid points to make but it might help in getting people to listen to you if you don't exhibit apparent double standards. In particular, your main criticism seems to be that people aren't reading Popper's texts and related texts enough. Yet, at the same time, you are apparently unaware of the basic philosophical arguments for Bayesianism. This doesn't reduce the validity of anything you have to say but as an issue of trying to get people to listen, it isn't going to work well with fallible humans.

You then replied:

Learning enough Bayesian stuff to sound like a Bayesian so people want to listen to me more sounds to me like more trouble than it's worth, no offense. I'm perfectly willing to read more things when I make a mistake and there is a specific thing which explains the issue. I have been reading various things people refer me to. If you wanted me to study Bayesian stuff for a month before speaking, well, I'd get bored because I would see flaws and then see them repeated, and then read arguments which depend on them. I did read the whole HP fic if that helps.

Neither of us made any mention of the sequences (which in any event wouldn't be great reading for this purpose- very little of them actually has to do with Bayesianism directly.)

Your link showing that you read an academic paper on Bayesianism occurs 30 hours after your above comment. Even if you were trying specifically to understand the culture of LW (not something stated in your earlier remark) reading HPMR is an awful way of going about it. So I don't understand your point at all.

In any event, whether or not you intended to mean something else isn't terrible relevant to the point I was trying to make: comments which try to include reading incomplete Harry Potter fanfic as legitimate evidence of having done one's research are not high quality remarks and seriously undermine your credibility.

comment by CarlShulman · 2011-04-08T02:55:09.263Z · LW(p) · GW(p)

Many people are motivated to comment to critique posts that they see as low quality. In contrast, a post that covers most issues it raises well may leave little room for comment.

comment by Desrtopa · 2011-04-08T23:14:13.813Z · LW(p) · GW(p)

The first three posts where written by curi, a Popperian critical of Bayesianism, who has also contributed a large number of quality comments, and probably many more comments than anyone else on LW during the same time period. So you've got this high quality commenter who has the skills to write a lot of good stuff very quickly and who is generating interest but who has fairly dismal kharma. Explain.

Upvotes signal "I would like to see more like this," and downvotes signal "I would like to see less like this." Curi was upvoted to start with for raising some ideas not in common circulation here, and for making some claims of poor scholarship on Eliezer's part (absent extenuating factors, we tend to upvotes comments which promote improved scholarship.) He began to be downvoted as other posters began to become frustrated with his double standards of scholarly expectations, poorly founded arguments, and failure to follow through on requests for information that would convince us to take further interest in Popper. The downvotes indicate that other posters no longer feel that he is participating according to standards we consider appropriate.

Criticism of the ideas that are mainstream here always generates activity, and the reception is positive when the conduct is positive, and negative when the conduct is negative.

comment by TheOtherDave · 2011-04-08T03:16:52.074Z · LW(p) · GW(p)

who is generating interest but who has fairly dismal kharma. Explain.

My tentative explanation is that it's not actually generating as much interest as you suggest.

Rather, there are a small number of people generating a vast number of comments that don't seem to generate any useful progress, and thus don't garner much karma.

Admittedly I'm mostly generalizing from my own experience here... that kind of volume: content ratio is not something I want to see more of (and is, indeed, something I'd like to see less of), and the low karma scores seem consistent with the idea that other people are like me in this respect.

That said, I could be wrong... it may be that lots of other people find that dialog worthwhile, that I'm the exception, and that the karma scores have some other explanation I haven't thought of.

Personally, I don't think the votes are anything people should care about.

To the extent (A) that votes on X's comments/posts reflect other people's desire to have stuff like those comments/posts on this site, and to the extent (B) that X cares about other people's desires, X should care about votes.

Of course, extent A is difficult to determine with confidence, and extent B is a consequence of X's values. For myself, I estimate A to be fairly high, and B is pretty high for me, so I care about votes and I think I should care about them.

Replies from: JoshuaZ
comment by JoshuaZ · 2011-04-09T17:05:15.371Z · LW(p) · GW(p)

Rather, there are a small number of people generating a vast number of comments that don't seem to generate any useful progress, and thus don't garner much karma.

I've obtained a delta of about +100 karma in this discussion. So this explanation seems wrong.

Replies from: TheOtherDave
comment by TheOtherDave · 2011-04-09T18:25:25.874Z · LW(p) · GW(p)

Fair enough... if that's coming from a small number of highly-ranked comments, that is indeed evidence that a lot of people are interested in the exchange. (If it's a large number of low-ranked comments, it's equally consistent with a small number of people who endorse your engagement in it.)

Thanks for the counterargument.

Replies from: JoshuaZ
comment by JoshuaZ · 2011-04-09T21:13:15.717Z · LW(p) · GW(p)

Fair enough... if that's coming from a small number of highly-ranked comments, that is indeed evidence that a lot of people are interested in the exchange. (If it's a large number of low-ranked comments, it's equally consistent with a small number of people who endorse your engagement in it.)

That seems like an accurate assessment. At present this comment http://lesswrong.com/lw/54u/bayesian_epistemology_vs_popper/3uqx is at +8 and this comment http://lesswrong.com/lw/54u/bayesian_epistemology_vs_popper/3usd is at +13 but the second comment also got linked to in a separate thread. No other comment of mine in that discussion has upvoted to more than 5. That data combined with your remark above suggests that your initial remark was correct.

comment by curi · 2011-04-08T09:21:36.721Z · LW(p) · GW(p)

My Karma has been experiencing heavy fluctuations. This is different than I have seen on other websites with Karma. It would often go up 10 points or more, and then down 10 points, in a short time period. After getting to 50 a few times it then dropped to 20. Then I went to sleep and woke up with 88. Today it changed less, gradually dropping to 65.

I have no particular opinion about what this means, and I don't really care what my karma is, but I'm a bit curious why this is different than what happens at, e.g. Hacker News.

comment by curi · 2011-04-08T22:57:33.730Z · LW(p) · GW(p)

BTW my Karma hit 0 today from a high of around 90. I think it may be capped there...

There is rate filtering on writing comments here. I think low karma makes the rate filterer more aggressive. (e.g. this comment ran into rate filtering, but a couple days ago i was posting at a higher rate than today)

In two attempts to post fairly exactly when allowed to, both times I got a message about how many milliseconds I still had to wait (250 and 184). Seems pretty unlikely to cut it that close twice without trying to. I wonder if there's a bug. I wonder, given that we have here some genuine numbers, if any Bayesian could calculate the probability it's a bug vs coincidence, to show us an example of their philosophy in action (I have yet to see any realistic examples where an actual number is calculated).

edit: just jumped up to 54 in the last like 10 minutes. i wonder why.

edit again: make that 67. i guess someone is upvoting all of my comments. lol

edit again: 84. whoever is doing this, if you like me that much want to talk? email me curi@curi.us

Replies from: Desrtopa
comment by Desrtopa · 2011-04-08T23:57:59.533Z · LW(p) · GW(p)

It's not a bug, it's a feature. Downvotes mean "I want to see less like this." If a person generates content that other members want to see more of, they quickly reach a point where they are allowed to post it at a rate only limited by how quickly they can generate it. Why limit someone's productivity if their work is getting a positive reception? But if their contributions attract mostly negative attention, it makes sense to limit their activity on the board until they improve their conduct to the point where their contributions are more welcome.

Replies from: curi
comment by curi · 2011-04-09T00:09:52.608Z · LW(p) · GW(p)

you misread quite badly.

i was wondering if the timer was bugged in such a way that you have more chance to get a milliseconds remaining message than you should.

comment by curi · 2011-04-08T09:31:06.127Z · LW(p) · GW(p)

The karma system is a psychological trick to gain more engagement with the website from certain kinds of foolish people. Just ignore it.

Replies from: timtyler, Emile
comment by timtyler · 2011-04-08T12:25:11.539Z · LW(p) · GW(p)

It lets the community defend itself against spammers and other undesirables, and provides a low-cost/low-effort feedback mechanism. Other sites use reputation systems too - they have real benefits in terms of creating responsible behaviour.

comment by Emile · 2011-04-09T09:49:24.130Z · LW(p) · GW(p)

I find it quite useful - if I don't have much time to read the rest posts, or comments to an interesting post, I'll only read those with high karma, so I appreciate the time-saving mechanism. It also creates an incentive for high-quality comments, which I appreciate too.

comment by JoshuaZ · 2011-04-07T18:39:33.362Z · LW(p) · GW(p)

I share this impression. But it might be due to some sort of self-congratulatory/Lake Woebegone bias about my own comments since almost all my contra-mainstream comments have been voted up. This is true for a large variety of different criticisms. Thus I've been critical of cryonics, of AI fooming, and Bayesianism. I've been deeply critical of the narrative here that portrays phlogiston as a bad scientific theory, and every single comment of that form has been voted up. But there's a related issue: I do try to talk in a way that will get LW people to listen. Thus for example, when discussing cryonics, I will go out of my way to explicitly discuss it in terms of expected utility because that gives a useful common vocab. If one tried to discuss cryonics from some form of deontological ethics even if the system had strongly anti-deathist attitudes, I expect that this would lead to confusion and downvoting here.

Edit: Another thing that seems to help get contra-mainstream comments voted up is to acknowledge weaknesses in one's idea. If one includes counter-arguments to what one is saying, even if one only includes a few of them, one comes across as more reasonable. Coming across as Tevye the milkman but leaning against LW consensus works a lot better than coming across as just strongly against the consensus.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2011-04-08T04:25:33.315Z · LW(p) · GW(p)

Thus I've been critical of cryonics, of AI fooming,

I'm not sure how mainstream these positions actually are. While these positions are certainly held by several high status members of the community, I'm pretty sure a majority of posters don't believe in AI fooming and wouldn't be surprised if a significant fraction are critical of cryonics.

comment by benelliott · 2011-04-07T12:32:33.457Z · LW(p) · GW(p)

We really need to clarify, for everyone, whether a down-vote means 'I disagree' or 'I think you shouldn't have posted this at all'.

Replies from: TheOtherDave, Gray
comment by TheOtherDave · 2011-04-07T14:13:12.661Z · LW(p) · GW(p)

For what it's worth, when I arrived here it became pretty clear to me pretty quickly that a downvote is intended to mean "I want less of this sort of thing."

That some people want less disagreement is perhaps unfortunate, and perhaps not, but in neither case is it necessarily a sign of confusion, and further clarification may not help. This may be a case, rather, of values in opposition.

comment by Gray · 2011-04-08T04:30:48.384Z · LW(p) · GW(p)

I think I heard an excellent answer, somewhere, that suggested that upvotes/downvotes merely represent "I want more people to see this" or "I want less people to see this". This is implied in the system that will reorganize the posts such posts with higher scores will be put in a more conspicuous on the page.

As excellent as that answer is, however, I do wonder if it misses something. As much as we might prefer votes to have a consistent interpretation, I think this bends to the idea that the meaning of a vote on a post depends on the nature of the post. Sometimes it is a method of voting on whether the argument is sound, sometimes it is voting on whether you like the post, sometimes it voting on whether you like the poster, and sometimes it is voting on the rhetorical style of the post. Someone could even make a post saying "vote me down if you like this post", whereby votes are unparsably ambiguous.

comment by wedrifid · 2011-04-07T11:46:52.376Z · LW(p) · GW(p)

Do you share my impression? And if this is the case, should we try to do something about it?

Not a priority. Obviously comments that people disagree with are going to be voted worse than those they agree with. And while we can't say that people agree with the mainstream by definition it is not far from it.

comment by Normal_Anomaly · 2011-04-07T12:05:38.379Z · LW(p) · GW(p)

I haven't done a solid survey, but I don't share your impression. I frequently see contra-mainstream comments get 6 or 7 karma; I also upvote them myself when I think they're well done.

Moderately low quality mainstream comments will stay at 0 to -1 karma while contra-mainstream comments of (apparently) similar quality score solidly negative karma,

Another influence that may be confounding your observations here is that comments with scores below -2 get automatically hidden.When is a commenter most likely to look at buried comments? Quite possibly, when there's an argument over the validity of a mainstream position going on. This would lead to bad contra-mainstream comments getting looked at and downvoted further even after they're buried.

Replies from: TheOtherDave
comment by TheOtherDave · 2011-04-07T14:07:08.737Z · LW(p) · GW(p)

You say:

I frequently see contra-mainstream comments get 6 or 7 karma;

The OP says:

arguments for contrary positions are received well and achieve high karma scores when they are of very high quality

...and yet you frame this as not sharing the OP's impression, which confuses me, because it sure does sound like your impressions are compatible.

Replies from: Normal_Anomaly
comment by Normal_Anomaly · 2011-04-07T22:10:47.301Z · LW(p) · GW(p)

I think I misread this:

moderately high quality mainstream comments achieve good positive karma while similar quality contra-mainstream comments stay at 0 to 2.

as

high quality mainstream comments achieve good positive karma while similar quality contra-mainstream comments stay at 0 to 2.

And, of course, the borderline between high quality and moderately high quality is very subjective. So, our impressions are mostly compatible but they differ enough that I don't see a problem.

comment by AlephNeil · 2011-04-07T12:12:31.897Z · LW(p) · GW(p)

Do you share my impression?

I agree, but I think it's more a 'feature' rather than a 'bug'. As I see it, the advantage of the present system is that we're not condemned to keep on debating basic philosophical questions until everyone agrees, which in any case would never happen.

Chalmers said this when he dropped by (at least I think it was him):

One way that philosophy makes progress is when people work in relative isolation, figuring out the consequences of assumptions rather than arguing about them. The isolation usually leads to mistakes and reinventions, but it also leads to new ideas. Premature engagement can minimize all three.

comment by hairyfigment · 2011-04-09T04:20:08.918Z · LW(p) · GW(p)

I started out disagreeing strongly with the last part of your description, and I still lean that way - I think "moderately high quality" disagreement gets at least as much positive karma as moderately good agreement.

It does occur to me that we could be using a double standard when it comes to 'scholarship'. By this, I mean criticism that misunderstands the "mainstream" position (or seems to) likely gets downvotes even when we don't fully grasp the critic's position. At a glance, the current dispute seems like weak evidence for people talking past each other (and plain old annoyance at someone asking Bayesians what they could possibly mean by "support".)

But even knowing this, I don't feel that actively curious about an old but out-of-fashion philosophy which, according to its proponent here, denies any degree of validity to the reasoning that we observe people using to stay alive. Some philosophies are just wrong. In a quick search I can't find curi making an effort to explain how critical rationalism escapes the problems of Bayes, which seems like the best way to show that studying Popper has value for me. ("If you don't understand it, that's a criticism -- it should have been easier to understand.") Tell me that (say) your approach rules out a species consistently using the Gambler's Fallacy to set probabilities or reject theories, and we'll talk.

For curi: having read the link, do you think evolution would 'criticize' that species out of existence? What makes you think so? What makes your reasons less circular than assigning a rough probability?

Replies from: Desrtopa
comment by Desrtopa · 2011-04-09T13:19:01.976Z · LW(p) · GW(p)

I don't think that this tends to be the case. My impression is that if someone doesn't understand the mainstream positions here, they'll usually be offered corrections unless their position appears to be too confused for explanation to be likely to help. Curi didn't start being downvoted until he started demonstrating poor debate conduct, including exercising double standards himself with regards to scholarship and polite behavior.