Lesswrong, Effective Altruism Forum and Slate Star Codex: Harm Reduction
post by diegocaleiro · 2015-06-08T16:37:02.177Z · LW · GW · Legacy · 153 commentsContents
The Problem: Possible Solutions: The Past: A Suggestion: The Status Quo: None 153 comments
Cross Posted at the EA Forum
At Event Horizon (a Rationalist/Effective Altruist house in Berkeley) my roommates yesterday were worried about Slate Star Codex. Their worries also apply to the Effective Altruism Forum, so I'll extend them.
The Problem:
Lesswrong was for many years the gravitational center for young rationalists worldwide, and it permits posting by new users, so good new ideas had a strong incentive to emerge.
With the rise of Slate Star Codex, the incentive for new users to post content on Lesswrong went down. Posting at Slate Star Codex is not open, so potentially great bloggers are not incentivized to come up with their ideas, but only to comment on the ones there.
The Effective Altruism forum doesn't have that particular problem. It is however more constrained in terms of what can be posted there. It is after all supposed to be about Effective Altruism.
We thus have three different strong attractors for the large community of people who enjoy reading blog posts online and are nearby in idea space.
Possible Solutions:
(EDIT: By possible solutions I merely mean to say "these are some bad solutions I came up with in 5 minutes, and the reason I'm posting them here is because if I post bad solutions, other people will be incentivized to post better solutions)
If Slate Star Codex became an open blog like Lesswrong, more people would consider transitioning from passive lurkers to actual posters.
If the Effective Altruism Forum got as many readers as Lesswrong, there could be two gravity centers at the same time.
If the moderation and self selection of Main was changed into something that attracts those who have been on LW for a long time, and discussion was changed to something like Newcomers discussion, LW could go back to being the main space, with a two tier system (maybe one modulated by karma as well).
The Past:
In the past there was Overcoming Bias, and Lesswrong in part became a stronger attractor because it was more open. Eventually lesswrongers migrated from Main to Discussion, and from there to Slate Star Codex, 80k blog, Effective Altruism forum, back to Overcoming Bias, and Wait But Why.
It is possible that Lesswrong had simply exerted it's capacity.
It is possible that a new higher tier league was needed to keep post quality high.
A Suggestion:
I suggest two things should be preserved:
Interesting content being created by those with more experience and knowledge who have interacted in this memespace for longer (part of why Slate Star Codex is powerful), and
The opportunity (and total absence of trivial inconveniences) for new people to try creating their own new posts.
If these two properties are kept, there is a lot of value to be gained by everyone.
The Status Quo:
I feel like we are living in a very suboptimal blogosphere. On LW, Discussion is more read than Main, which means what is being promoted to Main is not attractive to the people who are actually reading Lesswrong. The top tier quality for actually read posting is dominated by one individual (a great one, but still), disincentivizing high quality posts by other high quality people. The EA Forum has high quality posts that go unread because it isn't the center of attention.
153 comments
Comments sorted by top scores.
comment by John_Maxwell (John_Maxwell_IV) · 2015-06-08T19:30:07.728Z · LW(p) · GW(p)
I've previously talked about how I think Less Wrong's culture seems to be on a gradual trajectory towards posting less stuff and posting it in less visible places. For example, six years ago a post like this qualified as a featured post in Main. Nowadays it's the sort of thing that would go in an Open Thread. Vaniver's recent discussion post is the kind of thing that would have been a featured Main post in 2010.
Less Wrong is one of the few forums on the internet that actually discourages posting content. This is a feature of the culture that manifests in several ways:
One of the first posts on the site explained why it's important to downvote people. The post repeatedly references experiences with Usenet to provide support for this. But I think the internet has evolved a lot since Usenet. Subtle site mechanics have the potential to affect the culture of your community a lot. (I don't think it's a coincidence that Tumblr and 4chan have significantly different site mechanics and also significantly different cultures and even significantly different politics. Tumblr's "replies go to the writer's followers" mechanic leads to a concern with social desirability that 4chan's anonymity totally lacks.)
On reddit, if your submission is downvoted, it's downvoted in to obscurity. On Less Wrong, downvoted posts remain on the Discussion page, creating a sort of public humiliation for people who are downvoted.
The Main/Discussion/Open Thread distinction invites snippy comments about whether your thing would have been more appropriate for some other tier. On most social sites, readers decide how much visibility a post should get (by upvoting, sharing, etc.) Less Wrong is one of the few that leaves it down to the writer. This has advantages and disadvantages. One advantage is that important but boring scholarly work can get visibility more easily.
Upvotes substitute for praise: instead of writing "great post" type comments, readers will upvote you, which is less of a motivator.
My experience of sitting down to write a Less Wrong post is as follows:
I have some interesting idea for a Less Wrong post. I sit down and excitedly start writing it out.
A few paragraphs in, I think of some criticism of my post that users are likely to make. I try to persevere for a while anyway.
Within an hour, I have thought of so many potential criticisms or reasons that my post might come across as lame that I am totally demoralized. I save my post as a draft, close the tab, and never return to it.
Contrast the LW model with the "conversational blogging" model where you sit down, scribble some thoughts out, hit post, and see what your readers think. Without worrying excessively about what readers think, you're free to write in open mode and have creative ideas you wouldn't have when you're feeling self-critical.
Anyway, now that I've described the problem, here are some offbeat solution ideas:
LW users move away from posting on LW and post on Medium.com instead. There aren't upvotes or downvotes, so there's little fear of being judged. Bad posts are "punished" by being ignored, not downvoted. And Medium.com gives you a built-in audience so you don't need to build up a following the way you would with an independent blog. (I haven't actually used Medium.com that much; maybe it has problems.)
The EA community pays broke postdocs to create peer-reviewed, easily understandable blog posts on topics of interest to the EA community at large (e.g. an overview of the literature on how to improve the quality of group discussions, motivation hacking, rationality stuff, whatever). This goes on its own site. After establishing a trusted brand, we could branch out in to critiquing science journalism in order to raise the sanity waterline or other cool stuff like that.
Someone makes it their business to read everything gets written on every blog in the EA-sphere and create a "Journal of Effective Altruism" that's a continually updated list of links to the very best writing in the EA-sphere. This gives boring scholarly stuff a chance to get high visibility. This "Editor-in-Chief" figure could also provide commentary, link to related posts that they remember, etc. I'll bet it wouldn't be more than a part-time job. Ideally it would be a high status, widely trusted person in the EA community who has a good memory for related ideas.
Some of these are solutions that make more sense if the EA movement grows significantly beyond its current scope, but it can't hurt to start kicking them around.
The top tier quality for actually read posting is dominated by one individual (a great one, but still)
Are we talking about LW proper here? Arguably this has been true over a good chunk of the site's history: at one time it was Eliezer, then Yvain, then Lukeprog, etc.
Replies from: None, Vaniver, Gondolinian, Evan_Gaensbauer, Gunnar_Zarncke, jacob_cannell, Evan_Gaensbauer, Evan_Gaensbauer↑ comment by [deleted] · 2015-06-08T19:44:40.960Z · LW(p) · GW(p)
Within an hour, I have thought of so many potential criticisms or reasons that my post might come across as lame that I am totally demoralized. I save my post as a draft, close the tab, and never return to it.
It doesn't help that even the most offhand posting is generally treated as if it was an academic paper and reviewed skewered accordingly :-p.
Replies from: Gondolinian, MathiasZaman↑ comment by Gondolinian · 2015-06-08T20:22:14.050Z · LW(p) · GW(p)
It doesn't help that even the most offhand posting is generally treated as if it was an academic paper and reviewed skewered accordingly :-p.
I agree. There are definitely times for unfiltered criticism, but most people require a feeling of security to be their most creative.
Replies from: John_Maxwell_IV↑ comment by John_Maxwell (John_Maxwell_IV) · 2016-11-27T14:52:27.718Z · LW(p) · GW(p)
I believe this is referred to as "psychological safety" in the brainstorming literature, for whatever that's worth.
↑ comment by MathiasZaman · 2015-06-09T08:38:37.053Z · LW(p) · GW(p)
Agreed. This is, for me, one of the main advantages of posting on tumblr. You still get the feedback you want from clever people and criticism, but that criticism doesn't feel quite as bad as it would here, because everyone realizes that tumblr is a good space to test and try out ideas. Less Wrong feels, to me, more like a place where you share more solidified ideas (with the Open Thread as a possible exception).
↑ comment by Vaniver · 2015-06-08T20:25:44.384Z · LW(p) · GW(p)
Vaniver's recent discussion post is the kind of thing that would have been a featured Main post in 2010.
I will point out that I didn't put that in Main (which is where I target the majority of the post-style content I create) because I think the first paragraph is the only 'interesting' part of that post, and it's a fairly straightforward idea, and the primary example was already written about by Eliezer, twice.
Within an hour, I have thought of so many potential criticisms or reasons that my post might come across as lame that I am totally demoralized. I save my post as a draft, close the tab, and never return to it.
This is a more serious issue, which was actually pretty crippling with the aforementioned discussion post--but that was mostly because it was a post telling people "you can't tell people things they don't know." (Yes, there's the consolation that you can explain things to people, but did I really want to put in the effort to explain that?)
↑ comment by Gondolinian · 2015-06-08T19:46:55.544Z · LW(p) · GW(p)
Is anyone in favor of creating a new upvote-only section of LW?
[pollid:988]
Replies from: Nornagest, Sarunas, ete, diegocaleiro, Richard_Kennaway, Richard_Kennaway↑ comment by Nornagest · 2015-06-08T21:44:44.250Z · LW(p) · GW(p)
Proposals for making LW upvote-only emerge every few months, most recently during the retributive downvoting fiasco. I said then, and I continue to believe now, that it's a terrible idea.
JMIV is right to say in the ancestor that subtle features of moderation mechanics have outsized effects on community culture; I even agree with him that Eliezer voiced an unrealistically rosy view of the downvote in "Well-Kept Gardens". But upvote-only systems have their own pitfalls, and quite severe ones. The reasons behind them are somewhat complex, but boil down to bad incentives.
Imagine posting as a game scored in utility. Upvotes gain you utility; downvotes lose you it; and for most people being downvoted costs you more than being upvoted gains you, though the exact ratio varies from person to person. You want to maximize your utility, and you have a finite amount of time to spend on it. If you spend that time researching new content to post, your output is low but it's very rarely downvoted. Debate takes a moderate amount of time; votes on debate are less reliable, especially if you're arguing for something like neoreaction or radical feminism or your own crackpot views on time and dimension, but you're all but guaranteed upvotes from people that agree with you. Plus telling people they're wrong is fun, so you get some bonus utility. Finally, you can post cat pictures, which takes almost no time, will score a few upvotes from people that like looking at their little jellybean toes, but violates content norms.
Which one of these is optimal changes, depending on how tolerant you are of downvoting and how good you are at dodging it. But while removing the downvote option incentivizes all three (which is why social media likes it), it should be clear that it incentivizes the last two much more. You can see the fruits of this on Facebook groups, that site's closest analogy to what's being proposed here. (Tumblr, and Facebook user pages, are also upvote-only in practice, but their sharing and friending mechanisms make them harder to analyze in these terms.)
Replies from: pianoforte611, Houshalter, Dahlen↑ comment by pianoforte611 · 2015-06-08T23:42:34.113Z · LW(p) · GW(p)
He isn't suggesting making LW upvote only. Just a creating a new section of it that is upvote only. And why not? If you're right the evidence will bear out that it is a terrible system. But we won't know until we test the idea.
Replies from: Nornagest↑ comment by Nornagest · 2015-06-08T23:52:50.108Z · LW(p) · GW(p)
An earlier version of my comment read "LW or parts of it". Edited it out for stylistic reasons and because I assumed the application to smaller domains would be clear enough in context. Guess I was wrong.
Granted, not everything I said would apply to the first proposal, the one where top-level posts are upvote-only but comments aren't. That's a little more interesting; I'm still leery of it but I haven't fully worked out the incentives.
As to empirics, one thing we're not short on is empirical data from other forums. We're not so exceptional that the lessons learned from them can't be expected to apply.
Replies from: pianoforte611↑ comment by pianoforte611 · 2015-06-08T23:59:27.915Z · LW(p) · GW(p)
Apologies if that seemed like nitpick (which I try to avoid). I thought it was relevant because even if you are right, trying out the new system wouldn't mean making LessWrong terrible, it would just mean making a small part of LessWrong terrible (which we could then get rid of). The cost is so small so that I don't see why its shouldn't be tried.
Replies from: Nornagest↑ comment by Nornagest · 2015-06-09T00:02:02.679Z · LW(p) · GW(p)
I think the cost is higher than you're giving it credit for. Securing dev time to implement changes around here is incredibly hard, at least if you aren't named Eliezer, and changes anywhere are usually harder to back out than they are to put in; we can safely assume that any change we manage to push through will last for months, and forever is probably more likely.
↑ comment by Houshalter · 2015-06-10T02:28:56.267Z · LW(p) · GW(p)
Hacker News has a downvote, but you need to have 500 karma to use it. This keeps it from being used too often, and only by people very familiar with the community culture. Stackoverflow allows anyone to downvote, but you have to spend your own karma, to discourage it.
HN also hides the votes that comments have. And reddit has been moving to this policy as well.
↑ comment by Dahlen · 2015-06-21T01:55:40.589Z · LW(p) · GW(p)
Imagine posting as a game scored in utility. Upvotes gain you utility; downvotes lose you it
That's exactly my problem with reddit-style voting in general. Human communication, even in an impoverished medium such as forum posting, is highly, highly complex and pluridimensional. Plus one and minus one don't even begin to cover it. Even when the purpose is a quick and informal moderation system. Good post on a wholly uninteresting topic? Good ideas once you get past the horrendous spelling? One-line answers? Interesting but highly uncertain info? Excessive posting volume? The complete lack of an answer where one would have been warranted? Strong (dis)approval looking just like mild (dis)approval? Sometimes it's difficult to vote.
Besides, the way it is set up, the system implicitly tells people that everyone's opinion is valid, and equally valid at that. Good for those who desire democracy in everything, but socially and psychologically not accurate. Some lurker's downvote can very well cancel out EY's upvote, for instance, and you'll never know. Maybe some sort of weighted karma system would work better, wherein votes would count more according to a combination of the voter's absolute karma and positive karma percentage.
To address your specific concerns about upvote-only systems, positive feedback expressed verbally may be boring to read and to write, hence reducing it to a number, but negative feedback expressed silently through downvotes leaves you wondering what the hell is wrong with your post and according to who. As long as people can still reply to each other, posters of cat pictures can still be disapproved of, even without downvotes. And perhaps the criticism may stick more if there are words to "haunt" you rather than an abstract minus one.
However, this one strongly depends on community norms. If the default is approval, then the upvote is the cheap signal and a downvote-only system can in fact work better. If the default is disapproval, then the downvote is a cheap signal. An upvote-only policy works best in a significantly more hostile environment.
↑ comment by Sarunas · 2015-06-09T21:35:07.050Z · LW(p) · GW(p)
Other. I do not think there is a need for a new section. Instead, we could encourage people to use tags (e.g. something like these belief tags) and put disclaimers at the top of their posts. Even though actual tags aren't very easy to notice, we can use "informal tags", such as, e.g. putting a tag in square brackets.
For example, if you want to post your unpolished idea, your post could be titled something like this: "A Statement of idea [Epistemic state: speculation] [Topic:Something]" or "A Statement of idea [Epistemic state: possible] [Topic:Something]" or "A Statement of idea [Epistemic state: a very rough draft][Topic:Something]". In addition to that you could put a disclaimer at the top of your post. Perhaps such clarity would make it somewhat easier to be somewhat more lenient on unpolished ideas, because even if a reader can see that the poster intended their post to be a rough draft with many flaws, they cannot be sure if that draft being highly upvoted won't be taken by another reader as a sign that this post is correct and flawless (or at least thought as such by a lot of LWers), thus sending the wrong message. If a poster made it clear that they merely explore the curious idea, an interesting untested model or something that has only a remote possibility of not being not even wrong, a reader would be able to upvote or downvote a post based on what the post was trying to achieve, since there would be less need to signal other readers that a post has serious flaws, and therefore should not be believed, if it was already tagged as "unlikely" or something like that.
Perhaps, numerical values to indicate the belief status (e.g. [0.3]) could be used instead of words.
There would still be an incentive to tag your posts as "certain" or "highly likely", because most likely they would be treated as having more credibility and thus attract more readers.
↑ comment by plex (ete) · 2015-06-09T16:04:07.195Z · LW(p) · GW(p)
Another approach would be not allowing downvote to be open to all users. On the Stackexchage network for example, you need a certain amount of reputation to downvote someone. I'd bet that a very large majority of the discouraging/unnecessary/harmful downvotes come from users who don't have above, say, 5-15 karma in the last month. Perhaps official downvote policies messaged to a user the first time they pass that would help too.
This way involved users can still downvote bad posts, and the bulk of the problem is solved.
But it requires technical work, which may be an issue.
Replies from: Gondolinian↑ comment by Gondolinian · 2015-06-09T16:41:17.853Z · LW(p) · GW(p)
Perhaps official downvote policies messaged to a user the first time they pass that would help too.
Anything with messages could be implemented by a bot account, right? That could be made without having to change the Less Wrong code itself.
Maybe we could send a message to users with guidelines on downvoting every time they downvote something? This would gently discourage heavy and/or poorly reasoned downvoting, likely without doing too much damage to the kind of downvoting we want. One issue with this is it would likely be very difficult or practically impossible for a bot account to know when someone downvotes something without changing the LW code. (Though it probably wouldn't require a very big change, and things could be limited to just the bot account(s).)
[pollid:989]
Replies from: ete↑ comment by plex (ete) · 2015-06-09T21:14:12.120Z · LW(p) · GW(p)
Every time someone downvotes would probably be too much, but maybe the first time, or if we restrict downvotes only for users with some amount of karma then when they hit that level of karma?
↑ comment by diegocaleiro · 2015-06-08T21:15:23.323Z · LW(p) · GW(p)
Would you be willing to run a survey on Discussion also about Main being based on upvotes instead of a mix of self-selection and moderation? As well as all ideas that seem interesting to you that people suggest here?
There could be a research section, a Upvoted section and a discussion section, where the research section is also displayed within the upvoted, trending one.
Replies from: Gondolinian, Gondolinian↑ comment by Gondolinian · 2015-06-10T13:56:19.769Z · LW(p) · GW(p)
On second thought, I'll risk it. (I might post a comment to it with a compilation of my ideas and my favorites of others' ideas, but it might take me a while.)
↑ comment by Gondolinian · 2015-06-09T17:12:46.874Z · LW(p) · GW(p)
Would you be willing to run a survey on Discussion also about Main being based on upvotes instead of a mix of self-selection and moderation? As well as all ideas that seem interesting to you that people suggest here?
I'd rather not expose myself to the potential downvotes of a full Discussion post, and I also don't know how to put polls in full posts, only in comments. Nonetheless I am pretty pro-poll in general and I'll try to include more of them with my ideas.
↑ comment by Richard_Kennaway · 2015-06-09T23:00:59.874Z · LW(p) · GW(p)
Another suggestion. Every downvote costs a point of your own karma. You must have positive karma to downvote.
↑ comment by Richard_Kennaway · 2015-06-09T22:59:19.170Z · LW(p) · GW(p)
Another suggestion: Every downvote costs a point of your own karma.
↑ comment by Evan_Gaensbauer · 2015-06-09T01:11:43.381Z · LW(p) · GW(p)
Contrast the LW model with the "conversational blogging" model where you sit down, scribble some thoughts out, hit post, and see what your readers think. Without worrying excessively about what readers think, you're free to write in open mode and have creative ideas you wouldn't have when you're feeling self-critical.
I don't know if I've ever read the following from an original source (i.e., Eliezer or Scott), but when people ask "why do those guys no longer post on Less Wrong?", the common response I get from their personal friends in the Bay Area, or wherever, and the community at large, is apparently, however justified or not, the worry their posts would be overly criticized by posts is what drove them off Less Wrong for fairer pastures where their ideas wouldn't need pass through a crucible of (possibly motivated) skepticism before valued or spread.
Replies from: Jiro↑ comment by Jiro · 2015-06-09T17:57:08.319Z · LW(p) · GW(p)
Which shows that a bug to some people is a feature to others.
A lot of posts, including in the Sequences, have really good criticisms in the comments. (For that matter, a lot of SSC posts have really good criticisms in the comments, which Scott usually just ignores.) I can easily understand why people don't like reading criticism, but if you're posting for the ideas, some criticism should be expected.
↑ comment by Gunnar_Zarncke · 2015-06-08T21:46:54.251Z · LW(p) · GW(p)
All true.
I have some interesting idea for a Less Wrong post. I sit down and excitedly start writing it out.
The key point seems to be not to aim for Main if you have some creative idea. Most creative ideas fail. That doesn't mean they were bad ideas. Just that creativity doesn't work like safe success. Main is for a specific audience and requires a specific class of writers. Why not aim for Discussion or Open Thread? Yes, these are tiers and maybe a more smooth transition were nicer but as it is that works fine.
↑ comment by jacob_cannell · 2015-06-13T22:56:31.084Z · LW(p) · GW(p)
Within an hour, I have thought of so many potential criticisms or reasons that my post might come across as lame that I am totally demoralized. I save my post as a draft, close the tab, and never return to it.
This. My standard for what I would post on LW eventually just became too high - higher than what I would post on my own blog, and beyond justifiable effort.
↑ comment by Evan_Gaensbauer · 2015-06-09T01:27:35.165Z · LW(p) · GW(p)
This comment is great. Please cross-post the suggestions for effective altruism especially to the Effective Altruism Forum. If you don't, do you mind if I do?
Replies from: John_Maxwell_IV↑ comment by John_Maxwell (John_Maxwell_IV) · 2015-06-09T04:35:20.668Z · LW(p) · GW(p)
Thanks! I already linked to my comment from the EA forum. If you want to signal-boost it further, maybe put a link to it and/or a summary of my suggestions in the EA Facebook group? By the way, I'm planning to write a longer post fleshing out the idea of peer-reviewed blog posts at some point.
↑ comment by Evan_Gaensbauer · 2015-06-09T01:09:10.206Z · LW(p) · GW(p)
Are we talking about LW proper here?
I think he's only talking about Slate Star Codex.
comment by Alicorn · 2015-06-08T19:27:28.338Z · LW(p) · GW(p)
I think this post misses a lot of the scope and timing of the Less Wrong diaspora. A lot of us are on Tumblr now; I've made a few blog posts at the much more open group blog Carcinisation, there's a presence on Twitter, and a lot of us just have made social friendships with enough other rationalists that the urge to post for strangers has a pressure release valve in the form of discussing whatever ideas with the contents of one's living room or one's Facebook friends.
The suggestions you list amount to "ask Scott to give up his private resource for a public good, even though if what he wanted to do was post on a group blog he still has a LW handle", "somehow by magic increase readership of the EA forum", and "restructure LW to entice the old guard back, even though past attempts have disintegrated into bikeshedding and a low level of technical assistance from the people behind the website's actual specs". These aren't really "solutions".
Replies from: Dustin, Raemon, witness, HBDfan, diegocaleiro, Bruno_Coelho↑ comment by Dustin · 2015-06-08T20:15:27.848Z · LW(p) · GW(p)
A lot of us are on Tumblr now; I've made a few blog posts at the much more open group blog Carcinisation, there's a presence on Twitter, and a lot of us just have made social friendships with enough other rationalists that the urge to post for strangers has a pressure release valve in the form of discussing whatever ideas with the contents of one's living room or one's Facebook friends.
I don't like this.
I do not have the time to engage in the social interactions required to even be aware of where all this posting elsewhere is going on, but I want to read it. I've been regularly reading OB/LW since before LW existed and this diaspora makes me feel left behind.
Replies from: Evan_Gaensbauer, Gondolinian↑ comment by Evan_Gaensbauer · 2015-06-09T01:39:23.053Z · LW(p) · GW(p)
I started a thing back in March called the LessWrong Digest. First of all, to you and/or anyone else reading this who signed up for it, I'm sorry I've been neglecting it for so long. I ran it for a few weeks in March, but I was indisposed for most of April, and it's been fallow since then. It contains highlights from the blogs of rationalists who post off of Less Wrong. It doesn't contain Tumblrs yet. I'll restart it tonight. I intend to build upon it to have some sort of rationalist RSS feed. I don't know how many other rationalist Tumblrs or blogs it would include, but lots. Hopefully I can customize it.
Anyway, it's my goal to make bring such projects to fruition so no rationalist under the sun cannot be found, no matter how deep into the blogosphere they burrow.
Replies from: Alicorn, MathiasZaman↑ comment by MathiasZaman · 2015-06-09T08:49:34.880Z · LW(p) · GW(p)
If you want, I can help with the tumblr part of this. If you don't need help with the tumblr part, but want to be pointed in the right direction, I host the Rationalist Masterlist with most of the tumblr rationalists on it.
Also keep in mind that tumblr tends to have a very low signal-to-noise ratio.
↑ comment by Gondolinian · 2015-06-08T20:26:53.447Z · LW(p) · GW(p)
I do not have the time to engage in the social interactions required to even be aware of where all this posting elsewhere is going on, but I want to read it.
There's a Masterlist for rational Tumblr, but I'm not aware of a complete list of all rationalist blogs across platforms.
Perhaps the Less Wrong community might find it useful to start one? If it were hosted here on LW, it might also reinforce LW's position as a central hub of the rationality community, which is relevant to the OP.
Replies from: Evan_Gaensbauer↑ comment by Evan_Gaensbauer · 2015-06-09T01:42:14.480Z · LW(p) · GW(p)
I have already thought of doing this, and want to do it. I've been neglecting this goal, and I've got lots of other priorities on my plate right now, so I'm not likely to do it alone soon (i.e., by the end of June). If you want me to help you, I will. I may have an "ugh field" around starting this project. Suggestions for undoing any trivial inconveniences therein you perceive are welcomed.
Replies from: Gondolinian↑ comment by Gondolinian · 2015-06-12T22:07:55.178Z · LW(p) · GW(p)
Sorry for the late reply, and thanks for the offer! Unfortunately I wasn't actually talking about doing it myself, just putting it out there as an idea. Good luck though; it sounds like a valuable thing for the rationality community to have.
↑ comment by Raemon · 2015-06-08T23:03:39.900Z · LW(p) · GW(p)
Curious what (in your own case, and your best estimation of other people's case) motivated the move to Tumblr?
Replies from: Alicorn↑ comment by Alicorn · 2015-06-09T05:31:38.280Z · LW(p) · GW(p)
I don't feel like I "moved to" Tumblr. I ran out of things that seemed like they'd be best expressed as LW posts and stopped being motivated by karma circa I think late 2010/early 2011 and my posting dropped off considerably. It was the end of 2012 when my sister convinced me to get a Tumblr, and I don't even mostly Tumbl about rationality (mostly). Scott has a Tumblr I think explicitly because he can dash off posts without worrying as much about quality, there; Mike has one for very similar social reasons to my own; I don't think most other people I can think of who are big on Rationalist Diaspora Tumblr were ever heavy posters on LW, although I could be missing some who don't have corresponding screen names, or forgetting someone. They're to a substantial extent different people who happened to enter the circle of interest when Tumblr was a reasonably effective way to hang out on a space with rationalists in it, and so they did that, because for whatever reason it was comfier.
↑ comment by witness · 2015-06-09T02:49:03.508Z · LW(p) · GW(p)
I rarely bother to comment on this site but this is important meta information. Many outsider groups and rationalists in particular seem to dissolve the moment their exclusion from standard social systems is removed. The most dumbed down example I have, and I specifically desire to post as low brow and example as possible, is the episode of Malcom In The Middle titled "Morp." Its prom backwards in case you missed that. The outsider group starts an anti-prom where they do everything ironically, and amusingly have all the same status bullshit problems over who is in charge or what should even be done as the normal kids prom. Then when some random dumb popular girls come down, feel upper class girl pity, and invite them to real prom everyone but Malcolm goes.
Less Wrong and its specific section of the rationalist community has approached this same singularity. It was all about getting enough like-minded and conveniently located people to form your own samesy, dull, cookie cutter clique just like normal people. Alicorn is a prime example of posts that expose this issue, although that whole cuddle pile bullshit is a more general example.
Much like say Atheism+, the OB/LW community has exploded into a million uncoordinated fragments merely seeking to satisfy their standard social needs. Meanwhile each of these shards has the same number of useless, weird, counterproductive group beliefs as mainstream Christians. And they've accomplished almost nothing except maybe funding the useless MIRI, if one even considers that an accomplishment. EA people even came and said MIRI doesn't qualify for GiveWell.
Indeed I feel my comparison to A+ is quite apt. So much bullshit spewed about improving stuff, raising the sanity waterline vs inclusive atheism but each group did essentially the opposite of their goal.
As per my title and associated duties I here mark the collapse of "internet rationalists" as a cohesive, viable, or at all productive group. Scott has a popular blog, Elie has a full time job wasting his life but gets paid good money, and Alicorn can now throw "interesting" "dinner parties." Also innumerable Tumblr related bullshit storms. Well, some movements accomplished less.
Adieu.
Replies from: Alicorn, gjm↑ comment by Alicorn · 2015-06-09T05:36:41.089Z · LW(p) · GW(p)
This is undiplomatically expressed but may contain an important seed of useful information for anyone who would like to recentralize rationalism: meeting people's normal, boring, apey social needs is important for retention, especially at scale when it seems more tempting to split off with your favorite small percentage of the group and not put in the effort with the rest. If you want people to post on Less Wrong, what's in it for them, anymore?
(I understand the desire to scare-quote the interestingness of my dinner parties but they are, in fact, parties at which dinner is served, in the most literal possible sense.)
Replies from: Vaniver, witness↑ comment by Vaniver · 2015-06-09T15:06:11.271Z · LW(p) · GW(p)
This is undiplomatically expressed but may contain an important seed of useful information for anyone who would like to recentralize rationalism: meeting people's normal, boring, apey social needs is important for retention, especially at scale when it seems more tempting to split off with your favorite small percentage of the group and not put in the effort with the rest.
Indeed. Especially if the point of LW is to socialize newcomers to rationality, well, socializing newcomers is hard and not particularly glamorous work, and we're (to some extent) selecting for people who don't want to be socialized!
Replies from: witness↑ comment by witness · 2015-06-11T20:59:33.958Z · LW(p) · GW(p)
That's clearly not true. Alicorn again is a perfect of example of someone who clearly wanted to be socialized. I mean... dinner parties. Yes, I cannot get over the whole dinner party thing, get over it.
More on point though, centralization is the ultimate bug bear of the left/progressive/radicals/w.e. Look at the internecine wars of feminism or socialism or atheism. Furthermore everyone wants to address their local personal issues first and also divides who is allowed to interfere in problems among demographic or identity lines.
The success of a revolutionary movement, various religions being examples, requires both that it be more correct than what came before and that it be either equally or more satisfying. One should be careful though of copying the old systems too closely. Ethical Humanist solstice parties? Good lord what a terrible idea.
Replies from: Lumifer↑ comment by witness · 2015-06-11T20:32:44.957Z · LW(p) · GW(p)
I scare quoted dinner parties because they are the most ridiculously conventional upper middle class thing of all time. Even more than Valium.
Replies from: OrphanWilde↑ comment by OrphanWilde · 2015-06-11T20:46:30.787Z · LW(p) · GW(p)
Dinner parties are extraordinarily useful social tools. There's a -reason- upper middle class people do them.
The causal relationship between "Being the sort of person to host dinner parties" and "Being upper middle class" doesn't flow in only one direction.
Replies from: witness↑ comment by witness · 2015-06-11T21:11:12.524Z · LW(p) · GW(p)
Yes but it underlines what I was saying about "Morp." And it also addresses people who were asking why I singled out Alicorn.
Whenever someone tells me I'm only doing something for attention or that I only hate on certain things because I'm excluded then I say: "Thanks Captain Obvious." It throws them off a lot. People who are different are different not by choice but by force. Conventional social norms exert a massive pressure on every individual even ones with non-conforming parents/siblings/peers/teachers and the only reason why it doesn't work is because an equal or greater pressure is going the other way.
So many groups, including Less Wrong, are full of so much, conscious or subconscious, self signalling and it destroys their ability to understand their own motivations or those of similar people.
The original post is all uptight about content, but content doesn't matter. Socializing matters. No amount of actually thought provoking content is going to save LessWrong unless the community improves. But the communities own standards won't allow it to improve because you aren't properly regulating who is allowed to stay, among other issues, including the aforementioned issue of the community and not the content being the problem. Creating a surviving discussion website is not the same as creating a growing discussion website.
I won't get into the drama that will develop if I explain what I mean about regulating who can post since you wouldn't implement my suggestion anyways. But I think many people know what I mean even if they don't agree, and we'll leave it at that.
Replies from: OrphanWilde↑ comment by OrphanWilde · 2015-06-12T18:07:36.353Z · LW(p) · GW(p)
This reads more like you're using my comment as an excuse to talk more about what you want to talk about than that you're responding in any meaningful sense to the actual content of my comment.
Replies from: witness↑ comment by gjm · 2015-06-09T11:24:46.479Z · LW(p) · GW(p)
Alicorn is a prime example of posts that expose this issue
What does this mean? I guess you mean "(some subset of) Alicorn's posts" (though I can't help thinking the way you've phrased it is suggestive of some kind of personal animosity), but which ones and what exactly do you think is wrong with them?
↑ comment by diegocaleiro · 2015-06-08T20:54:06.588Z · LW(p) · GW(p)
The solutions were bad in purpose so other people would come up with better solutions on the spot. I edited to clarify :)
↑ comment by Bruno_Coelho · 2015-06-09T17:30:43.983Z · LW(p) · GW(p)
The boundaries of relevante is something to think. A lot of places outside LW have discussions. Political topics was a thing back then, but now apparently people mention is Open Threads, and the most frequent talkers are still posting elsewhere. EA emerge, and with good coordination. However, this does not mean we should stop possible dynamical changes.
comment by katydee · 2015-06-08T19:34:46.014Z · LW(p) · GW(p)
I think LessWrong has a lot of annoying cultural problems and weird fixations, but despite those problems I think there really is something to be gained from having a central place for discussion.
The current "shadow of LessWrong + SSC comments + personal blogs + EA forum + Facebook + IRC (+ Tumblr?)" equilibrium seems to have in practice led to much less mutual knowledge of cool articles/content being written, and perhaps to less cool articles/content as well.
I'd really like to see a revitalization of LessWrong (ideally with a less nitpicky culture and a lack of weird fixations) or the establishment of another central hub site, but even failing that I think people going back to LW would probably be good on net.
Replies from: Evan_Gaensbauer, Evan_Gaensbauer↑ comment by Evan_Gaensbauer · 2015-06-10T04:00:59.200Z · LW(p) · GW(p)
I've read some of the comments below, and I'm thinking both for your own use and further discussion it will help to distinguish between different sorts on Less Wrong by reading this post by Ozy Frantz.
↑ comment by Evan_Gaensbauer · 2015-06-09T00:59:58.171Z · LW(p) · GW(p)
annoying cultural problems and weird fixations
Not that they aren't here, but which ones are you talking about? What's a weird fixation to some might be an attractor for others, and visa-versa.
Replies from: katydee↑ comment by katydee · 2015-06-09T01:06:08.911Z · LW(p) · GW(p)
In terms of weird fixations, there are quite a few strange things that the LW community seems to have as part of its identity - polyamory and cryonics are perhaps the best examples of things that seem to have little to do with rationality but are widely accepted as norms here.
If you think rationality leads you to poly or to cryo, I'm fine with that, but I'm not fine with it becoming such a point of fixation or an element of group identity.
For that matter, I think atheism falls into the same category. Religion is basically politics, and politics is the mind-killer, but people here love to score cheap points by criticizing religion. The fact that things like the "secular solstice" have become part of rationalist community norms and identity is indicative of serious errors IMO.
For me, one of the most appealing things about EA (as opposed to rationalist) identity is that it's not wrapped up in all this unnecessary weird stuff.
Replies from: Jiro, Vaniver, Viliam, Evan_Gaensbauer↑ comment by Jiro · 2015-06-09T18:02:34.288Z · LW(p) · GW(p)
For me, one of the most appealing things about EA (as opposed to rationalist) identity is that it's not wrapped up in all this unnecessary weird stuff.
I'd consider EA itself to be one of those strange things that LW has as part of its identity. It's true that EA involves rationality, but the premises that EA is based on are profoundly weird. I have no desire to maximize utility for the entire human race in such a way that each person's utility counts equally, and neither does just about everyone else outside of the LW-sphere. I prefer to increase utility for myself, my family, friends, neighbors, and countrymen in preference to increasing the utility of arbitrary people. And you'll find that pretty much everyone else outside of here does too.
Replies from: jsteinhardt, None, ChristianKl↑ comment by jsteinhardt · 2015-06-09T18:25:32.948Z · LW(p) · GW(p)
I don't view this as inconsistent with EA. I basically share the same preferences as you (except that I don't think I care about countrymen more than arbitrary people). On the other hand, I care a non-zero amount about arbitrary people, and I would like whatever resources I spend helping them to be spent efficiently. (Also, given the sheer number of other people, things like scientific research that would potentially benefit everyone at once feel pretty appealing to me.)
Replies from: Jiro↑ comment by Jiro · 2015-06-09T19:43:52.714Z · LW(p) · GW(p)
Well, that's a matter of semantics. I could say "I don't want to maximize utility added up among all people", or I could say "I assign greater utility to people closer to me, and I want to maximize utility given that assignment". Is that EA? If you phrase it the second way, it sort of is, but if you phrase it the first, it isn't.
Also, I probably should add "and people who think like me" after "countrymen". For instance, I don't really care about the negative utility some people get when others commit blasphemy.
↑ comment by ChristianKl · 2015-06-10T13:51:19.825Z · LW(p) · GW(p)
I prefer to increase utility for myself, my family, friends, neighbors, and countrymen in preference to increasing the utility of arbitrary people. And you'll find that pretty much everyone else outside of here does too.
I think there are plenty of people out there who do care to some extend about saving starving African children.
Replies from: Jiro↑ comment by Jiro · 2015-06-10T14:46:32.684Z · LW(p) · GW(p)
Yes, they care to some extent, but they would still prefer saving their own child from starvation to saving another child in a distant continent from starvation. Caring to some extent is not equally preferring.
Replies from: ChristianKl↑ comment by ChristianKl · 2015-06-10T14:52:43.130Z · LW(p) · GW(p)
I don't think any of the EA people wouldn't care more about their own child. To me that seems like a strawman.
Replies from: Jiro↑ comment by Jiro · 2015-06-10T15:03:58.151Z · LW(p) · GW(p)
The argument usually goes in reverse: since you'd care about your own child, surely you should care equally about this child in Africa who's just as human. It's presented as a reason to care more for the distant child, not care less for your own child. But it still implies that you should care equally about them, not care more about your own.
Replies from: ChristianKl↑ comment by ChristianKl · 2015-06-10T15:25:21.679Z · LW(p) · GW(p)
I don't know any EA who says that they have an utility function that treats every child 100% equally.
↑ comment by Vaniver · 2015-06-09T01:25:26.297Z · LW(p) · GW(p)
I'm not fine with it becoming such a point of fixation or an element of group identity.
So, maybe this is just my view of things, but I think a big part of this conversation is whether you're outside looking in or inside looking out.
For example, I'm neither poly nor signed up for cryo, but I'm open to both of those things, and I've thought them through and have a balanced sense of what facts about the world would have to change for my identification / recommendations to have to change. In a place where most people have seriously considered the issue, that gets me no weird looks.
But saying "I'm open to cryo" to an audience of stereotypical skeptics comes across as an admission of kookery, and so that's the relevant piece about LW they notice: not "they don't scoff at ideas" but "they believe in cryonics more than normal."
people here love to score cheap points by criticizing religion.
Is that true? I mostly don't notice people scoring cheap points by criticizing religion; I mostly notice them ignoring religion.
Religion is basically politics... The fact that things like the "secular solstice" have become part of rationalist community norms and identity is indicative of serious errors IMO.
Mmm. I would say that "religion is basically community"--they're the people you spend a lot of time with, they're the people you have a shared history / myth base with, they're people you can trust more than normal. And any community, as it becomes more sophisticated, basically becomes a 'religion.' The Secular Solstice is part of making a genuine sophisticated rationalist community--i.e., a rationalist religion, of the "brownies and babysitting" variety rather than the "guru sex cult" variety.
Replies from: katydee↑ comment by katydee · 2015-06-09T19:26:54.154Z · LW(p) · GW(p)
So, maybe this is just my view of things, but I think a big part of this conversation is whether you're outside looking in or inside looking out.
I'm on the inside and I think we should get rid of these things for the sake of both insiders and outsiders.
Is that true? I mostly don't notice people scoring cheap points by criticizing religion; I mostly notice them ignoring religion.
See for instance Raising the Sanity Waterline, a post which raises very important points but is so unnecessarily mean-spirited towards religion that I can't particularly show it to many people. As Eliezer writes elsewhere:
Why would anyone pick such a distracting example to illustrate nonmonotonic reasoning? Probably because the author just couldn't resist getting in a good, solid dig at those hated Greens.
↑ comment by Viliam · 2015-06-09T09:15:10.619Z · LW(p) · GW(p)
Seems to me we have to differentiate between two things:
a) x-rationality (rationality without compartmentalization)
b) LessWrong x-rationalist culture
Rationality means thinking and acting correctly, not doing stupid stuff. Culture means creating an environment where people feel comfortable, and are encouraged to do (what the culture considers to be) the right thing.
There is only one rationality, but there can be multiple rationalist cultures. Different cultures may work better for different people. But different people cannot have different definitions of rationality.
Seems to me that polyamory is a clearly cultural thing, atheism is a part of rationality itself (not believing in magic, not accepting "mysterious answers", reductionism), and cryonics is... somewhere in between, these days probably more on the cultural side. Secular solstice is obviously a cultural thing, and in my opinion not even a central component of the traditional LW culture; although it's obviously related.
I love the "old good hardcore LessWrong rationalist culture", and I would be sad to see it disappear. I want it to survive somewhere, and LW seems like the logical place. (I mean, where else?) But I don't want to push it on other people, if they object. I enjoy it, but I can understand if other people don't. I support experimenting with other rationalist cultures.
Not sure what is the solution here. Maybe making the cultures more explicit? Giving them names? Yes, this encourages tribal thinking, but on the other hand, names are Schelling points. (And if we don't have an explicit name for the culture, people will simply use "the rationalist community" as a name, and then there will be confusion when different people will try to define it differently, when what they really mean is they prefer different cultures.)
Actually, this could be an interesting topic for a separate discussion: Do we need a rationalist culture? What kinds of cultures (that we could consider rationalist) already exist? How to design a new one?
↑ comment by Evan_Gaensbauer · 2015-06-09T01:26:30.939Z · LW(p) · GW(p)
I don't notice Less Wrong users bashing religion all the time. At some point in the past, there may have been more overlap with New Atheism, but because there are no new points being made in that domain these days, among other reasons, I don't observe this as much. Mind you I could be biased based on how I spend less time on Less Wrong the website these days, and spend more time discussing with friends on social media and at meetups, where bashing religion seems like it would take place less often anyway.
Religion is basically politics, and politics is the mind-killer
Mentally, I've switched out "politics is the mind-killer" for "politics is hard mode". That article was originally written by Robby Bensinger, and I think it works better than the original sentiment, for what it's worth.
I perceive the secular solstice as part of the rationalist community being a step away from the public atheism and skeptic communities, at large. While in many skeptic circles, or among casual atheists, people I know seem grossed out by the elements of piety and community devotion, it seems to me the rationalist community embraces them because they understand, psychologically, replicating such activity from organized religion can engender happiness and be empowering. The rationalist community may be able to do so without receiving all the fake and false beliefs which usually comes with the territory of organized religion. In embracing the secular solstice, perhaps the rationalist community isn't afraid of looking like a bunch of clowns to achieve their goals as a social group.
On the other hand, the secular solstice could be too heavy-handed with symbolism and themes of anti-deathism and transhumanism. I haven't attended one. I know there were big ones in Seattle, New York, and Berkeley in 2014, and I think only the latter was so overtly steeped in transhumanist memes. I could also have more sentimentality for the of a "secular solstice" than most non-religious folk, as I seem to perceive more value in "spirituality" than others.
comment by Anatoly_Vorobey · 2015-06-16T21:58:44.305Z · LW(p) · GW(p)
Note how all the exodus is to places where people own their particular space and have substantial control over what's happening there. Personal blogs, tumblrs, etc. Not, say, subreddits or a new shinier group blog.
Posting on LW involves a sink-or-swim feeling: will it be liked/disliked? upvoted/downvoted? many comments/tepid comments/no comments? In addition, you feel that your post stakes a claim on everybody's attention, so you inevitably imagine it'll be compared to other people's posts. After all, when you read the Discussion page, you frequently go "meh, could've done without that one", so you imagine other people thinking the same about your post, and that pre-discourages you. In addition, a few years' worth of status games and signalling in the comments have bred to some degree a culture of ruthlessness and sea-lawyering.
So, these three: fretting about reactions; fretting about being compared with other posts; fretting about mean or exhausting comments. One way to deal with it is to move to an ostensibly less demanding environment. So you post to Discussion, but then everyone starts doing that, Main languishes and the problem reoccurs on Discussion. So you post to open threads, but then Discussion languishes, open threads balloon and become unpleasant to scan, and the problem reoccurs, to a lesser degree, on them too. But if you go off to a tumblr or a personal blog or your Facebook: 2nd problem disappears; 3rd problem manageable through blocking or social pressure from owner (you); 1st problem remains but is much less acute because no downvotes.
It's useless to say "just don't fret, post on LW anyway". The useful questions are "why didn't this happen in the first 4-5 years of the site?" and "assuming we want this reverted, how?" For the first question, because as the site was growing, the enthusiasm for this exciting community and the desire to count your voice among its voices overrode those feelings of discomfort. But after a few years things changed. Many regulars established lateral links. The site feels settled in, with an established pecking order of sorts (like the top karma lists; these were always a bad idea, but they just didn't matter much at first). There's no longer a feeling of "what I'll post will help make LW into what it'll be". And there's a huge established backlog that feels formidable to build on, especially since nobody's read it all. So the motivation lessened while the dis-motivation stayed as it was.
How to fix this? I think platformizing LW might work well. Everybody prefers their own space, so give everybody their own space on the common platform. Every user gets a personal blog (e.g. vaniver.lesswrong.com) on the same platform (reddit code under the hood). The global list of users is the same. Everybody gets to pick their reading list (tumblr-style) and have their custom view of new posts. There's also RSS for reading from outside of course. Blog owners are able to ban users from their particular blog, or disallow downvotes.
Then bring back Main as a special blog to which anyone can nominate a post from a personal blog, and up/downvotes determine pecking order, with temporal damping (HN style). Would also be cool to have a Links view to which everyone can nominate links from other rationality blogs and LWers can discuss.
(I realize that this would require nontrivial programming work, and have a good understanding of how much of it would be required. That isn't an insurmountable challenge).
comment by Vaniver · 2015-06-09T01:04:08.048Z · LW(p) · GW(p)
So, I have lots of thoughts and feelings about this topic. But I should note that I am someone who has stayed on LessWrong, and who reads a sizable portion of everything that's posted here, and thus there's some difference between me and people who left.
In order to just get this comment out there, I'm going to intermingle observations with prescriptions, and not try to arrange this comment intelligently.
Individual branding. There are lots of benefits to having your own site. Yvain can write about whatever topics he wants without any concern about whether or not other people will think the subject matter is appropriate--it's his site, and so it's what he's interested in. As well, people will remember that they saw it on SSC, rather than on LW, and so they'll be much more likely to remember it as a post of his.
This could be recreated on LW either by giving post authors more control over the page appearance for things they post (a different page header?), having author / commenter images, or by shifting the "recent on rationality blogs" from a sidebar to a section of similar standing to Main and Discussion. I must admit I haven't used reddit much, but I'm of the impression that the standard use case is a link to content elsewhere, which can be up/downvoted, and comments on the Reddit link. I doubt it'd be very difficult to tie blogs and LW accounts, so that whenever Nate posts to Minding our way, the so8res LW account posts a link to it in the Rationality Links section, and then any upvotes on the link would translate to karma for the so8res LW account.
Gresham's Law. It applies to social groups as easily as money. Jerks make a conversation less fun for people who are not jerks, so they participate less, so the conversation is even more dominated by jerks, and so on. (Compare to counterfeit money making all money less trusted and valuable, so known good money is hoarded, so the average value of traded money is assumed to be even lower.) There needs to be some counter-force that encourages pleasant interactions and discourages unpleasant interactions, or it seems like this will happen anywhere.
There are a couple of ways to make LW warmer and fuzzier. I don't know how well I expect that to work, and I think that's hard to square with a "truth uber alles" approach.
One is to have active moderators paying attention to people who seem like jerks, hopefully starting with modeling good approaches / pointing to NVC principles / discussing, and then moving to red text and karma penalties and banhammers. I think we can get some novel, interesting work out of this suggestion if it's heavy on the "we want to teach NVC to troublesome posters" and light on the "let's just ban the trolls and the problem is solved," but I'm not currently opposed to banning people who aren't "trolls" but are just aggressively unfun for others.
Another approach is to move from a "main" and "discussion" split, where the difference is "seriousness," to something like a "sensitivity" and "specificity" split, where the former is for speculative / broad / hastily stated ideas and run under "yes, and" norms, where the likelihood is high that something is there, and the latter is for fully baked / precise ideas and run under "no, but" norms. When there's something you want to hold up to high scrutiny, you put it in "specificity," and things upvoted to the top in that feed will be high quality; when there's something that you want to suggest but don't necessarily want to defend, you put it in "sensitivity."
Coordination problems. Part of the problem with a gradual, systematic shift is that no individual can stop it. If there are, say, eight high-profile interesting posters who gradually posted and checked LW less and less in a mutually reinforcing fashion, then just one of them coming back won't do much. They'll see that the other seven are still missing, and more importantly, the other seven won't notice they're back, because they don't check LW much! But this is a coordination problem, and coordination problems can be solved. If those eight got together and decided "yes, we will recolonize LW," then the activation barrier could be crossed and LW could flip from the low-energy local minimum to the high-energy local minimum. But in order for this to make sense, it needs to be a good idea to recolonize LW!
Other things. The rationalist community may be at a point where a community blog is not where the community really resides, or where it should reside. LW originally existed to create a connected community of people able to think clearly about the future, in order to provide sufficient attention and funding to MIRI and other institutions (both for-profit and non-profit) that do work in offices. And now MIRI has the attention and funding it needs, and there is a community of people able to think clearly about the future. One possibility is to just discard LW, as a booster rocket that served its purpose, and another possibility is to try to recreate it to better serve a interstitial, communal role.
But moving from "the place where any idea will be considered for epistemic rationality reasons" to "the common ground of many causes that benefit from clear thinking" seems even worse for "truth uber alles" than trying to be warm and fuzzy, because you need the weird stuff to be enough out of the way and minimized that people think it's PR-positive to advertise there, rather than a liability.
Replies from: Gram_Stone↑ comment by Gram_Stone · 2015-06-10T19:04:29.339Z · LW(p) · GW(p)
But I should note that I am someone who has stayed on LessWrong, and who reads a sizable portion of everything that's posted here, and thus there's some difference between me and people who left.
Out of curiosity, why have you stayed, why do you read as much as you do, and how are you different?
Replies from: Vaniver↑ comment by Vaniver · 2015-06-11T02:03:13.428Z · LW(p) · GW(p)
Out of curiosity, why have you stayed, why do you read as much as you do, and how are you different?
I suspect I find reading and posting on forums more intrinsically motivating than most people; this was one of my primary hobbies before LW, and it will likely be one of my primary hobbies after LW. LW was just the best forum I had found.
comment by Houshalter · 2015-06-09T09:23:51.667Z · LW(p) · GW(p)
People seem to be complaining about community fracturing, and good writers going off onto their own blogs. Why not just accept that and encourage people to post links to the good content from these places?
Hacker News is successful mainly because they encourage people to post their own blog posts there, to get a wider audience and discussion. As opposed to reddit where self promotion is heavily discouraged.
Lesswrong is based on reddit's code. You could add a lesswrong.com/r/links, and just tell people it's ok to publish links to whatever they want there. This could be quite successful, given lesswrong already has a decent community to seed it with. As opposed to going off and starting another subreddit, where it's very hard to attract an initial user base (and you run into the self promotion problem I mentioned.)
Replies from: tog↑ comment by tog · 2015-06-19T07:11:52.304Z · LW(p) · GW(p)
Potentially worth actually doing - what'd be the next step in terms of making that a possibility?
Relevant: a bunch of us are coordinating improvements to the identical EA Forum codebase at https://github.com/tog22/eaforum and https://github.com/tog22/eaforum/issues
Replies from: Houshalter, ChristianKl↑ comment by Houshalter · 2015-06-22T01:36:24.814Z · LW(p) · GW(p)
You'd need to convince whoever runs Lesswrong. There was some other discussion in this thread about modifying the code, but no point in doing that if they aren't going to push it to the site. Otherwise there is /r/RationalistDiaspora which is attempting to fill this niche for now.
↑ comment by ChristianKl · 2015-06-19T09:47:22.766Z · LW(p) · GW(p)
Potentially worth actually doing - what'd be the next step in terms of making that a possibility?
Getting agreement from MIRI (likely Eliezer) that LW should be changed in that way.
comment by RyanCarey · 2015-06-10T08:37:01.708Z · LW(p) · GW(p)
Hey all,
As the admin of the effective altruism forum, it seems potentially useful to chip in here, or at least to let everyone know that I'm aware of and interested in this kind of conversation, since it seems like mostly everything that needs to has already been said.
The statement of the problem - online rationalist discourse is more fractured than is optimal - seems plausible to me.
I think that SSC and Scott's blogging persona is becoming quite a bit bigger than LessWrong curently is - it's got to the stage where he's writing articles that are getting thousands of shares, republished in the New Statesman, etc. I think SSC's solo blogging is striking a winning formula and shouldn't be changed.
For the EA Forum, the risk has always been that it would merely fracture existing discussion rather than generating anew any of its own. People usually think enough about how their project could become a new competing standard because they have a big glorious vision of how it would be. The people who are enthusiastic enough to start a project tend to be way out on the bell curve in terms of estimating how successful it is likely to be, so it can be unthinkable that it would end up as 'just another project' like the others (e.g. Standards: https://xkcd.com/927/). I thought about this a lot when starting the forum, and despite the fact that significant effort has been put into promoting it to a clear existing target community, this is still a plausible objection to the forum.
That's why I'm sceptical of the idea of creating new centres of gravity on subreddits. If the EA Forum is only uniting somewhat more than it's fracturing, then it's unlikely that a subreddit would do so. Why would a subreddit fare so much better at centralising discussion than LessWrong, the userbase that it's directly trying to cannibalise, and which has been a supremely popular blog over multiple years? It's so unlikely that it's simply not going to happen.
As for the forum, it's been growing slowly yet persistently, the output of content is going well, and the discourse is more constructive and action-oriented than one might have hoped for. Overall, I think it's having a centralising effect on EA discussion moreso than a fracturing one, and a constructive effect on EA activity moreso than just addicting people to nonproductive discussion. Since the growth is good so far, the hope is that as it continues, it will attract more members from outside of existing communities. If the growth trajectory starts to reverse itself, then we'd have to revisit some of these questions, but essentially, so far, so good. Incremental updates are to be made, but not any complete overhaul.
LessWrong also has significant value as it is.
So what does that leave? My initial thoughts would be:
- A handful of LessWrongers read the EA Forum and vice versa, but most don't, and most are interested in, and not actively repulsed by the other group's writing, to the extent that they make an effort to relate to it. So maybe we should feel less inhibited about posting articles from either source to the other, e.g. in comments where relevant, so that people can feel happy that they're seeing more of the whole picture.
- Maybe we should make a smart automatic feed for rationality and EA stuff, using this algorithm but training it on the LW diaspora. This algorithm looks very effective, and wouldn't create new problems by decentralising comments.
- An embarassing number of people run personal blogs to some extent because of vanity, when the value of doing so (number of people who engage with them, community-building effect) is less. As discussed, it's kind-of like a prisoners dilemma, for which the default solution should be to try to establish a social norm to ensure cooperation. Some articles could be written in this vein and promoted to encourage centralisation of discussion.
The overall mass of people thinking about Rationality, x-risk and effective altruism seems to be growing, though, which is good news, so this kind of discussion is not a crisis talk. Still, it does seem like an important discussion to have. Happy as always for comments and criticism.
Replies from: philh↑ comment by philh · 2015-06-10T14:38:06.104Z · LW(p) · GW(p)
The statement of the problem - online rationalist discourse is more fractured than is optimal - seems plausible to me.
I wonder if we should be distinguishing between essays and discussions here.
The subreddit might end up fracturing discussions by adding a new place to comment, but unifying essays by adding a place to find them without needing to subscribe to everybody's personal blog.
comment by Princess_Stargirl · 2015-06-10T04:12:41.004Z · LW(p) · GW(p)
A possible dark explanation:
-The main reason people cared about lesswrong was that Scott and Elizier posted on lesswrong. Neither posts on lesswrong anymore. Unless some equally impressive thinkers can be recruited to post on LW the site will not recover.
Replies from: knb↑ comment by knb · 2015-06-16T20:51:05.803Z · LW(p) · GW(p)
I'll weigh in and say that neither Scott nor Eliezer were much of an incentive for posting on LW. Mostly I like the high standards of discussion in the comments, and the fact that there is a much lower inferential distance on many important topics.
Replies from: Usernamecomment by Raziel123 · 2015-06-08T19:26:44.597Z · LW(p) · GW(p)
mm.., I think and agregator from less wrong, SSC , EA forum and OB posts, would be great,only if all of the formers have an easy (visible) link to it. It could allow more traffic to flow between those gravity centers. it may be better than crossposting.
Replies from: Vaniver↑ comment by Vaniver · 2015-06-09T02:31:45.474Z · LW(p) · GW(p)
Pros of having it on Reddit:
- It's a clearly neutral place, with no history or baggage.
- It's a bit more cleanly set up for link posts.
- Instead of a potentially costly change to the LW codebase, it's already done.
Cons of having it on Reddit, instead of on LW (see this other comment of mine for suggestions on how that could be done):
- It requires a different account, and a new account for anyone who doesn't already use Reddit.
- It doesn't inherit the good parts of the history, like tying the Yvain of SSC links to the Yvain of Generalizing From One Example.
- It creates a new source of gravity, potentially diffusing things even more, rather than consolidating them. Instead of conversations in SSC comments and tumblr and Facebook and a LessWrong link post, we now might have conversations in SSC comments, tumblr, Facebook, a LessWrong link post, and Reddit.
↑ comment by Raziel123 · 2015-06-09T03:12:34.923Z · LW(p) · GW(p)
I would be surprised if that subreddit get traction. I was thinking something more like Reaction Times(damn Scot and his FAQ), and having it in a visible place in all of the Rationality related sites. a coordinanted effort.
Well, the idea was not to comment in the agregator, that way it will be like a highway, it should take you to others sites with 2 clicks (3 max) . if that is not possible I'm not sure there will be any impact, besides making another gravity center.
Replies from: ESRogs↑ comment by ESRogs · 2015-06-09T14:56:04.126Z · LW(p) · GW(p)
the idea was not to comment in the agregator
I'm thinking about whether to try to explicitly establish this as a norm of /r/RationalistDiaspora. Haven't made up my mind yet.
Replies from: Vaniver↑ comment by Vaniver · 2015-06-09T15:11:58.169Z · LW(p) · GW(p)
Comments in the aggregator makes much more sense to me--no trivial inconvenience to posting a comment, and people can read the comments to determine whether or not to follow the link, and it means every link has access to Reddit-quality commenting (karma, threads, etc.) regardless of how the source is set up.
It does make it harder for the content creator to see those comments.
Replies from: Raziel123, philh↑ comment by Raziel123 · 2015-06-09T17:54:10.213Z · LW(p) · GW(p)
but in that case the people will be even more diluted, why create another gravity center?, that´s the issue we are trying to solve, I'm mostly convinced that t would be better if the aggregator have no comments.
Edit: I guess the aggregator have more traffic than I thought, I'm just worried if there is only a one way flow from less wrong all the other sites..
↑ comment by philh · 2015-06-09T16:48:32.996Z · LW(p) · GW(p)
It might also make sense to have multiple parallel discussions with different norms, so that people who are turned off by one set of norms can still comment elsewhere. (This does run the risk of fragmentation.)
...though I'd suggest that if we're going to discuss the comment policy of the new place, we should do that in a meta thread at the new place.
comment by Unknowns · 2015-06-09T04:57:05.450Z · LW(p) · GW(p)
I agree with the comments (like John Maxwell's) that suggest that Less Wrong effectively discourages comments and posts. My karma score for the past 30 days is currently +29, 100% positive. This isn't because I don't have anything controversial to say. It is because I mostly stopped posting the controversial things here. I am much more likely to post them on Scott's blog instead, since there is no voting on that blog. I think this is also the reason for the massive numbers of comments on Scott's posts -- there is no negative incentive to prevent that there.
I'm not sure of the best way to fix this. Getting rid of karma or only allowing upvotes is probably a bad idea. But I think the community needs to fix its norms relating to downvoting in some way. For example, officially downvoting purely for disagreement has been discouraged, but in practice I see a very large amount of such downvoting. Comments referring to religion in particular are often downvoted simply for mentioning the topic without saying something negative, even if nothing positive is said about it.
I also agree with those who have said that the division between Main and Discussion is not working. I would personally prefer simply to remove that distinction, even if nothing else is put in to replace it.
Replies from: Nonecomment by IlyaShpitser · 2015-06-09T19:50:01.535Z · LW(p) · GW(p)
I think "LW type" rationalists should learn to be colleagues rather than friends. In other words, I think the win condition is if you agree on the ideals, but possibly bicker on a personal level (successful academic communities are often like this).
Replies from: None↑ comment by [deleted] · 2015-06-10T13:03:11.129Z · LW(p) · GW(p)
Sorry, this was an useless post so now it's gone
Replies from: IlyaShpitser↑ comment by IlyaShpitser · 2015-06-10T13:22:31.408Z · LW(p) · GW(p)
There are brain-imposed bounds on movement growth if you insist on staying in your cuddle pile.
Why conflate your personal social goals, and general movement goals?
Replies from: None↑ comment by [deleted] · 2015-06-11T14:43:43.170Z · LW(p) · GW(p)
sorry, this was an unhelpful comment that is now gone :)
Replies from: IlyaShpitser↑ comment by IlyaShpitser · 2015-06-11T23:03:27.624Z · LW(p) · GW(p)
I don't see why personal social goals and general movement goals should be necessarily mutually exclusive.
They are not necessarily, but they are in this case. I think Scott once mentioned that BA rationalists can't grow beyond about 150. 150 is a magic number, and is suggestive of what the problem might be.
"Cuddle pile" is my slightly unkind shorthand for the kinds of social peculiarities rationalists, imo, should leave behind if they want the ideas to become more mainstream.
Metacomment: "it is not necessarily the case that X" is almost always true for interesting X.
Replies from: Nornagest↑ comment by Nornagest · 2015-06-16T20:50:52.318Z · LW(p) · GW(p)
I suspect most rationalists will turn out to care more about their cuddle piles than about their ideas becoming mainstream. There's always been a rather unhealthy interaction between community goals and the community's social quirks (we want to raise the sanity waterline -> we are saner -> our quirks should be evangelized), and we don't really have a working way to sort out what actually comes with increased rationality and what's just a founder effect.
Replies from: IlyaShpitser↑ comment by IlyaShpitser · 2015-06-16T20:54:07.997Z · LW(p) · GW(p)
I agree. And that's too bad.
I have been trying to serve as a bit of a "loyal opposition" re: separating rationality from social effects. But I am just one dude, and I am biased, too. Plus, I am an outsider, and my opinions don't really carry a lot of weight outside my area of expertise, around here.
The community itself has to want it, on some level.
comment by Paamayim · 2015-06-09T07:30:58.434Z · LW(p) · GW(p)
Honestly, my maginal returns of spending time on LW dropped drastically since I finished reading the sequences. Attending local meetups was kinda fun to meet some like-minded people, but they inevitably were far behind in the sequences and for the most part always struck me as trying to identify as a rationalist rather than trying to become more rationalist. This strikes me as the crux of the issue: LW has become (slash might have always been) an attractor of nerd social status, which is fine if that's its stated goal, though this doesn't seem to be the issue.
Additionally, in the 5 years I've been attending meetups (at least six different ones in three different countries), I've noticed a drastic increase in the levels of weirdness happening, to the extent that I find myself discouraged from attending and having to deal with these people. This is the point I think Witness was trying to express below, perhaps not in so many words, but I find myself explicitly not liking a lot of the people/memes now associated with LW. This is not good.
I do, however, think a place for cultivating rationality is important to have, and to that end I would suggest using Github as the platform. Having some sort of rationality repository (preferably without the LW label), where people can open pull requests for things they're thinking about/working on solving. As an added bonus, you get the ability to track how ideas change over time, can easily fork differing opinions, and get all of the cool things a commit history would do for you. I think having some sort of rationality platform is important, but personally I would do away with the LW culture, keep our identities small, and individually go on our way.
LessWrong has had its time and its place, but its lingering death is probably something we should pay a lot of attention to. As a community experiment, I think the results speak for themselves.
My $0.02.
comment by [deleted] · 2015-06-09T11:22:27.464Z · LW(p) · GW(p)
If you have drafts you think are not good enough for LW, then polish them, include the criticisms of which you can think, make a falsifiable prediction and GO POST THEM ON YOUR OWN BLOG. Link to LW articles on specific biases that could have guided your thoughts if you can identify them. You do not owe anyone anything, and if you write well enough, you will have readers. Make your own rules, change them when you need to, hell, STOP BLOGGING if you don't feel the need.
It does not mean that you have to leave LW. Comment, post, IGNORE KARMA HITS, comment on YOUR OWN BLOG about what you think was badly discussed here, just - just go ahead already.
It's annoying to read articles about possible solutions that don't get implemented instead of, well, content.
comment by Ruby · 2015-06-09T04:28:13.064Z · LW(p) · GW(p)
I'm surprised by this idea of treating SSC as a rationalist hub. I love Scott, Scott's blog, and Scott's writing. Still, it doesn't seem like it is a "rationality blog" to me. Not directly at least. Scott is applying a good deal of epistemic rationality to his topics of interest, but the blog isn't about epistemic rationality, and even less so about practical rationality. (I would say that Brienne's and Nate's 'self-help' posts are much closer to that.) By paying attention, one might extract the rationality principles Scott is using, but they're not outlined.
There's a separate claim that while Scott's blog isn't about rationality in the same was LW is, it has attracted the same audience, and therefore can be a rationality attractor/hub. This has some legitimacy, but I still don't like it. LW has attracted a lot of people who like to debate interesting topics and ideas on the internet, with a small fraction who are interested in going out and doing things (or just staying in, but actually changing themselves). Scott's blog, being about ideas, seems that it also attract lots of people who simply like mental stimulation, but without a filter for those most interested in doing. I'd really like our rationality community hubs to select for those who want take rationality seriously and implement it in their minds and actions.
On this selecting for -or at least being about- the EA Forum is actually quite good.
Lastly, maybe I feel strong resistence to trying to open Scott's blog up because it seems like it really is his personal blog about things he wants to write about - and just because he's really successful and part of the community doesn't mean we get tell him now 'open it up'/'give it over'/co-opt it for the rest of the community.
comment by Gondolinian · 2015-06-08T19:14:54.246Z · LW(p) · GW(p)
A few tangential ideas off the top of my head:
If the moderation and self selection of Main was changed into something that attracts those who have been on LW for a long time, and discussion was changed to something like Newcomers discussion, LW could go back to being the main space, with a two tier system (maybe one modulated by karma as well).
People have been proposing for a while that we create a third section of LW for open threads and similar content.
We could have a section without any karma scores for posts/upvote only, though we could still keep the same system for comments.
We could allow Discussion posts to be Promoted while still using the Discussion karma system.
We could have Promotion somehow be based on popular vote (not necessarily karma), instead of a moderator's judgement.
↑ comment by Jiro · 2015-06-09T18:05:45.845Z · LW(p) · GW(p)
Those are all phrased as "do you agree that people are saying X" or "do you agree that we could X" rather than "is X a good idea".
Replies from: Gondolinian↑ comment by Gondolinian · 2015-06-09T18:41:08.154Z · LW(p) · GW(p)
Good point, thanks. I was already not a fan of the way the polls made the post look, so I went ahead and took them down. I could replace them with something better, but I think this thread has already gotten most of the attention it's going to get, so I might as well just leave the post as it is.
comment by knb · 2015-06-09T21:38:22.252Z · LW(p) · GW(p)
People enjoy writing elsewhere more because they don't have to write about "refining the art of human rationality," which is the stated topic and purpose of LW. Actually making progress on this topic is difficult and fairly dry. If you're concerned that we're missing out on the rationality-relevant content they post elsewhere, just ask them for permission to repost on LW. I know this is already happening with some Slate Star Codex posts.
comment by SilentCal · 2015-06-09T20:06:52.972Z · LW(p) · GW(p)
I'm not exactly a top-tier contributor, but my writings here tend to get positive responses, and the reason I don't write more is chiefly lack of ideas. One thing I'm doing is resolving right now to try to write more on LW; another is resolving to be willing to post a broader variety of things until I actually get some negative feedback that I should narrow.
But as far methods external to myself, I wonder if something like a topic of the month could seed participation. Maybe do posts with discussion questions--I actually really enjoyed these on the Superintelligence reading group posts.
comment by Princess_Stargirl · 2015-06-10T04:14:16.490Z · LW(p) · GW(p)
Maybe people should post here:
comment by raydora · 2015-06-13T03:04:31.374Z · LW(p) · GW(p)
This is not a well thought out post, in keeping with the nature of the subject matter. Less Wrong does seem to encourage solidified thoughts rather than subconscious reactions. A good thing, I think, but difficult all the same. Ideas follow.
- An IRC-style (not necessarily chat) section which has neither votes nor a delineation between post and comment. An area for LWers to post thoughts as they occur. Restrict formatting of these posts to plain text. Not a design choice, so much as to encourage train-of-thought style conversation.
- Why upvotes at all? Why not a well defined rating scheme, in addition to use of belief tags in standalone Main and Discussion posts?
comment by Lumifer · 2015-06-09T15:02:40.379Z · LW(p) · GW(p)
I feel the need to go a bit meta.
A bunch of people here expressed discomfort with downvoting. Essentially, they are saying that the likelihood of criticism -- either overt (the post gets skewered) or covert (the post gets silently downvoted) -- discourages them from doing things such as posting content.
Let me agree that this is a problem. It's a problem of being thin-skinned and it's a big problem for these people. The thing is, real life is not a support group full of nice boys and girls with gold stars for everyone and no criticism ever because it migh stunt your personal growth.
The ability to handle disagreement, criticism (fair or unfair), and the general attitude of "fuck you you fucking fuck" is a very very useful ability to have. In fact, I would call it essential. If someone calling you an idiot makes you go curl up in the corner and never ever try anything like what you did again, well, either you need therapy or you need to HTFU.
The world is full of unfair, mean, nasty people. You will meet them at various points in your life. You have to be able to deal with them. If the idea of a downvote on an internet forum scares you into impotence, how will you handle things like negative performance reviews or just a boss who's having a bad day and decided to scream at you for a bit?
Of course, as with all things, there has to be a certain balance. If you are surrounded by assholes, often the best response is to go someplace else. But LW is not full of assholes. If you cannot handle LW, how will you handle reality?
Replies from: Vaniver, Dahlen↑ comment by Vaniver · 2015-06-09T15:41:31.608Z · LW(p) · GW(p)
It's a problem of being thin-skinned and it's a big problem for these people. The thing is, real life is not a support group full of nice boys and girls with gold stars for everyone and no criticism ever because it migh stunt your personal growth.
No... but real life is a place where people can take their balls and go home, because they don't want to play with you anymore. Eliezer doesn't have to post to LW; Yvain doesn't have to post to LW; interesting people can just go elsewhere and do things that are more fun for them, and the more interesting they are, the more likely they are to have other options. Yes, there should be a place on LW where people can ruthlessly skewer technical ideas, but empirically we are massively losing out by limiting the audience of LW to TOUGH GUYS who can HANDLE CRITICISM.
Replies from: Lumifer↑ comment by Lumifer · 2015-06-09T16:26:38.575Z · LW(p) · GW(p)
empirically we are massively losing out by limiting the audience of LW to TOUGH GUYS who can HANDLE CRITICISM
First, not audience but content creators, but second, is this so? Did any of the really valuable contributors to LW go away because they were driven away by incessant criticism? You think Scott Alexander moved to SSC because he couldn't handle the downvotes?
The general cry here seems to be "We want more content!". Well, I don't want more content. I have a whole internet full of content. What I want is more high-quality content that I do not need to search through piles of manure to find. The great advantage of LW is that here pearls are frequent but bullshit is rare -- and I attribute this in not a small degree to the fact that you'll be punished (by downvotes and comments) for posting bullshit.
A system without downvotes encourages posting, true, but it encourages posting of everything including cat pictures and ruminations on a breakfast sandwich in three volumes. Someone has to do pruning and if you take this power away from the users, it'll fall to the moderators. I don't see why this would be better -- and people whose cat got disrespected will still be unhappy.
Replies from: Risto_Saarelma, Vaniver, philh↑ comment by Risto_Saarelma · 2015-06-10T09:38:10.023Z · LW(p) · GW(p)
Did any of the really valuable contributors to LW go away because they were driven away by incessant criticism? You think Scott Alexander moved to SSC because he couldn't handle the downvotes?
Didn't Eliezer say somewhere that he posts on Facebook instead of LW nowadays because on LW you get dragged into endless point-scoring arguments with dedicated forum arguers and on Facebook you just block commenters who come off as too tiresome to engage with from your feed?
Replies from: Lumifer, ChristianKl↑ comment by Lumifer · 2015-06-10T14:57:09.378Z · LW(p) · GW(p)
As far as I understand (it isn't very far), Eliezer prefers Facebook basically because it gives him control -- which is perfectly fine, his place on FB is his place and he sets the rules.
I don't think that degree of control would be acceptable on LW -- the local crowd doesn't like tyrants, even wise and benevolent.
↑ comment by ChristianKl · 2015-06-10T10:08:25.605Z · LW(p) · GW(p)
On the LW facebook group Eliezer bans occasionally bans people who post really low quality content. The same goes for his own feed.
If Eliezer would bans someone on LW on the other hand he would get a storm of criticism.
Replies from: Vaniver↑ comment by Vaniver · 2015-06-10T14:02:19.758Z · LW(p) · GW(p)
If Eliezer would bans someone on LW on the other hand he would get a storm of criticism.
I'm curious what solution would work here.
Suppose you had a list of ~10 users with 'censor' power, and the number of censors who have 'remonstrated' a user is public, possibly also with the remonstrations. "Don't be a jerk," or "don't promote other sites in your early posts," or "think before you speak," or so on. If a sufficient number of censors have remonstrated a user, then they're banned, but censors can lift their remonstration once it's no longer appropriate.
Thoughts on this solution:
Reasoning is clear and transparent, and gradual. Instead of "all clear" suddenly turning to "can't post anymore," people are put 'on notice.'
If which censor has remonstrated a user is hidden, it isn't "Eliezer" using his dictatorial powers; it's some moderator moderating.
If which censor has remonstrated a user is hidden, the drama might multiply rather than decrease. Now an offending user can message the entire group of censors, pleading to have their remonstration removed, or complain bitterly that clearly it was their enemy who is a censor, regardless of whether or not that was actually the person that remonstrated with them.
If three out of ten moderators agree that a poster should stop posting, then it becomes much easier to defend the action to remove the poster.
↑ comment by ChristianKl · 2015-06-10T14:43:45.906Z · LW(p) · GW(p)
That's a bureaucratic solution.
But it doesn't really get at the heart of the issue. Eliezer acts that way because of the Roko affair and people telling him that he shouldn't have moderated. In that case the decision being made by three people instead of one wouldn't have made it more defensible.
This forum currently has MIRI ties that make controversial moderating decisions reflect badly on MIRI. A solution would be to cut those ties and give LW into the hand of a small group of moderators who are more free to focus on what's good for the community instead of larger PR effects.
↑ comment by Vaniver · 2015-06-09T17:21:41.230Z · LW(p) · GW(p)
You think Scott Alexander moved to SSC because he couldn't handle the downvotes?
He did explicitly point out that this culture of criticism / high standards makes writing for LW a chore, and so he doesn't do it anymore. So, yes.
I am not advocating for the removal of downvotes; I think they serve a necessary function, and I think having some sort of pruning and sorting methodology is a core site feature. But to cultivate good content, it is not enough to just remove bad content.
Replies from: Lumifer↑ comment by Lumifer · 2015-06-09T17:32:30.010Z · LW(p) · GW(p)
He did explicitly point out that this culture of criticism / high standards makes writing for LW a chore
Let's bring in the entire quote. Yvain said:
Less Wrong requires no politics / minimal humor / definitely unambiguously rationality-relevant / careful referencing / airtight reasoning (as opposed to a sketch of something which isn't exactly true but points to the truth.) This makes writing for Less Wrong a chore as opposed to an enjoyable pastime.
Note that the first three points have nothing do with criticism. The fourth point is the requirement to show evidence which still isn't criticism. And the final point I read as having to be literal and formal with little "free play" in the moving parts -- I think there is a connection with the recent series of posts by Jonah Sinick where he talks how gestalt pattern recognition is, at certain level, superior to formal reasoning (and LW expects formal reasoning).
Yeah, I still think Scott Alexander could handle the downvotes just fine.
But to cultivate good content, it is not enough to just remove bad content.
I agree, but the suggestions offered tend to gravitate to "Let's just be nice to everyone"...
What kind of positive incentives to creators of high-quality content can LW come up with?
Replies from: Username↑ comment by Username · 2015-06-17T17:08:19.198Z · LW(p) · GW(p)
The thing is, the high standards on LW that Yvain refers to are precisely what makes LW content valuable. At some level, wanting to escape requirements such as airtight reasoning means you want to write stuff that doesn't have airtight reasoning.
Replies from: Lumifer↑ comment by Lumifer · 2015-06-17T17:57:40.982Z · LW(p) · GW(p)
The thing is, the high standards on LW that Yvain refers to are precisely what makes LW content valuable.
Yes, I agree. That's why I think "more content" is the wrong yardstick. I want "more high-quality content" which you don't get by relaxing standards.
wanting to escape requirements such as airtight reasoning means you want to write stuff that doesn't have airtight reasoning
Correct, but that's fine. There is a lot of high-quality and valuable stuff that is not airtight-reasoned.
↑ comment by philh · 2015-06-09T17:05:24.431Z · LW(p) · GW(p)
I've refrained from posting because I expected to get really banal criticism. You may or may not consider that a loss. But I kind of get the impression that Scott feels somewhat similarly. It's not like he doesn't get criticized on SSC.
I think this isn't a case of me needing to HTFU. (Other self-modification would have worked, but it would also not be very useful outside of LW.) So it may not be relevant to what you're trying to say. But I also wonder whether other people feel similarly, and are expressing it in ways that you're interpreting as them needing to HTFU.
Replies from: Lumifer↑ comment by Lumifer · 2015-06-09T17:36:37.637Z · LW(p) · GW(p)
I expected to get really banal criticism
..so just ignore it?
I think yours is a different case -- it's as if you want better readers than the LW crowd. Would you be fine with insightful and to-the-point skewering?
Replies from: philh↑ comment by philh · 2015-06-09T18:36:25.700Z · LW(p) · GW(p)
(I know I'm allowed to ignore comments like that, but I still didn't feel like bothering.)
I don't think "better" readers would be a helpful way to frame it. There are lots of dimensions of quality. E.g. one of the HN comments said
Game theorists always seem to assume that there's no such thing as nuanced communication.
which is a bad comment in a way that I don't think would get traction on LW.
I think... maybe one factor is comments that are bad because they're wrong, and comments that are bad because they're right but, really, who cares? Like jaywalking in front of a policeman who then stops you, gives you a stern lecture, and you have to say yes officer and no officer and so on. It feels more like a power trip than an actual attempt to make me or anyone else safer.
If insightful and to-the-point skewering was justified, then I wouldn't enjoy it and it might put me off future posting (and maybe it should), but I hope I would find it valuable and take it as a sign that I needed to level up.
Replies from: Lumifer↑ comment by Lumifer · 2015-06-09T18:56:05.983Z · LW(p) · GW(p)
maybe one factor is comments that are bad because they're wrong, and comments that are bad because they're right but, really, who cares?
So, nit-picking? Yes, it's popular on LW :-/ but (a) you are still free to ignore those; and (b) as opposed to the example with the cop, there is no inherent power imbalance. Nothing prevents you from going meta and pointing out the difference between what is important and what is not.
Do I read you right in that you want more co-travelers in figuring out problems and solutions and less critics who carefully examine your text for minor flaws and gotchas, basically?
Replies from: philh, OrphanWilde↑ comment by philh · 2015-06-10T14:26:43.228Z · LW(p) · GW(p)
On reflection, I'm not sure that nitpicking is quite the problem that I'm pointing at, but I don't think I have a very good handle on what is. (I do think nitpicking is a problem.)
Maybe next time I have that feeling, I'll just post anyway and see what happens.
↑ comment by OrphanWilde · 2015-06-10T15:04:01.076Z · LW(p) · GW(p)
So, nit-picking? Yes, it's popular on LW :-/ but (a) you are still free to ignore those; and (b) as opposed to the example with the cop, there is no inherent power imbalance. Nothing prevents you from going meta and pointing out the difference between what is important and what is not.
It often takes a special effort to -notice- that a criticism isn't meaningful, especially when it is correct - especially because Less Wrong entertains a -much- higher level of pedant than will generally be encountered elsewhere. More problematically, pedantry tends to get upvoted, which means people may pay too much attention to it, and also that it is being encouraged.
If we're interested in discouraging pedantry-for-the-sake-of-pedantry, I'd lean towards implementing an applause-lights keyword to indicate that a criticism may be valid, but doesn't actually add anything to what is being said, along the lines of how "Updating" was used as an applause-lights keyword to counterbalance the generally negative attitude people start with towards admitting wrongness.
Replies from: Lumifer↑ comment by Lumifer · 2015-06-10T15:24:02.615Z · LW(p) · GW(p)
It often takes a special effort to -notice- that a criticism isn't meaningful, especially when it is correct
True -- but I think it's a very useful skill to develop and practice.
pedantry tends to get upvoted
And that is probably a feature of the local culture by now, heavily supported by the meme of how you can't make even one tiny little itty bitty mistake when programming the AI because if you do it's all paperclips all the time.
I'd lean towards implementing an applause-lights keyword
I call such things "technically correct, but irrelevant", but I don't think this expression functions well as an applause-lights switch. Ideas?
Replies from: OrphanWilde↑ comment by OrphanWilde · 2015-06-10T17:18:10.924Z · LW(p) · GW(p)
The best opposite to "pedantry" I can come up with is "pragmatic." Pragmatism is a relatively good value on Less Wrong, but I don't see a good application.
Yours seems good. It concedes the argument attempted to be raised, shutting off further discussion - a very desirable quality when dealing with somebody who is specifically looking for something to argue with - and rebuts the fundamental problem, redirecting future attention there. (Minor shift for reasons I have trouble explicating, but which seems a stronger, slightly harsher version of the sentiment - "Technically correct. Also irrelevant.") If it's used appropriately, and consistently, I think it could become an applause-light within the sub-culture here.
↑ comment by Dahlen · 2015-06-21T01:10:36.114Z · LW(p) · GW(p)
Ah, a vote in favour of strife. Yes, that's what it is. If you start off from the premise of a world full of unfair, mean, nasty people, you do still have the choice of either adapting by joining their ranks, or ensuring that the patch of reality you control remains well-defended from the corruption. This is a very useful matter to conceive of in terms of tendencies. What to promote? Harmony, or strife? You're pushing for more strife now in what seems to me you conceive of as overzealous pro-harmony efforts, but with that attitude I have no guarantee that you won't push for strife even further. Even with the talk of balances and all.
Ironically enough, I do think that LW is pretty balanced in that regard (with some outliers, of course), so on the surface, I agree that downvotes shouldn't be having a great emotional impact on a reasonably stable individual. It's the attitude that begets disapproval, not the facts as they now stand. You'll still be more comfortable with trolling rather than with sensitivity even after this bout of excess sensitivity might have passed or been successfully countered.
There are 1) better and 2) enough venues for getting acquainted with the harsher realities of the world. Why would anyone try to make more of them out of milder spaces is beyond me. But I suppose conflict is another one of those acquired tastes.
On the topic of negative feedback:
If someone calling you an idiot makes you go curl up in the corner and never ever try anything like what you did again, well, either you need therapy or you need to HTFU.
This line right here illustrates the belief that the dignified way to deal with mean-spirited criticism is never to internalise it; presumably to have / express a low opinion on the criticiser right back? Criticism, in order to serve some useful purpose besides just creating tension between people, has to be listened to, otherwise it's just a pointless battle between my pride and yours. Who knows, maybe the person really is an idiot who should never ever try anything like he/she did again. If the local culture has it that that option is never even up for consideration, every attempt at criticism will just result in a lot of pointless bickering. If we're being realistic rather than either sensitive or prideful, and want criticism to function properly as negative feedback, then we want bad posters to maybe consider a defanged version of "they're being idiots", but without feeling like they've made a new enemy. Then it's the criticiser's responsibility to deliver the criticism in a manner that maximises the signal and minimises the noise. I.e. no pointless hostility.
Interestingly, there's a forum I hang out around that has this same philosophy of thick-skinnedness. It has upvotes but no downvotes, and this was a conscious decision by the admins -- because they knew everyone would be downvoting left, right, and centre. A signal of appreciation was more, let's say, signal-y than one of dislike, for them. It's been working like a charm for years and years.
Replies from: Lumifer↑ comment by Lumifer · 2015-06-22T16:03:05.412Z · LW(p) · GW(p)
Ah, a vote in favour of strife. Yes, that's what it is.
Nope, that's what it is not.
That specific comment is really not about LW voting system at all, it's about people's ability to take criticism (of various sorts including totally unfair one) and the usefulness of such an ability.
What to promote? Harmony, or strife?
Still nope, even in the context of LW karma that's the wrong framework. Negative feedback is not strife -- if you screwed up and no one will tell you so because it's not nice, you will continue to screw up until reality delivers the message to you. Feedback and consequences is a much more useful set of terms to use.
It's the attitude that begets disapproval
LOL. Would you like to... adjust my attitude? X-D
the belief that the dignified way to deal with mean-spirited criticism is never to internalise it; presumably to have / express a low opinion on the criticiser right back?
You're missing a very important part: distinguishing between the criticism of an idea or a proposal, and the criticism of a person.
You should listen and pay attention to the criticism of your ideas. You should not interpret the criticism of your ideas as criticism of your self/identity/personality/soul/etc.
comment by Evan_Gaensbauer · 2015-06-09T01:33:15.847Z · LW(p) · GW(p)
Nate Soares' blog seems excellent, of what I've read. I don't read all of it. He posts approximately once or twice per week, and writes his blog posts in the form of sequences, like Eliezer or Luke have done in the past. He doesn't seemed to have slowed in recent weeks in coming into his role as executive director of MIRI. I'm unsure if he'll blog less frequently as he comes into his new role at MIRI in full. Anyway, if he intends to keep blogging every couple weeks, you/we could ask him to cross-post as many blog posts as he feels like to Less Wrong, as most content on his blog seems more than appropriate. He could act as a lightning rod or new hero to revitalize Less Wrong, at least for a time. I don't know how lazy other users are. Maybe most of never read a post if it's not directly in Main or Discussion. Maybe most of us never click on links on the lower sidebar(s), but would be more inspired to build upon or respond to articles posted directly on Less Wrong.
Replies from: Vaniver↑ comment by Vaniver · 2015-06-09T01:37:43.284Z · LW(p) · GW(p)
you/we could ask him to cross-post as many blog posts as he feels like to Less Wrong
He's already cross-posted several, but I don't see this solution working long-term, or generalizing to many people, unless it is technically very easy.
comment by Journeyman · 2015-06-11T08:14:02.030Z · LW(p) · GW(p)
Another piece of the rationalist diaspora is neoreaction. They left LW because it wasn't a good place for talking about anything politically incorrect, an ever expanding set. LW's "politics is the mindkiller" attitude was good for social cohesion, but bad for epistemic rationality, because so many of our priors are corrupted by politics and yesterday's equivalent of social justice warriors.
Neoreaction is free of political correctness and progressive moral signaling, and it takes into account history and historical beliefs when forming priors about the world. This approach allows all sorts of uncomfortable and repulsive ideas, but it also results in intellectual progress along novel lines of thought.
Neoreactionary thought varies in quality and rigor, but the current leadership contains rationalists now, and they have recognized the need to provide more rigorous arguments. I predict that more and more rationalists will explore neoreaction once they get over their absurdity heuristic and realize what it actually is.
Replies from: IlyaShpitser, ChristianKl, hairyfigment↑ comment by IlyaShpitser · 2015-06-11T10:09:33.927Z · LW(p) · GW(p)
I think I learned what I needed to learn about Moldbug and neoreaction based on his reaction to Scott's post. "Intellectual progress" is when you engage with your critics.
Replies from: None, knb, Journeyman, Lumifer↑ comment by Journeyman · 2015-06-12T08:57:04.944Z · LW(p) · GW(p)
I think many people would have loved to see a response by Moldbug, and found his response disappointing. My guess is that Moldbug felt that his writings already answered a lot of Scott's objections, or that Scott's approach wasn't fair. And Moldbug isn't the same thing as neoreaction; there were other responses by neoreactionaries to Scott's FAQ.
The FAQ nails neoreaction on a lot of object-related issues, and it has some good philosophical objections. But it doesn't do a good job of showing the object-related issues that neoreaction got right, and it doesn't quite do justice to some ideas, like The Cathedral and demotism. And the North Korea stuff has really easy to anticipate objections from neoreactionaries (like the fact that it was lead by communists).
The FAQ answers the question "what are a bunch of objections to neoreaction?", but it doesn't answer the question "how good a philosophy is neoreaction?" because it only makes a small dent. If you consider the FAQ in conjunction with Neoreactionary Philosophy in an Enormous, Planet-sized Nutshell, then you would get a better sense of the big picture of neoreaction, but he doesn't really integrate his arguments across the two essays, which causes an unfortunately misleading impression.
The FAQ put me off getting into neoreaction for a while, but when I did, I was much more impressed than I expected. The only way to get a good sense of what it actually is would be spending a lot of time with it.
Replies from: None, VoiceOfRa↑ comment by [deleted] · 2015-06-17T11:31:16.170Z · LW(p) · GW(p)
Things that need to happen before I take NRx any sort of seriously:
- Someone hires an editor for Moldbug and publishes a readable and structured ebook
Currently I have no idea if Moldbugs writings really answered Scott's objections and finding it out looks simply harder than what being a generic reader is supposed to be.
Replies from: VoiceOfRa↑ comment by VoiceOfRa · 2015-06-16T01:22:22.919Z · LW(p) · GW(p)
The FAQ nails neoreaction on a lot of object-related issues,
And gets a bunch of the object level issues wrong, as Michael Anissimov has pointed out.
Replies from: Journeyman↑ comment by Journeyman · 2015-06-16T02:58:32.302Z · LW(p) · GW(p)
Fully agreed. Oops, accidentally retracted this and can't fix it.
↑ comment by Lumifer · 2015-06-11T14:34:51.367Z · LW(p) · GW(p)
"Intellectual progress" is when you engage with your critics.
Without getting into NRx issues, this sentence is very wrong.
Replies from: None↑ comment by [deleted] · 2015-06-18T09:23:09.117Z · LW(p) · GW(p)
Arguing and pursuing truth is indeed not the same, but when virtually every empirical, numerical claim is falsified by an opponent, that is a situation where arguing or changing the mind is really called for.
To be fair, when they were making them I already smelled something. I have some familiarity with the history of conservative thought back to Oakeshott, Chesterton, Burke or Cicero and never just pointed to a crime stat or something and saying see, that is what is wrong here. It was never their strengths and I was half-expecting that engaging in chart duels is something they are not going to win.
↑ comment by ChristianKl · 2015-10-28T11:28:15.201Z · LW(p) · GW(p)
Neoreaction is free of political correctness and progressive moral signaling, and it takes into account history and historical beliefs when forming priors about the world.
"Taking in account history" means for neoreactionaries deconstrutivist techniques and not factual discussion for which evidence has to be presented. At least that's a position that Moldbug argued explicitely.
When you look at the success of Moldbug predictions such as Bitcoin going to zero, you find that Moldbug is very bad at political understanding because he let's himself get blinded by stories.
↑ comment by hairyfigment · 2015-06-22T18:41:18.815Z · LW(p) · GW(p)
Downvoted for "political correctness". The short response is that some neoreactionary volunteered the claim that Scott could not be a leader of Nrx until he ended a particular relationship, his personal life being "politically incorrect" in that commenter's circle.
Replies from: Journeyman↑ comment by Journeyman · 2015-06-23T06:31:40.939Z · LW(p) · GW(p)
While both the left and the right have their own forms of ideological conformity, the term "political correctness" is associated with left ideological conformity. There is a reason that ideological purges and struggle sessions throughout history are associated with the left. I realize that "political correctness" is a loaded term, but I agree with its connotations and I'm not interested in feigning neutrality.
As for Scott, I cannot comment on that particular case, but him as a leader of NRx wouldn't make sense anyway because he isn't right-leaning enough.
Replies from: hairyfigment↑ comment by hairyfigment · 2015-06-23T18:32:51.540Z · LW(p) · GW(p)
You're just digging yourself in deeper. (See also.) You know this, or you wouldn't have used the meaningless phrase, "associated with the left."