Revitalizing Less Wrong seems like a lost purpose, but here are some other ideas
post by John_Maxwell (John_Maxwell_IV) · 2016-06-12T07:38:58.557Z · LW · GW · Legacy · 34 commentsContents
34 comments
This is a response to ingres' recent post sharing Less Wrong survey results. If you haven't read & upvoted it, I strongly encourage you to--they've done a fabulous job of collecting and presenting data about the state of the community.
So, there's a bit of a contradiction in the survey results. On the one hand, people say the community needs to do more scholarship, be more rigorous, be more practical, be more humble. On the other hand, not much is getting posted, and it seems like raising the bar will only exacerbate that problem.
I did a query against the survey database to find the complaints of top Less Wrong contributors and figure out how best to serve their needs. (Note: it's a bit hard to read the comments because some of them should start with "the community needs more" or "the community needs less", but adding that info would have meant constructing a much more complicated query.) One user wrote:
[it's not so much that there are] overly high standards, just not a very civil or welcoming climate . why write content for free and get trashed when I can go write a grant application or a manuscript instead?
ingres emphasizes that in order to revitalize the community, we would need more content. Content is important, but incentives for producing content might be even more important. Social status may be the incentive humans respond most strongly to. Right now, from a social status perspective, the expected value of creating a new Less Wrong post doesn't feel very high. Partially because many LW posts are getting downvotes and critical comments, so my System 1 says my posts might as well. And partially because the Less Wrong brand is weak enough that I don't expect associating myself with it will boost my social status.
When Less Wrong was founded, the primary failure mode guarded against was Eternal September. If Eternal September represents a sort of digital populism, Less Wrong was attempting a sort of digital elitism. My perception is that elitism isn't working because the benefits of joining the elite are too small and the costs are too large. Teddy Roosevelt talked about the man in the arena--I think Less Wrong experienced the reverse of the evaporative cooling EY feared, where people gradually left the arena as the proportional number of critics in the stands grew ever larger.
Given where Less Wrong is at, however, I suspect the goal of revitalizing Less Wrong represents a lost purpose.
ingres' survey received a total of 3083 responses. Not only is that about twice the number we got in the last survey in 2014, it's about twice the number we got in 2013, 2012, and 2011 (though much bigger than the first survey in 2009). It's hard to know for sure, since previous surveys were only advertised on the LessWrong.com domain, but it doesn't seem like the diaspora thing has slowed the growth of the community a ton and it may have dramatically accelerated it.
Why has the community continued growing? Here's one possibility. Maybe Less Wrong has been replaced by superior alternatives.
- CFAR - ingres writes: "If LessWrong is serious about it's goal of 'advancing the art of human rationality' then it needs to figure out a way to do real investigation into the subject." That's exactly what CFAR does. CFAR is a superior alternative for people who want something like Less Wrong, but more practical. (They have an alumni mailing list that's higher quality and more active than Less Wrong.) Yes, CFAR costs money, because doing research costs money!
- Effective Altruism - A superior alternative for people who want something that's more focused on results.
- Facebook, Tumblr, Twitter - People are going to be wasting time on these sites anyway. They might as well talk about rationality while they do it. Like all those phpBB boards in the 00s, Less Wrong has been outcompeted by the hot new thing, and I think it's probably better to roll with it than fight it. I also wouldn't be surprised if interacting with others through social media has been a cause of community growth.
- SlateStarCodex - SSC already checks most of the boxes under ingres' "Future Improvement Wishlist Based On Survey Results". In my opinion, the average SSC post has better scholarship, rigor, and humility than the average LW post, and the community seems less intimidating, less argumentative, more accessible, and more accepting of outside viewpoints.
- The meatspace community - Meeting in person has lots of advantages. Real-time discussion using Slack/IRC also has advantages.
Less Wrong had a great run, and the superior alternatives wouldn't exist in their current form without it. (LW was easily the most common way people heard about EA in 2014, for instance, although sampling effects may have distorted that estimate.) But that doesn't mean it's the best option going forward.
Therefore, here are some things I don't think we should do:
- Try to be a second-rate version of any of the superior alternatives I mentioned above. If someone's going to put something together, it should fulfill a real community need or be the best alternative available for whatever purpose it serves.
- Try to get old contributors to return to Less Wrong for the sake of getting them to return. If they've judged that other activities are a better use of time, we should probably trust their judgement. It might be sensible to make an exception for old posters that never transferred to the in-person community, but they'd be harder to track down.
- Try to solve the same sort of problems Arbital or Metaculus is optimizing for. No reason to step on the toes of other projects in the community.
But that doesn't mean there's nothing to be done. Here are some possible weaknesses I see with our current setup:
- If you've got a great idea for a blog post, and you don't already have an online presence, it's a bit hard to reach lots of people, if that's what you want to do.
- If we had a good system for incentivizing people to write great stuff (as opposed to merely tolerating great stuff the way LW culture historically has), we'd get more great stuff written.
- It can be hard to find good content in the diaspora. Possible solution: Weekly "diaspora roundup" posts to Less Wrong. I'm too busy to do this, but anyone else is more than welcome to (assuming both people reading LW and people in the diaspora want it).
- EDIT 11/27/16 - Recently people have been arguing that social media generates relatively superficial discussions. This plausibly undermines my "lost purpose" thesis.
ingres mentions the possibility of Scott Alexander somehow opening up SlateStarCodex to other contributors. This seems like a clearly superior alternative to revitalizing Less Wrong, if Scott is down for it:
- As I mentioned, SSC already seems to have solved most of the culture & philosophy problems that people complained about with Less Wrong.
- SSC has no shortage of content--Scott has increased the rate at which he creates open threads to deal with an excess of comments.
- SSC has a stronger brand than Less Wrong. It's been linked to by Ezra Klein, Ross Douthat, Bryan Caplan, etc.
But the most important reasons may be behavioral reasons. SSC has more traffic--people are in the habit of visiting there, not here. And the posting habits people have acquired there seem more conducive to community. Changing habits is hard.
As ingres writes, revitalizing Less Wrong is probably about as difficult as creating a new site from scratch, and I think creating a new site from scratch for Scott is a superior alternative for the reasons I gave.
So if there's anyone who's interested in improving Less Wrong, here's my humble recommendation: Go tell Scott Alexander you'll build an online forum to his specification, with SSC community feedback, to provide a better solution for his overflowing open threads. Once you've solved that problem, keep making improvements and subfora so your forum becomes the best available alternative for more and more use cases.
And here's my humble suggestion for what an SSC forum could look like:
As I mentioned above, Eternal September is analogous to a sort of digital populism. The major social media sites often have a "mob rule" culture to them, and people are increasingly seeing the disadvantages of this model. Less Wrong tried to achieve digital elitism and it didn't work well in the long run, but that doesn't mean it's impossible. Edge.org has found a model for digital elitism that works. There may be other workable models out there. A workable model could even turn in to a successful company. Fight the hot new thing by becoming the hot new thing.
My proposal is based on the idea of eigendemocracy. (Recommended that you read the link before continuing--eigendemocracy is cool.) In eigendemocracy, your trust score is a composite rating of what trusted people think of you. (It sounds like infinite recursion, but it can be resolved using linear algebra.)
Eigendemocracy is a complicated idea, but a simple way to get most of the way there would be to have a forum where having lots of karma gives you the ability to upvote multiple times. How would this work? Let's say Scott starts with 5 karma and everyone else starts with 0 karma. Each point of karma gives you the ability to upvote once a day. Let's say it takes 5 upvotes for a post to get featured on the sidebar of Scott's blog. If Scott wants to feature a post on the sidebar of his blog, he upvotes it 5 times, netting the person who wrote it 1 karma. As Scott features more and more posts, he gains a moderation team full of people who wrote posts that were good enough to feature. As they feature posts in turn, they generate more co-moderators.
Why do I like this solution?
- It acts as a cultural preservation mechanism. On reddit and Twitter, sheer numbers rule when determining what gets visibility. The reddit-like voting mechanisms of Less Wrong meant that the site deliberately kept a somewhat low profile in order to avoid getting overrun. Even if SSC experienced a large influx of new users, those users would only gain power to affect the visibility of content if they proved themselves by making quality contributions first.
- It takes the moderation burden off of Scott and distributes it across trusted community members. As the community grows, the mod team grows with it.
- The incentives seem well-aligned. Writing stuff Scott likes or meta-likes gets you recognition, mod powers, and the ability to control the discussion--forms of social status. Contrast with social media sites where hyperbole is a shortcut to attention, followers, upvotes. Also, unlike Less Wrong, there'd be no punishment for writing a low quality post--it simply doesn't get featured and is one more click away from the SSC homepage.
TL;DR - Despite appearances, the Less Wrong community is actually doing great. Any successor to Less Wrong should try to offer compelling advantages over options that are already available.
34 comments
Comments sorted by top scores.
comment by Gram_Stone · 2016-06-12T22:22:07.888Z · LW(p) · GW(p)
Well, this is discouraging to someone who had the opposite reaction to ingres' recent survey analysis. I heard, "Try to solve the object-level problem and create content that meets the desiderata that are implicit in the survey results."
I was going to start writing about feelings-as-information theory; Kaj Sotala introduced moods as information in Avoid misinterpreting your emotions, lukeprog mentions it briefly in When Intuitions Are Useful (which Wei Dai thought might be relevant to metaphilosophy), and gwern mentions related work on processing fluency here. There are simple but interesting twists on classic, already-simple heuristics and biases experiments that everyone here's familiar with, debiasing implications, stuff about aesthetics, stuff on how we switch between Type 1 and Type 2 processing, which is relevant to the stuff lukeprog was getting into with Project guide: How IQ predicts metacognition and philosophical success, and what Kaj Sotala was getting into with his summaries of Stanovich's What Intelligence Tests Miss.
I was just about to write another post about how thinking of too many alternative outcomes to historical events can actually make hindsight bias worse, with explanations of the experimental evidence, like my most recent post. I don't know how to do more for the audience than do things like warn them about how debiasing hindsight can backfire.
And there's other stuff I could think to write about after all of that.
There are quite a number of people coordinating to fulfill the goal of revitalizing LW, and I wonder if something like this couldn't have waited. I mean, everyone just told everyone exactly what everyone's doing wrong.
Replies from: John_Maxwell_IV↑ comment by John_Maxwell (John_Maxwell_IV) · 2016-06-13T07:10:35.767Z · LW(p) · GW(p)
I'm sorry for discouraging you. I think writing the posts you described is a great idea. I hope that if you write them, people who've read this will be more inclined to upvote them if they like them, given increased awareness of the incentives problem I described.
Another option is to pursue multiple angles of attack in parallel. My angle requires a programmer or two to volunteer their time (may as well contact Scott now if you're interested!); your angle requires people who have ideas to write them up. My guess is that these requirements don't funge against each other very much. Plus, even if the community ultimately decides to go elsewhere, I'm sure your ideas will be welcomed in that new place if you just post whatever you were going to post to LW there, and that will be a valuable kickstart.
I also agree that having people repeatedly say "LW is dying" can easily become a self-fulling prophecy. Even if LW is no longer a check-once-a-day kind of place, it can still be a perfectly fine check-once-a-week kind of place. I probably should have been more careful in my phrasing.
Replies from: Gram_Stone↑ comment by Gram_Stone · 2016-06-13T16:11:48.693Z · LW(p) · GW(p)
IAWYC.
comment by philh · 2016-06-13T17:08:15.705Z · LW(p) · GW(p)
(I've mostly only skimmed.)
It can be hard to find good content in the diaspora. Possible solution: Weekly "diaspora roundup" posts to Less Wrong. I'm too busy to do this, but anyone else is more than welcome to (assuming both people reading LW and people in the diaspora want it).
This is what /r/RationalistDiaspora was intended to do. It never really got traction, and is basically dead now, but it still strikes me as a good solution. If that's not going to revive though, I agree that a weekly thread on LW is worth trying. By default, I'll make one later this week. (I'm not currently sure I'll have anything to post in it myself, I'll be asking people to post links in the comments.)
Go tell Scott Alexander you'll build an online forum to his specification, with SSC community feedback, to provide a better solution for his overflowing open threads.
He tried to move people to /r/SlateStarCodex, but that didn't work. We'd want to understand why. (Some hypotheses: it wasn't actually on SSC, where people go directly; posts there don't pop up in their RSS readers; people have an aversion to comment systems with voting; people have an aversion to reddit specifically.)
As Scott features more and more posts, he gains a moderation team full of people who wrote posts that were good enough to feature.
I'm not sure that "writes good posts" and "would make a good moderator" are sufficiently correlated for this to work. A lot of people like Eliezer's writing but dislike his approach to moderation.
(On the other hand: maybe, if we want Eliezers to stick around, we need them to be able to shape the community? Even if that means upsetting people who don't write much.)
It also creates weird incentives, like: "I liked this post that was highly critical of our community, but I don't want the author to be a mod". (This is the problem that Scott Aa points to of "this system can only improve on ordinary democracy if the trust network has some other purpose" - I worry that voting-for-comment-scores isn't a sufficiently strong purpose to outweigh voting-for-moderators.)
Another system to consider would be to do it based on the way people administer votes, not the way they remove them. If your votes tend to correlate with others', they have more weight in future. If posts you flag tend to get removed, your flags count for more. (I'm not convinced that this works either.)
Replies from: Lumifer, John_Maxwell_IV, Pimgd↑ comment by Lumifer · 2016-06-14T14:28:14.662Z · LW(p) · GW(p)
He tried to move people to /r/SlateStarCodex, but that didn't work.
He didn't really try. All he did was mention offhand a couple of times that if people are unhappy with how the comment section works, there is the subreddit and it looks reasonable to him.
It would not be hard for Scott to move people to subreddit: put a link to it at the end of each article + just go there and respond to comments in the subreddit.
↑ comment by John_Maxwell (John_Maxwell_IV) · 2016-06-14T09:38:32.890Z · LW(p) · GW(p)
He tried to move people to /r/SlateStarCodex, but that didn't work. We'd want to understand why. (Some hypotheses: it wasn't actually on SSC, where people go directly; posts there don't pop up in their RSS readers; people have an aversion to comment systems with voting; people have an aversion to reddit specifically.)
I think a big explanation is that /r/SlateStarCodex was not advertised sufficiently, and people never developed the habit of visiting there. I imagine that if Scott chose to highlight great comments or self posts from /r/SlateStarCodex each week, the subreddit would grow faster, for instance.
Online communities are Schelling points. People want to be readers in the community where all the writers are, and vice versa. Force of habit keeps people visiting the same places over and over again, but if they don't feel reinforced through interesting content to read / recognition of their writing, they're liable to go elsewhere. The most likely explanation for why any online community fails, including stuff like /r/RationalistDiaspora and /r/SlateStarCodex, is that it never becomes a Schelling point. My explanation for why LW has lost traffic: there was a feedback loop involving people not being reinforced for writing and LW gradually losing its strength as a Schelling point.
Edit: also, subreddits are better suited to link sharing than original posts IMO.
I'm not sure that "writes good posts" and "would make a good moderator" are sufficiently correlated for this to work. A lot of people like Eliezer's writing but dislike his approach to moderation.
Acknowledged, but as long as the correlation is above 0, I suspect it's a better system than what reddit has, where ability to vote is based on possession of a warm body.
It also creates weird incentives, like: "I liked this post that was highly critical of our community, but I don't want the author to be a mod".
Concrete example: Holden Karnofsky's critical post was liked by many people. Holden has posted other stuff too, and his karma is 3689. That would give him about 1% of Eliezer's influence, 4% of Yvain's influence, or 39% of my influence. This doesn't sound upsetting to me and I doubt it would upset many others. If Holden was able to, say, collect mucho karma by writing highly upvoted rebuttals of every individual sequence post, then maybe he should be the new LW moderator-in-chief.
But even if you're sure this is a problem, it'd be simple to add another upvote option that increases visibility without bestowing karma. I deliberately kept my proposal simple because I didn't want to take away the fun of hashing out details from other people :) I'm in favor of giving Scott Alexander "god status" (ability to edit the karma for every person and post) until all the incentive details are worked out, and maybe even after that. In the extreme, the system I describe is simply a tool to lighten Scott's moderation load.
(This is the problem that Scott Aa points to of "this system can only improve on ordinary democracy if the trust network has some other purpose" - I worry that voting-for-comment-scores isn't a sufficiently strong purpose to outweigh voting-for-moderators.)
So I guess the analogy here would be if I want a particular user to have more influence, I'd vote up a post of theirs that I didn't think was very good in order to give them that influence? I guess this is a problem that would need to be dealt with. Some quick thoughts on solutions: Anonymize posts before they're voted on. Give Scott the ability to "punish" everyone who voted up a particularly bad post and lessen their moderation abilities.
Another system to consider would be to do it based on the way people administer votes, not the way they remove them. If your votes tend to correlate with others', they have more weight in future. If posts you flag tend to get removed, your flags count for more. (I'm not convinced that this works either.)
A related idea that might work better: Make it so downvotes work to decrease the karma score of everyone who upvoted a particular thing. This incentivizes upvoting things that people won't find upsetting, which works against the sort of controversy the rest of the internet incentivizes. But there's no Keynesian beauty contest because you can never gain points through upvoting, only lose them. This also creates the possibility that there will be a cost associated with upvoting a thing, which makes karma a bit more like currency (not necessarily a bad thing).
Replies from: John_Maxwell_IV↑ comment by John_Maxwell (John_Maxwell_IV) · 2016-06-28T01:47:02.621Z · LW(p) · GW(p)
The Less Wrong diaspora demonstrates that the toughest competition for online forums may be individual personal blogs. By writing on your personal blog, you build up you own status & online presence. To be more competitive with personal blogs, it might make sense to give high-karma users of a hypothetical SSC forum the ability to upvote their own posts multiple times, in addition to those of others. That way if I have a solid history of making quality contributions, I'd also have the ability to upvote a new post of mine multiple times if it was an idea I really wanted to see get out there, in the same way a person with a widely read personal blog has the ability to really get an idea out there. The mechanism I outlined above (downvotes taking away karma from the people who upvoted a thing) could prevent abuse of self-upvoting: if I self-upvote my own post massively, but it turns out to be lousy, other people will downvote it, and I'll lose some of the karma that gave me the ability to self-upvote massively.
↑ comment by Pimgd · 2016-06-14T09:07:35.292Z · LW(p) · GW(p)
If posts you flag tend to get removed, your flags count for more. (I'm not convinced that this works either.)
StackExchange uses a flag weight model. They removed it from the visible section of the profile (http://meta.stackexchange.com/questions/119715/what-happened-to-flag-weight) but I think they still use it internally.
comment by lsusr · 2021-03-04T09:25:17.820Z · LW(p) · GW(p)
I'm writing this from Less Wrong 2.0.
If you've got a great idea for a blog post, and you don't already have an online presence, it's a bit hard to reach lots of people, if that's what you want to do.
I don't know what Less Wrong 1.0 was like but I feel like Less Wrong 2.0 accomplishes this.
If we had a good system for incentivizing people to write great stuff (as opposed to merely tolerating great stuff the way LW culture historically has), we'd get more great stuff written.
Once again, I don't know what Less Wrong 1.0 was like, but I think Less Wrong 2.0 does a good job of this without incentivizing too much.
comment by plex (ete) · 2016-06-12T19:25:45.205Z · LW(p) · GW(p)
Excellent post. Agree with all major points.
I think Less Wrong experienced the reverse of the evaporative cooling EY feared, where people gradually left the arena as the proportional number of critics in the stands grew ever larger.
I'd think it was primarily not the proportional number of critics, but lower quality of criticism and great users getting tired of replying to/downvoting it. Most of the old crowd of lesswrongers welcomed well thought out criticism, but when people on the other side of an inferential distance gap try to imitate those high-criticism norms it is annoying to deal with, so they end up leaving. Especially if the lower quality users are loud and more willing to use downvotes as punishment for things they don't understand.
Replies from: casebash, someonewrongonthenet↑ comment by someonewrongonthenet · 2016-06-30T06:54:42.305Z · LW(p) · GW(p)
So basically it is eternal september, then. It's just that lesswrong's "september" took the form of excessively/inappropriately contrarian people.
Replies from: ete↑ comment by plex (ete) · 2016-07-03T01:40:31.281Z · LW(p) · GW(p)
Among other forms, yes.
comment by ChristianKl · 2016-06-12T17:27:40.222Z · LW(p) · GW(p)
Try to solve the same sort of problems Arbital or Metaculus is optimizing for. No reason to step on the toes of other projects in the community.
I don't think that it's bad to have multiple website that gather predictions. It's good to have different websites trying different approaches.
Metaculus has curated questions. Users can suggest new questions but they have to get chosen. Predictionbook allows user suggested questions. GJOpen is completly curated without much ways of users to suggest question (expect a bit, see the recent question for new questions) Metaculus starts by showing the user the average guesses of the community before the user votes.
Metaculus let's the user pick probabilities with a sliding scale. Predictionbook lets the user input a number. GJOpen let's the user click two times (you click first on 10-19 and then on 14).
All three try to score users differently.
I would welcome more experimentation.
comment by rayalez · 2016-06-15T00:20:49.280Z · LW(p) · GW(p)
Hey, everyone! Author of rationalfiction.io here.
I am actively building and improving our website, and I would be happy to offer it as a new platform for LW community, if there's interest.
I can take care of the hosting, and build all the necessary features.
I've been thinking about creating a LW-like website for a while now, but I wasn't sure that it will work. After reading this post I have decided that I'm going launch and see where it goes.
If there's any ideas or suggestions about how such platform can be improved or what features we'll need - let's discuss them.
By the way, the platform is open source(though I will probably fork it as a separate project and develop it in a new repo).
comment by Vaniver · 2016-06-12T23:00:41.304Z · LW(p) · GW(p)
It's hard to know for sure, since previous surveys were only advertised on the LessWrong.com domain, but it doesn't seem like the diaspora thing has slowed the growth of the community a ton and it may have dramatically accelerated it.
My impression is that a big part of the increase in survey responses was that this was explicitly advertised as a diaspora survey, with a number of people saying things like "if you're reading this, it's for you."
Replies from: ingres↑ comment by namespace (ingres) · 2016-06-12T23:49:19.118Z · LW(p) · GW(p)
Yes, that was exactly why it was marketed that way.
comment by Gleb_Tsipursky · 2016-06-12T21:20:48.789Z · LW(p) · GW(p)
Nice ideas! I think you highlighted well the fundamental problem of lack of social rewards for writing content for LW, and having strong criticism for doing so.
Regarding changing things, I think it makes sense to work with people like Scott who have a lot of credibility, and figure out what would work for them.
However, it also seems that LW itself has a certain brand, and attracts a sizable community. I would like to see a version of the voting system you described implemented here, with people who have more karma having votes that weigh more. I'd also like to see some cross-posting of content from Scott and others on LW itself.
So not doing away with LW as it exists, but expanding it in collaboration with others who would be interested in revitalizing a different form of LW. One where authors get appropriate credit for posting, with credible people - those who have lots of karma - being able to upvote them more.
comment by namespace (ingres) · 2016-06-12T19:38:13.434Z · LW(p) · GW(p)
I am honored that my survey writeup produced this level of quality discussion and endorse this post.
(Though not necessarily its proposed upvote scheme, sounds kind of flaky to me. I'm personally very skeptical of upvotes and community curation as cure-alls.)
Replies from: John_Maxwell_IV↑ comment by John_Maxwell (John_Maxwell_IV) · 2016-06-12T20:23:40.084Z · LW(p) · GW(p)
I am honored that my survey writeup produced this level of quality discussion and endorse this post.
Thanks!
(Though not necessarily its proposed upvote scheme, sounds kind of flaky to me. I'm personally very skeptical of upvotes and community curation as cure-alls.)
What's your favored solution?
comment by casebash · 2016-06-13T01:36:27.015Z · LW(p) · GW(p)
The incentives are currently a major problem. Looking at my posting history, my most upvoted posts are all light and fluffy things.
Try to tackle a difficult and controversial problem and you'll find it very hard to get upvotes, because these will mostly be balanced out by downvotes from people who strongly believe in the opposite.
Replies from: Dagon↑ comment by Dagon · 2016-06-14T02:12:31.816Z · LW(p) · GW(p)
How much are you motivated by votes in the first place, though? I care a bit that a comment got any reaction, and a bit that it's positive, but I give a lot more weight to followups and responses than to votes. And I especially don't care about vote magnitude. 2 or 3 is as good as 10 or 12 to my happiness-at-posting response.
Replies from: casebash↑ comment by casebash · 2016-06-14T14:24:36.585Z · LW(p) · GW(p)
I think I probably care less about votes than the average person, since I appreciate feedback. I suspect that many people who received as many critical comments and downvotes as I have would give up on posting on the forum (not that many of most posts are negative, but I've often got posts downvoted into the negative, then voted up by other people).
comment by buybuydandavis · 2016-06-12T22:44:22.058Z · LW(p) · GW(p)
How about just reposting the sequences? One a week. Plenty of quality material.
Replies from: Vaniver↑ comment by Vaniver · 2016-06-12T23:04:20.941Z · LW(p) · GW(p)
MinibearRex did this a while back, and the Rationality Reading Group recently finished RAZ. I weakly suspect that we're better off doing this with something besides the Sequences, like Superintelligence or Good and Real or so on.
(But if you want to do Sequence reposts, go ahead.)
comment by Evan_Gaensbauer · 2016-06-12T13:03:21.292Z · LW(p) · GW(p)
I don't know if it was in the comments of here or an SSC post, but when talking about the rationalist diaspora and where the community goes from here, Scott has said he would welcome blog posts from guest authors, and mentioned several people he'd be willing to have on the site, or had already invited to make a guest post. Naturally, the guest authors he mentioned were already once-prominent LW bloggers--I forget who he mentioned besides Eliezer, who declined, but there were almost a dozen others. Scott's writing is so impressive I wouldn't be surprised if even some of his close friends, our friends, who hundreds of us think are often writers just as good or sometimes better than Scott, are personally too intimidated to post on SSC.
Well, that's one hypothesis. Sometimes posts are so top-notch on SSC guest authors might feel they're not up to snuff. Another hypothesis is that, for prominent authors, LW was a forum which exhausted all the low-hanging fruit, and wasn't receptive to juicier, edgier, topics, like the culture and politics Scott writes about. However, with that level of exposure on a personal blog, writing on more controversial topics, earns a lot more scrutiny. SSC isn't without its share of contentious posts. Maybe all that politicking, having the patience to grit your teeth and exercise the principle of charity in the face of hundreds of commentators, is a skill Scott has that's harder for the rest of us to hack. Maybe other diaspora authors know this, and don't think they can thrust themselves into the spotlight without getting burned out.
I think if the LW/rationalist community made an effort to publicly laud the authors of various blogs we like, they'd feel like it's more worth the effort if respected readers want more. I don't know if that'll work. However, I know that if there was a thread were dozens of people were commenting that they liked my blog, even though they usually didn't speak up about it, and that each of those comments had dozens of upvotes or whatnot, I'd be more inclined to write.
Replies from: Evan_Gaensbauer↑ comment by Evan_Gaensbauer · 2016-06-12T13:05:31.266Z · LW(p) · GW(p)
I'm aware Ozy Frantz had one or two guest posts on SSC as well, and that sometimes they and Scott had a dynamic where they were sometimes responding to posts on each others' blogs, and that was interesting. AFAIK, that was mostly while they were dating. I don't know what the status of Ozy ever being a guest author on SSC, or not, is.
comment by Gunnar_Zarncke · 2016-06-12T09:35:13.714Z · LW(p) · GW(p)
Where can I find the CFAR mailing list you mentioned?
Replies from: Fluttershy, ChristianKl↑ comment by Fluttershy · 2016-06-12T11:06:18.895Z · LW(p) · GW(p)
I believe that you'll need to attend a CFAR workshop ($3,900 without a scholarship) to receive a subscription to the CFAR mailing list. I'd be willing to pay some amount just to get added to it, since I already have a CFAR workbook, and am relatively familiar with the material taught during the workshops.
↑ comment by ChristianKl · 2016-06-12T17:47:05.089Z · LW(p) · GW(p)
It's not public. It's an alumni mailing list. By only opening up the mailing list to people who were at CFAR, CFAR manages to create a selected cycle of people with shared vocabulary.
comment by root · 2016-06-12T14:10:55.549Z · LW(p) · GW(p)
Two questions:
Can anyone who is a user for a significant amount of time give links to anything that wasn't deemed worhy of the sequences but is a worthy read? I have no idea when the sequences were collected but if LW was really great in the past, there would've been a bunch of other high-quality posts that are easily missed. This could also double as proof that LW was, indeed, as great as advertised.
What do other places have that LW doesn't? If LW is dedicated to human rationality, is it truly doing that?
Am I a complete dumbass for typing this? In hindsight, it doesn't take a special variation of Godwin's law to think 'someone probably posted a similar question before'.
↑ comment by Manfred · 2016-06-13T20:26:28.918Z · LW(p) · GW(p)
\1. There have been several posters who wrote some very nice articles, including Alicorn, lukeprog, Yvain, AnnaSalamon, and Wei_Dai. (Listed in order on a sort of life-hacks to decision-theory spectrum).
Oh, and here's a classic by that prolific author, anonymous (Who? That would be telling :) )
\2. To be uncharitable, we might say that other places have way more discussions of race, politics, and gender. Or to be uncontroversial, we might just say that other places have a lot more ordinary blog-type content, which people read for ordinary blog-type reasons.
A lot of diasporae I like most (E.g. Otium, Paul Christiano's medium) don't have such content, and are correspondingly unpopular.
\3. On question 1, there are definitely index posts aimed at this sort of thing, but I couldn't find the specific one I Was thinking of with just a cursory search.
Replies from: root↑ comment by John_Maxwell (John_Maxwell_IV) · 2016-06-12T20:38:32.624Z · LW(p) · GW(p)
Here is a list of lists of archive posts.