Less Wrong lacks direction
post by casebash · 2015-05-25T14:53:30.972Z · LW · GW · Legacy · 34 commentsContents
34 comments
I think the greatest issue with Less Wrong is that it lacks direction. There doesn't appear to be anyone driving it forward or helping the community achieve its goals. At the start this role was taken by Eliezer, but he barely seems active these days. The expectation seems to be that things will happen spontaneously, on their own. And that has worked for a few things (is. SubReddit, study hall, ect.), but on the while the community is much less effective than it could be.
I want to give an example of how things could work. Let's imagine Less Wrong had some kind of executive (as opposed to moderators who just keep everything in order). At the start of the year, they could create a thread asking about what goals they thought were important for Less Wrong - ie. Increasing the content in main, producing more content for a general audience, increasing female participation rate.
They would then have a Skype meeting to discuss the feedback and to debate which ones that wanted to primarily focus on. Suppose for example they decided they wanted to increase the content in main. They might solicit community feedback on what kinds of articles they'd like to see more of. They might contact people who wrote discussion posts that were main quality and suggest they submit some content there instead. They could come up with ideas of new kinds of contentLW might find useful (ie. Project management) and seed the site with content on that area to do that people understand that kind of content is desired.
These roles would take significant work, but I imagine people would be motivated to do this by altruism or status. By discussing ideas in person (instead of just over the internet), they there would be more of an opportunity to build a consensus and they would be able to make more progress towards addressing these issues.
If a group said that they thought A was an important issue and the solution was X, most members would pay more attention than if a random individual said it. No-one would have to listen to anything they say, but I imagine that many would choose to. Furthermore if the exec were all actively involved in the projects, I imagine they'd be able to complete some smaller ones themselves, or at least provide the initial push to get it going.
34 comments
Comments sorted by top scores.
comment by ChaosMote · 2015-05-26T13:43:31.922Z · LW(p) · GW(p)
I think the issue you are seeing is that Less Wrong is fundamentally a online community / forum, not a movement or even a self-help group. "Having direction" is not a typical feature of such a medium, nor would I say that it would necessary be a positive feature.
Think about it this way. The majority of the few (N < 10) times I've seen explicit criticism of Less Wrong, one of the main points cited was that Less Wrong had a direction, and that said direction was annoying. This usually refereed to Less Wrong focusing on the FAI question and X-risk, though I believe I've seen the EA component of Less Wrong challenged as well. By its nature, having direction is exclusionary - people who disagree with you stop feeling welcome in the community.
With that said, I strongly caution about trying to change Less Wrong to import direction to the community as a whole (e.g. by having an official "C.E.O"). With that said, organizing a sub-movement within Less Wrong for that sort of thing carries much less risk of alienating people. I think that would be the most healthy direction to take it, plus it allows you to grow organically (since people can easily join/leave your movement and you don't need to get the entire community mobilized to get started).
Replies from: RobbBB↑ comment by Rob Bensinger (RobbBB) · 2015-05-28T22:27:03.253Z · LW(p) · GW(p)
I think these concerns are good if we expect the director(s) (/ the process of determining LessWrong's agenda) to not be especially good. If we do expect them the director(s) to be good, then they should be able to take your concerns into account -- include plenty of community feedback, deliberately err on the side of making goals inclusive, etc. -- and still produce better results, I think.
If you (as an individual or as a community) don't have coherent goals, then exclusionary behavior will still emerge by accident; and it's harder to learn from emergent mistakes ('each individual in our group did things that would be good in some contexts, or good from their perspective, but the aggregate behavior ended up having bad effects in some vague fashion') than from more 'agenty' mistakes ('we tried to work together to achieve an explicitly specified goal, and the goal didn't end up achieved').
If you do have written-out goals, then you can more easily discuss whether those goals are the right ones -- you can even make one of your goals 'spend a lot of time questioning these goals, and experiment with pursuing alternative goals' -- and you can, if you want, deliberately optimize for inclusiveness (or for some deeper problem closer to people's True Rejections). That creates some accountability when you aren't sufficiently inclusive, makes it easier to operationalize exactly what we mean by 'let's be more inclusive', and makes it clearer to outside observers that at least we want to be doing the right thing.
(This is all just an example of why I think having explicit common goals at all is a good idea; I don't know how much we do want to become more inclusive on various axes.)
Replies from: ChaosMote↑ comment by ChaosMote · 2015-05-29T01:44:46.828Z · LW(p) · GW(p)
You make a good point, and I am very tempted to agree with you. You are certainly correct in that even a completely non-centralized community with no stated goals can be exclusionary. And I can see "community goals" serving a positive role, guiding collective behavior towards communal improvement, whether that comes in the form of non-exclusiveness or other values.
With that said, I find myself strangely disquieted by the idea of Less Wrong being actively directed, especially by a singular individual. I'm not sure what my intuition is stuck on, but I do feel that it might be important. My best interpretation right now is that having an actively directed community may lend itself to catastrophic failure (in the same way that having a dictatorship lends itself to catastrophic failure).
If there is a single person or group of people directing the community, I can imagine them making decisions which anger the rest of the community, making people take sides or split from the group. I've seen that happen in forums where the moderators did something controversial, leading to considerable (albeit usually localized) disruption. If the community is directed democratically, I again see people being partisan and taking sides, leading to (potentially vicious) internal politics; and politics is both a mind killer and a major driver of divisiveness (which is typically bad for the community).
Now, to be entirely fair, these are somewhat "worst case" scenarios, and I don't know how likely they are. However, I am having trouble thinking of any successful online communities which have taken this route. That may just be a failure of imagination, or it could be that something like this hasn't been tried yet, but it is somewhat alarming. That is largely why I urge caution in the instance.
Replies from: casebash↑ comment by casebash · 2015-06-02T00:31:51.312Z · LW(p) · GW(p)
"With that said, I find myself strangely disquieted by the idea of Less Wrong being actively directed, especially by a singular individual." - the proposal wasn't that a single individual would choose the direction, but that there would be a group.
Replies from: Lumifercomment by estimator · 2015-05-25T15:31:12.375Z · LW(p) · GW(p)
Agreed that LW is in a kind of stagnation. However, I think that just someone writing a series of high-quality posts would suffice to fix it. Now, the amount of discussion in comments is quite good, the problem is that there aren't many interesting posts.
If a group said that they thought A was an important issue and the solution was X, most members would pay more attention than if a random individual said it. No-one would have to listen to anything they say, but I imagine that many would choose to. Furthermore if the exec were all actively involved in the projects, I imagine they'd be able to complete some themselves, especially if they choose smaller ones.
It isn't quite a good thing; many people noticed that LW is somewhat like Eliezer's echo chamber. Actually, we should endorse high-quality opinions different from LW mainstream.
Replies from: ZacHirschman, passive_fist↑ comment by ZacHirschman · 2015-05-25T20:18:01.813Z · LW(p) · GW(p)
What are your heuristics for telling whether posts/comments contain "high-quality opinions," or "LW mainstream"? Also, what did you think of Loosemore's recent post on fallacies in AI predictions?
Replies from: estimator↑ comment by estimator · 2015-05-25T21:50:28.119Z · LW(p) · GW(p)
It's just my impression; I don't claim that it is precise.
As for the recent post by Loosemore, I think that it is sane and well-written, and clearly required a substantial amount of analysis and thinking to write. I consider it a central example of high-quality non-LW-mainstream posts.
Having said that, I mostly disagree with its conclusions. All the reasoning there is based on the assumption that the AGI will be logic-based (CLAI, following the post's terminology), which I find unlikely. I'm 95% certain that if the AGI is going to be built anytime soon, it will be based on machine learning; anyway, the claim that CLAI is "the only meaningful class of AI worth discussing" is far from being true.
↑ comment by passive_fist · 2015-05-25T22:25:41.959Z · LW(p) · GW(p)
I think LW might actually be suffering from something like a collective affective death spiral.
Replies from: efim↑ comment by efim · 2015-05-26T07:01:03.344Z · LW(p) · GW(p)
I'm not sure how it relates to proposed stagnation (i.e loss of momentum) of LW community. Could you please elaborate? I understand affective death spirals to mean something completely different I am totally confused.
Replies from: passive_fist↑ comment by passive_fist · 2015-05-26T08:40:33.997Z · LW(p) · GW(p)
It's quite easy (and in fact almost inevitable) to get carried away with a theory (as in a bunch of axiomatic ideas together with a logical framework) you have. "As the theory seems truer, you will be more likely to question evidence that conflicts with it. As the favored theory seems more general, you will seek to use it in more explanations." Thus you will cease to question the theory and cease to truly go beyond it, leading to stagnation.
Replies from: Richard_Kennaway↑ comment by Richard_Kennaway · 2015-05-26T11:31:25.274Z · LW(p) · GW(p)
What is the theory that you think LW has such a spiral around?
Replies from: passive_fist, estimator↑ comment by passive_fist · 2015-05-26T22:14:54.236Z · LW(p) · GW(p)
The idea that you can actually optimize your thought processes using deliberate rational will and analysis of biases, as exemplified by the home page, and specifically the extreme version of this idea that some users try to adopt.
Replies from: estimator, Richard_Kennaway↑ comment by estimator · 2015-05-26T22:34:51.804Z · LW(p) · GW(p)
Can you unpack "optimizing thought processes"? Under some definitions the statement is questionable, under others trivially true.
Also, the articles you've linked to describe techniques that are very popular outside -- so if they are overrated, it isn't a LW-specific mistake.
Replies from: passive_fist↑ comment by passive_fist · 2015-05-27T02:25:49.484Z · LW(p) · GW(p)
I can try to elaborate on the criticisms of the pages I linked. There hasn't been any study of the long-term effects of spaced repetition. There are indications that it may be counter-productive and that it may act as an artifical 'importance inflator' of information, desensitizing the brain's long-term response to new knowledge that is actually important, especially if one is not consciously aware of that.
About the pomodoro technique, it's even less researched than spaced repetition and there's very little solid evidence that it works. One thing that seems a bit worrying is that it seems like a 'desperate measure' adopted by people experiencing low productivity, indicating some other problem (depression/burnout etc.) that should be dealt with directly. In these cases pomodoros would make things far worse.
It could be said that none of these are criticisms of LW, but just criticisms of these specific techniques that arose outside of LW. However, if one is too eager to adopt and believe in such techniques, it betrays ADS-type thinking as relating to the idea that optimization of thought processes can be done through 'productivity hacks'.
↑ comment by Richard_Kennaway · 2015-05-26T22:33:43.861Z · LW(p) · GW(p)
The idea that you can actually optimize your thought processes using deliberate rational will and analysis of biases, as exemplified by the home page,
How are you distinguishing an affective death spiral from people thinking that something is a good idea?
and specifically the extreme version of this idea that some users try to adopt.
People using Anki and Pomodoros (neither of which were invented on LW or by LWers) doesn't look extreme to me.
↑ comment by estimator · 2015-05-26T22:15:28.955Z · LW(p) · GW(p)
TDT, FAI (esp. CEV), acausal trading, MWI -- regardless whether they are true or not, the level of criticism is lower than one would expect; either because of the Halo effect or ADS.
Replies from: Richard_Kennaway↑ comment by Richard_Kennaway · 2015-05-26T22:38:17.107Z · LW(p) · GW(p)
I see these things being discussed here from time to time. I don't see any general booming of them, still less any increasing trend. Eliezer, of course, has boomed MWI quite strongly; but he is no longer here.
Replies from: estimator↑ comment by estimator · 2015-05-26T23:02:36.464Z · LW(p) · GW(p)
My impression is that inside LW they are usually assumed true, while outside LW they are usually assumed false or highly questionable. Again, I'm not saying that these theories are wrong, but the pattern looks suspicious; almost every LW's non-mainstream belief can be traced back to Eliezer. What a coincidence. One of the possible explanations is the halo effect of the Sequences. Or they are actually underrated outside LW. Or my impressions are distorted.
Replies from: gwern↑ comment by gwern · 2015-05-26T23:50:22.104Z · LW(p) · GW(p)
Or my impressions are distorted.
I'm going with distorted.
Take MWI for example; apparently a lot of people are under the impression that LWers must be ~100% MWI fanatics. But the annual surveys report that lukewarm endorsements of MWI as the least bad QM interpretation covers, what, <50% of respondents? And it's not clear to me that LW is even different from mainstream physicists, since the occasional polls of them show MWI keeps becoming more popular. It seems like people overgeneralize from the generally respectful treatment of MWI as a valid alternative (as opposed to early criticism of it as nonsense or crackpot pseudoscience) and from MWI topics being a lot more fun to discuss than, say, Copenhagen.
Or, global pandemics are regularly rated in the survey as a very concerning x-risk up there with AI, but are discussed much less; possibly because the risk of pandemics seems well-appreciated by society at large and there's little new to discuss.
Similarly for some of the other stereotypical beliefs; critics like Stross and XiXiDu have been campaigning to turn Roko's basilisk into the defining shibboleth of LW, but do even <5% of LWers take it seriously or as more than an obscure hypothetical in one superseded decision theory? (I don't think so but in that case I can't prove it with survey data.)
And with TDT and acausal trading, they're technical and difficult enough, relying heavily on formal logic and decision theory, that it's hard to make any comments on them at all, either pro or con. Personally, I don't believe in acausal trading. But I also don't ever come out and talk about it, because I don't feel I understand it or UDT/TDT well, am not particularly interested in them, and have nothing new to contribute to conversations about them; so why would I write about them, and if I were writing about them, why would you or anyone want to read what I wrote?
comment by kilobug · 2015-05-26T08:18:39.443Z · LW(p) · GW(p)
I'm not really sure the issue is about "direction", but more about people who have enough time and ideas to write awesome (or at least, interesting) posts like the Sequences (the initial ones by Eliezer or the additional ones by various contributors).
What I would like to see are sequences of posts that build on each other, starting from the basics and going to deep things (a bit like Sequences). It could be collective work (and then need a "direction"), but it could also be the work of a single person.
As for myself, I did write a few posts (a few in main and a few in discussions) but if I didn't write recently is mostly because of three issues :
Lack of time, like I guess many of us.
The feeling of not being "good enough", that's the problem with a community of "smart" people like LW, with high quality base content (the Sequences), it's a bit intimidating.
The "taboo" subjects (like politics) which I do understand and respect, but they limit what I could write about.
There are a few things I would like to write about, but either I feel I lack the skill/knowledge to do it at LW level (point 2) or they border too much the "taboo" subjects (point 3).
comment by ChristianKl · 2015-05-25T18:28:09.031Z · LW(p) · GW(p)
These roles would take significant work, but I imagine people would be motivated to do this by altruism or status.
If someone's goal is status, why write on LW instead of writing a personal blog?
Replies from: roryokane, casebash↑ comment by casebash · 2015-05-26T02:01:16.653Z · LW(p) · GW(p)
Perhaps they want status among the rationalist community?
Replies from: ChristianKl↑ comment by ChristianKl · 2015-05-26T13:36:49.469Z · LW(p) · GW(p)
If I read a blog post on a personal blog it's more likely that a month later I remember who wrote the blog post than when I read it on LW.
Replies from: tim↑ comment by tim · 2015-05-27T02:48:20.660Z · LW(p) · GW(p)
Yeah, but it's not fair to start with "given that I read a post and it was on a personal blog..." if the odds of you reading said post in the first place is higher when posted on LW rather than someone's personal blog that you may not be aware of or check regularly.
comment by Vaniver · 2015-05-25T20:21:37.313Z · LW(p) · GW(p)
I think the greatest issue with Less Wrong is that it lacks direction.
I'm not sure "direction" is the right phrase. It's not like Eliezer told other people what posts to write--he had a goal and a thing to explain, and then he explained it. People didn't read his posts because he had authority; they read his posts because someone recommended them as being useful and interesting.
And similarly, I don't think the absence of roles is the issue: I think it's the absence of time and topics. Maybe there are people with something to write who aren't writing it because of some block that a clearly visible person could help remove--but it seems more likely to me that there are people with something to say but no time to say it, or with the time to write posts but little to talk about, and there are fewer intersections of those two as time goes on.
Replies from: ChristianKl, RobbBB↑ comment by ChristianKl · 2015-05-25T21:43:54.134Z · LW(p) · GW(p)
but it seems more likely to me that there are people with something to say but no time to say it, or with the time to write posts but little to talk about, and there are fewer intersections of those two as time goes on.
Or they simply write the post on their own blogs.
↑ comment by Rob Bensinger (RobbBB) · 2015-05-28T22:40:33.557Z · LW(p) · GW(p)
This assumes that the period when Eliezer was writing a lot was pretty optimal, and we should be trying to recapitulate that era of success. Maybe the thing Eliezer was trying to do has now succeeded, and we should be using the site in a very different way now? Or, if he didn't succeed, there may still be better goals to move toward than 'lots of people are active and have interesting discussions about miscellaneous topics a la early-LW'.
Replies from: Vaniver↑ comment by Vaniver · 2015-05-28T23:37:47.475Z · LW(p) · GW(p)
This assumes that the period when Eliezer was writing a lot was pretty optimal, and we should be trying to recapitulate that era of success.
The OP states that, and I think I agree, in the sense that if there's a topic that would be as general and as interesting as the Sequences, then I would rather someone write a long text about it here than them not do that. I suspect, as I've argued elsewhere, that there isn't something in that class; there are lots of interesting things for people to do now, but they aren't going to be as common an interest.
I do think that there are changes LW could make to adapt to the new community and purpose for it, but I'm not sure here is the best place to discuss that.
comment by SebastianG (JohnBuridan) · 2015-06-05T04:36:09.693Z · LW(p) · GW(p)
I don't think the site as a whole needs a "new" direction. It needs continued conversation, new sub-projects, and for the members to engage with the community.
Less Wrong has developed its own conventions for argument, reference points for logic, and traditions of interpretation of certain philosophical, computational, and every day problems. The arguments all occur within a framework which implicitly furnishes the members with a certain standard of thinking and living (which we don't always live up to).
Maybe what you really want is for people in the community to find a place where they can excel and contribute more. What we need most is to continue to develop ways people can contribute. Not force the generation of projects from above.
comment by [deleted] · 2015-06-01T07:56:52.242Z · LW(p) · GW(p)
this was an unhelpful comment, removed and replaced by the comment you are now reading
Replies from: JohnBuridan↑ comment by SebastianG (JohnBuridan) · 2015-06-05T04:35:12.208Z · LW(p) · GW(p)
ha!