Has LessWrong Ever Backfired On You?

post by Evan_Gaensbauer · 2014-12-15T05:44:32.795Z · LW · GW · Legacy · 70 comments

Contents

70 comments

Several weeks ago I wrote a heavily upvoted post called Don't Be Afraid of Asking Personally Important Questions on LessWrong. I thought it would only be due diligence if I tried to track users on LessWrong who have received advice on this site and it's backfired. In other words, to avoid bias in the record, we might notice what LessWrong as a community is bad at giving advice about. So, I'm seeking feedback. If you have anecdotes or data of how a plan or advice directly from LessWrong backfired, failed, or didn't lead to satisfaction, please share below. 

70 comments

Comments sorted by top scores.

comment by Shmi (shminux) · 2014-12-15T17:53:28.253Z · LW(p) · GW(p)

Bringing up EY/LW in a positive way in unrelated online discussions got me labeled a weirdo once or twice. I recall having to leave one forum because of the hostility. I am tempted to say that this was for the best, but it could be just the sour grapes fallacy.

Replies from: FiftyTwo, Kawoomba, buybuydandavis, Error
comment by FiftyTwo · 2014-12-16T14:56:24.013Z · LW(p) · GW(p)

Yeah, I've had people complain about the standard basilisk and weird AI speculation stuff. Also the association with neoreactionaries, sexists and HBD people.

Replies from: None
comment by [deleted] · 2015-02-23T16:36:29.110Z · LW(p) · GW(p)

Sometimes you get the opposite - LW seen as an SJW forum because Scott Alexander is okay with referring to his partner as ze/zir/zur in his blog and if you are not American, or over 40, or at any rate did not go to a US college in the last 15 years this comes accross as weird. I remember even on Reddit as late as 2009 the "in" "progressive" thing was to hate Bush, not to understand something about transgenderism or feminism or what, so it is a very recent thing, I would say, in mainstream circles.

comment by Kawoomba · 2014-12-15T21:23:21.365Z · LW(p) · GW(p)

Whatever you do, don't mention the Contract.

Replies from: Punoxysm
comment by Punoxysm · 2014-12-15T22:11:39.992Z · LW(p) · GW(p)

What is this?

Replies from: Kawoomba, Metus
comment by Kawoomba · 2014-12-15T22:27:03.115Z · LW(p) · GW(p)

Nothing. Absolutely nothing.

comment by Metus · 2014-12-15T22:21:35.030Z · LW(p) · GW(p)

A joke.

comment by buybuydandavis · 2014-12-16T01:10:47.516Z · LW(p) · GW(p)

I recall having to leave one forum because of the hostility.

"An infidel in our midst! Burn him! BURN!"

For LW? How wacky.

I am tempted to say that this was for the best

Harry Browne wrote a pretty good book, "How I Found Freedom in an Unfree World". A couple of themes I took away was "sell to your market", and "it pays to advertise". You want to attract people, and invest your time in people, who actually are your market, and so that your investment has a good chance of paying off over time.

That forum aint it, and chasing you out over LW doesn't say flattering things about them. There are always special cases, but in general, a forum with such hostility for what you are is not a good place for you to get attached to.

comment by Error · 2014-12-15T21:22:11.150Z · LW(p) · GW(p)

Was the hostility because of the werido-ness, or something else, if you don't mind me asking? Seems to me that if strangeness on its own creates that sort of response, you're probably better off elsewhere...

Replies from: shminux
comment by Shmi (shminux) · 2014-12-16T07:26:03.137Z · LW(p) · GW(p)

Eliezer's writing, fiction and non-fiction tends to attract hostility, and all LWers are automatically labeled "Yudkowskians". On a somewhat related note, the idea of AGI x-risk he's been pushing for years has finally gone mainstream, yet the high-profile people who speak out about it avoid mentioning him, like he is low-status or something.

Replies from: Vulture, Viliam_Bur
comment by Vulture · 2014-12-16T18:09:15.538Z · LW(p) · GW(p)

Eliezer seems to be really really bad at acquiring or maintaining status. I don't know how aware of this fault he is, since part of the problem is that he consistently communicates as if he's super high status.

Replies from: gothgirl420666
comment by gothgirl420666 · 2014-12-17T09:30:36.802Z · LW(p) · GW(p)

Eliezer is kind of a massive dork who also has an unabashedly high opinion of himself and his ideas. So people see him as a low-status person acting as if he is high-status, which is a pattern that for whatever reason inspires hatred in people. LessWrong people don't feel this way, because to us he is a high-status person acting as if he is high-status, which is perfectly fine.

Also, one thing he does that I think works against him is how defensive he gets when facing criticism. On Reddit, he occasionally will write long rants about how he is an unfair target of hate and misrepresentation when someone brings up Roko's basilisk. Which may be true, but feeling the need to defend yourself to such an extent is very low status behavior. Just the other day I saw him post on facebook a news story which portrayed the secular solstice in a positive light with the caption "FINALLY SOME HONEST JOURNALISM!!!!!" or something like that. This is just not a good look. I wonder if he could hire an image consultant or PR person, it seems like that would be something that could make FAI more likely.

Replies from: Viliam_Bur
comment by Viliam_Bur · 2014-12-18T13:46:35.285Z · LW(p) · GW(p)

For some reason this reminds me of a scene from Game of Thrones, where one person says "knowledge is power", and the other person responds by threatening their life, and they saying "no, power is power". (Unspecific to avoid spoilers.)

The point is, some kinds of power depends on context, some don't. Generally, respecting people for their intellectual or artist skills is context-dependent. You don't get status by being good at maths among people who consider maths low status. You don't get status for writing good fan fiction among people who consider fan fiction low status. You don't get status for being able to debate rationality among people who consider rational debating low status. -- More universal sources of status are money, and ability to harm people. Because almost everyone is afraid of harm, and almost everyone needs money.

When dealing with journalists, it is useful to realize that journalists have this kind of destructive power. Dealing with a journalist is like meeting a thug in a dark street. You don't want to make him angry. If you get out alive, you should consider it a success, and not complain about small inconveniences. In a long term, if you live on that dark street, you should try to "befriend" the thug, so that he will not attack you, and may even agree to attack people you don't like.

How specifically to "befriend" journalists? Well, this is exactly what PR is about. You treat them with respect, invite them on conferences when you give them free food, and offer help with writing articles. Because they usually have small salaries and have to write a lot of articles, so by giving them free food and making part of their work for them, you make them happy. If you keep them hungry and unrespected, they may randomly attack you.

Replies from: gothgirl420666
comment by gothgirl420666 · 2014-12-18T18:51:31.426Z · LW(p) · GW(p)

Yes, but I don't think the negative press LessWrong receives is simply because journalists are fickle creatures. I think there is something inherent to the culture that turns outsiders off.

My guess is that Eliezer, MIRI, and LWers in general are strange people who believe strange things, and yet they (we) are pretty confident that they are right and everyone else is wrong. Not only that, but they believe that the future of humanity is in their hands. So at best, they're delusional. At worst, they're right... which is absolutely terrifying.

Also, like I said, Eliezer is a big dork, who for example openly talks about reading My Little Pony fanfiction. The idea that such a goober claims to be in charge of humanity's destiny is off-putting for the same reason. I wonder if to most people, Eliezer pattern-matches better to "weird internet celebrity", kind of an Amazing Atheist figure, than to "respectable intellectual" the same way e.g. Nick Bostrom might. We can see in presidential elections that Americans don't trust someone who isn't charismatic, tall, in good shape, etc. to run the country. So, of course, the average person will not trust someone who lacks those qualities to save the world. It's an ivory tower thing, but instead of ivory it's more like green play-doh.

I think Eliezer's lack of "professionalism" in this sense probably has its upsides as well. It makes him more relatable, which helps him establish an audience. It makes his writings more fun to read. And it is probably easier for him to communicate his ideas if he isn't trying to sanitize them so they meet a certain standard. MIRI in general seems to favor an "open book, keep it real, no bullshit" approach, as exemplified with how lukeprog wrote on this forum that it was disastrously managed before he took over, and all he had to do was read Nonprofits for Dummies. From a PR standpoint, that seems unequivocally stupid to publicly admit, but he did it anyway. I feel like this philosophy has its benefits for MIRI as a whole, but I can't quite put my finger on what they are.

Replies from: Viliam_Bur
comment by Viliam_Bur · 2014-12-18T23:03:41.763Z · LW(p) · GW(p)

Now I feel like every group that tries to do something faces a trilemma:

1) Deny your weakness. Leads to irrationality.

2) Admit your weakness. Leads to low status, and then opposition from outsiders.

3) Deny your weakness publicly, only admit them among trusted members. Leads to cultishness.

Replies from: Kaj_Sotala, Lumifer, gothgirl420666, hawkice, Sarunas
comment by Kaj_Sotala · 2014-12-19T14:01:20.348Z · LW(p) · GW(p)

2) Admit your weakness. Leads to low status, and then opposition from outsiders.

I wonder: it feels like with individuals, honestly and directly admitting your weakness while giving the impression that they're not anything you're trying to hide, can actually increase your status. Having weaknesses yet being comfortable with them signals that you believe you have strength that compensates for those weaknesses, plus having flaws makes you more relatable. Could that also work for groups? I guess the biggest problem would be that with groups, it's harder to present a unified front: even when a single person smoothly and honestly admits the flaw, another gets all defensive.

Replies from: Princess_Stargirl
comment by Princess_Stargirl · 2014-12-20T17:03:29.642Z · LW(p) · GW(p)

I don't think this strategy works well for individuals. Though maybe we are thinking of different reference sets. To me the way to understand social interactions is to look at what politicians do. Or if one only cares about a more intelligent set of humans executives at companies. People may hate politicians/executives but they are provably good at succeeding socially.

Are politicians/executives big on admitting weakness? I don't think so. They seem much more fond of either blatantly lying (and betting their supporters will defend them) or making only the weakest possible admissions of weakness/guilt ("mistakes were made").

Of course acting like a politician is usually pretty terrible for all sorts of reasons. But its probably the "playing to win" action socially.

comment by Lumifer · 2014-12-19T16:21:24.345Z · LW(p) · GW(p)

In real life these choices are neither exclusive nor binary. A group might well admit the weakness in internal meetings and PR-manage the exposure of that weakness to the outside without either fully denying it or doing the whole sackcloth-and-ashes bit.

comment by gothgirl420666 · 2014-12-18T23:22:27.821Z · LW(p) · GW(p)

Great point, I didn't think of it that way.

Replies from: Viliam_Bur
comment by Viliam_Bur · 2014-12-19T10:36:09.084Z · LW(p) · GW(p)

Thanks! On the other hand, lest I prove too much, each of these ways can work:

1) Irrationality does not have to be fatal. Dilbert makes a living complaining about irrationality of companies, and yet those companies make billions of profit.

2) Open source software exposes all their bugs, and still many open-source projects are respected. (Although this may be because their exposed weakness is incomprehensible for most people, so on the social level it is as if they exposed nothing.)

3) Most organizations have people with privileged access to information, and don't expose everything to public. Most organizations have a clear boundary between a non-member and a member, between a non-manager and a manager. People don't question this, because it's business as usual.

So probably the problem here is that LessWrong is not an organization, and that LessWrong is somehow not sufficiently separated from MIRI. Which feels ironical, because I am on LessWrong every day, and I mostly don't know what people in MIRI are working at now, so the separation clearly exists from my view; but it may not exist from an outsider's view, for whom simply LessWrong = Eliezer, and MIRI = Eliezer (so if Eliezer said something low status on LessWrong, it automatically means MIRI is low status). So my conclusion is that compartmentalization has an important role, and Eliezer failed to do it properly.

In real life, we usually don't have much data about leaders of high-status organizations. From the outside they seem like boring people, who only do their work and that's all they ever do. (Think about what it did for Bill Clinton's career when the details of his sex life became public.) I understand the desire to be influential and to be free to expose whatever you want about yourself, but it probably doesn't work this way. By exposing too much, you limit your status. Powerful people do not enjoy freedom of speech in the same way popular bloggers do. Eliezer went the popular blogger way. Now we need a way to promote MIRI which does not mention Eliezer.

Replies from: Lumifer
comment by Lumifer · 2014-12-19T16:27:47.722Z · LW(p) · GW(p)

I understand the desire to be influential and to be free to expose whatever you want about yourself, but it probably doesn't work this way. By exposing too much, you limit your status.

It certainly doesn't work that way, but I think it's not just about status. If you want to be influential (aka have power, that's different from just being high-status), you should be instrumentally rational about it, that is, evaluate whether the consequences of your actions serve your goals. In this particular case, you need to carefully manage your public persona, the image you present to the outside. This careful management is not very compatible with exposing " whatever you want about yourself".

This is actually a problem in that it's a serious disincentive for good people to get involved in high-level politics. Would you want a team of smart lawyers and investigators to go over your visible life with a fine-toothed comb looking with malice for any kind of dirt they can fling at you?

comment by hawkice · 2014-12-21T17:21:54.418Z · LW(p) · GW(p)

So, obviously that list isn't exhaustive, because there are more ways to split interactions than public/private, but in an attempt to add meaningful new outlooks:

4) Speak about your weaknesses openly when in public, and deny them in private.

Many high status individuals are much harsher, demanding, arrogant, and certain in private than in public. I think this is a result of -- when you don't know the target well -- not knowing who you will have to impress, who you have to suck up to, and who is only useful when they get you the thing you want.

comment by Sarunas · 2014-12-19T14:23:17.624Z · LW(p) · GW(p)

2) Admit your weakness. Leads to low status, and then opposition from outsiders.

That sounds similar to a standard job interview question "What is your greatest weakness?". In that situation, perhaps a standard advice how to answer this question - emphasize how one intends to overcome that weakness and what weaknesses one has conquered in the past - is applicable here as well?

Edit. Although perhaps you meant that the very act of letting outsiders to define what is and what is not a weakness leads to low status.

Replies from: Princess_Stargirl, Viliam_Bur
comment by Princess_Stargirl · 2014-12-20T17:07:08.845Z · LW(p) · GW(p)

It is suicidal to admit an actual serious weakness. For multiple reasons. One is that admitting a serious weakness will leave a very bad impressions that is hard to overcome. See the research that people will frequently pay more for a single intact set of objects then two sets of the same objects where one set is damaged.

The other problem is that admitting an actual error is going off the social script. It either paints you as clueless or a "weirdo." This is also a very serious problem.

Replies from: jsteinhardt
comment by jsteinhardt · 2014-12-21T06:45:18.235Z · LW(p) · GW(p)

I don't think this is right. I talk pretty publicly about whatever problems/insecurities I have, but I do so in a pretty self-confident manner. It may help that I'm visibly competent at what I do and I don't claim that it is a universally good strategy, but it works for me and helps me to stay in a fairly constant state of growth mindset which I've found to be beneficial.

comment by Viliam_Bur · 2014-12-19T23:47:40.858Z · LW(p) · GW(p)

In the job interview, you are explicitly given the task to describe your weakness. And you probably choose one that is relatively harmless. Something like "I am very rational, but sometimes I am underconfident". So that's different.

comment by Viliam_Bur · 2014-12-16T21:51:48.042Z · LW(p) · GW(p)

I remember reading an article on Overcoming Bias long ago which predicted exactly this. In general, not just about AGI. That in many areas, first people who go there are those who ignore social conventions (otherwise they wouldn't be first). But when the area becomes successful, there comes the second wave of people who are following a safe path to success. The people from the second wave usually don't credit the people from the first wave, so the public perceives this second wave as the founders.

Eliezer did say and write many things. Some of them are now perceived as low status, some as high status. The safe road to success is to repeat only the high status things, and to never mention Eliezer. (Plus do some other high status things unrelated to Eliezer.)

Replies from: gwern, seez
comment by gwern · 2014-12-19T01:44:29.906Z · LW(p) · GW(p)

"Even When Contrarians Win, They Lose" http://www.overcomingbias.com/2007/09/even-when-contr.html

This is not just a plausible story – I have personally known people where similar stories have played out, and have read about others. It has happened to varying degrees with Ted Nelson, Eric Drexler, Douglas Engelbart, Doug Lenat, David Deutsch, Alfred Russel Wallace, Hugh Everett, and, yes, me.

comment by seez · 2014-12-18T23:02:58.070Z · LW(p) · GW(p)

Can you provide a link to the article, if you remember it?

comment by tgb · 2014-12-16T19:08:03.634Z · LW(p) · GW(p)

I one time asked for advice here and the responses felt overly demeaning and presumptuous and largely ignored trying to help in favor of lambasting me for being in the situation at all. It was not a response I had been expecting and it made me feel bad and less likely to ask somewhat personal questions in the future. I don't think anyone replying was intending to cause me any harm and it wasn't a big deal in any sense of the word. But I felt disappointed with the outcome and the community.

I'm sure anyone sufficiently interested could find this out of my post history, but the details aren't particularly interesting. To a third party it probably won't seem like much at all, but at the time to me it wasn't a good feeling.

Replies from: Brillyant
comment by Brillyant · 2014-12-17T02:04:15.411Z · LW(p) · GW(p)

I've seen at least one or two such occurrences like this on LW. There is a very cold and, well, rational tone to the responses here. Overall, I think it's good, since there are plenty of other forums for people to go to get encouragement. But if this is your go-to forum for life advice, and you are going through something difficult and personal that you decide to share, the responses might not give you a warm fuzzy feeling.

Replies from: None
comment by [deleted] · 2014-12-19T14:11:47.780Z · LW(p) · GW(p)

Rationality is about winning. If your goal is to give people advice that they will accept, imbuing your message with hopefulness and cheer will assist you in that goal. If your goal is to get people to continue asking for advice, slapping a "Working as intended" on their complaints about your advice-giving technique is an abysmal failure.

comment by FiftyTwo · 2014-12-16T14:58:14.917Z · LW(p) · GW(p)

Using terms that I picked up here which are not well known, or mean different things in different contexts

Also, I sometimes over pattern match arguments and concepts I've picked up on Lesswrong to other situations, which can result in trying to condescendingly explain something irrelevant.

Replies from: Smaug123
comment by Smaug123 · 2014-12-17T14:01:23.922Z · LW(p) · GW(p)

I do something similar. I consistently massively underestimate the inferential gaps when I'm talking about these things, and end up spending half an hour talking about tangential stuff the Sequences explain better and faster.

Replies from: MathiasZaman, Gunnar_Zarncke
comment by MathiasZaman · 2014-12-18T10:07:42.066Z · LW(p) · GW(p)

Since I mostly communicate in Dutch when in meatspace, I find myself rarely using terms directly from Less Wrong (because good translations don't always come to mind). Of course, this isn't exactly a lifehack, since you wouldn't expect most people to move to a different language zone for a minor benefit.

comment by Gunnar_Zarncke · 2014-12-17T17:24:41.562Z · LW(p) · GW(p)

Same with me. Except pointing to LW or the sequences doesn't help.

comment by Punoxysm · 2014-12-15T10:18:00.218Z · LW(p) · GW(p)

Well, I've found that advice about time management of which this site has tons, is not really helpful. It is not the lack of a system to organize my efforts but a lack of persistence that has always been a bottleneck for me.

Replies from: Gunnar_Zarncke, passive_fist, Unknowns
comment by Gunnar_Zarncke · 2014-12-17T17:22:49.423Z · LW(p) · GW(p)

I think motivation is a hard problem and there is no simple solution. As with many self-help advice trying out multiple approaches may ultimately lead to an approach that may work for you.

comment by passive_fist · 2014-12-16T10:17:55.652Z · LW(p) · GW(p)

I tried the pomodoro system for a bit (which I understand is somewhat popular here) but I found it to be largely useless, for myself at least. Instead just removing various distractions was far more powerful. This is corroborated by the literature; Gloria Mark's research is worth a look: http://www.ics.uci.edu/~gmark/Home_page/Welcome.html

Replies from: Cyan, hamnox
comment by Cyan · 2014-12-18T21:08:46.178Z · LW(p) · GW(p)

Someone who really cared about time management wouldn't be reading this site in the first place.

Replies from: passive_fist
comment by passive_fist · 2014-12-18T21:49:08.100Z · LW(p) · GW(p)

As far as internet distractions go, lesswrong is hardly the worst offender. Although more than 10 minutes a day on LW is probably too much.

comment by hamnox · 2014-12-17T01:00:23.010Z · LW(p) · GW(p)

Same, actually. Pomodoros have never stuck as a solution for me.

comment by Unknowns · 2014-12-15T10:49:28.351Z · LW(p) · GW(p)

Someone who really cared about time management wouldn't be reading this site in the first place.

Replies from: satt
comment by satt · 2014-12-22T00:16:04.156Z · LW(p) · GW(p)

Someone who really cared about time management wouldn't be reading this site in the first place.

I can see why that was downvoted. I am surprised it was downvoted to -11; tone it down a bit and it's probably true. (And as ever, fuck da troll toll.)

Replies from: Unknowns
comment by Unknowns · 2014-12-22T01:58:39.876Z · LW(p) · GW(p)

Maybe people thought it was a comment about the advice on LW or on the site in general, but actually it was just about procrastination. I spend more time procrastinating here than anywhere else.

comment by ChristianKl · 2014-12-15T12:54:19.005Z · LW(p) · GW(p)

I probably spend too much time on LW than is warrented, but otherwise I don't have a particular story of backfired advice that comes to mind.

Replies from: Username
comment by Username · 2014-12-16T01:28:36.817Z · LW(p) · GW(p)

Agreed. The biggest way LW has backfired is eating up free time that provides a very questionable ROI. I've spent quite a bit of time procrastinating on here, and the amount of actionable advice I've put into practice is quite low.

The advice I have put into practice usually works pretty well, but that's mostly a function of me realizing that it slots in well to my existing habits/ways of thinking.

Replies from: beoShaffer
comment by beoShaffer · 2014-12-16T02:58:20.541Z · LW(p) · GW(p)

Same, though in my case it is likely that that time was mostly funging against other internet timewasters not my worktime.

comment by Ritalin · 2014-12-18T10:56:43.926Z · LW(p) · GW(p)

Describing myself as a "rationalist" pretty much automatically makes a bad impression, no matter how much you explain afterwards that you value emotion and passion and humanity and you're totally not a Straw Vulcan or an Objectivist.

Replies from: Vulture
comment by Vulture · 2014-12-19T20:59:42.704Z · LW(p) · GW(p)

"Aspiring rationalist" or "" could be a less negative alternative.

comment by lmm · 2014-12-16T12:28:23.313Z · LW(p) · GW(p)

I can recall one instance of bad advice on a particular subject (I don't want to be specific). In retrospect it should have been obvious that the person giving the advice lacked the experience to give it, but it's hard to judge someone's credentials over the internet.

Some of the media recommendations have been bad; of course no recommendation is perfect, but in my limited experience LW's strike rate is worse than e.g. TV Tropes (which may just be a factor of the latter containing a lot more detail and having more contributors).

Replies from: Nornagest, Richard_Kennaway
comment by Nornagest · 2014-12-16T18:27:59.355Z · LW(p) · GW(p)

Back when I browsed TV Tropes regularly, my algorithm for using it to find media that I liked centered around skimming a lot of media pages that looked vaguely interesting and using them to get a better idea of themes and target audience, while throwing out anything that was full of creepy fanservice tropes or obviously written by a single very enthusiastic viewer. When I tried mining it for actual recommendations, they were usually bad.

LW doesn't have anything that lends itself to that sort of exploratory search, but recommendations from the media threads have been somewhat reliable for me, probably thanks to a closer demographic match. Better coverage in certain topic areas, too: we seem to have a greater proportion of literary SF readers posting here, for example.

comment by Richard_Kennaway · 2014-12-17T14:12:02.196Z · LW(p) · GW(p)

Of the anime recommendations I've followed up on account of their claimed rationality content, I've yet to find one that repaid the effort.

Replies from: Jayson_Virissimo
comment by Jayson_Virissimo · 2014-12-18T01:35:05.201Z · LW(p) · GW(p)

Had better luck anywhere else?

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2014-12-18T10:47:30.580Z · LW(p) · GW(p)

The rationality content was my only interest, so I haven't particularly looked for any other source of anime recommendations. However, I have seen Princess Mononoke, Howl's Moving Castle, and Spirited Away, and all I can say of them is that they were pleasant enough.

Replies from: Vulture
comment by Vulture · 2014-12-19T20:54:56.082Z · LW(p) · GW(p)

Is it possible that some of the reported "rationality content" was more like genre-savviness which is more visible to people who are very familiar with the genre in question?

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2014-12-20T10:36:39.497Z · LW(p) · GW(p)

I think it was more a case of people looking at the works with the hammer of rationality in their hand and seeing lots of nails for the characters to knock in. For example, The Melancholy of Haruhi Suzumiya sets up a problem (Unehuv vf Tbq naq perngrq gur jbeyq 3 lrnef ntb ohg qbrfa'g ernyvfr vg, naq vs fur rire qbrf gura fur zvtug haperngr vg whfg nf rnfvyl), but I found that setup fading into the background as the series of DVDs that I watched went on. By the fourth in the series (the murder mystery on the island isolated by storms), it was completely absent.

With Fate/Stay Night, one problem is that I was looking at ripped videos on Youtube, while the original material is a "visual novel" with branching paths, so it's possible (but unlikely) that the people who put up the videos missed all the rationality-relevant bits.

I've not tried Death Note, but I suspect I'd find the same dynamic as in Haruhi Suzumiya. A hard problem is set up (how does a detective track down someone who can remotely kill anyone in the world just by knowing their name?), which makes it possible to read it as a rationality story, but unless the characters are actually being conspicuously rational beyond the usual standards of fiction, that won't be enough.

I'm also not part of the anime/manga community: I watched these works without any context beyond the mentions on LessWrong and a general awareness of what anime and manga are.

It's weird how the girls all look like cosplay characters. :)

Replies from: Desrtopa, Vulture
comment by Desrtopa · 2014-12-21T09:22:44.578Z · LW(p) · GW(p)

With Fate/Stay Night, one problem is that I was looking at ripped videos on Youtube, while the original material is a "visual novel" with branching paths, so it's possible (but unlikely) that the people who put up the videos missed all the rationality-relevant bits.

I haven't watched the anime, but I have read the visual novel, and the anime does not have a reputation for being a very faithful adaptation. The visual novel at least does share themes that often feature in Eliezer's work, but I wouldn't call them "rationality content" as such. More in the manner of Heroic Responsibility and related concepts.

comment by Vulture · 2014-12-20T16:13:53.661Z · LW(p) · GW(p)

In terms of Death Note, I've read the first several volumes and can vouch that it's a fun, "cerebral" mystery/thriller, especially if you like people being ludicrously competent at each other, having conversations with multiple levels of hidden meaning, etc. Can't say there's anything super rational about it, but the aesthetic is certainly there.

Replies from: Desrtopa
comment by Desrtopa · 2014-12-21T09:33:34.801Z · LW(p) · GW(p)

Actually I for one gave up Death Note in frustration very early on because I couldn't help focusing on how much of the real inferential work was being done by the authors feeding the correct answers to the characters. Like when L concludes that Kira must know the victim's real name to kill him... there were so many reasons that just didn't work. Kira's apparent modus operandi was to kill criminals, there was no particular reason to suppose he would respond to a challenge to kill anyone else, so the fact that he didn't was already weak evidence regarding whether he could at all, let alone what the restrictions might be. Whether Kira knew his real name or not was just one variable switched between him and Lind L. Taylor. L could just as easily have been immune because he eats too many sweets.

While smart, knowledgeable people can often extract a greater yield of inference from a limited amount of data than others, I find that far too many writers take this idea and run with it while forgetting that intelligence very often means recognizing how much you can't get out of a limited amount of data.

comment by Gunnar_Zarncke · 2014-12-16T22:24:28.977Z · LW(p) · GW(p)

I can't think of anything on which LW did backfire, but some points on which LW is rather about neutral. I think it is valuable to list this neutral data points too.

I find myself arguing over the quality and content of LessWrong posts with friends, one close friend in particular. He questions the aspirations and qualifications of LW in general and some posts/authors we were discussing in particular. And I find myself at least partly agreeing with his assessments. Not because of his rhetoric or my wish for consent. Rather because it is convicing.

For context: He is my sparrings partner for validating new ideas I have developed or picked up (not only on LW). I admit that I use his clear and sceptical reasoning and solid theoretical footing as a sanity check for my own reasoning (thus guarding against egocentric and overconfidence bias; actually a good idea: everybody should have such a santity check partner). Now I have to ask myself: Do his arguments generalize? It taints the LW experience. It eats time not only on LW but also of our personal discussions which now tend to go meta more often now. He hasn't read much of the LW topics and thus has an outside view somewhat filtered by my presentation of the topics under consideration. And the inferential gap this causes between what I know about topics on LW and what he knows about such topics cause 'needed explanation' tension. My discussions with him would probably run more smooth without LW.

I also agree with some commenters that reading and commenting on LW can eat quite a lot of time not all of is worth it.

Replies from: Arkanj3l
comment by Arkanj3l · 2015-03-19T21:28:12.403Z · LW(p) · GW(p)

Any LW-concept-specific critiques applicable to everyone else?

comment by ilzolende · 2014-12-23T00:41:15.964Z · LW(p) · GW(p)

According to my parents, certain behaviors are immoral if you can explain why you're doing them.

Overreacting to a parent listening in on your phone call or using physical coercion (not hitting me, just grabbing me and blocking my movements) when they claim good intentions? Teenage hormones.

Stating that you have a precommitment to react negatively to people who wiretap or use force on me, even when it's costly for me to do so? Morally wrong.

[Yes, I realize that the actual moral here is "Don't tell people you understand the concept of precommitments, just pretend to be an irrational actor". This isn't an example of advice being wrong, just an example of advice needing to be clarified.]

Replies from: Lumifer, Arran_Stirton, Viliam_Bur
comment by Lumifer · 2014-12-23T01:38:07.329Z · LW(p) · GW(p)

Yes, I realize that the actual moral here is "Don't tell people you understand the concept of precommitments, just pretend to be an irrational actor".

Well, I would read the actual moral as "Parents are likely to phrase their arguments in terms of morality if it suits their purpose, even if it isn't actually their morality".

Replies from: ilzolende
comment by ilzolende · 2014-12-23T05:56:50.447Z · LW(p) · GW(p)

I think we're using definitions differently here: I was using "moral" to mean "lesson for the reader based on what the main character wishes she had done".

Also, parents in this instance react to events based on their stated moral system, not on their actual moral system.

However, that is the sort of assumption I already make about my parents' statements about morality whenever those statements are suspiciously specific and applicable to a current argument that they would like to win.

Replies from: Lumifer
comment by Lumifer · 2014-12-23T06:26:34.532Z · LW(p) · GW(p)

I was using "moral" to mean "lesson for the reader based on what the main character wishes she had done".

So was I :-)

comment by Arran_Stirton · 2014-12-26T05:34:06.327Z · LW(p) · GW(p)

Are you sure precommitment is a useful strategy here? Generally the use of precommitments is only worthwhile when the other actors behave in a rational manner (in the strictly economic sense), consider your precommitment credible, and are not willing to pay the cost of you following through on your precommitment.

While I'm in no position to comment on how rational your parents are, it's likely that the cost of you being upset with them is a price they're willing to pay for what they may conceptualize as "keeping you safe", "good parenting" or whatever their claimed good intentions were. As a result no amount of precommitment will let you win that situation, and we all know that rationalists should win.

The optimal solution is probably the one where your parents no longer feel that they should listen to your phone calls or use physical coercion in the first place. I couldn't say exactly how you go about achieving this without knowing more about your parents' intentions. However you should be able to figure out what their goal was and explain to them how they can achieve it without using force or eavesdropping on you.

comment by Viliam_Bur · 2014-12-29T10:32:11.151Z · LW(p) · GW(p)

I think this is a usual intuition. Seems wrong to me, but I don't know how exactly to fix it.

I am similarly frustrated by moral intuitions which follow this pattern: (1) Imagine that you see a drowning person, and you are a good swimmer. Is it your moral duty to save them? Yes, it is. (2) Now imagine that you see a drowning person, but you absolutely can't swim. Is it your moral duty to try saving them? No, it isn't; you would probably just kill yourself and achieve nothing. (3) There is no urgent situation. You just have a choice between learning to swim and e.g. spending your time watching anime. Is it your moral duty to learn to swim? Uhm... no, it isn't. Why would it be?

So, in other words, there are obstacles which can absolve you from a moral duty, but you don't have a moral duty to remove these obstacles.

Actually, your situation seems a bit similar to this pattern. Being irrational and doing "precommitments" by instinct absolves your morally. If you become rational and good at introspection, learn game theory and understand your motives, then you supposedly have a moral duty (to avoid acting on these instinctive "precommitments" without replacing them with conscious ones). However, no one supposedly has a moral duty to become more rational and introspective.

Seems like one part of the problem is skills which are not under your control in short term, but are under you control in long term (being good at swimming, being rational and introspective). Our intuition is too quick to classify them as immutable, because in the short-term scenario, they are. So these skills give you moral duties, but you get no moral rewards for developing them.

comment by [deleted] · 2014-12-19T14:07:11.573Z · LW(p) · GW(p)

Not really advice, but I started talking about feminism here and immediately dropped in karma. The people arguing against me produced unbacked assertions contrary to my points, not doing a modicum of research. My responses took one to two hours of research.

If you care about the answer to a question and not just feeling happy because you think you're right, you should do the research on your own. I spend a lot of time arguing against atheists on /r/DebateReligion, and I have to do the research for them. (Guess what: 2,000 years of people practicing a religion with a strong tradition of exegesis and discussion means that they aren't likely to make mistakes about how to practice their religion that an outsider can spot in two seconds!) I expected people around here to realize this. I was wrong. Maybe I should write a post about it.

comment by ErinFlight · 2015-03-14T17:24:15.885Z · LW(p) · GW(p)(Edited)