Where can I read about the Roko's Basilisk drama?

post by theflowerpot · 2022-01-30T16:29:40.732Z · LW · GW · No comments

This is a question post.

Contents

  Answers
    15 Viliam
    -10 lincolnquirk
None
No comments

I'm pretty new here and would like to learn more about Roko's Basilisk and the drama that happened here. Could someone of you explain this?

Answers

answer by Viliam · 2022-01-30T18:26:34.508Z · LW(p) · GW(p)

There is a tag [? · GW] for that, although not all related articles are tagged. There is an article on RationalWiki that made the whole thing popular, and there is also a mention at Wikipedia, because the RationalWiki admin also happens to be a Wikipedia admin.

Long story short, a Less Wrong user called Roko once wrote a comment containing a thought experiment about an artificial intelligence that would incentivize its creation by torturing everyone who did not contribute to its creation. (So once people started building this monstrosity, it would be a Prisonner's Dilemma kind of situation where everyone has a selfish motive to contribute to the project, even if they actually wish it never existed.) Some people were triggered, Eliezer Yudkowsky deleted the comment, then of course people talked about it more; then Eliezer nuked the entire discussion saying that brainstorming publicly about machines that would torture people is a really stupid idea regardless of whether the plan is feasible or not, and banned further discussion on similar topics, just in case.

Then our friends at RationalWiki noticed this, and rationally concluded that because this comment was posted on Less Wrong, it must represent what all Less Wrong users actually believe, and because Eliezer deleted it, it must be Eliezer's darkest secret. And therefore, it is their civil duty to document it for the posterity, and make sure everyone knows about it, and everyone knows this is what Less Wrong is actually about -- especially Wikipedia.

Because at that time RationalWiki had quite high ranking in Google results, everyone who googled about Less Wrong found this, and every journalist who wrote an article mentioning Less Wrong or the rationalist community, made sure to also mention this. Some people call this Streisand effect, some call it citogenesis.

Since then, about once in a year someone comes and asks people to explain what the Roko's Basilisk is all about. In 2022, it happens to be you. Congratulations! As you can clearly see, we talk about this topic all day, every day, because we deeply care. But we also never talk about it, because it is our deepest held secret. Sorry if this is confusing.

EDIT: If you click on the tag [? · GW], and then click "Read More", there is actually a long explanation of the idea.

EDIT2: I just learned that there is a Reddit forum where about once a week someone new asks about Roko's basilisk. (And XiXiDu [LW · GW] is its moderator, why am I not surprised?) Seems like there is a huge basilisk fan community out there.

comment by mikbp · 2022-02-27T17:35:49.363Z · LW(p) · GW(p)

And what are/were the problems with RationalWiki? I am also pretty new here and was aware of the Basilisk controversy, but I don't know about that seemingly related problem... Probably I just forgot I read about RationalWiki when reading about the Basilik because i don't know what it is. If it can be shortly summarised, please do it as I am now quite curious. If not, please don't bother --it is not way the effort!

Replies from: Viliam, TAG
comment by Viliam · 2022-02-27T18:56:08.982Z · LW(p) · GW(p)

Long story short, RationalWiki tries to promote rationality (more precisely, mainstream science), and oppose pseudoscience and religious fanaticism, which is nice.

But they are also politically woke left, and it clouds their judgment -- they sometimes treat "left-wing" and "science" as synonyms; and "politically incorrect" and "pseudoscience" also as synonyms. (It makes sense historically, they started as an opposite to Conservapedia.) For example, among their 10 longest pages, 2 are currently about "gamergate", which is completely unrelated to science or pseudoscience, it's just a thing that people like to have a political opinion about. So, kinda, started as "against pseudoscience", but ended up as "against anything we don't like".

RationalWiki only has a few editors, and some of them have too much free time, so if one of them has a personal grudge against something, then that kinda becomes the official position of RationalWiki. The specific editor that has a problem with Less Wrong and rationalist community in general, is David Gerard; he is an admin at both RationalWiki and Wikipedia, and he often abuses his position: writes something negative about someone on RW, and when a journalist quotes him (a few years ago, RW was famous and often quoted by journalists), he then quotes this on Wikipedia (so he is effectively quoting his own words, but coming from an independent source). After much time, it seems like some other Wikipedia admins finally start getting tired of this, and he was recently banned from attacking Scott Alexander (the author of Astral Codex Ten) on Wikipedia.

Replies from: mikbp, TAG
comment by mikbp · 2022-02-27T22:29:40.005Z · LW(p) · GW(p)

Thanks for the answer!

comment by TAG · 2022-02-27T20:14:14.008Z · LW(p) · GW(p)

Long story short, RationalWiki tries to promote rationality (more precisely, mainstream science), and oppose pseudoscience and religious fanaticism, which is nice.

But they are also politically woke left, and it clouds their judgment

Long story short, lesswrong tries to promote rationality (more precisely, mainstream science), and oppose pseudoscience and religious fanaticism, which is nice.

But they are also politically libertarian, and it clouds their judgment.

Replies from: Viliam
comment by Viliam · 2022-02-28T09:34:35.721Z · LW(p) · GW(p)

Well, Eliezer is. Couldn't find more recent survey, but here [LW · GW] is a 2016 analysis of politics of LW community, and if we can trust the answers, there are 10× as many Democrats as Libertarians among the American members.

And yes, it would be nice if RationalWiki made it clear that this is their true objection. ;)

Replies from: TAG
comment by TAG · 2022-03-02T00:14:09.246Z · LW(p) · GW(p)

And "rationalwiki is woke" is hard empirical fact...or a "vibe"?

Replies from: Viliam
comment by Viliam · 2022-03-02T09:01:40.383Z · LW(p) · GW(p)

Not sure what would be the most convincing evidence in this case. As far as I know, they do not have a page saying "yes, we are woke", or anything like this.

But if you read the content, there are many articles that are just about politics, unrelated to the "science - pseudoscience" dimension. And I don't mean articles like "Donald Trump", because those are relevant: as a president he could increase or decrease some government spending on science, or promote some pseudoscientific opinion on TV, etc.

But why is it necessary to have a page e.g. on "men's movement" (keywords: "non-existent problem", "ridiculously absurd", "neo-reactionary", "overtly conspiratorial view", "reactionary" -- that was just the short summary at the top of the article) or "gamergate"? How is this, like, related to science?

The level of charity is exactly as much as you would expect from "a snarky point of view", i.e. don't let reality stand in the way of a good jab at the enemy. If you e.g. characterize the man's rights advocacy as "bros before hoes", you make it clear that their arguments are going to get a fair treatment, right?

Ok, let's look at the specific ideas: MRAs complain about draft, but this "borders on red herring" because almost no one is doing it these days. (Meanwhile, in actual universe, there are quite many 18-years old kids drafted on both Russian and Ukrainian sides these days; you can watch them dying on Reddit.) According to Rational Wiki, even the argument that men were conscripted during the two World Wars is not valid, because... wait, I am not making this up... the disabled men were exempt. (Checkmate, misogynist!)

With gamergate, I will skip the object-level claims for the sake of brevity, and focus on the process. If you look at the Wikipedia article on gamergate, it clearly depicts the whole affair as utterly negative and without any merit whatsoever. Yet there was one editor, I think his name was Ryulong or something like that, who was kicked out from Wikipedia for being too obsessed and unfair about the topic. (Mind you, this was a judgment made by people who had a very negative opinion on the topic themselves, but tried to hold themselves to some standards, such as not making up stuff.) The banned editor then moved to RationalWiki, where he wrote three long articles on gamergate (at that moment, those were 3 among the 15 longest pages on RationalWiki), and those articles are still there. I take this as evidence that not only the standards of RW are way lower than Wikipedia, but also as evidence how one person can bring his own pet topic and publish it on RationalWiki with pretty much zero fact-checking, as long as it culturally fits.

Now let's look at how RationalWiki treats the opposite fringe, for example "critical race theory". It is a "cross-disciplinary intellectual and social movement". Nice.

Feel free to give me an example from RationalWiki that a woke person would not agree with.

Replies from: TAG
comment by TAG · 2022-03-02T14:01:21.401Z · LW(p) · GW(p)

So, vibe. If you can assess them as woke based on vibe, they can assess you as libertarian based on vibe. If the sole purpose of lesswrong is to prevent an AI apocalypse, why are there so many articles about government regulation (bad) and stock trading (good)?

The level of charity is exactly as much as you would expect from “a snarky point of view”

So they are not failing at what they say they are doing...they are failing at what you think they should be doing.

comment by TAG · 2022-02-27T19:40:44.470Z · LW(p) · GW(p)

And what are/were the problems with RationalWiki?

If you ask that question here, you will get an answer that assumes that rationality is being done entirely correctly here, and is therefore being done wrongly at rationalwiki, inasmuch as rationalwiki does anything differently. Of course, rationalwiki has its own objections to lesswrong and Yudkowksy

It may be possible to settle the issue , but first people would need to get out of the mindset that says "my tribe is right because it's my tribe".

Replies from: mikbp
comment by mikbp · 2022-02-27T22:34:46.064Z · LW(p) · GW(p)

Thanks for the answer. Although I've went to quickly read the 2 post you linked and... they are informative and I do agree with some of the critiques but they are far from objective. Although I don't like the tone and style, they would be alright for a personal blog, but not for a wiki and much less for a rational wiki. It felt really weird and very untrustworthy.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2022-03-06T14:03:47.114Z · LW(p) · GW(p)

It is. Rationality is their flag, not their method.

answer by lincolnquirk · 2022-01-30T16:50:24.420Z · LW(p) · GW(p)

This community has a virtue of taking weird ideas seriously. Roko came up with a weird idea which, the more seriously you took it, the more horrifying it became. This was deemed an info hazard, and censored in some way, I don't know how. But the people who didn't take it seriously in the first place weren't horrified by the idea and thus were confused about why it should have been censored, and thus boosted the Streisand effect.

comment by Charlie Steiner · 2022-02-01T03:13:52.839Z · LW(p) · GW(p)

the more seriously you took it, the more horrifying it became. 

Eh. Up to a point. And then if you take it more seriously than that, it becomes less horrifying again.

Arguments for why it's scary are the decision theory equivalent to someone describing how scary knives are, and how to make your own sharp knives, but never mentioning any knife safety tips.

"Sharp knives," in this metaphor, is the recognition that other people might try to manipulate us, and the decision theory of why they'd do it and how it would work. "Knife safety" is our own ability to use decision theory to not get manipulated.

The reason that I think Roko's basilisk is a net harmful idea is because there are a lot of people who are way more motivated to learn/talk about "cool" topics like sharp scary knives or ideas that sound dangerous, who are not nearly as motivated to learn about "boring" topics like knife safety or mathy decision theory. So for people who happen to allocate their attention this way (maybe you're an edgy young adult, or maybe you're just a naturally anxious person, or maybe you're an edgy anxious person), it might just make them more anxious or otherwise mislead them.

comment by ChristianKl · 2022-01-30T17:14:26.456Z · LW(p) · GW(p)

I don't know how. 

If you don't know, why try to answer?

In general, your post is pretty misleading. It was not censored because the idea itself horrified people. 

The idea was either wrong, in which case preventing people from reading a wrong idea is net beneficial by getting them better ideas or the idea was right and that suggests it's dangerous. EY censored it because he believed that neither state would make it valuable to have the post on LessWrong and maybe out of a general precautionary principle. You don't need to be horrified by things to use the precautionary principle.

Replies from: Dagon
comment by Dagon · 2022-01-30T23:42:39.236Z · LW(p) · GW(p)

I recall that among the reasons given were that it had triggered severe reactions in some members of the community.  "Horrified" would be a mild way to describe the reaction that was claimed at the time.  There was discussion about whether the idea itself could be self-fulfilling and thus inherently dangerous (in addition to the pain caused in some people thinking about it), but that didn't last long.

I don't think simply being wrong would have been enough to try to censor it.  

Replies from: ChristianKl
comment by ChristianKl · 2022-01-31T10:34:24.393Z · LW(p) · GW(p)

The precautionary principle mattered. 

Jessica wrote a post [LW · GW]about how the people at MIRI at the time thought about keeping information that's a potentially dangerous secret. It was pretty extreme and they're trying to keep information that Jessica would have seen as a more trivial secret drove her into schizophrenia. 

It was not an intellectual atmosphere of keeping ideas that are horrifying people, dangerous because those are horrifying ideas. MIRI had complex ideas about secrecy that they considered very serious and if you ignore those but treat the motivation like "people did something because they are horrified" you project decision heuristics onto EY that he didn't use. 

Replies from: Dagon
comment by Dagon · 2022-01-31T16:17:28.068Z · LW(p) · GW(p)

Fair enough.  I'm not part of the Bay Area rationalist community, and I suspect there was a lot of stuff going on that didn't appear in public posts or discussion on the topic.  People (including Eliezer and others) are complicated, and there are both private and public reasons for actions, as well as reasons that aren't easily understood, even by the actors.

BTW, none of this explains why lincolnquirk's comment was strong-downvoted.  Even if it's incorrect (though it doesn't seem to me to be - more incomplete), it's not harmful or wasteful.

Replies from: ChristianKl
comment by ChristianKl · 2022-01-31T16:29:47.463Z · LW(p) · GW(p)

I don't think that someone who by their own admission doesn't think that they have a good understanding should offer up their explanation on a rumor in a case like this.

No comments

Comments sorted by top scores.