Should LW have a public censorship policy?

post by Bongo · 2010-12-11T22:45:15.282Z · LW · GW · Legacy · 42 comments

Contents

42 comments

It might mollify people who disagree with the current implicit policy, and make discussion about the policy easier. Here's one option:

There's a single specific topic that's banned because the moderators consider it a Basilisk. You won't come up with it yourself, don't worry. Posts talking about the topic in too much detail will be deleted. 

One requirement would be that the policy be no more and no less vague than needed for safety.

Discuss.

42 comments

Comments sorted by top scores.

comment by WikiLisks · 2010-12-11T23:44:39.173Z · LW(p) · GW(p)

WikiLisks is a non-karma media organization dedicated to bringing important existential risks to the public. We provide an innovative, secure, and acausal way for independent AIs around the multiverse to provide information about their utility functions.

Replies from: Lightwave
comment by Lightwave · 2010-12-12T00:18:00.575Z · LW(p) · GW(p)

wedrifid, is that you?

Replies from: wedrifid
comment by wedrifid · 2010-12-12T07:55:04.930Z · LW(p) · GW(p)

Dammit. It should be!

:P

comment by David_Gerard · 2010-12-11T23:23:21.797Z · LW(p) · GW(p)

PROPOSAL: To outline a clear set of directions for the really creative trolls.

This could be just the first in a whole series of basilisks! The written equivalent of Goatse! And its successors!

I can't see what could possibly go wrong with this idea.

comment by ronnoch · 2010-12-11T23:20:12.726Z · LW(p) · GW(p)

Of course, if you put something like that in the FAQ or about page, right there for eager new members to read, a lot of them are bound to go looking for it. Forbidden knowledge is tantalizing.

Replies from: fubarobfusco
comment by fubarobfusco · 2010-12-12T00:19:09.975Z · LW(p) · GW(p)

As an eager new member who did exactly that, I have to say I don't see what the big fuss is about. It seems to be one big case of privileging the hypothesis.

Replies from: Pavitra
comment by Pavitra · 2010-12-12T03:00:20.532Z · LW(p) · GW(p)

I'd like to have some way to discuss the basilisk with others who have already seen it, if it would be possible to do so without the forbidden-fruit problem.

Replies from: Lightwave, Snowyowl
comment by Lightwave · 2010-12-12T19:33:27.912Z · LW(p) · GW(p)

Some aspects of the basilisk are currently being discussed and have been discussed even before the original basilisk was proposed. You just need to connect the dots and be able to recognize them.

comment by Snowyowl · 2010-12-12T09:17:03.419Z · LW(p) · GW(p)

I don't see the purpose of such discussion. All the posts which are not criticising Roko's argument will be downvoted into oblivion. That's not a discussion, it's a monologue. The only aspect of this mess worth discussing any more is the censorship itself.

(I think it was uncalled for. We have downvoting for a reason.)

Replies from: Perplexed
comment by Perplexed · 2010-12-12T17:32:26.698Z · LW(p) · GW(p)

'Oblivion' is not true oblivion. Heavily downvoted comments are still visible if you look. I found myself peeking at negative karma comments so often that I have simply eliminated the visibility threshold.

Replies from: Snowyowl
comment by Snowyowl · 2010-12-12T23:36:33.560Z · LW(p) · GW(p)

True. Still, it's an incentive not to make posts that will negatively impact your karma.

comment by Eneasz · 2010-12-13T19:46:19.284Z · LW(p) · GW(p)

At this point I think we need to TVTropes this subject. Roko's Rule: At least once per quarter someone will recommend hiding the Basilisk, which will introduce it to a whole new generation of readers.

Why do we even bother making oblique references to it anymore? It should be fully explained in the FAQ/Wiki and included in the introductory post.

Replies from: Mitchell_Porter, Eneasz
comment by Mitchell_Porter · 2010-12-13T23:54:36.863Z · LW(p) · GW(p)

You are being humorous, but here is the answer to your question: People are talking about it obliquely because they want to talk about it openly, but don't believe they can, without having their discussions disappear.

LW is not a police state. Discussions are free and fearless, except for this one thing. And of course that makes people even more curious to test the boundaries and understand why, on this one topic, the otherwise sensible moderators think that "you can't handle the truth".

We can seek a very loose historical analogy in the early days of nanotechnology. Somewhere I read that for several years, Eric Drexler was inhibited in talking about the concept, because he feared nanotechnology's destructive side. I don't know what actually happened at all, so let 's just be completely hypothetical. It's the early 1970s, and you're part of a little group who stumbled upon the idea of molecular machines. There are arguments that such machines could make abundance and immortality possible. There are also arguments that such machines could destroy the world. In the group, there are people who want to tell the world about nanotechnology, because of the first possibility; there are people who want to keep it all a secret, because of the second possibility; and there are people who are undecided or with intermediate positions.

Now suppose we ask the question: Are the world-destroying nanomachines even possible? The nano-secrecy faction would want to inhibit public consideration of that question. But the nano-missionary faction might want to encourage such discussion, either to help the nano-secrecy faction get over its fears, or just to make further secrecy impossible.

In such a situation, it would be very easy for the little group of nano-pioneers to get twisted and conflicted over this topic, in a way which to an outsider would look like a collective neurosis. The key structural element is that there is no-one outside the group presently competent to answer the question of whether the world-destroying nanomachines are physically possible. If they went to an engineer or a physicist or a chemist, first they would have to explain the problem - introduce the concept of a nanomachine, then the concept of a world-destroying nanomachine - before this external authority could begin to solve it.

The deep reason why LW has this nervous tic when it comes to discussion of the forbidden topic, is that it is bound up with a theoretical preoccupation of the moderators, namely, acausal decision theory.

In my 1970s scenario, the nano-pioneers believe that the only way to know whether grey goo is physically possible or not is to develop the true (physically correct) theory of possible nanomachines; and the nano-secrecy faction believes that, until this is done, the safe course of action is to avoid discussing the details in public.

Analogously, it seems that here in the real world of the 2010s, the handful of people on this site who are working to develop a formal acausal decision theory believe that the only way to know whether [scary idea] is actually possible, is to finish developing the theory; and a pro-secrecy faction has the upper hand on how to deal with the issue publicly until that is done.

Returning to the hypothetical scenario of the nano-pioneers, one can imagine the nano-secrecy faction also arguing for secrecy on the grounds that some people find the idea of grey goo terrifying or distressing. In the present situation, that is analogous to the argument for censorship on the grounds that [scary idea] has indeed scared some people. In both cases, it's even a little convenient - for the pro-secrecy faction - to have public discussion focus on this point, because it directs people away from the conceptual root of the problem.

In my opinion, unlike grey goo, the scary idea arising from acausal decision theory is an illusion, and the theorists who are afraid of it and cautious about discussing it are actually retarding the development of the theory. If they were to state, publicly, completely, and to the best of their ability, what it is that they're so afraid of, I believe the rest of us would be able to demonstrate that, in the terminology of JoshuaZ, there is no basilisk, there's only a pseudo-basilisk, at least for human beings.

Replies from: Eneasz, drethelin
comment by Eneasz · 2010-12-14T19:20:44.374Z · LW(p) · GW(p)

Well, that was a much more in-depth reply than I was expecting. I had actually been trying to point out that any pro-censorship person who spoke about this idea, ever, for any reason, even to justify the censorship, was actually slitting their own wrists by magnifying it's exposure. But this was a very interesting reply, sparked some new thoughts for me. Thanks!

comment by drethelin · 2012-01-26T04:49:50.365Z · LW(p) · GW(p)

I love this post

comment by Eneasz · 2010-12-14T23:08:20.857Z · LW(p) · GW(p)

Woo! In an awesome display of confirmation bias, I've found another application of Roko's Rule less than 24 hours since the coinage! Go Chipotle! http://www.sogoodblog.com/2010/12/14/chipotle-social-media/ :)

comment by Manfred · 2010-12-12T20:15:46.175Z · LW(p) · GW(p)

Hm. Probably.

Given the difficulty of genuine censorship, it might be better merely to outline the risks (which can be put scarily) and encrypt it (maybe rot-n, so that it takes a little more work to break but is still doable for everyone).

Frankly, actually censoring things just invites the Streisand effect.

comment by D_Alex · 2010-12-12T11:01:36.287Z · LW(p) · GW(p)
  1. The reasoning given for banning the "dangerous topic" is, to put it bluntly, irrational. Also, the manner in which it was made was appalling. It is hard to say more without revealing the topic, so take this as my opinion FWIW... but also:

  2. The "dangerous topic" is basically in public domain now, so further censorship is pointless.

comment by Aurini · 2010-12-12T01:13:39.516Z · LW(p) · GW(p)

Goddamnit, I want to stare at this basilisk!

Replies from: Alicorn, David_Gerard
comment by Alicorn · 2010-12-12T04:20:17.830Z · LW(p) · GW(p)

Outside view indicates that if you stare at the basilisk, you will most likely either a) think it was a terrible idea and wish you hadn't and maybe have nightmares, or b) wonder what the heck all the fuss is about and consider it a waste of time except insofar as you might consider censorship wrong in itself, and might thereby be tempted to share the basilisk with others, each of whom has an independent risk of suffering reaction (a).

Do you want to want to stare at the basilisk?

Replies from: Kingreaper, topynate, JoshuaZ, NihilCredo, MarkusRamikin
comment by Kingreaper · 2010-12-12T16:25:08.437Z · LW(p) · GW(p)

I'll add my opinion to the list:

I'm not an a, or a b.

Turns out, the basilisk was very close to one of the list of things I'd thought up, based on the nature of this community's elders, and gone "no, no, they wouldn't buy into that idea would they? No-one here would fall for that...".

Reading about it, combined with the knowledge that EY banned it, gives me an insight into EYs thought patterns that significantly decreases my respect for him. I think that that insight was worth the effort involved in reading it.

Replies from: Broggly
comment by Broggly · 2010-12-14T17:46:00.473Z · LW(p) · GW(p)

Honestly I was suprised at EY's reaction. I thought he had figured out things like that problem and would tear it to pieces rather than become. Possibly I'm not as smart as him, but even presuming Roko's right you would think Rationalists Should Win. Plus, I think Eliezer has publicly published something similar to the Basilisk, albeit much weaker and without being explicitly basilisk like, so I'd have thought he would have worked out a solution. (EDIT: No, turns out it was someone else who came up with it. It wasn't really fleshed out so Eliezer may not have thought much of it or never noticed it in the first place.)

The fact that people are upset by it could be reason to hide it away, though, to protect the sensitive. Plus, having seen Dogma, I get that the post could be an existential risk...

Replies from: Kingreaper, Vaniver
comment by Kingreaper · 2010-12-14T18:35:40.847Z · LW(p) · GW(p)

The fact that people are upset by it could be reason to hide it away, though, to protect the sensitive.

I don't think hiding it will prevent people getting upset. In fact, hiding it may make people more likely to believe it, and thus get scared. If someone respects EY and EY says "this thing you've seen is a basilisk" then they're more likely to be scared than if EY says "this thing you've seen is nonsense"

comment by Vaniver · 2010-12-14T18:46:31.508Z · LW(p) · GW(p)

Plus, having seen Dogma, I get that the post could be an existential risk...

My understanding is that the post isn't the x-risk- a UFAI could think this up itself. The reaction to the post is supposedly an x-risk- if we let on we can be manipulated that way, then a UFAI can do extra harm.

But if you want to show that you won't be manipulated a certain way, it seems that the right way to do that is to tear that approach apart and demonstrate its silliness, not seek to erase it from the internet. I can't come up with a metric by which EY's approach is reasonable.

Replies from: wedrifid, Broggly
comment by wedrifid · 2010-12-14T20:41:19.055Z · LW(p) · GW(p)

My understanding is that the post isn't the x-risk- a UFAI could think this up itself. The reaction to the post is supposedly an x-risk- if we let on we can be manipulated that way, then a UFAI can do extra harm.

(Concerns not necessarily limited to either existential or UFAI, but we cannot discuss that here.)

But if you want to show that you won't be manipulated a certain way, it seems that the right way to do that is to tear that approach apart and demonstrate its silliness, not seek to erase it from the internet. I can't come up with a metric by which EY's approach is reasonable.

Agree. :)

comment by Broggly · 2010-12-14T19:58:31.764Z · LW(p) · GW(p)

The reaction to the post is supposedly an x-risk

Yes, but not in the way you seem to be saying. I was semi-joking here, in that the post could spook people enough to increase x-risks (which wfg seems to be trying to do, albeit as blackmail rather than for its own sake). I was referring to how in the film Dogma gjb snyyra natryf, gb nibvq uryy, nggrzcg gb qrfgebl nyy ernyvgl. (rot13'd for spoilers, and in case it's too suggestive of the Basilisk)

if we let on we can be manipulated that way, then a UFAI can do extra harm.

It can? I suppose I just don't get decision theory. The non-basilisk part of that post left me pretty much baffled.

comment by topynate · 2010-12-12T20:20:55.291Z · LW(p) · GW(p)

"Do you want to know?" whispered the guide; a whisper nearly as loud as an ordinary voice, but not revealing the slightest hint of gender.

Brennan paused. The answer to the question seemed suspiciously, indeed extraordinarily obvious, even for ritual.

"Yes, provided that * * ** ** ** ** * * * ** ** ** * * ** * *," Brennan said finally.

"Who told you to say that?", hissed the guide.

Replies from: Roko
comment by Roko · 2010-12-12T20:44:26.040Z · LW(p) · GW(p)

Brennan is a fucking retard. No, you don't want to know. You want to signal affiliation with desirable groups, to send hard-to-fake signals of desirable presonality traits such as loyalty, intelligence, power and the presence of informed allies. You want to say everything bad you possibly can about the outgroup and everything good about the ingroup. You want to preech altruism and then make a plausible but unlikely reasoning error which conveniently stops you from having to give away anything costly.

All the other humans do all of these things. This is the true way of our kind. You will be punished if you deviate from the way, or even if you try to overtly mention that this is the way.

Replies from: katydee
comment by katydee · 2010-12-17T03:46:30.582Z · LW(p) · GW(p)

This may be the way now, but it doesn't have to be the way always.

comment by JoshuaZ · 2010-12-12T21:57:58.746Z · LW(p) · GW(p)

You seem to be talking mainly in part (a) about the pseudo-basilisk rather than the basilisk itself. I suspect that most people who are vulnerable to the pseudo-basilisk are either mentally ill or vulnerable to having similar issues simply when thinking about the long-term implications of the second law of thermodynamics or the like. If one is strongly vulnerable to that sort of disturbing idea then between known laws of physics and nasty low probability claims made by some major religions, most basiliking of this sort is already well-covered.

comment by NihilCredo · 2010-12-12T16:00:18.520Z · LW(p) · GW(p)

Minus the censorship part, that's not worse than watching Saw.

Replies from: Alicorn
comment by Alicorn · 2010-12-12T19:40:27.234Z · LW(p) · GW(p)

One can receive partial-impact synopses of Saw without risking the full effect, and gauge their susceptibility with more information on hand.

Replies from: David_Gerard, NihilCredo
comment by David_Gerard · 2010-12-12T19:58:30.893Z · LW(p) · GW(p)

There's a reason I've refrained from seeking out 2 Girls 1 Cup.

(I should stop bringing it into my mind, really.)

comment by NihilCredo · 2010-12-12T20:01:03.769Z · LW(p) · GW(p)

True. I think that after reading the debate(s) about the censored post one should have a pretty good idea of what it is, though.

comment by MarkusRamikin · 2012-05-14T10:09:46.287Z · LW(p) · GW(p)

My own reaction was

c) More.

Yes, I know. I'm hopelessly stupid.

comment by David_Gerard · 2010-12-12T01:36:45.616Z · LW(p) · GW(p)

http://www.youtube.com/v/9MTVMwIFAzc - ignore the title.

Edit: Well, it was the best I could do for a harmless Standard Test Basilisk at the time.

comment by gwern · 2011-01-08T18:02:22.087Z · LW(p) · GW(p)
[a lesser light asks Eliezer //
What are the activities of an FAI>//
Eliezer answers //
I have not the slightest idea //
The dim light then says //
Why haven't you any idea>//
Eliezer replies//
I just want to keep my no-idea]

(With apologies to the original.)

comment by wedrifid · 2010-12-12T17:46:04.129Z · LW(p) · GW(p)

EDIT: I might choose to un-look at it myself

Now I am confused. You can do that?

Replies from: Normal_Anomaly
comment by Normal_Anomaly · 2010-12-12T18:04:31.673Z · LW(p) · GW(p)

No, I mean that I might if I had the opportunity. I would like that option, but I can't. Sorry for the confusing wording.

comment by hairyfigment · 2011-03-09T09:26:04.257Z · LW(p) · GW(p)

So Eliezer decided to draw attention to this scenario as a trinity knot joke? Well done, sir!

comment by Snowyowl · 2010-12-12T09:05:48.940Z · LW(p) · GW(p)

The only interesting piece of information I pulled from that is that in the time between EY posting to announce his intention to censor the post and his clicking the "delete" button, he got at least 3 points of positive karma (depending on how long it was between the page being cached and deleted). The basilisk must have made an impact.

comment by [deleted] · 2010-12-18T01:52:15.129Z · LW(p) · GW(p)

If it's dangerous for information to be merely written (say a link to an extremist website, or staggeringly good source code for sentient AI), it should not be posted here.

If it's obvious by widely accepted priors that an idea will harm the reader, then it should not be posted.

If the idea might be the above and you just don't know, it should be discussed in a limited circle of masochists, until they decide how obvious it is. Find the circle using the method below, but with as much precaution as possible.

If the idea's harmfulness depends on priors which vary across the community, it should be protected.

An adequate test for a protected idea: A series of questions, or even of one-on-one discussions, which test the odds an individual puts on all relevant priors. (Of course, the test should be clever enough to not give away the idea itself!) Anyone who would predictably regret knowing the idea should be told so.

If they still insist, give a second, more clear warning about the predicted effects of the idea on the person, then let them have the key. Or require that the person states why he or she wants to take this risk, and have a moderator decide.