Posts
Comments
Sorry to see this so heavily downvoted. Thanks -- this made for interesting reading and watching.
If you haven't checked out the archive of iq.org it's also a rather interesting blog :)
re: formatting... you don't happen to use Ubuntu/Chrome, do you?
He says that natural events are included in the category of journalism that's not about exposing other peoples secrets....
LOL, how did I miss this:
1) There is quite a bit of journalism that has nothing to do with exposing other peoples secrets. This would include reporting on natural events (storms, snow, earthquakes, politicians lying or accepting bribes).
Are you under the impression that a politician wouldn't consider his accepting bribes to be a secret?
Wikileaks has published less than 1% of the diplomatic cables[1]. It goes thorough and removes sensitive and personal information before posting them online[2]. Except for a handful of exceptions, they only publish information that one of their newspaper partners has already published[2].
In the US we don't say people are guilty until proven so -- Manning has made no public confession, and has not been tried. He's being held solely as the result of one man's (Adrian Lamo's) testimony, to the best of our knowledge[3]. That man was forcibly checked into a mental institution 3 weeks before said informing, and has made several inconsistent statements about his relationship with Manning, and what Manning told him to the press[4].
What do you suppose Einstein would say about doing different things over and over and expecting the same result? :p
Never trust anyone unless you're talking in person? :p
Yes. If I didn't none of this would make any sense...
It's interesting, but I don't see any similarly high-effectiveness ways to influence Peter Thiel... Republicans already want to do high x-risk things, Thiel doesn't already want to decrease funding.
After reviewing my copies of the deleted post, I can say that he doesn't say this explicitly. I was remembering another commenter who was trying to work out the implications on x-risk of having viewed the basilisk.
EY does say things that directly imply he thinks the post is a basilisk because of an x-risk increase, but he does not say what he thinks that increase is.
Edit: can't reply, no karma. It means I don't know if it's proportional.
At karma 0 I can't reply to each of you one at a time (rate limited - 10 min per post), so here are my replies in a single large comment:
I would feel differently about nuke designs. As I said in the "why" links, I believe that EY has a bug when it comes to tail risks. This is an attempt to fix that bug.
Basically non-nuke censorship isn't necessary when you use a reddit engine... and Roko's post isn't a nuke.
Yes, though you'd have to say more.
Incredible, thanks for the link
Incredible. Where were you two days ago!
After Roko's post on the question of enduring torture to reduce existential risks, I was sure they're must be a SIAI/LWer who was willing to kill for the cause, but no one spoke up. Thanks :p
In this case my estimate is a 5% chance that EY wants to spread the censored material, and used censoring for publicity. Therefore spreading the censored material is questionable as a tactic.
Great! Get EY to rot13 posts instead of censoring them.
You can't just pretend that the threat is trivial when it's not.
Fair enough. But you can't pretend that it's illegal when it's not (ie. the torture/murder example you gave).
Actually, I just sent an email. Christians/Republicans are killing ??? people for the same reason they blocked stem cell research: stupidity. Also, why you're not including EY in that causal chain is beyond me.
I think his blackmail declarations either don't cover my precommitment, or they also require him to not obey US laws (which are also threats).
Re #1: EY claimed his censorship caused something like 0.0001% risk reduction at the time, hence the amount chosen -- it is there to balance his motivation out.
Re #2: Letting Christians/Republicans know that they should be interested in passing a law is not the same as hostage taking or harming someone's family. I agree that narrow targeting is preferable.
Re #3 and #4: I have a right to tell Christians/Republicans about a law they're likely to feel should be passed -- it's a right granted to me by the country I live in. I can tell them about that law for whatever reason I want. That's also a right granted to me by the country I live in. By definition this is legitimate authority, because a legitimate authority granted me these rights.
Yes, hopefully for EY as well
Yes: talk some sense into Eliezer.
The common misunderstanding from these comments is that they didn't click on the "precommitment" link and read the reasons why the precommitment reduced existential risk.
If I ever do this again, I'll make the reasoning more explicit. In the mean time I'm not sure what to do except add this comment, and the edit at the bottom of the article for new readers.
By my math it's an existential risk reduction. Your point was talked about already in the "precommitment" post linked to from this article.
You throw some scary ideas around. Try this one on for size. This post of yours has caused me to revise my probability of the proposition "the best solution to some irrational precommitments is murder" from Pascal's-wager levels (indescribably improbable) to 0.01%.
There are some people who agree with you (the best way to block legislation is to kill the people who come up with it).
I'd say that since I've only been talking about doing things well within my legal rights (using the legal system), that talking about murdering me is a bit "cultish"...
I actually explicitly said what oscar said in the discussion of the precommitment.
I also posted my reasoning for it.
Those are both from the "precommitted" link in my article.
If one loan individual told me that if I didn't wear my seatbelt, he'd bust my kneecaps, then that would be blackmail.
I think this is closer to if one lone individual said that every time he saw you not wear a seatbelt (which for some reason a law couldn't get passed for), he'd nudge gun control legislation closer to being enacted (assuming he knew you'd hate gun control legislation)
Also note that it wasn't when I submitted to the main site...
I should have taken this bet
YES IT IS. In case anyone missed it. It isn't Roko's post we're talking about right now
In this case, the comment censored was not posted by you. Therefore you're not the author.
FYI the actual author didn't even know it was censored.
Are you aware of the damage your censoring is doing?
That blocking these things you think are bad (and most people do not) is causing tangible PR problems, chilling effects, SIAI/LW appearing psycho, and repeated discussion of a ancient thread?
If you weren't you, but were instead witnessing a friend do something so utterly stupid, wouldn't you tell them to stop?
I don't think my addition gives EY the high ground.
What are the points you wanted to bring up with him?
I already did that, that's why we're here.... make that email horribleforsiai.wfg@xoxy.net
Releasing the stuff hinted at here would cause a minor scandal -- perhaps rolling dice whenever a post is banned would be a happy medium.
Huh? Try:
P(law will pass | (sane existential risk estimators would think it's retarded) && (lots of right wingers heard great reasons) && (a lobby or two was convinced it was in their interest))
But (hopefully!) nothing like that could be caused with a single email. Baby steps.
Are you joking? Do you have any idea what a retarded law can do to existential risks?
A (slightly more) sane response would be to direct your altruistic punishment towards the SIAI specifically.
I'm all ears.
If you can think of something equally bad that targets SIAI specifically, (or anyone reading this can), email it to badforsiai.wfg@xoxy.net
If you feel more comfortable labeling it 'terrorism'... well... it's your thinking to bias.
someone has to stand up against your abuse
Agree except for the 'terrorism' and 'allegedly' part.
I just emailed a right-wing blogger some stuff that probably isn't good for the future. Not sure what the increase was, hopefully around 0.0001%.
I'll write it up in more detail and post a top-level discussion thread after work.
-wfg
Then I guess I'll be asked to leave the lesswrong site.
The 0.0001% bit was a reference to my earlier precommitment
yep, this one is showing as deleted
Lol. right, that'd do it
Great post. It confuses me why this isn't at 10+ karma
I notice that your list is future facing.
Lies are usually about the past.
It's very easy to not lie when talking about the future. It is much easier to "just this once" lie about the past. You can do both, for instance, by explaining that you believe a project will succeed, even while withholding information that would convince a donor otherwise.
An example of this would be errors or misconduct in completing past projects.
Lack of relevant qualifications for people SIAI plans to employ on a project.
Or administrative errors and misconduct.
Or public relations / donor outreach misconduct.
To put the question another, less abstract way, have you ever lied to a SIAI donor? Do you know of anyone affiliated with SIAI who has lied a donor?
Hypothetically, If I said I had evidence in the affirmative to the second question, how surprising would that be to you? How much money would you bet that such evidence doesn't exist?
why shouldn't they shut up?
Because this is LessWrong -- you can give a sane response and not only does it clear the air, people understand and appreciate it.
Cable news debating isn't needed here.
Sure we might still wonder if they're being perfectly honest, but saying something more sane on the topic than silence seems like a net-positive from their perspective.
no sensible person who had the answer would
I respectfully disagree, and have my hopes set on Carl (or some other level-headed person in a position to know) giving a satisfying answer.
This is LessWrong after all -- we can follow complicated arguments, and at least hearing how SIAI is actually thinking about such things would (probably) reduce my paranoia.
Make that "they do it for the greater good"
Sorry about mistakingly implying s/he was affiliated. I'll be more diligent with my google stalking in the future.
edit: In my defense, SIAI affiliation has been very common when looking up very "pro" people from this thread
but he won me back by answering anyway <3
This sounds very sane, and makes me feel a lot better about the context. Thank you very much.
I very much like the idea that top SIAI people believe that there is such a thing as too much devotion to the cause (and, I'm assuming, actively talk people who are above that level down as you describe doing for Roko).
As someone who has demonstrated impressive sanity around these topics, you seem to be in a unique position to answer these questions with an above-average level-headedness:
Do you understand the math behind the Roko post deletion?
What do you think about the Roko post deletion?
What do you think about future deletions?
Am I missing something? Desrtopa responded to questions of lying to the donor pool with the equivalent of "We do it for the greater good"
That "confessor" link is terrific
If banning Roko's post would reasonably cause discussion of those ideas to move away from LessWrong, then by EY's own reasoning (the link you gave) it seems like a retarded move.
Right?
accusations stick in the mind even when one is explicitly told they are false
Actually that citation is about both positive and negative things -- so unless you're also asking pro-SIAI people to hush up, you're (perhaps unknowingly) seeking to cause a pro-SIAI bias.
Another thing that citation seems to imply is that reflecting on, rather than simply diverting our attention away from scary thoughts is essential to coming to a correct opinion on them.
One of the interesting morals from Roko's contest is that if you care deeply about getting the most benefit per donated dollar you have to look very closely at who you're giving it to.
Market forces work really well for lightbulb-sales businesses, but not so well for mom-and-pop shops, let alone charities. The motivations, preferences, and likely future actions of the people you're giving money to become very important. Knowing if you can believe the person, in these contexts, becomes even more important.
As you note, I've studied marketing, sales, propaganda, cults, and charities. I know that there are some people who have no problem lying for their cause (especially if it's for their god or to save the world).
I also know that there are some people who absolutely suck at lying. They try to lie, but the truth just seeps out of them.
That's why I give Roko's blurted comments more weight than whatever I'd hear from SIAI people who were chosen by you -- no offence. I'll still talk with you guys, but I don't think a reasonably sane person can trust the sales guy beyond a point.
As far as your question goes, my primary desire is a public, consistent moderation policy for LessWrong. If you're going to call this a community blog devoted to rationality, then please behave in sane ways. (If no one owns the blog -- if it belongs to the community -- then why is there dictatorial post deletion?)
I'd also like an apology from EY with regard to the chilling effects his actions have caused.
But back to what you replied to:
What would SIAI be willing to lie to donors about?
Do you have any answers to this?
Okay, you can leave it abstract. Here's what I was hoping to have explained: why were you discussing what people would really be prepared to sacrifice?
... and not just the surface level of "just for fun," but also considering how these "just for fun" games get started, and what they do to enforce cohesion in a group.
Ahh. I was trying to ask about Cialdini-style influence techniques.
The wording here leaves weird wiggle room -- you're implying it wasn't Nick?
I think the question you should be asking is less about evil conspiracies, and more about what kind of organization SIAI is -- what would they tell you about, and what would they lie to you about.
I agree that there's a lot in history, but the examples you cited have something that doesn't match here -- historically, you lie to people you don't plan on cooperating with later.
If you lie to an oppressive government, it's okay because it'll either get overthrown or you'll never want to cooperate with it (so great is your reason for lying).
Lying to your donor pool is very, very different than lying to the Nazis about hiding jews.
One of the more disturbing topics in this post is the question of how much can you trust an organization of people who are willing to endure torture, rape, and death for their cause.
Surely lying isn't as bad as any of those...
Of course, lying for your cause is almost certainly a long term retarded thing to do... but so is censoring ideas...
It's hard to know what to trust on this thread
First off, great comment -- interesting, and complex.
But, some things still don't make sense to me...
Assuming that what you described led to:
I was once criticized by a senior singinst member for not being prepared to be tortured or raped for the cause. I mean not actually, but, you know, in theory. Precommiting to being prepared to make a sacrifice that big. shrugs
How did precommitting enter in to it?
Are you prepared to be tortured or raped for the cause? Have you precommitted to it?
Have other SIAI people you know of talked about this with you, have other SIAI people precommitted to it?
What do you think of others who do not want to be tortured or raped for the cause?
Thanks, wfg