Posts

How To Lose 100 Karma In 6 Hours -- What Just Happened 2010-12-10T08:27:28.781Z · score: -44 (92 votes)
How Greedy Bastards Have Saved More Lives Than Mother Theresa Ever Did 2010-12-03T06:20:38.961Z · score: 16 (26 votes)

Comments

Comment by waitingforgodel on Understanding Wikileaks history · 2011-01-02T16:09:37.021Z · score: 2 (2 votes) · LW · GW

Sorry to see this so heavily downvoted. Thanks -- this made for interesting reading and watching.

If you haven't checked out the archive of iq.org it's also a rather interesting blog :)

re: formatting... you don't happen to use Ubuntu/Chrome, do you?

Comment by waitingforgodel on Dark Arts 101: Using presuppositions · 2011-01-02T08:32:00.050Z · score: -1 (1 votes) · LW · GW

He says that natural events are included in the category of journalism that's not about exposing other peoples secrets....

Comment by waitingforgodel on Dark Arts 101: Using presuppositions · 2011-01-01T21:52:02.714Z · score: 1 (1 votes) · LW · GW

LOL, how did I miss this:

1) There is quite a bit of journalism that has nothing to do with exposing other peoples secrets. This would include reporting on natural events (storms, snow, earthquakes, politicians lying or accepting bribes).

Are you under the impression that a politician wouldn't consider his accepting bribes to be a secret?

Comment by waitingforgodel on Dark Arts 101: Using presuppositions · 2011-01-01T14:49:41.986Z · score: 2 (6 votes) · LW · GW
  1. Wikileaks has published less than 1% of the diplomatic cables[1]. It goes thorough and removes sensitive and personal information before posting them online[2]. Except for a handful of exceptions, they only publish information that one of their newspaper partners has already published[2].

  2. In the US we don't say people are guilty until proven so -- Manning has made no public confession, and has not been tried. He's being held solely as the result of one man's (Adrian Lamo's) testimony, to the best of our knowledge[3]. That man was forcibly checked into a mental institution 3 weeks before said informing, and has made several inconsistent statements about his relationship with Manning, and what Manning told him to the press[4].

Comment by waitingforgodel on New Year's Resolutions · 2011-01-01T05:02:05.128Z · score: 1 (5 votes) · LW · GW

What do you suppose Einstein would say about doing different things over and over and expecting the same result? :p

Comment by waitingforgodel on Rationality Quotes: December 2010 · 2010-12-15T11:33:18.456Z · score: 1 (3 votes) · LW · GW

Never trust anyone unless you're talking in person? :p

Comment by waitingforgodel on How To Lose 100 Karma In 6 Hours -- What Just Happened · 2010-12-12T08:54:53.589Z · score: 2 (10 votes) · LW · GW

Yes. If I didn't none of this would make any sense...

Comment by waitingforgodel on How To Lose 100 Karma In 6 Hours -- What Just Happened · 2010-12-12T08:40:39.390Z · score: 1 (5 votes) · LW · GW

It's interesting, but I don't see any similarly high-effectiveness ways to influence Peter Thiel... Republicans already want to do high x-risk things, Thiel doesn't already want to decrease funding.

Comment by waitingforgodel on How To Lose 100 Karma In 6 Hours -- What Just Happened · 2010-12-11T06:11:56.073Z · score: 4 (12 votes) · LW · GW

After reviewing my copies of the deleted post, I can say that he doesn't say this explicitly. I was remembering another commenter who was trying to work out the implications on x-risk of having viewed the basilisk.

EY does say things that directly imply he thinks the post is a basilisk because of an x-risk increase, but he does not say what he thinks that increase is.

Edit: can't reply, no karma. It means I don't know if it's proportional.

Comment by waitingforgodel on How To Lose 100 Karma In 6 Hours -- What Just Happened · 2010-12-11T05:57:34.102Z · score: 2 (12 votes) · LW · GW

At karma 0 I can't reply to each of you one at a time (rate limited - 10 min per post), so here are my replies in a single large comment:


@JoshuaZ

I would feel differently about nuke designs. As I said in the "why" links, I believe that EY has a bug when it comes to tail risks. This is an attempt to fix that bug.

Basically non-nuke censorship isn't necessary when you use a reddit engine... and Roko's post isn't a nuke.


@rwallace

Yes, though you'd have to say more.


@jaimeastorga2000

Incredible, thanks for the link


@shokwave

Incredible. Where were you two days ago!

After Roko's post on the question of enduring torture to reduce existential risks, I was sure they're must be a SIAI/LWer who was willing to kill for the cause, but no one spoke up. Thanks :p


@Jack

In this case my estimate is a 5% chance that EY wants to spread the censored material, and used censoring for publicity. Therefore spreading the censored material is questionable as a tactic.


@rwallace

Great! Get EY to rot13 posts instead of censoring them.


@Psychohistorian

You can't just pretend that the threat is trivial when it's not.

Fair enough. But you can't pretend that it's illegal when it's not (ie. the torture/murder example you gave).


@katydee

Actually, I just sent an email. Christians/Republicans are killing ??? people for the same reason they blocked stem cell research: stupidity. Also, why you're not including EY in that causal chain is beyond me.


@Lightwave

I think his blackmail declarations either don't cover my precommitment, or they also require him to not obey US laws (which are also threats).

Comment by waitingforgodel on How To Lose 100 Karma In 6 Hours -- What Just Happened · 2010-12-11T05:15:14.144Z · score: -1 (11 votes) · LW · GW

Re #1: EY claimed his censorship caused something like 0.0001% risk reduction at the time, hence the amount chosen -- it is there to balance his motivation out.

Re #2: Letting Christians/Republicans know that they should be interested in passing a law is not the same as hostage taking or harming someone's family. I agree that narrow targeting is preferable.

Re #3 and #4: I have a right to tell Christians/Republicans about a law they're likely to feel should be passed -- it's a right granted to me by the country I live in. I can tell them about that law for whatever reason I want. That's also a right granted to me by the country I live in. By definition this is legitimate authority, because a legitimate authority granted me these rights.

Comment by waitingforgodel on How To Lose 100 Karma In 6 Hours -- What Just Happened · 2010-12-11T04:57:46.427Z · score: -5 (25 votes) · LW · GW

Yes, hopefully for EY as well

Comment by waitingforgodel on How To Lose 100 Karma In 6 Hours -- What Just Happened · 2010-12-10T18:38:14.916Z · score: -9 (23 votes) · LW · GW

Yes: talk some sense into Eliezer.

Comment by waitingforgodel on How To Lose 100 Karma In 6 Hours -- What Just Happened · 2010-12-10T18:36:50.082Z · score: -2 (16 votes) · LW · GW

The common misunderstanding from these comments is that they didn't click on the "precommitment" link and read the reasons why the precommitment reduced existential risk.

If I ever do this again, I'll make the reasoning more explicit. In the mean time I'm not sure what to do except add this comment, and the edit at the bottom of the article for new readers.

Comment by waitingforgodel on How To Lose 100 Karma In 6 Hours -- What Just Happened · 2010-12-10T18:31:35.373Z · score: -6 (12 votes) · LW · GW

By my math it's an existential risk reduction. Your point was talked about already in the "precommitment" post linked to from this article.

Comment by waitingforgodel on How To Lose 100 Karma In 6 Hours -- What Just Happened · 2010-12-10T18:29:00.901Z · score: 3 (9 votes) · LW · GW

You throw some scary ideas around. Try this one on for size. This post of yours has caused me to revise my probability of the proposition "the best solution to some irrational precommitments is murder" from Pascal's-wager levels (indescribably improbable) to 0.01%.

There are some people who agree with you (the best way to block legislation is to kill the people who come up with it).

I'd say that since I've only been talking about doing things well within my legal rights (using the legal system), that talking about murdering me is a bit "cultish"...

Comment by waitingforgodel on How To Lose 100 Karma In 6 Hours -- What Just Happened · 2010-12-10T18:18:12.852Z · score: 2 (8 votes) · LW · GW

I actually explicitly said what oscar said in the discussion of the precommitment.

I also posted my reasoning for it.

Those are both from the "precommitted" link in my article.

Comment by waitingforgodel on How To Lose 100 Karma In 6 Hours -- What Just Happened · 2010-12-10T18:02:37.723Z · score: -3 (11 votes) · LW · GW

If one loan individual told me that if I didn't wear my seatbelt, he'd bust my kneecaps, then that would be blackmail.

I think this is closer to if one lone individual said that every time he saw you not wear a seatbelt (which for some reason a law couldn't get passed for), he'd nudge gun control legislation closer to being enacted (assuming he knew you'd hate gun control legislation)

Comment by waitingforgodel on How To Lose 100 Karma In 6 Hours -- What Just Happened · 2010-12-10T08:33:05.689Z · score: -1 (17 votes) · LW · GW

Also note that it wasn't when I submitted to the main site...

Comment by waitingforgodel on Best career models for doing research? · 2010-12-10T08:27:52.872Z · score: -1 (15 votes) · LW · GW

I should have taken this bet

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T22:28:29.426Z · score: 5 (17 votes) · LW · GW

YES IT IS. In case anyone missed it. It isn't Roko's post we're talking about right now

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T22:27:43.067Z · score: 4 (14 votes) · LW · GW

In this case, the comment censored was not posted by you. Therefore you're not the author.

FYI the actual author didn't even know it was censored.

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T22:14:32.043Z · score: -4 (30 votes) · LW · GW

Are you aware of the damage your censoring is doing?

That blocking these things you think are bad (and most people do not) is causing tangible PR problems, chilling effects, SIAI/LW appearing psycho, and repeated discussion of a ancient thread?

If you weren't you, but were instead witnessing a friend do something so utterly stupid, wouldn't you tell them to stop?

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T22:09:31.291Z · score: -4 (16 votes) · LW · GW

I don't think my addition gives EY the high ground.

What are the points you wanted to bring up with him?

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T22:08:55.629Z · score: -11 (19 votes) · LW · GW

I already did that, that's why we're here.... make that email horribleforsiai.wfg@xoxy.net

Releasing the stuff hinted at here would cause a minor scandal -- perhaps rolling dice whenever a post is banned would be a happy medium.

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T21:48:27.077Z · score: -6 (12 votes) · LW · GW

Huh? Try:

P(law will pass | (sane existential risk estimators would think it's retarded) && (lots of right wingers heard great reasons) && (a lobby or two was convinced it was in their interest))

But (hopefully!) nothing like that could be caused with a single email. Baby steps.

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T21:41:30.182Z · score: 0 (8 votes) · LW · GW

Are you joking? Do you have any idea what a retarded law can do to existential risks?

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T21:37:49.111Z · score: -14 (30 votes) · LW · GW

A (slightly more) sane response would be to direct your altruistic punishment towards the SIAI specifically.

I'm all ears.

If you can think of something equally bad that targets SIAI specifically, (or anyone reading this can), email it to badforsiai.wfg@xoxy.net

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T21:26:26.347Z · score: -12 (30 votes) · LW · GW

If you feel more comfortable labeling it 'terrorism'... well... it's your thinking to bias.

someone has to stand up against your abuse

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T21:15:48.770Z · score: -29 (45 votes) · LW · GW

Agree except for the 'terrorism' and 'allegedly' part.

I just emailed a right-wing blogger some stuff that probably isn't good for the future. Not sure what the increase was, hopefully around 0.0001%.

I'll write it up in more detail and post a top-level discussion thread after work.

-wfg

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T21:07:26.158Z · score: -10 (24 votes) · LW · GW

Then I guess I'll be asked to leave the lesswrong site.

The 0.0001% bit was a reference to my earlier precommitment

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T20:44:38.009Z · score: 1 (9 votes) · LW · GW

yep, this one is showing as deleted

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T18:59:39.187Z · score: -4 (12 votes) · LW · GW

Lol. right, that'd do it

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T18:53:46.024Z · score: 1 (9 votes) · LW · GW

Great post. It confuses me why this isn't at 10+ karma

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T18:18:40.344Z · score: -11 (21 votes) · LW · GW

I notice that your list is future facing.

Lies are usually about the past.

It's very easy to not lie when talking about the future. It is much easier to "just this once" lie about the past. You can do both, for instance, by explaining that you believe a project will succeed, even while withholding information that would convince a donor otherwise.

An example of this would be errors or misconduct in completing past projects.

Lack of relevant qualifications for people SIAI plans to employ on a project.

Or administrative errors and misconduct.

Or public relations / donor outreach misconduct.

To put the question another, less abstract way, have you ever lied to a SIAI donor? Do you know of anyone affiliated with SIAI who has lied a donor?

Hypothetically, If I said I had evidence in the affirmative to the second question, how surprising would that be to you? How much money would you bet that such evidence doesn't exist?

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T17:10:31.741Z · score: 0 (6 votes) · LW · GW

why shouldn't they shut up?

Because this is LessWrong -- you can give a sane response and not only does it clear the air, people understand and appreciate it.

Cable news debating isn't needed here.

Sure we might still wonder if they're being perfectly honest, but saying something more sane on the topic than silence seems like a net-positive from their perspective.

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T17:06:13.829Z · score: 1 (7 votes) · LW · GW

no sensible person who had the answer would

I respectfully disagree, and have my hopes set on Carl (or some other level-headed person in a position to know) giving a satisfying answer.

This is LessWrong after all -- we can follow complicated arguments, and at least hearing how SIAI is actually thinking about such things would (probably) reduce my paranoia.

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T17:00:34.961Z · score: 1 (7 votes) · LW · GW

Make that "they do it for the greater good"

Sorry about mistakingly implying s/he was affiliated. I'll be more diligent with my google stalking in the future.

edit: In my defense, SIAI affiliation has been very common when looking up very "pro" people from this thread

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T16:54:46.530Z · score: 4 (8 votes) · LW · GW

but he won me back by answering anyway <3

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T16:54:30.853Z · score: 6 (10 votes) · LW · GW

This sounds very sane, and makes me feel a lot better about the context. Thank you very much.

I very much like the idea that top SIAI people believe that there is such a thing as too much devotion to the cause (and, I'm assuming, actively talk people who are above that level down as you describe doing for Roko).

As someone who has demonstrated impressive sanity around these topics, you seem to be in a unique position to answer these questions with an above-average level-headedness:

  1. Do you understand the math behind the Roko post deletion?

  2. What do you think about the Roko post deletion?

  3. What do you think about future deletions?

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T16:32:48.873Z · score: -1 (15 votes) · LW · GW

Am I missing something? Desrtopa responded to questions of lying to the donor pool with the equivalent of "We do it for the greater good"

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T16:32:18.049Z · score: 4 (8 votes) · LW · GW

That "confessor" link is terrific

If banning Roko's post would reasonably cause discussion of those ideas to move away from LessWrong, then by EY's own reasoning (the link you gave) it seems like a retarded move.

Right?

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T16:21:46.510Z · score: 3 (17 votes) · LW · GW

accusations stick in the mind even when one is explicitly told they are false

Actually that citation is about both positive and negative things -- so unless you're also asking pro-SIAI people to hush up, you're (perhaps unknowingly) seeking to cause a pro-SIAI bias.

Another thing that citation seems to imply is that reflecting on, rather than simply diverting our attention away from scary thoughts is essential to coming to a correct opinion on them.

One of the interesting morals from Roko's contest is that if you care deeply about getting the most benefit per donated dollar you have to look very closely at who you're giving it to.

Market forces work really well for lightbulb-sales businesses, but not so well for mom-and-pop shops, let alone charities. The motivations, preferences, and likely future actions of the people you're giving money to become very important. Knowing if you can believe the person, in these contexts, becomes even more important.

As you note, I've studied marketing, sales, propaganda, cults, and charities. I know that there are some people who have no problem lying for their cause (especially if it's for their god or to save the world).

I also know that there are some people who absolutely suck at lying. They try to lie, but the truth just seeps out of them.

That's why I give Roko's blurted comments more weight than whatever I'd hear from SIAI people who were chosen by you -- no offence. I'll still talk with you guys, but I don't think a reasonably sane person can trust the sales guy beyond a point.

As far as your question goes, my primary desire is a public, consistent moderation policy for LessWrong. If you're going to call this a community blog devoted to rationality, then please behave in sane ways. (If no one owns the blog -- if it belongs to the community -- then why is there dictatorial post deletion?)

I'd also like an apology from EY with regard to the chilling effects his actions have caused.

But back to what you replied to:

What would SIAI be willing to lie to donors about?

Do you have any answers to this?

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T15:56:36.230Z · score: 4 (8 votes) · LW · GW

Okay, you can leave it abstract. Here's what I was hoping to have explained: why were you discussing what people would really be prepared to sacrifice?

... and not just the surface level of "just for fun," but also considering how these "just for fun" games get started, and what they do to enforce cohesion in a group.

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T05:32:58.582Z · score: 0 (4 votes) · LW · GW

Ahh. I was trying to ask about Cialdini-style influence techniques.

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T05:32:44.089Z · score: -6 (14 votes) · LW · GW

The wording here leaves weird wiggle room -- you're implying it wasn't Nick?

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T04:18:40.955Z · score: 2 (12 votes) · LW · GW

I think the question you should be asking is less about evil conspiracies, and more about what kind of organization SIAI is -- what would they tell you about, and what would they lie to you about.

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T03:46:52.113Z · score: -2 (20 votes) · LW · GW

I agree that there's a lot in history, but the examples you cited have something that doesn't match here -- historically, you lie to people you don't plan on cooperating with later.

If you lie to an oppressive government, it's okay because it'll either get overthrown or you'll never want to cooperate with it (so great is your reason for lying).

Lying to your donor pool is very, very different than lying to the Nazis about hiding jews.

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T03:29:03.244Z · score: -6 (22 votes) · LW · GW

One of the more disturbing topics in this post is the question of how much can you trust an organization of people who are willing to endure torture, rape, and death for their cause.

Surely lying isn't as bad as any of those...

Of course, lying for your cause is almost certainly a long term retarded thing to do... but so is censoring ideas...

It's hard to know what to trust on this thread

Comment by waitingforgodel on Best career models for doing research? · 2010-12-09T03:16:21.653Z · score: 2 (12 votes) · LW · GW

First off, great comment -- interesting, and complex.

But, some things still don't make sense to me...

Assuming that what you described led to:

I was once criticized by a senior singinst member for not being prepared to be tortured or raped for the cause. I mean not actually, but, you know, in theory. Precommiting to being prepared to make a sacrifice that big. shrugs

  1. How did precommitting enter in to it?

  2. Are you prepared to be tortured or raped for the cause? Have you precommitted to it?

  3. Have other SIAI people you know of talked about this with you, have other SIAI people precommitted to it?

  4. What do you think of others who do not want to be tortured or raped for the cause?

Thanks, wfg