MIRI strategy 2013-10-28T15:33:10.040Z


Comment by ColonelMustard on CFAR in 2014: Continuing to climb out of the startup pit, heading toward a full prototype · 2014-12-28T22:37:17.134Z · LW · GW

It is, or was, an organisation to teach thinking skills. Please don't focus on the example; it was the first one that came to mind and I didn't realise the website had expired. The point is that a lot of groups claim to teach thinking skills. Do you consider all such count to be EA? If not, what distinguishes CFAR from those that don't?

Comment by ColonelMustard on CFAR in 2014: Continuing to climb out of the startup pit, heading toward a full prototype · 2014-12-26T22:54:14.199Z · LW · GW

Strongly agree with the last two sentences here.

Comment by ColonelMustard on CFAR in 2014: Continuing to climb out of the startup pit, heading toward a full prototype · 2014-12-26T22:35:41.339Z · LW · GW

How does cfar rank other thinking skills organisations outside the EA/MIRI groups? For instance, is Ember Associates plausibly one of the most important organisations currently existing?

Comment by ColonelMustard on Effective Altruism Summit 2014 · 2014-05-10T13:17:00.149Z · LW · GW

Any word on this? We submitted applications ~6 weeks ago and it would be useful to find out who will be offered a spot.

Comment by ColonelMustard on Open Thread April 8 - April 14 2014 · 2014-04-11T12:29:19.492Z · LW · GW

Thought experiment. Imagine a machine that can create an identical set of atoms to the atoms that comprise a human's body. This machine is used to create a copy of you, and a copy of a second person, whom you have never met and know nothing about.

After the creation of the copy, 'you' will have no interaction with it. In fact, it's going to be placed into a space ship and fired into outer space, as is the copy of Person 2. Unfortunately, one spaceship is going to be very painful to be in. The other is going to be very pleasant. So a copy of you will experience pain or pleasure, and a copy of someone else will experience the other sensation.

To what extent do you care which copy receives which treatment? Zero? As much as you would care if it was you who was to be placed into the spaceship? Or something in between?

Comment by ColonelMustard on 2013 Less Wrong Census/Survey · 2013-12-09T12:24:16.228Z · LW · GW

"Where are you from" and "where do you live now" are different questions. The first of these has multiple answers for a lot of people I know; the second probably doesn't. I would suggest both questions be asked next year.

Comment by ColonelMustard on 2013 Less Wrong Census/Survey · 2013-12-09T05:04:45.629Z · LW · GW

Took the survey. I assume from the phrasing that 'country' means where I'm "from" rather than where I currently reside (there is more room for uncertainty about the former than about the latter). Might be interesting to put both questions.

Comment by ColonelMustard on A critique of effective altruism · 2013-12-02T04:57:57.944Z · LW · GW

EA doesn't want to take over countries

"Take over countries" is such an ugly phrase. I prefer "country optimisation".

Comment by ColonelMustard on MIRI strategy · 2013-10-30T01:26:20.687Z · LW · GW

"Hear ridiculous-sounding proposition, mark it as ridiculous, engage explanation, begin to accept arguments, begin to worry about this, agree to look at further reading"

Comment by ColonelMustard on MIRI strategy · 2013-10-29T12:50:24.523Z · LW · GW

I agree and I like it. I think it could be further optimised for "convince intelligent non-LWers who have been sent one link from their rationalist friends and will read only that one link", but it could definitely serve as a great starting point.

Comment by ColonelMustard on MIRI strategy · 2013-10-29T12:46:55.808Z · LW · GW

I agree and would also add that "Eliezer failed in 2001 to convince many people" does not imply "Eliezer in 2013 is incapable of persuading people". From his writings, I understand he has changed his views considerably in the last dozen years.

Comment by ColonelMustard on MIRI strategy · 2013-10-29T12:44:57.482Z · LW · GW

Thanks, Luke. This is an informative reply, and it's great to hear you have a standard talk! Is it publicly available, and where can I see it if so? Maybe MIRI should ask FOAFs to publicise it?

It's also great to hear that MIRI has tried one pamphlet. I would agree that "This one pamphlet we tried didn't work" points us in the direction that "No pamphlet MIRI can produce will accomplish much", but that proposition is far from certain. I'd still be interested in the general case of "Can MIRI reduce the chance of UFAI x-risk through pamphlets?"

Pamphlets...don't work for MIRI's mission. The inferential distance is too great, the ideas are too Far, the impact is too far away.

You may be right. But, it is possible to convince intelligent non-rationalists to take UFAI x-risk seriously in less than an hour (I've tested this), and anything that can do that process in a manner that scales well would have a huge impact. What's the Value of Information on trying to do that? You mention the Sequences and HPMOR (which I've sent to a number of people with the instruction "set aside what you're doing and read this"). I definitely agree that they filter nicely for "able to think". But they also require a huge time commitment on the part of the reader, whereas a pamphlet or blog post would not.

Comment by ColonelMustard on Open Thread, October 27 - 31, 2013 · 2013-10-28T08:44:11.307Z · LW · GW

Thank you! One more - how much karma do I need? I was under the impression one needed 2 to post to discussion (20 to main), but presumably this is not the case. Is there an up to date list?

Comment by ColonelMustard on Open Thread, October 27 - 31, 2013 · 2013-10-28T04:20:24.625Z · LW · GW

Not sure where this goes: how can I submit an article to discussion? I've written it and saved it as a draft, but I haven't figured out a way to post it.

Comment by ColonelMustard on How does MIRI Know it Has a Medium Probability of Success? · 2013-10-07T02:32:06.367Z · LW · GW

I am thinking of writing a discussion thread to propose MIRI make it a priority to create a (video/pamphlet/blog post), tailored to intelligent non-rationalists and with as little jargon as possible (e.g. no terms like Kolmogorov complexity), to explain the dangers of UFAI. Please upvote this comment if you think LW is better with such a post, because I have zero karma.

Comment by ColonelMustard on Rationality Quotes October 2013 · 2013-10-07T02:27:20.793Z · LW · GW

Great damage is usually caused by those who are too scrupulous to do small harm.