The Singularity Institute is expanding its research program at very little cost. Boo-yah!

post by lukeprog · 2011-07-22T10:48:55.724Z · LW · GW · Legacy · 18 comments

Three new research associates. Link to the announcement.

18 comments

Comments sorted by top scores.

comment by JGWeissman · 2011-07-22T15:52:08.803Z · LW(p) · GW(p)

Research associates are not salaried staff, but we encourage their Friendly AI-related research outputs by, for example, covering their travel costs for conferences at which they present academic work relevant to our mission.

Are the research associates given enough support that they can work full time on their research?

Replies from: lukeprog
comment by lukeprog · 2011-07-22T20:09:37.963Z · LW(p) · GW(p)

Are the research associates given enough support that they can work full time on their research?

No. SI needs more funding to be able to support that kind of thing. The Research Associates program allows SI to support some additional research with almost no increase in our funding levels.

Replies from: Rain
comment by Rain · 2011-07-22T20:25:45.892Z · LW(p) · GW(p)

Is it SI, or SIAI, or the Singularity Institute, or The Institute?

Replies from: lukeprog
comment by lukeprog · 2011-07-22T21:03:17.616Z · LW(p) · GW(p)

I believe the org is in the process of shortening its name from Singularity Institute for Artificial Intelligence (SIAI) to simply Singularity Institute (SI).

Replies from: Document, Dorikka
comment by Document · 2011-07-31T20:38:36.963Z · LW(p) · GW(p)

I'm irrationally slightly annoyed to have "wasted" the time it took to train my brain to see "superintelligence" instead of "Singularity Institute" for SI back when it was used that way.

comment by Dorikka · 2011-07-23T06:34:18.871Z · LW(p) · GW(p)

Isn't the 'Artificial Intelligence' part the most important?

ETA: Seeing as it's trying to reduce existential risks from AI first and foremost.

Replies from: Zack_M_Davis, knb
comment by Zack_M_Davis · 2011-07-23T16:59:06.596Z · LW(p) · GW(p)

(Disclaimer: I don't speak for SingInst, nor am I presently affiliated with them.)

But recall that the old name was "Singularity Institute for Artificial Intelligence," chosen before the inherent dangers of AI were understood. The unambiguous for is no longer appropriate, and "Singularity Institute about Artificial Intelligence" might seem awkward.

I seem to remember someone saying back in 2008 that the organization should rebrand as the "Singularity Institute For or Against Artificial Intelligence Depending on Which Seems to Be a Better Idea Upon Due Consideration," but obviously that was only a joke.

comment by knb · 2011-07-23T07:55:38.338Z · LW(p) · GW(p)

And also, SI is pretty commonly used as an abbreviation for Sports Illustrated and the International System of Units.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2011-07-23T15:04:52.715Z · LW(p) · GW(p)

(I think SingInst is a better abbreviation now, as opposed to SIAI or SI.)

Replies from: Rain
comment by Rain · 2011-07-24T15:16:55.495Z · LW(p) · GW(p)

Yeah, and we can use 'The Institute' as a threshold warning sign that things have gone too far.

Replies from: Baughn
comment by Baughn · 2011-07-26T14:38:54.152Z · LW(p) · GW(p)

As in, once random people call it "the institute" we know it's probably time for the singularity?

comment by cousin_it · 2011-07-24T23:24:06.375Z · LW(p) · GW(p)

I really hope SingInst weakens its policy of secrecy and ensures that the research output of the new associates is made public. Don't care about publishing it in "proper journals", but please please please put everything online.

Replies from: steven0461
comment by steven0461 · 2011-07-24T23:35:40.774Z · LW(p) · GW(p)

Because you think doing so would reduce expected existential disaster, or because you want to read the material?

Replies from: cousin_it
comment by cousin_it · 2011-07-24T23:41:33.300Z · LW(p) · GW(p)

Because I want to read the material, I want to build upon it, and I want to see other people build upon it. For example, Wei Dai is not on the list of new associates. Do you agree that hiding the material from him will likely slow down progress?

Replies from: steven0461
comment by steven0461 · 2011-07-25T00:15:49.536Z · LW(p) · GW(p)

Sure, but to make a good decision you need to weigh upsides against downsides.

Replies from: cousin_it
comment by cousin_it · 2011-07-25T14:33:33.274Z · LW(p) · GW(p)

It's true that publishing the material can hasten the arrival of unfriendly AI, but it can also give the world a chance where it had none. If the problem of Friendliness is hard enough that SingInst folks can't generate all the required insights by themselves before unfriendly AI arrives, then secrecy has negative expected utility. Looking at the apparent difficulty of the problem and the apparent productivity of SingInst over the 10 years of its existence, that seems to me to be the case. Eliezer believes the solution is just a handful of insights away, but I don't see why.

comment by Manfred · 2011-07-23T18:59:43.417Z · LW(p) · GW(p)

Huh, Dewey's "Learning what to value" paper didn't cite Eliezer's CFAI. I'm glad he's doing a good job of sharing his paper at the AGI conference - was it published anywhere notable?

comment by Rain · 2011-07-22T17:06:09.679Z · LW(p) · GW(p)

And they have an SIAI credit card now. Does it have fees?

Edit: Answered my own question.