Singularity Institute is now Machine Intelligence Research Institute

post by Kaj_Sotala · 2013-01-31T08:25:45.762Z · LW · GW · Legacy · 96 comments

Contents

96 comments

http://singularity.org/blog/2013/01/30/we-are-now-the-machine-intelligence-research-institute-miri/

As Risto Saarelma pointed out on IRC, "Volcano Lair Doom Institute" would have been cooler, but this is pretty good too. As the word "Singularity" has pretty much lost its meaning, it's better to have a name that doesn't give a new person all kinds of weird initial associations as their first impression. And "Machine Intelligence Research Institute" is appropriately descriptive while still being general enough.

96 comments

Comments sorted by top scores.

comment by Xachariah · 2013-01-31T11:10:40.347Z · LW(p) · GW(p)

The new acronym for the Singularity Institute is MIRI.

The first google hit is the wikipedia page for the Star Trek: TOS episode Miri (S1E8). It's about how 90% of the population of not-Earth was destroyed by an existential threat leaving nothing but irrational children. The crew find themselves under a death sentence from this threat and try to find a solution, but they need the children's help. However the children think themselves immune and immortal and won't assist. In the last seconds, the crew manages to convince the children that the existential threat cannot be ignored and must be solved or the kids will eventually die too. With their help, the crew saves the day and everyone lives happily ever after. Also, the episode was so ahead of it's time that even though it was reviewed as excellent, it got so many complaints that it was never rebroadcast for 20 years.

I think my symbolism detector just pegged off the charts then exploded.

Replies from: lukeprog, wuncidunci, lukeprog, MugaSofer
comment by lukeprog · 2013-02-26T20:55:16.795Z · LW(p) · GW(p)

For the record, we totally didn't know this before you posted this comment. I remember seeing the Google search result but didn't read the plot summary on Wikipedia.

Replies from: Kawoomba
comment by Kawoomba · 2013-02-26T21:07:10.001Z · LW(p) · GW(p)

You should still replace his broken symbolism detector, if only symbolically.

comment by wuncidunci · 2013-01-31T14:44:17.483Z · LW(p) · GW(p)

When I searched the first hit was the Malaysian town called Miri. Looks like an example of filter bubbles.

Replies from: None, Xachariah, FiftyTwo
comment by Xachariah · 2013-01-31T15:56:34.703Z · LW(p) · GW(p)

What do you get when you use incognito mode? I checked with that and got the same Star Trek result. I think incognito pops Google's filter bubble, although I'm not certain.

Though other search engines do give me the Malaysian town.

Replies from: wuncidunci
comment by wuncidunci · 2013-01-31T16:00:38.433Z · LW(p) · GW(p)

The same, though Star Trek comes up second. Though google uses a lot of other info about your computer to determine the results (like IP-adress and browser details).

Replies from: Xachariah
comment by Xachariah · 2013-01-31T16:16:35.911Z · LW(p) · GW(p)

I had overlooked that I was trying to hide from Google using their own software. Of course they wouldn't have provided it if it worked against them. Silly of me in retrospect.

A handful of proxies verify that it is indeed Malaysia first and Star Trek second.

comment by FiftyTwo · 2013-01-31T15:15:37.532Z · LW(p) · GW(p)

Same, followed by "Mid-Infrared Instrument "

Replies from: gwern
comment by gwern · 2013-01-31T15:43:48.835Z · LW(p) · GW(p)

Funny, I get Star Trek, Malaysia, language, Star Trek, musical instructor, and only hit #6 is yours.

comment by lukeprog · 2013-02-27T05:26:29.542Z · LW(p) · GW(p)

Near the end of the episode, Kirk is trying to persuade the irrational children to help him save the planet, and the children just keep yelling "Blah Blah Blah!"

Kirk says:

No "blah blah blah"! Because... if you don't help us, there won't be any games anymore. There won't be anything. Nothing... Forever and ever...

...I'm begging you: let me help you! Or there won't be anything left at all.

comment by MugaSofer · 2013-02-01T13:08:44.101Z · LW(p) · GW(p)

I knew I'd heard that name somewhere ... suddenly feel a lot more respect for whoever picked it.

comment by Mitchell_Porter · 2013-01-31T16:19:04.965Z · LW(p) · GW(p)

MIRI's number-one goal will be the discovery of a Consequentialist Anticipatory Logic that can save the world (codename, MIRI-CAL).

comment by Larks · 2013-01-31T10:20:01.755Z · LW(p) · GW(p)

I think this is a good change. Bravo!

It will be hard not to refer to "Singinst" anymore as it did trip off the tougne, but I guess "The Singularity Institute for or against Artificial Intelligence, depending on which seems like a better idea at the time" was never really on the cards.

Replies from: cody-bryce, Benito
comment by cody-bryce · 2013-01-31T14:57:36.444Z · LW(p) · GW(p)

There is an important tradition of people using the former names of things for decades.

comment by Ben Pace (Benito) · 2013-02-01T06:59:26.424Z · LW(p) · GW(p)

We could call it the Machriarchy.

(MACHine intelligence ReaseARCH Institute, and just stick in the 'I')

comment by Exiles · 2013-01-31T12:58:25.898Z · LW(p) · GW(p)

We wish, Dmytry.

comment by Rain · 2013-01-31T13:43:45.406Z · LW(p) · GW(p)

I like it.

Does this mean I have to put both on my tax forms for donations this year?

Replies from: lukeprog, None
comment by lukeprog · 2013-01-31T20:55:25.388Z · LW(p) · GW(p)

We'll be sure to inform all 2013 donors of the proper procedure.

comment by [deleted] · 2013-01-31T14:39:51.098Z · LW(p) · GW(p)

And what about paypal monthlys? Will they redirect properly or do I have to exert agency?

Replies from: lukeprog
comment by lukeprog · 2013-01-31T20:55:41.323Z · LW(p) · GW(p)

We'll be in touch with everyone about that when the time comes.

comment by Ben Pace (Benito) · 2013-01-31T10:54:41.312Z · LW(p) · GW(p)

This is an interesting example of how the changing definitions doesn't change reality. 'Singularity Institute for Artificial Intelligence' sounded all future-y to me, and 'Machine Intelligence Research Institute' sounds all technical and intelligent (silly stereotypes), but in reality they've not changed. Still the same people, doing the same things.

Gonna have to change the site name, though...

Replies from: private_messaging
comment by private_messaging · 2013-02-01T08:48:55.199Z · LW(p) · GW(p)

The Machine Intelligence Research Institute indeed sounds like some older guys, maybe cranky but well accomplished in the field, working on something... That is until you find star trek reference.

comment by wedrifid · 2013-01-31T08:57:44.251Z · LW(p) · GW(p)

And "Machine Intelligence Research Institute" is appropriately descriptive while still being general enough.

MIRI. Machine Intelligence Research Institute.

Adequate. Not especially inspiring but I can't think of anything better either. It is certainly better than "Singularity Institute". (Good change!)

comment by novalis · 2013-01-31T22:23:54.537Z · LW(p) · GW(p)

Don't forget to redo your credit card design, when you get around to it.

comment by timtyler · 2013-01-31T10:46:47.265Z · LW(p) · GW(p)

"Machine Intelligence" is my preferred term. "Singularity" seems like pseudoscientific terminology to me.

comment by buybuydandavis · 2013-01-31T12:05:41.987Z · LW(p) · GW(p)

it's better to have a name that doesn't give a new person all kinds of weird initial associations as their first impression.

Yeah, though I buy into much of what Kurzweil has to say, Singularity has always rubbed me the wrong way - too much of Omega Immanentizing the Eschaton for me.

Is there any other snappy term out there for the idea that we're headed for very big changes through exponential performance improvement in lots of technologies?

Replies from: Eugine_Nier
comment by Eugine_Nier · 2013-02-01T03:29:49.492Z · LW(p) · GW(p)

Yeah, though I buy into much of what Kurzweil has to say, Singularity has always rubbed me the wrong way - too much of Omega Immanentizing the Eschaton for me.

Changing the name doesn't change the fact that they're trying to Immanentize the Eschaton.

comment by A1987dM (army1987) · 2013-01-31T11:14:58.082Z · LW(p) · GW(p)

Why "Machine" rather than "Artificial"?

Replies from: Kaj_Sotala
comment by Kaj_Sotala · 2013-01-31T12:59:11.025Z · LW(p) · GW(p)

To me, "Machine Intelligence" sounds less worn than "Artificial Intelligence", and also seems to more strongly imply that they're talking about general intelligence rather than narrow AI. But I don't know whether those were the actual reasons.

Replies from: amacfie, army1987, FiftyTwo, David_Gerard
comment by amacfie · 2013-01-31T17:46:49.864Z · LW(p) · GW(p)

I was under the impression that this use of the word "machine" was archaic -- it was used decades ago for naming things like machine learning, machine translation, and the Association for Computing Machinery. I don't immediately see why a more familiar term wasn't used.

Replies from: loup-vaillant
comment by loup-vaillant · 2013-02-01T10:09:00.299Z · LW(p) · GW(p)

Possibly for the "M". Imagine "AIRI" instead of "MIRI".

comment by A1987dM (army1987) · 2013-01-31T17:31:31.261Z · LW(p) · GW(p)

"Machine Intelligence" sounds less worn than "Artificial Intelligence"

It does, but why “worn” is a bad thing in this context? Wouldn't you want a familiar-sounding phrase?

and also seems to more strongly imply that they're talking about general intelligence rather than narrow AI

I get the reverse impression, probably because “artificial intelligence” reminds me of science fiction, whereas “machine intelligence” reminds me of Google Translate and self-driving cars.

comment by FiftyTwo · 2013-01-31T15:37:17.801Z · LW(p) · GW(p)

Agreed. Also is narrower, you could plausibly argue that lots of things were 'artificial intelligence' (e.g. bioengineered neural goop) but machine is closer to what we're actually talking about.

comment by David_Gerard · 2013-01-31T13:03:53.148Z · LW(p) · GW(p)

The first hit on "AIRI" isn't as good.

comment by RomeoStevens · 2013-01-31T08:54:35.690Z · LW(p) · GW(p)

Is it an acronym or initialism?

Replies from: Baughn
comment by Baughn · 2013-01-31T15:58:40.268Z · LW(p) · GW(p)

I'm vaguely suspecting a retronym, based on that Star Trek episode.

comment by Tuxedage · 2013-01-31T12:57:50.814Z · LW(p) · GW(p)

I personally dislike the change, but I trust that you guys have changed your names for a reason. I think I may be reacting negatively due to the nostalgia factor.

Replies from: Qiaochu_Yuan, Alexei
comment by Qiaochu_Yuan · 2013-01-31T19:37:48.269Z · LW(p) · GW(p)

Reversal test: if it were always called MIRI and just now decided to change its name to SIAI, how would you feel about it? (This isn't quite the right test because maintaining the same name over time does have some value, but it might help.)

comment by JoshuaFox · 2013-02-03T09:20:41.981Z · LW(p) · GW(p)

Good.

The name "Singularity Institute for Artificial Intelligence" seemed clunky to me from the moment I encountered it.

Too long. Two parts that don't work together, a first part with social acceptability issues, and a second part which has been taboo in leading tech circles since 1992.

It took me about half a year before I stopped typing siai.org when looking for the Institute's site.

Not to mention that SU took over the mindspace, and that SI occasionally strayed from the area implied by its name.

So, good job! I like it.

comment by Document · 2015-04-11T04:21:54.472Z · LW(p) · GW(p)

I wonder if anyone suggested the Council for Understanding Logic and Technology.

comment by curiousepic · 2013-02-28T19:17:59.998Z · LW(p) · GW(p)

I just realized that "MIRI" is (perhaps intentionally) evocative of the word "mirror", which is all kinds of suitable.

comment by bentarm · 2013-02-03T16:48:28.526Z · LW(p) · GW(p)

I notice that http://www.miri.org is very definitely not a placeholder for a new Singularity Institute page. Have you managed to acquire it?

(miri.com seems as though it should be available, but not exactly entirely appropriate. Maybe better than nothing).

Replies from: Kaj_Sotala
comment by Kaj_Sotala · 2013-02-03T20:13:50.363Z · LW(p) · GW(p)

Have you managed to acquire it?

Nope.

comment by JoshuaFox · 2013-01-31T19:28:42.569Z · LW(p) · GW(p)

Now, all we need is a replacement for "Singularitarian" ... this, time, one that people can get right when they try to repeat it.

Replies from: John_Maxwell_IV, MichaelAnissimov, MichaelAnissimov, handoflixue, CronoDAS
comment by John_Maxwell (John_Maxwell_IV) · 2013-02-01T07:44:12.770Z · LW(p) · GW(p)

What happened to futurist, transhumanist, extropian?

comment by MichaelAnissimov · 2013-02-03T02:54:21.728Z · LW(p) · GW(p)

Transhumanist

Replies from: None
comment by [deleted] · 2013-02-03T05:46:52.356Z · LW(p) · GW(p)

Already exists, and not all transhumanists are singularitarians.

comment by MichaelAnissimov · 2013-02-03T02:53:17.076Z · LW(p) · GW(p)

Transhumanist

Replies from: JoshuaFox
comment by JoshuaFox · 2013-02-03T09:12:30.613Z · LW(p) · GW(p)

Singularitarians are, IMHO, a subtype of Tranhumanists who are (1) focused on the Intelligence Explosion rather than nanotech, life extension, cryopreservation, Cyborg art, or any other Transhumanist area, and (2) do something about it.

Of course, people canbe Singularitarians and also other things as well, and people can believe something is a good idea without doing anything about it, but the above definition seems significant enough to be worth having a name for.

You could also potentially be a Singularitarian but not a Transhumanist, if you focused on AGI safety and even Friendliness, without caring about the entire H+ memeplex. Though there may be nobody like this today, I can imagine a future in which a game theorist or other scientist enters the field out of a combined desire to do good and to boost their career, but is not interested in all the other stuff.

comment by handoflixue · 2013-01-31T22:26:28.035Z · LW(p) · GW(p)

Mirian seems easy to say :)

Replies from: John_Maxwell_IV
comment by John_Maxwell (John_Maxwell_IV) · 2013-02-01T07:42:28.246Z · LW(p) · GW(p)

That's a terrible idea if they want to maintain credibility as a serious independent bunch of academic types.

Replies from: handoflixue
comment by handoflixue · 2013-02-01T21:48:29.201Z · LW(p) · GW(p)

I had assumed the question was more aimed towards LessWrong, not serious academic usage. I'd expect academic references to be in the form "I work for MIRI", not cutesy shorthand.

Replies from: John_Maxwell_IV
comment by John_Maxwell (John_Maxwell_IV) · 2013-02-02T08:19:41.015Z · LW(p) · GW(p)

Still, no one walks around calling themself a CSAILian. If there are people calling themselves Mirians, even if those people aren't directly affiliated with MIRI, that could hurt MIRI's credibility, I suspect, because it would make them seem unusual relative to other serious research groups.

comment by CronoDAS · 2013-02-01T03:51:24.189Z · LW(p) · GW(p)

"Phyg" member! ;)

Replies from: None
comment by [deleted] · 2013-02-01T04:05:29.588Z · LW(p) · GW(p)

"phygger"

Replies from: Nornagest
comment by Nornagest · 2013-02-01T05:08:03.371Z · LW(p) · GW(p)

"Dirty phygger"?

Replies from: fubarobfusco
comment by fubarobfusco · 2013-02-01T05:14:38.584Z · LW(p) · GW(p)

"Phyglet".

Replies from: Nornagest
comment by Nornagest · 2013-02-01T05:22:37.562Z · LW(p) · GW(p)

Isn't the term itself kind of a phyg leaf?

Replies from: fubarobfusco
comment by fubarobfusco · 2013-02-01T05:54:43.677Z · LW(p) · GW(p)

I'm sure that's just a phygment of your imagination.

Replies from: BlazeOrangeDeer
comment by BlazeOrangeDeer · 2013-02-01T08:57:56.066Z · LW(p) · GW(p)

I don't think I've seen a pun thread on lesswrong before... Perhaps it's one of those things that should stay on reddit.

comment by Andreas_Giger · 2013-01-31T15:34:51.535Z · LW(p) · GW(p)

Looks like an attempt to get rid of the negative image associated with the name Singularity Institute. I wonder if it isn't already too late to take PR seriously.

Replies from: gwern
comment by gwern · 2013-01-31T15:45:41.550Z · LW(p) · GW(p)

Looks like an attempt to get rid of the negative image associated with the name Singularity Institute.

From OP:

When Singularity University (SU) acquired the Singularity Summit from us in December, we also agreed to change the name of our institute to avoid brand confusion between the Singularity Institute and Singularity University.

I'm not sure this name change is a good idea or worth whatever SU offered (or that there was a real brand issue), but there apparently was some other motivation than 'SI now has embarrassing connotations'.

Replies from: Kaj_Sotala, David_Gerard
comment by Kaj_Sotala · 2013-01-31T16:17:51.960Z · LW(p) · GW(p)

SU and SI kept getting confused with each other all the time.

Replies from: ciphergoth, gwern
comment by Paul Crowley (ciphergoth) · 2013-02-03T08:12:18.242Z · LW(p) · GW(p)

Also, when you're trying to explain that there's a gigantic difference between you and another organisation with a similar-sounding name, you can sound a little like the People's Front of Judea.

comment by gwern · 2013-01-31T17:04:49.380Z · LW(p) · GW(p)

By whom? I think the only time I personally saw that happen online was in one British newspaper article.

Replies from: Alex_Altair, lukeprog, Alicorn, ciphergoth
comment by Alex_Altair · 2013-01-31T17:34:27.287Z · LW(p) · GW(p)

My impression is that anyone who has ever heard of Singularity University doesn't even have it in their hypothesis space that you mean something different when you say Singularity Institute.

Replies from: Kevin
comment by Kevin · 2013-02-03T06:02:24.951Z · LW(p) · GW(p)

Yup.

Even when they do have it in their hypothesis space, it still gets mangled. I recently got a follow-up email from someone that still thought I was Singularity University. I had briefly explained to him about how SU had acquired the Singularity Summit from us, and his follow-up email said "now that you have acquired the Singularity Summit, you may be interested in my product..."

comment by lukeprog · 2013-01-31T20:57:47.832Z · LW(p) · GW(p)

By whom?

Almost every time I spoke to anyone who wasn't deeply familiar with either SU or SI. Including almost every press person.

comment by Alicorn · 2013-01-31T18:30:40.418Z · LW(p) · GW(p)

Rudi Hoffman confused them when I sought a quote from him and mentioned Singinst. And him you'd expect to move in the right circles to know the difference.

comment by Paul Crowley (ciphergoth) · 2013-02-03T08:10:59.966Z · LW(p) · GW(p)

When I stayed with a friend in the Bay Area, I was confused that he said he knew loads of people in SingInst, but kept naming people I'd never heard of - and guess why!

comment by David_Gerard · 2013-01-31T17:47:43.511Z · LW(p) · GW(p)

There is a real brand issue. I say "Singularity Institute" to people down the pub, the ones who've heard the word go "ah, Kurzweil!" (I was trying to explain this site I like called LessWrong.)

Replies from: Rain
comment by Rain · 2013-01-31T17:56:06.196Z · LW(p) · GW(p)

I told someone at work, and they said, "Oh, like on that Fringe episode [about Kurzweillian uploading]."

comment by pleeppleep · 2013-01-31T14:27:40.857Z · LW(p) · GW(p)

Kinda awkward to say aloud. I think Institute for the Research of Machine Intelligence would sound better. Minor nitpick.

Replies from: Kaj_Sotala, Kevin
comment by Kaj_Sotala · 2013-01-31T15:00:16.808Z · LW(p) · GW(p)

Really? To me, the extra words in "Institute for the Research of Machine Intelligence" feel redundant, and MIRI is better for being concise and to the point.

comment by Kevin · 2013-02-01T01:44:56.035Z · LW(p) · GW(p)

IRMI?

irm-y? Sounds like squirm. Or the name Erma.

comment by Shmi (shminux) · 2013-01-31T15:21:29.395Z · LW(p) · GW(p)

What MIRI really is is the Institute for study of emergent intelligences, whether machine, biological, hybrid or any other kind, but given how Eliezer dislikes the term emergence, I can see why EIRI would be a non-starter. Still, I like the new name better than the old one.

Replies from: hankx7787, Baughn, David_Gerard
comment by hankx7787 · 2013-02-09T16:33:30.214Z · LW(p) · GW(p)

"What exactly do you mean by ‘machine’, such that humans are not machines?" - Eliezer Yudkowsky

Replies from: shminux
comment by Shmi (shminux) · 2013-02-09T19:08:15.772Z · LW(p) · GW(p)

Good point. The Wikipedia description certainly covers humans.

comment by Baughn · 2013-01-31T15:53:46.810Z · LW(p) · GW(p)

There's a reason he doesn't like it..

I'm not entirely sure what your sentence means. Could you rewrite it to not use "emergence" (or define "emergence")?

Replies from: shminux
comment by Shmi (shminux) · 2013-01-31T16:40:24.937Z · LW(p) · GW(p)

The reason he does not like the term is that, as pointed out before, "emergence" is not an explanation of anything. However, it is an observational phenomenon: when you get a lot of simple things together, they combine in ways one could not foresee and the resulting entities behave by the rules not constructable from (but reducible to) those of the simple constituents. When you combine a lot of simple molecules, you get a solid, a liquid or a gas with the properties you generally cannot infer without observing them first. When you get a group of people together, they start interacting in apriori unpredictable ways as they form a group. Once you observe the group behavior, you can often reduce it to that of its constituents, but a useful description is generally not in terms of the constituents, but in terms of the collective. For example, in thermodynamics people use gas laws and other macroscopic laws instead of the Newton's laws.

I am guessing that one reason that the (friendly) machine intelligence problem is so hard is that intelligence is an emergent property: once you understand it, you can reduce it to interactions between neurons, but you cannot infer it from such interactions. And what's more, it's several layers above, given that intelligence evolved long after simpler neural processes got established.

Thus what MIRI is doing is studying the laws of an emergent structure (AI) without being able to observe the structure first, since it does not exist, yet. This is like trying to deduce the behavior of a bee hive by studying single cells. Even if you come up with some new "emergent" laws, it may well end up being more like a tree than a hive.

Replies from: drethelin, gjm
comment by drethelin · 2013-01-31T21:43:35.354Z · LW(p) · GW(p)

Emergence is a subset of the word Surprise. It's not meaningless but you can't use it to usefully predict things you want to achieve with something, because it's equivalent to saying "If we put all these things together maybe they'll surprise us in an awesome way!"

Replies from: shminux, timtyler
comment by Shmi (shminux) · 2013-01-31T22:07:44.912Z · LW(p) · GW(p)

Sort of. It is not surprising that incremental quantitative changes results in a qualitative change, but the exact nature of what emerges can indeed be quite a surprise. It is nevertheless useful to keep in mind the general pattern in order to not be blindsided by the fact of emergence in each particular case ("But... but.. they are all nice people, I didn't expect them to turn into a mindless murderous mob!"). And to be ready to take action when the emergent entity hits the fan.

Replies from: Baughn, drethelin
comment by Baughn · 2013-01-31T22:28:10.007Z · LW(p) · GW(p)

Or in simpler terms, AI is a crapshoot.

comment by drethelin · 2013-02-01T01:32:15.419Z · LW(p) · GW(p)

Agreed. Like with surprises, you can try to be robust to them or agile enough to adapt.

comment by timtyler · 2013-02-01T00:10:45.102Z · LW(p) · GW(p)

If something is an emergent property, you can bet on it not being the sum of its parts. That has some use.

Replies from: loup-vaillant
comment by loup-vaillant · 2013-02-01T11:21:03.571Z · LW(p) · GW(p)

Aiming the tiny Friendly dot in AI-space is not one of them, though.

comment by gjm · 2013-02-03T17:47:13.404Z · LW(p) · GW(p)

Surely what MIRI would ideally like to do is to find a way of making intelligence not "emergent", so that it's easier to make something intelligent that behaves predictably enough to be classified as Friendly.

Replies from: shminux
comment by Shmi (shminux) · 2013-02-03T19:57:40.431Z · LW(p) · GW(p)

find a way of making intelligence not "emergent"

I don't believe that MIRI has been consciously paying attention to thwarting undesirable emergence, given that EY refuses to acknowledge it as a real phenomenon.

Replies from: gjm
comment by gjm · 2013-02-03T21:43:56.700Z · LW(p) · GW(p)

I fear we're at cross purposes. I meant not "thwart emergent intelligence" but "find ways of making intelligence that don't rely on it emerging mysteriously from incomprehensible complications".

Replies from: shminux
comment by Shmi (shminux) · 2013-02-03T22:22:27.026Z · LW(p) · GW(p)

Sure, you cannot rely on spontaneous emergence for anything predictable, as neural network attempts at AGI demonstrate. My point was that if you ignore the chance of something emerging, that something will emerge in a most inopportune moment. I see your original point, though. Not sure if it can be successful. My guess is that the best case is some kind of "controlled emergence", where you at least set the parameter space of what might happen.

comment by David_Gerard · 2013-01-31T17:49:26.290Z · LW(p) · GW(p)

First hit on EIRI. More appositely, EIRI exists.

Replies from: fubarobfusco, shminux
comment by fubarobfusco · 2013-02-01T00:06:53.853Z · LW(p) · GW(p)

... and here I was thinking of Masami Eiri from Serial Experiments Lain.

comment by Shmi (shminux) · 2013-01-31T17:57:59.676Z · LW(p) · GW(p)

Surely this is not your real objection. One can try EII or IEI or...

comment by Thomas · 2013-01-31T08:35:18.217Z · LW(p) · GW(p)

What if the Institute has lost its meaning and not the The Singularity?