A Brief Summary of Recent Happenings at the Singularity Institute

post by MichaelAnissimov · 2011-12-11T20:59:46.344Z · LW · GW · Legacy · 10 comments

Posted at http://singinst.org/blog/2011/12/11/a-brief-summary-of-recent-happenings-at-the-singularity-institute/

The Singularity Summit was a huge success. We raised over $300,000, one-third of that a donation from Jaan Tallinn, another third from a variety of generous donors who gave during or immediately after the Summit, and a third from ticket sales. We have roughly $500,000 in the bank, while annual payroll expenses are about $350,000. We almost have enough money to make it through the next year without adding any research staff, though we definitely would like to add additional researchers. In the meantime, our (unpaid) Research Associates team has been growing, and is tackling a variety of projects.

Since the Singularity Summit in October, our President Michael Vassar moved on to help found Personalized Medicine, a company we are all excited about, and Luke Muehlhauser was appointed Executive Director. Luke answered questions about these changes and our future plans in a recent video Q&A. I encourage you to watch it, and submit additional questions you may have about the Singularity Institute.

We have seven staff at this time: Luke, Michael Anissimov, Anna Salamon, Carl Shulman, Amy Willey, Louie Helm, and Eliezer Yudkowsky. Our internal collaboration has increased, we're keeping work logs, and we regularly eat dinner together. It's becoming more of a family.

What are we all doing? Luke and Michael are working on a new website for the Singularity Institute. Amy and Luke are working hard to expand the Singularity Summit brand and bring the Summit to "the next level." Anna and Eliezer are building a curriculum for the new Rationality Org, which we hope to spin off as a separate organization from Singularity Institute sometime next year. Luke, Anna, and Carl have been working on a variety of research papers, and Carl has also been working with the fast-growing "optimal philanthropy" movement, which is now poised to direct substantial funds over the next few years to reducing existential risks. Louie is working on donor relations, fundraising, recruiting, operations, and has also contributed to some of our forthcoming research articles.

In the coming year, we look forward to improving communication and transparency with our supporters, and to increasing the rate of our published research output.

10 comments

Comments sorted by top scores.

comment by turchin · 2011-12-11T22:01:41.634Z · LW(p) · GW(p)

What are you doing about FAI?

Replies from: lukeprog
comment by lukeprog · 2011-12-12T09:33:06.024Z · LW(p) · GW(p)

At the moment, our primary efforts on FAI are devoted to working toward writing an Open Problems in Friendly Artificial Intelligence document, so that the whole world can be working on the technical sub-problems of Friendly AI rather than just a small group mostly at the Singularity Institute and FHI.

comment by shminux · 2011-12-11T21:39:07.591Z · LW(p) · GW(p)

the new Rationality Org, which we hope to spin off as a separate organization from Singularity Institute sometime next year

That's an excellent idea, separating rationality from existential risk studies.I wonder how the new organization would look.

comment by Normal_Anomaly · 2011-12-11T22:00:40.515Z · LW(p) · GW(p)

our (unpaid) Research Associates team has been growing, and is tackling a variety of projects.

Luke, Anna, and Carl have been working on a variety of research papers,

some of our forthcoming research articles.

Can we get more information on these? Topics and expected completion times especially! ETA: and how they will contribute to eventually creating an FAI! (Thanks, turchin.)

Replies from: Bugmaster
comment by Bugmaster · 2011-12-12T10:54:18.210Z · LW(p) · GW(p)

I was going to ask a question exactly like this, so: upvoted.

This article makes it seem that the SIAI's primary objective is to extend community outreach in order to grow the SIAI. Thus, the SIAI is making itself exponentially more powerful, and is therefore rapidly approaching the Singularity event horizon. I sure hope the SIAI is Friendly, or we're all doomed.

comment by khafra · 2011-12-12T16:00:57.192Z · LW(p) · GW(p)

Speaking of Personalized Medicine, have they given the website to Hacker News for a good working-over, yet? HN usually assumes Crocker's Rules under such circumstances, and have a lot of experience at maximizing the effectiveness of a landing page. I ask partially because it seems to me to be missing some info and a call to action--as a patient, can/should I ask my doctor to sign up with them?

comment by JoshuaZ · 2011-12-11T22:22:58.007Z · LW(p) · GW(p)

Luke and Michael are working on a new website the Singularity Institute

Missing word here.

comment by Bruno_Coelho · 2011-12-12T02:26:50.084Z · LW(p) · GW(p)

That transparency is something I'm waiting for. The SI is part of something more big, the reduction of existential risks and the spread the memes. Maybe in the road some excess of agreement arise, but I think it is inevitable.

Replies from: Normal_Anomaly
comment by Normal_Anomaly · 2011-12-12T21:41:41.909Z · LW(p) · GW(p)

Is English your second language? I find I am having trouble parsing your comment. Can you explain what you mean by "the spread the memes"?

comment by Alexei · 2011-12-12T03:46:59.706Z · LW(p) · GW(p)

Typo: "to increasing the rate"