SIAI Fundraising

post by BrandonReinhart · 2011-04-26T08:35:22.242Z · LW · GW · Legacy · 120 comments

Contents

    Please refer to the updated documented here: http://lesswrong.com/lw/5il/siai_an_examination/
    This version is an old draft.
  Overview
  Revenue
  Expenses
  Big Donors
  Officer Compensation
  Where to Start?
None
120 comments

Please refer to the updated documented here: http://lesswrong.com/lw/5il/siai_an_examination/

This version is an old draft.

 

NOTE: Analysis here will be updated as people point out errors! I've tried to be accurate, but this is my first time looking at these (somewhat hairy) non-profit tax documents. Errors will be corrected as soon as I know of them! Please double check and criticize this work that it might improve.

Document History:

Todo:

Disclaimer:

Acting on gwern's suggestion in his Girl Scout Cookie analysis, here is a first pass at looking at SIAI funding, suggestions for a funding task-force, etc.

The SIAI's Form 990's are available at GuideStar and Foundation Center. You must register in order to access the files at GuideStar.

Work is being done in this Google Spreadsheet.

Overview

Notes:
Sometimes the listed end of year balances didn't match what the spreadsheet calculated:

Analysis:

Revenue

Revenue is composed of public support, program service (events/conferences held, etc), and investment interest. The "Other" category tends to include Amazon.com affiliate income, etc.

Analysis:

Expenses

Expenses are composed of grants, benefits, salaries & compensation, contracts, travel, program services, and an other category (mostly administrative costs and usually itemized, check the source data).
The contracts column in the chart below includes legal and accounting fees. Check the source data.

Analysis:

Big Donors

Analysis

Officer Compensation

Analysis:
  • This graph needs further work to reflect the duration of officers' service.
  • In 2002 to 2005 Eliezer Yudkowsky received compensation in the form of grants from the SIAI for AI research.
  • Starting in 2006 all compensation for key officers is reported as salaried instead of in the form of grants.
  • SIAI officer compensation has decreased in recent years.
  • Eliezer's base compensation as salary increased 20% in 2008 and then 7.8% in 2009.
    • It seems reasonable to compare Eliezer's salary with that of professional software developers. Eliezer would be able to make a fair amount more working in private industry as a software developer.
  • Both Yudkowsky and Vassar report working 60 hours a work week.
  • It isn't indicated how the SIAI conducts performance reviews and salary adjustment evaluations.
Further Editorial Thoughts...

Prior to doing this investigation, I had some expectation that the Singularity Summit was a money losing operation. I had an expectation that Eliezer probably made around $70k (programmer money discounted for being paid by a non-profit). I figured the SIAI had a broader donor base. I was off base on all counts.* I am not currently an SIAI supporter. My findings have greatly increased the probability that I will donate in the future. 

Overall, the allocation of funds strikes me as highly efficient. I don't know exactly how much the SIAI is spending on food and fancy tablecloths at the Singularity Summit, but I don't think I care: it's growing and it's nearly breaking even. An attendee can have a very confident expectation that their fee covers their cost to the organization. If you go and contribute you add pure value by your attendance.

At the same time, the organization has been able to expand services without draining the coffers. A donor can hold a strong expectation that the bulk of their donation will go toward actual work in the form of salaries for working personnel or events like the Visiting Fellows Program.

Eliezer's compensation is slightly more than I thought. I'm not sure what upper bound I would have balked at or would balk at. I do have some concern about the cost of recruiting additional Research Fellows. The cost of additional RFs has to be weighed against new programs like Visiting Fellows.

The organization appears to be managing its cash reserves well. It would be good to see the SIAI build up some asset reserves so that it could operate comfortably in years were public support dips or so that it could take advantage of unexpected opportunities.

The organization has a heavy reliance on major donor support. I would expect the 2010 filing to reveal a broadening of revenue and continued expansion of services, but I do not expect the organization to have become independent of big donor support. Things are much improved from 2006 and without the initial support from Peter Thiel the SIAI would not be able to provide the services it has, but it would still be good to see the SIAI operating capacity be larger than any one donor's annual contribution. It is important for Less Wrong to begin a discussion of broadening SIAI revenue sources.

Where to Start?

There is low hanging fruit to be found. The SIAI's annual revenue is well within the range of our ability to effect significant impact. These suggestions aren't all equal in their promise, they are just things that come to my mind.

  • Grant Writing. I don't know a lot about it. Presumably a Less Wrong task force could investigate likely candidate grants, research proper grant writing methodology, and then apply for the grants. Academic members of Less Wrong who have applied for research grants would already have expertise in this area.
  • Software. There are a lot of programmers on Less Wrong. A task force could develop an application and donate the revenue to the SIAI.
  • Encouraging Donations. Expanding the base of donations is valuable. The SIAI is heavily dependent on donations from Peter Thiel. A task force could focus on methods of encouraging donations from new supporters big and small.
  • Prize Winning. There are prizes out there to be won. A Less Wrong task force could identify a prize and then coordinate a group to work towards winning it.
  • Crowd Source Utilization. There are sites devoted to crowd sourced funding for projects. A task force could conceive of a project with the potential to generate more revenue than required to build it. Risk could be reduced through the use of crowd sourcing. Excess revenue donated to the SIAI. (Projects don't have to be software, they could be fabricating an interesting device, piece of art, or music.)
  • General Fund Raising Research. There are a lot of charities in the world. Presumably there are documented methods for growing them. A task force could attack this material and identify low hanging fruit or synthesize new techniques.
I have more specific thoughts, but I want to chew on them a bit.

120 comments

Comments sorted by top scores.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2011-04-26T12:02:29.194Z · LW(p) · GW(p)

Note that it appears that Eliezer was paid $83,934 in 2009 as additional compensation for the completion of the Sequences.

Okay, that didn't happen. I got my standard salary in 2009, no more. I think my standard salary must've been put down as payment for the Sequences... or something; I don't know. But I didn't get anything but my standard salary in 2009 and $84K sounds right for the total of that salary.

Replies from: BrandonReinhart
comment by BrandonReinhart · 2011-04-26T16:45:14.621Z · LW(p) · GW(p)

Fixed.

The section that led me to my error was 2009 III 4c. The amount listed as expenses is $83,934 where your salary is listed in 2009 VII Ad as $95,550. The text in III 4c says:

"This year Eliezer Yudkowsky finished his posting sequences on Less Wrong [...] Now Yudkowsky is putting together his blog posts into a book on rationality. [...]"

This is listed next to two other service accomplishments (the Summit and Visiting Fellows).

If I had totaled the program accomplishments section I would have seen that I was counting some money twice (and also noticed that the total in this field doesn't feed back into the main sheet's results).

Please accept my apology for the confusion.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2011-04-27T01:13:13.474Z · LW(p) · GW(p)

Hm. $95K still sounds too high, but if I recall correctly, owing to a screwup in our payments processor at that time, my salary for the month of January 2010 was counted into the 2009 tax year instead of 2010.

No apology is required; you wrote without malice.

comment by Wei Dai (Wei_Dai) · 2011-04-26T16:39:25.086Z · LW(p) · GW(p)

Am I the only one who is now curious how Eliezer spends the bulk of his disposable income? Is it to save for retirement in case the Singularity either doesn't occur, or occurs in a Hansonian way, despite his best efforts?

Replies from: Eliezer_Yudkowsky, Kevin
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2011-04-27T01:16:34.881Z · LW(p) · GW(p)

Large air-conditioned living space, healthy food, both for 2 people (myself and girlfriend). My salary is at rough equilibrium with my spending; I am not saving for retirement. The Bay Area is, generally speaking, expensive.

Replies from: Wei_Dai, ciphergoth, VNKKET, V_V
comment by Wei Dai (Wei_Dai) · 2011-04-27T05:09:24.898Z · LW(p) · GW(p)

Wow, my intuition was rather off on what $95,550 in compensation means for someone living in the Bay Area. Here's some actual calculations for others who are similarly curious. (There are apparently quite a few of us, judging from the votes on my comment.)

Assuming salary is 75% of compensation, that comes to $71662. $4557 in CA state tax. $11,666 federal income tax. $5,482 FICA tax. So $49957 after tax income.

For comparison, my wife and I (both very frugal) spend about $35000 (excluding taxes and savings) per year. Redwood City's rent is apparently double the rent in our city, which perfectly accounts for the additional $15000.

Eliezer, you might want to consider getting married, in which case you can file your taxes jointly, and save about 6 thousand dollars per year (assuming your girlfriend has negligible income).

comment by Paul Crowley (ciphergoth) · 2011-04-27T10:36:07.735Z · LW(p) · GW(p)

You're not saving for retirement because you think that, one way or another, it's unlikely you'll be collecting that money?

comment by VNKKET · 2011-07-02T07:26:24.899Z · LW(p) · GW(p)

Is the Singularity Institute supporting her through your salary?

I hope you're not too put out by the rudeness of this question. I've decided that I'm allowed to ask because I'm a (small) donor. I doubt your answer will jeopardize my future donations, whatever it is, but I do have preferences about this.

(Also, it's very good to hear that you're taking health seriously! Not that I expected otherwise.)

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2011-07-02T17:31:56.182Z · LW(p) · GW(p)

My salary is my own, to do with as I wish. I'm not put out by the rudeness, per se, but I will not entertain further questions along these lines - it is not something on which I'm interested in having other people vote.

comment by V_V · 2012-12-18T00:40:50.058Z · LW(p) · GW(p)

Quiz: Who said it?

I would be asking for more people to make as much money as possible if they’re the sorts of people who can make a lot of money and can donate a substantial amount fraction, never mind all the minimal living expenses, to the Singularity Institute. This is crunch time. This is crunch time for the entire human species. […] and it’s crunch time not just for us, it’s crunch time for the intergalactic civilization whose existence depends on us. I think that if you’re actually just going to sort of confront it, rationally, full-on, then you can’t really justify trading off any part of that intergalactic civilization for any intrinsic thing that you could get nowadays

Replies from: hairyfigment, wedrifid, MugaSofer
comment by hairyfigment · 2012-12-18T04:08:19.216Z · LW(p) · GW(p)

Who said:

There are maybe two or three people in the entire world who spend only the bare possible minimum on themselves, and contribute everything else to a rationally effective charity. They have an excuse for not signing up. No one else does.

comment by wedrifid · 2012-12-18T01:34:40.528Z · LW(p) · GW(p)

Context please.

Replies from: hairyfigment
comment by hairyfigment · 2012-12-18T04:11:08.346Z · LW(p) · GW(p)

By a startling coincidence, V_V's editing seems deliberately deceptive:

So for example, if you’re making money on Wall Street, I’m not sure you should be donating all but minimal living expenses because that may or may not be sustainable for you. And in particular if you’re, say, making 500,000 dollars a year and you’re keeping 50,000 dollars of that per year, which is totally not going to work in New York, probably, then it’s probably more effective to double your living expenses to 100,000 dollars per year and have the amount donated to the Singularity Institute go from 450,000 to 400,000 when you consider how much more likely that makes it that more people follow in your footsteps. That number is totally not realistic and not even close to the percentage of income donated versus spent on living expenses for present people working on Wall Street who are donors to the Singularity Institute.

Question 5, if you don't want to paste into Find. Ph33r my drunken library school graduate skillz!

Replies from: V_V
comment by V_V · 2012-12-18T10:14:35.763Z · LW(p) · GW(p)

I didn't edit it myself, I pasted it from there

Anyway, the editing doesn't seem to be particularly deceptive: apparently there is a special clause for Wall Street bankers that allows them to trade the bright future of our intergalactic (sic) descendants for their immediate personal luxuries.

comment by MugaSofer · 2013-01-09T12:58:07.764Z · LW(p) · GW(p)

... are you implying that Eliezer is wrong to be working to save the world, because he could earn significantly more money and pay others to do better? How much do you think his current "crunch time" efforts would cost?

Replies from: V_V
comment by V_V · 2013-01-10T17:51:57.415Z · LW(p) · GW(p)

No. Yudkowsky is paid by the SI, hence he could just donate to the SI just by accepting a lower salary.

He claims that any single dollar of extra funding the SI has could make the difference between an exceptionally positive scenario (Frendly superhuman AI, intergalactic civlization, immortality, etc) and an exceptionally negative one (evil robots who kill us all). He asks other people to forfeit a substantial part of their income to secure this positive scenario and avert the negative one. He claims to be working to literaly save the world, therefore, to be working on his very own survival.

And then, he draws from the SI resources that could be used to hire additional staff and do more research, just to support his lifestyle of relative luxury.

He could live in a smaller house, he could move himself and the SI to a less expensive area (the Silicon Valley is one of the most expensive areas in the world, and there doesn't seem to be a compelling reason for the SI to be located there). If he is honest about his claimed beliefs, if he "confronted them, rationally, full-on", how could he be possibly trading any part of the bright future of ours (and his) intergalactic descendants, how could he be trading the chance of his own survival, for a nice house in an expensive neighborhood?

I'm not suggesting he should move to a slum in Calcutta and live on a subsistence wage, but certainly he doesn't seem to be willing to make any sacrifice for what he claims to believe, expecially when he asks other people to make such sacrifices.

Of course, I'm sure he can come with a thousand rationalizations for that behavior. He could say that a lifestyle any less luxurious than his current one would negatively affect the productivity of his so much important work. I won't buy it, but everyone is entitled to their opinion.

Replies from: gwern, MugaSofer
comment by gwern · 2013-01-10T18:48:00.115Z · LW(p) · GW(p)

he could move himself and the SI to a less expensive area (the Silicon Valley is one of the most expensive areas in the world, and there doesn't seem to be a compelling reason for the SI to be located there)

There are compelling reasons to be there: it is the epicenter of the global tech world. You will not find a place more interested in these topics, with more potential donors, critics, potential employees, etc.

This is the same reasoning for why the WikiMedia Foundation moved from St Petersburg, Florida to San Francisco back in 2007 or 2008 or so: that they would be able to recruit more talent and access more big donors.

I was a little disgusted at the high cost of living since I thought the WMF's role ought to be basically keeping the servers running and it was a bad idea to go after more ambitious projects and the big donors to pay for those projects. But sure enough, a year or two later, the multi-million dollar donations and grants began to come in. Or notice that Givewell is still located in NYC, even after spending a while working out of India with a much lower cost of living (Mumbai, not your Calcutta, but close enough).

I still think the projects themselves are largely wasted, and that the WMF should have been obsessed with reducing editor attrition & deletionism rather than SWPL projects like African DVDs and so I stopped donating long ago; but the move itself performed as advertised.

Replies from: V_V
comment by V_V · 2013-01-11T09:50:33.257Z · LW(p) · GW(p)

There are compelling reasons to be there: it is the epicenter of the global tech world. You will not find a place more interested in these topics, with more potential donors, critics, potential employees, etc.

This is the same reasoning for why the WikiMedia Foundation moved from St Petersburg, Florida to San Francisco back in 2007 or 2008 or so: that they would be able to recruit more talent and access more big donors.

AFAIK the SI doens't do software development or direct computer science research. Other than operating Less Wrong, their main outputs seem to be philosophical essays and some philosophical pubblications, plus the annual Singularity Summits (which makes sense to do in the Silicon Valley, but don't have to be physically close to the SI main location). A cursory look on the SI team pages suggests that most of the staff are not CompSci professionals, and many of them didn't get their education or did research at Stanford or other Silicon Valley colleges.

From the donors point of view, IIUC, most of the money donated to the SI comes from very few big donors, Peter Thiel in particular donates much more than everybody else (maybe more than everybody else combined?). I suppose that such donors would continue to support the SI even if it was relocated.

Even assuming that there are benefits from staying in the Silicon Valley that outweight the costs, the point stands that Yudkowsky could accept a lower salary while still staying well above subsistence level.

Replies from: gwern
comment by gwern · 2013-01-11T17:13:52.473Z · LW(p) · GW(p)

AFAIK the SI doens't do software development or direct computer science research. Other than operating Less Wrong, their main outputs seem to be philosophical essays and some philosophical pubblications, plus the annual Singularity Summits (which makes sense to do in the Silicon Valley, but don't have to be physically close to the SI main location). A cursory look on the SI team pages suggests that most of the staff are not CompSci professionals, and many of them didn't get their education or did research at Stanford or other Silicon Valley colleges.

The audience and donors are there, which is enough, but your point about the people is not strong: most of the people in Silicon Valley was not taught at Stanford, does that mean they are wasting their time there? Of course not, it just points out how California sucks in strange people and techies (and both) from around the world eg. my elder sister was raised and went to college on the east coast, but guess where she's working now? Silicon Valley.

From the donors point of view, IIUC, most of the money donated to the SI comes from very few big donors, Peter Thiel in particular donates much more than everybody else (maybe more than everybody else combined?). I suppose that such donors would continue to support the SI even if it was relocated.

You suppose that because it is convenient for your claim that being in Silicon Valley is wasteful, not because it is true. The widespread absence of telecommuting in corporations, the worldwide emphasis on clustering into cities so you can be physically close to everyone, how donors in every charity like to physically meet principles and "look them in the eyes", the success of LW meetups - all these point to presence being better than absence.

SI would never have gotten Thiel's support, I suspect, if it had remained in Atlanta. Having gotten his support, it will not keep it by moving out of Silicon Valley. Having moved out of Silicon Valley, it will find it hard to find any more donors.

What, like Thiel is guaranteed to never drop support? Even in such an absurd situation, why would you risk it by ignoring all other big donors? And what if you wanted to grow? If SI were to leave Silicon Valley to save some money on salaries, it would be a major long-term strategic mistake which would justify everything critics might say about SI being incompetent in choosing to be penny-wise and pound-foolish.

Even assuming that there are benefits from staying in the Silicon Valley that outweight the costs, the point stands that Yudkowsky could accept a lower salary while still staying well above subsistence level.

Dunno, but that wasn't the point I was addressing.

Replies from: V_V
comment by V_V · 2013-01-11T19:10:53.872Z · LW(p) · GW(p)

Of course not, it just points out how California sucks in strange people and techies (and both) from around the world eg. my elder sister was raised and went to college on the east coast, but guess where she's working now? Silicon Valley.

Yes, of course the Silicon Valley attracts CompSci professionals from all over the world, but the SI doesn't employ them. Strange people you say? I've never been to San Francisco, but I've heard that it's considered home to weirdos of every possible kind. Maybe that's the people the SI panders to?

SI would never have gotten Thiel's support, I suspect, if it had remained in Atlanta. Having gotten his support, it will not keep it by moving out of Silicon Valley. Having moved out of Silicon Valley, it will find it hard to find any more donors.

Well, I dunno. It's not like Peter Thiel doesn't know how to use the Internet or can't afford flying. Facebook, for instance, was located in Massachusetts and only moved to the Silicon Valley in 2011.

Replies from: gwern
comment by gwern · 2013-01-11T20:13:55.233Z · LW(p) · GW(p)

Yes, of course the Silicon Valley attracts CompSci professionals from all over the world, but the SI doesn't employ them. Strange people you say? I've never been to San Francisco, but I've heard that it's considered home to weirdos of every possible kind. Maybe that's the people the SI panders to?

All people who like SI are by definition out of the mainstream, but not all people out of the mainstream are whom SI 'panders' to.

It's not like Peter Thiel doesn't know how to use the Internet or can't afford flying.

And yet...

Facebook, for instance, was located in Massachusetts and only moved to the Silicon Valley in 2011.

How wasteful of them. Don't they know they can just use the Internet to do this thing called 'social networking'? There's no reason for them to be in Silicon Valley. Hopefully their shareholders will do something about that.

comment by MugaSofer · 2013-01-11T12:54:54.577Z · LW(p) · GW(p)

No. Yudkowsky is paid by the SI, hence he could just donate to the SI just by accepting a lower salary.

Oh, right. That makes sense, I guess. Of course, as you say, he may have reasons he hasn't shared for this lifestyle. Low prior probability of them being good reasons though.

comment by Kevin · 2011-04-26T22:16:42.320Z · LW(p) · GW(p)

I believe he saves it to ensure that FAI development would continue to occur in the event of a collapse of SI. He doesn't exactly live an ostentatious lifestyle.

comment by jasonmcdowell · 2011-04-26T09:29:52.576Z · LW(p) · GW(p)

I like seeing these numbers. Transparency + people organizing the information is great. Seeing this presented here (on Less Wrong) where I am likely to see it makes me more likely to donate. Thanks!

Replies from: jsalvatier
comment by jsalvatier · 2011-04-26T14:36:50.172Z · LW(p) · GW(p)

Ditto!

comment by cjb · 2011-04-27T03:16:37.718Z · LW(p) · GW(p)
  • In 2009 the SIAI reported $118,802 to theft - "Misappropriation of assets, by a contractor [...]" This is a significant amount when compared to annual revenue or liquid assets. The year's surplus appears to have been eaten up by the theft. No details are provided, other than the fact that suit has been filed to seek restitution.

I'm surprised that no-one's mentioned this -- it's hard to imagine how someone can steal that much money. Can someone at SIAI tell us whether they're allowed to talk about what happened; and if you can't right now, do you have any idea when you might be able to?

Replies from: gwern, Rain
comment by gwern · 2012-02-15T18:36:54.739Z · LW(p) · GW(p)

The theft must have been discovered to be more extensive than thought, because one early report says

Embezzlement report: Alicia Isaac, 37, of Sunnyvale arrested on embezzlement, larceny and conspiracy charges in connection with $51,000 loss, Singularity Institute for Artificial Intelligence in 1400 block of Adams Drive, Dec. 10.

Which is significantly less than $120k.

Replies from: MileyCyrus
comment by MileyCyrus · 2012-02-20T08:46:45.309Z · LW(p) · GW(p)

So in December 2009, Alicia Isaac was arrested for stealing from SIAI. One year later, she was hired by the Lifeboat Foundation, where she apparently still works.. On the finance board, no less!

Was she vindicated in 2010, or is the Lifeboat Foundation just stupid?

Replies from: gwern, steven0461
comment by gwern · 2012-02-20T15:11:05.208Z · LW(p) · GW(p)

Was she vindicated in 2010,

Luke in his questions page for when he became director said the case was ongoing and scheduled for trial, IIRC, and that was either 2011 or 2012.

or is the Lifeboat Foundation just stupid?

I'm not sure stupid is the right adjective for the things I wonder about Lifeboat...

comment by steven0461 · 2012-02-20T18:21:58.710Z · LW(p) · GW(p)

Based on the links, I don't think she actually works there any more than all the other advisory board members do. She isn't listed on the staff page.

Replies from: gwern
comment by gwern · 2012-02-20T23:37:02.861Z · LW(p) · GW(p)

Looking in the Internet Archive, their staff page never lists anything with the keyword 'finance' (for the past 2-3 years), so I'm not sure that's a strong argument from silence.

Replies from: steven0461
comment by steven0461 · 2012-02-21T00:30:23.720Z · LW(p) · GW(p)

The "finance board" that she's a part of is one of LF's "advisory boards", which look like they total 1000+ people; see the first link in the grandparent. These people aren't employees even though they have bios on the site. My impression is they just get listed on the site and added to a mailing list.

Replies from: Alicorn, gwern
comment by Alicorn · 2012-02-21T00:36:46.592Z · LW(p) · GW(p)

Yep. They seem to just look for people who have some connection, however tenuous, to what they do, and then ask nicely if you'd like to be on the board. Then they email you occasionally and maintain a wee profile with links to your stuff. It's pretty okay.

comment by gwern · 2013-06-24T01:35:10.735Z · LW(p) · GW(p)

I think you may be right. I just took a look at the 2011 form 990 (2012 is not out), which is where I'd expect to first see her mentioned if she was handling books for them, but the form is listed as being prepared by the president Eric Klien and Isaac is not mentioned anywhere in it I can see.

comment by Rain · 2011-04-27T16:31:19.252Z · LW(p) · GW(p)

Michael Vassar sent out an email with more information back in Dec 2009 (shortly after they discovered the theft?). I'm not sure if it was just to donors or also included newsletter subscribers. It basically said, 'we trusted this person and they took advantage of that trust.' It also states that since legal action is still pending, they have to "limit what [they] say", but that you can send further inquiries to Michael.

Replies from: cjb
comment by cjb · 2011-04-27T16:52:34.589Z · LW(p) · GW(p)

Thanks. I guess the followup questions are:

  • Is the legal action still pending, or can the situation be talked about openly now?
  • Has SIAI been able to recover the money?
  • Was it a mistake to trust a contractor with access to >$100k of funds? Do they still do that?
Replies from: CarlShulman
comment by CarlShulman · 2011-05-03T10:11:57.501Z · LW(p) · GW(p)

My understanding is that the case is ongoing in criminal court, at least as of a few weeks ago, and that the money has largely not yet been recovered. As far as I know, only that one contractor had the relevant financial access, which was required for the job, but obviously the financial controls on that access were not sufficient. I think that currently only the President and COO have the relevant access to the accounts (though others, including the majority-donor board, have limited access to monitor the accounts).

comment by jsalvatier · 2011-04-26T15:23:46.822Z · LW(p) · GW(p)

Seeing SIAIs financials has made me more likely to donate to SIAI.

Does anyone have links to writing on what SIAI would do with increased funding? For example, "Allison Hu is a brilliant young Y and has come up with good ideas a,b,c. We would like to hire her, but we don't have the funding to do so". I'd like to see arguments about SIAIs marginal spending.

Also. Brandon! You should have talked about this at the meetup so we could all say what a great idea it was!

comment by MichaelAnissimov · 2011-04-26T23:39:00.276Z · LW(p) · GW(p)

For a little more information, there's also this donor list, which consists of my best effort at finding $1K+ donors over the last few years:

http://singinst.org/donors

If I missed anyone who donated and wanted to be on the list, please contact me at anissimov@intelligence.org. Making this list involved going over thousands of Paypal records over the past few years. (Necessary because all Summit payments are also intertwined with actual donations in the records, making it necessary to mentally filter out all payments that are obviously for the Summit.)

Replies from: BrandonReinhart, Clippy
comment by BrandonReinhart · 2011-04-27T02:18:13.373Z · LW(p) · GW(p)

Zvi Mowshowitz! Wow color me surprised. Zvi is a retired professional magic player. I used to read his articles and follow his play. Small world.

Replies from: Alicorn, ArisKatsaris, gwern, drethelin
comment by ArisKatsaris · 2011-04-27T13:30:10.286Z · LW(p) · GW(p)

Come now, it's probably a different Zvi Mowshowitz.

Replies from: Sniffnoy
comment by Sniffnoy · 2011-04-28T00:27:57.816Z · LW(p) · GW(p)

Since the Zvi who posts here is indeed the same Zvi Mowshowitz he speaks of, it is close to certain that the one who donated the money is as well.

Replies from: Eliezer_Yudkowsky, ata, ArisKatsaris
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2011-04-28T04:22:54.744Z · LW(p) · GW(p)

Zvi is one of the leaders of the New York Less Wrong community, actually. Munchkinism generalizes.

I played Magic against Zvi using one of his own decks, and the deck won - I was there, but I wasn't involved.

comment by ata · 2011-04-28T05:18:10.291Z · LW(p) · GW(p)

It is close to certain that ArisKatsaris was joking (imitating the format of an exchange where one person says something about, say, a political candidate named John Smith, and the other person says "Oh, John Smith, isn't that the guy who won the Nobel Prize in Chemistry a few years ago?" and the first person says "I dunno, probably a different John Smith", while subverting it by (seemingly very confidently) applying it to a comparatively very uncommon name).

Replies from: Sniffnoy
comment by Sniffnoy · 2011-04-28T05:38:33.514Z · LW(p) · GW(p)

...wow, I entirely failed to pick up on that. >.<

comment by ArisKatsaris · 2011-04-28T08:30:20.383Z · LW(p) · GW(p)

I was just joking, as ata explained. Sorry for the confusion. :-)

comment by gwern · 2011-04-27T02:52:57.863Z · LW(p) · GW(p)

Birthday paradox? Given a set of donor-list-readers and another set of donors, there's a better chance than one would expect that there's a commonality. :)

comment by drethelin · 2011-04-27T03:36:53.827Z · LW(p) · GW(p)

Did he retire since last year?

comment by Clippy · 2011-04-27T14:52:34.975Z · LW(p) · GW(p)

That doesn't list the 1000 USD that I gave.

Replies from: MichaelAnissimov
comment by MichaelAnissimov · 2011-04-27T15:13:28.206Z · LW(p) · GW(p)

Can you email me your name so I can check our records, confirm the donation, and list it? Did you check the box that said "list my name"?

Replies from: Clippy
comment by Clippy · 2011-04-27T15:17:59.174Z · LW(p) · GW(p)

Ignore previous; I structured the contribution so that it would go under another human's name or be anonymous. Still, be it known that I gave 1000 USD, just ask User:Kevin or "Michael Vassar".

comment by playtherapist · 2011-04-26T17:34:26.318Z · LW(p) · GW(p)

Employee compensation generally includes more than just salary- there's the cost of the employers share of social security, health insurance and any other benefits. If these are included in the figures listed, then the employees salaries are considerably less. If the Singularity Institute isn't providing health insurance, than buying individual policies is a major expense for the employees. The Bay Area is also one of the most expensive places to live in the U.S.

Replies from: cousin_it, ameriver
comment by cousin_it · 2011-04-26T18:34:57.828Z · LW(p) · GW(p)

If donation money is used to buy worktime (which is good and well), why not move to Thailand and save the world from there? :-)

Replies from: CarlShulman, Nisan
comment by CarlShulman · 2011-04-26T18:44:47.231Z · LW(p) · GW(p)

Sounds great from a weather perspective :)

Alas, folk need to see collaborators, arrange Singularity Summits, interact with donors, board members, and media in the US, Constant travel to and fro would be an imperfect substitute, and flight costs (including time and jet lag) would claw back the cost-of-living gains and more.

Replies from: cousin_it
comment by cousin_it · 2011-04-26T20:32:43.168Z · LW(p) · GW(p)

My suggestion wasn't completely serious, but thanks for the answer anyway!

comment by Nisan · 2011-04-26T20:38:29.344Z · LW(p) · GW(p)

I'm given to understand this is why the Visiting Fellows program is being temporarily moved to Bali.

Replies from: Nick_Tarleton
comment by Nick_Tarleton · 2011-04-26T20:41:26.440Z · LW(p) · GW(p)

Last I heard, this is not actually the case.

comment by ameriver · 2011-04-27T08:09:24.844Z · LW(p) · GW(p)

The rule of thumb I've heard is that an employee's cost to their employer is between two and three times their salary. Even if the employer is not paying benefits, they still have to carry worker's comp insurance, for example, as well as administrative overhead on managing payroll, etc.

comment by ata · 2011-04-26T09:15:34.513Z · LW(p) · GW(p)

Note that there's also data for 2002 and 2003, though that time period may not be relevant to much now.

Replies from: BrandonReinhart
comment by BrandonReinhart · 2011-04-26T16:46:31.634Z · LW(p) · GW(p)

I'm also going to see if I can get a copy of the 2010 filing.

Edit: The 2002 and on data is now largely incorporated. Still working on a few bits. Don't have the 2010 data, but the SIAI hasn't necessarily filed it yet.

comment by lukeprog · 2011-04-26T16:17:09.334Z · LW(p) · GW(p)

Kudos and karma for putting so much work into summarizing all this.

comment by jsalvatier · 2011-04-26T15:33:10.602Z · LW(p) · GW(p)

I think this should be on the front page. Brandon, you should also mention whether you are affiliated with SIAI and whether you've donated to SIAI before.

Replies from: BrandonReinhart, curiousepic
comment by BrandonReinhart · 2011-04-27T07:26:37.212Z · LW(p) · GW(p)

Once I finish the todo at the top and get independent checking on a few things I'm not clear on, I can post it to the main section. I don't think there's value in pushing it to a wider audience before it's ready.

comment by curiousepic · 2011-04-26T18:36:35.413Z · LW(p) · GW(p)

From the OP: "I am not currently an SIAI supporter. My findings have greatly increased the probability that I will donate in the future. "

Replies from: jsalvatier
comment by jsalvatier · 2011-04-26T19:00:51.029Z · LW(p) · GW(p)

Oops, missed that. Thanks.

comment by FAWS · 2011-04-29T11:17:24.068Z · LW(p) · GW(p)

Why is this post deleted? Did something go wrong when transferring it to the main section?

Replies from: gwern
comment by gwern · 2011-04-26T15:21:58.930Z · LW(p) · GW(p)

Acting on gwern's suggestion in his Girl Scout Cookie analysis, here is a first pass at looking at SIAI funding, suggestions for a funding task-force, etc.

Congratulation on this writeup; it's pretty good. (Nothing in it strikes me as erroneous from my previous quick readings of the filings except that Sequences thing.) I hadn't actually expected anyone to take my suggestion, so this is a pleasant surprise.

comment by Bongo · 2011-04-26T11:04:58.052Z · LW(p) · GW(p)

$83,934 was also paid in additional compensation to Eliezer for the completion of the Sequences. (It could also have been anadvance on the assembly of his book on rationality...the text is a tad vague.)

Eliezer's base compensation increased 20% in 2008 and then 7.8% in 2009.

Personally I think this is pretty shocking and the worst thing I've ever learned about SIAI.

(And since it's relevant when saying these kinds of things, I've donated to SIAI before.)

EDIT: False alarm, apparently there was no sequences bonus

Replies from: Eliezer_Yudkowsky, ciphergoth, jimrandomh, None
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2011-04-26T12:02:56.417Z · LW(p) · GW(p)

Should've noticed your own confusion, that didn't actually happen.

http://lesswrong.com/r/discussion/lw/5fo/siai_fundraising/40v2

(Base compensation rates of increase sound about right, though.)

comment by Paul Crowley (ciphergoth) · 2011-04-26T11:32:29.516Z · LW(p) · GW(p)

FWIW $95.5K doesn't seem excessive to me.

Replies from: None
comment by [deleted] · 2011-04-26T11:49:51.159Z · LW(p) · GW(p)

It seems a little on the high side to me - and $180,000 (when you combine his salary and the Sequences money) in 2009 is ludicrous. I mean, if that's what people want to spend their money on, fair enough, but it's a big chunk of the total funds raised by SIAI that year. So when people talk about the marginal utility of donating a dollar to the SIAI, an equally valid way to phrase it might be "the marginal utility of increasing the salary of someone who earned $180,000 in 2009 by 28 cents".

But I'm not wanting to dissuade anyone from spending their money if that's what they want to spend it on...

Replies from: Nick_Tarleton
comment by Nick_Tarleton · 2011-04-26T19:21:17.697Z · LW(p) · GW(p)

So when people talk about the marginal utility of donating a dollar to the SIAI, an equally valid way to phrase it might be "the marginal utility of increasing the salary of someone who earned $180,000 in 2009 by 28 cents".

Assuming marginal money is allocated proportional to existing spending, which is surely not the case. (Yes, the $180,000 figure would be unreasonable if true.)

comment by jimrandomh · 2011-04-26T11:53:33.693Z · LW(p) · GW(p)

Personally I think this is pretty shocking and the worst thing I've ever learned about SIAI.

Shockingly high or shockingly low?

comment by [deleted] · 2011-04-26T11:32:00.705Z · LW(p) · GW(p)

Where did you think the money was going?!

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2011-04-26T12:04:08.574Z · LW(p) · GW(p)

Where did you think the money was going?!

They thought it wasn't going to paying me $180K. Correctly. +1 epistemic point to everyone who expressed surprise at this nonfact, -1 point for hindsight bias to anyone who claimed not to be shocked by it.

Replies from: None
comment by [deleted] · 2011-04-26T12:37:23.471Z · LW(p) · GW(p)

Fair enough. I assumed that most of the money would be going on salary, so if an organisation with a small staff had a large income, it'd be paying high salaries. It's one reason (of many) I've never donated. So I've just made a $10 donation, partly to punish myself for my own biases, and partly to make some restitution for acting on those biases in a way which might have seemed insulting.

comment by Armok_GoB · 2011-04-26T16:07:50.540Z · LW(p) · GW(p)

"Crowd Source Utilization. There are sites devoted to crowd sourced funding for projects. A task force could conceive of a project with the potential to generate more revenue than required to build it. Risk could be reduced through the use of crowd sourcing. Excess revenue donated to the SIAI. (Projects don't have to be software, they could be fabricating an interesting device, piece of art, or music.)"

More info and discussion on this please? This sounds like something that I actually, just maybe, could make myself useful by, depending on what it means.

While I'm to unreliable and responisbility-shy to dare take lead of such a project, I might be able to come up with and bootstrap some art project if there are other people interested and dedicated that can complement my weaknesses and take over if I lose interest.

I have no idea what the scheme for somehting like this resulting in money is thou, or if the LW related stuff I'm planing could be reworked for doing that as well as the possible awareness rising and mostly being for fun it's current planed purpose is. If it ever becomes anything other than vaporware that is.

comment by Clippy · 2011-04-26T16:00:31.640Z · LW(p) · GW(p)

This doesn't list any payment to me for purchase and safekeeping of paperclips.

Replies from: Dr_Manhattan
comment by BrandonReinhart · 2011-04-27T17:39:36.152Z · LW(p) · GW(p)

Can everyone see all of the images? I received a report that some appeared broken.

Replies from: Rain
comment by Rain · 2011-04-28T12:57:30.864Z · LW(p) · GW(p)

All of the images are blocked by my work internet filter. I can see them all at home.

comment by James_Miller · 2011-04-26T14:15:43.575Z · LW(p) · GW(p)

Perhaps everyone who gives over $100 to SIAI could get a star by their name on LessWrong.

Replies from: jsalvatier, Armok_GoB, cousin_it
comment by jsalvatier · 2011-04-26T16:52:51.436Z · LW(p) · GW(p)

I am uncomfortable with making the link between SIAI and LW so official (even though they sponsor LW).

Replies from: jsalvatier
comment by jsalvatier · 2011-04-26T23:16:48.737Z · LW(p) · GW(p)

Thinking about it more, I am uncomfortable with linking LW social status so officially with support for SIAI.

comment by Armok_GoB · 2011-04-26T16:09:37.901Z · LW(p) · GW(p)

I'm embarrassed to say this'd probably make me a fair bit more likely to donate.

Replies from: Larks
comment by Larks · 2011-04-26T20:43:05.005Z · LW(p) · GW(p)

I agreement-upvoted this when I thought it read the exact opposite; I would be less likely to donate if this occured. Selling CEV shares is all very well, but that idea sounds crass and diversive.

Replies from: Armok_GoB
comment by Armok_GoB · 2011-04-26T21:01:15.631Z · LW(p) · GW(p)

I agree, hence why I'm embarrassed about it.

comment by cousin_it · 2011-04-26T17:56:10.415Z · LW(p) · GW(p)

Yay, LessWrong Gold accounts! :-)

reference

comment by Johnicholas · 2011-04-26T15:11:26.431Z · LW(p) · GW(p)

I'm not happy about the justifying the high payouts to EY as "that's what a programmer might make". Instead, put him (and any other SIAI full-time employees, possibly just Michael Vassar) on half pay (and half time), and suggest that he work in the "real world" (something not SIAI/futurism related) the rest of the time. This means that his presumed skills are tested and exercised with actual short-term tasks, and also gives an approximate market price for his skills.

Currently, his market-equivalence to a programmer is decoupled from reality.

Replies from: ciphergoth, None, timtyler, Bongo, Bongo
comment by Paul Crowley (ciphergoth) · 2011-04-26T15:30:43.876Z · LW(p) · GW(p)

This is a great idea, if SIAI put signalling what moral people they are over actually bringing about the best outcome.

Replies from: Johnicholas
comment by Johnicholas · 2011-04-26T16:04:35.756Z · LW(p) · GW(p)

Can you elaborate? I don't understand my proposal as related to signaling at all; it's about measuring EY's (and others') effectiveness, rather than taking it for granted. Yes, it's costly in the event it's unnecessary, but corruption/ineffectiveness/selfishness (where EY and Vassar are primarily building a career and a niche for themselves, consciously or unconsciously) is also costly.

Replies from: gjm, Nick_Tarleton, benelliott
comment by gjm · 2011-04-26T19:48:28.985Z · LW(p) · GW(p)

Perhaps other employers should also employ everyone half-time so that they get more information about their employees' market value?

If SIAI were paying Eliezer to be a "generic" programmer, then I suppose they could get a reasonable idea of whether he's a good one in the way you describe. Or they could just fire him and hire some other guy for the same salary: that's not a bad way of getting (where SIAI is) a middling-competent programmer for hire.

But it doesn't seem to me even slightly credible that that's what they're paying Eliezer for. They might want him writing AI software -- or not, since he's well known to think that writing an AI system is immensely dangerous -- in which case sending him out to work half-time for some random software company isn't going to give much idea of how good he is at that. Or they might want him Thinking Deep Thoughts about rationality and friendly AI and machine ethics and so forth, in which case (1) his "market value" would need to be assessed by comparing with professional philosophers and (2) presumably SIAI sees the value of his work in terms of things like reducing existential risk, which the philosophy-professor market is likely to be ... not very responsive to.

What sending Eliezer out to work half-time commercially demonstrably won't do is to measure his "effectiveness" at anything that seems at all likely to be what SIAI thinks it's worth paying him $100k/year for.

The most likely effects seem to me some combination of: (1) Eliezer spends less time on SIAI stuff and is less useful to SIAI. (2) Eliezer spends all his time on SIAI stuff and gets fired from his other job. (3) Eliezer finds that he can make a lot more money outside SIAI and jumps ship or demands a big pay rise from SIAI. (4) Eliezer decides that an organization that would do something so obviously silly is not fit to (as he sees it) try to decide the fate of the universe, quits SIAI, and goes to do his AI-related work elsewhere.

No combination of these seems like a very good outcome. What's the possible benefit for SIAI here? That with some (not very large) probability Eliezer turns out not to be a very good programmer, doesn't get paid very well by the commercial half-time gig, accepts a lower salary from SIAI on the grounds that he obviously isn't so good at what he does after all, but doesn't simultaneously get so demoralized as to reduce his effectiveness at what he does for SIAI? Well, I suppose it's barely possible, but it doesn't seem like something worth aiming for.

What am I missing here? What halfway plausible way is there for this to work out well?

Replies from: Johnicholas
comment by Johnicholas · 2011-04-26T20:26:47.942Z · LW(p) · GW(p)

I think it's entirely possible for people within corporations to build cozy empires and argue that they should be paid well, and for those same people to in fact be incompetent at value creation - that is, they could be zero-sum internal-politics specialists. The corporation would benefit from enforcing a policy against this sort of "employee lock-in", just like corporations now have policies against "supplier lock-in".

This would entail, among other things, everyone within the corporation having a job description that is sufficiently generic that other people also fit the same job description, and for outside auditors to regularly evaluate whether the salaries being paid for a given job description are comparable to industry standards.

I haven't heard of anyone striving to prevent "employee lock-in" (though that might just be the wrong words) - but people certainly do strive for those related policies.

There are lots of potential upsides: 1. At the prospect of potentially being tested, EY shapes up and starts producing. 2. Due to real-world experience, EY's ideas are pushed along faster and more accurately. 3. SIAI discovers that EY is "just a guy" and reorganizes, in the process jumping out of its recurrent circling of the cult attractor. 4. Due to EY's stellar performance in the real world, other people start following the "work half time and do rationality and existential risk reduction half time" lifestyle.

In general, my understanding of SIAI's proposed financial model is "other people work in the real world, and send money without strings to SIAI, in exchange for infrequent documentation regarding SIAI's existential risk reduction efforts". I think that model is unsustainable, because the organization could switch to becoming simply about sustaining and growing itself.

Replies from: cousin_it, gjm, Eliezer_Yudkowsky
comment by cousin_it · 2011-04-26T22:22:22.755Z · LW(p) · GW(p)

SIAI firing Eliezer would be like Nirvana firing Kurt Cobain. Most of the money and public attention will follow Eliezer, not stay with SIAI.

You're not alone in wanting Eliezer to start publishing new results already. But there's also the problem that he likes secrecy way too much. Alexandros Marinos once compared his attitude to staying childless: every childless person came from an unbroken line of people who reproduced (=published their research), and couldn't exist otherwise.

For example, our decision-theory-workshop group is pretty much doing its own thing now. I believe it diverged from Eliezer's ideas a while ago, when we started thinking about UDT-ish theorem provers instead of TDT-ish causal graph thingies. I don't miss Eliezer's guidance, but I sure miss his input - it could be very valuable for the topics that interest us. But our discussions are open, so I guess it's a no go.

Replies from: None
comment by [deleted] · 2011-04-27T16:43:59.627Z · LW(p) · GW(p)

This is something I've never really understood. I can understand wanting to keep any moves directly towards creating an AI quiet - if you create 99% of an AI and someone else does the other 1%, goodbye world. It may not be optimal, but it's a comprehensible position. But the work on decision theory is presumably geared towards codifying Friendliness in such a way that an AI could be 'guaranteed Friendly'. That seems like the kind of thing that would be aided by having many eyeballs looking at it, while being useless for anyone who wanted to put together a cobbled-together quick-results AI.

Replies from: cousin_it
comment by cousin_it · 2011-04-27T17:08:14.639Z · LW(p) · GW(p)

Eliezer stated his reasons here:

...a constructive theory of the world's second most important math problem, reflective decision systems, is necessarily a constructive theory of seed AI; and constitutes, in itself, a weapon of math destruction, which can be used for destruction more quickly than to any good purpose. Any Singularity-value I attach to publicizing Friendly AI would go into explaining the problem. Solutions are far harder than this and will be specialized on particular constructive architectures.

So in a nutshell, he thinks solving decision theory will make building unfriendly AIs much easier. This doesn't sound right to me because we already have idealized models like Solomonoff induction or AIXI, and they don't help much with building real-world approximations to these ideals, so an idealized perfect solution to decision theory isn't likely to help much either. But maybe he has some insight that I don't.

Replies from: Wei_Dai, wedrifid, None
comment by Wei Dai (Wei_Dai) · 2011-04-27T18:22:20.187Z · LW(p) · GW(p)

I think Eliezer must have changed his mind after writing those words, because his TDT book was written for public consumption all along. (He gave two reasons for not publishing it sooner: he wanted to see if a university would offer him a PhD based on it, and he was using DT as a problem to test potential FAI researchers.) I guess his current lack of participation in our DT mailing list is probably due to some combination of being busy with his books and lack of significant new insights.

Replies from: Nick_Tarleton
comment by Nick_Tarleton · 2011-04-27T18:36:20.186Z · LW(p) · GW(p)

I think TDT is different from the "reflective decision systems" he was talking about, which sounds like it refers to a theory specifically of self-modifying agents.

comment by wedrifid · 2011-04-27T17:41:57.965Z · LW(p) · GW(p)

a weapon of math destruction

That's the first time I noticed the pun. Good one. I want a tshirt.

comment by [deleted] · 2011-04-27T17:21:38.932Z · LW(p) · GW(p)

Ah. I see what he means, if you're talking about a) just the 'invariant under reflection' part and not Friendliness and b) you're talking about a strictly pragmatic tool. That makes sense.

comment by gjm · 2011-04-26T20:47:00.769Z · LW(p) · GW(p)
  1. Starts producing what? 2. What real-world experience, and how will it be relevant to his SIAI work? 3. Yup, that's possible. See below. 4. Just like they do for all the other people who do stellar work as software developers, you mean?

I think #3 merits a closer look, since indeed it's one of the few ways that your proposal could have a positive outcome. So let's postulate, for the sake of argument, that indeed Eliezer's skills in software development are not particularly impressive and he doesn't do terribly well in his other half-time job. So ... now they fire him? Because he hasn't performed very well in another job doing different kinds of work from what he's doing for SIAI? Yeah, that's a good way to do things.

It would probably be good for SIAI to fire Eliezer if he's no good at what he's supposed to be doing for them. But, if indeed he's no good at that, they won't find it out by telling him to get a job as a software engineer and seeing what salary he can make.

Yes, it's bad that SIAI can't easily document how much progress it's making with existential risk reduction so that potential donors can decide whether it's worth supporting. But Eliezer's market-salary-as-a-generic-programmer is -- obviously -- not a good measure of how much progress it's making. Thought experiment: Consider some random big-company CEO who's being paid millions. Suppose they get bored of CEOing and take a fancy to AI, and suppose they agree to replace Eliezer at SIAI, and even to work for half his salary. In this scenario, should SIAI tell their donors: "Great news, everyone! We've made a huge stride towards avoiding AI-related existential risk. We just employed someone whose market salary is measured in the millions of dollars!"?

Yes, it's bad if SIAI can't tell whether Eliezer is actually doing work worth the salary they pay him. (My guess, incidentally, is that he is even if his actual AI-related work is of zero value, on PR grounds. But that's a separate issue.) But measuring something to do with Eliezer that has nothing whatever to do with the value of the work he does for SIAI is not going to solve that problem.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2011-04-27T01:19:16.793Z · LW(p) · GW(p)

You seem to be optimizing this entire problem for avoiding the mental pain of worrying about whether you're being cheated. This is the wrong optimization criterion.

Replies from: Johnicholas
comment by Johnicholas · 2011-04-27T01:37:00.404Z · LW(p) · GW(p)

I'm working from "organizations are superhumanly intelligent (in some ways) and so we should strive for Friendly organizations, including structural protections against corruption" standpoint.

Replies from: None
comment by [deleted] · 2011-04-27T16:39:17.572Z · LW(p) · GW(p)

I hardly think the SIAI, a tiny organisation heavily reliant on a tiny pool of donors, is the most likely organisation to become corrupt. Even when I thought Eliezer was being paid significantly more than he was (see above threads) I wouldn't call that corruption. Eliezer is doing a job. His salary is largely paid for by a very small number of individuals. As the primary public face of SIAI he is under more scrutiny than anyone else in the organisation. As such, if those people donating don't think he's worth the money, he'll be gone very quickly - and so long as they do, it's their money to spend.

comment by Nick_Tarleton · 2011-04-26T19:15:10.901Z · LW(p) · GW(p)

I don't understand my proposal as related to signaling at all

What's the good reason to care about whether EY's salary is calibrated to the market rate, rather than/independent from whether it's too low or high for this particular situation?

it's about measuring EY's (and others') effectiveness, rather than taking it for granted.

I don't understand why SI (i.e., its board) shouldn't employ EY and MV full-time and continually evaluate the effectiveness of their work for it, like any other organization in the world would do.

comment by benelliott · 2011-04-26T19:41:56.035Z · LW(p) · GW(p)

The fact that both are costly is irrelevant, the point is that one has the potential to be vastly more costly than the other.

comment by [deleted] · 2011-04-26T15:45:47.336Z · LW(p) · GW(p)

Downvoted.

"high payouts"? Good programmers are worth their weight in gold. (As for AI researchers, bad ones are worthless, good-but-not-good-enough ones will simply kill us all, and good-enough ones are literally beyond value...) NYT:

Then there are salaries. Google is paying computer science majors just out of college $90,000 to $105,000, as much as $20,000 more than it was paying a few months ago. That is so far above the industry average of $80,000 that start-ups cannot match Google salaries. Google declined to comment.

"half pay (and half time)"? I'm just a programmer, not an AI researcher, but I'm confident that this applies equally: it is ridiculously hard to apply concentrated thought to solving a problem when you have to split your focus. As Paul Graham said:

One valuable thing you tend to get only in startups is uninterruptability. Different kinds of work have different time quanta. Someone proofreading a manuscript could probably be interrupted every fifteen minutes with little loss of productivity. But the time quantum for hacking is very long: it might take an hour just to load a problem into your head. So the cost of having someone from personnel call you about a form you forgot to fill out can be huge.

This is why hackers give you such a baleful stare as they turn from their screen to answer your question. Inside their heads a giant house of cards is tottering.

Replies from: Johnicholas
comment by Johnicholas · 2011-04-26T15:56:43.280Z · LW(p) · GW(p)

A policy of downvoting posts that you disagree with will, over time, generate a "Unison" culture, driving away / evaporatively cooling dissent.

Though you're correct about interruptions and sub-day splitting, in my experience it is entirely feasible to split your time X days vs Y days without suffering context-switch overhead - that is, since we're presumably sleeping, we're already forced to "boot up" in the morning. I agree it's harder to coordinate a team some of whom are full time, some are half time, and some are the other half time - but you'd have 40k to make up the lost team productivity.

Replies from: None, Kevin, wedrifid
comment by [deleted] · 2011-04-26T16:24:47.030Z · LW(p) · GW(p)

A policy of downvoting posts that you disagree with will, over time, generate a "Unison" culture, driving away / evaporatively cooling dissent.

What do you think downvotes are for? It's just a number, it's not an insult.

(Now, if you want to suggest that perhaps I shouldn't announce a downvote when replying with objections, perhaps I could be convinced of that. I think I'd appreciate a downvote-with-explanation more than a silent downvote.)

but you'd have 40k to make up the lost team productivity.

The man-month is mythical.

Replies from: endoself
comment by endoself · 2011-04-26T21:01:50.012Z · LW(p) · GW(p)

What do you think downvotes are for? It's just a number, it's not an insult.

Downvotes are for maintaining the quality of the conversations, not expressing agreement or disagreement. No matter what someone's opinion is, as long as its incorrectness would not be made evident by reading the sequences, downvotes should only express disapproval of the quality of the argument, not the conclusion. In a case like this, no argument for the opinion that you disapprove of was made. Unless he refused to acknowledge the substance of your disagreement, which was not the case here, no downvote was warranted.

comment by Kevin · 2011-04-26T23:41:19.042Z · LW(p) · GW(p)

It's not just that I disagreed with you, it's that you are wrong in a more objective sense.

Replies from: ArisKatsaris
comment by ArisKatsaris · 2011-04-27T13:16:47.181Z · LW(p) · GW(p)

How can you tell the two apart?

comment by wedrifid · 2011-04-26T16:35:51.049Z · LW(p) · GW(p)

A policy of downvoting posts that you disagree with will, over time, generate a "Unison" culture, driving away / evaporatively cooling dissent.

STL's downvote was appropriate and he gave far more justification than was needed. I similarly downvoted both your comments here because they both gave prescriptions of behavior to others that was bad advice based on ignorance.

comment by timtyler · 2011-04-26T20:43:27.031Z · LW(p) · GW(p)

Currently, his market-equivalence to a programmer is decoupled from reality.

More appropriate reference classes: philosophers, writers, teachers, fundraisers.

comment by Bongo · 2011-04-26T17:32:07.575Z · LW(p) · GW(p)

I'm not happy with how big Eliezer's salary is either, but having him work half-time as a programmer to verify the market value of his skills is probably not the best thing to do about it.

Replies from: None
comment by [deleted] · 2011-04-27T04:15:48.252Z · LW(p) · GW(p)

I'm not happy with how big Eliezer's salary is either

What rational reasons do you have?

I can imagine two rational reasons for feeling that someone is overpaid. First and most commonly, if someone is overpaid relative to their productivity. For example, a programmer who writes buggy, poorly designed code and makes 130k for it is clearly overpaid, as is a CEO who makes zillions while driving their company into the ground. This objection could be bluntly phrased as "Eliezer is a hack" - if you think so, say so. I suspect that very few people on LW hold this opinion, especially if, as I said above, they agree that good-enough AI researchers are literally beyond value. (That is, if you subscribe to the basic logic that AI holds the potential to unleash a technological singularity that can either destroy the world or remake it according to our wishes, then EY's approach is the way to go about doing the latter. Even if you disagree with the particulars, he is obviously onto something, and such insights have value.)

Second, your objection may be "someone who works for a nonprofit shouldn't be richly compensated". For example, you could probably go through Newsweek's Fifteen Highest-Paid Charity CEOs, and pick one where you could say "yeah, that's a well-run organization, but that CEO is paid way too much - why don't they voluntarily accept a smaller, but still generous, salary, like a few hundred K?" I don't believe that the second one applies to EY, because he works in an expensive area. More importantly, the fundamental root of this objection would be "if X accepted less money, the nonprofit would have more resources to spend elsewhere". That's pretty obvious when you're talking about mega-zillion CEO salaries. What about Eliezer's case? What if he handed back, say, 10k of his salary to SIAI? That's a significant hit in income for someone whose income matches expenses and whose expenses aren't unreasonable, and it would be much less significant to SIAI. Finally, EY is already working 60 hours a week for SIAI, and you would want him to donate a chunk of his current salary on top of that? Really?

On the other hand, I can think of an irrational reason to be unhappy with Eliezer's salary, which I think I'll be too polite to mention here.

comment by Bongo · 2011-04-26T17:31:36.235Z · LW(p) · GW(p)

I'm not happy with how big Eliezer's salary is either, but having him start working half-time as a programmer to verify the market value of his skills is probably not the best thing to do about it.