Posts

Sorry for the downtime, looks like we got DDosd 2024-12-02T04:14:30.209Z
(The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser 2024-11-30T02:55:16.077Z
OpenAI Email Archives (from Musk v. Altman) 2024-11-16T06:38:03.937Z
Using Dangerous AI, But Safely? 2024-11-16T04:29:20.914Z
Open Thread Fall 2024 2024-10-05T22:28:50.398Z
If-Then Commitments for AI Risk Reduction [by Holden Karnofsky] 2024-09-13T19:38:53.194Z
Open Thread Summer 2024 2024-06-11T20:57:18.805Z
"AI Safety for Fleshy Humans" an AI Safety explainer by Nicky Case 2024-05-03T18:10:12.478Z
Goal oriented cognition in "a single forward pass" 2024-04-22T05:03:18.649Z
Express interest in an "FHI of the West" 2024-04-18T03:32:58.592Z
Structured Transparency: a framework for addressing use/mis-use trade-offs when sharing information 2024-04-11T18:35:44.824Z
LessWrong's (first) album: I Have Been A Good Bing 2024-04-01T07:33:45.242Z
How useful is "AI Control" as a framing on AI X-Risk? 2024-03-14T18:06:30.459Z
Open Thread Spring 2024 2024-03-11T19:17:23.833Z
Is a random box of gas predictable after 20 seconds? 2024-01-24T23:00:53.184Z
Will quantum randomness affect the 2028 election? 2024-01-24T22:54:30.800Z
Vote in the LessWrong review! (LW 2022 Review voting phase) 2024-01-17T07:22:17.921Z
AI Impacts 2023 Expert Survey on Progress in AI 2024-01-05T19:42:17.226Z
Originality vs. Correctness 2023-12-06T18:51:49.531Z
The LessWrong 2022 Review 2023-12-05T04:00:00.000Z
Open Thread – Winter 2023/2024 2023-12-04T22:59:49.957Z
Complex systems research as a field (and its relevance to AI Alignment) 2023-12-01T22:10:25.801Z
How useful is mechanistic interpretability? 2023-12-01T02:54:53.488Z
My techno-optimism [By Vitalik Buterin] 2023-11-27T23:53:35.859Z
"Epistemic range of motion" and LessWrong moderation 2023-11-27T21:58:40.834Z
Debate helps supervise human experts [Paper] 2023-11-17T05:25:17.030Z
How much to update on recent AI governance moves? 2023-11-16T23:46:01.601Z
AI Timelines 2023-11-10T05:28:24.841Z
How to (hopefully ethically) make money off of AGI 2023-11-06T23:35:16.476Z
Integrity in AI Governance and Advocacy 2023-11-03T19:52:33.180Z
What's up with "Responsible Scaling Policies"? 2023-10-29T04:17:07.839Z
Trying to understand John Wentworth's research agenda 2023-10-20T00:05:40.929Z
Trying to deconfuse some core AI x-risk problems 2023-10-17T18:36:56.189Z
How should TurnTrout handle his DeepMind equity situation? 2023-10-16T18:25:38.895Z
The Lighthaven Campus is open for bookings 2023-09-30T01:08:12.664Z
Navigating an ecosystem that might or might not be bad for the world 2023-09-15T23:58:00.389Z
Long-Term Future Fund Ask Us Anything (September 2023) 2023-08-31T00:28:13.953Z
Open Thread - August 2023 2023-08-09T03:52:55.729Z
Long-Term Future Fund: April 2023 grant recommendations 2023-08-02T07:54:49.083Z
Final Lightspeed Grants coworking/office hours before the application deadline 2023-07-05T06:03:37.649Z
Correctly Calibrated Trust 2023-06-24T19:48:05.702Z
My tentative best guess on how EAs and Rationalists sometimes turn crazy 2023-06-21T04:11:28.518Z
Lightcone Infrastructure/LessWrong is looking for funding 2023-06-14T04:45:53.425Z
Launching Lightspeed Grants (Apply by July 6th) 2023-06-07T02:53:29.227Z
Yoshua Bengio argues for tool-AI and to ban "executive-AI" 2023-05-09T00:13:08.719Z
Open & Welcome Thread – April 2023 2023-04-10T06:36:03.545Z
Shutting Down the Lightcone Offices 2023-03-14T22:47:51.539Z
Review AI Alignment posts to help figure out how to make a proper AI Alignment review 2023-01-10T00:19:23.503Z
Kurzgesagt – The Last Human (Youtube) 2022-06-29T03:28:44.213Z
Replacing Karma with Good Heart Tokens (Worth $1!) 2022-04-01T09:31:34.332Z

Comments

Comment by habryka (habryka4) on 1a3orn's Shortform · 2024-12-06T21:25:09.546Z · LW · GW

I was just thinking of adding some kind of donation tier where if you donate $20k to us we will custom-build a Gerver sofa, and dedicate it to you.

Comment by habryka (habryka4) on johnswentworth's Shortform · 2024-12-06T20:04:10.026Z · LW · GW

My guess is neither of you is very good at using them, and getting value out of them somewhat scales with skill. 

Models can easily replace on the order of 50% of my coding work these days, and if I have any major task, my guess is I quite reliably get 20%-30% productivity improvements out of them. It does take time to figure out at which things they are good at, and how to prompt them.

Comment by habryka (habryka4) on Open Thread Fall 2024 · 2024-12-06T19:49:41.766Z · LW · GW

Could you send me a screenshot of your post list and tag filter list? What you are describing sounds really very weird to me and something must be going wrong.

Comment by habryka (habryka4) on Common misconceptions about OpenAI · 2024-12-06T17:31:39.523Z · LW · GW

It… was the fault of Jacob?

The post was misleading when it was written, and I think was called out as such by many people at the time. I think we should have some sympathy with Jacob being naive and being tricked, but surely a substantial amount of blame accrues to him for going to the bat for OpenAI when that turned out to be unjustified in the end (and at least somewhat predictably so).

Comment by habryka (habryka4) on Thoughts on sharing information about language model capabilities · 2024-12-05T17:02:24.060Z · LW · GW

What is plausibly a valid definition of multi-hop reasoning that we care about and that excludes getting mathematical proofs right and answering complicated never-before-seen physics questions and doing the kind of thing that a smaller model needed to do a CoT for?

Comment by habryka (habryka4) on Thoughts on sharing information about language model capabilities · 2024-12-05T16:48:31.819Z · LW · GW

Transformers are obviously capable of doing complicated internal chains of reasoning. Just try giving them a difficult problem and force them to start their answer in the very next token. You will see no interpretable or visible traces of their reasoning, but they will still get it right for almost all questions.

Visible CoT is only necessary for the frontier of difficulty. The rest is easily internalized.

Comment by habryka (habryka4) on Thoughts on sharing information about language model capabilities · 2024-12-05T16:28:10.007Z · LW · GW

I do not understand your comment at all. Why would it be falsified? Transformers are completely capable of steganography if you apply pressure towards it, which we will (and have done).

In Deepseek we can already see weird things happening in the chain of thought. I will happily take bets that we will see a lot more of that.

Comment by habryka (habryka4) on The 2023 LessWrong Review: The Basic Ask · 2024-12-05T00:40:42.883Z · LW · GW

How are the triangle numbers not quadratic?

Sure looks quadratic to me.

Comment by habryka (habryka4) on Karl Krueger's Shortform · 2024-12-04T22:15:11.685Z · LW · GW

Welcome! Hope you have a good time emerging from the shadows.

Comment by habryka (habryka4) on The 2023 LessWrong Review: The Basic Ask · 2024-12-04T20:58:51.315Z · LW · GW

I think people usually want that sentence to mean something confused. I agree it has fine interpretations, but people by default use it as a semantic stopsign to stop looking for ways the individual parts mechanistically interface with each other to produce the higher utility thing than the individual parts naively summed would (see also https://www.lesswrong.com/posts/8QzZKw9WHRxjR4948/the-futility-of-emergence )

Comment by habryka (habryka4) on A Qualitative Case for LTFF: Filling Critical Ecosystem Gaps · 2024-12-04T19:46:43.772Z · LW · GW

Also, I don't claim there's another major grant maker that's less constrained like this.)

I think the SFF appears less constrained like this

Comment by habryka (habryka4) on romeostevensit's Shortform · 2024-12-04T19:31:45.126Z · LW · GW

What's the context? 

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-04T18:00:56.191Z · LW · GW

Alas, thank you for looking into it.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-03T23:31:04.603Z · LW · GW

I set up an every.org donation link which supports crypto donations, stock donations and recurring donations, so this is now the case!

Comment by habryka (habryka4) on Launching Applications for the Global AI Safety Fellowship 2025! · 2024-12-03T23:27:52.185Z · LW · GW

The post feels very salesy to me, was written by an org account, and also made statements that seemed false to me like: 

1⃣ Fellows will work with the world’s leading AI safety organisations to advance the safe and beneficial development of AI. Some of our placement partners are the Center for Human Compatible AI (CHAI), FAR.AI, Conjecture, UK AISI and the Mila–Quebec AI Institute.

(Of those, maybe Far.AI would be deserving of that title, but also, I feel like there is something bad about trying to award that title in the first place). 

There also is no disambiguation of whether this program is focused on existential risk efforts or near-term bias/filter-bubble/censorship/etc. AI efforts, the latter of which I think is usually bad for the world, but at least a lot less valuable.

Comment by habryka (habryka4) on papetoast's Shortforms · 2024-12-03T19:34:25.441Z · LW · GW

Our post font is pretty big, but for many reasons it IMO makes sense for the comment font to be smaller. So that plus LaTeX is a bit of a dicey combination.

Comment by habryka (habryka4) on papetoast's Shortforms · 2024-12-03T18:25:10.185Z · LW · GW

In the case of forward propagation, these artifacts means you get  for ~free, and in backwards propagation you get  for ~free.

Presumably you meant to say something else here than to repeat  twice?

Edit: Oops, I now see. There is a switched . I did really look quite carefully to spot any difference, but I apparently still wasn't good enough. This all makes sense now.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-03T18:02:51.150Z · LW · GW

This is a line item for basically all the service staff of a 100-bed, 30,000 sq. ft. conference center/hotel.

I don't think I understand how not living in the Bay Area and making flights there instead would work. This is a conference center, we kind of need to be where the people are to make that work.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-03T17:42:10.460Z · LW · GW

That would be great! Let’s hope they say yes :)

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-03T17:37:15.037Z · LW · GW

I am working on it! Will post here in the coming week or two about how it’s going.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-03T07:29:26.825Z · LW · GW

Thank you!

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-03T02:05:03.762Z · LW · GW

Thank you! 

I'd guess plenty are planning to donate after Jan 1st for tax reasons, so perhaps best to keep highlighting the donation drive through the first week of Jan.

Yeah, I've been noticing that when talking to donors. It's a tricky problem because I would like the fundraiser to serve as a forcing function to get people who think LW should obviously be funded, but would like to avoid paying an unfair multiple of their fair share, to go and fund it. 

But it seems like half of the donors will really want to donate before the end of this year, and the other half will want to donate after the start of next year. 

It's tricky. My current guess is I might try to add some kind of "pledged funds" section to the thermometer, but it's not ideal. I'll think about it in the coming days and weeks.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-03T01:09:47.182Z · LW · GW

It was never much additional revenue. The reason is that Amazon got annoyed at us because of some niche compliance requirement for our Amazon France account, and has decided to block all of our sales until that's resolved. I think it's going to be resolved before the end of the year, but man has it been a pain. 

If you come by Lighthaven you can also buy the books in-person! :P

Comment by habryka (habryka4) on Kaj's shortform feed · 2024-12-02T17:58:26.483Z · LW · GW

It seems to me that o1 and deepseek already do a bunch of the "mental simulation" kind of reasoning, and even previous LLMs did so a good amount if you prompted them to think in chain-of-thoughts, so the core point fell a bit flat for me.

Comment by habryka (habryka4) on Conjecture: A Roadmap for Cognitive Software and A Humanist Future of AI · 2024-12-02T17:55:37.404Z · LW · GW

This essay seems to have lost the plot of where the problems with AI come from. I was historically happy that Conjecture focused on the parts of AI development that are really obviously bad, like having a decent chance of literally killing everyone or permanently disempowering humanity, but instead this seems like it's a random rant against AI-generated art, and name-calling of obviously valuable tools like AI coding assistants .

I am not sure what happened. I hope you find the plot again.

Comment by habryka (habryka4) on Sorry for the downtime, looks like we got DDosd · 2024-12-02T17:51:00.589Z · LW · GW

Oh, interesting. I had not properly realized you could unbundle these. I am hesitant to add a hop to each request, but I do sure expect Cloudflare to be fast. I'll look into it, and thanks for the recommendation.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-02T10:41:07.312Z · LW · GW

Oh no! I just signed up for an account on Benevity, hopefully they will confirm us quickly. I haven't received any other communication from them, but I do think we should try to get on there, as it is quite helpful for matching, as you say.

Comment by habryka (habryka4) on Sorry for the downtime, looks like we got DDosd · 2024-12-02T10:09:00.466Z · LW · GW

Yeah, we considered setting up a Cloudflare proxy for a while, but at least for logged-in users, LW is actually a really quite dynamic and personalized website, and not a great fit for it (I do think it would be nice to have a logged-out version of pages available on a Cloudflare proxy somehow).

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-02T05:51:45.532Z · LW · GW

I could have saved a bit of money with better tax planning, but not as much as one might think. 

The money I was able to donate came from appreciated crypto, and was mostly unrelated to my employment at Lightcone (and also as an appreciated asset was therefore particularly tax-advantageous to donate). 

I have generally taken relatively low salaries for most of my time working at Lightcone. My rough guess is that my average salary has been around $70k/yr[1]. Lightcone only started paying more competetive salaries in 2022 when we expanded beyond some of our initial founding staff, and I felt like it didn't really make cultural or institutional sense to have extremely low salaries. The only year in which I got paid closer to any competetive Bay Area salary was 2023, and in that year I also got to deduct most of that since I donated in the same year.

(My salary has always been among the lowest in the organization, mostly as a costly signal to employees and donors that I am serious about doing this for impact reasons)

  1. ^

    I don't have convenient tax records for years before 2019, but my income post-federal-tax (but before state tax) for the last 6 years was $59,800 (2019), $71,473 (2020), $83,995 (2021), $36,949 (2022), $125,175 (2023), ~$70,000 (2024). 

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-02T05:22:10.284Z · LW · GW

Aah, that makes sense. I will update the row to say "Expected Lighthaven Income"

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-02T05:21:44.382Z · LW · GW

I fixed some misunderstandable parts, I meant the $500k being the LW hosting + Software subscriptions and the Dedicated software + accounting stuff together. And I didn't mean to imply that the labor cost of the 4 people is $500k, that was a separate term in the costs. 

Ah yeah, I did misunderstand you there. Makes sense now. 

Is Lighthaven still cheaper if we take into account the initial funding spent on it in 2022 and 2023?

It's tricky because a lot of that is capital investment, and it's extremely unclear what the resell price of Lighthaven would end up being if we ended up trying to sell, since we renovated it in a pretty unconventional way. 

Total renovations cost around ~$7M-$8M. About $3.5M of that was funded as part of the mortgage from Jaan Tallinn, and another $1.2M of that was used to buy a property right next to Lighthaven which we are hoping to take out an additional mortgage on (see footnote #3), and which we currently own in full. The remaining ~$3M largely came from SFF and Open Phil funding. We also lost a total of around ~$1.5M in net operating costs so far. Since the property is super hard to value, let's estimate the value of the property after our renovations at our current mortgage value ($20M).[1]

During the same time, the Lightcone Offices would have cost around $2M, so if you view the value we provided in the meantime as roughly equivalent, we are out around $2.5M, but also, property prices tend to increase over time at least some amount, so by default we've probably recouped some fraction of that in appreciated property values, and will continue to recoup more as we break even.

My honest guess is that Lighthaven would make sense even without FTX, from an ex-post perspective, but that if we hadn't have had FTX there wouldn't have been remotely enough risk appetite for it to get funded ex-ante. I think in many worlds Lighthaven turned out much worse than it did (and for example, renovation costs already ended up in the like 85th percentile of my estimates due to much more extensive water and mold damage than I was expecting in the mainline).

  1. ^

    I think this is a potentially controversial choice, though I think it makes sense. I think most buyers would not be willing to pay remotely as much for the venue as that, since they would basically aim to return the property back to its standard hotel usage, and throw away most of our improvements, probably putting the property value at something like $15M. But I think our success of running the space as a conference venue suggests to me that someone else should also be able to tap into that, for e.g. weddings or corporate events, and I think that establishes the $20M as a more reasonable mean, but I think reasonable people could disagree with this.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-02T05:03:46.107Z · LW · GW

Due to an apparently ravenous hunger among you all for having benches with plaques dedicated to them, and us not actually having that many benches, I increased the threshold for getting a bench (or equivalent) with a plaque to $2,000. Everyone who donated more than $1,000 but less than $2,000 before Dec 2nd will still get their plaque.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-02T04:42:40.635Z · LW · GW

Thank you so much!

Some quick comments: 

then the real costs are $500k for the hosting and hosting cost of LessWrong 

Raw server costs for LW are more like ~$120k (and to be clear, you could drive this lower with some engineering, though you would have to pay for that engineering cost). See the relevant line in the budget I posted.

Total labor cost for the ~4 people working on LW is closer to ~$800k, instead of the $500k you mention.

(I'm not super convinced it was a good decision to abandon the old Lightcone offices for Lighthaven, but I guess it made sense in the funding environment of the time, and once we made this decision, it would be silly not to fund the last 1 million of initial cost before Lighthaven becomes self-funded).

Lighthaven is actually cheaper (if you look at total cost) than the old Lightcone offices. Those also cost on the order of $1M per year, and were much smaller, though of course we could have recouped a bunch of that if we had started charging for more things. But cost-savings were actually a reason for Lighthaven, since according to our estimates, the mortgage and rent payments would end up quite comparable per square foot.

Again, thank you a lot.

Comment by habryka (habryka4) on Sorry for the downtime, looks like we got DDosd · 2024-12-02T04:20:51.862Z · LW · GW

Should now be fixed. We've blocked traffic to basically all pages and been restoring them incrementally to make sure we don't go down again immediately. I just lifted the last of those blocks.

Comment by habryka (habryka4) on Habryka's Shortform Feed · 2024-12-02T04:06:25.582Z · LW · GW

We were down between around 7PM and 8PM PT today. Sorry about that.

It's hard to tell whether we got DDosd or someone just wanted to crawl us extremely aggressively, but we've had at least a few hundred IP addresses and random user agents request a lot of quite absurd pages, in a way that was clearly designed to avoid bot-detection and block methods. 

I wish we were more robust to this kind of thing, and I'll be monitoring things tonight to prevent it from happening again, but it would be a whole project to make us fully robust to attacks of this kind. I hope it was a one-off occurence. 

But also, I think we can figure out how to make it so we are robust to repeated DDos attacks, if that is the world we live in. I do think it would mean strapping in for a few days of spotty reliability while we figure out how to do that.

Sorry again, and boo for the people doing this. It's one of the reasons why running a site like LessWrong is harder than it should be.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-01T23:05:53.412Z · LW · GW

Stripe doesn't allow for variable-amount recurring donations in their payment links. We will probably build our own donation page to work around that, but it might take a bit. 

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-01T22:56:07.586Z · LW · GW

(I at least have no ability to access to the phone numbers of anyone who has donated so far, and am pretty sure this is unrelated to the fundamental Stripe payment functionality. Just to verify this, I just went through the Stripe donation flow on an incognito window with a $1 donation, and it did not require any phone numbers)

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-01T22:52:28.321Z · LW · GW

Have you considered cutting salaries in half? According to the table you share in the comments, you spend 1.4 million on the salary for the 6 of you, which is $230k per person. If the org was in a better shape, I would consider this a reasonable salary, but I feel that if I was in the situation you guys are in, I would request my salary to be at least halved. 

We have! Indeed, we have considered it so hard that we did in fact do it. For roughly the last 6-8 months our salaries have on-average been halved (and I have completely forfeited my salary, and donated ~$300k to Lightcone at the end of last year myself to keep us afloat). 

I don't think this is a sustainable situation and I expect that in the long run I would end up losing staff over this, or I would actively encourage people to make 3x[1] their salary somewhere else (and maybe donating it, or not) since I don't think donating 70% of your counterfactual salary is a particularly healthy default for people working on these kinds of projects. I currently think I wouldn't feel comfortable running Lightcone at salaries that low in the long run, or would at least want to very seriously rearchitect how Lightcone operates to make that more OK.

(Also, just to clarify, the $230k is total cost associated with an employee, which includes office space, food, laptops, insurance, payroll taxes, etc. Average salaries are ~20% lower than that.)

Relatedly, I don't know if it's possible for you to run with fewer employees than you currently have. I can imagine that 6 people is the minimum that is necessary to run this org, but I had the impression that at least one of you is working on creating new rationality and cognitive trainings, which might be nice in the long-term (though I'm pretty skeptical of the project altogether), but I would guess you don't have the slack for this kind of thing now if you are struggling for survival.

We are generally relatively low on slack, and mostly put in long hours. Ray has been working on new rationality and cognitive training projects, but not actually on his work time, and when he has been spending work time on it, he basically bought himself out with revenue from programs he ran (for example, he ran some recent weekend workshops for which he took 2 days off from work, and in-exchange made ~$1.5k of profit from the workshops which went to Lightcone to pay for his salary).

I currently would like to hire 1-2 more people in the next year. I definitely think we can make good use of them, including for projects that more directly bring in revenue (though I think the projects that don't would end up a bunch more valuable for the world).

On the other side of the coin, can you extract more money out of your customers? The negotiation strategy you describe in the post (50-50ing the surplus) is very nice and gentlemanly, and makes sense if you are both making profit. But if there is a real chance of Lightcone going bankrupt and needing to sell Lighthaven, then your regular customers would need to fall back to their second best option, losing all their surplus. So I think in this situation it would be reasonable to try to charge your regular costumers practically the maximum they are willing to pay.

I think doing the negotiation strategy we did was very helpful for getting estimates of the value we provide to people, but I agree that it was quite generous, and given the tightness have moved towards a somewhat more standard negotiation strategy. I am not actually sure that this has resulted in us getting more of the surplus, I think people have pretty strong fairness instincts around not giving up that much of the surplus, and negotiations are hard. 

We do expect to raise prices in the coming year, mostly as demand is outstripping supply for Lighthaven event slots, which means we have more credible BATNAs in our negotiations. I do hope this will increase both the total surplus, and the fraction of the surplus we receive (in as much as getting that much will indeed be fair, which I think it currently is, but it does depend on things being overall sustainable).

  1. ^

    Our historical salary policy was roughly "we will pay you 70% of what we are pretty confident you could make in a similar-ish industry job in compensation". So cutting that 70% in half leaves you with ~1/3rd of what you would make in industry, so the 3x is a relatively robust estimate, and probably a bit of an underestimate as we haven't increased salaries in 2-3 years, despite inflation and it doesn't take into account tail outcomes like founding a successful company (though also engineering salaries have gone down somewhat in that time, though not as much in more AI-adjacent spaces, so it's not totally obvious)

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-01T21:12:24.012Z · LW · GW

Yeah, I agree, and I've been thinking through things like this. I want to be very careful in making the site not feel like it's out to get you, and so isn't trying to sell you anything, and so have been hesitant for things in the space that come with prominent UI implications, but I also think there are positive externalities. I expect we will do at least some things in this space.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-01T21:10:33.093Z · LW · GW

My inside view is that it's about as strong of a COI as I've seen. This is largely based on the exact dynamics of the LTFF, where there tends to be a lot of negotiation going on, and because there is a very clear way in which everything is about distributing money which I think makes a scenario like "Caleb rejects me on the EAIF, therefore I recommend fewer things to orgs he thinks are good on the LTFF" a kind of threat that seems hard to rule out. 

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-01T21:00:41.690Z · LW · GW

Oops, I thought I had added a footnote for that, to clarify what I meant. I shall edit. Sorry for the oversight.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-01T18:49:09.522Z · LW · GW

Caleb is heavily involved with the EAIF as well as the Long Term Future Fund, and I think me being on the LTFF with him is a stronger conflict of interest than the COI between EAIF and other EVF orgs.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-01T18:47:22.077Z · LW · GW

Though it's the website which I find important, as I understand it, the majority of this money will go towards supporting Lighthaven.

I think this is backwards! As you can see in the budget I posted here, and also look at the "Economics of Lighthaven" section, Lighthaven itself is actually surprisingly close to financially breaking even. If you ignore our deferred 2024 interest payment, my guess is we will overall either lose or gain some relatively small amount on net (like $100k). 

Most of the cost in that budget comes from LessWrong and our other generalist activities. At least right now, I think you should be more worried about the future of Lighthaven being endangered by the financial burden of LessWrong (and in the long run, I think it's reasonably likely that LessWrong will end up in part funded by revenue from Lighthaven).

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-01T09:13:50.335Z · LW · GW

My favorite fiscal sponsorship would be through GWWC: https://www.givingwhatwecan.org/inclusion-criteria 

Their inclusion criteria suggests that they want to see at least $50k of expected donations in the next year. My guess is if we have $10k-$20k expected this month, then that is probably enough, but I am not sure (and it might also not work out for other reasons).

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-12-01T06:24:40.149Z · LW · GW

I am working on it! What country would you want it for? Not all countries have charity tax-deductability, IIRC.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-11-30T23:00:09.343Z · LW · GW

I think a lot of projects in the space are very high variance, and some of them are actively deceptive, and I think that really means you want a bunch of people with context to do due diligence and think hard about the details. This includes some projects that Zvi recommends here, though I do think Zvi's post is overall great and provides a lot of value.

Another big component is doing fair splitting. I think many paths to impact require getting 4-5 pieces in place, and substantial capital investment, and any single donor might feel that there isn't really any chance for them to fund things in a way that gets the whole engine going, and before they feel good giving they want to know that other people will actually put in the other funds necessary to make things work. That's a lot of what our work on the S-Process and Lightspeed Grants was solving.

In-general, the philanthropy space is dominated by very hard principal-agent problems. If you have a lot of money, you will have tons of people trying to get your money, most of them for bad reasons. Creating infrastructure to connect high net worth people with others who are actually trustworthy and want to put in a real effort to help them is quite hard (especially in a way that results in the high net-worth people then actually building justified trust in those people).

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-11-30T22:55:08.626Z · LW · GW

I am working on making that happen right now. I am pretty sure we can arrange something, but it depends a bit on getting a large enough volume to make it worth it for one of our UK friend-orgs to put in the work to do an equivalence determination. 

Can you let me know how much you are thinking of giving (either here or in a DM)?

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-11-30T22:38:19.099Z · LW · GW

Ah, yep, I am definitely more doomy than that. I tend to be around 85%-90% these days. I did indeed interpret you to be talking about timelines due to the "farther".

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-11-30T22:17:02.147Z · LW · GW

Hmm, my guess is we probably don’t disagree very much on timelines. My honest guess is that yours are shorter than mine, though mine are a bit in flux right now with inference compute scaling happening and the slope and reliability of that mattering a lot.

Comment by habryka (habryka4) on (The) Lightcone is nothing without its people: LW + Lighthaven's big fundraiser · 2024-11-30T22:11:07.778Z · LW · GW

Yep, both of those motivate a good chunk of my work. I think the best way to do that is mostly to work one level removed, on the infrastructure that allows ideas like that to bubble up and be considered in the first place, but I’ll also take opportunities that make more direct progress on them as they present themselves.