Should I Divest from AI?

post by OKlogic · 2025-02-10T03:29:33.582Z · LW · GW · No comments

This is a question post.

Contents

  Answers
    2 RHollerith
None
No comments

In recent years, AI has been all the rage in the stock market, and there is no reason to see that slowing down. With the picture of AGI on the horizon becoming clearer and clearer, faster and smarter models being released, and more and more investment being poured into AI stocks, it seems inevitable that prices will continue to rise . However, there is no point in having a great portfolio if we are all dead.

 

On the other hand, the effect that marginal changes to stock prices would have on large companies like Microsoft, Google, Nvidia, etc., seem very small, and is not exactly the same as just giving them funding directly, and money generated by increases in AI stock could be used to invest in efforts into AI safety, which receives comparably less money. 

 

What do you all think? Also, as a tangent, what is the highest impact AI safety charity?

Answers

answer by RHollerith · 2025-02-10T19:17:09.743Z · LW(p) · GW(p)

money generated by increases in AI stock could be used to invest in efforts into AI safety, which receives comparably less money

In the present situation, contributing more money to AI-safety charities has almost no positive effects and does almost nothing to make AI "progress" less dangerous. (In fact, in my estimation, the overall effect of all funding for alignment research so far has make the situation a little worse, by publishing insights that will tend to be usable by capability researchers without making non-negligible progress towards an eventual practical alignment solution.)

If you disagree with me, then please name the charity you believe can convert donations into an actual decrease in p(doom) or say something about how a charity would spend money to decrease the probability of disaster.

Just to be perfectly clear: I support the principle, which I believe has always been operative on LW, that people who are optimistic about AI or who are invested in AI companies are welcome to post on LW.

comment by OKlogic · 2025-02-10T20:15:32.176Z · LW(p) · GW(p)

Given your response, it seems like there should be a stronger push towards AI divestment from within the LessWrong and EA communities. Assuming that many members are heavily invested in index funds like S&P500, that means that millions are dollars are being spent by the less wrong community alone on the stock of companies pursuing AI capabilities research (Microsoft, Google, and Nvidia alone make up more than 10% of the Index’s market cap), which is not an intuitively negligible effect in my view. One could rationalize this by saying that they could use the excess gains to invest in AI safety, but you seem to disagree with this (I am uncertain myself given a lack of experience with AI safety non-profits ).

Replies from: rhollerith_dot_com
comment by RHollerith (rhollerith_dot_com) · 2025-02-12T15:48:08.100Z · LW(p) · GW(p)

An influential LW participant, Jim Miller, who I think is a professor of economics, has written here that divestment does little good because any reduction in the stock price caused by pulling the investments can be counteracted by profit-motivated actors. For publicly-traded stocks, there is a robust supply of of profit-motivated actors scanning for opportunities. I am eager for more discussion on this topic.

I am alarmed to see that I made a big mistake in my previous comment: where I wrote that "contributing more money to AI-safety charities has almost no positive effects", I should have written "contributing to technical alignment research has almost no positive effects". I have nothing bad to say about contributing money to groups addressing the AI threat in other ways, e.g., by spreading the message that AI research is dangerous or lobbying governments to shut it down.

Replies from: OKlogic
comment by OKlogic · 2025-02-12T17:41:55.095Z · LW(p) · GW(p)

Could you link me to his work? If he is correct, it seems a little bit counterintuitive. 

No comments

Comments sorted by top scores.