Amazon to invest up to $4 billion in Anthropic
post by Davis_Kingsley · 2023-09-25T14:55:35.983Z · LW · GW · 8 commentsThis is a link post for https://twitter.com/AnthropicAI/status/1706202966238318670
Contents
8 comments
8 comments
Comments sorted by top scores.
comment by Hoagy · 2023-09-25T16:47:10.253Z · LW(p) · GW(p)
Would be interested to hear from Anthropic leadership about how this is expected to interact with previous commitments about putting decision making power in the hands of their Long-Term Benefit Trust.
I get that they're in some sense just another minority investor but a trillion-dollar company having Anthropic be a central plank in their AI strategy with a multi-billion investment and a load of levers to make things difficult for the company (via AWS) is a step up in the level of pressure to aggressively commercialise.
Replies from: evhub↑ comment by evhub · 2023-09-25T20:15:24.860Z · LW(p) · GW(p)
From the announcement, they said (https://twitter.com/AnthropicAI/status/1706202970755649658):
Replies from: HoagyAs part of the investment, Amazon will take a minority stake in Anthropic. Our corporate governance remains unchanged and we’ll continue to be overseen by the Long Term Benefit Trust, in accordance with our Responsible Scaling Policy.
↑ comment by Hoagy · 2023-09-25T22:29:36.190Z · LW(p) · GW(p)
Cheers, I did see that and wondered whether still to post the comment but I do think that having a gigantic company owning a large chunk and presumably a lot of leverage over the company is a new form of pressure so it'd be reassuring to have some discussion of how to manage that relationship.
Replies from: Daniel_Eth, evhub↑ comment by Daniel_Eth · 2023-09-26T05:42:26.644Z · LW(p) · GW(p)
Didn't Google previously own a large share? So now there are 2 gigantic companies owning a large share, which makes me think each has much less leverage, as Anthropic could get further funding from the other.
↑ comment by evhub · 2023-09-26T00:17:33.719Z · LW(p) · GW(p)
Yeah, I agree that that's a reasonable concern, but I'm not sure what they could possibly discuss about it publicly. If the public, legible, legal structure hasn't changed, and the concern is that the implicit dynamics might have shifted in some illegible way, what could they say publicly that would address that? Any sort of "Trust us, we're super good at managing illegible implicit power dynamics." would presumably carry no information, no?
Replies from: LosPolloFowler↑ comment by Stephen Fowler (LosPolloFowler) · 2023-09-26T05:58:27.962Z · LW(p) · GW(p)
That it is so difficult for Anthropic to reassure people stems from the contrast between Anthropic's responsibility focused mission statements and the hard reality of them receiving billions in dollars of profit motivated investment.
It is rational to draw conclusions by weighting a companies actions more heavily than their PR.
Replies from: evhub↑ comment by evhub · 2023-09-26T07:22:56.182Z · LW(p) · GW(p)
It is rational to draw conclusions by weighting a companies actions more heavily than their PR.
Yeah—I'm very on board with this. I think people tend to put way too much weight and pay way too much attention to nice-sounding PR rather than just focusing on concrete evidence, past actions, hard commitments, etc. If you focus on nice-sounding PR, then GenericEvilCo can very cheaply gain your favor by manufacturing that for you, but actually making concrete commitments is much more expensive.
So yes, I think your opinion of Anthropic should mostly be priors + hard evidence. If you learned that there was an AI lab that had taken in a $4B investment from Amazon and had also committed to the LTBT governance structure and Responsible Scaling Policy, what would you then think about that company, updating on no other evidence? Ditto for OpenAI, Google DeepMind—I think you should judge them each in approximately the same way. You'll end up relying on your priors a lot if you do this, but you'll also be able to operate much more safely in an epistemic environment where some of the major players might be trying to game your approval.