post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by gjm · 2023-01-01T14:34:54.192Z · LW(p) · GW(p)

The connection between the article you link to and the specific claims you are making and implying is rather indirect, and so is the connection between those claims and the people around here to whom you are obliquely referring.

What I've seen of the

recent uptick of people who are very vocally advocating for "slowing down AI"

has not at all involved any suggestion of sabotage or obstruction. E.g., Katja Grace's recent post about this explicitly points out that people may tend to think of "slowing down AI research" in terms of terrorist-type action, and gives a list of more likely things, none of which is in any way coercive or violent or illegal or the sort of thing that gets security agencies very interested in you. (And all of which are about AI research as opposed to e.g. chip manufacture, which is the actual subject of this "cold war" and the actual topic of the NYT article, which does not mention AI at any point. The article is not talking about a "current AI standoff".)

And the NYT article, although indeed its headline uses the term "Cold War", makes no suggestion of any state activity other than some funding aimed at getting more chip factories built in the US. And, I repeat, it's not (on its face, at any rate) at all about AI as such; it's about chip manufacturing capability.

It seems plausible to me that we are several steps removed from any situation where people arguing about slowing down AI need to worry about

ludicrously serious consequences in the very near term [...] covert operations [...] to infiltrate and overcome, suppress, or eliminate the threat.

I do agree that it's worth being aware of that sort of possibility, and I think Katja was wise to point out that "think about slowing down AI" means, in practice, not "try to blow up GPU factories and university research departments" but "make arguments suggesting that researchers consider working on something else".

During the original Cold War, what were the worst things that happened to people who were advocating non-violently for nuclear disarmament, and who exactly were the targets? That seems like it might give a rough idea of the worst case for what some AI-slowing advocates might face. I would expect, moderately confidently, that things will be less bad for AI-slowing advocates than for disarmament advocates, because the connection with actual military capabilities is less direct.

Replies from: TrevorWiesinger
comment by trevor (TrevorWiesinger) · 2023-01-01T16:39:54.708Z · LW(p) · GW(p)

I would expect, moderately confidently, that things will be less bad for AI-slowing advocates than for disarmament advocates, because the connection with actual military capabilities is less direct.

Technically this is true, if you read books about nuclear game theory such as Schelling's Arms and Influence, it becomes pretty clear that military capabilities absolutely hinge on nuclear weapons. However, the latest AI is vital for both modern military capabilities AND nuclear weapons (e.g. cruise missile piloting and loitering munitions) so while it's technically "less direct" it's probably not very much less direct.

The connection between the article you link to and the specific claims you are making and implying is rather indirect, and so is the connection between those claims and the people around here to whom you are obliquely referring.

The article was the latest report on exactly how important the AI industry is for geopolitical purposes; how the money is moving is a more reliable indicator than how frequently and directly senior government officials declare AI to be critical for national security (which has been happening for years before GPT-3).

Replies from: gjm
comment by gjm · 2023-01-01T19:54:57.489Z · LW(p) · GW(p)

The article does not mention the AI industry, nor does it make any claim (though doubtless it's true) that ICs are "important for geopolitical purposes". (It does say that recent big investments in US semiconductor manufacture "ha[ve] implications for [...] geopolitics", but that isn't about chips being important for e.g. military applications, it's about the US wanting China not to have too much power over an industry that's important for everything. (The specific examples the article gives are smartphones and VR goggles.)

For the avoidance of doubt, I am not saying that AI isn't of military importance. Only that the article doesn't say anything about AI or about the military.