Posts

Comments

Comment by Eleanor Konik (eleanor-konik) on Paranoia, Cognitive Biases, and Catastrophic Thought Patterns. · 2025-02-14T13:57:37.778Z · LW · GW

Your points about the history of human fear and the negativity bias make sense to me. I certainly tend to dismiss anyone who says I “should” “worry”  -- I will consider, I will plan, I will watch out for, I will keep an eye on, I will ask advice about. I try not to worry.

But a few things stood out to me here enough that I went ahead and finally made an account instead of just lurking. First, the point about nuclear war remaining a genuine existential risk -- I'm not going to rehash all the debates here, not least of which it's not my area of expertise. Also because it wasn't the main thrust of your argument. But I do want to note that I don't think it's at all an uncontroversial claim. 

Second, societal collapse is certainly a thing that has happened. Referencing the luddites is popular because it was a technological innovation that mirrors the rise of AI, but it's not like people don't have real bad times to point to. Leaving aside little things like the 100 Years War, and just general sucky times like the Bengali Famine or whatever... the Fall of Rome was a big deal, as Bret Devereaux points out. So was 1177 BC. So was the collapse of the Mayan civilization. People can argue all they want that it wasn't a "real" collapse because the culture lived on, and not everyone died, and it was “just” a change, but although "did Rome really fall" is as popular an AP test prep question as "was Alexander really Great" I think Devereaux has the right of it when he points out that the carrying capacity of a region taking a bad enough hit leads to a lot of suffering. A huge loss of population. A lot of starving babies. 

I do think it's a mistake to conflate the likelihood of societal collapse with the likelihood of human extinction. 

But although I am not an AI doomer by any stretch of the imagination, I don't think it needs to be "human extinction level" bad for AI to really mess us up, and for people to be justified in fretting. One easy to imagine scenario that could lead to a great deal of human suffering and population collapse would be for AI to disrupt the American political system just badly enough that we stop defending international trade against piracy. If no one steps up to quickly fill the gap -- perhaps due to knock-on effects like some kind of economic collapse caused by American trade policy -- then all sorts of trade routes get less protected, less gets traded, and critical supply chains get disrupted. 

If the Pax Americana falls it could be as bad as the Pax Romana. I don't think I'm being particularly doomerist to consider these scenarios. AI doesn't even need to be particularly smart for that to happen! It merely needs to be economically disruptive, and tbh we're already there with Deep Research and such. 

Note: This is not a prediction I am just saying it's not unreasonable to imagine such a scenario impacting people.