Why do futurists care about the culture war?

post by Knight Lee (Max Lee) · 2025-01-14T07:35:05.136Z · LW · GW · 3 comments

This is a question post.

Contents

  Answers
    9 Richard_Kennaway
    5 Seth Herd
    5 Milan W
    4 exmateriae
None
3 comments

I think it doesn't make sense why some futurists (e.g. Elon Musk, Peter Thiel) care so much about the culture war. After the singularity, a lot of the conflicts should disappear.

This all assumes that the singularity goes well, and we don't get taken over by a misaligned ASI (or possibly a human who deliberately makes others suffer). The logical thing to do should be to make sure the singularity go well rather than fighting the culture war.

Unfortunately, few people are making this logical choice.

A lot of people deeply care about the culture wars because they don't believe the singularity is coming soon. Yet a lot of people who do believe it is coming soon still seem just as invested (e.g. Elon Musk, Peter Thiel, and others on the left wing).

Why?

Edit: haha this has been downvoted somehow. I really don't know if the answer is obvious (but people gave 3 different answers), or if the answer is unimportant, or if I made too many unrealistic claims in the question.

Edit: Question 2: why can't we decide the culture wars after the singularity, when everyone might become more civilized and have more time to think? Do the people who want to fight the culture war now believe that, whoever controls the singularity will be so extremely closed-minded that they will prevent even themselves from changing their minds or hearing arguments from the other side? Hence the culture wars will be frozen the moment the singularity happens?

Answers

answer by Richard_Kennaway · 2025-01-14T11:50:15.692Z · LW(p) · GW(p)

A lot of people care about the culture wars because they don't believe the singularity is coming soon. Yet a lot of people who do believe it is coming soon still seem just as invested (e.g. Elon Musk, Peter Thiel, and others on the left wing).

Why?

Because the results of culture wars now will determine the post-singularity culture.

comment by Milan W (weibac) · 2025-01-14T12:53:50.884Z · LW(p) · GW(p)

I agree that the culture wars as fought now will influence what the great masses of people will believe in the day before AGI is created. Is it a relevant input to what they will believe in 50 years after that, though?

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2025-01-14T13:05:38.165Z · LW(p) · GW(p)

Is there an implicit assumption of some convergence of singularities? Or that the near term doesn’t matter because the vastly bigger long term can’t be predicted?

Replies from: weibac
comment by Milan W (weibac) · 2025-01-14T13:12:10.838Z · LW(p) · GW(p)

Rather, an implicit assumption that normative culture tends to propagate top-down rather than bottom-up. Thus, influencing mass culture now seems like a losing strategy relative to influencing the culture of those who will in the future control AGI (if we manage to have controllable AGI).

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2025-01-14T13:50:09.837Z · LW(p) · GW(p)

Elon Musk, Peter Thiel, and the like — the people the OP mentions — are shaping up to be the ones controlling the Singularity (if anyone does).

comment by Knight Lee (Max Lee) · 2025-01-14T20:01:13.799Z · LW(p) · GW(p)

Can you give an example of a result now which will determine the post-singularity culture in a really good/bad way?

PS: I edited my question post to include "question 2," what do you think about it?

answer by Seth Herd · 2025-01-14T20:56:34.467Z · LW(p) · GW(p)

Because people aren't rational. Motivated reasoning is a big factor [LW(p) · GW(p)] but also we're all trying to think using monkey brains.

Believing what feels good is evolutionarily adaptive in the sense arriving at correct conclusions about whether God or Singularities exist won't help much if it makes your tribesmates dislike you. This bias is a cumulative, recursive problem that stacks up over the thousands or millions of cognitive acts that go into our beliefs about what we should care about.

And this gets a lot worse when it's combined with our sharp cognitive limitations. We seem to have roughly the least cognitive capacity that still lets a species as a whole very slowly invent and build technologies.

We are idiots, every last one of us. Rationalists with tons of knowledge are a bit less idiotic, but let's don't get cocky - we're still monkey-brained idiots. We just don't have the cognitive horsepower to do the Bayesian math on all the relevant evidence, because important topics are complex. And we're resistant but far from immune to motivated reasoning: you've got to really love rationalism to enjoy being proven wrong and so not turning away cognitively it when it happens.

What I take from all this is that humans are nobly struggling against our own cognitive limitations. We should try harder in the face of rationality being challenging. Success is possible, just not easy and never certain. And very few people are really bad; they're just deluded.

To your exact question:

Musk believes in an intelligence explosion. He cares a lot about the culture war because, roughly as he puts it, he's addicted to drama. I don't know about Thiel.

Most of humanity does not believe in an intelligence explosion happening soon. So actually people who both believe in a singularity and still care about culture wars are quite rare.

I do wonder why people downvoted this quite reasonable question. I suspect they're well-meaning monkey-brained idiots, just like the rest of us.

comment by Knight Lee (Max Lee) · 2025-01-14T21:34:56.021Z · LW(p) · GW(p)

I think the sad part is although these people are quite rare, they actually represent a big share of singularity believers' potential influence. e.g. Elon Musk alone has a net worth of $400 billion, while worldwide AI safety spending is between $0.1 and $0.2 billion/year [? · GW].

If the story of humanity was put in a novel, it might be one of those novels which feel quite sour. There's not even a great battle where the good guys organized themselves and did their best and lost honorably.

Replies from: Seth Herd
comment by Seth Herd · 2025-01-14T21:53:39.020Z · LW(p) · GW(p)

I disagree. There is such a battle. It is happening right now, in this very conversation. The rationalist X-risk community is the good guys, and we will be joined by more as we organize. We are organizing right now, and already fighting aspects of that battle. It won't be fought with weapons but ideas. We are honing our ideas and working out goals and strategies. When we figure out what to do in the public, we will fight to get that done. We are already fighting to figure out alignment of AGI, and starting to work on alignment of humans to meet that challenge.

It's a shame Musk hasn't joined up, but in most good stories, the good guys are the underdogs anyway.

Now, I'd much rather live in dull than exciting times. But here we are. Time to fight. The main enemy is our collective monkey-brained idiocy.

Join the fight!

Replies from: Max Lee
comment by Knight Lee (Max Lee) · 2025-01-14T22:12:34.683Z · LW(p) · GW(p)

:) that's a better attitude. You're very right.

On second thought, just because I don't see the struggle doesn't mean there is none. Maybe someday in the future we'll learn the real story, and it'll will turn out beautiful with lots of meaningful spirit and passion.

Thank you for mentioning this.

answer by Milan W · 2025-01-14T12:49:16.046Z · LW(p) · GW(p)

As you repeatedly point out, there are multiple solutions to each issue. Assuming good enough technology, all of them are viable. Which (if any) solutions end up being illegal, incentivized, made fun of, or made mandatory, becomes a matter of which values end up being normative. Thus, these people may be culture-warring because they think they're influencing "post-singularity" values. This would betray the fact that they aren't really thinking in classical singularitarian terms.

Alternatively, they just spent too much time on twitter and got caught up in dumb tribal instincts. Happens to the best of us.

answer by exmateriae · 2025-01-14T16:21:40.401Z · LW(p) · GW(p)

To add to the other answers, outside of the induced consequences post-singularity, what happens pre-singularity still matters if the singularity does not happen extremely soon.

  1. we don't know when (or IF!) the singularity will happen
  2. does nothing matter until then? What if the singularity happens in a century? Should you abandon all your beliefs because one day when you'll be (likely) dead, people will have solved all problems?

3 comments

Comments sorted by top scores.

comment by Mitchell_Porter · 2025-01-14T15:56:06.745Z · LW(p) · GW(p)

Regarding Musk and Thiel, foremost they are billionaire capitalists, individuals who built enormous business empires. Even if we assume your thinking about the future is correct, we shouldn't assume that they have reproduced every step of it. You may simply be more advanced in your thinking about the future than they are. Their thought about the future crystallized in the 1980s, when they were young. Since then they have been preoccupied with building their empires. 

This raises the question, how do they see the future, and their relationship to it? I think Musk's life purpose is the colonization of Mars, so that humanity's fate isn't tied to what happens on Earth. Everything else is subordinate to that, and even robots and AI are just servants and companions for humanity in its quest for other worlds. As for Thiel, I have less sense of the gestalt of his business activities, but philosophically, the culture war seems very important to him. He may have a European sense of how self-absorbed cultural elites can narrow a nation's horizons, that drives his sponsorship of "heterodox" intellectuals outside the academy. 

If I'm right, the core of Musk's futurism is space colonization, and the core of Thiel's futurism is preserving an open society. They don't have the idea of an intelligence singularity whose outcome determines everything afterwards. In this regard, they're closer to e/acc than singularity thinking, because e/acc believes in a future that always remains open, uncertain, and pluralist, whereas singularity thinking tends towards a single apocalyptic moment in which superintelligence is achieved and irreversibly shapes the world. 

There are other reasons I can see why they would involve themselves in the culture war. They don't want a socialism that would interfere with their empires; they think (or may have thought until the last few years) that superintelligence is decades away; they see their culture war opponents as a threat to a free future (whether that is seen in e/acc or singularity terms), or even to the very existence of any kind of technological future society. 

But if I were to reduce it to one thing: they don't believe in models of the future according to which you get one thing right and then utopia follows, and they believe such thinking actually leads to totalitarian outcomes (where their definition of totalitarian may be, a techno-political order capable of preventing the building of a personal empire). Musk started OpenAI so Google wouldn't be the sole AI superpower; he was worried about centralization as such, not about whether they would get the value system right. Thiel gave up on MIRI's version of AI futurology years ago as a salvationist cult; I think he would actually prefer no AI to aligned AI, if the latter means alignment with a particular value system rather than alignment with what the user wants. 

Replies from: Seth Herd
comment by Seth Herd · 2025-01-14T20:34:31.114Z · LW(p) · GW(p)

Musk definitely understands and believes in an intelligence explosion of some sort. I don't know about Thiel.

Replies from: Max Lee
comment by Knight Lee (Max Lee) · 2025-01-14T20:47:28.260Z · LW(p) · GW(p)

Thiel used to donate to MIRI but I just searched about him after reading your comment and saw this:

“The biggest risk with AI is that we don’t go big enough. Crusoe is here to liberate us from the island of limited ambition.”

(In this December 2024 article)

He's using e/acc talking points to promote a company.

I still consider him a futurist, but it's possible he is so optimistic about AGI/ASI that he's more concerned about the culture war than about it.