post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by gwern · 2022-04-12T01:16:52.562Z · LW(p) · GW(p)

These are controversial statements to make about the Cold War, and even more controversial to make about the War on Terror which a lot of people still remember firsthand a good deal less rosily than you seem to.

comment by ChristianKl · 2022-04-12T13:23:14.854Z · LW(p) · GW(p)

What I keep seeing is that it seems overwhelmingly based on firsthand experience with open, transparent organizations, even though most of the relevant decisions are made by closed, nontransparent orgs like tech companies and military-minded, nationalistic governments.

It's quite natural that you can't see the nontransparent engagements of the people with whom you are talking. Eliezer did speak about his experiences with being invited to military planning exercises in the past. 

Organisations like MIRI and other EA organizations that get a good portion of their budget from billionaires are well connected to speak to influential people behind closed doors. 

Meanwhile, the emergence of intense cybersecurity competition has made chipmaking and computer software more opaque than ever, and there is plenty of room for it to be strangled further, merely by the threat of individual hackers (let alone individual Manhattan-projects).

While the threat is high it still hasn't made the NSA privot from being more interested in defense and getting companies to fix vulnerabilities then it's interested in exploiting those vulnerabilities.

Also, the publicly known history of nuclear weapons isn't all that terrible. According to Thomas Schelling's foundational book Arms and Influence (1966), nuclear escalations are effectively two nuclear powers playing a game of chicken, ramping up the odds of nuclear war until one of the sides folds.

Thomas Schelling didn't know that if Arkhipov wouldn't have stopped the submarine commander from launching nuclear weapons, that those would have actually be used in the cuban missle crisis.

When it comes to understand our current capability regarding nuclear weapons Dominic Cummings who did have inside insight summaries the current state in the UK in his substack:

If you think I’m probably too pessimistic, then ponder this comment by Professor Allison who has spent half a century in these circles: ‘Over the past decade, I have yet to meet a senior member of the US national security team who had so much as read the official national security strategies’ (emphasis added). NB. he is referring to reading the official strategies, not the explanations of why they are partly flawed!

Even if you think we weren't just lucky with nuclear safety it's quite different then AGI risk. 

Biosafety seems more similar. We are two years into a pandemic that likely exists because of people screweing up biosafety and there a lot of publish pressure against stating it as a basic fact. In a world where it's hard to admit basic facts, there's we lack the political will to shut down other gain-of-function research. 

Shutting down gain-of-function research after a catastroph that made us all suffer looks like a problem that would be comparably easier than regulating AGI but powerful forces are not coordinating to make that happen.

Replies from: TrevorWiesinger
comment by trevor (TrevorWiesinger) · 2022-04-13T14:12:38.078Z · LW(p) · GW(p)
  1. There's big billionaires and little billionaires. And then there's military elites and big billionaires. Inequality is prevalent among elites too, and insulation from ambitious and well-connected outsiders is a prerequisite to having any sort of stability in national security decisionmaking. However, personal networks abound and all sorts of things can happen due to chance.
  2. I think that nuclear accidents are very real but they are also overemphasized on lesswrong, and far too few people here know the basics of nuclear deterrence and coercion, which are one of the biggest prerequisites to understanding nuclear standoffs and major conflicts like Ukraine. Deliberate action can be depicted as accidents, dramatically decreasing risks and costs of the deliberate action.
  3. Gain-of-function in the current era is understandable and sane, even if it's unfortunate. The programs aren't gathering dust anymore, which they've generally appeared to do since WW2. It's terrible news, obviously, but everyone's thinking about it, which means everyone's thinking about everyone thinking about it. Also, in terms of offense-defense, deterrence and MAD can be outmaneuvered if the enemy has many more options than you, e.g. they can do something that's a little bit insane and your only option for retaliation is to retaliate with something that's extremely insane.
comment by Ricardo Meneghin (ricardo-meneghin-filho) · 2022-04-12T12:03:02.819Z · LW(p) · GW(p)

If you change the analogy to developing nuclear weapons instead of launching them, the picture becomes much grimmer.

Replies from: TrevorWiesinger
comment by trevor (TrevorWiesinger) · 2022-04-13T13:51:34.151Z · LW(p) · GW(p)

I agree wholeheartedly. In addition, it's worth noting that all militaries around the world live in fear of the CIA, which will give them special attention and burn them if they develop nuclear weapons. As a result, it is unthinkable. However, the US has always been likely to flip-flop on nonproliferation policy and has already done plenty of that with Iran for more than a decade, and of course Iran persisted in spite of the consequences and their existing capabilities to hold the Gulf hostage. 

comment by Raemon · 2022-04-13T05:05:48.749Z · LW(p) · GW(p)

(this post is empty, which I think might be a bug. When I dig into the version-history with my admin-powers I see it was recently updated to delete all content, and want to doublecheck if that was intentional)

Replies from: TrevorWiesinger
comment by trevor (TrevorWiesinger) · 2022-04-13T13:34:16.165Z · LW(p) · GW(p)

It was intentional. I wasn't aware of the "do not frontpage" option, and even if I was I never seriously thought about the possibility that would happen.

I'm new here, and I didn't do enough thinking about the tradeoffs between a smaller, tight-knit discussion among specialists and hobbyists, and a broader display to all kinds of people. I'm fine with additional people seeing it here and there, since I wouldn't have written it if I didn't think it was highly worth reading, but big public statements come with consequences that can't be predicted, even on Lesswrong where people are very pragmatic.