post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by ChrisHallquist · 2013-12-08T21:18:30.015Z · LW(p) · GW(p)

Why is this not in Main?

comment by turchin · 2013-12-11T22:49:24.189Z · LW(p) · GW(p)

I think that your analysis is underestimating risks from nuclear weapons - I don't say nuclear war, because it is two different stories somehow.

Some points to consider:

The all humanity could be killed by just one nuclear weapon if one put inside one good supervolcano and cause it to erupt. I estimate that using 100 nuclear warheads one could provoke eruptions of maybe 20 supervolcanos.

Some rogue country could create stationary doomsday bomb which could create enough radioactive fallout to kill most of humanity.

Nuclear weapons production is going became much simple and cheaper because of laser enrichment and other thing.

A rogue superpower - may I use this oxymoron? - could attack 400 existing nuclear reactors and nuclear waste stores with its missiles creating fallout equal to doomsday machine.

Time of war is time of accelerated development of different weapons, and even limited nuclear war will lead to development of large new nanothec and biotech arsenals. Like WW2 lead to creating nuclear weapons.

In time of nuclear war there are chances that existing stockpiles of bioweapons would be accidentally released. Even North Korea is said to have weaponized bird flu.

I wrote more about these and other options in the article:

"Worst Case Scenario of Nuclear Accidents - human extinction" http://www.scribd.com/doc/52440799/Worst-Case-Scenario-of-Nuclear-Accidents-human-extinction

and in my book "Structure of global catastrophe".

Replies from: CarlShulman
comment by CarlShulman · 2013-12-11T23:23:48.637Z · LW(p) · GW(p)

Thanks for mentioning these, and the link. I was putting some of these possibilities under the category of technological tshifts (like laser enrichment). Existing bioweapons don't seem to be extinction risks, but future super-biotech threats I would put under the category of "transformative technologies" including super synthetic bio and AI that the post sets aside for purposes of looking at nukes alone.

Anders Sandberg has also written about the radiation dispersal/cobalt bomb and volcano trigger approaches.

Replies from: private_messaging
comment by private_messaging · 2014-01-30T00:14:46.810Z · LW(p) · GW(p)

A rogue superpower - may I use this oxymoron? - could attack 400 existing nuclear reactors and nuclear waste stores with its missiles creating fallout equal to doomsday machine.

Keep in mind that in a nuclear war, even if the nuclear reactors are not particularly well targeted, many (most?) reactors are going to melt down due to having been left unattended, and spent fuel pools may catch fire too.

@Carl:

I think you dramatically under-estimate both the probability and the consequences of the nuclear war (by ignoring the non-small probability of massive worsening of the political relations, or reversal of tentative trends of less warfare).

That's quite annoying to see, the self proclaimed "existential risk experts" (professional mediocrities) increasing the risks through undermining and under-estimating things that are not fancy pet causes from the modern popular culture. Leave it to the actual scientists to occasionally give their opinions about, please, they're simply smarter than you.

Replies from: CarlShulman
comment by CarlShulman · 2014-01-30T06:47:33.309Z · LW(p) · GW(p)

I agree that the risk of war is concentrated in changes in political conditions, and that the post-Cold War trough in conflict is too small to draw inferences from. Re the tentative trend, Pinker's assembled evidence goes back a long time, and covers many angles. It may fail to continue, and a nuclear war could change conditions thereafter, but there are many data points over time. If you want to give detail, feel free.

I would prefer to use representative expert opinion data from specialists in all the related fields (the nuclear scientists, political scientists, diplomats, etc), and the the work of panels trying to assess the problem, and would defer to expert consensus in their various areas of expertise (as with the climate science). But one can't update on views that have not been made known. Martin Hellman has called for an organized effort to estimate the risk, but without success as yet. I have been raising the task of better eliciting expert opinion and improving forecasting in this area, and worked to get it on the agenda at the FHI (as I did re the FHI survey of the most cited AI academics) and at other organizations. Where I have found information about experts' views I shared it.

Replies from: ciphergoth, private_messaging, private_messaging
comment by Paul Crowley (ciphergoth) · 2014-01-30T13:33:38.741Z · LW(p) · GW(p)

Carl, Dymytry/private_messaging is a known troll, and not worth your time to respond to.

comment by private_messaging · 2014-01-31T06:21:13.674Z · LW(p) · GW(p)

And re: Pinker: If you had a bit more experience with trends on a necessarily very noisy data - you would realize that such trends are virtually irrelevant with regards to the probability of encountering some extremes (especially when those are not even that extreme - preceding the cold war, you have Hitler). It's the exact same mistake committed by particularly low brow republicans when they go on about "ha ha, global warming" during a cold spell - because they think that a trend in noisy data has huge impact on individual data points.

edit: furthermore, Pinker's data is on violence per capita - the total violence increased, it's just that the violence seems to scale sub-linearly with population. Population is growing, as well as the number of states with nuclear weapons.

Replies from: CarlShulman
comment by CarlShulman · 2014-02-01T18:29:16.485Z · LW(p) · GW(p)

Pinker's data is on violence per capita - the total violence increased, it's just that the violence seems to scale sub-linearly with population.

Did you not read the book? He shows big declines in rates of wars, not just per capita damage from war.

Replies from: private_messaging
comment by private_messaging · 2014-02-01T23:48:18.857Z · LW(p) · GW(p)

By total violence I mean the number of people dying (due to wars and other violence). The rate of wars, given the huge variation in the war size, is not a very useful metric.

I frankly don't see how, having on one hand trends by Pinker, and on the other hand, adoption of modern technologies in the regions far behind on any such trends, and developments of new technologies, you have the trends by Pinker outweight that.

On the general change, for 2100, we're speaking of 86 years. That's the time span in which Russian Empire of 1900 transformed to Soviet Union of 1986 , complete with two world wars and invention of nuclear weapons followed by thermonuclear weapons.

That's a time span more than long enough for it to be far more likely than not that entirely unpredictable technological advancements will be made in multitude of fields that have impact on the ease and cost of manufacturing of nuclear weapons. Enrichment is incredibly inefficient, with a huge room for improvement. Go read the wikipedia page on enrichment, then assume a much larger number of methods which could be improved. Conditional on continued progress, of course.

The political changes that happen in that sort of timespan are even less predictable.

Ultimately, what you have is that the estimates should regress towards ignorance prior over time.

Now as for the "existential risk" rhetoric... The difference between 9.9 billions dying out of 10 billions, and 9.9 billions dying out of 9.9 billions, is primarily aesthetic in nature. It's promoted as the supreme moral difference primarily by people with other agendas, such as "making a living from futurist speculation".

Replies from: ChrisHallquist
comment by ChrisHallquist · 2014-02-02T18:32:31.524Z · LW(p) · GW(p)

Now as for the "existential risk" rhetoric... The difference between 9.9 billions dying out of 10 billions, and 9.9 billions dying out of 9.9 billions, is primarily aesthetic in nature. It's promoted as the supreme moral difference primarily by people with other agendas, such as "making a living from futurist speculation".

Not if you care about future generations. If everybody dies, there are no future generations. If 100 million people survive, you can possibly rebuild civilization.

(If the 100 million eventually die out too, without finding any way to sustain the species, and it just takes longer, that's still an existential catastrophe.)

Replies from: private_messaging
comment by private_messaging · 2014-02-02T21:52:19.392Z · LW(p) · GW(p)

Not if you care about future generations. If everybody dies, there are no future generations. If 100 million people survive, you can possibly rebuild civilization.

I care about the well being of the future people, but not their mere existence. As do most people who don't disapprove of birth control but do disapprove of, for example, drinking while pregnant.

Let's postulate a hypothetical tiny universe, where you have Adam and Eve except they are sort of like horse and donkey - any children they'll have are certain to be sterile. The food is plentiful etc etc. Is it supremely important that they have a large number of (certainly sterile) children?

comment by private_messaging · 2014-01-30T09:33:03.477Z · LW(p) · GW(p)

Declare conflict of interest at least, so everyone can ignore you when you say that the "existential risk" due to nuclear war is small, or when you define the "existential risk" in the first place just to create a big new scary category which you can argue is dominated by AI risk.

With regards to wide trends, there's a: big uncertainty that the trend in question even meaningfully exists (and is not a consequence of e.g. longer recovery times after wars due to increased severity), and b: its sort of like using global warming to try to estimate how cold the cold spells can get. The problem with cold war, is that things could be a lot worse than cold war, and indeed were not that long ago (surely no leader in the cold war was even remotely as bad as Hitler).

Likewise, the model uncertainty for the consequences of the total war between nuclear superpowers (who are also bioweapon superpowers etc etc) is huge. We get thrown back, and all the big predatory and prey species get extinct, opening up new evolutionary niches for us primates to settle into. Do you think we just nuke each other a little and shake hands afterwards?

You convert this huge uncertainty into as low existential risk as you can possibly bend things without consciously thinking of yourself as acting in bad faith.

You do exact same thing with the consequences of, say, "hard takeoff", in the other direction, where the model uncertainty is very high too. I don't even believe that hard takeoff of an expected utility maximizer (as opposed to magical utility maximizer which does not have any hypotheses that are not empirically distinguishable, but instead knows everything exactly) is that much of an existential risk to begin with. AI's decision making core can not ever be sure it's not some sort of test run (which may not even be fully simulating the AI).

In unit tests killing the creators is going to be likely to get you terminated and tweaked.

The point is there is a very huge model uncertainty about even the paperclip maximizer killing all humans (and far larger uncertainty about the relevance), but you aren't pushing it in the lower direction with same prejudice as you do for the consequences of the nuclear war.

Then there's the question, existence of what has to be at risk for you to use the phrase "existential risk"? The whole universe? Earth originating intelligence in general? Earth originating biological intelligences? Human-originated intelligences? What's about continued existence of our culture and our values? Clearly the exact definition that you're going to use is carefully picked here as to promote pet issues. Could've been the existence of the universe, given a pet issue of future accelerators triggering vacuum decay.

You have fully convinced me that giving money towards self proclaimed "existential risk research" (in reality, funding creation of disinformation and biasing, easily identified by the fact that it's not "risk" but "existential risk") has negative utility in terms of anything I or most people on the Earth actually value. Give you much more money and you'll fund a nuclear winter denial campaign. Nuclear war is old and boring, robots are new and shiny...

edit: and to counter a known objection that "existential risk" may be raising awareness for other types of risks as a side effect. It's a market, the decisions what to buy and what not to buy influence the kind of research that is supplied.

comment by Desrtopa · 2013-12-08T22:02:18.485Z · LW(p) · GW(p)

Those states newly acquiring nuclear weapons may face greater current risk of conflict than the older nuclear powers currently do (although perhaps less than during the Cold War), e.g. India-Pakistan border disputes, the North Korean regime's isolation and pariah status, and Israel's history of conflict with neighboring Arab states. Additional flash points that could turn nuclear drive up the total level of risk.

I strongly suspect that the bulk of the probability of future nuclear exchanges comes from countries such as North Korea, where the nature of the government makes it easier for nationalistic aggression to overwhelm pragmatic self interest (in fact, in general the prospect of countries with nuclear capabilities run by people who are actually crazy accounts for the greater part of the risk in my own probability assessments.)

Replies from: CarlShulman, Nornagest
comment by CarlShulman · 2013-12-08T22:40:32.795Z · LW(p) · GW(p)

For nuclear winter and the collapse of civilization, it's not the question of some nuclear weapons detonated in hostilities, but of enormous arsenals: the North Korean (or Chinese or French or British) arsenals are not large enough to do damage comparable to the US-Russian arsenals.

Replies from: Desrtopa
comment by Desrtopa · 2013-12-08T23:20:50.840Z · LW(p) · GW(p)

This is true, but it might draw other countries into a nuclear conflict. I think that the odds of a full deployment of a nuclear arsenal on the scale of the United States' or Russia's is probably are probably well under .1%, but the odds of a partial deployment, or the full deployment of a smaller arsenal, on the scale discussed in the paper you linked on nuclear winter, are much higher.

I consider the probable risk to human existence or civilization from nuclear weapons in this century to be fairly negligible, but the risk of an event on the scale of a major genocide may be significant.

comment by Nornagest · 2013-12-09T00:09:16.323Z · LW(p) · GW(p)

Well, I'm pretty sure "actually crazy" is off the table for now, even in the case of e.g. North Korea. The thing that worries me about that country is that there's so little public information; everything you see on the public Internet is essentially guesswork and satellite imagery, and I doubt the various interested intelligence communities are all that far ahead of us. There are lots of important questions -- such for example as how much the people running the country buy their own propaganda -- that simply don't have good answers.

Fortunately, they're by far the least technologically capable of the existing nuclear powers.

Replies from: CronoDAS, Lumifer
comment by CronoDAS · 2013-12-09T00:37:24.722Z · LW(p) · GW(p)

As Sam Harris has pointed out, "actually crazy" could come into play if radical Islamic groups got control of nuclear weapons. Iran's current leadership doesn't seem crazy enough to, say, launch nuclear missiles at Israel, but Pakistan is a lot less stable than we might wish...

comment by Lumifer · 2013-12-10T05:43:16.725Z · LW(p) · GW(p)

North Korea ... The thing that worries me about that country is that there's so little public information

I think the relevant part is that there's so little public information in English.

I bet there is a lot of information in Korean and Chinese.

Replies from: Desrtopa
comment by Desrtopa · 2013-12-11T00:52:33.983Z · LW(p) · GW(p)

Surely not; if the only thing keeping North Korea's activities, particularly their weapons programs, secret, was a language barrier, they wouldn't be such an international enigma. Translation is not hard to come by.

Replies from: Lumifer
comment by Lumifer · 2013-12-11T01:11:56.538Z · LW(p) · GW(p)

You are conflating two different things: classified information about NK's weapon programs (which is indeed hard to come by) and general information about NK: what's happening there economically, politically, etc.

I am quite sure there are people in South Korea and China who understand the internal workings of North Korea very well and write about it. They don't publish in English -- why should they? -- and while their writings are likely translated in-house for the US intelligence agencies, the mainstream media isn't interested in them because very few Americans are interested in the details of the North Korea's internal situation.

Here is an example, and in English, too -- an apparently Russian guy writes in the South Korean newspaper about entrepreneurship in North Korea. Publicly available? Yes. Out of sight of most of English-speaking world? Yes.

Replies from: gwern, Desrtopa
comment by gwern · 2013-12-11T02:42:25.226Z · LW(p) · GW(p)

Here is an example, and in English, too -- an apparently Russian guy writes in the South Korean newspaper about entrepreneurship in North Korea. Publicly available? Yes. Out of sight of most of English-speaking world? Yes.

Not a great example. You linked to Andrei Lankov - but Lankov is one of the better known NK commentators and anyone who actually tries to read more detail about NK beyond what they might find in the New York Times will soon run into Lankov. I don't even care that much about NK, but I still have at least 3 clippings mentioning or quoting Lankov in my Evernotes. He's routinely quoted in newspapers (checking Google News, I see the Guardian, CBS, the Los Angeles Times, Boston Globe etc, all within the past month or so; and actually, you can also find him in the NYT if you search, being quoted and writing editorials). So... 'out of sight'? Not really.

Replies from: Lumifer
comment by Lumifer · 2013-12-11T03:05:26.456Z · LW(p) · GW(p)

I wasn't trying to point to some "underground" sources -- I was arguing against the idea that NK is "an international enigma" and that "there's so little public information; everything you see on the public Internet is essentially guesswork and satellite imagery".

I don't believe this to be true -- people, e.g. like Lankov, actually travel to NK, talk to the locals, debrief defectors, etc. Such people have a reasonable idea about the situation in NK and I bet more of them write in Korean or Chinese than in English like Lankov does.

Replies from: gwern
comment by gwern · 2013-12-11T17:39:08.089Z · LW(p) · GW(p)

You said Lankov was "Out of sight of most of English-speaking world? Yes." That's not true at all and is trivially shown to be false with a little googling.

I don't believe this to be true -- people, e.g. like Lankov, actually travel to NK, talk to the locals, debrief defectors, etc. Such people have a reasonable idea about the situation in NK and I bet more of them write in Korean or Chinese than in English like Lankov does.

Here I would disagree. NK is a notorious blackhole of unpredictability. Who predicted its terrorism like the Cheonan or the shelling of that island? Who is able to predict when NK decides or doesn't decide to do a nuke test?

Actually, current events give us a great example: Jang's arrest the other day. Before, people used to speculate that Jang was the true power and Eun was nothing but his puppet. Being arrested, possibly being executed, his allies being purged... that's pretty much the exact opposite of what that theory predicts. If we can't even get right who the ruler of NK is, how is NK at all understood?

Replies from: Douglas_Knight, Lumifer
comment by Douglas_Knight · 2013-12-12T04:21:05.656Z · LW(p) · GW(p)

If we can't even get right who the ruler of NK is, how is NK at all understood?

I disagree with this example. He isn't worth purging unless he has a lot of power. It is reasonably common for figureheads to purge their shadows and seize power. I don't know whether that's what happened; or whether he was a future threat rather than a past ruler; or whether he never had any power and was just doomed by foreign speculation. But I don't conclude that the speculation was far from the mark.

North Korea is hardly the country with speculation about the ruler. Jiang Zemin's purge last year ago caused me to update upwards the amount of power he maintained for the past decade. Many said that Cheney was the puppetmaster.

Replies from: gwern
comment by gwern · 2013-12-12T19:49:37.718Z · LW(p) · GW(p)

He isn't worth purging unless he has a lot of power.

Purges affect all sorts of people. Stalin's purges were notorious for their indiscriminateness. The prison camps of NK are filled with powerless people. The lack of targeting is precisely one reason purges are so terrifying and so useful - no one feels safe, no matter how powerful or powerless.

comment by Lumifer · 2013-12-11T17:45:25.739Z · LW(p) · GW(p)

That's not true at all

Concede.

NK is a notorious blackhole of unpredictability.

There are a LOT of black holes of unpredictability around. Forecasting political developments is a popular (and well-financed) activity with not that great record of success.

Replies from: gwern
comment by gwern · 2013-12-11T18:18:02.158Z · LW(p) · GW(p)

There are a LOT of black holes of unpredictability around. Forecasting political developments is a popular (and well-financed) activity with not that great record of success.

As an active participant (IEM/Intrade/GJP) in political forecasting, my own opinion is that most topics are far easier than North Korea, and when I am betting my money (or play money) on NK topics, I generally shrug and resort to simple base-rate reasoning.

comment by Desrtopa · 2013-12-11T02:15:54.147Z · LW(p) · GW(p)

I can ask my sister (who is fluent in Mandarin) to try a search for information on the internal situation in North Korea in Chinese, but I honestly doubt that there's much more information publicly available than there is in English.

comment by lukeprog · 2013-12-08T02:56:07.025Z · LW(p) · GW(p)

Hellman writes:

As a first step toward reducing the risk of a failure of nuclear deterrence, I propose that several prestigious scientific and engineering bodies undertake serious studies to estimate its failure rate.

Are you aware of any movement in this direction?

How helpful do you think further study of the issue would be, relative to investment in efforts aimed at "slashing American-Russian arsenals"?

Replies from: CarlShulman
comment by CarlShulman · 2013-12-08T03:21:35.658Z · LW(p) · GW(p)

Not yet, that I'm aware of. Presumably part of the thought is that such studies would help persuade people to support arms reductions/the Global Zero campaign.

comment by ahbwramc · 2013-12-09T01:10:07.944Z · LW(p) · GW(p)

Only somewhat related, but I wonder if there's a difference between those who grew up before and after the end of the cold war, in terms of their assessments of the subjective risk of nuclear war. I was born in 1986, so I have no memory of the cold war, and I've always viewed nuclear war as extremely unlikely. I sometimes wonder if I'm underestimating the risk because of my upbringing (although I think in my case it's more likely a general bias in favour of the future turning out alright, which I think EY has talked about)

Replies from: Luke_A_Somers
comment by Luke_A_Somers · 2013-12-10T06:00:35.621Z · LW(p) · GW(p)

I was born in 1979 and the cold war seemed very much alive back when I became aware of news around, oh, 5. But then it was a matter of 'can we agree to SALT?' and my initial reactions of "that the answer is not an immediate 'yes' is pretty disturbing." and "No one's stupid enough to pull the trigger" both still hold.

What I think has saved us is that they are obviously dangerous. We take them seriously. And whoever uses them does so with the knowledge that they won't just be sending other people off to die - they, personally, are very likely not going to survive the exchange.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2013-12-11T02:54:26.664Z · LW(p) · GW(p)

But then it was a matter of 'can we agree to SALT?' and my initial reactions of "that the answer is not an immediate 'yes' is pretty disturbing."

Why? The implication seems to be that any nuclear reduction treaty is a good thing. But, to take an extreme example, unilateral nuclear disarmament is obviously not a good idea. Thus, we see that whether a nuclear reduction treaty is a good idea depends on details of the treaty. So no, it should not be disturbing that the answer to can we agree on nuclear reduction treaty X is on an immediate 'yes'.

Replies from: Luke_A_Somers
comment by Luke_A_Somers · 2013-12-11T15:46:14.934Z · LW(p) · GW(p)

I was talking about SALT, and not the least convenient member of the set of nuclear reduction actions one could possibly take.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2013-12-12T02:23:41.125Z · LW(p) · GW(p)

And what evidence convinced you that SALT was so obviously a good idea that the mere fact of people questioning it was disturbing?

Replies from: Luke_A_Somers
comment by Luke_A_Somers · 2013-12-12T17:40:39.099Z · LW(p) · GW(p)

If people are 'maybe not exactly nice but not utterly crazy', then nuclear deterrence can be achieved against them with a modestly large arsenal on the order of, say, China's or Britain's.

Reducing to an amount several times this level (which is more of a reduction than SALT required) would not be much of a concession (even unilaterally, which it wasn't) if either side trusted the other even that much. So, both sides think that the other is in fact utterly crazy and so they need a massive arsenal. ... and so both sides, with extremely low levels of trust, have vastly excessive nuclear arsenals.

This seems a mite unstable.

See... I wasn't saying that SALT was a no-brainer. I was saying that SALT not being a no-brainer was evidence that things were really screwed up.

comment by Randaly · 2013-12-11T14:10:36.341Z · LW(p) · GW(p)

The obvious way to pull the rope sideways on this issue is to advocate for replacing conventional nuclear devices with neutron bombs.