Posts

Comments

Comment by Jacopo Baima (jacopo-baima) on The Future of Nuclear Arms Control? · 2021-02-15T11:07:30.069Z · LW · GW

"I think a good path for nuclear modernization would be to generally reduce nuclear weapon yields while increasing precision"

I am unsure about this. I have the feeling that this is in part the trend going on right now, and that geopolitics experts are very worried, because it gives a path to gradual escalation and makes nuclear war more likely. When nuclear bombs were large and inaccurate, they were kept only as last option of deterrence: it was clear that any use would be followed by a devastating counter-strike with ~100% probability. In contrast, when a military has smaller, well-targeted tactical nukes, there is a temptation to use them for a limited strike, maybe to military installations, thinking that at worst the other country will do the same (that's what the "tactical" word means in this context after all). But the country on the receiving may very well react with an all-out nuclear counter-strike, especially if their capabilities for a tactical strike are limited or absent (or crippled by the first country's attack).

This is very much a real-world worry, btw. Only an year of two ago the risk of this scenario was discussed for an India-Pakistan war, after Pakistan increased its tactical nuke capabilities if I remember correctly. In fact, given the stated policies of the two countries and their respective capabilities, it is what would likely happen if no-one is bluffing. (In detail: India overwhelms Pakistani conventional forces and attempts a quick, deep invasion of Pakistan to disable nuclear installations and/or force surrender -> Pakistan destroys the invading force or cripples the logistic chain with tactical nukes -> India uses its own nukes, not ruling out targeting major cities.)

Of course the complete elimination of larger "strategic" arsenals, like you propose, would ease this worry somewhat. However, in a world where military technology levels are not equal everywhere, it may be impossible to convince the less-advanced military to give up what they think as their only deterrence. 

Comment by Jacopo Baima (jacopo-baima) on A review of Where Is My Flying Car? by J. Storrs Hall · 2020-11-08T19:18:22.402Z · LW · GW

I have not read the book so I might not give it justice. But while the topic is greatly worthwhile some of the stories given to explain "the roots of stagnation" seem off to me. I will try to explain why.

On nanotech: the NNI was a US initiative, and it is strange to explain a worldwide setback with a dysfunction of US funding. Europe has a completely different science funding system, with EU-scale funding interacting with different national systems and priorities. A research area that got large centralized funding in the EU is that of nanoscale materials, which above is described as contributing to the death of nanotech in order to get a share of the pot of money. Yet, in both US and EU nanoscale materials research is alive, if with somewhat disappointing results, and nanotech is struggling. You could say that research in the EU is tied to the one in the US, which would be true although there is a complex bidirectional relationship. But other geographical areas are more loosely connected, for example Japan has his own research ecosystem to the point that it often takes me much longer to read a Japanese paper - too many references to things I never heard about. No chance that the NNI killed nanotech in Japan as well. My hypothesis (pure speculation) is that nanotech tried to go for the technological payoff too soon, while they needed ten more years of basic research, improved tools etc. 

On nuclear: I don't think it makes sense to discuss nuclear without mentioning the military applications, i.e. the atomic bomb. I think this contributed to the difficulties of the technology through two channels. One, in the public mind nuclear=scary is a very clear and strong association, so good luck to any attempt to deregulate nuclear power. Second, a top priority of governments has been to limit the access to nuclear technology, i.e. the opposite of free market. They were probably correct: in the world imagined above with at-home nuclear reactors, all countries an many non-state actors would have nuclear bombs. By now we would have had a dozen nuclear wars and al quaeda would have blown up Manhattan instead of just destroying two skyscrapers. 

I feel that the author of the book may have approached the subject from a libertarian prospective and concluded unsurprisingly that the government is at fault.

I also have more nebulous doubts on the "counterculture trends" explanation. My feeling is that anti-technology culture has always been there, and pro-technology culture has not really gone away. But explaining myself on the issue would make this wall of text even longer :)

Comment by Jacopo Baima (jacopo-baima) on Availability · 2020-09-25T15:26:56.882Z · LW · GW

The increased damage is due to building more on the flood plains, which brings economic gains. It is very possible that they outweigh the increased damage. Within standard economics, they should be, unless strongly subsidized insurance (or expectation of state help for the uninsured after a predictable disaster) is messing up the incentives. Then again, standard economics assumes rational agents, which is kind of the opposite of what is discussed in this post...

The straightforward way to force irrational homeowners/business owners/developers to internalize the risk would be compulsory but not subsidized insurance. That's not politically feasible, I think. That's why most governments would use some clunky and probably sub-optimal combination of regulation, subsidized insurance, and other policies (such as getting the same community to pay for part of the insurance subsidies through local taxes).