Anti-automation policy as a bottleneck to economic growth
post by mhampton · 2025-04-09T20:12:53.482Z · LW · GW · 0 commentsContents
Introduction Unautomated tasks may bottleneck growth Are governments growth-maximizers? Will we see anti-automation policies? Localism and adequacy Counterpoints None No comments
Epistemic status: [EA · GW] These are preliminary notes on a topic that I hope to more thoroughly explore in the future. I would estimate the probability of the scenario that I’m describing as less than 50% but still significant enough to be worth discussing.
Introduction
A commonly suggested policy solution to automation-driven unemployment is a universal basic income. An implicit or explicit assumption in such proposals is that automation will create such significant economic growth that there will be ample gains for a UBI to redistribute — making all of us better off rather than just cutting up a fixed pie.[1] But is this necessarily the case? Or are there situations in which AI could drive widespread unemployment without causing similarly significant economic growth that could fund UBI? How likely are such scenarios, and what are the implications?
I raise the idea that, even if many jobs are automated, growth may not "foom" if some employees may successfully push for policies that preserve their jobs and if these non-automated jobs serve as a bottleneck to growth. On the other hand, gross world product growth may still be significant because it is unlikely that such policies will happen in every country. However, this may still cause hurdles for countries that automate incompletely: The fact that some countries' automation spurs GWP growth may not benefit countries that do not experience this growth.
Unautomated tasks may bottleneck growth
Several reasons why growth may hit bottlenecks despite widespread automation are discussed in Davidson (2021). Among them is the idea of essential but unautomated tasks.
Davidson notes that, even if many tasks are automated, there may be complementary tasks that are not automated, which would limit growth.[2] Using a toy example, he explains:
“Suppose there are three stages in the production process for making a cheese sandwich: make the bread, make the cheese, combine the two together. If the first two stages are automated and can proceed much more quickly, the third stage can still bottleneck the speed of sandwich production if it isn’t automated. Sandwich production as a whole ends up proceeding at the same pace as the third stage, despite the automation of the first two stages.”
This may happen due to inherent limits to creating automation technology[3] or human preferences,[4] but one reason that I think may be worth discussing is legal restrictions on automation.
Are governments growth-maximizers?
Davidson suggests, “There will be huge incentives to remove bottlenecks to growth, and if there’s just one country that does this it would be sufficient.”[5] Indeed, governments do have such incentives: Economic growth can increase tax revenues and make voters happy.
But governments also face incentives that cut in other directions. Do we have strong reasons to believe that governments, acting on the combination of these incentives, will be adequate [? · GW] at picking low-hanging fruit [LW · GW] with regards to economic growth?[6] There are clear examples from reality in which they are not. For example, the resource curse, in which resource-rich countries often fail to economically develop because the ruling elite can rely on revenue from natural resources, eliminating the need for an economically productive population to tax.[7] Such elites could in theory encourage their population to economically develop, allowing the government to obtain tax revenues as well as resource rents, but they judge (perhaps accurately) that it is in their material interest to live in a less prosperous autocracy than a more prosperous state in which their rule would be more constrained.
Democracies with diverse economies have stronger incentives to cater to their citizens, but it’s not clear that this system is aligned with economic growth either. Rather than disinterestedly calculating what will maximize GDP,[8] governments are subject to numerous competing voting blocs and interest groups, each of which makes demands in its own interests, and the product of these interests is not necessarily optimal. Two examples:
- Nuclear power has the potential to provide cheap and low-carbon electricity, but because voters are worried about accidents (despite nuclear being safer than fossil fuels), nuclear power forms only 9% of the world’s electricity.
- Professional groups often engage in regulatory capture or lobby for regulations that benefit their members at the expense of others, such as the American Medical Association preventing telehealth visits.[9]
Will we see anti-automation policies?
Groups are already taking political action to preserve their employment from automation.[10] It should not be surprising if this becomes more salient as AI becomes able to replace a broader percentage of human jobs. People generally dislike being unemployed, and may feel that way even if they anticipated that they could get UBI.[11] People will link their identity to their work, they will think of it as not just a source of income, they will attribute to it a sacredness that cannot be copied by "mere machines," as we are seeing with acrimonious debates in which artists argue that AI art is definitely Not Real Art.
The important question for this scenario is whether some of the groups that serve as essential elements of economic production (perhaps transportation or manufacturing) effectively preserve their jobs, stifling productivity increases. Since the ability to exercise political power is distributed unevenly between professions, we may then see a situation where professional groups/industries with sufficient political power get preserved, whereas workers without this influence are replaced.[12] If these unautomated jobs serve as significant bottlenecks, then, despite many jobs being automated, there is no economic boom to fund a UBI; the attempts to save some from automation cause it to more severely harm the population as a whole.[13]
Localism and adequacy
But as alluded to by Davidson, we do not have to assume that governments tend to do what is in the interest of economic growth: “if there’s just one country that does this it would be sufficient.” Nor do we have to assume that even this one country acts in the interest of economic growth in every way, or even in most ways, but only in the one particular question of automation. The idea that regulation will prevent economic growth then faces a much higher bar to pass: it must imply that governments are so misaligned with economic growth that barriers will not be lifted in even one of the world’s many jurisdictions. Perhaps this is surmountable, but it is a more difficult challenge, and a priori, it tilts the scale towards Davidson’s assumption.
This tilt is a consequence of the fact that we have many jurisdictions. Does this reasoning, therefore, suggest that international decision-making on the question of automation should be avoided, on the grounds that we would be “putting all our eggs in one basket”?
At the same time, does it also suggest the need for free trade, free movement across borders, or globalized redistribution? If e.g. Liechtenstein is the one country whose policies allow for automation in a way that leads to explosive growth, this does little for countries outside its borders if they see little of this growth.
Counterpoints
- One may object that a world in which certain sectors preserve their jobs from automation is one in which humans can still find jobs, and therefore no UBI is needed. But one should not expect these jobs to be plentiful enough or available to many or most people who have been displaced. Indeed, regulatory capture would likely involve erecting barriers to entry, as increased labor supply would drive down wages.
- Growth may still be high enough to fund an ample UBI even if bottlenecked.
- It will be hard for people not to find ways around previously essential unautomated tasks.
- It may still be a net benefit to preserve jobs on non-economic grounds.
- ^
E.g. Altman (2021)
- ^
See also Aghion et al. (2017). Note that while growth may still increase to some degree if there are bottlenecks, it will not be able to increase to the degree that it would if bottlenecks were removed.
- ^
E.g. difficulties in obtaining training data that conveys tacit knowledge.
- ^
E.g. wanting humans to act as athletes or caregivers.
- ^
I.e. sufficient for “explosive growth” defined as >30% annual gross world product increase.
- ^
Bryan Caplan alludes to this concern in this article (cited by Davidson at footnote 75): “But the most favorable political environments on earth still have plenty of regulatory hurdles – especially for technologies that pose a threat to reigning powers.”
- ^
As summarized by Luke Drago: “Because they earn money from resources, rentier states have no incentive to pay regular people today or invest in them tomorrow. Building better schools doesn’t earn them more money. They invest just as much as it takes to move the oil out of the ground, onto trucks, and out to the ports. It’s not that their citizens couldn’t do anything worth taxing, it’s that there’s no reason to develop them into a taxable population. Why ask your people for money when you can get it from the ground?”
- ^
Or any other, perhaps more meaningful, metric of well-being
- ^
As discussed by labor economist Michael Webb here.
- ^
E.g. Wiseman (2025), Coyle (2024)
- ^
Of course, UBI may not be a certainty, or may be inferior to working on purely financial terms.
- ^
Perhaps universal unionization or other movements to equalize political power among different groups would make this scenario less likely, but it seems difficult to ensure that differences in political power are small enough that either automation is constrained enough to be trivial or that no major bottlenecks exist from unautomated sectors.
- ^
Narrow automation may be bad enough if opportunities for displaced workers are not available or do not fit their skill sets.
0 comments
Comments sorted by top scores.