Some thoughts on David Roodman’s GWP model and its relation to AI timelines
post by Tom Davidson (tom-davidson-1) · 2021-07-19T22:59:06.861Z · LW · GW · 1 commentsContents
Economic theory doesn’t straightforwardly support Roodman’s extrapolation over standard extrapolations Roodman’s GWP extrapolation is aggressive from an outside-view perspective Roodman’s GWP extrapolation shouldn’t be given much weight in our AI timelines None 1 comment
[Cross posted from the EA forum [EA · GW].]
I’ve been working on a report (see blog) assessing possible trajectories for GWP out to 2100. A lot of my early work focussed on analysing a paper of my colleague David Roodman. Roodman fits a growth model to long-run GWP; the model predicts a 50% probability that annual GWP growth is >= 30% by 2043.
I was thinking about whether to trust this model’s GWP forecasts, compared with the standard extrapolations that predict GWP growth of ~3% per year or less.[1] I was also thinking about how the model might relate to AI timelines.
This post briefly describes some of my key takeaways, as they don’t figure prominently in the report. I explain them briefly and directly, rather than focussing on nuance or caveats.[2] I expect it to be useful mostly for people who already have a rough sense for how Roodman’s model works. Many points here have already been made elsewhere.
Although for brevity I sometimes refer to “Roodman’s extrapolations”, what I really mean is the extrapolations of his univariate model once it’s been fitted to long-run GWP data. Of course, David does not literally believe these extrapolations. More generally, this post is not about David’s beliefs at all but rather about possible uses and interpretations of his model.
[Views are my own, not my employers]
Economic theory doesn’t straightforwardly support Roodman’s extrapolation over standard extrapolations
Early on in the project, I had the following rough picture in my mind (oversimplifying for readability):
Standard extrapolations use what are called ‘exogenous growth models’. These fit the post-1900 data well. However, the exponential growth is put in by hand and isn’t justified by economic theory. (Exogenous growth models assume technology grows exponentially but don’t attempt to justify this assumption; the exponential growth of technology then drives exponential growth of GDP/capita.)
On the other hand, endogenous growth models can explain growth without putting in the answer by hand. They explain technological progress as resulting from economic activity (e.g. targeted R&D), and they find that exponential growth is implausible - a knife-edge case. Ignoring this knife-edge case, growth is either sub- or super- exponential. Roodman fits an endogenous growth model to the data and finds super-exponential growth (because growth has increased over the long-run on average).
So Roodman’s model uses a better growth model (endogenous rather than exogenous). Roodman’s model also has the advantage of taking more data in account (standard economists typically don‘t use pre-1900 data to inform their extrapolations).
Overall, we should put more weight on Roodman than standard extrapolation, at least over the long-run.
I no longer see things this way. My attitude is more like (again oversimplifying for readability):
Although exogenous growth models don’t justify the assumption of exponential growth of technology, semi-endogenous growth models justify this claim pretty nicely.[3] These semi-endogenous models can explain the post-1900 exponential growth and the pre-1900 super-exponential growth in a pretty neat way - for example see Jones (2001).
Roodman’s model departs from these semi-endogenous models primarily in that it assumes population is ‘output-bottlenecked’.[4] This assumption means that if we produced more output (e.g. food, homes), population would become larger as a result: more output → more people. This assumption hasn’t been true over the last 140 years, and doesn’t seem to be true currently: since the demographic transition in ~1880 fertility has decreased even while output per person increased. (That said, significant behaviour change or technological advance could make the assumption reasonable again, e.g. return to Malthusian conditions, human cloning, AGI)
So semi-endogenous growth models are more suitable than Roodman’s model for extrapolating GWP into the future: the main difference between them is that the latter assumes population is output-bottlenecked. Both theories can explain the pre-1900 data,[5] and semi-endogenous models provide a better explanation of the post-1900 data.
Overall, by default I’ll trust the projections of the semi-endogenous models.[6] There’s one important caveat. If significant behaviour change or tech advance happens, then population may become output-bottlenecked again. In this case, I’ll trust the predictions of Roodman’s model.
Roodman’s GWP extrapolation is aggressive from an outside-view perspective
The above section gives an inside-viewy reason to think Roodman’s GWP projections are aggressive. (The model assumes population is output-bottlenecked; when you remove this assumption you predict slower growth.)
I think they’re also aggressive from an outside-view perspective, based purely on recent GWP growth data.
First, the model over-predicts GWP growth over the last 60 years and over-predicts frontier GDP/capita growth over the last 120 years. (This is widely recognised, and is documented in Roodman’s original post.)
Second, its median prediction for growth in 2020 is 7%. This is after updating on the GWP data until 2019. Why is this? Roodman’s model bases its prediction on the absolute level of GWP, and doesn’t explicitly take into account the recent growth rate. Roughly speaking, it believes that "higher GWP means higher growth" based on the pre-1900 data and it observes ~3% growth in the 1900s. GWP in 2020 is way higher than the average GWP in the 1900s, so the model predicts a higher value for 2020 growth than it observed throughout the 1900s.[7]
Why does it matter that the model predicts 7% growth in 2020? Well, GDP growth in frontier economies has recently been more like 2% (source). That’s a difference of 1.8 doublings.[8] Another 1.8 doublings gets us to 24% growth.[9] In log-space, Roodman’s model thinks that we’ve already covered half the distance between 2% and 24%.[10]
To put it another way, Roodman’s model falsely thinks we’ve already covered ~half the distance to TAI-growth in log-space.
If in fact we have to travel twice as far through log-space, it will take more than twice as long according to hyperbolic-y models like Roodman’s. That’s because each doubling of growth is predicted to take less long than the last. Roodman’s model thinks we’ve already covered the slowest doublings (from 2% to 4%, 4% to 7%). In its mind, all we have left are the much-quicker doublings from 7 to 14% and 14% to 24%.
How would adjusting for this change the GWP projections? Roughly speaking, it should much more than double the time until 24% growth. Double it because growth has to double twice as many times. Much more than double it because the doublings Roodman’s model omitted will take much longer than the ones it included.
I modelled this a bit more carefully, focussing on the time until we have 30% growth. Roodman’s model’s median prediction for the first year of 30% growth is 2043 - ~20 years away. I tried to adjust the model in two hacky ways, each time forcing it to predict that the growth in 2020 was 2%.[11] I found the median prediction for 30% growth shifts back to ~2110 or later - ~90 years away.[12] The time to 30% much more than doubles.
(The average GWP growth over the last 20 years is ~3.5%. If I set the 2020 growth to 3.5% rather than 2%, the predicted date of explosive growth is delayed to ~2075, ~55 years away.)
In other words, if we adjust Roodman’s model based on what we know to be the current growth rate, its predictions become much more conservative.
To be clear, I’m not saying that these adjustments make the model ‘better’. For example, they may overadjust if the recent period is surprisingly slow.[13] But I do think my adjustments are informative when considering what to actually believe from an outside-view perspective about future growth, especially in the next few decades. From an outside view perspective, I’d personally put more weight on the adjusted models than on Roodman’s original model.
(Note: there may be inside view-y reasons to think an AI-driven growth acceleration will be sooner and more sudden faster than Roodman’s model suggests; I’m putting these aside here.)
Roodman’s GWP extrapolation shouldn’t be given much weight in our AI timelines
Roodman’s model can predict how long it will take to get to a 30% annual GWP growth. Some people have thought about using this to inform timelines for transformative artificial intelligence (TAI). This rough idea is “we have pretty good outside-view reasons to think 30% growth is coming soon; TAI is the only plausible mechanism; so we should expect TAI soon”.
I don’t think this reasoning is reliable, for a few reasons (some discussed above):
- The same reasoning would have led you astray over the last few decades, as the model’s predicted date of 30% growth has been increasingly delayed.
- The model thinks we’re already halfway in log-space to TAI-growth; this makes its TAI timelines aggressive.
- We shouldn’t trust the predictions of Roodman’s model until we have advanced AI (or population is output-bottlenecked for another reason). So it can’t predict when advanced AI will happen.
-
In my mind, population being output-bottlenecked is (part of) the mechanism for super-exponential growth. Roodman’s model describes how powerful this mechanism has been over human history: how quickly has it led growth to increase.[14] The mechanism no longer applies, due to the demographic transition. However, advanced AI could reinstate the mechanism in a new form. So forecasting advanced AI is like forecasting when this mechanism will be in place again. But Roodman’s model forecasts growth on the assumption that the mechanism already applies; it can’t (reliably!) forecast when the mechanism will start to apply again.
- Here I’m drawing on my belief that population being output-bottlenecked was an important mechanism driving historical super-exponential growth, that this mechanism no longer applies, and that AI could reinstate this mechanism in a new form.
-
- The dynamics of historical growth and a potentially future AI-driven growth explosion will be different in many ways.
-
Roodman’s model is fit to long-run GWP data. The dynamics increasing growth in this period are more people -> more ideas -> more people and a probably bunch of significant one-off changes in institutions around the industrial revolution.[15]
-
With AI the dynamics of increasing growth are more AIs -> more ideas -> more hardware/software/wealth -> more and cleverer AIs ->...
-
There’s a suggestive surface similarity there, suggesting that if the former leads to super-exponential growth the latter might as well.
-
But the actual processes will look pretty different which could introduce huge differences in growth.
- e.g. ‘How easy is it to make AI cleverer compared with humans?’, ‘How many resources does it take to sustain an AI mind compared with a human mind?’, ‘AIs can be copied’, ‘People may not want to hand over tasks to AIs’, ‘Will diminishing marginal returns to tech R&D be different for AI minds than for human minds?’.
-
Overall I think Roodman’s model is useful for indicating that something big could happen, that growth could dramatically accelerate, but otherwise not very informative. To the extent Roodman’s model is informative about AI timelines, I view it as aggressive for the reasons given in the bullets.
Read the report for nuance and caveats! ↩︎
In semi-endogenous growth models technology improves as the result of R&D effort but there are diminishing returns -- each 1% increase in technology (measured as TFP) requires more researcher-year than the last -- so that you need exponential growth in researchers to sustain exponential technology growth. The justification of exponential growth is then roughly: the number of researchers has grown roughly exponentially, so we’d expect technology to have grown roughly exponentially as well. ↩︎
This assumption is made explicit in Roodman’s multivariate model. The univariate model doesn’t feature population, so is naturally understood as a purely statistical model without interpretation. However, in Roodman’s paper the univariate model is motivated theoretically as the univariate analogue of the multivariate model (in which population is output-bottlenecked). This is why I say that the univariate model “assumes population is ‘output-bottlenecked’”. Technically, you can get hyperbolic growth from the multivariate model even if population is held constant and so it is not literally true that the univariate model assumes population is output-bottlenecked. However, more extreme parameter values are needed for this to happen, and such values are in tension with the non-hyperbolic growth of the last 100 years. So in practice, if not technically, I do think of Roodman’s univariate model as assuming that population is output-bottlenecked. ↩︎
Here I’m putting aside reasonable doubts over whether their explanation of pre-1900 growth is correct. ↩︎
When combined with the standard assumption that global population will stabilize, semi-endogenous models imply economic growth will gradually slow down over time. They don’t imply constant exponential growth long into the future. ↩︎
I’ve oversimplified my description of the data to simplify this paragraph. In reality GWP growth increased until ~1960 and got as high as 5%, even though frontier growth stopped increasing from 1900. ↩︎
2 * 2^(1.8) = 7 ↩︎
7 * 2^(1.8) = 24 ↩︎
Why think about it in terms of log-space? Roodman’s model (ignoring the randomness) believes that “each time GWP increases by a factor f, GWP growth doubles” f depends on the data, and comes out at about 3.5 for Roodman’s data set. So in Roodman’s model, considering doublings of growth, i.e. log-space, is natural: growth doubles each time GWP increases by 3.5X. This is true of other hyperbolic models as well, e.g. Kremer (1993). ↩︎
Perhaps I should have set it to 2%, but I was using the recent GWP growth rate of 3.5%. ↩︎
The first method is my ‘growth multiplier’ explained here. Its median predicted date of explosive growth ranged from **2120 - 2140 **depending on an arbitrary choice of timescale (r in the model). See code here. The second method just reduces the instantaneous growth rate of Roodman’s model at every time-step by a constant factor 2/7 (because it currently predicts 7% rather than 2%). This led to a median prediction of 2110. See code here. ↩︎
I also think Roodman’s unadjusted model is more informative about how fast we could grow if the population were as large as our economy could support (Malthusian conditions). ↩︎
Of course, Roodman’s parameters will also implicitly include other mechanisms influencing growth like the massive increase in the share of labour focussed on innovation, improvements in education, and other things. ↩︎
E.g. expansion of R&D as a share of the economy and better institutions for investing in new businesses. ↩︎
1 comments
Comments sorted by top scores.
comment by Daniel Kokotajlo (daniel-kokotajlo) · 2021-07-20T08:30:18.506Z · LW(p) · GW(p)
Nice post! I basically agree with you here. Trying to forecast AI using GDP data is like trying to forecast fossil fuel production by looking at global mean temperature data. [LW · GW] But it's useful for rebutting people who think that transformative AI, AGI, etc. is crazy/unprecedented/low-prior.
Nitpick: I don't think it's helpful to describe some of the arguments as "inside view" and others as "outside view" here. This can mislead people into thinking that e.g. the arguments you label "outside view" should be treated differently from those you label "inside view," that e.g. people who aren't experts should put more weight on the "outside view" arguments, etc., and that this is justified by Tetlock's experiments and the surrounding literature.
Whereas in fact Tetlock's experiments etc. were about a different sort of thing than the kinds of arguments you are considering here. Besides, the terms "inside view" and "outside view" mean so many different things today that they basically just tell the audience how you feel about an argument and how you want the audience to feel about it. Taboo outside view! [LW · GW] I would suggest you replace instances of "inside-viewy" with "model-based" or "technical," or "implausible assumptions" and replace instances of "outside-viewy" with "Sanity check" or "implausible predictions."