At what point does disease spread stop being well-modeled by an exponential function?

post by Eli Tyre (elityre) · 2020-03-08T23:53:48.342Z · LW · GW · 4 comments

This is a question post.

Early in the spread of a new disease, its growth rate is exponential. But obviously, no disease can maintain that rate forever, it will soon run out of new people to infect. It has to level off at some point.

At what point (at what percent of the population infected?), does exponential spread become unrealistic?

Secondarily, how should you model the spread after that point?

(My very naive idea is to just model the up to 50% as exponential, and then model the next 50% as the inverse of the first half. (i.e. If weeks 1, 2, 3, and 4 had 6.25%, 12.5%, 25%, 50% infection rates, then project that weeks 5, 6 and 7, will have 75%, 87.5%, and 93.75% infection rates.) How good an approximation would that be?)

tags: coronavirsus, COVID-19

answer by clone of saturn · 2020-03-09T00:29:28.895Z · LW(p) · GW(p)

The basic idea is that the ratio of infected to susceptible people grows exponentially to infinity, which means the absolute number of infected people follows a logistic function.

comment by Bucky · 2020-03-09T23:18:05.889Z · LW(p) · GW(p)

I tried this [LW · GW] with the China data and it seems to fit well, thanks.

answer by kithpendragon · 2020-03-09T00:27:48.989Z · LW(p) · GW(p)

3Blue1Brown just did a video about this subject that I found very informative. The chart they use to explain the "inflection point" actually does look much like the idea you described with exponential growth up to about 50% of the total infections over the course of an outbreak and leveling off after that.