An appeal for vitamin D supplementation as a prophylactic for coronaviruses and influenza and a simple evolutionary theory for why this is plausible.

post by Michael A (michael-a) · 2020-12-22T19:40:08.731Z · LW · GW · 1 comments

Contents

1 comment

I'd like to bring home a case for vitamin D as a probable prophylactic for seasonal respiratory viruses like influenza or coronaviruses. Medical authorities already advise supplementing vitamin D in the winter but they haven't emphasized why it especially matters now. This is a no risk, low cost, low effort intervention so our standard of evidence doesn't need to be all that high here. We only need to have good cause to suspect that it helps, even if only just a little.

 

There quite a few studies showing that vitamin D deficiency is strongly associated with severity of infection. Here is a nice one. It found "VitD deficiency was associated with a 6-fold higher hazard of severe course of disease and a ~15-fold higher risk of death." in its decently sized cohort of 185 patients.

These are very large numbers. They really are not the sort you expect to see from a parameter that is merely correlated with the true causes and not itself causative. So perhaps there is something to this? A meta-analysis of studies relevant to severity paints a similar, but less stark, picture.

 

Influenza and coronaviruses are both enveloped respiratory viruses and both are prevalent in fall and winter, precisely when vitamin D levels are low and lowering. In tropical countries, the rainy season brings clouds and drives people indoors. It also brings their flu season, but its not as stark a difference as it is for temperate climes. As a spot check, let's look at the most populous nation with a rainy season. India's season generally ends in September which is when they peaked at ~100k cases per day. This is the month vitamin D levels would be expected to start to increase. It is down to ~25k now.

India isn't alone. Broadly speaking, COVID-19 appears so far to have seasonality that syncs up closely with that of influenza. We have been seeing a large increase in cases as we enter the dead of winter across Europe and USA. But perhaps this is due to the cold outside and dry conditions inside. Our protective mucus dries up when humidity is below ~40%, making us more vulnerable to infection. As temperatures lower, the virus's outer envelope solidifies like butter. This obstructs ozone and other reactive chemicals from penetrating inside and damaging its vulnerable interior. However, these two commonly cited factors obviously can't explain why it has long been observed that influenza prefers rainy and humid conditions in India and other tropical/subtropical climates. In fact, they seem to undermine what we observe!

 

On the other hand, the seasonal variation of vitamin D serum levels efficiently explains these broad seasonal observations. Seasonality seems to be stronger in temperate regions in part because the seasonal drop in vitamin D status there is much lower. The cold and indoor dryness that also come with winter magnify the effect even more.

 

This should already be enough to convince you that it is especially important right now to supplement vitamin D. I mean, why would we need much else simply to justify picking up a cheap bottle of something we need anyway?

 

I don't want to dwell on geographical observations, but Africa should not be forgotten about yet again. Throughout the world, black people have been hit much harder. In the US, they are ~4x more likely to be hospitalized and nearly 3x more likely to die. Yet somehow Sub-Saharan Africa has faired incredibly well. Only ~40k deaths in a population of 1.1B comes out to 1 death per ~27k, radically lower than the world average of 1 per ~4.5k. Perhaps its because its the sunniest continent?

A lot of people point to the young median age(20) instead, and they certainly have a point. There is no doubt that that definitely helps. But the number is so good that even excluding the half of the population under 20 still leaves us with only 1 death per 13.5k. So this begs the question, why is there such a racial disparity in severity throughout the world, yet Sub-Saharan Africa has some of the least severity outside of China and Southeast Asia(who were probably already substantially immune).

 

Perhaps you are still not convinced that sun exposure matters. Perhaps some other unmentioned environmental features are the causal ones here and vitamin D completely isn't. Well to address that we can also try to separate external contagiousness factors from internal biological ones.

This article points to an old controlled study where live attenuated influenza was inoculated into winter volunteers and caused 8x more fevers and elicited many times more antibodies than for summer volunteers. This sort of observation efficiently excludes contagiousness factors like crowding or who knows what. It suggests that influenza causes much less severe disease in whoever it manages to infect in the summer. Can we corroborate this?

Flu seems to disappear each spring, but we know that these strains don't all just immediately die out locally. In particular influenza B strains don't cause pandemics so they have to stick around. So it still must be infecting someone. Yet when the US CDC tests influenza like illnesses in the summer they virtually never detect flu. We are talking only ~100 cases each month for the whole country. Those could easily just be from the occasional person who is vitamin D deficient even in the summer. The people who get tested are people who had something that felt worse than a cold. So the fact that almost none of them have the flu is actually pretty reasonable observational evidence that influenza causes less severe disease in the summer.

Given the similarity of influenza and coronaviruses this is again ample reason enough for our modest purpose. To require certainty beyond a reasonable doubt here would be beyond unreasonable, no doubt.

 

I can even offer a straightforward evolutionary explanation. A virus that would have some advantage to preferentially spread during some season would need some way of 'knowing' what season it is from within the body. It just so happens that vitamin D is just the sort of biochemical calendar needed. The virus will surely take advantage of this information if it can, and it easily can. Vitamin D levels control whether the host immune system is being frugal and uses adaptive tactics or is being liberal because its summer when there is plenty to spare and uses innate tactics. In the summer, it is worthwhile to spend the extra resources to place innate guards at every door and in every corridor. These innate defenses are static and do not get better at killing the enemy. So a virus strain can readily precisely tune itself to be kept in check, but not eliminated, by the innate system and thus be able to stick around longer per host. This results in less symptoms and less built-up immunity. When the host species switches to winter defenses, the virus will cause a very different and aggressive course of disease. Simply put, the strategy here is to spread when the spreading is good and hide and bide time when it isn't.

It is hard for me to see how this couldn't be at least part of the reason. Its almost inevitable whenever a virus gains some advantage by preferentially spreading in winter. So again, why aren't you supplementing vitamin D right now?

 

Are you waiting for some government endorsed health organizations to recommend it? That might take awhile. For the cynical, the institutional inertia of bureaucracies so often makes for such sweet schadenfreude. Don't expect any recommendations to happen short of a favorable large well-executed double blind RCT. As they are, these institutional structures can hardly help but use the same evidentiary standard that they use for everything. Its one that's been found through experience to work decently enough for evaluating potentially dangerous drugs developed by not entirely trustworthy companies with a strong incentive toward profit, and where time is not critical and treatment capacity is not even an issue. That's a classic type I decision context. Whether to recommend vitamin D here is not a type I decision. Unfortunately, for all practical purposes such committees act as though they reject the very concept of a statistical type II error, and will continue to do so until a day comes where they finally make such a really bad call that everyone notices. We often to have to fail in order to learn.

 

Any potential for treatment that vitamin D might have would of course also work as good evidence for its use in prevention. So far there has only been one RCT that has come out examining a plausible treatment for hospitalized COVID-19 patients. They administered calcifediol, aka 25(OH)D, the liver-activated form of vitamin D. It's the metabolite usually measured in a blood test. Other studies have used the usual dietary D3 form instead. D3 just won't work well for acute conditions. It takes valuable days to increase calcifediol levels, and even with a massive dose the rate is still limited by the liver's maximum capacity to activate it. Giving calcifediol instead, directly raises plasma levels within hours to whatever level you would like. Let's look at the results.

Upon admission, 76 non-ICU patients in Cordoba, Spain were either started on calcifediol or randomly assigned to control. 13 out of 26 controls progressed to ICU, and 2 died. 1 out of 50 treated progressed, and 0 died. There was no placebo, but everyone was otherwise blinded.

This is direct evidence in favor of a causal role.

The probability for the randomization to yield a result as good as this purely by chance is less than one in a million.

The blinding was imperfect but it strains belief to suppose that the doctors would give care unevenly enough to produce this result.

Most likely the treatment was not as effective as it appeared to be here, but it is almost certainly effective. In a physician's hand, calcifediol is not all that much more dangerous than if you had a magic wand that could accelerate the rate D3 was processed by the liver.

 

The Cordoba study has been dismissed by many out of hand for having a small sample size and imperfect blinding. However, sample sizes are not appropriate to be used as a fundamental deciding factor! Attending to them is supposed to just be a quick rule of thumb that works well in a domain when observed effect sizes are typical for that domain. If, however, your justification bottoms out with "the sample size is too small" then you have badly mistaken a heuristic convenience for a proper epistemic foundation and you have turned a rule of thumb into a thought-terminating cliché.

Here is an actually good analysis.

In any case, I am only appealing for vitamin D supplementation here. This easily suffices for that.

 

Why is vitamin D insufficiency so prevalent?

The word 'nutrient' is ill-defined. It actually started out as an adjective in the phrase 'nutrient principle' back when they thought there was only one fundamental nourishing vital substance found in food. There are several incompatible 'definitions' in use out there, and they are all seriously flawed. Hopefully mine is less terrible:

A dietary nutrient is a fundamental substance that an organism's ancestors were evolutionarily selected to utilize through dietary ingestion.

This definition grounds nutrition in biology instead of medicine and pays special attention to the relation of evolution to the route of absorption. The route is of significant pragmatic importance. People still stubbornly believe that they get enough vitamin D from their diet alone, despite it just not being naturally sufficiently found in food. This misconception is probably the most fundamental reason why vitamin D insufficiency is so widespread. The root cause here is just that our categories don't make it clear that vitamin D is not a normal dietary nutrient.

 

Vitamin D is used throughout the animal kingdom as a biochemical calendar for cells. However different animals get their vitamin D by at least three different routes. Carnivores largely favor getting their vitamin D from their prey, so it's primarily a dietary nutrient for them. Furry herbivores favor getting much of their vitamin D from grooming their fur and thusly consuming the now ultraviolet-activated natural oils. Reptilian herbivores get their vitamin D from basking in sunlight. Omnivores might use any mixture of the three. So to be more precise,

A grooming nutrient is utilized through ingestion of sebum during grooming.

A basking nutrient is utilized through basking in sunlight.

Vitamin D is primarily a basking nutrient for humans. We are furless omnivores and evolved thriving on a variety of different diets with varying vitamin D contents. So naturally it made the most sense to just directly track sun exposure.

 

I've focused here on trying to articulate a plausible theoretical picture. No one is surprised that vitamin D treats rickets because we understand that our physiology assumes that there will always be at least a minimal amount of vitamin D present. I believe a similar story is likely true for vitamin D with respect to susceptibility to certain co-evolved viruses (and bacteria). Here though it is not just our genetics that has built-in evolutionary expectations about our vitamin D status. Species-hopping seasonal viruses surely have such expectations as well. Every time such a virus jumps to a new species, it predominantly kills individuals who respond with too much adaptive immunity and too little innate immunity. They are replaced by the progeny of the other individuals who then serve as better long-term hosts. This happens throughout animals and across ecosystems and ultimately results in a rough balance where countless species all have somewhat coordinated seasonal immune responses. Species with more frequent viral transfers between each other will be more coordinated.

Consider if a particularly rare path is blazed by a virus. Perhaps one starting from bats with their strange immune systems, leaping to subtropical pangolins which lack exposed skin to bask or hair to groom, and ending with sun-deficient humans in temperate climes. We shouldn't be surprised if vitamin D insufficiency ended up mattering quite a lot in such a situation.

 

4000 IU daily is the safe upper limit for a daily dose according to the NIH. The body buffers D3 by building up stores in the spring and summer. So in order to see benefit any time soon at all, an initial one-time only large loading dose(or doses) should be taken, ideally with a fatty meal. I'm not sure if providing a conservative number here would be against the forum rules. So consider asking a doctor how much a person like yourself in the middle of winter should take in order to rapidly get to vitamin D serum levels corresponding to late springtime or so.

 

Please share this. I am not sure how many people are even aware of this controversy or that the evidence is actually rather great. I only presented a tiny fraction of what's out there. Plus if some real awareness finally gets out there then I can have the time to post some babble about other purely semiotic nutrients besides vitamin D, like niacin, or how calcifediol is actually also an active form of vitamin D and how the two active forms enables the biochemistry to "mathematichemically" differentiate the calcitriol signal. Or maybe I can just finally go back to math. =)

 

In any case, I apologize for how long this post is. Thank you for taking the time to read it!

 

Resources:

An in depth analysis of the Cordoba RCT

A collection of medical evidence both pro and con

An excellent accessible philosophical treatment of the decision theory

A very articulate doctor going over medical evidence for literally an hour

A different doctor's blog suggesting that the follow-up trial to the Cordoba RCT is continuing to show results

1 comments

Comments sorted by top scores.

comment by AnthonyC · 2020-12-24T12:39:58.012Z · LW(p) · GW(p)

4000 IU daily is the safe upper limit for a daily dose according to the NIH. The body buffers D3 by building up stores in the spring and summer. So in order to see benefit any time soon at all, an initial one-time only large loading dose(or doses) should be taken, ideally with a fatty meal. I'm not sure if providing a conservative number here would be against the forum rules. 

I can't comment on what anyone should take, but I'm a ~215 lb male, and a few years ago (age 30) I was diagnosed with a mild Vitamin D deficiency and prescribed 50,000 IU once per week for either one or two months.