Jim Manzi suggests we view the results of the Massachusetts mortality study by Sommers et al. with some caution. (The study and related issues have been discussed by me here and here; Adrianna here; and Bill here, here, and here.) Manzi’s reasoning is broadly sound, but it may not quite work in detail. This is subjective, so you get to be your own judge.
Manzi’s overarching point is that an observational study—like the Massachusetts mortality one—can’t demonstrate causality, only a randomized controlled trial (RCT) can. This is basically right, though I’d say it differently. One can make causal inferences with observational study results, but only with some additional assumptions. Those assumptions should be explicit and ought to be backed up with plausible arguments and analysis that tests their validity. Notice, I did not say “proves their validity,” as that is impossible. But the more and varied the probes of the assumptions, the more confidence we should have that they hold.
From what I’ve read, it is widely held that the Massachusetts mortality study is a “strong” one, with “solid” methods. What does this mean? It basically means that the investigators made the number of plausible arguments and did enough validity testing to meet a “community standard.” That sounds lame, but there’s really no established benchmark. One can always do one more robustness check. One can always ponder another way the results could not be causal. In practice, you stop covering bases when your colleagues (or peer reviewers) tell you you can stop. In principle, they can torture you for as long as they like.
If you go to a typical economics seminar (particularly a job candidate’s talk), you’ll see this kind of torture in action. Sometimes, the sage scholars in the audience will toy with their victim until they’ve reduced the study to smoldering dust. Other times the presenter will have just enough answers to satisfy the establishment that s/he belongs, and the torture ends before tears are shed. (Very often it’s not so dramatic, but it can be.)
Manzi’s hypotheses for how the causality assumptions in the Massachusetts mortality study could fail are not unlike those one might encounter at such a seminar. They’re clever, worth reading in full, and deserve to be answered. You may find them convincing. But before you make up your mind there are a few things you should consider.
The main thrust of Manzi’s critique is that the results could have been driven by compositional changes. Maybe the population of Massachusetts became more healthy (less likely to die) than that in control counties in other states due to migration. This is not implausible, and Manzi offers several possible reasons why healthier people might have moved to the state precisely when the coverage expansion law was implemented, which you can read for yourself.
Here are some possible responses to Manzi’s probes:
First, the study authors found that mortality amenable to health care declined even more than mortality in general. Would we expect migration patterns to produce that, precise disproportionate effect? Maybe, if the population become more efficient users of health care by virtue of migration. One way that might happen is if parents were drawn into the state for educational opportunities for their children, one of Manzi’s hypotheses. It’s plausible that such concerned, prudent, forward-looking parents might also be better able to avail their families of mortality-protecting health care, though why wouldn’t they also be better able to avoid care accidents, gun-related deaths, and other ways of dying not amenable to health care? But hold that thought …
Second, the study authors found that counties with higher uninsurance rates at baseline experienced greater mortality reductions. Could this be due to migration, even by wise parents? Well, of course, but it requires an argument that the healthier migrants ended up disproportionately in such counties. I didn’t see such an argument from Manzi explicitly. (It could happen by random chance, which is a general concern Manzi implies about a study that is driven by the results in one state.)
Finally, let’s consider migration head on. How large was it for Massachusetts, relative to control counties? We don’t know exactly, but in a paper published in Health Affairs earlier this year, Aaron Schwartz and Ben Sommers (the lead author of the Massachusetts mortality study)
used data from the Current Population Survey to examine the migration patterns of low income people before and after recent expansions of public insurance in Arizona, Maine, Massachusetts, and New York. Using difference-in-differences analysis of migration in expansion and control states, we found no evidence of significant migration effects. Our preferred estimate was precise enough to rule out net migration effects of larger than 1,600 people per year in an expansion state.
This seems like too small net migration to fully explain several hundred fewer deaths in Massachusetts per year (though I’ll come back to this).
None of this is meant to imply that Manzi’s hypotheses are necessarily wrong or that there aren’t other threats to a causal interpretation. I’m sure Manzi, I, and others could think of some. Moreover, my responses above aren’t air tight. For example, the Health Affairs study I quoted wasn’t Massachusetts specific, so it’s possible larger migration was experienced for that state than the average effect estimated by that study. Or, it’s possible that even if net migration was small, it still changed the composition of the state’s population in ways that threaten the causal link between coverage and mortality (e.g., 200,000 new, healthier residents move in as 198,400 less healthy residents move out).
That brings me back to my main point, and Manzi’s. You can never prove causality with an observational study. At the same time, you can’t insist on an RCT for everything. You’ll die waiting. And policy decisions need to be made today. (Not to expand access to affordable coverage is as much a policy decision as to do so.) The best you can hope for, in many cases, is a substantial body of high caliber observational work that lines up largely pointing in one direction. Imperfect as it is, I think the Massachusetts mortality study complements such a body of work that suggests health insurance reduces mortality and morbidity.