A new General Mills infographic posted on Businesswire.com last week asks if cereal is the secret superfood. Here’s a (not so secret) secret: It isn’t.
A new General Mills infographic posted on Businesswire.com last week asks if cereal is the secret superfood. Here’s a (not so secret) secret: It isn’t.
The following is a guest post by Melissa Garrido, PhD (@GarridoMelissa). She is the Associate Director of the Partnered Evidence-based Policy Resource Center (PEPReC) at the Boston VA Healthcare System, U.S. Department of Veterans Affairs, and a Research Associate Professor with the Department of Health Law, Policy, and Management at Boston University School of Public Health.
A new proposal from the Trump administration would make millions of Americans ineligible for federal food assistance. This is likely to make people sicker and cost Medicare and Medicaid more.
The US Department of Agriculture is proposing eligibility changes to the Supplemental Nutrition Assistance Program (SNAP) — limiting the number of people who are categorically enrolled based on receipt of other federal benefits. Chiefly affected are households including an older adult; thirteen percent of households including an older adult that are currently eligible for SNAP benefits are at risk of losing eligibility.
The government’s own analysis acknowledges that this change could exacerbate food insecurity — inadequate access to food — for those no longer eligible for SNAP. What they don’t acknowledge is that food insecurity is also linked to negative health outcomes.
Food insecurity is associated with an increased risk of both mental and physical health issues in adults, including depression, diabetes, high blood pressure, and malnutrition. In food-insecure children, anxiety, aggression, asthma, and iron-deficiency anemia have been observed. Malnutrition, a consequence of food insecurity, weakens immunity and increases susceptibility to communicable illnesses.
The good news is SNAP works. Receipt of SNAP benefits is associated with reductions in food insecurity. It is also associated with reduced health expenditures among low-income adults. Among older adults, receipt of SNAP benefits is associated with reduced risks of hospitalization and nursing home admission. Older adults with SNAP are better able to afford medication — making them less likely to skip or ration important medications.
By restricting SNAP eligibility, the government will pay for nutrition assistance for fewer people. This may come at the price of worsened health status and increased health care spending for millions of Americans.
The USDA is seeking public comments on this proposal until September 23, 2019.
The following originally appeared on The Upshot (copyright 2019, The New York Times Company)
Almost anyone with a child in day care or preschool has received the call. They say your child has a minor ailment like pink eye and must go to the doctor. Otherwise, they say, the child won’t be able to return to school or day care. Sometimes, they even say your child can’t come back until they’re on antibiotics.
The best evidence, however, says there should be less treatment for pink eye and other minor illnesses, not more. Day care centers usually ignore this evidence — and parents often pay the price.
This pattern is extremely hard to reverse. There is an understandable urge to be protective when it comes to children (and also an incentive to protect yourself from future criticism should your decision not to act go poorly).
Being overprotective has costs, though. Visits to the doctor aren’t free, nor are drugs. Parents have to skip work to take their children to the doctor. They already pay a lot for day care, and this is a forced expense that often lacks value.
About 30 percent to 40 percent of working mothers pay for child care outside the home. In a recent Upshot article, Claire Cain Miller pointed out that child care can cost a typical family a third of its income. And the current administration keeps adding work requirements to many safety net programs.
Moreover, a visit to the physician is not harm-free. Children are exposed to other potentially ill children in a waiting room. Day care centers seem concerned that children could spread disease to other children in their facilities, but not nearly as concerned that they could spread illnesses to other children in a clinic.
Day care centers sometimes act as medical experts. In 2010, researchers examined the medical policies of day care centers in Pennsylvania. Almost all (97 percent) of them had written policies for when ill children should stay home. In almost all cases, those making decisions about who was too ill to attend were directors or teachers — neither of whom had medical training.
More than 90 percent had policies requiring antibiotics for pink eye with a white or yellow discharge. More than half had policies requiring antibiotics for diarrhea. Neither requirement makes sense.
Pink eye, also known as conjunctivitis, can be highly contagious, which helps explain why day care centers have such strict policies regarding it.
But a lot of pink eye is caused by viruses, which are unaffected by antibiotics.
Even when pink eye is caused by a bacterial infection, antibiotics have a limited effect. They slightly increase the chance that a child will feel better within two to five days. But there’s no promise of a cure the “next day,” as most day care centers seem to believe, and no lessening of infectiousness.
This is not just an American problem. A study of child care centers in Ontario found workers often recommended that parents visit the doctor for a runny nose or cough; they also said antibiotics would be useful for colds (they are not). More than two-thirds had allowed children to return if they were on antibiotics. Studies in England have also found that about half of policies required antibiotics for pink eye, and that such policies influence clinicians’ decisions about whether to prescribe them.
The American Academy of Pediatrics, along with some other major health organizations, has published recommendations on when children should be kept from child care centers. Two main criteria should drive considerations, according to the recommendations. If children are so sick that they cannot participate in activities, they should stay home. If they need more attention than the staff can provide, they should stay home, too.
Otherwise, it should depend not on whether they are sick, but on how sick. Clearly a very ill child should stay home, but each child experiences illness differently. Fever requires staying away only if it causes changes in activity participation or if it’s accompanied by other symptoms like a sore throat or rash. Vomiting requires staying away if it occurs more than once.
Multiple studies have shown that the optimal strategy for managing pink eye is to delay antibiotics — to see if it resolves on its own. There’s certainly no reason to require antibiotics, or to delay a child’s return until they’re on them.
Many schools will probably be too worried about children who spread infections to follow these guidelines. But children are often infectious before they show any symptoms, and we’re never going to be able to prevent all transmission of illness.
Our best bet is still focusing on hygiene, like proper hand washing, covering your mouth when you cough or sneeze, and not sharing food and drink with others.
Instead of relying on antibiotics and after-the-fact measures to prevent the spread of disease at day care, a center’s staffers could focus on disease prevention. A randomized controlled trial published last year in Pediatrics examined a hand hygiene program at child care centers in Spain that focused on more diligent hand washing or sanitizing. The program resulted in a reduction in respiratory infections and antibiotic prescriptions, as well as fewer days missed. Another study from a decade earlier found a similar result.
Unnecessary treatment isn’t just a problem with pink eye. It happens for sore throats. It happens for rashes. Taken together, it essentially amounts to a tax on being a parent, with costs both obvious and hidden: medical fees, missed work, lost time.
The idea is to protect children, but current practices don’t seem to be serving anyone well.
Our approach to alcohol and pregnancy is full of good intentions, but may lead to more harm than good. Punitive policies can dissuade women from getting prenatal care, which runs counter to the overall goal of healthier mothers and infants.
Earlier today, Tennessee released a draft proposal to introduce block grants into its Medicaid program. Setting aside the dubious policy merits of block grants, however, I don’t think the proposal is legal. I don’t even think it’s close.
Under section 1903 of the Medicaid statute, the federal government must pay a fixed “match rate” (known in the statutory lingo as “the Federal medical assistance percentage”) to every state that participates in Medicaid. In Tennessee, the match rate is 65.21%. That means that, for every $1 that Tennessee spends on its Medicaid program, the federal government kicks in about $2.
Tennessee wants to change that financing structure. Instead of matching dollars, Tennessee would like an up-front, lump-sum payment—a “federal block grant”—that’s calculated to cover the anticipated expenses of its Medicaid population. Because Tennessee would have a fixed sum of federal dollars to spend, the state would have an incentive to economize on Medicaid. Any savings, Tennessee says, would be split 50-50 with the federal government.
As Tennessee recognizes, it’ll need a waiver from HHS to make these changes. And section 1115 of the Medicaid statute does allow HHS to waive lots of the law’s restrictions in connection with experimental projects that are likely to assist in promoting Medicaid’s objectives.
Now, I’ve written before that I’m not at all sure that block granting Medicaid counts as an experiment that serves Medicaid’s purposes. But there’s a more fundamental problem with Tennessee’s proposal. You can’t use section 1115 to waive section 1903. To the contrary, section 1903 is pointedly omitted from the list of statutory provisions that HHS is empowered to waive.
So you can’t use Medicaid waivers to change Medicaid’s financing structure. And that’s exactly what Tennessee is proposing to do.
Under the proposed waiver, the state would no longer get $2 for every $1 in Medicaid spending. It would instead get a lump sum every year that’s calculated to cover its anticipated Medicaid expenditures. If Tennessee spends more than it expects, the effective match rate would be lower than 65.21%. If it spends less, the effective match rate would be higher. Either way, the proposal would violate section 1903.
I’d been thinking that Tennessee would find a way to conform its block grant proposal to section 1903. But it looks like it didn’t even try. Indeed, it’s telling that Tennessee doesn’t even mention section 1903 in its proposal—a glaring omission for a state that wants to make a fundamental change to how the federal government pays for Medicaid.
Maybe Tennessee has an argument for why this proposal is consistent with the Medicaid statute. I’m all ears. For now, however, I don’t see how to square what Tennessee wants with the language of the law.
Electronic cigarettes, like Juul, have been in use in the United States for over a decade now, so what’s up with the sudden spike in e-cigarette related illnesses? Spoiler: We wish we knew more.
A few years ago, Oregon found itself in a position that you’d think would be more commonplace: It was able to evaluate the impact of a substantial, expensive health policy change.
Rigorous evaluations of health policy are exceedingly rare. The United States spends a tremendous amount on health care, but very little of it learning which health policies work and which don’t. In fact, less than 0.1 percent of total spending on American health care is devoted to evaluating them.
As a result, there’s a lot less solid evidence to inform decision making on programs like Medicaid or Medicaid than you might think. There is a similar uncertainty over common medical treatments: Hundreds of thousands of clinical trials are conducted each year, yet half of treatments used in clinical practice lack sound evidence.
As bad as this sounds, the evidence base for health policy is even thinner.
A law signed this year, the Foundations for Evidence-Based Policymaking Act, could help. Intended to improve the collection of data about government programs, and the ability to access it, the law also requires agencies to develop a way to evaluate these and other programs.
Evaluations of health policy have rarely been as rigorous as clinical trials. A small minority of policy evaluations have had randomized designs, which are widely regarded as the gold standard of evidence and commonplace in clinical science. Nearly 80 percent of studies of medical interventions are randomized trials, but only 18 percent of studies of U.S. health care policy are.
Because randomized health policy studies are so rare, those that do occur are influential. The RAND health insurance experiment is the classic example. This 1970s experiment randomly assigned families to different levels of health care cost sharing. It found that those responsible for more of the cost of care use far less of it — and with no short-term adverse health outcomes (except for the poorest families with relatively sicker members).
The results have influenced health care insurance design for decades. In large part, you can thank (or curse) this randomized study and its interpretation for your health care deductible and co-payments.
More recently, the study based on random access to Oregon’s Medicaid program has been influential in the debate over Medicaid expansion. A state lottery — which provided the opportunity for Medicaid coverage to low-income adults — offered rich material for researchers. The findings that Medicaid increases access to care, diminishes financial hardship and reduces rates of depression have provided justification for program expansion. But its lack of statistically significant findings of improvements in other health outcomes has been pointed to by some as evidence that Medicaid is ineffective.
Sickle Cell Disease affects 100,00 Americans, and has been pretty well understood for a long time. So, why are there only two drugs available for the condition? Why are so few research dollars allocated to the problem? Well, the answers aren’t very nice.
This piece originally appeared in Public Health Post and is coauthored by Elsa Pearson (@epearsonbusph) and Austin Frakt (@afrakt). Research for this work was funded by the Laura and John Arnold Foundation.
Health insurance is supposed to help us pay for expensive medical care, but what if the insurance itself becomes too expensive? What happens to our health?
It is possible to make progress on taming health care cost growth, which often makes its way into insurance premiums. In Massachusetts, employer-sponsored insurance premiums are the highest in the country, and they keep growing. Yet, compared to the rest of the nation, the state has done a good job improving access to care and addressing spending growth. Two major laws are responsible.
Chapter 58—sometimes called RomneyCare—was passed in 2006 to expand access to affordable health insurance through state and federal subsidies. To address health care costs, Chapter 224 was passed six years later. Among other things, it established the Massachusetts Health Policy Commission, which sets health care cost growth goals and monitors the state’s progress.
Both laws were successful; the state boasts the lowest uninsured rate in the country and an annual health care spending growth rate below the national average for most of this decade. Still, health insurance costs—especially for private plans—are persistently high.
Premiums are only part of it. Out-of-pocket spending for private plans—deductibles, coinsurance, and copays—has also increased considerably nationwide.
The individuals hit hardest by this cost growth are those stuck in a coverage gap, with incomes too high to qualify for Medicaid but too low to comfortably afford a private plan. Massachusetts residents below 300% of the federal poverty level and enrolled in employer sponsored insurance spend nearly a third of their entire income on health care.
They are underinsured—they have insurance, but also deal with unaffordable out-of-pocket costs. In 2018, nearly 30% of insured adults in the United States were underinsured. Of that 30%, four in 10 admitted to delaying care because of cost.
Just like being uninsured, being underinsured has consequences. People who have to pay more out-of-pocket tend to delay or skip care because of the added cost. A study in JAMA Pediatrics found that some parents of children with asthma rationed their children’s medications and even delayed appointments when out-of-pocket costs were high.
Without change, health insurance will become even less affordable for more people over time, increasing the number who forgo coverage or care altogether. What could we do to fix this?
For one, Massachusetts has set the state’s health care cost growth benchmark at the national rate of 3.1%. If the state can stay below this, it could help prevent health care costs from getting worse. Other states could follow suit and establish or maintain annual growth benchmarks of their own.
But insurance is expensive now; what else could be done in Massachusetts and elsewhere?
For those who purchase insurance on the exchange, many states have good and modestly priced options. In Massachusetts, individuals with exchange plans spend the least amount of their income on health care spending compared to those with any other type of insurance.
For those with employer sponsored insurance, states could push employers to offer affordable insurance options. For example, Massachusetts could make permanent a law that penalizes employers who should offer affordable insurance but don’t. It’s set to sunset at the end of 2019, but allowing it to do so would be a step backwards, removing an incentive for employers to use their negotiating influence to reduce health care costs.
High health insurance costs are built into our system by years of uncontrolled spending growth. States have made progress in addressing the issue, but, for many Americans, insurance is still unaffordable. With unaffordable insurance comes skipped coverage, delayed care, and worsening health. Future reform must close the insurance coverage gap in which so many Americans are still stuck.
A new study out this month claims plant-based diets are associated with lower risk of not only cardiovascular disease and mortality but of all-cause mortality as well. Let’s see how the results stand up to scrutiny.