• Which Food is Going to Kill You?

    A few stranger-than-usual nutrition studies have been circulating social media lately and need some clearing up.

     

    @DrTiff_

     
    item.php
  • Does More Social Spending Reduce Healthcare Costs?

    We spend a lot of money on social programs. Studies indicate that a lot of those programs work to improve health outcomes. So, does that social spending reduce healthcare costs? Not really, at least not on a national level. But cutting social spending doesn’t help with health. As usual, it’s complicated, but there’s a lot of evidence these programs are doing good. Spending is worth it sometimes.

     

    @DrTiff_

     
    item.php
  • We’re making an NIH-funded podcast on a very important topic! Maybe we want to interview you!

    We are creating a podcast about science culture! Specifically, it is a podcast that will be accompanied by educational modules (all NIH funded) addressing how the “culture of science” plays into the “reproducibility crisis”.

     

    We are focusing on the “why” behind the “how”. Meaning rather than focusing on p-hacking/image duplication/fabricated data, we’re focusing on what drives people to p-hack/duplicate images/fabricate data. We want to examine, among other things, the incentive systems put into place by universities, granting agencies, & publishers, and how that contributes to poor research.

     

    A (non-exhaustive) list of things we plan to cover:

    -How quantity over quality is rewarded in hiring, tenure/promotion, funding, publishing

    -Media coverage (fame factor and how the media doesn’t report on null results)

    -Conflicts of interest (professional, ideological, financial)

    -Poor oversight/mentoring (including how that can trickle down scientific “generations”)

    -Authority structure

     

    Particularly in terms of those last two: We want to talk to people who have felt pressure from mentors (likely during grad school or postdoc) to conduct inappropriate analyses and/or other data/publication related tasks for the sake of publishing specific and/or significant results. We want to hear from you whether you felt comfortable doing what was asked of you or not, whether you complied or did not. There is no judgement, only the wish to talk about an issue that is difficult to quantify precisely because we do not talk about it. We can take measures to anonymize you if you prefer.

     

    If you have stuff to say about any of the above, we want to hear from you!!! You can e-mail Tiffany at tsdohert@iu.edu or DM her on Twitter @DrTiff_

    We’ll likely conduct a pre-interview over the phone (less than 20 minutes) to make sure we’ll have plenty of relevant and interesting things to discuss, and if we all feel like it’s a go, we’ll get you scheduled for an interview. We’ll come to you and work around your schedule. The interview itself would be around an hour, and it’s not live, so we can edit anything. Very low pressure. We want this to be as easy on you as possible.

    In addition, if you have further topics related to this angle that you think we should consider, we want to hear about it! And if you know someone who would be awesome but not likely to see this, we want to hear about them, too!

     
    item.php
  • You Guys Should Get the HPV Vaccine, Too

    The HPV vaccine is pretty well known amongst the public, but mostly as a measure against cervical cancer in women. A recent news article highlights its use in men, and we want to highlight that highlight.

     

    @DrTiff_

     
    item.php
  • Meat’s Bad for You! No, It’s Not! How Experts See Different Things in the Data.

    The following originally appeared on The Upshot (copyright 2019, The New York Times Company)

     

    Researchers are in another fight about food.

    This week the Annals of Internal Medicine published studies arguing that eating red meat poses minimal health risks for most people, and that even our certainty about that link is weak. With these conclusions in hand, the authors offered a set of recommendations that most people could continue their current levels of meat consumption.

    Although not involved in the research, I co-authored an editorial for the journal summarizing the findings, arguing that our messaging about the harms of red meat may be falling on deaf ears. and then pointing out other messages that might work better to reduce consumption of it.

    The conclusions and the guideline recommendations, made by an international team led by Bradley Johnston, an epidemiologist at Dalhousie University, run counter to many by established health authorities. This week, a number of nutrition researchers wrote me to say they vehemently disagree with the publication of these papers, and feel that they could do real harm.

    They believe that red meat and processed meat consumption poses a health hazard to people, and that if people don’t reduce their consumption, they are putting themselves, and the planet, at risk.

    I agree with them on the environmental argument for eating less meat. You can read about that here.

    But readers may well be confused about the health risks. How can experts disagree so strongly?

    The following questions may help you understand why even researchers of good faith can land on different sides of a debate.

     

     

    Part of the problem lies in the difficulty in doing research in this area. It’s almost impossible (and some would say unethical) to do the most rigorous type of experiment — a randomized controlled trial — in areas like red meat consumption. Because of that, we must rely on observational data; we ask people what they are eating and correlate that with outcomes.

    Others, like John Ioannidis, an expert in research design and analysis at Stanford, argue that we can and perhaps should do long-term randomized controlled trials of dietary patterns before we make proclamations.

     

     

    Even observational trials are hard to do well. Most major health setbacks are pretty rare. It’s hard to see big differences in death, cancer and heart attacks in even large groups of people, unless you follow them over long periods. But quantifying what people are eating over long periods is challenging, too, because often people don’t remember.

    Such studies are also difficult to interpret because of what are called confounding factors. Maybe people who eat more meat are poorer. Maybe they smoke, drink too much alcohol or don’t exercise. Those things would also lead to bad outcomes, and it’s hard to tease out individual components over time.

    If you do trials of people at higher risk — those who have already had heart attacks, for example — it’s easier to see if changes matter. The Predimed trial, for instance, which studied the Mediterranean diet, focused on people who already had diabetes or a number of traits placing them at high risk for heart disease. But these people aren’t necessarily representative of the general public, for whom dietary recommendations are written.

    All of this means that observational evidence, which is easier to obtain, will be ruled as “low quality” by some researchers. Others will argue it’s the best we can get, and therefore we should apply different standards to such research.

     

     

    Because big outcomes are rare, research sometimes looks at intermediate measures. Those, like weight, blood pressure, cholesterol levels and more, can change in shorter periods. Some will point to studies in these domains and say that they prove that meat reduction has significant health effects. High blood pressure or cholesterol levels are widely believed to be major risk factors for adverse events. Others will disagree as to how much we should rely on intermediate measures. These new studies focused only on those end-stage outcomes.

     

     

    Critics of the new meat studies argue that given the authors’ low certainty about their findings, they should have issued no recommendations at all. That’s not unreasonable. When the U.S. Preventive Services Task Force lacks sufficient evidence to publish recommendations on prevention, it gives recommendations an “I” rating, and says the current evidence is insufficient to assess the balance of benefits and harms. That’s all. Maybe that was preferable here, instead of publishing recommendations that people continue their current meat consumption.

     

     

    Even in studies that find statistically significant effects, the absolute benefits in most studies are small. I’ve written about this before. Many will argue, however, that even if there is a small individual benefit, the benefits to the population can be large.

    They are not wrong. Let’s say that the absolute risk reduction with respect to colon cancer is 0.5 percent. That would mean for every 200 people who reduced their meat consumption, one would see a benefit; 199 would not. To an individual, that might not seem like a big deal.

    But it also means that if two million people made that change, 10,000 would see a benefit, which from a population standpoint is great. It also means, of course, that 1,990,000 would see no benefit.

    There are many things that might make a difference at a population level for which people won’t make changes at an individual one. People accept major risks every day: to drive, to ski and more. Why? Because those activities bring benefits that individuals value over the harms. Should we care more about the individual or the population in recommendations?

     

     

    Some believe we shouldn’t bring preferences into play when we write guidelines: Focus on the health benefits alone, and not on other factors such as how much people like meat. After all, we don’t care if people “like” to smoke when we tell them not to.

    Others might counter that a study in the International Journal of Cancer in 2012 found that men smoking more than 30 cigarettes a day had a 10,250 percent increased risk for developing squamous cell carcinoma. That’s huge. An increase of 18 percent (relative risk of 1.18) for processed meat consumption is not the same, and therefore it might be reasonable to think about how much people derive joy from their current diets.

    Relative risk refers to the percentage change in one’s absolute (overall) risk as a result of some change in behavior (1.18, for example, is an 18 percent change from 1.0, and 1.0 represents a baseline of no difference in risk between an experimental group and a control group.)

    Because these questions don’t have black and white answers, researchers can look at the same sets of data and come to very different conclusions. Should recommendations be more concerned with populations or individuals? How much risk should there be to matter? Should personal preferences be considered? What should we say in the face of less than optimal evidence? Nothing? Play it safe?

    Unfortunately, too many of these arguments on meat consumption devolve into tribal sides.

    On the other hand, there are points on which I’m not seeing disagreement. Eating beef, for example, is a major problem for the environment. Eating less to improve the long-term outlook for climate change could make a huge difference, and would be something on which a majority of those involved in these debates might agree.

    @aaronecarroll 

     
    item.php
  • The Real Problem With Beef

    The following originally appeared on The Upshot (copyright 2019, The New York Times Company)

     

    The potentially unhealthful effects of eating red meat are so small that they may be of little clinical significance for many people.

    This finding, just released in multiple articles in the Annals of Internal Medicine, is sure to be controversial. It should certainly not be interpreted as license to eat as much meat as you like. But the scope of the work is expansive, and it confirms prior work that the evidence against meat isn’t nearly as solid as many seem to believe. (While I had no role in the new research, I co-wrote a commentary about it in the journal.)

    Red meat has been vilified more than almost any other food, yet studies have shown that while moderation is important, meat can certainly be part of a healthy diet.

    This doesn’t mean that there aren’t other reasons to eat less meat. Some point out that the ways in which cattle are raised and consumed are unethical. Others argue that eating red meat is terrible for the environment.

    Recently, meat substitutes have emerged as a possible solution, but the promise is much greater than the reality, at least so far.

    Burger King and other fast-food chains are trying out Impossible Foods burgers as a vegan answer to beef. Let’s dispense with the idea that this is “healthier” in any way. The Impossible Whopper has 630 calories (versus a traditional Whopper’s 660). It also contains similar amounts of saturated fat and protein, and more sodium and carbohydrates. No one should think they’re improving their health by making the switch.

    What about the environmental argument? Almost 30 percent of the world’s ice-free land is used to raise livestock. We grow a lot of crops to feed animals, and we cut down a lot of forests to do that. But beef, far more than pork or chicken, contributes to environmental harm, in part because it requires much more land. The greenhouse gas production per serving of chicken or pork is about 20 percent that of a serving of beef.

    Cows also put out an enormous amount of methane, causing almost 10 percent of anthropogenic greenhouse gas emissions and contributing to climate change.

    There has been a lot of hope that Beyond Meat’s pea protein or Impossible Burger’s soy could serve as beef burger substitutes, reducing the need for cows. That’s unlikely to happen, according to Sarah Taber, a crop scientist and food system specialist. Ground beef is not the problem; steak is.

    “There’s no profit to be made in ground beef,” she said. “That all comes either from leftover parts once cattle have been slaughtered for more expensive cuts, or from dairy cattle that have outlived their usefulness. If everyone gave up hamburgers tomorrow, the same number of cows would still be raised and need to be fed.”

    In other words, to improve the environment by reducing the number of cows slaughtered, we’d need to find a way to replace the many other cuts of beef Americans enjoy. No lab, and no company, is close to that.

    To greatly reduce the reliance on cows, we’d also need to wean ourselves from our high level of milk consumption. The increasing use of alternative milks, like oats or soy, could help, but the dairy industry still dominates.

    (The dairy industry’s claims about the health of its product are somewhat overblown. Milk isn’t nearly as “necessary for health” as many believe.)

    Some companies are researching ways to replace the more complex cuts of meat that drive the market. These companies aren’t replacing beef with substitutes; they’re trying to grow it in the lab using stem cells.

    Tamar Haspel, who writes on food policy for The Washington Post, has said such advances are not likely soon. Nor is it clear that they would have an overall positive impact, unless we are sure that this meat can be made in a more energy-efficient way than we can raise cattle.

    If meat substitutes won’t help in the short run, other things still might. Some believe that raising cattle on pastures, from birth until slaughter, might sequester carbon in the soil better than having cows finish their growth on feed lots. Researchers at the University of Florida argue that it can also be profitable for farmers in warmer climates to do just that. It would require the cattle industry to make significant changes, as well as to relocate, and it seems unlikely they’d be willing to do that.

    “Grass-feeding cattle without grain is the norm in New Zealand, but almost no one in the United States does it,” Dr. Taber said.

    It’s also worth pointing out that it would probably take longer to raise cows this way, giving them more time to emit methane.

    Other new developments could help with that problem. Some have proposed farming insects to make animal feed. And feeding seaweed to cows, even in small amounts, can significantly reduce their methane burps.

    One problem with seaweed is that the component that helps reduce methane emissions is classified as a carcinogen by the Environmental Protection Agency. It’s present in small amounts in seaweed, though, and humans have been eating seaweed safely for a long time. A larger problem is that we are unprepared to farm the unbelievable amount of seaweed it would take to feed all the cattle the world is raising.

    “Picture a seaweed farm the size of Manhattan,” Dr. Taber said.

    Until people are truly ready to reduce consumption of dairy or consumption of higher-end beef cuts, or to commit to raising cattle differently, it seems unlikely that any of the changes with respect to ground beef will make a significant environmental difference in the near future.

    That doesn’t mean there’s nothing we can do. I asked Dr. Taber what we might advise people, right now, to help the environment.

    “Who needs steak when there’s bacon and fried chicken?” she said.

     

    @aaronecarroll

     
    item.php
  • Are Daycare Policies Driving Overtreatment of Kids?

    Many daycare policies on children with minor ailments are unhelpful. Rules that keep sick kids at home for things like pink eye or a fever can make parents lives more difficult, and usually for no good reason. Many providers won’t let kids back in without proof that they’re taking antibiotics, even for mild illnesses that wouldn’t benefit from them.

     

    This video was adapted from a column Aaron wrote for the Upshot. Links to sources can be found there.

    @DrTiff_ 

     
    item.php
  • Why do doctors still offer treatments that may not be helpful?

    There are a surprising number of treatments that get accepted into mainstream care and covered by insurance, despite there being little to no evidence that they work.

     

    This video was adapted from a column Austin wrote for the Upshot. Links to sources can be found there.

    @DrTiff_

     
    item.php
  • Is Cereal Really a Superfood?

    A new General Mills infographic posted on Businesswire.com last week asks if cereal is the secret superfood. Here’s a (not so secret) secret: It isn’t.

     

    @DrTiff_

     
    item.php
  • Day Care Directors Are Playing Doctor, and Parents Suffer

    The following originally appeared on The Upshot (copyright 2019, The New York Times Company)

     

    Almost anyone with a child in day care or preschool has received the call. They say your child has a minor ailment like pink eye and must go to the doctor. Otherwise, they say, the child won’t be able to return to school or day care. Sometimes, they even say your child can’t come back until they’re on antibiotics.

    The best evidence, however, says there should be less treatment for pink eye and other minor illnesses, not more. Day care centers usually ignore this evidence — and parents often pay the price.

    This pattern is extremely hard to reverse. There is an understandable urge to be protective when it comes to children (and also an incentive to protect yourself from future criticism should your decision not to act go poorly).

    Being overprotective has costs, though. Visits to the doctor aren’t free, nor are drugs. Parents have to skip work to take their children to the doctor. They already pay a lot for day care, and this is a forced expense that often lacks value.

    About 30 percent to 40 percent of working mothers pay for child care outside the home. In a recent Upshot article, Claire Cain Miller pointed out that child care can cost a typical family a third of its income. And the current administration keeps adding work requirements to many safety net programs.

    Moreover, a visit to the physician is not harm-free. Children are exposed to other potentially ill children in a waiting room. Day care centers seem concerned that children could spread disease to other children in their facilities, but not nearly as concerned that they could spread illnesses to other children in a clinic.

    Day care centers sometimes act as medical experts. In 2010, researchers examined the medical policies of day care centers in Pennsylvania. Almost all (97 percent) of them had written policies for when ill children should stay home. In almost all cases, those making decisions about who was too ill to attend were directors or teachers — neither of whom had medical training.

    More than 90 percent had policies requiring antibiotics for pink eye with a white or yellow discharge. More than half had policies requiring antibiotics for diarrhea. Neither requirement makes sense.

    Pink eye, also known as conjunctivitis, can be highly contagious, which helps explain why day care centers have such strict policies regarding it.

    But a lot of pink eye is caused by viruses, which are unaffected by antibiotics.

    Even when pink eye is caused by a bacterial infection, antibiotics have a limited effect. They slightly increase the chance that a child will feel better within two to five days. But there’s no promise of a cure the “next day,” as most day care centers seem to believe, and no lessening of infectiousness.

    This is not just an American problem. A study of child care centers in Ontario found workers often recommended that parents visit the doctor for a runny nose or cough; they also said antibiotics would be useful for colds (they are not). More than two-thirds had allowed children to return if they were on antibiotics. Studies in England have also found that about half of policies required antibiotics for pink eye, and that such policies influence clinicians’ decisions about whether to prescribe them.

    The American Academy of Pediatrics, along with some other major health organizations, has published recommendations on when children should be kept from child care centers. Two main criteria should drive considerations, according to the recommendations. If children are so sick that they cannot participate in activities, they should stay home. If they need more attention than the staff can provide, they should stay home, too.

    Otherwise, it should depend not on whether they are sick, but on how sick. Clearly a very ill child should stay home, but each child experiences illness differently. Fever requires staying away only if it causes changes in activity participation or if it’s accompanied by other symptoms like a sore throat or rash. Vomiting requires staying away if it occurs more than once.

    Multiple studies have shown that the optimal strategy for managing pink eye is to delay antibiotics — to see if it resolves on its own. There’s certainly no reason to require antibiotics, or to delay a child’s return until they’re on them.

    Many schools will probably be too worried about children who spread infections to follow these guidelines. But children are often infectious before they show any symptoms, and we’re never going to be able to prevent all transmission of illness.

    Our best bet is still focusing on hygiene, like proper hand washing, covering your mouth when you cough or sneeze, and not sharing food and drink with others.

    Instead of relying on antibiotics and after-the-fact measures to prevent the spread of disease at day care, a center’s staffers could focus on disease prevention. A randomized controlled trial published last year in Pediatrics examined a hand hygiene program at child care centers in Spain that focused on more diligent hand washing or sanitizing. The program resulted in a reduction in respiratory infections and antibiotic prescriptions, as well as fewer days missed. Another study from a decade earlier found a similar result.

    Unnecessary treatment isn’t just a problem with pink eye. It happens for sore throats. It happens for rashes. Taken together, it essentially amounts to a tax on being a parent, with costs both obvious and hidden: medical fees, missed work, lost time.

    The idea is to protect children, but current practices don’t seem to be serving anyone well.

     

    @aaronecarroll

     
    item.php