The media need to understand better the studies they’re writing about. What do we do about that?

I get a lot of emails, tweets, etc. asking me to tell people if the health articles they’re reading are to be believed. This week, a lot of them were about prostate cancer. I was travelling, though, and so I couldn’t look right away.

They were reacting to a fairly large number of articles that were reporting on a study that declared that the rate of diagnoses of advanced prostate cancer was rising, and that this was correlated with the reduction in screening that has been advised in recent years.

As always, I refused to comment or answer them until I had read the study. I have now. It’s “Increasing incidence of metastatic prostate cancer in the United States (2004–2013)” and it was published in the journal Prostate Cancer and Prostatic Diseases. The methods:

Methods: We identified all men diagnosed with prostate cancer in the National Cancer Data Base (2004–2013) at 1089 different health-care facilities in the United States. Joinpoint regressions were used to model annual percentage changes (APCs) in the incidence of prostate cancer based on stage relative to that of 2004.

The results:

The annual incidence of metastatic prostate cancer increased from 2007 to 2013 (Joinpoint regression: APC: 7.1%, P<0.05) and in 2013 was 72% more than that of 2004. The incidence of low-risk prostate cancer decreased from years 2007 to 2013 (APC: −9.3%, P<0.05) to 37% less than that of 2004. The greatest increase in metastatic prostate cancer was seen in men aged 55–69 years (92% increase from 2004 to 2013).

Can you spot the problem here? Denise Grady at the NYT did:

In the study, the doctors examined the records of 767,550 men with prostate cancer diagnosed from 2004 to 2013. Using the number of cases of metastatic disease in 2004 (1,685) and 2013 (2,890), they reported an alarming increase of 72 percent.

But for the United States population, that percentage could be meaningless. On the cancer society website, Dr. Brawley said that to measure whether a disease was becoming more common, researchers could not rely on just the absolute number of cases. They need to calculate rates, meaning the number of cases per a certain number of people.

You can’t just look at the numbers of cases. You also have to look at the numbers of people who might have been diagnosed. You have to look at the rates.

This is epidemiology 101. It’s bread and butter.

There are other issues, too. One of the reasons that more aggressive disease is being found is because we’ve become better at finding it. The number of diagnoses started rising even before we started seeing recommendations to reduce screening. We don’t know if the number of people treated at these hospitals changed. The bottom line is that the coverage, and likely even the press release, were more than this study might have warranted.

This isn’t the first time.


The big question here is what should the media do about this? I include myself in that. There are so many publications, and the press releases say things that sometimes aren’t correct. I can’t expect every journalist to be able to read and parse a paper on its methods, can I? I also can’t expect every news organization to have its own peer review system to judge papers on their own.

You’re going to have a hard time getting people who have the scientific expertise to do this to be full-time journalists. The number of people with this skill set who are even willing to do what Austin and I do (WHICH IS A LOT) is very, very limited. There are only so many of us. Most of us have other, full-time job(s).

So what’s the solution? I’m seriously wondering. This kind of stuff hurts science and the credibility of the media.

@aaronecarroll

Hidden information below

Subscribe

Email Address*