Several weeks ago I quoted an article claiming that “90% of papers that have been published in academic journals are never cited.” As I wrote, I could not substantiate that claim with actual studies. Many, many people by email and Twitter sent me links to other published pieces that quoted stats like this, but few were grounded in science.
I hate that.
So does Dahlia Remler, who did some legwork to track down actual studies on citation rates. Her post is worth a full read but (glossing over some details, which are important) the following seems to be the bottom line: That 90% figure is bogus.
Many academic articles are never cited, although I could not find any study with a result as high as 90%. Non-citation rates vary enormously by field. “Only” 12% of medicine articles are not cited, compared to about 82% (!) for the humanities. It’s 27% for natural sciences and 32% for social sciences ([ungated] cite). For everything except humanities, those numbers are far from 90% but they are still high: One third of social science articles go uncited!
Ultimately, I’m not sure what to make of this. Let’s take health economics and health services research, the fields whose literature I know best. I presume they’re in the “social sciences” category, which has a 32% uncited rate. Is that too high, too low, or just right?
On the one hand, not everything deserves to be cited. Few in the field would deny that a lot of things get published based on shaky methods or that consider obscure minutia of little practical importance. On the other hand, given the resources expended on a publication (from grant application through paper writing and including the work of reviewers and editors), it’s a shame that there are research products that basically go nowhere, make no measurable impact.
I well know there are some very good papers that don’t get cited or recognized enough. In part, this blog is about bringing work like that to light just when it’s relevant to the policy debate, as well as raising its profile in the scholarly community.
So, I can’t reason my way to the “right” uncited rate. What I can reason my way to, however, is that it’s bad scholarship and bad journalism to cite a figure from a “study” that isn’t grounded in any, you know, research. Let’s stop doing that. Follow the links and the references. If there are none, or they dead end at a non-study WAG, let’s stop treating it like a fact. That’s how this works.