Vox seems to be getting a lot of clicks for a new paper claiming that there are “too many scientific studies out there.” Mind you, it’s not because there are too many papers with inadequate methods, or too many with bad results. It’s because the way in which they get cited drops more quickly than it used to.
I think that metric is flawed for any number of reasons. For one, it used to be that the only way to read a paper was to subscribe to the journal. So a few top journals hoarded all the good papers, and those got cited massively, even if they weren’t perfectly on point. As it became easier to read more journals and papers (see the Internet), then people spread the citations around.
It assumes that citing a paper less means you’ve “forgotten” about it or are “unable to track it”. That’s not necessarily true at all.
Further, there are a lot more journals around than there used to be! That’s not a terrible thing. So there are more places for people to publish, and more things people can cite.
Now, it’s entirely possible that this could mean that more bad science is being published. But it didn’t measure that. It also didn’t prove in any way that a lack of persistence of citations is a bad thing. It could mean science is moving faster.
But one thing that could replace the persistence of citations is a media with an attention span greater than that of a gnat.* If people are involved in the media who are experts in their domains, who know the key studies, or how to find them, then this is all moot. Those people can bring the important research to the forefront and keep people focused on what matters. I like to think that the members of TIE are trying to do just that.
*There are lots of good media sources these days that are trying to combat this problem!