• Citation analysis, in one “sobering fact”

    It is a sobering fact that some 90% of papers that have been published in academic journals are never cited. Indeed, as many as 50% of papers are never read by anyone other than their authors, referees and journal editors. We know this thanks to citation analysis, a branch of information science in which researchers study the way articles in a scholarly field are accessed and referenced by others.

    Sadly and ironically, this sobering fact from Lokman Meho is not associated with a citation. Not that I necessarily doubt its veracity, but I would love to be made aware of the body of work that supports it. Anybody know? (Comments open for one week for leads only. Email/Twitter fine too. I have also emailed the author.)


    Comments closed
    • Funny you should ask. I was just looking for actual studies behind this claim and others this yesterday for something that I was writing and found no evidence for such a broad result, although specific studies consistent with it. Subject-specific studies seem to dominate: http://www.bibsonomy.org/bibtex/246df8e4899f7ca5ed2a0f3fc4a34afb7/wdees in library information sciences; http://www.biomedcentral.com/1471-2288/4/14. Mark Bauerlein did a case study on literary research at 4 colleges: http://centerforcollegeaffordability.org/research/studies/literary-research-analysis. In a CHE article, Bauerlein and others
      http://chronicle.com/article/We-Must-Stop-the-Avalanche-of/65890/ refer to an academic study by Peter Jasco in Online Information Review in 2009, but when I looked at his 2009 papers in that journal that I could not find, I did not see any abstracts which could contain a result like that. I was planning to email Jasco… Please let me know what you find.

      • Addendum to prior comment: The limited cites are found were not necessarily consistent with more than 90% not being cited but rather with the general point. The Library and information sciences study shows median # of citations of 0, so at least 50% never cited; the subset of medical studies had 25% uncited; literary scholarship was closer to 90%.

    • You are quoting from a 2007 article, that I *presume* is based on this scholarship by the author and a co-author that was published in 2006,


    • Citation distributions are highly skewed. But Meho is exaggerating the overall skewness.
      If we do a search in Web of Science (WoS) for all items published in, say 2000, there are 1.2 million. The web interface for WoS only allows one to scroll down the 100,000th most cited item. But that’s almost 10%, and the 100,000th item has 53 citations. That’s still a lot of citations. Unfortunately the web interface doesn’t allow us to scroll further down the list. But we’d have to go much further to get to zero citations.
      Or if you try publication year 1994, which just slightly more than a million items in WoS, the 100,000th item has 43 citations today.
      Anyone with access to Web of Science can try that. But from some research by Larry Smolinsky and me, as yet unpublished, we can get more detail.
      Skew varies by field. Economics is very skewed. There are 53985 publications categorized as Economics by WoS published in the 1960s. Of these 82% had no citations at the end of the 1970s.
      By contrast, however, Biochemistry is less skewed. There are 109,202 publications in WoS published in the 1960s categorized as Biochemistry and Molecular Biology. Of these only 11% were not cited at the end of the 1970s.

    • I’m obviously not an academic, but I’ve posted two links below that might help in answering this question – or maybe suggesting additional research for Richard Freeman.

      Shankan Vidantam did a clip for NPR on Richard Freeman’s look into how diversity affects the quality of research. Here is a link to the IDEAS page that describes the paper:

      Also, Freeman’s earlier paper Why and Wherefore of Increased Scientific Collaboration may also have a partial answer to this question.
      File URL: