Study shows scientific reproducibility is hampered by lack of specificity of resources

Printer-friendly version

Nicole Vasilevsky and Melissa Haendel

Oregon Health & Science University Library’s Ontology Development Group

A key requirement when performing scientific experiments is the accessibility of material resources, including the reagents or model organisms, needed to address a specific hypothesis. The published scientific literature is a source of this valuable information - namely, the name and vendor of an antibody that will pick up a signal on a Western blot, or the description of a mutant zebrafish strain that shows altered development in a specific tissue and time. But frequently the published literature lacks sufficient detail to the extent that researchers are unable to identify material resources used to perform experiments. Biocurators must often chase down authors to identify these resources to support data ingest into public repositories, which is expensive, time consuming, and not sufficiently effective. To highlight to publishers, editors, reviewers, and authors the significance of the problem, we undertook a study to quantify the issue.

Our study was published on Sept. 5th in the new journal PeerJ. PeerJ is an Open Access publisher with a novel, open approach to peer review and scholarly publishing (which as an aside, it was an excellent experience publishing there). Here we demonstrate the magnitude of the problem, which negatively affects the ability of scientists to reproduce and extend reported studies. The study shows that a large number of scientific resources are unidentifiable based on the information reported within the journal articles.


Our study examined nearly 240 articles from more than 80 journals spanning five disciplines: neuroscience, immunology, cell biology, developmental biology and general science. The articles were evaluated to determine if the reported material research resources could be uniquely identified based on the information that was provided in each article, its supplemental data, or prior references. Specific criteria were developed to determine if antibodies, cell lines, constructs, model organisms, and knockdown reagents were identifiable. Based on these criteria, Haendel, Vasilevsky and their team also developed guidelines for reporting of research resources. These guidelines are available ( and and can be used as a new data standard by authors, reviewers, publishers, and other data contributors to aid reproducibility. They are already in use at PeerJ!


The study showed that just under 50 percent of scientific resources used in previously published articles were unidentifiable, a percentage which varied across resource types and disciplines. While this low value was not unexpected, the actual degree of unidentifiability is still shockingly low. Our study also found no increased level of identification in journals that had more stringent reporting guidelines, suggesting that the reporting guidelines are not strictly adhered to in the evaluation of the research.


The hope is that by quantifying the lack of research reproducibility stemming from inadequate resource identification, that we can highlight to the research and publishing community that there is a significant and pressing need to make material resource information more accessible. Libraries are poised to help researchers with their data management and publishing needs; ask a friendly librarian if you have questions.


Link to the article:


About Melissa Haendel

Melissa is an active participant in the Force11 community; she is on the executive board, chaired the Force2016 conference, was the program chair for Force2015, and currently co-leads the Attribution WG. Melissa also co-leads the Monarch Initiative, which aims to provide open integrated access to model organism and human phenotype-... More

View Profile