The Resource Identification Initiative is underway and will continue through the end of the year. We invite publishers, editors, authors, biocurators, librarians, authors, resource provides, and vendors to participate. The Resource Identifiers (RRIDs) are appearing in the published literature; publications currently reporting RRIDs can be found in Google Scholar or PubMed. Click here for more information.
The Resource Identification Initiative (#RII) is designed to help researchers sufficiently cite the key resources used to produce the scientific findings reported in the biomedical literature. A diverse group of collaborators are leading the project, including the Neuroscience Information Framework and the Oregon Health & Science University Library, with the support of the National Institutes of Health and the International Neuroinformatics Coordinating Facility.
Resources (e.g. antibodies, model organisms, and software) reported in the biomedical literature often lack sufficient detail to enable reproducibility or reuse. For example, catalog numbers for antibody reagents are infrequently reported, and the version numbers for software programs used for data analysis are often omitted. The Resource Identification Initiative aims to enable resource identification within the biomedical literature through a pilot study promoting the use of unique Research Resource Identifiers (RRIDs). In addition to being unique, RRID’s meet three key criteria, they are:
- Machine readable.
- Free to generate and access.
- Consistent across publishers and journals.
The first step in this project is to test the feasibility of the system for a limited set of resources: antibodies, model organisms (mice, zebrafish, flies) and tools (i.e. software and databases) in the biomedical literature. Authors publishing in participating journals will be asked to provide RRID's for their resources. RRID's will be drawn from:
To make it easy for authors to find the appropriate RRID's and to format their citations, we have created the Resource Identification Portal, where authors can search across all of sources from a single location.
In addition to facilitating reproducibility and reuse, the inclusion of RRID citations in the literature will allow resource providers, funders, others to better track usage and impact. Ultimately, we believe that the outcome of this pilot will show:
- The need for better reporting of materials and methods to promote reproducible science. Proper resource identification is a step towards this goal.
- The need for a cultural shift in the way we write and structure papers. We must recognize the increasing dominant model of interacting with the literature through automated agents; therefore, the conventions we adopt should be tailored towards greater machine-processability.
- The need for a cultural shift in the way we view the literature. The literature is not only a source of papers for people to read, but a connected set of data: observations and claims in biomedicine that span journals, publishers, and formats. The synthesis of information from the literature and other sources requires universal machine access to key entities.
This pilot project was an outcome of a meeting held at the NIH on Jun 26th, 2013. A draft report from the meeting is available. We are working with commercial partners, tools builders and others in the FORCE11 community to provide a set of interfaces that will allow authors to access the data set acquired from this pilot in various ways that will demonstrate the power of this approach. We hope the Resource Identification Initiative will be a small step towards improving scholarly communication and scientific reproducibility.
Please see this guide if you are interested in participating. For an overview of the project visit the FAQ.
Another reason why we are doing this: Faulty Antibodies Continue to Enter US and European Markets, Warns Top Clinical Chemistry Researcher
Helsby et al (2013) Reporting research antibody use: how to increase experimental reproducibility. F1000 Research
Biocompare: Taking Steps Towards Scientific Reproducibility