Resource Identification Initiative

Printer-friendly version
 
 
 
 
 
 
 
RII

Looking to get RRIDs for your paper (scicrunch.org/resources)

The Resource Identification Initiative has been launched and moved beyond a pilot phase. We invite publishers, editors, authors, biocurators, librarians, resource provides, and vendors to participate.

Authors can participate by adding RRIDs to their papers, go to scicrunch.org/resources

The Resource Identifiers (RRIDs) are in the published literature; publications currently reporting RRIDs can be found in Google Scholar PubMed Central or PubMed

The Resource Identification Initiative (#RRID) is designed to help researchers sufficiently cite the key resources used to produce the scientific findings reported in the biomedical literature. A diverse group of collaborators are leading the project, including the Neuroscience Information Framework with the support of the National Institutes of Health and the International Neuroinformatics Coordinating Facility.

Resources (e.g. antibodies, model organisms, and software projects) reported in the biomedical literature often lack sufficient detail to enable reproducibility or reuse. For example, catalog numbers for antibody reagents are infrequently reported, and the version numbers for software programs used for data analysis are often omitted. This has been called out as a serious enough problem by the NIH to introduce new guidelines for Rigor and Transparency for almost all awards in starting in May of 2016. 

These guidelines argue for authentication of key research resources, and transparency of how they are reported.

RII

The Resource Identification Initiative aims to enable resource transparency within the biomedical literature through promoting the use of unique Research Resource Identifiers (RRIDs). In addition to being unique, RRID’s meet three key criteria, they are:

  1. Machine readable.
  2. Free to generate and access.
  3. Consistent across publishers and journals.

The first step in this project was to test the feasibility of the system for a limited set of resources: antibodies, model organisms (mice, zebrafish, flies) and tools (i.e. software and databases) in the biomedical literature. Authors publishing in participating journals were asked to provide RRID's for their resources. RRID's will be drawn from:

To make it easy for authors to find the appropriate RRID's and to format their citations, we have created the Resource Identification Portal, where authors can search across all of sources from a single location.

In addition to facilitating reproducibility and reuse, the inclusion of RRID citations in the literature allows resource providers, funders and others to better track usage and impact. Ultimately, we believe that the outcome of the pilot phase showed:

  • The need for better reporting of materials and methods to promote reproducible science.  Proper resource identification is a step towards this goal.
  • The need for a cultural shift in the way we write and structure papers. We must recognize the increasing dominant model of interacting with the literature through automated agents; therefore, the conventions we adopt should be tailored towards greater machine-processability.
  • The need for a cultural shift in the way we view the literature. The literature is not only a source of papers for people to read, but a connected set of data: observations and claims in biomedicine that span journals, publishers, and formats. The synthesis of information from the literature and other sources requires universal machine access to key entities.

The pilot project was an outcome of a meeting held at the NIH on Jun 26th, 2013.  A draft report from the meeting is available. We are working with commercial partners, tools builders and others in the FORCE11 community to provide a set of interfaces that will allow authors to access the data set acquired in various ways that will demonstrate the power of this approach.  We hope the Resource Identification Initiative will be a small step towards improving scholarly communication and scientific reproducibility.

Please see this guide if you are interested in participating.  For an overview of the project visit the FAQ.


ENDORSE THIS PROJECT

Relevant publications

Bandrowski et al. (2015) The Resource Identification Initiative: A cultural shift in publishing.

   Co-Published: Journal of Comparative Neurology [10.1002/cne.23913], Brain and Behavior [10.1002/brb3.417], F1000 Research [10.12688/f1000research.6555.2], and Neuroinformatics [10.1007/s12021-015-9284-3].

 

Vasilevsky et al.  (2013) On the reproducibility of science: unique identification of research resources in the biomedical literature  Peer J.

Another reason why we are doing this:  Faulty Antibodies Continue to Enter US and European Markets, Warns Top Clinical Chemistry Researcher

Helsby et al (2013) Reporting research antibody use: how to increase experimental reproducibility. F1000 Research

Biocompare:  Taking Steps Towards Scientific Reproducibility

Group Events
Working Group Participation

Comments

http://www.elsevier.com/about/content-innovation/minimal-data-standards

 

Please let us know if this needs to be significantly updated. We seem to be converging on a set of formats.

Thanks to Elena, our press release is live!!!

Elsevier and the Neuroscience Information Framework Work Together to Improve Reporting of Research in Neuroscience Literature!

http://www.elsevier.com/about/press-releases/science-and-technology/else...

 

Thank you Francis Collins for including the work of this group in your thinking about improving reproducibility of pre-clinical research!

http://www.nature.com/news/policy-nih-plans-to-enhance-reproducibility-1...

As of yesterday this was true for two papers, now we have 3!

Looks like the Journal of Comparative Neurology and PeerJ contain the first RRIDs to come out. This is a great week!

Looking forward to many more in the coming weeks.

 

My Blog on the subject: http://blog.neuinfo.org/index.php/essays/rrids-are-in-the-wild-thanks-to...

Thanks to 15 journals (and counting) we have reached 100 articles that contain RRIDs.

 

Other statistics:

 # of RRIDs total 630

# of RRIDs that are correct 605

% of RRIDs that are correct 95%

most common research resource is RRID: nif-0000-30467 with 20 papers

   *if you want to know which research resource it is, you are just going to have to go to google scholar and find out!*

This is a cool benchmark:

We just hit the 50th journal that now contains at least 1 paper with an RRID, doubling the original number of journals that signed up to participate in the pilot.

...and that journal is: Alcohol Clinical Experimental Research.

pubmed.gov/25916839

Great quote from a blog post by Joseph Esposito that describes our aim in the RRID project and why we're seeing such growth:

 

"What we need are not new systems but new services. Services are not top-down comprehensive solutions to all the problems (and some of the merits) of scholarly communications but activities that address specific needs...

"What all of these things have in common is that they did not set out to change the entire world but to improve one piece of it."

http://www.nature.com/news/researchers-argue-for-standard-format-to-cite...

 

Best line: One format to rule them all!

Love it, just hope we don't need to be thrown down into lava at some point.

Hi Folks, just saw an independent confirmation from a friend (author who shall remain nameless until her paper is published) that Cell Press is delivering on their promise to add RRIDs to relevant resources.

Woohooo!

Nice piece from Phil, explaining the data commons. http://www.nature.com/nature/journal/v527/n7576_supp/full/527S16a.html "First, each research object in the commons — for example, data, software, narratives or papers — must be uniquely identified, sharable (taking into account privacy issues), and resolvable to its source by using a common identifier." - Agreed!

https://www.faseb.org/Portals/2/PDFs/opa/2016/FASEB_Enhancing%20Research... "...To achieve uniform reporting of research findings, FASEB recommends that investigators, funding agencies, and journals adopt best practices for experiments using antibodies...."

We need to think and digest. How can RII help? http://arxiv.org/abs/1602.02296

Thanks to our friends at NIDA, we have the some rather strongly worded text, supporting RRIDs, now in the inboxes of all current and former NIDA grantees. SciCrunch: A resource for Enhancing Reproducibility Through Rigor and Transparency https://www.force11.org/blog/nida-supports-scicrunch-and-rrids-making-re...

Thanks to Christophe Bernard, spearheading this effort eNeuro, the premier open access journal at the Society for Neuroscience, has started asking for RRIDs. Read all about it: http://eneuro.org/content/3/2/ENEURO.0046-16.2016

Thanks to Colleen Hammer all products are identified with RRIDs on the website and in Material Data Sheets. This is exactly what we need from all manufacturers of products. http://immunostar.com/latest-news/

Thanks to Katharine Miller, RRIDs are discussed along side, BioSharing, DataMed, and CEDAR!

http://bcr.org/content/data%E2%80%99s-identity-crisis-struggle-name-it-d...

Read all about it: Practical steps towards reproducibility:

http://dx.doi.org/10.1016/j.neuron.2016.04.030

 

Please take a look at the instructions to authors (oh go on, I know you have never done it before....)

http://submit.elifesciences.org/html/elife_author_instructions.html

"To help promote the identification, discovery, and reuse of key research resources, we encourage you to include Research Resource Identifiers (RRIDs) within the Materials and Methods section to identify the model organisms, cells lines, antibodies, and tools (such as software or databases) you have used (e.g.RRID:AB_2178887 for an antibody, RRID:MGI:3840442 for an organism, RRID:CVCL_1H60 for a cell line, and RRID:SCR_007358 for a tool). The RRID Portal lists existing RRIDs, and instructions for creating a new one if an RRID matching the resource does not already exist."

Instructions to authors at the Journal of Neuroscience have just been updated to include a clear, short instructions for authors 

http://www.jneurosci.org/site/misc/ifa_organization.xhtml

 

http://www.economist.com/news/science-and-technology/21702166-two-studie...

 

Would it not be nice to figure out which software tool was being used in each of those studies so that we could quickly figure out which ones are likely experiencing this 70% false positive rate? Yes, yes it would. Just another reason to use RRIDs!

 

"...Since all the “participants” in these newly conducted trials were, in fact, controls in the original trials, there ought to have been no discernible signal. All would presumably have been thinking about something, but since they were idling rather than performing a specific task there should have been no discernible distinction between those categorised as controls and those used as subjects. In many cases, though, that is not what the analysis suggested. The software spat out false positives—claiming a signal where there was none—up to 70% of the time.

False positives can never be eliminated entirely. But the scientific standard used in this sort of work is to have only one chance in 20 that a result could have arisen by chance. The problem, says Dr Eklund, lies with erroneous statistical assumptions built into the algorithms. And in the midst of their inspection, his team turned up another flaw: a bug in one of the three software packages that was also generating false positives all on its own."

 

http://biomethods.oxfordjournals.org/content/manuscript-preparation#RRID

Welcome to the family Biology Methods and Protocols.

 

 

We have great news about the Endocrinology Society journal

They have updated the instructions to authors as follows, more to come!:

http://press.endocrine.org/page/endoita#mozTocId290874

"Antibody Table

It is the policy of Endocrinology to require authors using antibodies for immunohistochemistry, immunocytochemistry, western blots, immunoblots, immunoneutralization, or related methodology, to submit an Antibody table. This table should be numbered to indicate its position in the sequence of tables in the article (e.g. Table 1). In the Methods section, describe appropriate positive or negative controls, antibody validation, lot number, and provide references. Beginning in September 2016, authors should also ascertain whether the antibody has a Research Resource Identifier (RRID) by consulting the Antibody Registry and include this information, if available, in the Methods section and/or the Antibody table of the original submission. If there is not an RRID, authors are required to register the antibody and obtain one no later than the revision stage of submission. For more information, see the editorial Antibody Validation Requirements for Articles Published in Endocrinology and the Resource Identification Portal.

- See more at: http://press.endocrine.org/page/endoita#mozTocId290874"

This is really exciting, several papers have already come out in this new format. 

It is everything we wanted, transparent, structured, filled with good information for finding materials!

http://dx.doi.org/10.1016/j.cell.2016.08.021

 

http://www.sciencedirect.com/science/article/pii/S009286741631011X

http://www.sciencedirect.com/science/article/pii/S0092867416309953

http://www.sciencedirect.com/science/article/pii/S0092867416309321

 

 

http://us8.campaign-archive2.com/?u=9938fea8603b684592ae3da72&id=9a1fd0e21e

..."The Table allows for linking to reagent databases such as the Resource Identification Initiative’s RRID. In addition, the methods section will have no size limit and, more importantly, will NOT count against a word count. Cell suggests that “[t]he presence of these sections helps authors to report and readers to easily appreciate how the studies follow guidelines from the NIH Rigor and Reproducibility initiative” as well as the ARRIVE and TOP guidelines. "

http://www.eurekalert.org/pub_releases/2016-08/cp-cpt082516.php

"We are emboldened by the opportunity to be pioneers, and to extend what we do best--publishing the highest quality of science--to the methods section"

Nature Methods just published a paper entitled "a proposal for validation of antibodies", by the members of the International Working Group on Antibody Validation. 

The pillars of validation include something near and dear to our hearts here at RII, identification of antibodies, i.e., that RRIDs should be included. 

http://www.nature.com/nmeth/journal/vaop/ncurrent/full/nmeth.3995.html

Yes, you read this correctly, Thermo Fisher Scientific, one of the biggest antibody companies will validate all products according to the IWGAV outlined standards. 

http://www.businesswire.com/news/home/20160905005488/en/Thermo-Fisher-Sc...

Check out each BioLegend page showing the citation format for each product, using RRIDs! 

This is great news for validation of methods. Thank you BioLegend for helping authors do the right thing. 

http://www.genengnews.com/gen-articles/research-antibody-reproducibility...

 

Update to instructions to authors is working, the first 5 papers are out!

https://wellcomeopenresearch.org/for-authors/article-guidelines/research...

This is the final report from the 2016 Asilomar meeting, which established among many other things, consensus from the antibody validation community including companies and researchers that RRIDs solve one of the validation problems, transparency. 

https://www.gbsi.org/gbsi-content/uploads/2016/12/Workshop-Report_12-15-...

Multiple solutions are needed to improve antibody reproducibility issues according to this article, but all solutions include RRIDs!

http://www.biocompare.com/333120-The-Antibody-Crisis-An-Ongoing-Discussion/

http://biorxiv.org/content/early/2017/02/16/109017

RRIDs are part of the solution, not the percipitate!

Chemistry also dealing with reproducibility and nomenclature problems! Great read!

https://www.chemistryworld.com/news/taking-on-chemistrys-reproducibility...

http://blog.benchsci.com/2017/03/31/reproducibility-in-science-how-prope...

..."An even better citation method that is gaining momentum with scientists was created by the Research Resource Initiative. This initiative was launched with the intention of creating a resource to standardize the citation of biomedical resources based on the references in which they were generated or used."

Thank you PLoS, apparently I completely missed this blog post.

http://blogs.plos.org/plos/2016/09/riding-a-wave-towards-improved-truth-...

Please see instructions to authors!

http://www.fasebj.org/site/misc/rc.xhtml