Transparent evaluations of FAIRness are increasingly required by a wide range of stakeholders, from scientists to publishers, funding agencies and policy makers. We propose a scalable, automatable framework to evaluate digital resources that encompasses measurable indicators, open source tools, and participation guidelines, which come together to accommodate domain relevant community-defined FAIR assessments. The components of the framework are: (1) Maturity Indicators – community-authored specifications that delimit a specific automatically-measurable FAIR behavior; (2) Compliance Tests – small Web apps that test digital resources against individual Maturity Indicators; and (3) the Evaluator, a Web application that registers, assembles, and applies community-relevant sets of Compliance Tests against a digital resource, and provides a detailed report about what a machine “sees” when it visits that resource. We discuss the technical and social considerations of FAIR assessments, and how this translates to our community-driven infrastructure. We then illustrate how the output of the Evaluator tool can serve as a roadmap to assist data stewards to incrementally and realistically improve the FAIRness of their resources.
Evaluating FAIR maturity through a scalable, automated, community-governed framework
Mark D. Wilkinson,M. Dumontier,S. Sansone,Luiz Olavo Bonino da Silva Santos,Mario Prieto,Dominique Batista,Peter McQuilton,T. Kuhn,P. Rocca-Serra,M. Crosas,E. Schultes
Published 2019 in Scientific Data
ABSTRACT
PUBLICATION RECORD
- Publication year
2019
- Venue
Scientific Data
- Publication date
2019-05-28
- Fields of study
Biology, Sociology, Computer Science, Medicine
- Identifiers
- External record
- Source metadata
Semantic Scholar, PubMed
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-10 of 10 references · Page 1 of 1