Interoperability and the FAIR principles – a discussion
Posted on 4 April 2019
Interoperability and the FAIR principles – a discussion
By Sarah Maddox, Technical Writer
This post was originally published at Ffeathers.
This week I’m attending a conference titled Collaborations Workshop 2019, run by the Software Sustainability Institute of the UK. The conference focuses on interoperability, documentation, training and sustainability. I’m blogging my notes from the talks I attend. All credit goes to the presenter, and all mistakes are my own.
Patricia Herterich from the University of Birmingham presented a session on "Interoperable as in FAIR – A librarian’s personal point of view".
A simple definition of interoperability: the ability of computer systems or software to exchange and make use of information. People also talk about semantic interoperability and other interpretations of the term.
Data interoperability
Patricia introduced the FAIR principles: a set of guidelines that aim to ensure data is:
- findable,
- accessible,
- interoperable, and
- reusable,
by both people and machines. FAIR principles focus more on the semantic aspects of interoperability rather than the technical aspects.
Patricia highlighted a big problem: Interoperability is not a well defined term. No-one knows what it means.
Some organisations have developed tools to assess data interoperability:
- The Dutch Data Archiving and Networked Services (DANS) organisation has developed a FAIR data assessment tool (see the prototype) that attempts to measure data interoperability.
- The Australian Research Data Common (ARDC) has also developed a FAIR Data self-assessment tool.
Software interoperability
For software, we can think of defining interoperability in this way:
- Use of open standards
- Use of platform/plugin architectures
- Use of common libraries and package managers
Patricia pointed out that FAIRsharing.org offers various standards, but there are already well over 1000 standards there.
So how does a researcher go about choosing the right standard? How do we train researchers to make data FAIR? Patricia left this as an open question for discussion.
Questions and comments from the floor:
- The FAIR principles were originally developed for data. Does it make sense to apply them to software?
- The FAIR principles seem like just a catchy way of packaging techniques that have been applied for a long time.
- Interoperability is not simple, and we need a set of user-friendly tools.
Thank you Patricia for a good discussion of the complex world of interoperability.