HomeNews and blogs hub

Never has Software and Credit been such an important topic - CW16

Bookmark this page Bookmarked

Never has Software and Credit been such an important topic - CW16

Author(s)
Shoaib Sufi

Shoaib Sufi

Community Team Lead

Posted on 26 January 2016

Estimated read time: 7 min
Sections in this article
Share on blog/article:
Twitter LinkedIn

Never has Software and Credit been such an important topic - CW16

Posted by s.sufi on 26 January 2016 - 10:42am

Register for CW16 at www.software.ac.uk/cw16 

The Collaborations Workshop 2016 (CW16) brings together researchers, developers, project leaders, funders, publishers and more to explore the space of software and credit, collect best practices, form collaborations and think about the future. CW16 will inform nascent research leaders and the wider community about what is necessary to sustain the future of  computationally powered research by supporting the people behind it for their indispensable contributions to research.

CW16 leads on from the Institute’s first Software and Credit workshop held in late 2015, the research software community came together to highlight the multi-faceted problems and (sometimes) solutions around software and reputational credit. Whether it's current best practice, career progress or research evaluation, getting credit for something like software is difficult. Software certainly does not have the same status as peer reviewed journal papers but you could argue that it’s just as important to research. To change the status quo requires nothing less than a real and enduring culture change in research, from how it funded, to how it is published and to how it is evaluated.

Change is not easy and we might not see the fruits of efforts in this area until a generation of researchers have come and gone. However it is imperative that progress in this space is made so that the rewards associated with modern research matches the necessary products and by-products of world leading research such as software and analysis.

Emergent guides to best practices in the software and credit space such are starting to appear and are certainly welcome. Martin Fenner of DataCite in Software Citation Workflows introduced methods to obtain better citation for software and system that support this. A key example being obtaining DOI via integration through the popular GitHub code repository and the Zenodo and FigShare repositories. However problems still remain, Daniel S. Katz of the University of Chicago highlights that although software having a DOI is a good thing, it is qualitatively different from a paper, in the sense of not being peer reviewed and also not been indexed by the likes of SCOPUS or Web of Science. These are really two problems, the problem of being found and the other of peer-reviewed. Even though the latter is more difficult to resolve than the former both are critical to move the lot of software and credit forward and perhaps contribute towards ‘h-index’ scores for those working in research. Liz Allen (F1000) is a proponent of the excellent CreDit ontology, it supports assigning roles to authors making it clear who played what role in the research and thereby encouraging crediting people who played a role in the formation of the paper who may not traditionally make it on to the author list. New systems such as Depsy from the altmetrics community aim to go a step further and mine papers to find software use, analyse open GitHub projects to track package use and also work on fractional credit ratings based on number of contributions to a repository; making keeping an accurate project history a must-do to make credit discoverable. 

In terms of software based career paths in research there is often real frustration that students and staff who have developed incredible computational skill and would rather stay in an academic setting are being lost or their roles misunderstood due to a lack of funding and understanding of the key role they have played in research. Things are fortunately improving in this space with the establishment of Research Software Engineering (RSE) groups at UCL, Manchester, Sheffield and Southampton, add to this EPSRC’s support for the newly minted Research Software Engineering Fellows and the work of the Institute and the Research Software Engineer community are starting to make progress. There is much to be done and if this area is one of interest for either your personally or growing a computationally advanced team to aid research then CW16 will be the right place to get insights into current thinking, network with existing RSE leaders and discover future plans in this area.

How research is evaluated is perhaps one of the most most thorny issues relating to software in research. Research evaluation need to take into consideration software outputs and how they contributed towards research. Michael Double of the Royal Veterinary College, London (RVC) is the lead on the BoneJ project, his 2010 paper describing BoneJ is the mostly highly cited paper at the RVC gaining 2 new citations per week on average(!). However it was not deemed the right shape to submit to the UK Research Evaluation Framework committee by the RVC management even though they admitted it was highly impactful. REF in the UK is a system for assessing academic research based organisation and success in this exercise can determine particular types of funding to an organisation. Liz Allan (F1000), formerly the head of evaluation at Wellcome Trust and involved in the running of the recent REF is quite clear that software related items could indeed be put forward towards UK REF committees but agreed that it was not entirely clear how those committees would judge such submissions. This state of affairs only compounds the lack of clarity around evaluation and software in the UK’s latest research evaluation. Coming together and agreeing on concrete steps and suggestions to improve how software’s place in research should be viewed by institutions and by evaluation committees could be a key outcome of the CW16 discussions as so many of those attending have a stake in this.

What should the future of software credit should look like? What should software citation mean for software credit? How might peer-reviewed software compare to the traditional journal publication? How do you encourage software users whose research is supported by software to actually cite the software without which they could not have achieved their results as easily, as quickly or at all. These any many other questions will be up for discussion at CW16.

The future of Software Credit is the future of research software’s viability in supporting research now and in the future that’s why this year we chose software and credit as the main theme for the annual Software Sustainability Institute’s Collaborations Workshop.

Software and Credit is not the only topics at CW16 that will get airtime, other areas being: reproducible research, collaborative working, data science and code / data sharing; all of which have strong overlaps with software and credit.

Learn, inform and collaborate, the Institute’s Collaborations Workshops are consistently rated by attendees as enjoyable, useful and where collaborations germinate.

Help yourself, your group, your domain and you institution by registering at www.software.ac.uk/cw16 - research software needs you! 

Share on blog/article:
Twitter LinkedIn