HomeNews and blogs hub

10 pivotal moments in the history of the Institute

Bookmark this page Bookmarked

10 pivotal moments in the history of the Institute

Author(s)
Simon Hettrick

Simon Hettrick

Director of Strategy

Posted on 14 August 2020

Estimated read time: 12 min
Sections in this article
Share on blog/article:
Twitter LinkedIn

10 pivotal moments in the history of the Institute

Posted by j.laird on 14 August 2020 - 9:30am

Road sign that says turning pointImage adapted from Roger Bradshaw 

By Simon Hettrick, Deputy Director.

This post is part of our 10 year anniversary series.

Over the last ten years there have been a number of moments where it felt that the future of the Institute was shaped. With some, there was a palpable feeling that something important had just happened. Others occurred without fanfare, and it wasn’t until later, sometimes much later, that we realised just how important they had been. In this post, I’ve chosen ten moments that stand out from the history of the Institute.

1. A lot’s in a name

There is something unbelievably soul-sapping about devising a mission statement. It’s important, but feels corporate. It needs detail, but can’t be too long. It has to be relatable, but needs to say something new. There are committees. And rewrites. And arguments. Then someone says “Better Software, Better Research”.

Our informal mission statement came out of the blue during a meeting in 2013. We like a grand gesture, so we created a design with massive block capitals in a font called “Impact”. Then we put it on stickers and it turned out that everyone wanted one. So we put it on T-shirts and everyone wanted one of them too.

“Better Software, Better Research” works because it perfectly describes not just what we stand for, but what a good proportion of the research community already knew to be true. It sounds frivolous to choose stickers and T-shirts as a pivotal moment in the Institute’s history, but they were the perfect medium for raising awareness of our primary goal across the research community. It also demonstrates our modus operandi: like so many of the Institute’s successes it was created by us, but the success was achieved by the response of our community.

2. How many researchers rely on software?

There was quite a lot of skepticism about our chances of success during our early years. Academia did not value software - and that was not going to change. For one thing, a vast amount of work was needed just to understand the scale of the problem we faced. One question always filled me with dread during those early years - how many researchers rely on software?

Everyone knew the answer - probably most of them - but it wasn’t easy evidencing how many researchers rely on software. We were trying to fix academia’s practice of ignoring software, which meant it was illogical to look for evidence in academic outputs, like papers or bids, because they were going to under-represent software. There was no requirement to store software somewhere where it could be counted, nor a department or other role responsible for software that could be asked to report. In the end, we were left with little choice but to run a survey.

Running a national survey without any prior experience presents the kind of learning curve that would deter lesser organisations, but a month later we had almost 500 responses from across disciplines and had delivered the first study of software reliance in academia. The ability to quickly answer the question of how many researchers relied on software certainly made our lives easier during the “any questions” section of a presentation, but it achieved vastly more. For the first time we could demonstrate what everyone knew: that most of research would be adrift without software. Once you can evidence that importance, you can start to change the way academia thinks about software.

3. Going nuclear

Being the first organisation to campaign about the risks of not sustaining software meant that we received some pretty interesting invitations. At first glance, the “Workshop on Software Sustainability for Safeguards Instrumentation” appeared a little dull, but then we learned that these “safeguards” were used by the International Atomic Energy Agency to monitor whether nuclear reactors were producing energy, rather than material for nuclear weapons.

It was a lot of pressure on Neil Chue Hong, our Director, to be the sustainability expert at a workshop focussed on sustaining this vital software, but it also perfectly illustrated how something as innocuous as making sure software functionality didn’t degrade over time could have global importance. From this point onwards, if anyone challenged our assertion that software sustainability was vital, we had the perfect case study.

4. Collaborate, don’t compete

When we launched the Institute, we intended to conduct software engineering, not teach it. Quite reasonably, we thought that no research bid would allow us the resources to support all of the trainers we would need to make an impact. It didn’t take long to realise that a lack of training was one of the biggest problems holding back academic software. A solution had to be found.

If we created our own training materials and they became popular, then we would guarantee that the success would be attributed to us. If we promoted someone else’s existing materials, we could end up expending a load of effort only for some other group to gain the glory. One of our tenets is “collaborate, don’t compete”. This question of training materials was the first major test of our commitment to this concept.

We decided to promote Software Carpentry, because it was already a popular course - although not yet popular in Europe - and became the UK coordinators in 2012. Rather than investing our time in creating a course, we invested it into publicising, coordinating and running Software Carpentry workshops. In 2012 there were seven workshops in the UK, and last year there were 64. We risked not being associated with this success, but we succeeded in using our limited resources to train a constantly increasing number of researchers, making a significant impact on skill levels in the UK. This decision justified our firmly held belief that collaboration leads to greater success than competition.

5. The Fellowship has a certain ring to it

Being a truly interdisciplinary organisation sounds like a wonderful idea right up until you learn just how many disciplines there are. Our remit was to represent the UK’s more than 200,000 researchers and, when we started, our handful of staff were all physicists and computer scientists. How could they understand the travails of researchers from all disciplines? The obvious solution was to choose representatives from across the community to help us, but how do you attract the people who can help?

We kicked off a programme called “Agents” in 2012 that played on the idea of being a spy collecting intelligence on the practices of the research community. People applied to join the Institute as an Agent, they got funding for travel and in return they provided information on what they learned on their travels. It was very popular, but it lacked punch. The concept of “Agents” was too transactional. What we needed was a community in which people could become invested.

The programme was renamed the Fellowship, so that other academics could recognise its worth. But the big impact came from building it from an intelligence-collecting programme into an interactive community by making the selection process an in-person affair, and by bringing the successful Fellows together at least twice a year at workshops. Quickly the people in the programme began collaborating, attending our events, writing for us, generating ideas and reviewing our plans. In short, Fellows became honorary staff members of the Institute, and contributed directly to the continuing success of the programme. The intelligence generated by the Agents was useful, but with only a few small changes the Fellowship increased the Institute’s impact by orders of magnitude.

6. The people who write code, not papers

Software developers have existed in academia for decades, but they were not valued in the same way as researchers. In 2013, we wanted to run a campaign that would have an impact on the quality of software used across the academic community. There were various ideas, but the frontrunner came from a discussion at our Collaborations Workshop in 2012 which dealt with the problems faced by software developers in academia.

A major obstacle that emerged from the discussions at the Collaborations Workshop seemed facile: the lack of a recognised job title for the role. Instead of one title, there were hundreds in use across the UK. This was a problem in that it made it very difficult to recruit a software developer, and it also made it difficult to find a job as one. It also made it near impossible to run a campaign to support the role. How can you fight for something that no one can name? A new name was born at the Collaborations Workshop: the Research Software Engineer (RSE).

The RSE campaign has lasted seven years so far. It now encompasses tens of thousands of people and it led to the foundation of a new, and much needed, Learned Society: the Society for Research Software Engineering. The impact of the campaign has reached around the world, but none of it could have happened without the creation of the job title.

7. Peaking over the parapet

In September 2010, we set off to the UK e-Science Conference in Cardiff for the first public outing of the now five-month old Software Sustainability Institute. We brought with us a ton of enthusiasm and an open minded idea about who we were and what we were hoping to achieve.

There was a lot of interest... and a reasonable number of people who thought we’d massively over-promised in suggesting that we could enact change across UK research. My abiding memory of this conference was that, looking past the inherent skepticism that is part of the territory in academia, most people wanted to collaborate and help. By harnessing this interest, we started to create a community that would work with us to improve research software for everyone.

8. Will anyone turn up?

By 2016, we had won some funding from EPSRC to start a conference series for RSEs. Anyone familiar with holding a party will recognise the feeling of “will anyone turn up?”, which grew to almost pathological levels over the summer of 2016. It was worth it. I was lucky enough to give an opening presentation in front of 200 RSEs, buoyed by an immense sense of pride at what the Institute and the community had achieved.

What was not expected, was that those attendees came from 14 different countries. Not only did this prove the widespread popularity of the RSE cause, it also brought together people who would go on to fight for change in their own countries. A group of German RSEs were the first to succeed, then the Dutch, followed by Canada, the Nordic countries, South Africa, the US, Australia, New Zealand and, most recently, Belgium. With one conference, the RSE campaign went global.

9. People care about sustainability

The single greatest problem with our work is this: scalability. Demand for our expertise is easy to demonstrate, but how can we ensure that everyone has access to it? Our RSEs were achieving great successes when they worked with researchers, but they were vastly outnumbered. We needed to distill this expertise and make it widely available.

We tasked our senior RSEs with distilling their expertise - the questions they would ask - into a guide on software evaluation and made this available on our website. The result was the “Software Evaluation Guide”: everything you needed to know to get started with software sustainability. In doing this we laid bare the fundamental question about the worth of the Software Sustainability Institute, but would researchers be interested?

We’ve had almost 2 million visitors to the Institute website over the last ten years, which is a huge amount of people for what was thought to be an esoteric subject. Over this time, the software evaluation guide has consistently been the most popular page on our website. Almost 70,000 people have reviewed the guide and, we hope, benefitted from its advice. This proves without doubt that software sustainability is a subject of importance to research.

10. Lucky number seven

When we launched the Institute, we had a completely interdisciplinary remit but were funded only by the EPSRC. We have campaigned over the last ten years against the idea that software was only an issue for the engineering and physical sciences, and many of the pivotal moments above are about us slowly winning this argument.

That we were making progress on this issue has been demonstrated by our funders. In 2015, we picked up the BBSRC and the ESRC. In 2018, we added the AHRC, MRC, NERC and STFC, and became one of the few UK organisations funded by all seven of the Research Councils. In 2010 we represented a gamble by the EPSRC: would this concept embed across academia? By 2018, we proved definitively that software is not a concern for specific disciplines, but is instead vital to everyone in research.

Share on blog/article:
Twitter LinkedIn