Software and research: the Institute's Blog

Fellowship - how it helped a digital humanites scholar.

By Stuart Dunn, lecturer at the Centre for e-Research, Kings's College London, and 2014 Institute Fellow.

One problem with being a digital humanities academic these days is the sheer volume of scholarly activity available – from seminars and workshops to conferences and symposia. In London alone, one could easily attend three or four such events every week, if not more.

My Fellowship has provided me with an excellent heuristic for selecting which events one goes to, and helped me to connect my participation in the community around how digital humanists approach and practice the sustainability of what they use and build.

Especially to one used to applying for research grants, the application process was extremely simple and lightweight. The focus was on your ideas and thinking, rather than just box-ticking. Even writing the application forced me to think succinctly about the challenges and questions facing the DH community in sustaining software. These include whether we are too reliant on proprietary software, what role crowdsourcing will play in the future and in what ways does the inherently collaborative nature of Digital Humanities impact in sustainability issues.

Fellowships - what's it like being a Fellow?

By Stephen Eglen, senior lecturer in Computational Biology at the University of Cambridge and 2014 Institute Fellow.

I first heard about the Software Sustainability Institute in 2013, when Laurent Gatto and I were planning an R programming bootcamp.

I have long been a believer in the open sharing of software, and so I was glad to read about many of the complementary issues that the Institute has promoted, both within the UK and worldwide. Another thing that convinced me to apply was that a respected colleague in the R community, Barry Rowlingson, was also a Fellow.

I found the application procedure refreshingly short and straightforward. The most useful thing in the process was to propose what I would do in the course of the Fellowship. I had been discussing with colleagues in the neuroscience community about ways in which we could encourage data and code sharing.

Top tips for software developers working with researchers

By Mike Jackson, Software Architect.

Working with researchers is something the Institute has been doing for many years now. So we thought it was about time to put together our top tips for software developers working with researchers, to help foster productive, and enjoyable, collaborations.

1. Remember they are not software developers

You may know the difference between centralised and distributed revision control, classes and objects, pass-by-value and pass-by-reference, upcasting and downcasting, coupling and cohesion, processes and threads, or a stack overflow and StackOverflow, but your researcher may not. Knowing how to knock together a few dozen lines of code does not make someone a software developer, as writing code is just a fraction of what a software developer does.

Reducing the Distance between theory and practice

Polar bears

By Mike Jackson, Software Architect.

Clever theory about how to estimate the density or abundance of wildlife is of limited value unless this theory can be readily exploited and applied by biologists and conservationists. Distance sampling is a widely-used methodology for estimating animal density or abundance and the Distance project provides software, Distance, for the design and analysis of distance sampling surveys of wildlife populations. Distance is used by biologists, students, and decision makers to better understand animal populations without the need for these users to have degrees in statistics or computer science. Distance places statistical theory into the hands of practitioners.

“Is this a good time?” – how ImprompDo can tell when you’re busy

By Liam Turner, PhD student at Cardiff School of Computer Science & Informatics.

This article is part of our series: a day in the software life, in which we ask researchers from all disciplines to discuss the tools that make their research possible.

Growth in smartphone technology has led to the traditional trawl for information to be devolved down to an individual level. This presents a challenge as traditional methods of making information available depend on when it is ready available, rather than when it is most convenient for a busy user.

Currently users have to work out the best way to get information while still managing their other commitments at the same time, but it would be more useful if this could be managed proactively. This predictive estimation would analyse and arrange itself around its user’s behaviour before it sent them the new information. This forms the backbone of our project in using the technical capabilities of the smartphone to infer interruptibility and so make a decision as to whether to deliver or delay.

What makes good code good at EMCSR 2014

By Steve Crouch.

On August 8th 2014, I attended the first Summer School in Experimental Methodology in Computational Research at the University of St. Andrews in Scotland. Run as a pilot for primarily computer scientists, it explored the latest methods and tools for enabling reproducible and recomputable research, and the aim is to build on this successful event and hold a bigger one next year.

The Institute already works with the Summer School organisers in a related project, recomputation.org. Led by Ian Gent, this project aims to allow the reproduction of scientific results generated using software by other researchers, by packaging up software and its dependencies into a virtual machine that others can easily download and run to reproduce those results.

Online psychological therapy for Bipolar Disorder

By nicholas [dot] todd [at] nhs [dot] net (Nicholas Todd), Psychologist in Clinical Training at Leeds Teaching Hospitals NHS Trust.

This article is part of our series: a day in the software life, in which we ask researchers from all disciplines to discuss the tools that make their research possible.

People with Bipolar Disorder often have problems gaining access to psychological therapy. Online interventions are an innovative solution to this accessibility problem and are recommended in clinical guidelines for mild to moderate anxiety and depression. These interventions provide round-the-clock, evidence based, self-directed support for a large number of people at a reduced cost to the NHS. 

The Living with Bipolar project was funded by Mersey Care NHS Trust and led by myself under the supervision of Professor Fiona Lobban and Professor Steven Jones, from the Spectrum Centre for Mental Health Research, Lancaster University. It was the first randomised controlled trial of an online psychological intervention for Bipolar Disorder to find preliminary evidence that the web-based treatment approach is feasible and potentially effective.

The Wild Man Game - bringing historic places to life

By Gavin Wood and Simon Bowen, Digital Interaction Group, Newcastle University.

This article is part of our series: a day in the software life, in which we ask researchers from all disciplines to discuss the tools that make their research possible.

Heritage organisations, such as museums, and managers of historic sites are increasingly interested in using mobile phones as a way of adding value to visits and directly connecting with the general public. App designers have responded by creating gamified digital experiences by borrowing game mechanics and game elements in an attempt to engage the user.

However, these experiences often fall short and we are given uninteresting treasure hunts that are often more about achieving goals and collecting rewards rather than thinking about and connecting with the heritage space itself. In response, we are exploring how digital play can bring our cherished cultural spaces to life, challenging the typical role for mobile phone apps in such contexts.

A map of many views - what Google Earth and a 1500 AD chart of Venice have in common

By Juraj Kittler, Assistant Professor of Communication at St. Lawrence University, and Deryck Holdsworth, Professor of Geography at Penn State University.

This article is part of our series: a day in the software life, in which we ask researchers from all disciplines to discuss the tools that make their research possible.

Our recent study, published last month in New Media & Society, surveyed the technical approaches adopted by Renaissance artist Jacopo de’ Barbari when he drafted his iconic bird’s-eye view of Venice in the last decade of the fifteenth century. We pointed out some important parallels between this masterpiece of Renaissance mapmaking and the current computer-supported digital representations of urban spaces.

The historical sources that we analysed indicate that de’ Barbari’s map was a composite image stitched together from a multitude of partial views. These were produced by surveyors using a technical device, called the perspectival window, and in a fashion that may be seen as a proto-digital technology. When constructing his two-dimensional image, the artist was intentionally tricking the eye of the observer into seeing a three-dimensional panoply, evoking what has later became known as virtual reality.

Super-computing, graphics cards, brains and bees

There's a bit of a buzz about brain simulation and GPUs.By Thomas Nowotny, Professor of Informatics at the University of Sussex.

This article is part of our series: a day in the software life, in which we ask researchers from all disciplines to discuss the tools that make their research possible.

Computer simulators have transformed almost every aspect of science and technology. From Formula One cars to modern jet engine aircraft, from predicting the weather to the stock market, and to the inner workings of the brain itself, most research and development activities today depend heavily on numerical simulations.

This is thanks to rapid advances over the last decades that have seen computer speeds double every two years. For much of this time, computer speed was raised by both simply shrinking the size of components and doubling the frequency at which central processing units (CPUs) – the workhorse of every computer – would operate. Yet we are now near the limits set by quantum physics that prohibit further advances in this direction. This has lead to a trend to instead focus on parallel architectures, where CPUs still run at the same speed but there are more of them that share the work.