Software and research: the Institute's Blog

Increasing our understanding of location - GeoTOD II

Latest version published on 6 October, 2016.

The time it takes for an ambulance arrives, the routes that police should patrol or the measures put in place during a national emergency all rely upon an understanding of location. This not only requires an understanding of geographical features like hills, buildings, roads and rivers, but also of information relating to a location at any point in time, such as road works, diversions, burst riverbanks and collapsed bridges. With this in mind, the UK Government have proposed a Location Strategy which seeks to maximise the value of geographic information by combining maps with information…

GeoTOD II linked data demonstrator is live

Latest version published on 3 October, 2016.

GeoTOD-II's linked data server, built using OGSA-DAI, is live, allowing geo-spatial data to be browsed and queried.

Over the past six months we have been providing the GeoTOD-II project with advice on using OGSA-DAI. GeoTOD-II, based at STFC, have been exploring ways of linking geographic information with location data. A lot of the required data is already available in existing databases (for example, in relational databases or files), so GeoTOD II built a framework, based on OGSA-DAI, to access this data and expose it as linked data. GeoTOD II have deployed a demonstration…

Experiences with ParaView and Python

Latest version published on 3 October, 2016.

While working with the Culham Centre for Fusion Energy on streamlining the visualisation of output from the GS2 fusion simulation suite we've had a few tussles with the ParaView visualisation tool.

Beginning with an existing Python program that post-processes GS2 output into a format that can be read by ParaView, our initial goal was to modify the program so that it could be use as a component (a Reader) in a ParaView/VTK pipeline (ParaView is built on VTK).  One main requirement was to enable the user to be able to select input files via a dialog.

We worked with…

Ask Steve! - What is test-driven development?

Latest version published on 30 September, 2016.

Got this question the other day from Andrew Milstead, also from the University of Southampton…

Test-driven development is a way of working – a discipline – when developing software. Essentially, you develop the unit tests for code before you write the code itself.

In general, testing is often given short-shrift in academic projects, because academia generally lacks the resources to place a proper emphasis on testing – functionality is often prioritised over quality. This is completely understandable because you are often developing proof-of-concept software. However,…

GeoTOD meet with data.gov.uk and Ordnance Survey

Latest version published on 3 October, 2016.

In October, the GeoTOD II team presented their work to key members of data.gov.uk and the Ordnance Survey. The meeting went well and there was interest in GeoTOD II's work on exposing legacy data sets, linking geographic and locational information, UML to RDFS conversion and also in GeoTOD IIs longer term plans.

GeoTOD II is contributing to the evolution of the UK's Location Strategy by exploring ways of linking geographic information with location data. The Software Sustainability Institute is assisting GeoTOD II in their use of the OGSA-DAI open source framework for…

Changing software a nightmare for tracking scientific data

Latest version published on 3 October, 2016.

Following the recent Nature article "Computational science: ... Error - why scientific programming does not compute" spawned by the Climategate affair, there's another interesting article titled "Changing software, hardware a nightmare for tracking scientific data" from the Nobel Intent blog on Ars Technica. Again, it is the pace of technological advance, so important for making new discoveries, which is also causing us to have to question if we can reproduce our past results.

The author notes the difficulties of keeping a fully reproducible analysis pipeline working, with…

Life after NeSCForge - surviving a repository closure

Latest version published on 3 October, 2016.

NeSCForge has been the home to many of the software projects from the UK's e-Science programme and beyond. However, pressures of funding have led to the decision to close NeSCForge permanently on 20th December 2010.

So what do you do if your software was using NeSCForge or another site marked for closure? Don't panic - it's fairly painless and the SSI can help guide you through the process.

We've recently extended our collection of guides for developers to include:

choosing a repository, migrating to a new repository, retrieving files from NeSCForge,…

What cloud computing got right

Latest version published on 3 October, 2016.

Cloud computing is, in my experience, a subject that creates excitement and scepticism in equal quantities. My introduction to the subject came courtesy of a presentation by Werner Vogels, Amazon's Vice President & Chief Technology Officer, on Amazon's Elastic Compute Cloud. It was a fascinating presentation, but it was even more interesting to hear the passion behind the questions that followed. I’m not going to focus on the technical side of things in this post, because what interests me is the way in which cloud computing has grasped the public’s attention.

Unlike Grid…

Computational science: eliminating the errors

Latest version published on 3 October, 2016.

There's an interesting feature over at Nature by Zeeya Merali called "Computational science: ... Error - why scientific programming does not compute" (disclaimer: I was one of the people interviewed for the piece). In it, Merali considers the issues of computational software written in the scientific context particularly in light of the problem revealed by the leak of emails from the Climatic Research Unit at the University of East Anglia last year.

As the article notes, a lot of scientific software has grown ever more complex and there is a steep learning…

"Riding the Wave"

Latest version published on 3 October, 2016.

With another hat on I've been reading the recent report from the European Commission's High-Level Expert Group on Scientific Data - "Riding the Wave: how Europe can gain from the rising tide of scientific data".  It captures the current state of research's digital landscape very well, offers a compelling vision of the value of scientific digital data twenty years hence, and recommends a number of key policy steps for the EU to consider.

I recommend it.  It's a good read - not perhaps in the same way that Patrick O'Brian's Jack Aubrey novels are a good read - but it does underline…