Planning software sustainability for a TNT library
By Sarah Al-Assam, Post Doc in the Dieter Jaksch Theory Group, Physics Dept., University of Oxford.
Accurately simulating strongly-interacting, many body systems is crucial to understanding a variety of phenomena – from quantum systems such as high-temperature superconductors, to classical systems such as traffic networks. Tensor network theory (TNT) provides accurate and efficient algorithms for simulating such systems by optimising the description of correlations within a given network structure. We are developing the first, general purpose library based on these algorithms. Attending the recent SeIUCCR Summer School gave me many useful tips for ensuring that our library complies with best practices in software sustainability.
As the name suggests, TNT provides a description of a system where its state is given by a network of tensors. There are many variants in the networks used, but these well-established and highly successful algorithms share a common feature: they can be broken down into simple operations on tensors to contract the network and extract the required properties. There has already been a lot of success using TNT algorithms. To take the work further and tackle more interesting problems, especially in higher spatial dimensions, requires high-quality software that takes into account the symmetries of the system and parallelises each operation on the tensors as far as possible.
The development of our TNT library is part of the collaborative computational project CCPQ. It includes core functions to allow users to build their own highly efficient TNT algorithms with ease and without having to worry about underlying tensor manipulations. We are collaborating with partners at UCL and will soon be joined by members of the NAG team assisting us under the HECTOR dCSE support initiative. We also benefit from core support from the Software Engineering Group (SEG) at STFC RAL, who have been helping us adhere to good software development practices.
During the development of the library we are taking care to ensure that it can be widely used and long-lived, with a view to disseminating it more widely in the near future. Another crucial factor for us to consider is that, once the first version is complete, we want others to be able to contribute to future versions. We expect that this will mostly be in the form of high-level routines for TNT algorithms that will be made available for others to share, although there may also be contributions to the core library itself.
Of particular interest to me at the SeIUCCR Summer School was the talk by Steve Crouch on managing software sustainability and the associated approaches. Although some of the issues were already familiar through the support from the SEG, it was still very useful to systematically determine which of these had been addressed and which required more thought. One issue important to us, given the intended long life of the library, is how to maintain the quality of the library beyond the timescale of the current project and potentially with many contributors – Steve’s explanation of the role of a release manager was very interesting. An effective theme that ran through a number of talks was the idea of envisaging a continuum of skills sets ranging from mad scientist through geek and onto computer scientist. Once your position on the scale is identified it can be advantageous to seek out those with skills elsewhere on the spectrum, but also to acknowledge that communication can be difficult when there is a very large distance between skill sets. The expertise available to us from those on the computer scientist part of the scale is certainly something that we mad scientists and geeks aim to take advantage of as much as possible.
Posted by Simon Hettrick on Wednesday 29 August 2012.