HomeNews and blogs hub

“On the fly” coding an android app for drone and kite-based sensing

Bookmark this page Bookmarked

“On the fly” coding an android app for drone and kite-based sensing

Author(s)

Karen Anderson

Posted on 18 November 2016

Estimated read time: 5 min
Sections in this article
Share on blog/article:
Twitter LinkedIn

“On the fly” coding an android app for drone and kite-based sensing

Posted by s.aragon on 18 November 2016 - 11:18am

Grassroot mappingBy Karen Anderson, University of Exeter, and David Griffiths, FoAM Kernow

This article is part of our series: A day in the software life, in which researchers from all disciplines discuss the tools that make their research possible.

Smartphones have emerged as powerful research tools for collecting scientific data because they are equipped with a broad suite of sensors (e.g. cameras, microphones, light sensors, accelerometer, compass, gyroscope, and GPS) and on-board microcomputers and are widely used globally. Many smartphones are designed to service the information requirements of multinational developers—they are location-aware—, and applications downloaded by users can transmit information back to providers. This capability can be exploited through the programmable nature of smartphones: sensors developed to supply  location-based services to providers can now be hacked using readily available computing resources. One such opportunity that remains untapped is the smartphone as a remote sensing imaging device that can be deployed in conjunction with rapidly developing lightweight drone technology.

We undertook a short project to explore the development of an android-based smartphone application aimed at delivering spatial geo-tagged aerial photographs. We wanted to explore the potential of a basic, ubiquitous smartphone turned into a ready-to-use remote sensing device. Remote sensing is the term used to describe acquisition of (usually imaging) data from a distance—we are exposed to remote sensing data every day through portals such as google Earth. However, such data are coarse-grained and the ability to put data capture into the hands of end-users is acknowledged widely across various scientific papers as a powerful change agent towards more distributed monitoring and mapping.

There are already some smartphone apps for remote sensing in existence but all of these apps require a user to trigger the camera on the handset using button presses. Often, such apps can be inflexible; they don’t allow the user to programme the conditions under which the camera is triggered or to trigger automatically given particular parameters of operation. Furthermore, the conventional app design process assumes that the purpose of the system being designed is fixed, and a programmer’s job is to provide features that solve existing problems for a client. In contrast to this ‘hard wired’ approach, we wanted to pursue a more flexible approach to the app design, specifically drawing on research in visual and live coding. Visual coding is a way of enabling simple programming by manipulating program elements graphically rather than by through text definition. It also allows for end-user development of apps through live coding, where coding is designed and implemented on the fly.

In designing our android application, we considered two critical functions that would set our app apart from others:

(a)   The need for a user to press a button to trigger data capture is not feasible when the device is airborne, so it was essential that our app would allow the camera and sensors on the smartphone to function autonomously after the setup;

(b)   The use of a visual-coding schema within the app to allow users to customise the app themselves (e.g. to collect images at specific points in space or time or according to particular conditions, such as  camera is level).

We developed an approach that allows users to program the app using the phone’s touch screen using a system based on ‘Scheme Bricks,’ whose drag and drop positioning of logical blocks construct expressions evaluated in flight by an interpreter. Blocks are picked using Android's ‘long press’ function, and haptic feedback is  provided by vibrations to indicate successful selection.

There are a wide variety of potential users of such applications given the recent rapid expansion of mobile phone ownership concomitant with increased sales of lightweight drones. Our concept for the design of the app was that any user with a lightweight drone, kite and android phone could use the app for rapid surveying of an area of interest. This functionality might be particularly useful in supporting, for example, humanitarian disaster relief–recent examples from Nepal demonstrate the power of citizen-gathered remote sensing data from personal drones.

During our project, we designed and tested various versions of this app in operational mapping scenarios. We used basic equipment to secure android mobile phones to the underside of drones, and we suspended them from kites using basic fixings. We found that the vibrations from drone motors needed dampening to ensure that the camera could capture good quality images, but we demonstrated that this could be achieved with very low-cost objects. The result  was an easy to use configurable app that can be programmed by any user to perform aerial mapping tasks effectively. Our paper in PLOS One describes the full approach and gives further details on the coding used.

UAV toolkit is free to download and is completely open source. Details can be found in our Github repository

References

Anderson, Karen, et al. "A grassroots remote sensing toolkit using live coding, smartphones, kites and lightweight drones." PloS one 11.5 (2016): e0151564.

Acknowledgements

We acknowledge the support of a wide variety of scientists within the ESI DroneLab and beyond, including: James Duffy, Liam Reinhardt, Jamie Shutler and Steve Hancock, many of whom acted as test pilots for the flights.

Share on blog/article:
Twitter LinkedIn