Regular readers of this blog will know that the Software Sustainability Institute has been collaborating with the Software Carpentry initiative to develop and deliver courses. Greg Wilson from Software Carpentry has set up a Peer2Peer University course on "How to Teach Webcraft and Programming to Free Range Students". One of the things that the SSI has become aware of as it has undertaken projects is that the experiences and skills in programming of researchers varies greatly, even within an research domain or group.
As part of the first exercise, members of the course have been considering the recommendations published in 2007 by the US Department of Education’s Institute of Education Sciences in a 60-page report: Organizing Instruction and Study to Improve Student Learning. The seven recommendations are summarised below, but the full report is worth a read as it contains a great deal of evidence to back up the validity of the recommendations and other claims.
- Space learning over time. Arrange to review key elements of course content after a delay of several weeks to several months after initial presentation.
- Interleave worked example solutions with problem-solving exercises. Have students alternate between reading already worked solutions and trying to solve problems on their own.
- Combine graphics with verbal descriptions. Combine graphical presentations (e.g., graphs, figures) that illustrate key processes and procedures with verbal descriptions.
- Connect and integrate abstract and concrete representations of concepts. Connect and integrate abstract representations of a concept with concrete representations of the same concept.
- Use quizzing to promote learning. Use quizzing with active retrieval of information at all phases of the learning process to exploit the ability of retrieval directly to facilitate long-lasting memory traces.
- Help students allocate study time efficiently. Assist students in identifying what material they know well, and what needs further study, by teaching children how to judge what they have learned.
- Ask deep explanatory questions. Use instructional prompts that encourage students to pose and answer “deep-level” questions on course material. These questions enable students to respond with explanations and supports deep understanding of taught material.
Many of these recommendations are aimed at more traditional notions of students - the SSI is training those who hae already undertaken university degrees: typically PhD students and early-career researchers, though also all the way up to estbalished professsors.
So how do our experiences at the SSI in this slightly different learning context reflect these best practices?
Space learning over time.
A significant issue with many of the people we are helping to learn is time constraints. Researchers, although nominally having self-learning budgets, are working at 200% already. Although our experience suggests that reviewing key elements of course content after a delay is very useful (we have done this as follow-ups to our software sustainability workshops) it is difficult to do this in as organised a fashion as the initial training. As a result, whilst it is effective as a learning practice, it is not easy to implement scalably as a tutor.
Interleave worked example solutions with problem-solving exercises.
This is something that has always used at EPCC (one of the SSI's member organisations). We've found it incredibly effective for teaching e.g. parallel programming, visualisation - basically topics where there is a chance that the technology might get in the way of understanding the concepts. By removing technological barriers (e.g. understanding particular pieces of software), students are able to step through examples first of all, before consolidating this information through trying it themselves.
Combine graphics with verbal descriptions.
Again, this is something that anecdotally we believe is useful. For software preservation we make use of flowcharts to help illustrate processes. Many of our guides are illustrated with pictures but we could make these more relevent to the topics. But this sidesteps the question of how you provide sensible graphical representations of abstract concepts such as software sustainability.
Use quizzing to promote learning.
In our "What Makes Good Code Good" module we use quizzing to both promote discussion, and fix certain concepts. Examples of responses to these quizzes from digital social researchers and scientists. As part of the latter course, a VRE was setup for students to followup the initial training.
Help students allocate study time efficiently.
As mentioned above, this is one of the hardest things to do with our type of students. This is one reason that taking students out of their regular working environments is a good thing - it allows them to block out time, and ignore regular distractions. Certainly, one thing that we could do better is understand what material is most important, and help learners identify which areas they should devote more time on, perhaps through self-assessment quizzes.
Ask deep explanatory questions.
What's a deep explanatory question in our context? Perhaps it's something like "how will you keep your software going?" - certainly something which will require use of all the material we are teaching and more!
Many of these experiences have also been noted by the different e-Science training projects such as ICEAGE. Nevertheless there remains a demand for software programming training amongst researchers in the UK. Doing it more effectively should also enable more peer-training to occur.
Overall, the SSI should be able to adapt its teaching and training methods to enable more researchers to benefit from them, and understand ways of providing follow-ups which consolidate the information learned. The use of quizzes after the initial training is certainly something that we'd be keen to implement, and the issue of lack of time to organise follow-ups is one which we'll need to explore with the people we're working with to come up with a workable solution.