Digital preservation in 3 days flat: The AQuA approach to hacking and mashing

Digital preservation in 3 days flat: The AQuA approach to hacking and mashing

The British Library has hosted the second of the AQuA Project‘s mashup events, bringing together practitioners from Libraries and Archives with digital preservation and technical experts. Following the same format as the first event in Leeds, but scaling up to 30 attendees, 3 days were spent examining content challenges, developing solutions and sharing in the digital preservation journey. The results from the 3 days can be found here, although some final gardening tasks are still outstanding. We plan to tidy up, tag and categorize, and then merge with the results from Leeds so I’d encourage interested readers to return again in a few weeks time.

The events themselves were I believe pretty successful on a number of counts. With fantastic efforts from the participants we were able to capture and document lots of detailed preservation challenges and requirements, we developed some prototypes and solutions for many of those challenges, and the journey to those results prompted lots of collaboration and knowledge sharing. I’ve already blogged about what we got up to in Leeds so won’t repeat the details of the format here, which we followed pretty closely at our second event in London. However I thought I’d share some thoughts on how we made AQuA work and also mention some of the lessons that we learnt along the way. As one of the AQuA organisers who has had a pretty limited experience of hackathon or mashup type events I realise that some of these observations might be less than enlightening for hack veterans. At the same time some naiviety on our part may have helped us go in some slightly different and hopefully interesting directions…

Document as you go: The AQuA format relied heavily on capturing results throughout the event in a wiki. With such a short-lived period of collaboration and development, longevity of results was a huge concern for us when we were planning AQuA. As a consequence we hounded the participants to record their work as they proceeded. Despite our worries, our very game hackers and mashers embraced the concept and in fact some commented that they liked the approach in post event feedback. Having some basic structure in place and wiki proformas ready to go, makes the process much easier. We of course learnt this on the second day of our first event.

Break down the barriers: One of the challenges at the recent Aligning National Approaches to Digital Preservation event was to break down national and cultural barriers in digital preservation collaboration. We went a step further at AQuA by trying to get techies, Librarians and Archivists working together. Just writing that down makes me shiver, but AQuA didn’t do too badly in meeting this gargantuan challenge! Matching techies armed with agile coding technologies with collection owners who had preservation problems to solve, facilitated a lot of useful interaction. And they seemed to really like it!

Several of the none-technical participants appreciated a chance to learn about the technical challenges in a none-threatening environment. Its perhaps no coincidence that the winners of our mashup competition (voted for by the participants themselves) really broke down the barriers, with both techie and non-techie learning a lot from the process of working with each other.

Many of the techies appreciated being able to problem solve in an agile manner, and get iterative feedback and steer from the collection owners. One participant voiced a feeling of liberation at being able to consider a problem and write a solution for it in 3 days without the need for any project planning or other institutional bureaucracy.

On the downside, by defining the tasks/challenges during the actual event (many hackathons have this ready at the start), it can get a bit hectic for the organisers at times. Spending time chatting to attendees before the event gave us some confidence that attendees would bring challenges that would fit with what we were trying to do. We didn’t find a particularly scalable way to match problem owners with solution providers, and this was seat of the pants stuff at the London event for the facilitators. I think we did a reasonable job. A slightly more techie heavy balance of attendees would have been useful, but 50/50 was about right. The none-techies didn’t quite have enough to keep them busy on the second day, an area we would look to improve on in a future event.

Encourage the knowledge sharing: There were lots of comments in the event feedback about how good the event was for learning about new tools and approaches. I think we helped this happen by creating an inclusive and friendly atmosphere where people felt they could ask supposedly dumb questions (they rarely turn out to be of course) without feeling uneasy. Our facilitators also spent quite a bit of time chatting with particpants about their progress, suggesting ways forward where we could help and putting attendees in touch with each other where we knew there was appropriate experience. This needs to be fairly light touch so that support is there when needed but pressure isn’t put on the attendees. Some of the solutions being developed were applicable to a number of different collections and preservation issues, and our network wiki structure also helped facilitate some of this join up while avoiding duplication of work.

We struggled to get enough reporting back into the schedule (the challenge for us at the London event was making the format work with 30 people), but still could have done more. Looking back, we could also have done a bit more small group work which always seems to work well to get the discussions going. Particularly when many participants have not met each other before. One participant thought we could have done more to encourage knowledge sharing via tweeting, or perhaps with a ticker display where people can glance up and see what others were working on. This is something we’d be keen to explore next time round and is pretty standard stuff at typical hackathon events.

Knowledge sharing is the somewhat intangible output from an event like this, but in some ways its the most valuable. So working hard to get these interactions going is worthwhile.

Get people outside of their comfort zone: Where possible we encouraged the techies to work on problems that were a little outside their usual range of experience, and as a result we saw quite a bit of benefit from fresh approaches being brought to familiar problems. There were exceptions however, where we wanted to really exploit the experience and skills of particular attendees. A mix seemed to work well.

The right venue is paramount: We went for conference type venues, with mixed results. We were initially worried about holding the Leeds event at Weetwood, which is hidden away on the outskirts of the town, albeit in lovely surroundings. In practice the “coding retreat” as we had optimistically sold it as worked really well, and our hackers in particular wanted more. Many commented about the London event that they wanted to go on hacking into the night, albeit while being plied with pizza, coffee and (ideally) beer. Getting a venue that we could have kept open until late would have maximised the value we got out of our techies.

We had a few wifi problems on the first day in London, and obviously this is critical to get right at a mashup or hackathon. As someone commented during the event: “decide how much wifi you need, then double it”. Wise words.

Not too long, not too short: We struggled with getting the event length right, and haven’t really come to a conclusion on this one. 2 days isn’t enough time to get sufficient hacking done, but 3 days is a long time out of the office and makes it difficult to convince managers to release their staff to attend. The work we did to brainstorm and capture preservation problems (as well as time spent on wrapping things up and evaluating at the end of the event) was really valuable and got the interactions going, but it does quickly suck up time that could be spent on hacking. More complex arrangements of tag team development with some attendees present on the first two days and others on the second and third days (or some other such configuration) appear wildly ambitious from an organisational stand point. Its worth noting that faciliators will be rushed off their feet at a structured event like this, so keeping things simple wherever possible is a good idea.

 

Think about next steps: Defining our preservation issues and hacking together solutions for them isn’t the end of the story. We want the best solutions to be utilised by the participants and the tools supported to be developed further. This is vastly easier to say than make happen. So we spent time on capturing the institutional contexts of the partipants and need to do some work to analyse this data. We also hope to follow up on the events with a further review to see how well the AQuA results have been exploited and if there are obstacles to implementation or further development. We can then hopefully find out how we can mitigate thse issues. This is where OPF has a role to play.

We had a brief discussion at the end of the 3 days on next steps and were encouraged by the enthusiasm to see us take forward the mini-community we’d forged during the event. As OPF Director, Bram, commented during the mashup, this is exactly what the OPF is all about. We’ll be doing our best to keep the interactions going…

11
reads

Leave a Reply

Join the conversation