How do we make preservation planning work for us?

OPF and DPC recently held a hackathon in York that followed the style of the AQuA hackathons I’ve previously reported on. I won’t say much on the event in general as Carl has just blogged about it, but I’m pleased to say it was our best yet and I think we really nailed the formula this time round. Suffice to say, we’ll be running more of these events next year. We will have some exciting news on this soon!

One of the most lively sessions we ran with our Content Owners (and some of the Devs that we distracted from their hacking) was on preservation planning. Carl Wilson talked us through the BL’s experiences of using Plato to plan some preservation work with its digitised newspapers. This then kicked off a lively discussion with the event participants. I captured a few notes from the discussion that I thought it worth sharing here. Picking up on the last bullet below, we’re planning to establish a location on the OPF wiki to capture information about various bits of work related to Preservation Planning. For the meantime we thought it might be useful to post these notes and we’ll update things here when we make some more progress.

Just recently, the EPIC Project has released its report on experiences in using Plato and there were several parallels between this report and comments made during our discussion.

  • Underlying essence of Plato process is very good: ensures best practice is followed to avoid making mistakes in handling/processing/changing digital content. Ensures some of the key aspects have been considered.
  • Mechanism is imperfect. Technology stack is difficult to manage and implement. Process doesn’t scale well to complex, non-uniform collections. Steepish learning curve. Is there enough detail in the resulting plan on the actual decision made?
  • Plato process is very close to ISO Risk Management Process 31000 (http://en.wikipedia.org/wiki/ISO_31000) (maps very closely). But ISO31000 is more modular. A more modular process would enable better alignment with other institutional activity.
  • Have we been missing a trick by not reusing effort/progress from other work and other relevant disciplines?
  • Its not just one person that should be doing preservation planning from end to end. In reality it needs to bring in other views and organisational processes (eg. risk management at a higher or organisational level). How can we facilitate this?
  • Could OPF provide a place to share policy information? Several participants were keen on this idea.
  • Observation: Its very hard to find preservation plans (perhaps excepting Plato and those engaged with working on it). Can we lower the barrier to entry? Through AQuA style events its become clear that preservation work *is* going on out there, but preservation planning doesn’t seem to be happening or at least being formally recorded.
  • Strong desire to support further research and development of Plato while, encouraging greater uptake.
  • Plato is the cutting edge that is pushing us forward.
  • Strong value in sharing preservation plans and sharing best practice and approaches embedded within them as Plato already supports. Community approach and sharing was a recurring theme in the discussion (and there was a parallel here to the social tools provided by Taverna that was also discussed in a session at the event).
  • It was noted that there was a quite a lot of relevant preservation planning information on the web but most participants weren’t aware of all of it. It was suggested that referencing all this information from one place on the OPF wiki would be useful, and might help to get further discussion going. This would include: Plato, ISO31000, Planets PP2/D2, Claire Ravenwood’s phd work (Loughborough), Cambridge/JISC’s EPIC report, blog post referencing preservation plans and policy work on The Signal, etc.

By paul, posted in paul's Blog

11th Oct 2011  3:56 PM  11996 Reads  No comments

Comments

There are no comments on this post.


Leave a comment