Emulation as a preservation strategy for memory institutions has been researched since Jeff Rothenberg’s article in Scientific American Magazine (1995). The strategy relies on emulating hard- and/or software that has become obsolete. Rendering a digital work by emulating the original platform for which it was originally created has the preservation benefit of recreating the (near)original look and feel of the work. The better the original rendering platform is emulated the higher the fidelity of the rendering.
It has been argued that emulation is specifically suited for dealing with the preservation of complex objects (e.g. works of art) and executable works (e.g. software games). The specification of such works and of their behavior during execution is more complex than that of simple objects, such as a PDF-document for example, which requires relatively simple format and reader specifications. The specification of complex objects and executable works often requires going back to the source code of the work (if it is software) and the application layers on which it runs, in order to recreate it perfectly.
As has been discussed extensively in the literature, emulating all the layers of software and hardware on which a work runs is not the most effective and economic strategy. A more effective strategy might be to emulate only the lower layers (hardware and operating systems software) and to preserve the upper layers (specific software applications). In general one can state that the more distant the layers, the more generic they are in terms of functionality. The closer the application layers, the more specific they are to the rendering of the work. Therefore it is better to preserve the specific application software and to emulate the more generic underlying software and hardware layers.
The emulation of the lower layers is not something memory institutions need to develop all by themselves. There are many (open source) emulation initiatives that are developing emulators of commodity software and hardware.
The preservation of specific application software by memory institutions raises organizational, legal and other issues that need to be addressed. To name just two examples: 1) the duplication of effort if all memory institutions start preserving the same software and 2) the licensing issues relating to use restrictions beyond the software’s lifecycle. Research in emulation solutions has discussed the issues at stake in some detail. Research has also produced several emulation test-beds. However, there are to date very few practical tools and services that support those memory institutions who want to implement emulation strategies.
IN close collaboration with the University of Freiburg, OPF is planning a Hackathon on November 13-15 in Freiburg Germany, for the purpose of this November 2012 OPF Hackathon we will be looking at emulation from a practical implementation perspective. The purpose is clearly not to repeat the academic discussion on emulation as a preservation strategy – nor to build an emulation stack for full-fledged preventive preservation. The hackathon aims to focus on emulation as a preservation method for long-term access, working with real-life test-cases.
Practitioners will contribute examples of obsolete works from their repositories that cannot be rendered with current commodity HW/SW platforms. They will make their requirements explicit in terms of fidelity rendering levels (from low to high). Researchers and developers will propose emulation solutions for these examples and requirements. This exercise will illustrate how emulation can be used in practice and will give valuable feedback for the development of emulation tools and services.
By bram van der werf, posted in bram van der werf's Blog
There are no comments on this post.