Sustainable Business Through Sustainable Emulation

Sustainable Business Through Sustainable Emulation

Following up on the idea of system imaging or "snapshotting" discussed in a previous post and presented at this year’s iPRES 2011, a general concept of long-term sustainable computer systems could emerge. Why not take emulation into consideration when designing future systems and defining their requirements? A seamless process would allow for systems that were previously run in real or virtualized mode to be moved into an appropriate emulator in the final preservation state. The preservation state could be kept accessible for a pre-defined period or forever. Sustainable emulators are the key for such an approach. Certain efforts, most likely requiring only a small fraction of the overall costs planned for the system, would be required to achieve this.

Virtualization and Emulation in the Mainstream

The boiled-down approach would make the original systems compatible with today's emulators or virtualization tools through slight modifications while still running on the original hardware. Additional drivers for network, sound cards or graphics adaptors, etc. could be installed before hand. The original MAC address of the network adaptor could be read and put into the configuration of the emulator. Plus, the fancy high-resolution graphics needs to be set to some more generic configuration to make it more compatible with the emulated hardware.

The more advanced approach would require a system design that is "virtualizatiable" and subsequently "emulateable" in order to ensure accessibility after the out-of-service declaration of the orginal software and hardware manufacturers. This strategy could include defining the relevant hardware components, which could augment the details of system purchases already. Governments or companies could require that the systems they acquire only use hardware components that are easily virtualized or emulated.

Often emulators already exist as applications for the popular ARM architecture (Android tablets and smartphones) and are developed off-device using system emulation. Other examples are provided throughout the history of computers. For example, Apple implemented emulators within their operating systems to allow for running applications compiled for the Motorola CPUs on the new PowerPC systems introduced at the end of 1990s. They did the same to run PowerPC programs compiled for Mac OS 8 or 9 on the early Mac OS X versions.

Sustainable Emulation

Emulators and virtualization tools have already proven their usefulness in IT processes, but are lacking in long-term sustainability. Judging by the product cycles and policies of the oldest player in the VMware market, there is too much change and uncertainty in order to rely on long-term availability and fitness for DP purposes. Plus, the software’s commerciality is unclear if EMC2/VMware are not around any more in a couple decades. At least at the moment they seem not to care too much for the older systems as they dropped quite a bit of operating system support at the tail already.

Rapid technological change and the constant improvement of computer systems requires a constant (continuous) adaptation of applications and operating systems. For example, if new peripherals such as USB or CPU commands are added, the operating systems must be modified to use these new features. This often implies the need to update the executables running on a certain platform. Otherwise, older versions are (after a certain amount of updates of the underlying operating system) no longer useable. This is especially true for virtualization tools dependent on hardware and operating system interfaces. Moreover, executables such as emulators and virtualization tools are dependent on a certain set of basic libraries whose interfaces and functions change over time. Despite the constant change of the digital ecosystem in which an emulator is embedded, it must not change its behavior nor its functions provided to the original environment.

Thus, an Open Source approach overlaid with a commercial service layer for emulators would fit best for digital preservation and access purposes. Nevertheless, sustainable software and especially sustainable emulation is still an open research topic.

Future Systems

Virtualization and emulation allow for a new lifecycle management paradigm of extended digital objects. It would allow sustainable business environments to outsmart the fast technology changes paced by the IT industry. However, in order to make this happen, a number of actions should be discussed. On one hand it would be helpful to agree on standard peripherals and emulator feature sets to help with conversions from real to emulated hardware. On the other hand the emulators used for development and the introduction of new architectures should be considered. They should be Open Source (if not already) and made available to the general public. The hardware vendors are in a core position here and must be convinced of future cooperation.

Leave a Reply

You might also like…

Post icon

In defence of migration

There is a trend in digital preservation circles to question the need for migration.  The argument varies a little from proponent to proponent but in…

Post icon

A Nailgun for the Digital Preservation Toolkit

Mentioned in various forums before, but not necessarily expanded upon within this community, the Nailgun client/server application removes the overhead of starting the Java Virtual Machine when running a Java application consecutive times. Given a large majority of the programs in the digital preservation toolkit are written in this programming language we should consider all of the optimizations that we can find. Nailgun enables us to reach a significant improvement in performance, and should be considered in future digital preservation workflows, if it is not being used already. This blog outlines the current performance issues with Java and provides an overview of how to get Nailgun up and running; giving baseline statistics as it goes to illuminate the descriptions provided.

Post icon

A Weekend With Nanite

Well over a year ago I wrote the ”A Year of FITS”(http://www.openpreservation.org/blogs/2013-01-09-year-fits) blog post describing how we, during the course of 15 months, characterised 400 million of harvested web documents using the File Information Tool Kit (FITS) from Harvard University. I presented the technique and the technical metadata and basically concluded that FITS didn’t fit that kind of heterogenic data in such large amounts. In the time that has passed since that experiment, FITS has been improved in several areas including the code base and organisation of the development and it could be interesting to see how far it has evolved for big data. Still, FITS is not what I will be writing on today.

Today I’ll present how we characterised more than 250 million web documents, not in 9 months, but during a weekend.

Join the conversation