Search
Close this search box.
Search
Close this search box.

The Future of the Past:

Preservation in American Research Libraries

report cover

by Abby Smith
April 1999

Copyright 1999 by the Council on Library and Information Resources. No part of this publication may be reproduced or transcribed in any form without permission of the publisher. Requests for reproduction for noncommercial purposes, including educational advancement, private study, or research, will be granted. Full credit must be given to both the author and the Council on Library and Information Resources.

Foreword

Introduction

Survival, Triage, and Preservation

The Embrittlement of Research Collections

Permanent Paper
Paper Deacidification
The Rationale for Reformatting
The Scope of the Problem
Local Responsibilities vs. National Priorities

A National Plan for Preserving Brittle Books

Selection for National Plan
The Role of Scholars in Selection

The Future of the Present

References


Foreword

This is a stock-taking report-a summary of challenges and accomplishments in preservation efforts since the early 1960s. For preservation specialists, the message is not new. But in our work with scholarly groups, we have found little knowledge of the library community’s preservation agenda. Preservation of library resources is a vital matter to both scholars and librarians, and this is our attempt to provide a common backdrop against which further work can proceed.

At a time when digitization is posed as the solution to a wide range of problems, we believe it is important to review the lessons learned from a national, coordinated preservation microfilming program. The library community has held different views about the best course of action to preserve brittle books, and the controversies have been public and sometimes contentious. Yet the progress in preserving the information recorded on the embrittled imprints of the past century and a half has been remarkable. In part as a result of the work done to address the brittle book problem, guidelines for preservation of library resources are well established and followed by virtually all libraries in the United States, as well as by other countries.

As we begin to address the preservation challenges presented by twentieth-century media, we are once again faced with decisions about how to attack a far-reaching problem. Should we deal with non-print media preservation in a national program? How do we select the materials that will be preserved? Where will the resources come from? How will scholars be involved? Abby Smith essay does not answer these questions, but it provides a concise history of how preservation of books and journals has been framed by librarians and archivists. Her report is meant to be a review that prepares us for the next phase of preservation work.

Deanna Marcum
President

Introduction

Recorded knowledge is as fragile as the medium on which it is recorded and as enduring as the human resolve to transmit it. In the United States, where full access to the human record is important for citizenship and for scholarship, libraries play a critical role in the acquisition, preservation, and dissemination of that record. Research libraries today make information in all media accessible to patrons onsite and off. What is not available in one’s own institutional library may be identified through a network of national and international databases and then retrieved and delivered through interlibrary loan or document delivery. This easy access to bibliographic records and source materials is currently being expanded through the extraordinary technology of digitally stored and transmitted information that can become instantly visible onscreen.

While continuing to provide traditional source materials in their original format to scholars onsite, libraries are moving aggressively into the new world of electronic creation and dissemination of information. Electronic technology offers new methods of making collections more accessible to researchers through digital finding aids and surrogates. Early indications are that, rather than decreasing the demand to consult originals, wide dissemination of digital surrogates has created fresh demand for use of primary sources in their original media. The new demand has placed a greater burden on research libraries to preserve as well as to serve artifactual collections but has so far generated no new funding for their preservation. The abiding importance to scholars of primary records in their original forms, together with the proliferation of new information on increasingly unstable media, create an imperative for research libraries and the communities that they serve to act energetically and collaboratively to ensure that the record of this century, as well as that of previous ones, is carefully selected and preserved before that record erodes and degrades.

This paper gives an overview of the preservation and management of research collections and describes the context in which decisions are made by researchers and librarians about what to preserve and how. By examining how librarians and scholars grappled with the first great crisis in the preservation of library materials-the pandemic loss of information printed on embrittled acid paper-it traces the development of the current consensus on how to manage large collections recorded on many media of varying stability. And the paper addresses the problem that, despite striking progress made in preservation technology and management, the difficulties of preserving original library materials have scarcely diminished over time and demand the same thoughtful cooperation between scholars and librarians as they enter the twenty-first century as the brittle-book problem received in the 1980s.

Survival, Triage, and Preservation

Since the invention of movable type five-and-a-half centuries ago, there has been an explosion of recorded information following each technological innovation in recording media, from the manufacture of cheap paper in the 1840s to the pressing of compact discs in the 1980s. Libraries must now manage stores of information proliferating so rapidly that they threaten to overwhelm anyone’s capacity to use them efficiently and intelligently. Based on our knowledge of the past, two things can be said definitively about future library collections: not all recorded information will survive, and we will never be able to predict accurately which information will be in demand by scholars in the future. Librarians routinely make conscious and active interventions to collect and preserve things, even if they cannot now know what the researchers of the future will need. They work in a variety of ways with the research community to identify which resources are and will be in demand by users of their collections and, taking into consideration the collecting practices of other research institutions, to develop policies for acquiring and building the collections.

Scholars work from a source base that is, of necessity, incomplete. Like stratigraphers, they must analyze phenomena on the basis not only of evidence, but also of inferences about the absence of evidence. Untold numbers of records have perished over the centuries through neglect, natural disaster, and war, and they will continue to do so. What one culture may consider worth saving may be of little interest to succeeding generations, while materials neglected by the present generation may come to be highly valued later. The very fact that some things are chosen to be in libraries or archives and others are not, attests to the highly selective nature of transmission.

Although the process of preservation is frequently seen to be retarding or reversing the effects of time, in fact much of the work of preservation involves forecasting how something will age and taking steps to mitigate the aging. In some sense, preservation resembles that other dismal science, economics, which can at best know and understand how things have turned out in the past but is called upon constantly to make forecasts about a future in which only one thing is sure-that change will have occurred. Few people in the nineteenth century, for example, could have known that the technological innovation that made large research libraries possible-the production of cheap paper made from wood pulp-would threaten the collections of those libraries within a century.

While information has been recorded on such diverse media as pebbles, papyrus, parchment, and plastic, research libraries took shape in the days when most information was printed on paper. The traditional, and most expedient, method of ensuring long-term access to that information was simply to protect the integrity of the medium on which it was recorded-that is, to repair and rebind books. Replacement with reprints and facsimiles was an option also employed in the case of rare or obsolete volumes. Most preservation activities have grown out of those traditional approaches to the book, and so has the approach to materials conservation that informs many decisions about collections care. This approach changed quite dramatically in the 1970s and 1980s as a result of the recognition of mass embrittlement of volumes in the stacks, a condition affecting as many as 35 percent of the holdings at some institutions. This bitter fact forced a new approach to custodial care.

>The Embrittlement of Research Collections

Until the 1970s, research libraries’ chief concern for preservation was keeping up with the wear and tear on their books-through rebinding, repair, and protective enclosure of monographs and serials, and, occasionally, through some other type of item-level treatment for rare items. In 1971, there were only three or four full-time preservation administrators in Association of Research Library (ARL) institutions.1 Where preservation departments existed, they were staffed by people usually trained in item-level conservation, often based on the model used by museums.

By the mid to late 1970s, the preservation problem had grown beyond keeping up with repairs to damaged books and finding additional shelving on which to put the volumes. The rapid growth of research collections in the postwar years coincided with an increasing recognition of the acid-paper problem, and libraries everywhere were reporting incidents of crumbling books. Both the Council on Library Resources (CLR) and the ARL recognized the problem of book deterioration in the early 1960s and developed programs to address it.2 The dilemma was pandemic, and it became evident that more than item-level treatment was needed. Indeed, because the disintegration derived from a natural chemical process of degradation in the paper itself, some librarians likened the problem to a time bomb in the stacks. In 1984, this phenomenon was documented in surveys of two major libraries, the Library of Congress and Yale University Library. The surveys revealed that a quarter to a third of the collections were highly embrittled and in danger of imminent disintegration. Other libraries were quick to wonder whether these two libraries could be exceptional or whether these findings pointed to a general phenomenon.

Why were so many books on the shelves of American research libraries literally crumbling when touched? Books deteriorate due to stress from two sources: the chemical composition of its materials-the paper, the binding, the glues, and other elements of construction-and the environmental conditions under which the books are kept. The greatest culprit in decay is the deleterious levels of acid found in paper manufactured from 1840 until 1980 and beyond. Before the middle of the nineteenth century, paper was made from linen and cotton rags, and was a remarkably stable medium. Most of the deterioration found in rag-paper documents has been caused by inks (many of which contain iron gall and other acidic metals) eating into the paper, or from the introduction of mold and pests into a volume.

Demand for paper was very high in the early nineteenth century and by the 1840s, mills had begun to produce it from a much more abundant source-wood pulp. Chemicals such as aluminum sulphate (known as alum), added during the papermaking process to improve the paper’s hand and to keep inks from being too readily absorbed, react to humidity by producing sulfuric acid, which, over time, weakens the molecular structure of the cellulose in the paper. Together with the weakening of fibers from bleaches used to brighten sheets, this chemical process leads to embrittlement. Other types of paper, manufactured from ground wood pulp and not cooked with chemicals, contain the fiber lignin, which causes discoloration upon oxidation (noticeable in any newsprint left in the sun for as little as a few hours).

The chemical composition of wood-pulp paper is highly reactive to the environments, both micro and macro, in which books are stored and used. High temperatures and relative humidity accelerate the chemical processes that lead to embrittlement and fluctuations in either or both of those environmental factors add additional stress to books. There is no uniform rate of deterioration and assessing any damage that has occurred or will occur is a local matter, depending not only on the environment of a library but also on the specific conditions within stacks and, to a degree not yet thoroughly studied, on the microenvironment of the bound volume itself. The brittle-book problem, therefore, while endemic to all books printed from 1840 onward, manifests itself differently in different parts of the country. In the mid-Atlantic states, for example, where there are a large number of research libraries with old collections, libraries have experienced more decay of their holdings, in large part because of the relatively high ambient temperature and humidity in library buildings. Collections on the West Coast, on the other hand, do not suffer as great a degradation because they are younger and their environments are less damaging to acid paper.

Permanent Paper

The problem with paper made from wood pulp was first noticed at the end of the nineteenth century but it was not until shortly after the Second World War, coincident with the rapid growth of research institutions and their library collections, that the chemistry of the phenomenon was systematically studied.3 By the end of the 1960s, the cost-effective manufacture of alkaline paper was feasible and the development of standards for making so-called permanent paper was well under way. To prevent the future self-destruction of research collections, CLR and The Andrew W. Mellon Foundation brought together in 1979 a group of experts on book production, publishing, and paper preservation to assemble information about the problem and devise a strategy for moving forward. In the following years, reports on book longevity and paper permanence were issued, and the American National Standards Institute (ANSI) worked to develop a standard for permanent paper. A number of library organizations agitated for the use of permanent paper in all books and documents that should be considered for long-term retention by libraries and archives. The campaign was successful among commercial as well as noncommercial publishers and printers and in 1990 the federal government began mandating the use of permanent paper for its official documents.

While there has been a great deal of scientific analysis of paper and the effects of aging on it (done through accelerated aging testing), there remains much that is still unknown. Most testing, for example, is done on single sheets of paper, not on paper in bound volumes. What exactly is the microclimate inside a book that was bound in leather in 1864? Does the composition of the leather and of all the chemicals that were introduced into it during the tanning and tooling processes act to accelerate or to buffer acidification? How much migration of acid is there from one sheet of paper to another, or from the paper to the binding? Paper made from wood pulp often has a very high lignin content and lignin is known to be a major factor in the discoloration of paper upon oxidation. But lignin is also thought to act as a buffer against certain pollutants found in the air in urban and heavily industrialized areas. Much research remains to be done into performance-based rather than composition-based standards for paper longevity.

Paper Deacidification

There are a number of strategies one could employ to prevent, or at least forestall, damage to acidic materials. In the 1960s and 1970s, one of the most promising was to deacidify paper and methods for doing so on a mass scale were under active development. (A method for aqueous deacidification of individual leaves has been available to conservators for several decades now, and it is used to treat items of sufficient value to warrant item-level treatment.) Most deacidification methods work to retard significantly the natural deterioration of paper by depositing an alkaline buffer to neutralize the acid.4 Though deacidification stabilizes paper, it cannot strengthen or reverse any damage that has already occurred. While very promising for the prospective treatment of materials that are threatened with decay, the process is not efficacious for materials that have already reached a certain level of embrittlement. For the millions of volumes that were already embrittled in libraries, deacidification would offer no relief. Another approach was necessary.

The Rationale for Reformatting

There were, in every library, materials so embrittled that handling them would, in effect, destroy them. For these books, there was only one option: to reformat them, that is, to transfer the information they contained onto another medium, such as photocopy or microfilm. This realization among library custodians marked a turning point in the management of library collections. Heretofore, the retention of the information in a book meant saving the book. Retention was no longer possible for embrittled books. For the first time, librarians and archivists had to start making a distinction between the information and the object. This relatively new concept-abandoning the carrier altogether and neither rebuilding nor restoring texts but rather capturing the information on a more stable medium-created a radically new situation for those who had dedicated their professional lives to restoring and rebuilding. Laboratories that had focused their efforts on restoring the integrity of an object in order to make it available to researchers now had to engage in the painful work of triage. As Patricia Battin, president of the Commission on Preservation and Access, a group that took the lead on developing a national strategy to address the problem, wrote in 1992, “We faced very painful and wrenching choices-we had to accept the fact that we couldn’t save it all, that we had to accept the inevitability of triage, that we had to change our focus from single-item salvation to a mass production process, and we had to create a comprehensive cooperative strategy. We had to move from the cottage industries in our individual library back rooms to a coordinated nationwide mass-production effort.” (Battin 1992, 6)

The Scope of the Problem

If up to a third of the collections at such libraries as Yale and the Library of Congress were already embrittled and contained many more volumes printed on acid paper that would also inevitably turn brittle without intervention, how many volumes were at risk across the country and how many needed to be reformatted as soon as possible? Warren J. Haas, the president of CLR, urged the Association of American Universities and the American Council of Learned Societies to join him in creating a task force to study the extent of book deterioration. In 1984, Haas commissioned a series of studies by Robert Hayes, then dean of the Graduate School of Library and Information Sciences at University of California at Berkeley, to determine the percentage of embrittlement and duplication at major U.S. repositories. In 1988, there were about 305 million volumes in the ARL libraries, of which, Hayes determined, approximately 25 percent were brittle. Hayes further calculated that 0.6 percent of the collections were changing from being endangered to being embrittled every year (as determined by a test of paper strength, the so-called MIT fold test, developed to identify mechanical weakness or embrittlement). Taking into account the number of titles that were held in more than one institution, Hayes calculated that 12 million volumes were unique and either already brittle or destined to become so within 20 years (Hayes 1985). These same reports forecast that realistically, only about a third of these titles could be filmed in a 20-year period. That third became the focus of the brittle books microfilming preservation programs initiated by scores of libraries around the country.

Local Responsibilities vs. National Priorities

At the same time that preservation librarians were grappling with the consequences of being able to rescue only some, but not all, of the information that was in jeopardy, they also realized that the way they had normally selected items for preservation treatment was inadequate to solve the problems that mass-scale deterioration posed. The threatened loss of information because of widespread embrittlement was seen as a national problem. However, preservation decisions about what to treat, when, and how, had always been made locally, with a view to the needs of specific institutions. Libraries are charged with custodial responsibility for items under their direct care and no library can dictate to another what to preserve. In order to rescue a national literature, however, and to avoid costly duplication of effort in the process, there needed to be a national collaborative activity in which individual libraries could participate.

When preservation decisions are local, a comprehensive care program tries to strike a balance between keeping heavily used items in good repair and preventing deterioration of, and damage to, materials that are of high priority for their artifactual value or rarity. Access to a library’s holdings is provided to users primarily on site, and, because researchers by and large prefer to consult a source in its original format, the repair and strengthening of heavily used items, books in particular, is a major preservation activity. Other treatment options include creating a surrogate for fragile materials and, for rare items, restricting access.5

A National Plan for Preserving Brittle Books

Some, but not all, of these preservation selection criteria come into play when developing a plan to preserve collections on a national level. Unlike other countries that have national libraries responsible for collecting and preserving the national output, the United States must develop and implement national plans through the existing decentralized network of repositories. There are four central questions that have to be answered for a plan to be effective at the national level:

  1. how do we document the information to be saved, on what medium, and using what standards;
  2. how do we record the fact that a title has been preserved;
  3. what should be preserved; and,
  4. who should be responsible for accomplishing the preservation.

Developing standards for preservation-quality microfilming was crucial to ensuring the integrity of the record. And the bibliographical record needed to be modified to record the existence of, or intention to create, a master microform of a title, to avoid duplication and make the availability of the copy known to the research community. While there had been cooperative reformatting programs in the 1970s aimed at preserving core literature in certain fields (for example, the efforts of the American Theological Library Association), it was only in the 1980s with the advent of the Research Libraries Group (RLG) cooperative filming program that some of the core issues of standards came into focus and the bibliographical infrastructure to support a national plan came into being.

With funding from the National Endowment for the Humanities (NEH), RLG undertook a cooperative microfilming project that captured 30,000 American monographs from 1876 to 1910, held in seven participating libraries. This project was significant in large part because it recorded the existence of the newly created surrogates on a shared database and thereby introduced an efficient way of obviating duplication of effort.6 In addition, with this project RLG began its long-term effort to develop standards and best practices for filming and bibliographical controls, two of the four elements essential to any national preservation effort. Over the course of the 1980s, and in consultation with the preservation community, RLG developed and published microfilming guidelines that became the accepted standard for American filming projects. Among the most important results of this process was the confirmation of 35mm silver halide film as the most durable and reliable medium for textual reformatting. Testing has indicated that, when stored under the proper conditions, such film can last up to 300 years. It can be reproduced rather quickly and inexpensively, and the images are readable with the aid of light and magnification alone, making it less vulnerable to hardware/software obsolescence than other possible technologies. While acknowledged to be less user-friendly than, say, digital imaging, microfilm is, as of this writing, still considered by far a more stable preservation medium. Ongoing research into the conversion of film to digital images and images to film indicates that, between the two formats, we may be able to have the stability of a good storage medium together with the advantages of a flexible access technology.

The endeavor to rescue so much endangered information faced not only technological problems but the much greater challenge of rousing the collective will of librarians and scholars to identify what must be filmed and who should take responsibility for it, and to raise the huge amounts of money it would take to accomplish the task. As a result of the work begun by the CLR in the mid-1980s, CPA was created in 1986 to coordinate the effort and instigate collaborative action, to publicize the issue of brittle books, and to provide leadership, with the specific task of working with the scholarly community to raise awareness and enlist the support of the user community. CPA developed three strategies to attack the brittle book crisis:

  1. convince printers and publishers to change to alkaline paper;
  2. explore the feasibility of deacidification; and,
  3. capture the intellectual contents of a substantial number of brittle books in an archival master copy format.

The National Advisory Council on Preservation was formed at the same time to enlist academic and professional organizations as advisors to CPA and to promote awareness of, and interest in, this preservation crisis among scholars.

On the two other matters crucial to a successful plan-selecting what should be filmed, and deciding how to apportion responsibility among institutions for filming particular titles-the microfilming project begun by RLG engaged its member libraries in those decisions. They chose to film their collection strengths. This method assumed two things: that the strength of an institution’s collection was such that, if the whole collection was filmed, a meaningful representation of the print record on a subject would be preserved; and that the economy of scale of not selecting on a title-by-title basis would make the model attractive to many other libraries.

As stated by RLG in a proposal to NEH for funding (Child 1992, 151), “This collection-based approach to preservation selection . . . assumes that institutions can review their holdings, identify discrete groups of materials, and determine that such collections are worthy of preservation.” Collection excellence was embraced as a principle for selection, especially when informed by the use of the so-called Conspectus, an inventory and ranking system for collections created and maintained by RLG.

Selection for National Plan

What has been called the “great collections” approach to selection for reformatting turned out to resonate with many libraries. Using the RLG Conspectus to identify institutional strengths and searching databases for holdings in other libraries revealed that the level of duplication at institutions was surprisingly low. But selecting those titles and volumes, and only those, that are both valuable intellectually and fragile physically is a very labor-intensive activity. It would have meant choosing items for preservation literally title by title. In libraries that typically had millions of volumes, arranged by subject matter and not by date of publication, volumes printed between 1840 and 1950 and having high likelihood of embrittlement were not shelved together. The title-by-title approach was simply impractical. The great collections approach minimized the effort spent on selection decisions and relied for academic integrity on the original decision to acquire the item because it had research potential, now or in the future, and should be preserved. This method had the virtue of eliminating any influence of the current view of scholarship on the intrinsic value of the item, but it ran the risk of expending funds to preserve items that might never be used (incidentally, the same risk taken when the item was acquired) or that might not yet be brittle (but were likely to have been printed on acid paper). This is the method most widely used by those institutions that receive funds from NEH. It reduces the expenditure of scarce resources on selection and concentrates them on actual conversion and bibliographical documentation.

Another selection method, which may be called the bibliographical model and was often favored by micropublishers, used a series of titles or a body of literature identified by a bibliographer or scholarly editorial board as a basis for selection, thus assembling a single metacollection that exists only in surrogate form. (The contemporary discussions about building a virtual digital library hark back to similar debates about the bibliographical model decades ago.) It was generally acknowledged that this method could only be effective in those academic fields that had a highly evolved bibliographical consensus, such as classics, agriculture, theology, and some area studies. It is simply impractical for newer or more dynamic fields that are evolving too quickly for consensus to emerge, or for fields such as history, in which the size of the source base precludes the idea of comprehensiveness.

Use-driven selection takes an approach opposite to that of the collections- or subject-based methods. This is an essentially passive form of identification, in which any item that is called for use is treated if it is in bad condition. While every library employs this approach in selecting items for repair or replacement in order to keep the circulating collection in serviceable condition, few libraries use it for preservation reformatting (as distinct from making a photocopy for use). That this approach has not been widely adopted is due in part to the fact that NEH, the primary source of funding for preservation microfilming, endorses only the collections- and subject-based methods. User-driven selection is seen as ineffective in helping to rescue endangered information within a national, rather than local, context. By focusing on commonly used materials, it has been argued, one would end up creating a so-called national collection that is randomly selected, not a coherent body of literature. One would also run the risk that a little-used item would have decayed by the time scholarship came to see its potential for research. Proponents of using this approach, on the other hand, point out that little-used titles might well turn out to be better preserved in the future than those materials most in use (De Stefano 1995).

Since there is little significant overlap of titles among the major research libraries and each method of selection for preservation has both advantages and disadvantages, the chances of reformatting the most important embrittled titles is greatly increased by the use of all three methods in parallel.

The Role of Scholars in Selection

Scholars play a role in selecting items for preservation similar to the role they play in selecting items for acquisition. Sometimes it is direct and systematic, other times it is indirect and ad hoc. The problem of the brittle book was an alarming threat to their scholarly resources and many scholars were galvanized into action to work with librarians to solve it. One of the earliest and most illuminating collaborations between scholars and librarians to identify and preserve the most endangered literature in a discipline was that between the American Philological Association (APA) and Columbia University Libraries. The APA received a grant from NEH and Mellon in 1984 to microfilm the most important research materials printed between 1850 and 1918. The APA appointed an editorial board of seven scholars to select the materials and Columbia was contracted to do the filming. The goals of the project were

  • the preservation of a substantial body of the most important materials from classical studies in a mature but now endangered period,
  • the improvement of scholarly access to this material through a wide availability of inexpensive copies, and
  • an investigation into how a preservation program involving scholars directly in decision making might work (Bagnall and Harris 1987, 141).

The classicists followed the bibliographical method. The editorial board began by working from the published shelf list of Harvard’s Widener Library. Some scholars also read the shelves of their own institution’s library. Not surprisingly, “the scholars disagreed significantly on the number of titles recommended for preservation,” though the board decided to err on the side of inclusion (Child 1992, 148). As Margaret Child wrote,

Although this approach proved successful in terms of its goals, the preservation and increased availability of the core literature of the field, the process was cumbersome and expensive. It cost an estimated $1.50 per title to cover the direct, paid costs of the operation of the editorial board. However, the time invested in the review process by individual scholars was not reimbursed and represented a substantial donation by senior faculty with many other obligations. In addition, about thirty percent of the items recommended were not found at Columbia, and copies had to be located elsewhere, either to be filmed by the holding library or loaned to Columbia for filming. Although this method of selection has some obvious strengths, especially for small, well-defined fields such as classics, no other discipline has to date attempted to replicate it.

When the newly formed CPA first addressed the question of selection for microfilming in 1986, it turned to the American Council of Learned Societies (ACLS), representing largely humanities and social science disciplines, to survey its membership’s knowledge of and concern about the preservation of printed materials important to their own fields. The response to the survey, completed in 1987, was disappointing (fewer than one-quarter of the societies answered), indicating that this was not a pressing issue for most scholars. With the help of the ACLS, CPA established a number of scholars’ task forces to identify the most valuable areas of their academic disciplines to reformat. Eventually seven task forces were convened, on Renaissance studies, history, philosophy, medieval studies, modern language and literature, art history, and the Hispanic literary heritage. The experience of the task force groups was common in many significant ways. Scholars were generally unaware of the scope of the brittle-book problem in libraries. When they came to understand that not all the print record from the last century and a half in their fields would long survive, they were, at first, shocked. Upon reflection, they expressed understandable reluctance to predict what future research needs would be. They stressed that collaborative, cross-institutional action was necessary to rescue the endangered information. In 1995, CPA asked Gerald George to review and assess the activities of the scholarly task forces and to suggest options for continuing the consultative process. He recommended that the scholarly associations “should take leadership responsibility for preserving materials of priority importance for research in their respective fields, ” and should expand their work to include materials that should be prioritized for digital conversion (George 1995, 15).

There has also been significant and ongoing involvement of scholars in the selection of materials to be filmed with funding from NEH. Libraries and library consortia are responsible for submitting proposals to the endowment, and, while each library has its own method for selecting materials to be filmed, scholars, subject specialists, and bibliographers, many of whom have advanced degrees in a subject specialty, identify collections to be preserved. The University of Texas at Austin, for example, developed its filming lists for Latin American titles with the aid of an academic advisory group. The United States Newspaper Preservation Program, designed to preserve regional newspapers throughout the country, was the direct result of a research tools survey of historians, who identified that source base as one of the most valuable and endangered. The process of evaluating the grant applications always involves written reviews and panels staffed by scholars working with librarians to assess the value of the collections proposed for filming.7

The Future of the Present

The next century’s major preservation challenge will be to cope with the fragile media of the present century, from magnetic tape to digital files. The record of the twentieth century exists on many media that are far more fragile than paper. From the earliest methods of recording images and sound, such as nitrate film and wax cylinders, to the more contemporary formats of videotape and audio cassette, the media that carry the nontextual information of this century will present future scholars and librarians with more difficult access and preservation decisions than any yet faced with acid paper. Unlike the codex, these new recording media, especially those that record information in real time and must be played back on intermediary machinery, pose an intellectual challenge when it comes to deciding what to preserve, because we do not yet fully understand the ways in which they carry significance and are used. In comparison with the printed word, sound and image are still very close to what could be considered an incunable stage. Just as the revolutionary effects that print had on religious, political, social, and economic development were barely presaged during the first one hundred years of print, it is likely that we will need decades more to understand enough about these media to make fully informed decisions about collection development and preservation. But by then, unlike the contents of print incunables, a significant portion of the information on these media will have been permanently lost, in precisely the same way that as much as 80 percent of the motion pictures made before 1940 have perished-that is, either because no one saved them at the time, or those which have been saved have nevertheless deteriorated physically beyond recovery.

The Council on Library and Information Resources (CLIR), formed by the merging of CLR and CPA, joined with ACLS in 1997 to form joint scholar-librarian task forces to address the collection development and preservation problems that these media present. What has emerged clearly from the CLIR-ACLS task forces is the unchanging desire for scholars to work with original, unreformatted, primary source materials, from paper manuscripts to vintage photographs, though many look to digital copying to facilitate use of older and very fragile media such as wax cylinder recordings (CLIR 1999). Future use of new media and formats is hard to predict reliably. Disciplines are consulting all different types of materials now and looking at ever-broader groups of documents as primary sources (for example, railroad schedules, menus, advertising), and texts are being scrutinized in a way that is in some cases quite new. Indeed, more and more attention is being paid to the circumstances of production and consumption of documents than ever before, leading to increased demand for the artifact itself as the bearer of information. However, for secondary source materials, digital access is sometimes preferable. The accumulation and retention of more sources present librarians with serious storage, preservation, and access problems not covered by current funding levels. One clear trend is that ever greater numbers of collection items will go into secondary storage, which is usually salubrious for the longevity of the holdings but comes at the price of less ready access.

The balance between preservation and access has always been precarious in those items that rely on the durability of the carrier, be it print on vellum, emulsion on wood-pulp paper, or grooves on acetate disks. Copying as a preservation strategy-for example, making a safety copy of nitrate film or a photocopy of a manuscript leaf-nearly always entails some loss of information. Therefore, the treatments that have been developed to preserve those types of items are designed to retard the tendency of materials to decay (so-called inherent vice) and to minimize physical handling, in order to obviate the need for copying or reformatting.

The strategies that have emerged in the past two decades to manage preservation risk to print and media (non-digital) research collections include

  • controlling storage environments to keep temperature and humidity levels consistent and at optimal settings that retard the natural processes of decay (such as photo oxidation and acid hydrolysis);
  • instructing patrons and staff on how to handle fragile materials in a way to minimize damage;
  • restricting access to fragile items (for example, by allowing only staff to photocopy);
  • removing items from service and providing surrogates to users (copy prints of photographs, digitized images, microforms, photocopies, and the like);
  • rehousing items in acid-free containers or inert Mylar sleeves; and, perhaps least visible of all,
  • implementing emergency preparedness strategies by equipping storage areas with water-damage protection and training staff in urgent response to catastrophes, man-made and natural, and by taking other similar steps.

These activities are unobtrusive and seldom remarked upon. Ironically, most patrons see preservation at work only in the cases when they themselves are asked to modify their behavior to protect the collections (for example, by wearing cotton gloves to handle photographic images), or when the medium has failed to survive one stress or another and the user must make do with a surrogate such as microform or facsimile. Changes in the behavior of researchers as well as librarians, including using safe handling techniques with fragile materials, learning how to retrieve items that are stored off-site, and making increased use of search tools to find materials that may no longer be browsable, indicate that the research environment of the future will rely, as it does now, on the continuing education and adaptibility of all members of the research community.

There is great hope that digital technology can help to preserve and make more accessible many rare and fragile items, because the quality of digital images is high and the use of electronic surrogates is easier than that of microforms. What role can digital conversion play in making the irreplaceable information contained in oral histories, field recordings, vintage photographs, and wax cylinders both more accessible and more durable? Unfortunately, we know already that electronic data are even more liable to disappear or become unrecoverable than information in analog media, such as microfilm. Refreshing data and migrating them from one hardware and software configuration to another as the technology changes require a major investment of resources, financial and other. Unlike a book, digital data will not survive long if left on the shelf and not refreshed. Critical preservation choices have to be made when digital information is being created, not, as is the case with other formats, at a later time and usually not by those directly involved in the creation of the information.

Moreover, the very concept of preserving original items (or information in its original format) is problematical in the digital environment. Electronic information has a virtual reality that has little or no dependence on physical media. There is nothing in the digital world comparable to the analog world’s well-understood notions of an original, an artifact, or a facsimile. One reason that the original is so valued in the analog environment is that the original object has more integrity and authenticity than a copy, and copying always produces some loss of information in addition to the change of medium. In the digital world, there is no loss of information during copying. Indeed, one could argue that one does not copy digital information, one clones it. What is at risk over time, however, is the loss of functionality in digital files. When data are preserved by migrating files from one system to another, some features of a file nearly always drop out or are altered.

Even though the problems of digital preservation have yet to be solved in a cost-effective manner, all major research libraries are devoting scarce resources to the development of an infrastructure that will support the creation of and access to digital information. This is happening in the very libraries where, every day, decisions are made about how to treat damaged or brittle volumes stressed by being photocopied, falling into book drops, being pulled and reshelved, and being shoved into backpacks and briefcases.

Preservation is the art of managing risk to the intellectual and physical heritage of a community and all members of that community have a stake in it. Risk management is dynamic, and, in practice, preservation becomes an ever-changing assessment of value and endangerment. A collaboration between scholars, who can advise about the intellectual value of collection items, and librarians, who can make judgments about the physical risk that threatens collections, is the best and most responsible way to ensure that the legacy we have inherited, and to which we contribute, will survive into the future.

References

Bagnall, Roger S., and Carolyn L. Harris. 1987. “Involving Scholars in Preservation Decisions: The Case of Classicists.” The Journal of Academic Librarianship XIII:3.

Battin, Patricia. 1992. “Substitution: the American Experience.” Typescript, lecture in Oxford Library Seminars, “Preserving Our Library Heritage,” February 25.

Child, Margaret. 1992. “Selection for Preservation.” In Advances in Preservation and Access. Edited by Barbra Buckner Higginbotham and Mary E. Jackson. Westport, Conn.: Meckler Publishing.

Commission on Preservation and Access. 1991. “Review and Assessment Committee Final Report,” September 26. Washington, D. C.: Author.

Council on Library and Information Resources. 1999. Scholarship, Instruction, and Libraries at the Turn of the Century: Results from Five Task Forces Appointed by the American Council of Learned Societies and the Council on Library and Information Resources. Washington, D. C.: Author.

De Stefano, Paula. 1995. “Use-Based Selection for Preservation Microfilming.” College & Research Libraries (September).

George, Gerald. 1995. “Difficult Choices”: How Can Scholars Help Save Endangered Research Resources? A Report to the Commission on Preservation and Access. Washington, D. C.: Commission on Preservation and Access.

Hayes, Robert M. 1985. “Analysis of the Magnitude, Costs, and Benefits of the Preservation of Research Library Books.” Working paper prepared for the Council on Library Resources.


FOOTNOTES

1 In contrast, by 1991 there were 52. (CPA 1991).

2 In 1961, CLR helped to establish the Barrow Research Laboratory in Richmond, Virginia to study the effects of environment on the longevity of books. The following year, ARL commissioned Gordon Williams to do a large-scale preservation survey of its member libraries.

3 Work of William J. Barrow of Virginia State Library.

4 Large-scale development of mass deacidification became a major initiative of the Library of Congress in the 1980s. Testing of the library’s selected method of gaseous deacidification, known as diethylzinc, or DEZ, was carried out at the NASA Goddard Space Flight Center. However, an accident in a testing chamber in 1985 led to the dismantling of the facility and proved to be a serious deterrent to the libraryÕs deacidification program.

5 Those items selected for special treatment usually fall into the following categories: items that are rare or unique, including things valued for their association with an author, owner, etc.; fragile objects; primary, not secondary, source materials, which are, by their nature, often unique; and items at risk of theft or mutilation.

6 The National Register of Microform Masters (NROMM) was created in 1965 to record where a surrogate resides. Now fully machine-readable, NROMM comprises nearly 2.3 million records, available online in OCLC and RLIN. Since 1990, libraries have been able to record not only the creation of a microform master, but also the intention to film a title.

7 NEH also makes available funds to prepare an item for the camera and for stabilizing the object before it is put back on the shelf.

Skip to content