Making a Path from the 21st Century to the 19th
–by Abby Smith
DLF Looks Beyond the Technical Issues
–by Donald J. Waters
We Can’t Save Everything
–by Deanna B. Marcum
Making a Path from the 21st Century to the 19th
–by Abby Smith
AS WE NEAR the moment when the 20th century will displace the 19th in everyday speech as “the last century,” librarians and scholars have begun a national dialogue about how best to ensure the transmission of information from the 19th century into the 21st. The 19th century is marked, among other things, by remarkable technological innovations in recording media, from the invention of photography to the devising of means to capture and play back sound. The latter two technological breakthroughs did not reach maturity until this century, when they have transformed the way people communicate among themselves and record and transmit information. One other notable 19th-century advance in recording information—the invention of cheap paper that enabled inexpensive book production—revolutionized the reading and writing, if not the thinking, habits of Americans in the 1800s. Paper manufactured from wood pulp made the book and the periodical the key media of information for generations and was largely responsible for the development and growth of research libraries as we know them today. It is only natural that scholars are systematically investigating the material culture of the 19th century, including the multifarious aspects of book production, distribution, and consumption, to document and interpret the century’s “Information Revolution.”
“Librarians and scholars have begun a national dialogue about how best to ensure the transmission of information from the 19th century into the 21st.”
One of the more lamentable legacies of the 19th-century publishing boom is the plethora of sources in our libraries that are printed on acidic paper. Without active intervention to save them, through techniques such as deacidification, these books and periodicals are destined for rapid obsolescence. The problem was recognized decades ago and addressed with particular energy and vision by the research library community, members of which joined together in an aggressive campaign to advocate the production of alkaline paper, to develop mass deacidification technologies that would halt the process of natural degradation, and to capture on film millions of pages of text imperiled by the already-brittle paper on which it was printed. Because the threat of the “slow fires” of acid paper was widely perceived to be a crisis that could not wait to be addressed, it received urgent attention and financial resources at a national level. Funding from Federal agencies such as the National Endowment for the Humanities, the Library of Congress, and the National Library of Medicine and support from generous private sources such as The Andrew W. Mellon Foundation and the William and Flora Hewlett Foundation were directed to countering the acidic fires through a massive filming program to rescue at least the intellectual content of these materials. Other problems with the books, judged to be less time-sensitive, were left to receive their due attention at a later time.
A growing number of scholarly and library associations are talking and writing about the need to preserve not only the information in a book, as filming does, but the book itself—an artifact of the 19th century.
That time has now come, as is signaled by the growing number of scholarly and library associations talking and writing about the need to preserve not only the information in a book, as filming does, but the book itself—an artifact of the 19th-century. The program to film brittle books has worked within the stringent premise that preservation must focus, through reformatting, on capturing textual information, not the contextual information about the creation and recording of the information that inheres in the physical volume itself. That was assumed to be beyond rescue. The strategy to rescue information destined to perish along with the fragile medium on which it was carried necessitated a decision-making process that divorced information from artifact. And yet, for the large store of library materials that may be fragile but not yet highly embrittled, there can—and should—be a way to make decisions about preservation treatment that take into account the research potential of the book or serial not just as —content provider— but as artifact.
The millions of volumes of 19th-century printed matter that may warrant being preserved for their artifactual as well as their informational value are not to be confused with rare books, of course, and it is not being proposed that all books printed before 1901 be segregated in rare-book stacks across the country, as books printed before 1801 are segregated. Nineteenth-century books, often referred to in shorthand as “medium-rare,” do need additional protection during service in the reading room, in the manner of rare books. But they should not be rebound into buckram, sturdy but aesthetically numbing. Nor, if they are reformatted, should they automatically be removed from circulation. In 1989, in a paper entitled On the Preservation of Books and Documents in Original Form written for the Commission on Preservation and Access, Barclay Ogden, preservation specialist at UC Berkeley, recommended one simple solution to selection: “The vast majority of artifacts could be preserved without treatment and at low cost through preservation measures to reduce their rates of deterioration and wear, thereby extending their lives and minimizing the number of artifacts in need of treatment at any one time.” This eminently practical approach to preserving 19th-century resources has the great virtue of recognizing the responsibility of each library to care for its own collections. It makes all decisions about treatment local.
On the other hand, what groups as diverse as the Modern Language Association and the American Library Association’s Rare Books and Manuscripts Section are calling for is not a strategy for preserving collections locally but a plan to do so nationally. The plan would be coordinated and, presumably, funded at a level that would reflect the groups’ conviction that the book legacy of the last century is a national asset that must be preserved nationally, in much the same way as saving the information threatened by acid paper was treated as a national preservation problem. Such a plan would call for consensus between scholars and librarians about the essential elements of the 19th-century book that need to be documented and preserved.
To develop a national plan for collaboration in ensuring long-term access to the printed records of the last century, four questions must be answered. The integrity and the feasibility of the plan will depend on who provides the answers and how the process of securing them is managed:
- What are the essential elements, both intellectual and physical, of the 19th-century book that must be documented and preserved ?
- How do we document them, and what standards need to be developed or ratified for the uniform recording of the elements?
- How do we record the preservation of an artifact and make that record available nationally?
- How do we decide who is responsible for documenting and preserving the artifacts?
Once the first question is answered, and the intellectual and physical criteria for preservation selection are defined, how do we build the infrastructure, which is both bibliographical and political, that will allow individual institutions to preserve their collections for the nation and not only for their local patrons? How would libraries make the preserved artifacts accessible nationally? They cannot provide a surrogate because the surrogate would not do for the type of research that requires the artifact—and the book itself is presumably too fragile or rare to let out on interlibrary loan. What kind of access could libraries offer to researchers, other than the usual on-site consultations that currently exist?
This issue might not be germane from a service point of view, because a scholar who needs to consult a unique source is likely to assume the burden of traveling to the repository where it is housed. But the question of accessibility may well have consequences for funding artifactual preservation. The element that has made the Federal Government so willing a partner with libraries in preservation reformatting is the new accessibility of the filmed items: the endangered collections are not only preserved but made accessible through easy copying of the microfilm. While NEH does make grants for the preservation of special collections, it does so through a limited program—which is also, at present, the only Federal program that dedicates funds to saving unique research items.
“What is now essential if the 19th century is to have a smooth transition into the 21st is that scholars and librarians work together to solve these problems and make the difficult choices that preservation always entails.”
The collaboration has begun, in advance of the millennium, and promises to bestow a long life on what our 19th-century forebears created.
DLF Looks Beyond the Technical Issues
–by Donald J. Waters
THE FUNDAMENTAL GOAL of the Digital Library Federation is to create the conditions under which distributed digital libraries can be federated. DLF partners want to integrate digital materials held in repositories at many different institutions, so that the virtual whole created from the materials is far more than the sum of its parts. To this rich and diverse whole, each partner in the DLF will contribute resources of various kinds: locally born digital materials, including databases, articles, monographs, reports, and course materials; materials digitized from other formats; and digitized materials that an institution has licensed from elsewhere and wants to make available to other institutions.
The successful federation of these categories of materials—across institutions—requires the DLF to confront a host of difficult technical issues. What systems components are needed? What is the most efficient scanning process for converted materials? What file formats should be used? What are the appropriate user interfaces? How can the persistence of data be assured?
But the technical issues, for all their complexity, may prove less challenging than the issues of policy that govern access to the digital materials and establish procedures for their distribution. An understanding of the significant policy work that needs to be done at senior administrative levels of DLF universities might usefully begin with discussion of two issues in particular: institutional copyright policies and interinstitutional licensing policies.
Institutional Copyright Policies. At universities, faculty members generally hold the copyright to their work, and upon publication they turn the copyright over to a publisher. As the costs of journal publications, particularly in the sciences, have soared for libraries and universities, speculation has turned to how changes in faculty ownership of copyright might alter patterns of scholarly communication. The new arrangements would favor libraries and institutions of higher education rather than commercial publishers. The speculation depends, in part, on an assumption that faculty members (at universities such as those in the DLF) would not simply give away the rights to the scholarship they produce. Rather, by reserving the rights, particularly to the digital versions of their work, they would open the possibility of new forms of distribution, including those forms that might be leveraged by investment in federated digital libraries.
There are many factors at play in faculty members’ decisions about how to handle the rights to their scholarly output. Prominent among them are the administrative policies in effect at their universities. For some years now, universities have asserted interest in patents that faculty members have generated. But they have not laid claim to copyright. At the DLF Steering Committee meeting in June, about half the members reported that their institutions have recently reviewed their policies on faculty copyrights or appointed high-level policy committees, which include faculty members, to do so. What changes are occurring or being considered? What opportunities do these new policies offer, or should they offer, for libraries to improve the system of scholarly communication by managing work created locally?
“Some institutions are beginning to assert an interest in copyright much as they do in patents, and, by all accounts, change so directed promises to meet with resistance from faculty members.”
According to the reports given at the Steering Committee meeting, some institutions have decided, after reviewing the matter, to make few, if any, changes at this time. Others, however, are considering substantial changes in one of two directions. Some institutions are beginning to assert an interest in copyright much as they do in patents, and, by all accounts, change so directed promises to meet with resistance from faculty members. In other cases, institutions are urging faculty members, through various proposed mechanisms, to recognize different kinds of rights in their works, to grant publishers limited licenses to distribute them, and otherwise to retain the rights.
Policy change in either direction—toward patent-like control of copyright or toward vigorous support for faculty members’ retention of copyright—creates an administrative burden for the institution, which must engage in copyright management. It also creates service opportunities for organizations like libraries to assist in managing copyright and in reshaping the patterns of dissemination of the works under copyright.
Models of Interinstitutional Licensing. Policies about copyright in institutions of research and higher education may also create significant opportunities for libraries to manage digital materials born locally and “federate” them with materials at other institutions. On the other hand, such policies might forestall opportunities and place them beyond the bounds of what the institution has decided is the proper scope of its business. In the short term, the results are likely to be mixed: institutions will experiment with a range of policy options to find out which work for them and which do not. In this experimental environment, library directors can contribute substantially to the larger institutional policy discussions by sorting out and articulating with clarity—and in terms of library policy—their own values about how to manage and federate the digital intellectual property for which they are responsible.
Thus, if the library directors were to consider, for example, only the materials they have digitized from other formats, they would immediately face a string of important questions: What are the service objectives in providing the digitized property? Who are the users, and how are they known? What quality of service should be provided, and how is quality of service defined? How are the intellectual property and the value of an institution’s investment in it being protected? What is the value of the investment? How are institutions selecting property for digitization? How are they financing both the initial investment and the ongoing operations it generates?
As libraries become more proficient in the technical mechanics of creating and delivering digital information, these broader service questions will become more insistent and demand attention. A recent article in the Chronicle of Higher Education (May 22, 1998, p. A27) suggests that one set of answers to the questions will move digital libraries in an explicitly commercial direction—to act as publishers of converted materials and to offer at least some of those materials for sale. But a commercial approach will be neither the only solution nor necessarily the best. Indeed, at this stage in the development of digital libraries, perhaps the wisest approach is to offer no certain answers to the questions but to assure rather that they are framed properly, within an environment where staff and other organizational resources are deployed to explore the full range of responses.
Within the context of the DLF’s objective to bring together resources contributed by various partners, institutions need to pose these questions as a matter of providing interinstitutional services. What practical modes of operation are best suited to that context of service? One relatively familiar approach may prove especially useful. In recent years, as libraries have invested substantial sums in electronic journals, they have worked with publishers in a contractual environment. There is a distinct commercial component to these arrangements: the contracts serve to set prices for licensed access to journal content. But in many respects the pricing component is the least important part of the contracts. Their more significant feature is that they provide a common ground from which the parties can address the new agenda of service issues associated with the provision of digital information. Contractual arrangements set the terms and the conditions for taking on the issues.
Because the contractual experience of libraries and publishers establishes an environment for dealing with critical service issues, it has a good deal of relevance for digital libraries in their role as providers of converted information (and, potentially, of locally born and remarketed digital materials). In June, the DLF Steering Committee began to debate whether libraries, archives, and special collections should adopt, as a matter of policy, the use of licenses to make their digital materials available across interinstitutional boundaries. The partners next need to model the conditions under which interinstitutional licensing seems appropriate, and then decide how the DLF might illustrate and test the models.
DLF Adds Research Associate
|REBECCA GRAHAM WILL join the CLIR staff as a Research Associate for the Digital Library Federation (DLF) on October 1. Graham comes to the DLF from the University of Illinois at Urbana-Champaign, where she was the Manager of Integrated Systems in the Library Systems Office. Previously, she held positions as a help-desk manager for Navistar International Corporation, as a senior computer operator for Wright State University, and as the Distributed Systems Manager for the Johnson County Library in Kansas.
Graham took a Bachelor of Science degree in Organizational Management at Wilberforce University and has just received her Master of Science in Library and Information Science from the University of Illinois at Urbana-Champaign.
At CLIR, Graham will develop and manage the DLF Web space, work with staff members of the DLF partner institutions to communicate about digital library projects and developments, assist in organizing and conducting Federation-sponsored projects, and represent the Federation to various external organizations.
We Can’t Save Everything
–by Deanna B. Marcum
This essay appeared on the OP-ED page of The New York Times on July 6, 1998
You’d think this would be a great time to be in the business of preserving history. Cyberspace is not real estate, and digital information takes up no room. Unlike books, which fill hundreds of miles of shelves in the Library of Congress, or reels of film, which require expensive refrigerated vaults, bits and bytes can record staggering amounts of data on a single computer hard drive.
So it may seem that, at little or no cost, we should now be able to save everything—every report, receipt, E-mail message, Web page. Unfortunately, it’s not that simple. In fact, technology is turning out to be as much foe as friend. The amount of information we create on our computers is growing exponentially, leaving us with a quantity of data that humans have never coped with before.
All this has presented librarians, archivists, and others who preserve history with difficult challenges and choices. What do we preserve? How do we preserve it? At what cost?
There is no doubt that much of the information our society gathers is critical to science, medicine, and our economic prosperity. Government agencies, colleges, and foundations produce thousands of studies a year on topics large and small. Satellites continuously beam information to earth about our weather, our national defense, our cosmos.
Individuals, too, are amassing information in amounts that would have been unwieldy in the print-on-paper world. An office worker might create dozens of elecronic documents, E-mail messages, and databases every day, and then go home, log onto a personal computer, and create even more.
Saving electronic documents seems like it should be as easy as creating them. That perception recently helped a group of historians out-argue the National Archivist in court. Three years ago the Archivist, John W. Carlin, advised Government agencies that they could delete certain computer files if they kept paper copies. Historians, however, argued that the records should be retained in electronic form (which would show, for instance, who handled each document), and in April a judge agreed, at least pending the findings of a study committee.
Looking ahead to when President Clinton leaves office, the Archives estimate that it will receive upward of eight million electronic files. Surely, it seems, these files would be easier to store electronically than as tens of millions of pieces of paper. (The Nixon Administration, for example, left more than 40 million pages to the Archives.)
But electronic storage, too, has its drawbacks. Of course, there is a sense of security in knowing that should an environmental scientist in 2040 need to know what the chemical composition of the rivers of western Colorado was in 1998, she could retrieve that information with a few computer commands.
But the preservation of data over decades and centuries demands not only a huge financial investment in computer hardware and software, but also the institutional committment to maintaining those systems. Try using Fortran, CP/M, or Wordperfect 2.0 data on a new computer. Storage on a disk may well be more costly than storage in vaults and on shelves.
Thus, the paradox in all this abundance is that the easier it is to create and store information, the harder that information is to manage, and the greater is the threat that we will not be able to find something when we need it. There is simply too much to sort through.
Experts in and outside of Government are still looking for the best solution to the problem. But clearly one thing that is not the solution is to save every tidbit of data generated.
Information is useful only if it can be easily found and retrieved. Anyone who has gone to an Internet search engine with a real if impercisely worded query and gotten thousands of “hits” in reponse knows that too much information is as bad as none at all.
In contrast, consider our much-envied research libraries—Harvard, Stanford, the University of Illinois, and others. Library collections are useful to scholars, students, and the public because they were shaped by men and women who used their critical judgment to select items that would be of value. They did not just hit the “save” button.
Archivists and record managers at those same institutions, working with a different mission from the librarians, have stored large numbers of documents that were required for limited periods of time, usually as evidence. (Think of your own medical records, or your bank’s financial records.) These materials have been gathered and stored with the view that eventually most could be discarded.
“Save everything” is not now the rule in Government either. At most, only 5 percent of all Government records are selected for permanent retention. Those who decide what to keep are professional archivists who work to legally sanctioned guidelines. These guidelines are periodically reviewed to prevent partisan opinions from compromising the intellectual audit trail of state and Federal governments.
Librarians and archivists know that a good collection, like a good book, is made in the editing. Individuals don’t save every scrap of paper in their files—tax records, restaurant receipts, drycleaning tickets—and deed them to their descendants with the injunction that they be kept forever. That would be irresponsible, self-centered, and lazy.
So, too, as a nation where full citizenship is based on free and unfettered access to public information, we must take responsibility in shaping that legacy. As James Madison wrote” “A popular Government, without popular information, or the means of acquiring it, is but a prologue to a Farce or a Tragedy, or, perhaps, both. Knowledge will forever govern ignorance.”
Madison, of course, was worried about censorship. In our age, when too much information can numb the mind and paralyze the will, knowledge may develop a new sort of elusiveness.
CLIR is pleased to announce the following new publications:
The Mirage of Continuity: Reconfiguring Academic Information Resources for the 21st Century.
Edited by Brian L. Hawkins and Patricia Battin. $25.
This book of essays, which CLIR is publishing in partnership with the Association of American Universities, comes to grips with the profound, and indeed transforming, changes technology will effect in how the nation’ss university campuses provide information resources in the 21st century. Hawkins is the new president of EDUCAUSE; Battin served as Vice President for Information Sciences and University Librarian of Columbia University. The additional contributors to the volume are John Seely Brown, Stanley Chodorow, Paul Duguid, Douglas Greenberg, Jos?-Marie Griffiths, Susan Hockey, Richard N. Katz, Donald Kennedy, Michael E. Lesk, Paula Kaufman, Peter Lyman, Deanna B. Marcum, Susan Rosenblatt, Donald J. Waters, and Samuel R. Williamson.
Computerization of the Archivo General de Indias: Strategies and Results.
By Pedro Gonz?lez. $20.
This report, written by the former director of the Archivo General de Indias in Seville, is an account of how the Archives dealt with the myriad technical, organizational, and managerial challenges that were presented when it undertook to digitize more than eleven million pages of documents about the Spanish presence in the New World.
Selecting Research Collections for Digitization.
By Dan Hazen, Jeffrey Horrell, and Jan Merrill-Oldham. $15.
Three Harvard University librarians serve as guides through the practical questions that need to be asked and answered before research libraries commit to digital conversion projects.
Each of the above publications may be ordered from CLIR for the price shown, postage and handling included. Orders must be prepaid by check made out to CLIR.
IFLA Principles for the Care and Handling of Library Material.
Compiled by Edward P. Adcock, with Marie-Thérése Varlamoff and Virginie Kremp.
This introduction to the care and handling of library materials for individuals and institutions with little or no preservation knowledge is being copublished by IFLA and CLIR. PAC regional offices will distribute the publication to IFLA members. Others wishing a copy should write the International Centre at the Bibliothéque nationale de France, IFLA PAC, 2, rue Vivienne, 75084 Paris cedex 02 France.
|Council on Library and Information Resources
1755 Massachusetts Avenue, N.W. Suite 500
The four current programs of CLIR are the Commission on Preservation and Access, Digital Libraries, the Economics of Information, and Leadership.