Number 16
July/August 2000
Contents
A New Approach to Finding Research Materials on the Web
Library Services for the Future: CLIR Symposium
by Deanna B. Marcum
Preventive Measures: All the Print That’s Fit to Save
by Abby Smith
- Anne Kenney to Join CLIR
- CLIR and DLF Initiate Program for Distinguished Fellows
- A. R. Zipf Fellowship Awarded to Rich Gazan of UCLA
- Forthcoming Publications
Authenticity in the Digital Environment
A New Approach to Finding Research Materials on the Web
WIDESPREAD NETWORK ACCESS to digital resources has created a paradox for the academic and research community. Despite enormous institutional investment in the creation and description of materials of serious interest to research and education, these resources exist in isolated pockets. They are difficult to find and impossible to search across. Meanwhile, students and faculty are tempted into over-reliance on commercial Internet search engines, despite their limitations and the uneven quality of the materials they include. In terms of providing convenient network access to information resources of value, neither the traditional library approach nor the emergent Internet approach is serving the academic community well.
The traditional library approach has relied upon the creation of descriptive metadata to give users search access to materials. To make researchers aware of the contents of dispersed and uncoordinated collections, union catalogs that reflect the holdings of many libraries have been created. These systems, however, require participants to actively contribute records to the union database. Moreover, union catalogs are designed around a single type of metadata, which limits the catalogs to materials that can be described by such metadata. Commercial abstracting and indexing services have developed separate systems for access to the journal literature, leading to a disjuncture between the search for monographs and the search for journal articles. The emergence in the 1990s of new metadata formats for everything from archival finding aids to social science data sets has led to an increasing balkanization of search-and-retrieval systems.
Some libraries have taken an alternative approach, using a standard search-and-retrieval protocol, Z39.50, to perform broadcast searching of physically separate databases. While this approach has the advantage of offering virtual union search capability across repositories with differing underlying data formats, it presents problems of its own. The data providers must support complex Z39.50 server software, and considerable coordination is required to set up workable profiles. Z39.50 search also works best across a limited number of services; it does not scale to the thousands of potential sources of digital content.
Internet search engines, in contrast, have generally relied upon the automatic indexing of HTML text, as opposed to the creation of metadata, and upon the automatic harvesting of Web sites, rather than on the active contributions from data providers. Consequently, these services scale in a way that traditional library approaches do not, and they allow for the creation of massive indexes that dwarf the largest library union catalog. However, Internet search engines cover academic and scholarly materials poorly, burying them in quantities of less reliable resources provided by the commercial sector or by unknown and uncredentialed individuals and organizations. Further, the search engines cover only a portion of “Web space” (as little as 17 percent according to a recent study) and frequently favor retrieval of resources based on their own business considerations rather than on the needs of searchers. They are susceptible to page-jacking, index spamming, and other dubious practices. Moreover, an enormous percentage of scholarly materials, from digitized slides to survey data, are not described by static Web pages but rather in myriad databases; consequently, such materials are largely invisible to the search engines.
A recent and promising approach attempts to combine the best of library and Internet techniques into a wholly new model for accessing scholarly resources. This model has its genesis in the October 1999 meeting of the Open Archives Initiative in Santa Fe, New Mexico, under the sponsorship of CLIR, the Digital Library Federation (DLF), the Scholarly Publishing and Academic Resources Coalition, the Association of Research Libraries, and the Research Library of the Los Alamos National Laboratory. The group discussed the interoperation of “e-print archives” (collections of electronic journal articles and preprints), focusing on how e-print repositories could most easily share metadata about their holdings. The group decided to pursue an approach adapted from the harvesting technique employed by the Internet search engines. In this approach, there are data providers and service providers. A data provider agrees to support a simple harvesting protocol and to provide extracts of its metadata in a common, minimal-level format in response to harvest requests. It then records information about its collection in a shared registry. A service provider uses this registry to locate participating data providers, and uses the harvest protocol to collect metadata from them. The service provider is then able to build intellectually useful services, such as catalogs and portals to materials distributed across multiple e-print sites. Ongoing commitment by the principals of the Open Archive Initiative will produce a refinement of the conventions and testbed implementations of this model.
Under the aegis of the DLF and with support from The Andrew W. Mellon Foundation, a related group has been building on this promising foundation by discussing how to generalize the concepts developed in Santa Fe into a universal model for research metadata harvesting. This model would apply to a wider range of digital resources of academic and scholarly interest. In addition to e-prints and electronic texts, such resources include science and social science data sets, visual materials, archival collections, geographic information system (GIS) data, sound and music, video, and any other type of resource for which metadata is typically created.
This approach appears to have many virtues. The effort to enable a data provider to support metadata harvesting is reasonable compared with that required to contribute to a union catalog or provide Z39.50 search access. After the repository is set up, its metadata can be harvested with relatively little intervention, providing the scalability that library approaches have lacked. The development of both inclusive and specialized search services is possible; in fact, the model encourages the development of many search services competing in terms of functionality, audience, and business models, thereby enriching the entire research environment.
This model could be used to expose the metadata in thousands of individual systems worldwide to central collection. For example, comprehensive collections of Americana or GIS data could be developed. This would make local repositories more generally known, and more generally useful, because researchers could search across previously unconnected materials. It would also illuminate the “dark matter” of the Internet—material that is hard or impossible to find if the user does not already know where it exists.
Most important, the academic community could begin to ensure that services would be developed that express the values of that community—services that center on materials of research and educational interest, that provide honest and transparent ranking and retrieval, and that improve search quality by intelligently integrating metadata.
The services that could be developed under the expanded Santa Fe framework are limited only by need and ingenuity. The following are among those discussed:
- A portal to digital Americana. Many universities, archives, historical societies, cultural institutions, and other organizations are creating Web-accessible collections of Americana, often with grant funds. Currently, these materials remain largely invisible to educators and scholars. A service focusing on harvested metadata for Americana might combine access to archival visual and textual collections, such as those included in American Memory, with citations to electronic journal articles from JSTOR, early American fiction from the University of Virginia, H. H. Richardson architectural drawings from Harvard, the Hoagy Carmichael collection at Indiana University, Hawaiian language newspapers from the University of Hawaii, and audio, visual, textual, and multimedia materials from hundreds of relevant sites.
- A portal to environmental information. Environmental information is collected by hundreds of international, federal, state, and private agencies, and described using dozens of metadata formats. This information is used intensively by government and university researchers, despite the difficulty of finding data scattered among such a vast number of sites. A portal built upon harvested metadata could combine access to land, air, and space data from key government agencies with access to white papers, treaties, policy documents, journals, newsletters, and other relevant sources of environmental information. An even more ambitious service might combine search access to environmental information with geographic information resources such as those indexed by the Illinois Natural Resources Geospatial Data Clearinghouse, the University of Nevada Geospatial Data Clearinghouse, the NYS (New York State) Spatial Data Clearinghouse, and other regional clearinghouses of geospatial information.
- The academic engine. Despite the availability of library catalogs, online journal search services, and departmental databases, many university students and researchers turn first to the major commercial Internet search engines for resource discovery. A comprehensive Internet search service oriented toward academic and research resources would be a more productive alternative. Such a service might include all the information covered in more specialized portals (e.g., Americana, environmental information, GIS), as well as metadata from academic catalogs and databases, Web pages in the “.edu” domain, and commercial resources aimed at the research community.
It may not be a huge undertaking to move this vision to reality. The next steps are to formalize the framework, to establish mechanisms to encourage research collections to make their metadata available, and to encourage service providers to build useful tools based upon the harvesting of this metadata. The following areas must be addressed:
- extending the general framework to encompass tools, business models, and project coordination;
- formalizing the governance structures for maintaining, documenting, and promoting the technical framework;
- creating a registry of high-quality research sites with harvestable metadata; and
- defining a set of demonstration projects to build and test a few catalogs, portals, and other services of interest to the research community to test the validity of the model.
Recent statements from the library community have urged research libraries to step up to the creation of a scholarly information commons—a networked space where users readily and seamlessly traverse the collected wealth of our disparate educational and cultural collections. A workable model for research metadata harvesting may provide the infrastructure needed for approaching this task. To gain community input to this initiative, the Digital Library Federation and other involved parties will issue additional discussion papers and sponsor meetings exploring these ideas over the next several months. For further information, contact the DLF at info@clir.org.
Library Services for the Future: CLIR Symposium
ONE-HUNDRED-TWENTY CLIR sponsors gathered at the Washington Hilton Hotel on May 5 for a symposium on Library Services for the Future. The session began with four speakers, two of whom have directed CLIR-funded projects and two of whom are CLIR staff members managing major projects.
Wendy Lougee, assistant director for digital library initiatives at the University of Michigan, and Max Marmor, director of the Arts Library at Yale University, described projects in which libraries have used digital technology to push the boundaries of service.
Ms. Lougee reported on the PEAK (Pricing of Electronic Access to Knowledge) project, which investigated the pricing, cost, and usage of electronic publications. The three-year study, undertaken with support from CLIR, gave 12 university and research libraries a choice of several pricing options for access to all 1,200 Elsevier Science journals. PEAK models included a new construct, a “generalized subscription,” whereby institutions may prepurchase bundles of articles rather than title-specific subscriptions and allow users to select the “subscription” base through actual use.
Mr. Marmor portrayed image libraries as being at a crossroads. “We are confronted with a decision that will fundamentally shape the digital image library of the future,” he said. “We will either take the path of least resistance, in which we continue to create useful but redundant image collections of dubious provenance and legitimacy and . . . conceal them from one another behind firewalls . . . or we will follow the road less taken.” Taking the less conventional path means summoning the considerable effort needed to create the organizational structures that favor the development of legitimate, shared digital image collections and respond to the needs of students, teachers, and scholars—that is, the development of demand-driven digital libraries.
Daniel Greenstein, director of the Digital Library Federation (DLF), encouraged the audience to consider the sizable challenges facing digital libraries. Since even institutions with robust digital library initiatives have limited research and development capacity, there is every reason for such institutions to share information on research, development, and implementation practices. Institutional efforts are relatively small and scattered; federated activity allows participating institutions to develop sustainable, scalable, and innovative digital collections. The DLF aims to learn more about use in an online environment and about dealing with users not only as patrons but also as content and systems suppliers. Preservation continues to be a major concern, and DLF institutions are eager to move digital preservation into a phase of applied, practical work. Mr. Greenstein also noted the need to connect digital libraries to innovative business models.
Abby Smith, CLIR’s director of programs, outlined the work of the Task Force on the Role of the Artifact in Libraries. The task force is considering the circumstances under which scholars must have access to the original artifact for their research. The task force, comprising scholars from several disciplines, two librarians, and an archivist, has divided itself into three working groups that will consider paper-based artifacts, audiovisual materials, and digital materials. Ms. Smith said that this work has become especially important as libraries increasingly offer digital surrogates, rather than original materials, to patrons. In the digital world, librarians are called upon to consider, more carefully than ever before, what it means to build a collection for scholarly use in the future.
The afternoon session of the symposium featured talks by representatives from three funding agencies. Richard Ekman, vice president for programs of the Atlantic Philanthropic Service Agency, challenged the audience to consider a system of scholarly communication that retains important features of traditional academic publishing: high-quality content, peer review, and wide distribution of the information. He also noted that streamlining efforts and taking advantage of the technology are necessary to reduce the costs.
Donald Waters, program officer for scholarly communication at The Andrew W. Mellon Foundation, imagined the future of library and information services. The future will not be utopian, he said. Instead, it will be “undisciplined, polymorphous, and polyglot.” But such a future includes emerging technologies for creating and disseminating “works of lasting value” by a diverse set of cultural institutions and groups. Universities, scholars, scholarly societies, publishers, museums, libraries, and archives will all be involved in creating these works. This will compel them to ask serious questions about what core services they retain, as well as what investments should be made to secure their future.
Richard Akeroyd, executive director, libraries and public access to information, of the Bill and Melinda Gates Foundation, moved the discussion to a broader consideration of the role of the library in society. Charged primarily with improving the capability of public libraries to meet their patrons’ growing information needs, the Foundation’s library initiatives have focused on bringing more computer and networking capabilities to disadvantaged areas of the United States and, increasingly, to other countries. Mr. Akeroyd emphasized the information infrastructure of the country that links public libraries with research institutions.
The sponsors’ symposium offered glimpses into the future of library services. Embedded in every discussion was this question: How must libraries transform themselves to play an important role in the future?
At the close of the day, CLIR invited sponsors to identify topics that should be included in subsequent sessions. At the top of the list was a consideration of organizational changes that are necessary in academic libraries. Faculty members expect all of the traditional services and an expanding paper-based book and journal collection. At the same time, electronic services are making new demands on library staff. Instead of being acquisitions specialists, librarians must become licensing experts. The traditional emphasis on collection holdings is shifting to an emphasis on portals for access. Increasingly, libraries are viewed as entities that offer a range of mediated and direct services rather than facilities that occupy a physical place. New economic models are needed for the new environment, to be sure, but it is even more important for libraries to decide what they become when students and faculty no longer must go there. It is important to reengineer the organization so that it has a future.
Librarians’ most courageous acts will be to work collaboratively with information technologists, scholars, staff of other information agencies, and users to invent a new information system. CLIR’s agenda is devoted to helping them move effectively into this new environment.
Preventive Measures: All the Print That’s Fit to Save
TEN YEARS AGO, few American research libraries were actively engaged in mass deacidification of materials at risk of embrittlement. The options for treating textual, geographic, and visual resources printed on high-acid paper were either to rescue titles already embrittled by reformatting them onto archival quality film, or to deacidify individual leaves of rare and valuable items. The only large-scale preventive measure was to improve the storage conditions for collections in an attempt to slow the inevitable decay of wood-pulp paper.
Today, there are several safe and effective technologies for mass deacidification. The Bookkeeper system is used in the United States (by the Library of Congress and others), Canada, and the Netherlands; Battelle is used in Germany and Switzerland; and a modification of the Wei TÕo system is used in Canada and the United States. Despite the advantages of these technologies, however, they are not as widely used as one might expect. It is time for deacidification to be integrated into the standard set of preservation treatments aimed at stabilization, such as binding and regular repairs. Most books can be deacidified for between $12 and $16, depending on their size.
Selection of suitable items for treatment is critical to an effective deployment of deacidification. The process does not reverse embrittlement or strengthen paper. It is not appropriate for materials that are already badly deteriorated. However, it is highly effective for stabilizing items on acid paper that are structurally sound—a condition that describes a large percentage of retrospective collections and, of course, 100 percent of new titles that are not printed on “permanent” paper. Surprisingly, on this continent there are still learned societies that publish their journals on acidic paper and major publishers that produce monographs on unbuffered pulp paper. Abroad, the picture is also mixed. The world’s leading publisher, China, prints almost exclusively on acidic paper, but there are European and Latin American firms that do as well. Any research library that collects foreign literature should seriously consider a deacidification program—of both retrospective and current titles—to ensure that its holdings are accessible long into the future.
Other titles appropriate for deacidification are so-called “medium rare” books, that is, post-1800 imprints that have artifactual value because of their binding, notations, illustrations, or relative scarcity. To reduce mishandling, many libraries are pulling medium rare books from their general stacks and serving them under controlled conditions. Buffering the paper to neutralize the acid should also be made a priority. Some libraries, especially those collaborating in secondary storage, are deacidifying many of the titles being sent off-site.
Serious interest in mass deacidification is growing. The Library of Congress, which has been scaling up its deacidification program since the mid-1990s, has declared that its efforts will focus first on Americana. A growing number of eastern and midwestern libraries are also routinely deacidifying parts of their holdings. In November 1999, representatives of United Kingdom and Irish repositories decided to initiate a feasibility study to assess all the issues surrounding deacidification and to recommend future action. The U.S. National Archives Annual Preservation Conference this spring focused exclusively on deacidification. It attracted more than 200 people, including scientific experts, practitioners from libraries, museums, and archives, vendors, and preservation managers. The European Commission on Preservation and Access and the State Archives of Lower Saxony are organizing a meeting this fall to examine how mass deacidification can be successfully integrated into the repertoire of preservation treatments in major cultural repositories.
As more institutions decide to deacidify their collections, one of the issues that must be considered is what role such local decisions play in the aggregate. Unlike preservation reformatting onto film, which was conceived of and planned as a rescue effort for information trapped on already-decayed paper, deacidification is a preventive measure. Moreover, it does not produce a surrogate, such as a microform, that can be easily duplicated for access. One book deacidified is one book made safe for local use and, often enough, for interlibrary loan as well. In selecting for treatment, should a library search for whether the title has been treated by another library? How many libraries that deacidify items record that treatment in the bibliographical record? Should a library select on the basis of collection strengths, regardless of current use, or should it treat only those items that are most heavily used? So far, selection criteria for deacidification vary among research libraries, as does the practice of recording treatment in the record. As more libraries begin to deacidify, it is time for a thoughtful assessment of the implications of these different approaches.
ANNOUNCEMENTS |
---|
Anne Kenney to Join CLIR
ANNE R. KENNEY, codirector of the Cornell Institute for Digital Collections and associate director of the Department of Preservation and Conservation at Cornell University Library, will join the CLIR staff September 1, 2000. Ms. Kenney will be based in Ithaca, New York, and will divide her time between work on initiatives at Cornell and at CLIR.
At CLIR, she will initially focus on advancing strategies for the creation of short- and long-term digital archival repositories and on promoting preservation-education initiatives.
Ms. Kenney has worked with the Department of Preservation and Conservation at Cornell since 1987. The author of numerous books and reports, she is also coeditor of RLG DigiNews. Her most recent work, Moving Theory into Practice: Digital Imaging for Libraries and Archives, written with Oya Rieger, was published this spring by the Research Libraries Group.
<Photo by Nicola Kountoupes
CLIR and DLF Initiate Program for Distinguished Fellows
CLIR AND THE Digital Library Federation (DLF) are pleased to announce a new opportunity for librarians, archivists, information technologists, and scholars to pursue their professional development and research interests as Distinguished Fellows.
The CLIR/DLF Program for Distinguished Fellows is open to individuals who have achieved a high level of professional distinction in their fields and who are working in areas of interest to CLIR or the DLF. Unlike other fellowship programs that provide support for individual research, the Distinguished Fellows program is aimed at identifying potential partners for the CLIR/DLF agenda.
The fellowships, available for periods of between three and twelve months, are ideal for senior professionals with well-developed personal research agendas who will benefit significantly from time away from their day-to-day responsibilities.
Although Distinguished Fellows will not be required to relocate to Washington, D.C., during their tenure, they will be expected to participate in program planning sessions in Washington and to cooperate with CLIR staff on existing projects, in addition to working on their own projects. For more information on how to apply for the fellowship, see www.clir.org/news/pressrelease/fellows.html.
A. R. Zipf Fellowship Awarded to Rich Gazan of UCLA
THE 2000 A. R. Zipf Fellowship in Information Management has been awarded to Rich Gazan, a Ph.D. student in the Department of Information Studies at the University of California, Los Angeles (UCLA). Mr. Gazan is the fourth recipient of the Zipf Fellowship, which was established in 1997 to recognize a graduate student who shows exceptional promise for leadership and technical achievement in information management.
Mr. Gazan began doctoral study at UCLA in 1999, after earning his master’s degree in Library and Information Science at the University of Hawaii. His research interests include information retrieval, database design, and the information industry, with a particular focus on integrating content from disparate sources.
Forthcoming Publications
GUIDES TO QUALITY in Imaging. Copublished by the Digital Library Federation and the Research Libraries Group (RLG), this Web-based publication provides practical advice on designing and carrying out an imaging project. The authors emphasize that there are few cut-and-dried approaches to planning an imaging project. Five guides pose questions that will help readers define their projects and objectives more fully. Topics include planning a digital library project, selecting a scanner, setting up an imaging system, establishing qualities for digital masters, and selecting file formats for digital masters. The guides will be available on both the CLIR and RLG Web sites in July.
Risk Management of Digital Information: A File Format Investigation. This report is based on an investigation conducted by Cornell University Library to assess the risks to digital file formats during migration. Written by Gregory Lawrence, William Kehoe, Oya Rieger, William Walters, and Anne Kenney of Cornell University Library, the study was carried out with support from CLIR. The report includes a workbook that will help library staff identify potential risks associated with migrating digital information. Each section of the workbook opens with a brief issue summary; this is followed by questions that will guide users in completing a risk assessment. The appendixes also include two case studies for migration: one for image files and the other for numeric files. A summary of the report appears in the June 15 issue of RLG DigiNews. CLIR will publish the full report in late June, at which time the text will also be available on CLIR’s Web site.
Authenticity in the Digital Environment
WHAT IS AN authentic digital object? In the world of print and analog media, we have developed elaborate ways of identifying authentic documents and detecting fakes. In the digital world, we have only begun to do this. Nonetheless, the question has gained importance and urgency as information—from personal correspondence to medical and financial records—is increasingly created, stored, and transmitted electronically.
For humanists and scientists, the question of what constitutes authenticity must be resolved before they can feel confident in creating and relying upon digital information. For custodians of information resources, the question has profound implications for the tasks of cataloging and describing an item, as well as for setting the parameters of what is preserved and by what technique or series of techniques.
In the fall of 1999, CLIR commissioned five experts from different disciplines to write papers addressing the question: What is an authentic digital object and what are the core attributes that, if missing, would render the object something other than what it purports to be? The papers formed the basis of a workshop, held in January 2000, to which CLIR invited representatives from different domains of the information resources community. The papers, and an overview of the key issues discussed, were recently published by CLIR in a report entitled Authenticity in a Digital Environment.
The authors of the papers are Charles Cullen, president and librarian of the Newberry Library; Peter Hirtle, codirector of the Cornell Institute for Digital Collections; David Levy, consultant and former researcher at the Xerox Palo Alto Research Center; Clifford Lynch, executive director of the Coalition for Networked Information; and Jeff Rothenberg, senior computer scientist at The Rand Corporation. A concluding essay by CLIR Director of Programs Abby Smith highlights the responses of workshop participants to the issues raised by the authors and the key themes that emerged.
Defining Authenticity
Authenticity in recorded information connotes precise, yet disparate, things in different contexts. It can mean being original as well as being faithful to an original. It can mean not only uncorrupted but also of clear and known provenance, “corrupt” or not. The word has specific meaning to an archivist and an equally specific, but different, meaning to a rare-book historian, just as there are different criteria for assessing authenticity for published and unpublished materials. Behind any definition of authenticity lie assumptions about the meaning and significance of content, fixity, consistency of reference, provenance, and context. In the digital environment, creating a common understanding about the multiple meanings and significance of authenticity is critical.
Discussion Underscores Diverse Perspectives
The workshop discussion brought to light how different communities define authenticity. And not surprisingly, a community’s understanding of what constitutes an authentic digital object mirrors its understanding of what constitutes an authentic analog object. Most of the workshop participants grounded their thinking about digital objects and their identity in the fitness of these objects for some specified function or purpose, such as a record that bears evidence; a historical source that bears witness to an event, a time, or a life; or data that could produce a replicable experiment. In other words, what was deemed intrinsic to an object was determined by the purpose for which it was created (or, in the case of archival records, the most narrowly defined of digital objects under discussion, the purpose of bearing evidence about an object’s creation and intended use). Perhaps for that reason, if no other, neither the presenters nor the workshop participants addressed systematically and directly the question of what an authentic digital object is and what the core attributes are that, if missing, would render the object something other than what it purports to be. However, threaded through the discussion were various responses to other questions that had been posed to the participants:
- If all information—textual, numeric, audio, and visual—exists as a bit stream, what does that imply for the concept of format and its role as an attribute essential to the object?
- Does the concept of an original have meaning in the digital environment?
- What role does provenance play in establishing the authenticity of a digital object?
- What implications for authenticity, if any, are there in the fact that digital objects are contingent on software, hardware, network, and other dependencies?
The discussion that ensued on the topic of provenance gives a flavor of the varying perspectives on each of the questions. The role of provenance is as important in the digital world as in the analog world, if not more so. Archives can provide evidence of authenticity by documenting the chain of transmission and custody, and they have internal controls that reduce to an acceptable level the risk of tampering. They serve as a trusted third party. Beyond the relatively controlled environment of the archives, however, the workshop participants agreed that the role of provenance is far more complicated. Whenever information crosses administrative and technological boundaries, as it does in the world of publishers and libraries, the role of trusted third parties is harder to develop and maintain. The digital environment will still need trusted third parties to store material, and libraries and publishers will need to agree on protocols for digital publishing and preservation that work as effectively as have those of the past.
Interestingly, the scholar-participants suggested that technological solutions to the problem will probably emerge that will obviate the need for trusted third parties. Such solutions may include, for example, embedding texts, documents, images, and the like with various warrants (e.g., time stamps, encryption, digital signatures, and watermarks). The technologists replied with skepticism, saying that there is no technological solution that does not itself involve the transfer of trust to a third party. Technological solutions such as encryption or public key infrastructure are as weak or strong as the trusted third party.
Authenticity in the Digital Environment is available on CLIR’s Web site. Print copies of the report may be ordered from CLIR.
Council on Library and Information Resources | ||||
---|---|---|---|---|
1755 Massachusetts Avenue NW, Suite 500 Washington, DC 20036 (202) 939-4750 Fax: (202) 939-4765 · E-mail: info@clir.org The Council on Library and Information Resources (CLIR) grew out of the 1997 merger of the Commission on Preservation and Access and the Council on Library Resources. CLIR identifies the critical issues that affect the welfare and prospects of libraries and archives and the constituencies they serve, convenes individuals and organizations in the best position to engage these issues and respond to them, and encourages institutions to work collaboratively to achieve and manage change.
|