The Commission on Preservation and Access
Developing a European Commission on Preservation and Access
One outcome of the Commission’s June 1993 international scholars conference, Preserving the Intellectual Heritage, was a unanimous vote to create a European commission on preservation and access. The 23 scholars from 11 western European countries and the United States at the conference recommended that the new commission, as a counterpart to the U.S. Commission on Preservation and Access, should be “an independent body that is endorsed by a wide range of existing cultural institutions and that is materially supported by two or three of them in the early stages of its development.” The scholars identified emerging issues for further exploration by the new commission and others as: how to bring awareness of preservation problems to the attention of library users and scholars, how to involve scholars with librarians and other custodians in the difficult decisions about preservation priorities, and how to identify and allocate funding and resources for preservation activities. For a complete report of that event, see Preserving the Intellectual Heritage: A Report of The Bellagio Conference, June 7-10, 1993, available from the Commission for $10.00, prepayment required.
Current efforts to establish the European commission are bring funded in part by a grant from the Gladys Krieble Delmas Foundation. Acting upon specific recommendations adopted at the conference, a steering committee has appointed a board of directors, which is scheduled for its initial meeting in Amsterdam in late March. UNESCO has invited board members to a meeting to discuss ways to cooperate, and interest in collaboration also has been expressed by the Commission of the European Communities and the European Cultural Foundation.
European commission board members as of February 1994 are: Universities–Professor Michel Jouve, Université Michel de Montaigne Bordeaux III; Professor Jack Meadows, Loughborough University; Professor Hinrich Seidel, Universität Hannover. Academies and learned societies– Professor Pieter Drenth, Royal Netherlands Academy of Arts and Sciences; Sir Anthony Kenny, British Library. Libraries– Mrs. Fernanda Maria Campos, National Library of Portugal; Professor Klaus-Dieter Lehmann, Die Deutsche Bibliothek; Professor Adam Manikowski, National Library of Poland. Archives–Professor Eric Ketelaar, Netherlands State Archives Department; Professor Geoffrey Martin, University of Essex. Appointments from the publishing and other areas are pending. Serving as secretary to the board is Ms. Alison de Puymège, Deputy Secretary General, Standing Conference of Rectors, Presidents and Vice-Chancellors of the European Universities (CRE). Hans Rütimann serves as an ex of officio member.
Photo Images Exhibited at American Historical Association
Digitized images of 300 photographs from the special collections of the University of Southern California (USC) were demonstrated at the Commission exhibit during the annual meeting of the American Historical Association (AHA) in San Francisco January 6-9, 1994. Over 100 professors, scholars, curators and archivists viewed and manipulated photo images using a Photo-cD player and a 27″ monitor provided by Eastman Kodak. The donated equipment will become a regular part of the Commission’s exhibits at scholarly and publishing conferences over the next 18 months.
USC is a repository for a vast collection of black-and-white historical photographs of southern California and the American Southwest. The images used for the AHA demonstration were selected from the Hearst and Whittington Collections and included stock photographs documenting the Los Angeles area; coverage of street and city views; and images of Hollywood environs and people. USC is preserving the photographs as part of work of the Kodak Library Image Consortium (KLIC), formed in 1992 by USC, Eastman Kodak, Cornell University, and the Commission The University libraries are planning a prototype image database that will archive and have the capacity to electronically deliver approximately 3,000 historical black-and-white photographic images. The KLIC consortium is exploring ways in which libraries, museums, and other archival repositories might use Photo-CD technology. One of UC. supporting reports, Perceived Interface and Output Requirements of Potential Users of an Electronic Image Database System, is included as an insert to this month’s newsletter
Librarians from the University of California at Berkeley, Stanford University, the Research Libraries Group, Inc., and the University of Southern California staffed the exhibit which was funded by grants from the Gladys Krieble Delmas and H.W. Wilson Foundations. Forty AHA members signed up to be added to the Commission’s mailing list.
Pilot Test Libraries Report on Microfilm Audit
Libraries at Harvard, Yale, and Ohio State Universities participated in a pilot test of an audit process in late 1992 and early 1993 to help develop benchmarks and procedures for the periodic inspection of the quality of microfilm produced under the brittle books program of the National Endowment for the Humanities (NEH) Division of Preservation and Access. (See the March and August 1993 newsletters for more background.) The following report on that pilot test and subsequent activities was prepared by Wesley Boomgaarden, Preservation Officer at Ohio State, on behalf of the three participants.
The Commission is working to establish a regular audit of preservation microfilms produced under the National Endowment for the Humanities (NEH) brittle books program which will include qualified third party assessment. Such an audit process can help assure that public funds arc being spent for high quality preservation products.
In the pilot test, one percent of the master negatives created by Harvard, Yale, and Ohio State with Federal funds were examined in first- and third-generation copies by MSTC, Inc., a suburban Washington, DC, company specializing in such film examination. The examination included film base, leader, density, resolution, and indications of physical deterioration (dirt, mold, scratches, etc.). The audit process was organized and funded by the Commission. A follow-up meeting of these participants and other specialists was held in early November, where plans for a nationwide-scale audit were discussed. At the November meeting, participants indicated in positive terms the value of the pilot audit to their preservation microfilming operations.
According to Paul Conway, Head, Preservation Department, Yale University, the audit process was especially useful for helping the institution learn more about preservation of large bodies of material. At Ohio State, the audit found a number of improvements that needed to he made to microfilm processing and generated discussions that resulted in improvements in understanding by the filmer and the library, notably in adjusting film density levels. At Harvard, several concrete steps were taken as a result of th audit, including: additional education of camera operators readjustment of cameras, increased attention to resolution charts, changes in frequency of cleaning lenses, and considerations of new equipment and changes in reduction ratios. Participants in the pilot test suggested that the following characteristics of master microforms be considered for an ongoing national film audit:
- The quality of the film manufacture
- The quality of the image
- The quality and permanence of the processing
The quality and completeness of the bibliographic entity, including bibliographic control The quality of the storage environment The Commission is working with several preservation administrators to develop a draft microfilm data collection procedure.
“Rescuing Our Heritage”–Preservation in the Southwest
In cooperation with the Commission Southern Methodist University (SMU) held a regional symposium on preservation in Spring 1993. The final report of that event, including a summary of the presentations of keynote speakers, is now available as a special edition of the Central University Libraries faculty newsletter The Link and on the SMU Gopher on the Internet. The symposium’s target audience was budgetary decision makers, in particular college and university presidents and provosts state legislators, and directors of state libraries and archives. In addition, library directors and officers in state and regional library and archives organizations were invited.
Speakers included Commission board president Billy E. Frye, Interim President and Provost Emory University, and Commission board member David B. Gracy II, Gov. Bill Daniel Professor of Archival Enterprise, University of Texas at Austin. Rescuing Our Heritage Symposium Report by Dr. Kenneth Lavender, Curator, Rare Book and Texana Collections University of North Texas Libraries, Denton, and Thomas F.R. Clareson, Preservation Service Manager, AMIGO. Bibliographic Council, Dallas, is available from Maureen Pastine, Central University Librarian, Southern Methodist University, Dallas, TX 75275-0135.
Acid-Free Paper in Europe and the U.S.
New survey findings from Europe indicate that some publishers find acid-free paper no more costly than acid paper, and that their main rationale for adopting acid-free stock is the request of library buyers for permanent paper. The survey was completed in January 1994 by the European Foundation for Library Cooperation/Groupe de Lausanne (EFLC) and the Dutch company Swets & Zeitlinger (see January 1994 newsletter for more background). Sixty-eight respondents, most of them scientific publishers, from 13 European countries reported they are printing their publications on acid-free paper. Twenty-one are British, a lesser number are German and Dutch, and others are spread throughout France, Belgium, Nordic countries, Austria, Switzerland, Italy, Spain and Ireland. Half of the 68 do not announce by any logo or mention that their publications are printed on permanent paper.
Another 74 publishers from 14 countries reported they do not use acid-free paper, and some of these were unaware of its availability. Most indicated they will consider adopting permanent paper provided there is significant demand and it is not too expensive. The EFLC noted in its press release, “Probably, most of the eighteen-hundred ones [Publishers] who did not reply to the survey do not use acid-free paper…. The problem of brittle books falling apart in libraries does not yet seem sufficiently known outside the library world in many countries of continental Europe.’ More detailed results of the survey and a free copy of European Directory of Acid-Free and Permanent Book Paper are available from: EFLC, 17, Chemin de. Vieux Amis, B-1380 Lasne, Brussels. Belgium.
In other news, Rolf Dahlø of the National Office for Research and Special Libraries, Oslo, Norway, and chairman of the ISO-committee for Physical Keeping of Documents, reports that the final proofs for the new International Paper Standard ISO 9706 are completed and publication is underway. The English version of the document is ISO 9706:1994(E) and the French version is ISO 9706:1994(F). The official publication is expected to be available in early spring of this year.
…Meanwhile, in the U.S.Adapted from the ALA Washington Newsletter, November30, 1993
President Clinton’s Executive Order 12873 (October 1993) requiring the use of recycled paper within the federal government makes no mention of permanence considerations or Public Law 101-423 enacted in October 1990 that established a national policy to promote the use of permanent alkaline paper. The new executive order addresses several qualities that may affect paper permanence. The order is detailed and includes enforcement mechanisms, while the earlier law on permanent paper establishes policy, but lacks enforcement mechanisms. Whether a sufficient supply of paper exists that meets both sets of standards–for recycled content and for permanence–is yet to be determined. ALA and several other library and scholarly organizations have recommended that the Executive Order accommodate needs for alkaline permanent paper so that costs for preserving the federal record would he minimized. ALA and other groups are considering courses of action and would welcome input from experts in permanent paper and preservation of library and archival materials. Contact: Carol Henderson, Executive Director Designate, American Library Association Washington Office, 110 Maryland Avenue, N.E., Suite 101, Washington, D.C. 20002.
Science Research Projects Available
The Commissions Preservation Science Council (PSC) is widely distributing descriptions of three recommended research projects that will lead to new management tools for libraries and archives, and a fourth project has received funding from the National Endowment for the Humanities (NEH) Division of Preservation and Access. The PSC is composed of 20 preservation administrators and scientists representing a broad sampling of North American research libraries and archives. It has released information on the following projects: (1) Paper and Book Collections: use of accelerated aging experiments to yield specific predictions concerning the life expectancy under different temperature and relative humidity (RH) conditions of five types of paper commonly found in libraries and archives; (2) Magnetic Media: a review of the status of current research on the longevity and durability of magnetic media, identification of tools useful to librarians and archivists in managing these collections, and identification of research necessary to develop management tools; and (3) Paper–Lignin: all assessment of the influences of lignin in a paper on its permanence, including in such factors as color changes upon aging and possible deleterious effects upon non-lignin papers coming into contact with lignin-containing papers.
The fourth recommended project, to develop the technical information necessary for improving storage practices and enclosures for film collections in libraries and archives, is being undertaken by the Image Permanence Institute (IPI) at Rochester Institute of Technology, Rochester, NY, which recently received a $304,625 NEH grant. As described in an IPI press release, the three-year project will look at the ways in which enclosures speed up or slow down the deterioration of acetate film supports and will investigate the effects of enclosures on temperature and RH equilibrium relations. Results will include clear recommendations of the best types of enclosures for minimizing acetate base degradation in microfilm, movie film, and sheet film. Further, the study will furnish a model of how film reacts to the temperature and moisture changes it encounters when it is moved in and out of cold storage, will clarify how much RH fluctuation can safely be permitted in film archives, and will supply missing data that will provide a new approach to improved storage through dynamic, rather than static, set points for temperature and RH.
Final descriptions of the remaining two recommended projects are being developed. These involve a study of microclimate effects on paper and book collections and an investigation into the relative performance of several types of polyvinyl acetate adhesive (PVA) films.
The Commission has sponsored the PSC’s work over 2 1/2 years first to study, analyze, and set priorities for research possibilities, and then to develop a finite number of manageable projects of primary value. The six projects have the full endorsement of the Commission, which encourages laboratories, research institutions, individual researchers, and funding agencies to consider their undertaking. While the Commission cannot fund the project directly, it will support submissions of proposals for funding. For more information, contact Program Officer Maxine K. Sitts at the Commission.
Perceived Interface and Output Requirements of Potential Users of an Electronic Image Database System
Gary Jones, University of Southern California
Annenberg School for Communication
under the direction of
John Waiblnger, Associate University Librarian
for Scholarly Technology & Information Systems
and Victoria Steele, Head, Special Collections.
As a member of the Digital Preservation Consortium, the University Libraries at the University of Southern California are planning a prototype image database which will archive and have the capacity to electronically deliver approximately 3,000 historical black and white photographic images. This report on user perceptions of system interface and output requirements was submitted to the Commission as one aspect of a more comprehensive interim report on this project.
The University of Southern California (USC) is a repository for a vast collection of black-and-white historical photographs of southern California and the American Southwest. These include the Hearst Collection (1.2 million images) and the Whittington Collection (500,000), as well as a number of smaller but significant holdings. The subject content of these collections is extensive and varied ranging from stock photographs documenting the Los Angeles area; images showing the growth of important municipal enterprises such as ground transportation systems, aviation and shipping; coverage of street and city views; images of Hollywood–its environs and people, to whole collections of prominent local photographers whose subject focus included Los Angeles, California Missions, Southwestern Indians, and turn-of-the-century Nevada, Arizona, and California. The planning process will result in a determination of appropriate system technologies (scanning, storage, network delivery and output), legal protection (copyright restrictions and fair use), budgetary considerations, and user interface. This report addresses issues including:
- The usefulness of an image database, for either research or instructional purposes, to a given academic discipline.
- Expectations and requirements relating to how photographs in an electronic database should be indexed for search purposes.
- The degree of need for the capacity to “browse” multiple images simultaneously.
- Approximate level of required image resolution.
- Desired location and form of hardcopy output.
Although this report does not specifically address matters of system hardware, some technological points will he considered where critical to the questions listed above. This report is a distillation of comments made by selected faculty at three focus group meetings held at Doheny Memorial Library during the last ten days of April 1993. Participants were selected based upon potential interest in n electro image delivery system. That interest was determined either by general reputation or from research concerns as included in the University of Southern Californias 1992 Experts Directory. Faculty were also selected with some attention to academic discipline. As a result, the fifty invitees were approximately equally divided according to appointment in the physical sciences, social sciences and the humanities
Comments were also solicited by three other means: open ended questionnaire (designed only for those who were unable to attend the focus groups), personal interview and observations offered subsequent to a presentation of the project overview on May 14. The presentation was attended by approximately 25 people, primarily interested library staff.
There are several limitations to this method of inquiry. First, respondents were not randomly selected. Although identification of individuals most likely to make use of an image delivery system is a rational approach, it inevitably introduces bias into the collective response (a “pro-innovation” bias, for example). Second, respondents were not severely constrained by legal, budgetary or technological considerations. While some real-world parameters were introduced at the focus group meetings, subjects were generally allowed to express preferences for an idealized system. Finally, it should be kept in mind that this interim report is based on a relatively limited number of respondents (approximately 20 actually contributed submitted comments), which at this point includes no students.
Findings and Analysis
Image Database Usefulness
All respondents endorsed the potential benefit of a campus wide image delivery system. Subjects agreed that the photographic holdings of the university were grossly underutilized. This was attributed to lack of awareness, lack of accessibility and awkwardness of existing search techniques (although that difficulty will be mitigated by the recent cataloging of some of the collections). Respondents agreed that an electronic image database would be useful for various purposes. Faculty tended to emphasize the potential for purposes of research, instruction and individual student projects. Library staff tended to focus on the benefits of preservation and access. All were concerned with issues of cataloging.
The perceived usefulness of an image database system revolved around three primary considerations: convenience of access, speed of access, and user control. Convenience may be especially salient as USC’s photographic holdings are presently located beyond easy walking distance from campus. Regardless of physical location, subjects perceived a clear advantage to rapid on-line access of photographs cataloged at the item level. Finally, although several subjects were highly complimentary of the collection curator, most agreed that individual control of a search offered by an on-line system would be highly advantageous to much of their work (again qualified by indexing and browsing options, discussed below).
General endorsement of digital image technology was not without one important qualification: no one anticipates that electronically delivered images (or text) can completely replace the occasional use of primary documents. As one professor put it: “We still go to see the original art work, the original building. The original material–is important.” Also, “There is the matter of serendipity, or oversight, or coincidence, or human error. There is always the possibility of discovering something that fell through the cracks.” Several agreed that there is “something about” the aura of an original document–even an original photographic print. Finally, faculty emphasized the importance of exposing students to original sources. Such statements, however, were made as asides–as parenthetical tribute to primary archives before moving on to the more intriguing questions surrounding the technology at issue.
Some of those questions ranged afield of the photographic collection under consideration. Professors of music, geography and history found the possibility of digitizing various musical scores, maps and historical documents of great potential benefit to their respective fields. These discussions were eventually steered back to photo images, which usually resulted in deliberation of the primary concern: how images were to be cataloged, indexed and searched.
Indexing and Searching
Participants in this study were vitally concerned both with issues of individual item indexing and total system flexibility. Regarding item description the following categories were mentioned: source, biographical names, geographic location, and date. Whenever captioned, a journalism professor pointed out, full-text retrieval by caption should be available. Several respondents expressed a need for technical descriptors, such as original image size, quality, whether black & white or color, and copyright information. One subject desired information on time of day the photo was taken (if deducible) and type of illumination used. Another suggested that the system keep track of, and be able to report, the number of times an individual image was accessed.
More difficult was the problem of determining keyword descriptors to designate the subject/content of a photograph. While there have been some tentative steps towards searching an image database using an iconic interface (including the Getty’s Art History Information Project and Photodex, a commercial firm out of Durango, CO), “language is still the most precise search and retrieval tool we have” (Michael Ester, director of the Getty’s AHIP). Photodex claims to have developed an extremely sophisticated keyword search engine based on artificial intelligence algorithms and “fuzzy logic” programming, but the technology is proprietary.
In addition to item indexing there is the issue of searching style. Without exception, respondents expressed a need for a system capable of accommodating both “hunts” for a specific image and more general “browsing” technique for photos that meet more relaxed search criteria. Those who had performed image-based research almost invariably began with broad categories. This initial phase was described as a “limbering up period” while the researcher acquired a better feel for what was sought and how it might be indexed or described by the system. As one professor put it, “this stage of the search process is a matter of collecting the points, much like note cards scattered all over.” Researchers then proceed to narrow the field to smaller subsets of material, as they “winnow out” extraneous images. In this connection words like “system resilience,” “subsets,” and “flexibility” were mentioned repeatedly. These were the types of dualities, it was pointed out, that a storage medium like microfilm lacks.
Not surprisingly the most difficult issues surrounding search criteria were those involving what was variously described as the “look” or “tone” or “feel” of a photograph. While this is an unlikely concern in the physical sciences it arose frequently in the domains of social science and the humanities. One participant had a need to search for “city night views,” another was interested in “1950’s sci-fi imagery, a kind of iconographic “look.” Another researcher once searched for 1930’s images of “really seedy hotels and crummy bars”–unlikely formal descriptors in even the most sophisticated system.
Interesting discussions ensued in each focus group on the idea of providing a user-defined field with each to allow system users to elaborate on image descriptors. There was a clearly perceived willingness on the part of these users to add their own descriptive content to the image cataloging record. While such capability implies considerable administrative and authentication issues, it does provide an interesting, “low-cost” method of enhancing the descriptive content of the indexing records.
The importance of providing adequate indexing tools for searching the image database was consistently stressed by all the participants in the focus group meetings. Success of a digital image database will be dependent of easy-to-use, yet content rich indexing tools.
The universally perceived importance of database browsing capability was acknowledged above. This section is concerned primarily with comments related to multiple-image display. Briefly stated, every respondent expressed a strong desire for the simultaneous screen display of multiple images. There was also unanimous agreement that the exact nature of this display involved a complex trade off between number of concurrent images, screen size and potential image resolution. Beyond this, general agreement breaks down somewhat, depending upon how particular individual tends to browse. Several respondents expressed a preference for the capability of displaying a maximum number of discernible images on screen at a time–as many as 30. “I may need to pull up 500 images related to a particular subject, and be able to see a whole array of them, or flip through them like pages of a book,” said one. This seemed especially true if one was searching for an image with a certain “look” or “feel.”
Others were equally certain they would have no need to display more than a few images at a time. A professor from fine arts stated that much of her research involves high resolution paired-comparisons. A geographer with a research interest in urban history expressed a preference for no more than four images displayed at a time. He explained, “I always need detail. If the screen displayed more than four images, I would just ask for blow-ups, whereas with four I would probably get the resolution I required in the first place.” “That first read is important,” he continued, if I reject an image at that time, I may never go back.” Again, these opinions are intertwined with considerations of screen size and image resolution. Screen size is budget dependent, resolution is somewhat more variable.
A technical discussion of image resolution is beyond the scope of this report. However, some basic facts and assumptions were set forth at the various user meetings in order to establish a foundation for discussion. digital scanners are now capable of capturing a photographic image at resolutions equal to the grain of the print or negative emulsion There are tradeoffs scanning at extremely high resolutions is more costly in terms of time, money and computer storage space. Likewise, as resolution increases, so does the time of transmission over a network As images approach photo-realistic resolution human ability to discern slight degradations of image quality decreases.
Perhaps predictably, respondents wanted maximum resolution in minimal time, and low resolution even faster. Several participants had experienced the slow process of retrieving an image by standard vertical fill-in, which paints an image down the screen line-by-line. A few had sat through the long reception times of a high-resolution images over the Internet. These methods were declared unacceptable by those who had experienced them.
There is, however, a technology called “progressive transmission” which strikes a precise balance between image transmission speed and resolution. Progressive transmission sends the entire image, almost instantaneously, at extremely low resolution upon demand. Within two seconds the user receives a full-screen image of recognizable detail. Within 15 seconds the image fills in to moderate resolution The process continues until maximum resolution is reached, usually between one and two minutes.
The user response to this solution, presented by the project’s technical coordinator, was consistent approval. As long as the image became recognizable within a few seconds, it would either be rejected or allowed to continue “filling itself in.” It was expected that the image could be captured at any stage during the transmission. If maximum resolution was desired a ninety-second wait was not perceived as too burdensome.
The question of system outputs related both to terminal location and hardcopy output. Although it is envisioned that image-delivery will eventually he campus-wide, it is likely that the initial delivery will be to high-capacity workstation in selected locations. There was no serious objection to this design, especially if the terminals delivered the high resolution imagery of which they will be technically capable. There were three primary questions related to hardcopy system output: a) the ratio of instructional needs to research demands, b) the issue of cost recovery, and of the perceived priority of output types (“Xerox” quality, slides, high resolution grayscale prints, downloaded files, including Photo CD output).
The perceived use of the system for instructional rather than research purposes varied greatly from one professor to another. For those who viewed the technology primarily as a means of improved instruction, slide output was a priority, followed by medium resolution photocopy quality. Research demands dictated high resolution prints. Those in the fine arts, particularly, expressed a strong interest in download able graphic files. All respondents indicated a willingness to pay for hardcopy output on a cost-recovery basis. There was an expressed willingness to pay standard rates for any image used in a publication.
Several groups became particularly animated when discussion ranged to possibilities of transmitting entire sets of selected images directly into the classroom teaching environment. While such extensions of this technology is a possibility, it was emphasized that creation of university wide electronic classrooms is not within the compass of this project.
The USC project team has also been working extensively with Kodak’s Photo CD technology. Several Photo CD images databases have already been created and incorporated into the instructions program (public access Photo CD reading stations have been installed in the Architecture and Fine Arts Library and in the Doheny Reference Center; students have been assigned Photo CD “readings”). This has generated considerable discussion on the advantages of such a ‘portable” image database “subset” medium. Incorporation of a Photo CD output mechanism from the networked image server has been identified as an important output requirement. The ease of use and portability of this medium was seen as an important step in making digitized images available for classroom use.
Conclusions and Recommendations
Within the limitations of this study outlined at the outset, the following conclusions and recommendations can be drawn:
- Although digital archiving and retrieval is not viewed as a complete substitute for original document research, the technology is generally welcomed for its perceived advantages in convenience and speed of access, and individual user control.
- While certain indexing criteria for photographs can be sharply defined, one of the most important, content remains problematic. Some preliminary attempts at “iconic search procedures” are being made in the marketplace but language remains the most precise descriptor of images.
- Image database searches frequently begin with broad categories which the user then narrows as the search continues. An electronic image-delivery system should be flexible enough to accommodate both a broad (low resolution) search as well as a narrow (generally higher resolution) image request.
- One short-term solution to the difficulty of providing exhaustive keyword descriptors to photographic images is to allow for an associated user-input field. Although difficult to administer and monitor, several potential users liked the idea.
- A simultaneous multi-image display capability is essential to users. To the extent technically feasible, this feature should offer maximum flexibility.
- Flexibility should also be built into the system at the intersection of image transmission resolution, and delivery speed. The proposed technology of progressive transmission, which offers a trade-off between speed and resolution, was well received by potential system users.
- Prior to campus-wide image delivery, potential users would be willing to go to certain designated location with high-end work station terminals to interface the System. “Hardcopy” and “portable” output needs varied among professors according to emphases on research, teaching, or the creative arts.
- Finally, at various stages in the search and retrieval process, potential users expressed a desire for system ability to save marked subsets of previous image searches.
Commission on Preservation and Access
1400 16th Street, NW, Suite 740
Washington, DC 20036-2217
(202) 939-3400 Fax: (202) 939-3407
The Commission on Preservation and Access was established in 1986 to foster and support collaboration among libraries and allied organizations in order to ensure the preservation of the published and documentary record in all formats and to provide enhanced access to scholarly information.
The Newsletter reports on cooperative national and international preservation activities and is written primarily for university administrators and faculty, library and archives administrators, preservation specialists and administrators, and representatives of consortia, governmental bodies, and other groups sharing in the Commission’s goals. The Newsletter is not copyrighted; its duplication and distribution are encouraged.Patricia Battin–President
Maxine K. Sitts–Program Officer, Editor
Sonny Koerner – Managing Editor