Search
Close this search box.
Search
Close this search box.

Enabling Access in Digital Libraries: Invited Presentations

Creation of an Authorization Database

Russell S. Vaught, Director, Center for Academic Computing, Pennsylvania State University

Russell Vaught described the large multiyear effort at Penn State, a DLF institution, to build central authentication and authorization services for a large university with many campuses. The complexities, he pointed out, reach beyond the technical aspects of the project to issues of university policy and cost-benefit tradeoffs that affect far more than just the computing service organization.

Vaught sought to clarify the meaning of authentication and authorization. He pointed out that they are often confused because they are frequently employed together. Users, he noted, can be authenticated by something they know (such as an identity code or user ID and password), something they have (such as a SecureID card), or something they are (which can be verified, for example, using a retina scan). Authorization grants a user the right to use a system or data and usually presupposes authentication of users.

In 1992 the centralized Computer and Information Systems (CIS) service installed a distributed file system known as AFS, which relied on Kerberos, developed by the Massachusetts Institute of Technology (MIT), for authentication. AFS was developed at Carnegie Mellon University (CMU) and in its early stages was known as the Andrew File System (AFS). AFS and Kerberos both emerged in the 1980s, the byproducts of large projects dedicated to building campus networks and distributed systems. Kerberos provides authentication based on user ID and a password (“something you know”). In the summer of 1993, CIS decided to build a central authentication service based on version 4 of Kerberos to support all core computing systems (such as e-mail, dialup access, and the use of microcomputer labs). Kerberos is used in conjunction with SecureID cards for administrative applications in which the high cost of a security breach justifies the cost of the card (which generates passwords for one-time use). The Kerberos database includes 114,500 active principals (user identities); Penn State has 80,000 students and 30,000 faculty and staff at 24 locations.

In mid-1996, CIS decided to provide authorization services that could support more applications. The new system is based on the Distributed Computing Environment (DCE) Security Services, which uses version 5 of Kerberos for authentication. A cross-organizational task group was formed to develop an initial database to control authorization. The new system saw its first application in the summer of 1997. It is now being used to support many applications, including a proxy system for access to the remote JSTOR archive of scholarly journal literature. Vaught hopes that all systems using the old authentication service will be converted by the summer of 1999.

Although the system is complex, Vaught finds other options, such as a Public Key Infrastructure, equally complex and perhaps less cost effective. Performance and scalability, though still a concern, are expected to improve with the planned enhancements to the DCE directory component (and increased network capacity and processing power).

Reflections on the NISO DOI Rights Metadata Working Group

John S. Erickson, Vice President for Rights Technologies, Yankee Rights Management

John Erickson began by pointing out that copyright serves both as enabler and as inhibitor, establishing a balance to facilitate creativity for the overall benefit of society. His presentation described the current state of thinking of the NISO DOI Rights Metadata Working Group, chaired by Sally Morris, of Wiley, U. K., a working group formed to establish a standard rights metadata schema to facilitate electronic commerce for information objects (whether in digital or nondigital form). This working group, which has very active participation from U.K. publishers, is one of several emerging from a series of joint workshops organized by the National Information Standards Organization (NISO) and the International Digital Object Identifier (DOI) Foundation.4

The DOI system and related activities have developed within the publishing community and until recently, the focus has been on making money through the enforcement of rights. Erickson believes that the group is beginning to tolerate some degree of fair use and the related ambiguity. The joint activities with NISO signal recognition that discussion must be opened up to a broader community.

The group’s stated objective is to develop “a consensus rights transaction model through very active, highly visible public discussions and information sharing.” The resulting conceptual model is shown in figure 3.5

Figure 3: Elements of the Conceptual Model for Rights Transactions

In the group’s opinion, certain digital property rights languages, such as that proposed by Mark Stefik of Xerox PARC, have both advantages and disadvantages and hence alternative models are needed for purposes of comparison and practical evaluation. In particular, the group sees a need for a model to express agreements. Their current thinking borrows from approaches used by stock photography agencies and is based on the use of decision trees for evaluating permissions. A basic assumption here is that any use has a price, even if the price is zero. The model can accommodate a default agreement with standard prices for all users for a limited set of operations. Agreements could relate users or classes of users to certain operations on (uses of) classes of objects. Owners and administrators of agreements would have to be able to apply templates of operations and prices to groups of objects and users.

In Erickson’s view, it is essential that the gathering of appropriate metadata become part of the publishing workflow. Two other important issues have been raised. Who would be accountable for codifying a license agreement and maintaining the data that supports access management? And would rights metadata for content be made available to third-party services along with descriptive metadata?

After Erickson’s presentation, Clifford Lynch (CNI) provided some additional context for the activity of the NISO DOI working group. In a new approach to standards setting, NISO has sponsored exploratory workshops encouraging broad participation. In 1997 and early 1998, NISO and the DOI Foundation sponsored a series of joint meetings that addressed the question of whether DOI activities should be brought into the regular process for national and international standards. The meetings, the related electronic forum, and the five or six working groups they spawned have no formal standing within the national or international standards process. They are not intended to be exclusionary and have served a valuable educational role.


REFERENCES

4 The Digital Object Identifier (DOI) system is a mechanism for marking digital objects in order to facilitate electronic commerce and enable copyright management in a digital environment. The system emerged from activities of the Association of American Publishers, which is a charter member of the International DOI Foundation. As indicted on its Web site (http://www.doi.org/), the foundation is dedicated to supporting the needs of the intellectual property community in the digital environment, by establishing and governing the DOI system, setting policies for the system, choosing service providers for the system, and overseeing its successful operation.

5 Erickson’s full set of slides is available on the Yankee Book Peddler, Inc., Web site (http://www.ybp.com/yrm/presentations/DLF_CRIAShow/

Skip to content