Search
Close this search box.
Search
Close this search box.

Census of Institutional Repositories in the United States MIRACLE Project Research Findings

A Summary of a Report Published by the Council on Library and Information Resources

Census of Institutional Repositories in the United States
MIRACLE Project Research Findings

by Karen Markey, Soo Young Rieh, Beth St. Jean, Jihyun Kim, and Elizabeth Yakel
February 2007


Why Study Institutional Repositories?

A considerable portion of the scholarly record is born digital, and some scholarship is produced in digital formats that have no physical, in-the-hand counterparts. The proliferation of digital scholarship raises serious and pressing issues about how to organize, access, and preserve it in perpetuity. The response of academic institutions has been to build and deploy institutional repositories (IRs) to manage the digital scholarship their learning communities produce. IR efforts require a considerable financial, personnel, and technical investment. For this reason, it would be helpful if academic institutions could learn from one another, sharing their experiences, building models, and formulating best practices. Such sharing would streamline the implementation of IRs at institutions where the decision to initiate an IR effort has not yet been made.

Why Conduct a Census of IRs in the United States?

Previous surveys have focused on academic institutions where IRs are already operational or on specialized groups of academic institutions that are likely to be first adopters of new technologies such as IRs (Appendix F3). To avoid duplication, MIRACLE Project staff (i.e., this report’s authors) sought to cast a wide net and fill a void. Conducting a census of academic institutions in the United States about their involvement with IRs would include institutions that have not yet jumped on the IR bandwagon. Being inclusive increases our confidence that we will be able to identify the wide range of practices, policies, and operations in effect at institutions where decision makers are contemplating, planning, pilot testing, or implementing IRs and will enable us to learn why some institutions have ruled out IRs entirely.

Who Participated in the MIRACLE Project Census of IRs in the United States?

Of the 2,147 academic library directors and senior library administrators MIRACLE Project staff contacted, 446 participated in the census-a response rate of 20.8%. Characterizing the extent of their involvement with IRs, 236 (52.9%) respondents reported that they have done no IR planning (NP) to date, 92 (20.6%) respondents are only planning (PO) for IRs, 70 (15.7%) respondents are actively planning and pilot testing IRs (PPT), and 48 (10.8%) respondents have implemented (IMP) an operational IR (Figure 2.1).

What Kinds of Educational Institutions Have and Do Not Have IRs?

MIRACLE Project staff used the Carnegie Classification of Institutions of Higher Education (CCHE) to characterize census respondents (Table 2.2 and Figure 2.3). Research universities vastly outnumber other CCHE classes with respect to involvement in IR planning, pilot testing, and implementation (Table 2.3). Most NP and PO respondents come from master’s and baccalaureate institutions.

Who Bears the Responsibility for IR Planning, Pilot Testing, and Implementation?

At PPT and IMP institutions, librarians take the lead in IR pilot testing and system implementation (Table 2.4), assume most of the responsibility for the IR effort (Figure 2.6), and are members of various IR committees (Figure 2.5). Funding almost always comes from the library (Table 3.1). A typical approach to funding the IR is to absorb its cost in routine library operating costs.

At NP institutions where no IR effort is under way, the library director takes the lead, consulting with the provost, chief information officer, faculty, and archivist about funding, technical expertise, potential contributors and users, and digital collections (Tables 2.4 and 2.5). IR committee membership becomes increasingly less inclusive as the IR project progresses from pilot testing to implementation, leaving the library “holding the bag” (Figure 2.5).

What Are Useful Investigative Activities?

Staff involved with various phases of IR efforts have voracious appetites for information about IRs, especially information pertaining to best practices and successful implementations at institutions similar to their own (Tables 4.1, 8.1, 8.2, and 9.3). The needs assessment is not as important as other investigative activities (Table 4.1 and Figure 4.1). Pilot testing one or more IR-system packages is very important. About 16% of MIRACLE census respondents are pilot testing one or more IR-system packages (Figure 2.1), and almost three-quarters of PO respondents intend to pilot test IR-system software (Figure 4.2). Benefits of pilot testing include developing the requisite technical expertise for IR implementation, evaluating IR-system software, and estimating implementation costs (Table 4.3). For most PO institutions in the census, the next step is to widen the scope of their investigations. For most PPT institutions, the next step is to implement IR-system software (Figure 4.3). Very few (about 10%) PO and PPT institutions are likely to terminate their IR efforts (Figure 4.5).

What Are Respondents’ Experiences with IR-system Software Packages?

Respondents’ preferred IR-system software for both pilot testing and implementation is DSpace (Table 5.2). Asked how long their IR has been operational, 52.1% of respondents with operational IRs cite 12 months or less, 27.1% from 13 to 24 months, 4.2% from 25 to 36 months, and 16.6% for more than 36 months. IR-system functionality is satisfactory, but the user interface, including controlled vocabulary searching and authority control, needs serious reworking (Table 5.3). Except for portable document files (PDFs), institutions with operational IRs do not guarantee file formats in perpetuity (Table 6.2). Improving preservation functionality in IRs should be a systems-development priority because IMP respondents rate greater preservation capacity as the major reason why they will migrate to a new IR (Table 5.4). To date, respondents have used IR-system evaluation methods that are limited to simple counts that most IR systems produce automatically in management reports (Table 7.5).

What Content Is in Pilot-test and Operational IRs?

Both pilot-test and operational IRs are very small (Figure 6.1). About 80% of the former and 50% of the latter contain fewer than 1,000 digital documents. Only four (8.3%) pilot-test IRs and seven (19.4%) operational IRs contain more than 5,000 documents. There is no relationship between IR size and age. Pilot-test and operational IRs contain a wide range of text, numeric, and multimedia files, but traditional text-based document types that are the result of the research enterprise of staff and students at postsecondary institutions are especially characteristic of these institutions’ content (Table 6.1).

What Progress Have Respondents Made on IR Policies?

At least 60% of census respondents with operational IRs report they have implemented policies for (1) acceptable file formats, (2) determining who is authorized to make contributions to the IR, (3) defining collections, (4) restricting access to IR content, (5) identifying metadata formats and authorized metadata creators, and (6) determining what is acceptable content (Figure 6.2). There are many more IR-related activities for which these institutions report drafted policies or no policies at all.

It may be not necessary for all IR policies to be in place at the time of the public launch of an institution’s IR. Taking a wait-and-see attitude, evaluating what transpires after a period of time, then firming up existing policies and implementing new ones as needed may be the most expedient course of action.

Who Contributes to IRs and at What Rate?

Authorized contributors to IRs are typically members of the institution’s learning community-faculty, librarians, research scientists, archivists, and graduate and undergraduate students (Table 6.3). Staff who facilitate the research and teaching missions of the institution (e.g., press, news service, academic support staff, central computer staff) are less likely to be authorized to contribute. Asked to identify the major contributor to their IR, only PPT staff are unified in their response, with almost 60% naming faculty (Table 6.4). Percentages drop to 48.1% and 33.3% for PO and IMP respondents, respectively. The unified response of PPT staff probably stems from the fact that they work one-on-one with faculty who are early adopters during the planning and pilot-test phase of the IR effort. In fact, PO, PPT, and IMP respondents choose “IR staff working one-on-one with early adopters” as the most successful method for recruiting IR content (Figure 6.5). Other successful methods are “word of mouth from early adopters to their colleagues” (Figure 6.6), “personal visits to staff and administrators,” and “presentations about the IR at departmental and faculty meetings” (Figure 6.7).

Respondents report that recruiting content for the IR is difficult (Figure 7.3). At institutions with operational IRs, IR staff are willing to entertain institutional mandates that require members of their institution’s learning community to deposit certain document types in the IR (Table 7.3). Asked why they think people will contribute to the IR, respondents give high ratings to reasons that enhance scholarly reputations and offload research-dissemination tasks onto others. Lower-ranked reasons pertain to enhancing the institution’s standing.

What Are the Benefits of IRs?

Asked to rate a list of 14 benefits of IRs, census respondents give high ratings to all but two (Figure 7.1 and Table 7.1). Instead of having a few benefits that stand far above the others, IRs may have many benefits. Respondents may also feel it is premature to rank one or two benefits above the others because IRs have not yet “come into their own.” Once IRs have become more common in all types of educational institutions, the answers to this question might be different. One or two benefits may ultimately dominate.

NP respondents are especially interested in benefits of IRs so they can incorporate them into arguments to convince their institutions’ decision makers to support IR planning (Tables 8.2 and 9.1).

What Factors Inhibit the Deployment of a Successful IR?

Factors affecting the successful deployment depend on the stage of an institution’s IR effort (Table 7.3). IMP respondents are concerned about contributors and contributions to the IR. In fact, that concern is pushing them to consider mandating contributions of certain material types. PPT respondents are also concerned about contributions, but other priorities, projects, and initiatives are competing with the IR effort for resources. PO respondents are most concerned about sustaining the IR effort in terms of competing for resources and supporting the costs of an operational IR.

How Likely Are Institutions Where No IR Planning Has Been Done to Jump on the IR Bandwagon?

The largest percentage (52%) of MIRACLE Project census respondents comes from institutions where no IR planning has been done. Dominating these NP respondents are master’s and baccalaureate institutions (Table 2.3).

Among NP respondents is a sleeping beast of demand for IRs. These respondents want to know how much IRs cost to plan, implement, and maintain, and what institutions comparable to their own are doing with regard to IRs (Table 8.2 and Subchapter 9.1). None of the top-ranked reasons why NP institutions have not begun IR planning rules out future involvement with IRs (Table 8.1); however, right now, NP institutions have other things on their plate or have insufficient resources or expertise for IR planning. Very few are totally in the dark in terms of what IRs are and whether IRs have relevance for their institutions (Figure 8.1). Slightly under 50% of NP respondents may start IR planning within the next 24 months (Figure 8.2).

Asked how the MIRACLE Project could assist them regarding IRs, NP respondents want to learn about (1) IRs generally, (2) the details and specifics of IRs, (3) best practices, (4) benefits of IRs, (5) securing funding for IRs, and (6) opportunities for partnerships (Table 9.1). NP respondents’ interest in IRs is a wake-up call to their colleagues at other-than-research-universities to share their success stories about IRs with an audience that is craving for information. It is also an opportunity for the MIRACLE Project to focus on other-than-research-universities in subsequent project activities because that is where the need is greatest and where the gap in our knowledge about IRs widest.

What Previous Findings about IRs Do MIRACLE Project Census Findings Verify?

The MIRACLE Project census verifies almost two dozen findings from previous surveys. Among these findings are that research universities lead in the implementation of IRs, that libraries play a leading role in the IR effort, and that DSpace leads in IR-system pilot testing and implementation. See Table 9.2 for the complete list.

What Findings Are Unique to the MIRACLE Project Census?

Subchapter 9.3 features a discussion of 13 unique findings. Examples are the shrinking-violet role that archivists play in the IR effort; the voracious appetites that census respondents have for information especially about successful IR implementations at institutions similar to their own; the ability of the IR to forge new relationships for libraries; and the need for improved preservation functionality in IRs.

What Long-term Issues Will Occupy IR Staff Long after the MIRACLE Project Ends?

Subchapter 9.4 discusses seven such issues. Examples are the benefits of IRs, the effect of IRs on derailing the current publishing model, and requiring learning communities to submit the products and by-products of their research and teaching enterprises to the IR.


More About this Report

Census of Institutional Repositories in the United States
MIRACLE Project Research Findings

by Karen Markey, Soo Young Rieh, Beth St. Jean, Jihyun Kim, and Elizabeth Yakel
.
February 2007. ISBN 978-1-932326-28-4. 167 pages.

Report text is available free at https://www.clir.org/pubs/abstract/pub140. Printed copies are not available.

Skip to content