Search
Close this search box.
Search
Close this search box.

7 IR BENEFITS AND BENEFICIARIES

Chapter 7 explores institutional repository (IR) benefits from the perspectives of IR staff as well as staff perceptions of IR contributors. It also examines the IR’s effect on building relationships with other units, the deployment of a successful IR, and the methods that institutions are using to evaluate IRs.

7.1            Benefits of IRs

Questionnaires asked all census respondents about the benefits of IRs but they asked the question in different ways depending on the extent of respondents’ involvement with IRs:

•               No planning (NP): How important do you think these anticipated benefits of IRs would be to your institution?

•               Planning only (PO) and planning and pilot testing (PPT): How important are these anticipated benefits of IRs to your institution?

•               Implementation (IMP): At the beginning of IR planning at your institution, how important did you think these anticipated benefits of IRs would be to your institution?

All four questionnaires listed the same 16 anticipated benefits and same response categories. With two exceptions, respondents were uniformly positive about rating listed benefits. When the percentages of “very” and “somewhat” important ratings were totaled, the sum equaled or exceeded 67% for 14 of the 16 benefits.

Respondents ranked the following two benefits next to last and last, respectively:

•               an increase in citation counts to your institution’s intellectual output

•               reducing user dependence on your library’s print collection

Respondents’ positive ratings vary in a systematic way. IMP respondents’ ratings for a listed benefit are almost always greater than PPT respondents’ ratings for a listed benefit. Likewise, PPT respondents’ ratings for a listed benefit are almost always more positive than PO respondents’ ratings. Finally, PO respondents’ ratings for a listed benefit almost always exceed NP respondents’ ratings. Even though NP respondents are not as positive as respondents involved with IRs (i.e., POs, PPTs, and IMPs), they are definitely positive about the ratings they give to IR benefits. Figure 7.1 shows how respondents’ positive ratings for listed benefits increases from NP to PO, from PO to PPT, and, finally, from PPT to IMP.

These data beg the question—why should IMP respondents be more positive about IR benefits than respondents in the other groups? It may be that IMP respondents, having experienced the IR implementation effort from beginning to end, are more confident about IR benefits and that they express this confidence by giving benefits high ratings. Or, having invested much time and effort into IR implementation, IMP respondents may want the IR to succeed so strongly that they give it the highest ratings.

To simplify results of this analysis of benefits, MIRACLE Project staff assigned weights to response categories as follows: (+2) very important; (+1) somewhat important; (0) no opinion, don’t know, or not applicable; (-1) somewhat unimportant; and (-2) very unimportant. They added up the weights. These results were then compiled to rank order all the positions. Table 7.1 uses IMP ranks to order top- (1 to 7), middle- (8 to 14), and bottom-ranked (15 to 16) benefits.

image068.gif

Table 7.1. IR benefits

Top-ranked benefits (1 to 7)

NP

PO

PPT

IMP

Capturing the intellectual capital of your institution

2

2

2

1

Better service to contributors

(8)†

6

3

2

Exposing your institution’s intellectual output to researchers around the world who would not have access to it through traditional channels

(9)

(9)

(7)

3

An increase in your library’s role as a viable partner in the research enterprise

6

5

4

4

Longtime preservation of your institution’s digital output

3

3

5

5T*

Better services to your institution’s learning community

1

1

1

5T

A solution to the problem of preserving your institution’s intellectual output

5

4

6

7

Middle-ranked benefits (8 to 14)

NP

PO

PPT

IMP

An increase in the accessibility to knowledge assets such as numeric, video, audio, and multimedia formats

(7)

8

8

8

A boost to your institution’s prestige

14

13

10

9

Maintaining control over your institution’s intellectual property

(4)

(7)

9

10

Contributing to the reform of the entire enterprise of scholarly communication and publishing

13

14

12

11

New services to learning communities beyond your institution

10T

10

11

12

A reduction in the amount of time between discovery and dissemination of research findings

12

11T

13

13

Providing maximal access to the results of publicly funded research

10T

11T

14

14

Bottom-ranked benefits (15 to 16)

NP

PO

PPT

IMP

An increase in citation counts to your institution’s intellectual output

15

15

15

15

Reducing user dependence on your library’s print collection

16

16

16

16

†  Parentheses indicate NP, PO, and PPT benefits that deviated from IMP top, middle, or bottom ranks.

*   T indicates a ranked benefit that tied another benefit’s weight.

IMPs do not agree with NPs, POs, and PPTs about the top-ranked benefit. IMPs choose “Capturing the intellectual capital of your institution” and three others  choose “Better services to your institution’s learning community.” However, all four respondent types are very positive about both benefits.

Questionnaires asked IMP respondents about benefits a second time. The question was “Now that you are implementing or have implemented an IR, reassess these same anticipated benefits of IRs and tell whether you think they are less important or more important than you originally thought.” Answer categories listed after each benefit were (1) very much more important, (2) somewhat more important, (3) no change in importance, (4) somewhat less important, (5) very much less important, (6) no opinion, (7) do not know, and (8) not applicable.

When IMP respondents report a change in the importance of listed benefits, the change is an increase in importance. From 30% to 49% of IMP respondents report an increase in importance for the 11 benefits listed in Table 7.2.

Table 7.2. Increases in benefits’ importance

Benefit

% Increase

An increase in your library’s role as a viable partner in the research enterprise

48.7

Longtime preservation of your institution’s digital output

35.0

An increase in the accessibility to knowledge assets such as numeric, video, audio, and multimedia data sets

35.0

Better service to contributors

34.2

Better services to your institution’s learning community

34.2

A solution to the problem of preserving your institution’s intellectual output

32.5

Exposing your institution’s intellectual output to researchers around the world who would not have access to it through traditional channels

32.5

New services to learning communities beyond your institution

32.5

A boost to your institution’s prestige

31.7

Capturing the intellectual capital of your institution

30.0

Maintaining control over your institution’s intellectual property

30.0

The library’s role as a viable research partner makes the biggest jump, registering an almost 50% increase. Overall, 11 of 16 IR benefits register a 30%-or-more increase in importance between planning and implementation, a result that reinforces our idea about the multifaceted nature of IR benefits. Respondents did not use write-in responses to explain their answers to this question, but MIRACLE Project staff will be able to explore respondent answers in subsequent follow-up activities.

7.2            Building Relationships

Asked to what extent the IR will affect their institution’s ability to build relationships with others, such as archives, student services, library systems, and digital asset management systems, PO, PPT, and IMP respondents are overwhelmingly positive (Figure 7.2). No one chooses the answer categories “big negative effect” or “moderate negative effect,” and only between 3% and 6% of respondents choose the “combination of positive and negative effects” category. Because of the leading role that libraries take in the IR effort, this question is really a referendum on new relationships forged by libraries as a result of the IR effort.

image071.gif

Larger percentages of IMP respondents (39.4%) check the “big positive effect” than the “moderate positive effect” (33.3%) answer category. Perhaps IMP respondents are especially enthusiastic in their responses to this question because they are starting to see evidence of new relationships as a result of IR implementation.

In the previous question about IR benefits, IMP respondents acknowledge an increased role in the research enterprise (see Table 7.2). Such a role probably comes with the new relationships to which IMP respondents are referring. The nature of these relationships can be explored in this project’s follow-up activities, e.g., phone interviews and case studies, when respondents can give open-ended responses to interview questions.

7.3            Deploying a Successful IR

Questionnaires asked PO, PPT, and IMP respondents what was likely to inhibit their ability to deploy a successful IR, listed 13 potentially inhibiting factors, and asked respondents to rate those factors. To simplify results, MIRACLE Project staff assigned weights to response categories as follows: (+2) very likely; (+1) somewhat likely; (0) no opinion, don’t know, or not applicable; (-1) somewhat unlikely; and (-2) very unlikely. They totaled the weights. These results were compiled to rank order all the funding factors. Table 7.3 uses IMP ranks to order factors from top (1) to bottom (13).

Table 7.3. Factors inhibiting the deployment
of a successful IR

Top-ranked inhibiting factors (1 to 4)

PO

PPT

IMP

Absence of campus-wide mandates regarding mandatory contribution of certain material types, e.g., doctoral dissertations, master’s theses, faculty preprints

(5)†

(8)

1

Contributors’ lack of knowledge about how they can benefit from IRs

4

1

2

Convincing faculty that the IR will not adversely affect the current publishing model

(10)

(5)

3

Contributors’ concerns about intellectual property rights for digital materials

3

4

4

Middle-ranked inhibiting factors (5 to 8)

PO

PPT

IMP

Encouraging faculty to submit digital content to the IR

7

(2)

5

Competing for resources with other priorities, projects, and initiatives

(1)

(3)

6

Making members of your institution’s learning community aware of the IR

(12)

(9)

7

Contributors’ concerns about the difficulty using the IR system to contribute digital content to the IR

6

7

8

Bottom-ranked inhibiting factors (9 to 13)

PO

PPT

IMP

Supporting all ongoing costs of an operational IR

(2)

(6)

9

Inability of contributors to formulate quality metadata

9

10

10

Difficulties in long-term preservation of digital files

11

11

11

Inadequacy of the IR system’s digital preservation capabilities

13

12

12

Lack of on-campus technical expertise in IR systems

(8)

13

13

†  Parentheses indicate NP, PO, and PPT factors that deviate from IMP top, middle, or bottom ranks.

IMP respondents’ top-five ranked factors pertain to IR contributors and contributions. In fact, their concern in this regard is pushing them to consider mandating contributions of certain material types. Although PPT respondents are concerned about IR contributors and contributions, they have other priorities, projects, and initiatives that are competing with the IR effort for resources. PO respondents are even more concerned than PPT respondents about sustaining the IR effort, ranking “Competing for resources” and “Supporting ongoing costs of an operational IR” first and second, respectively.

Although there is little agreement among the three respondent types regarding the ranking of inhibiting factors, top-ranked factors for each respondent type reflect their pervasive concerns at their particular stage in the IR effort. IMP respondents have operational IRs. They are concerned about contributors and contributions; in fact, they will even venture to consider mandating deposits of certain materials into the IR. PO respondents do not have a pilot-test or operational IR; thus, their top concern is funding the IR project, then competing with other priorities for resources and securing contributions for the IR. PPT respondents are pilot testing IRs, and thus, they are concerned about securing contributions to the IR.

7.4           IR Contributors

Questionnaires asked NPs and POs, the two respondent types with the least experience with IRs, to speculate on how easy it would be for them to get faculty and other members of their institutions’ learning community to contribute to the IR (see Figure 7.3). In view of the concern PPT and IMP respondents have about the success of IRs being connected to IR contributors and contributions (see Table 7.3), we should have invited all census respondents to answer this question

Large percentages of NP and PO respondents think it will be difficult to get faculty to contribute to IRs. NP respondents are more positive than PO respondents about getting other members of their institution’s learning community to contribute to IRs; however, about one in eight PO respondents check the “do not know” category.

image074.gif

Questionnaires asked respondents to rate 15 reasons why others would contribute to the IR. For PO and PPT respondents, the question said, “Why do you think members of your institution’s learning community will contribute to an IR?” For IMP respondents, the question said, “When planning for an IR, what did you think would be the most important reasons why people would contribute to an IR?” Respondents rated the reasons on a scale from “very important” to “very unimportant.”

To simplify results, MIRACLE Project staff assigned weights to response categories as follows: (+2) very important; (+1) somewhat important; (0) no opinion, don’t know, or not applicable; (-1) somewhat unimportant; and (-2) very unimportant. The staff totaled the weights. These results were then compiled to rank order all the reasons. Table 7.4 uses IMP ranks to order the top- (1 to 5), middle- (6 to 10), and bottom-ranked (11 to 15) reasons. Parentheses indicate NP, PO, and PPT reasons that deviated from IMP top, middle, or bottom ranks.

Table 7.4. Reasons for contributing to the IR

Top-ranked reasons (1 to 5)

PO

PPT

IMP

Total

To expose the particular scholar’s intellectual output to researchers around the world who would not have access to it through traditional channels

3

1

1

1

To boost the particular scholar’s prestige

1

2

2

2

To increase the accessibility to knowledge assets such as numeric, video, audio, and multimedia datasets

4

3

3

3

To place the burden of preservation on the IR instead of on individual faculty members

5

(6)†T*

(7)

4

To solve the problem of preserving your institution’s intellectual output

2

(6)T

(10)

5

Middle-ranked reasons (6 to 10)

PO

PPT

IMP

Total

To expose your institution’s intellectual output to researchers around the world who would not have access to it through traditional channels

6

(4)

6

6

To increase citation counts to the particular scholar’s oeuvre

8

(5)

(5)

7

To reduce the amount of time between discovery and dissemination of research findings to scholarly communities

7

8

(4)

8

To encourage other scholars to provide open access to their intellectual output

(12)

9

8

9T

To provide maximal access to the results of publicly funded research

9

(11)

10

9T

Bottom-ranked reasons (11 to 15)

PO

PPT

IMP

Total

To boost your institution’s prestige

11

(10)

(9)

11

To increase the library’s role as a viable partner in the research enterprise

(10)

12

13

12

To increase citation counts to your institution’s intellectual output

13

13

12

13

To contribute to the reform of the entire enterprise of scholarly communication and publishing

14

14

14

14

To reduce user dependence on your library’s print collection

15

15

15

15

†  Parentheses indicate PO and PPT positions that deviated from IMP top, middle, or bottom ranks.

*   T indicates a ranked position that tied another position’s weight.

In Table 7.4, respondents give high ratings to reasons that enhance faculty’s scholarly reputations and assign responsibility for research-dissemination tasks to others so that faculty can focus on intellectual tasks. Lower-ranked reasons pertain to enhancing the institution’s standing.

MIRACLE Project investigators listed the reasons “Boosting the institution’s prestige” and “Reforming scholarly communication” because they are prominent in discussions about the ability of IRs to derail the current publishing model (Chan 2004; Crow 2002a; Harnad 2001b). These reasons, however, are not the ones that respondents feel motivate faculty to contribute to IRs; in fact, respondents rank these reasons toward the bottom.

Most census respondents are positive about the importance of all listed reasons. Only the two bottom-ranked reasons have most respondents checking the two “unimportant” response categories.

7.5            Evaluation Methods

Table 7.5 tells the number and percentage of IMP respondents who have used listed methods to assess their IR’s success.

Table 7.5. Methods of assessing the IR’s success

Methods

IMP

No.

%

Tracking number of contributions

27

56.3

Tracking number of users

21

43.8

Tracking number of unique contributors

19

39.6

Tracking number of searches

19

39.6

Conducting interviews with IR contributors

10

20.8

Tracking number of queries

9

18.8

Tracking number of unique IR users

7

14.6

Conducting interviews with IR users

5

10.4

Surveying IR contributors

4

8.3

Surveying IR users

2

4.2

Over half of IMP respondents are tracking the number of contributions to their IRs. Other popular methods are tracking the number of users, unique contributors, and searches. Popular approaches enlist simple counts that the IR system probably produces automatically in periodic management reports. Less popular are interviews with IR contributors (20.8%) or IR users (10.4%) and surveys of IR contributors (8.3%) or users (4.2%). These methods are more intensive, requiring staff to draft data-collection instruments, submit them to institutional review boards for human subjects approval, recruit respondents, collect and analyze data, and communicate results.

Questionnaires should have featured a response category for “Nothing to date” because three IMP respondents wrote us saying they have not collected any data for evaluation purposes.

7.6            Chapter 7 Summary

Census respondents give high ratings to more than a dozen benefits of IRs. In fact, their ratings are so high it is hard to single out one or two benefits as more important than the others (see Subchapter 7.1). Two explanations are given for this finding: (1) IRs have many benefits; or (2) it may be premature to single out any particular benefits because IRs have not yet come into their own.

Questionnaires asked IMP respondents to examine IR benefits a second time, reassessing whether benefits are more or less important now that they are implementing or have implemented an IR. When respondents note a change, the change is an increase in importance (see Table 7.2). The library’s role as a viable research partner makes the biggest jump; almost 50% of IMP respondents rated this benefit as increasing in importance.

Asked to what extent the IR will affect their institution’s ability to build relationships with others such as archives, student services, and library systems, PO, PPT, and IMP respondents are overwhelmingly positive (see Figure 7.2). Hardly anyone chooses “Negative effect” or “Combination of positive and negative effects” categories.

Clarifying what it means to be a viable research partner and exploring the new relationships that the library has established as a result of the IR can be explored in this project’s follow-up activities, when respondents can give open-ended responses to interview questions.

Findings about factors that are likely to inhibit their ability to deploy a successful IR reflect the pervasive concerns of PO, PPT, and IMP respondents at their particular stage in the IR effort (see Table 7.3). Because IMP respondents have operational IRs, they are concerned about contributors and contributions. In fact, they will even consider mandating deposits of certain materials into the IR. PO respondents do not have a pilot-test or operational IR; their concerns relate primarily to competing with other projects, priorities, and initiatives for resources at a time when they will soon be acquiring hardware, IR-system software, and the requisite technical expertise. PPT respondents are pilot testing IRs; thus, they are concerned about securing contributions to the IR.

The four top-ranked reasons why census respondents think people will contribute to IRs are connected with enhancing scholarly reputations and offloading research-dissemination tasks onto others (see Table 7.4). Reasons pertaining to reforming the current publishing model figure toward the bottom of the ranked list.

Methods that institutions are using to evaluate IRs usually enlist simple counts that IR systems produce automatically in periodic management reports (see Table 7.5). Less popular are interviews and surveys that require staff to dedicate considerable time and effort to planning, data collection, analysis, and reporting.

 

8 INSTITUTIONS THAT HAVE NO INVOLVEMENT WITH IRS

Chapter 8 features findings pertaining to institutions that have done no planning for an institutional repository (IR).

8.1            Reasons for No Planning

Participating in the MIRACLE Project’s nationwide census are 236 respondents where no IR planning (NP) has been done to date. Dominating the NP respondent type are institutions from the Carnegie Classification of Institutions of Higher Education (CCHE) master’s (43.6%) and baccalaureate (33.5%) classes (see Table 2.3).

Questionnaires asked NP respondents to rate 15 reasons why they have not yet done such planning. Table 8.1 tells the percentages of respondents who gave each reason a “very” or “somewhat” important rating.

Table 8.1. Reasons for no planning

Rank

Top-ranked reasons (1 to 5)

% Important

1

Other priorities, issues, activities, etc., are more pressing than an IR

87.2

2

We have no resources to support planning

71.1

3

We want to assess IRs at institutions like our own before taking the plunge

65.5

4

We have no in-house expertise for planning

58.8

5

We want to assess IRs at other institutions generally before taking the plunge

56.1

Rank

Middle-ranked reasons (6 to 10)

%

6

We are waiting for funding to support IR planning

48.2

7

We have no support from our institution’s administration

36.3

8

We are waiting to join a consortium, partnership, or group

36.0

9

We doubt members of our institution’s learning community will contribute to an IR

33.3

10

We are not convinced that an IR would benefit our institution’s learning community

32.9

Rank

Bottom-ranked reasons (11 to 15)

%

11

We have no support from our institution’s information technology group

23.4

12

We do not understand or believe in the value or effectiveness of an IR

19.4

13

We will outsource IR services to another institution, consortium, partnership, or group

16.8

14

We do not need an IR

15.4

15

We have no support from our library’s administration

9.5

None of the top-ranked reasons rules out these institutions from getting involved with IRs at a later date. Right now, NP institutions appear to have other things on their plates, they have neither the resources nor the expertise needed for IR planning, or they want to assess what comparable institutions have done before taking the plunge.

Reasons ranked 10, 12, and 14 are ones that might preclude academic institutions from becoming involved with IRs. They are ranked almost at the bottom of Table 8.1. Had these reasons been ranked at the top, we might be persuaded that NP respondents have little interest in IRs or do not consider them appropriate for their institution’s learning community. Responses indicate that this is not the case.

Figure 8.1 graphs NP respondents’ exact answers to these three reasons. Between 40% and 49% of NP respondents consider them “unimportant,” and between 33% and 41% are undecided, choosing the answers “Do not know,” “Not applicable,” or “No opinion.”

The no-planning reason getting the least support is “We have no support from our library administration,” implying that NP respondents generally do have support from the library administration. Because libraries play such major roles in IR implementation, we do not think that the alternative explanation—NP respondents do not think library support is important for IR implementation—applies here.

Write-in answers give details that are impossible for respondents to express in closed-ended questions. Here is how one NP respondent explained why his or her institution could not take on an IR:

•               “Our institution is almost 200 years old and has rich historical resources and an administration that values the resources and is willing to support them financially. We have not yet begun conversations about IR. We have, however, hired our first professionals: museum director, museum registrar, university archivist, and special collections librarian. They are tackling the initial processing of huge collections. We are also completing construction of a $9 million museum and significantly expanding and renovating the space in the library for archives and special collections.”

A handful of respondents write that they have no need for an IR because of their institution’s emphasis on teaching over research:

•               “Our faculty do not do research as we are a computer-aided design school.”

•               “We are a very small institution (500 full-time equivalents [FTEs]) with an emphasis on teaching and produce relatively little material of this nature.”

Write-in answers reveal two answer categories that MIRACLE Project investigators should have included on the questionnaires—one about consortia and a second about the issue of IRs never having been raised. Respondent comments about these are as follows:

•               “We are in the talking stage internally and are in conversation with a consultant to meet with us and sort out the issues as well as a strategy for eliciting interest in the university. Moreover, our library and IT consortia are interested in a joint endeavor to analyze regional resources and gaps in resources for pursuing institutional repositories and/or consortial repository.”

•               “Participating in state-wide library planning effort.”

•               “To my knowledge, the issue [of IRs] has never been raised.”

•               “There does not appear to be any college discussion or support [for an IR initiative].”

•               “This [IR] question has never been discussed on our campus as far as I know.”

Two NP respondents note that they are in the dark about IRs:

•               “We do not understand what an institutional repository is.”

•               “We are not aware of this whole topic as you obviously are aware by the ‘don’t know’ responses.”

Had large numbers of NP respondents expressed comparable sentiments, MIRACLE Project investigators would have been convinced that NP respondents were disinterested in IR implementation or did not think them appropriate for their institutions. Having encountered few such comments, we conclude that NP respondents simply do not have IR implementation on their agendas right now.

8.2            IR-related Activities

The questionnaires listed 14 IR-related activities, events, or issues that might put NP respondents on the road to an IR and asked them to rate the importance of each item on a scale of from “very important” to “very unimportant.” Table 8.2 tells the percentages of respondents who rated the IR-related activity “very” or “somewhat” important on the road to an IR.

Table 8.2. IR-related activities on the road to an IR

Rank

Top-ranked activities (1 to 5)

% Important

1

How much it costs to implement an IR

90.2

2

How much it costs to maintain an IR

89.6

3

How much it costs to plan for an IR

83.0

4

What institutions comparable to my own are doing with regard to IRs

81.3

5

Whether members of my institution’s learning community will use our IR

74.9

Rank

Middle-ranked activities (6 to 10)

%

6

Whether members of my institution’s learning community will contribute to our IR

73.9

7

How to interest my institution’s administration in IR planning

72.3

8

What is the impetus for IR planning and implementation at institutions comparable to my own

71.6

9

What other institutions generally are doing with regard to IRs

71.4

10

An IR as an accepted “best practice” in the profession

68.2

Rank

Bottom-ranked activities (11 to 14)

%

11

How much it costs to migrate to a new IR

63.8

12

What is the impetus for IR planning and implementation at other institutions generally

59.8

13

How to interest a consortium, partnership, group, library network, etc., in IR planning

48.3

14

How to interest an institution(s) in partnering with us on an IR

37.1

First, NP respondents are concerned about the costs of IRs. Next, they want to know what institutions comparable to their own are doing with regard to IRs. They then want to know whether members of their institution’s learning community will contribute to and use their IR. Finally, they want to know how to interest their institution’s administration in IR planning.

Less than half of NP respondents give “important” ratings to only the bottom two Table 8.2 activities. Both these activities address partnering with other institutions for IR services. Clearly, NP respondents in the MIRACLE Project census prefer to go it alone in terms of IR services.

Because NP respondents give high ratings to almost all IR-related activities, they are surprisingly very favorably inclined to IRs. This may be because of how we invited people to participate in the census. We performed the electronic version of the salesperson’s “cold call”; that is, we sent prospective respondents e-mail messages with a substantive phrase in the “SUBJECT” line announcing our IR census and asked them to participate. Most likely, the people who responded to our e-mail message are interested in IRs and are thus more likely than others to open and read such a message and to respond positively about IRs on their questionnaire.

Questionnaires asked NP respondents to choose one or more events that would have to happen for IR planning to begin at their institutions. Table 8.3 tells the percentages of respondents who chose the listed event.

Table 8.3. What would have to happen for IR planning
to begin at your institution?

Rank

Top-ranked events (1 to 4)

% Important

1

We receive funding from our institution’s administration

66.1

2

Successful IR demonstration projects at a comparable institution

54.7

3

We receive approval from our institution’s administration

52.1

4

We are convinced that our institution’s learning community would contribute to it

43.2

Rank

Middle-ranked events (5 to 7)

%

5

We receive additional personnel resources to support planning

36.4

6

We receive approval from our institution’s information technology group

34.3

7

We reassess our institution’s current priorities, issues, and activities

33.9

Rank

Bottom-ranked events (8 to 12)

%

8

Successful IR demonstration projects at other institutions generally

19.5

9T

Contracting for IR services from another institution, consortium, or group

18.2

9T

We receive approval from our library’s administration

18.2

11

We receive funding from our institution’s information technology group

17.4

12

We receive funding from our library’s administration

12.7

To initiate IR planning, NP respondents need approval from their institution’s administration and funding that includes support for the personnel to undertake the project. Respondents also want evidence of successful IR projects at comparable institutions. They are not interested in evidence of such projects at institutions unlike their own and are not interested in partnering for IR services. Because most NP respondents come from master’s and baccalaureate institutions, they want to be convinced that IRs at institutions awarding these degrees are successful in terms of technical implementation, securing contributions to the IR from the local learning community, and system use by the local community and beyond.

NP respondents’ open-ended responses to this question reveal five themes: (1) the pressing nature of other priorities, issues, activities, etc.; (2) the need for resources to begin planning; (3) waiting for a consortium; (4) low levels of research at the institution; and (5) raising the issue of IRs at their institution. Here are examples of each:

1.               Other priorities, issues, and activities:

•               “[The] IR is one of the university librarian’s hot topics for his 10-year plan for content services. For now, however, other issues are more pressing.”

•               “Achieve stability in the administration of the institution.”

•               “Many of these requirements are already in place; delay is just because we have other projects higher on the priority list.”

2.               Resources needed:

•               “We have appropriate, trained, skilled personnel who are committed to overseeing the project.”

•               “We [must] receive staff support [for the IR project] from [the] information technology group.”

•               “We find some outside funding.”

•               “We have limited staff to engage in the planning, promotion, education, etc. Funding may eventually be a problem, but until planning is done, I can’t say for certain.”

3.               Waiting for a consortium:

•               “Move by the state board of regents to develop a system-wide IR.”

•               “Consortium, could move toward a statewide IR site.”

4.               Low research levels:

•               “This is not a research institution, hence IR planning is not a high priority here.”

•               “We need to publish more.”

5.               Raising the issue:

•               “Someone at the administrative level would need to embrace an IR as a goal for our institution—right now, an IR seems to be completely off radar.”

8.3            Likelihood of IR Planning

Two questions asked NP respondents about the likelihood of future IR planning. The first question asked them about such a likelihood in the near future (next 12 months), and the second asked them about this in the medium term (next 24 months). Figure 8.2 shows the results.

Less than 20% of NP respondents are likely to start IR planning in the next 12 months. Just uder 50% are likely to start such planning within 24 months. An operational IR is a distinct possibility for many of the NP institutions participating in the MIRACLE Project census.

8.4            Chapter 8 Summary

The top-ranked reason why NP institutions have done no IR planning to date is the pressing nature of other priorities, issues, and activities (see Table 8.1). None of the other top-ranked reasons rules out these institutions from eventually getting involved with IRs. Had reasons such as “We do not need an IR” or “We do not understand or believe in the value or effectiveness of an IR” been top ranked, then we would question whether NP institutions would get involved with IRs in the short- to medium-term future.

Right now, NP institutions appear to have other things on their plate, they have no resources or expertise for IR planning, or they want to assess what others are doing before taking the plunge.

Asked to rate the importance of 14 next steps on the road to an IR (see Table 8.2), NP respondents give the highest ratings to three steps pertaining to costs:

  • “How much it costs to implement an IR”
  • “How much it costs to maintain an IR”
  • “How much it costs to plan for an IR”

After learning about costs, NP respondents want to know what institutions comparable to their own are doing with regard to IRs. Then they want to know whether members of their institution’s learning community will contribute to and use the IR. Finally, they want to know how to interest their institution’s administration in IR planning. They are not interested in partnering with other institutions. When asked about learning from other institutions, they want to know about the IR-implementation experience of comparable institutions, meaning master’s and baccalaureate institutions, because these are the majority of NP institutions responding to the MIRACLE Project census.

Underlying the high ratings NP respondents give to all but a handful of the next steps on the road to IR planning is a favorable inclination toward IRs. This may be because of how MIRACLE Project staff invited people to participate in the census, performing an e-mail version of cold calling. Most likely, the people who responded to our e-mail message are interested in IRs and are thus more likely to open and read a message about them and, eventually, respond positively about IRs on their questionnaires.

To initiate IR planning (see Table 8.3), NP institutions need approval from their institution’s administration and funding that includes support for personnel. They also want evidence of successful IR projects at comparable institutions; again, this means master’s and baccalaureate institutions. They are not interested in evidence of such projects at institutions unlike their own and are not interested in contracting for IR services. Several NP respondents wrote open-ended responses that reveal these themes: (1) the pressing nature of other priorities, issues, activities, etc.; (2) the need for resources to begin planning; (3) waiting for a consortium; (4) low levels of research at the institution; and (5) raising the issue of IRs at their institution.

Less than 20% of NP respondents are likely to start IR planning in the next 12 months. The percentage increases to just under 50% for a beginning start date in 24 months.

Generally, NP respondents in the MIRACLE Project census are favorably inclined toward IRs. They come from master’s and baccalureate institutions where research may not be as important as teaching and service. These respondents recognize the importance of IRs for their institutions and for educational institutions in general. Because they have other priorities right now, they are content to take a wait-and-see approach, that is, monitoring whether IRs at institutions like their own have been successful in technical implementation, whether members of their learning communities will contribute to IR, and whether they will use the IR. Planning for the cost of IR implementation, finding staff with the requisite expertise, and broaching the issue of IR implementation with their administration are important issues for NP respondents.


9 DISCUSSION OF CENSUS FINDINGS

Chapter 9 discusses census findings, specifically, the sleeping beast of demand for institutional repositories (IRs) from master’s and baccalaureate institutions, findings that confirm those of previous surveys, and findings that build on our knowledge of IRs. It concludes with observations on issues pertaining to IRs that will persist long after the MIRACLE Project terminates.

9.1            The Sleeping Beast of Demand for IRs from Master’s and Baccalaureate Institutions

No planning (NP) institutions are the largest respondent type in the MIRACLE Project’s nationwide census, accounting for 52% of respondents (see Figure 2.1). Planning only (PO) institutions are the second-largest respondent type in the census, accounting for 21% of respondents (see Figure 2.1). Dominating both NP and PO respondent types are institutions from the Carnegie Classification of Institutions of Higher Education (CCHE) master’s (43.6% and 34.8%, respectively) and baccalaureate (33.5% and 31.5%, respectively) classes (see Table 2.3).

Despite their prevalence (56.6%) in the population of institutions of higher education (IHEs) in the United States (see Figure 2.3), master’s and baccalaureate CCHE institutions are not where IR activity is happening. To date, the story of IRs in U.S. academic institutions has been written by the research universities CCHE class. Although research universities represent only 7.9% of IHEs in the U.S. (see Figure 2.3), they are the majority (62.5%) of IMP institutions in the MIRACLE census, that is, institutions where IRs have been implemented (see Table 2.3). Previous IR surveys (Bailey et al. 2006; Shearer 2004) have been limited to members of the Association of Research Libraries (ARL) in the United States and of the Canadian Association of Research Libraries (CARL)—types of libraries that are typical of the research universities CCHE class. The Coalition for Networked Information (CNI), another recent surveyor, is sponsored by ARL, and CNI’s survey of 81 liberal arts colleges bearing CNI consortial membership reveals that only 6% have an operational IR (Lynch and Lippincott 2005). Two prominent research universities, Massachusetts Institute of Technology (MIT) and Cornell University, have been involved in the development of the popular DSpace and Fedora IR systems, respectively. Case studies (e.g., Smith et al. 2003; Rogers 2003; Walters 2006; Baudoin and Branschofsky 2003) focus on research library and research university experiences with IRs. IRs are a recent phenomenon and they are happening at research universities.

The MIRACLE Project census has uncovered a sleeping beast of demand on the part of master’s and baccalaureate universities and colleges regarding IRs. Respondents at these institutions want to know about the IR experiences of master’s and baccalaureate institutions generally (see Tables 8.1, 8.2, and 8.3). They also want to learn about their peers’ experiences with IR costs, required technical expertise, funding the IR effort, whether the local learning community will contribute to and use the IR, and raising the issue of IRs with their institution’s central administration.

MIRACLE Project questionnaires ended with this question: “How can the MIRACLE Project assist you regarding IRs?” Many NP and PO respondents asked us particularly about the small and mid-size college and university experiences with IRs. Samples of such responses include the following:

  • “Tell us stories about how small institutions made their IRs a reality. We really need models and realistic next steps.” (PO respondent)
  • “Best practices, identification of institutions like ours [a small liberal arts college in the middle Atlantic states] who have succeeded, formation or information about collaborative groups who have (or will have) a shared IR that we can join.” (NP respondent)
  • “I believe that a full-fledged IR is beyond our capabilities at this point, but would be interested in continuing to hear about developments in this area, especially in small universities.” (NP)
  • “Provide examples of what other small to mid-size public universities are doing with IRs.” (NP)
  • “Would love to see models in a small, liberal arts college environment, particularly for consortial opportunities.” (NP)
  • “We are always interested in what our peer institutions [mid-size public midwestern universities] are doing.” (NP)
  • “Providing information about what comparable institutions [small liberal arts colleges in the Central Plains states] are doing …” (NP)
  • “By publicizing success stories from institutions similar in size and mission to ours [small private midwestern college].” (NP)
  • “Provide more information about IRs, their benefits, and the resources needed to establish and maintain them. We are a small, underfunded undergraduate [private Southern] college just struggling to fund basic needs.” (NP)

NP respondents participating in the MIRACLE Project are surprisingly positive about IRs. Very few are totally in the dark in terms of what IRs are and whether they have relevance for their institutions (see Figure 8.1). Slightly less than 50% of NP respondents may start IR planning within the next 24 months (see Figure 8.2).

The positive attitude that this project’s NP respondents have about IRs may be the result of how MIRACLE Project staff invited people to participate. We performed the electronic version of cold calling, that is, we sent prospective respondents e-mail messages with a substantive phrase in the “SUBJECT” line announcing our IR census and asked them to participate. Most likely, the people who responded to our message are interested in IRs and are more likely to read and respond to such a message, and, eventually, to respond positively about IRs on their questionnaire.

MIRACLE Project investigators identified more themes in NP respondents’ answers to our question “How can the MIRACLE Project assist you regarding IRs?” These themes are (1) learning about IRs generally, (2) learning the details and specifics of IRs, (3) best practices, (4) benefits of IRs, (5) securing funding for IRs, (6) encouragement and advocacy, (7) opportunities for partnerships, and (8) learning about IRs from completing the MIRACLE Project’s questionnaire. Table 9.1 lists a few remarks for each of these themes. Many more examples could be enumerated in this table, and most remarks cut across two or more themes.

Table 9.1. How the MIRACLE Project can assist
NP respondents regarding IRs

Theme NP respondent’s remark Institution detail
(1) Learning about IRs generally “I’m still learning what IRs are and how we might think about starting one ourselves. Any information on those topics would be useful to me at this point.” Special focus professional school in the Southwest
“Continue disseminating information. Review and publicize tools, especially those for institutions with limited technical support and funding.” Small private liberal arts college in the Southeast
“I think your study itself will be valuable.” Western master’s university
“Send us survey results. Connect us to institutions like us who are considering IRs.” Small private liberal arts college in the Central Plains states
(2) Learning the details and specifics of IRs “Marketing materials, potential benefits and liabilities … The whole administrative impact. From the smallest size institution, this is more than just adding a service; it could relate to a huge percentage of extremely tight resources. Erase the FEAR of costs.” Small private church-affiliated liberal arts college in the Great Lakes area
“Information and assistance on coping with copyright issues associated with an IR; promotional material and arguments to convince faculty and the learning community to participate in and support an IR.” Mid-size public research university in the Mountain West
(3)
Best practices
“Provide ‘best practices’ for an institution of my size. Offer guidelines for partnering with other institutions.” Small public baccalaureate university in the Mountain West
“We’ll be interested in the procedures developed by others and what current best practices are at the time we’re ready to start.” Small private master’s university in New England
(4)
Benefits of IRs
“We are in the midst of an institution-wide reassessment. The benefits listed in question 6a reflect many of the values we would like to incorporate in our plans for the future [and] … to the academic community as a whole.” Small private master’s university in a large northeastern city
“Provide concise examples and talking points of benefits and successes from IRs for use to gain campus and administrative support.” Mid-size public master’s university in a Central Plains state
(5)
Securing funding for IRs
“Figure out a way so that the top administration would want to fully fund such an operation. It really requires a lot of talented labor to input and maintain.” Small private research university in the Northeast
“Provide more information about IRs, their benefits, and the resources needed to establish and maintain them. We are a small, underfunded, undergraduate college just struggling to fund basic needs.” Small private church-affiliated baccalaureate college in the South
(6) Encourage-ment and advocacy “This is really low on the radar right now. Just being there in the future is all I could ask at this time.” Small technical [special-focus] institution in the Northern Plains states
“If you can help me wake people up to the potential of an IR over the din of all the other challenges of an institution like ours, that would be great.” Small private religious-affiliated master’s university in the Northeast
(7)
Oppor-tunities for partnership
“Suggest some consortial models we could investigate.” Small private liberal arts college in the Southeast
“Would love to see models in a small, liberal arts college environment, particularly for consortial opportunities.” Small private master’s university in the Southeast
(8) Learning about IRs from the question-naire “You have already identified the issue for me. I will leave it simmering on the back burner until I see more interest within the faculty and the library community in general.” Small, private, church-affiliated liberal arts college in the Central Plains states
“Having this kind of in-depth survey to use as background information and ammunition will help spark the whole planning and implementation.” Mid-size midwestern doctoral university in a Central Plains state
“Presenting the range of questions we should be thinking about, so just taking this survey has been educational.” Major military academy

The high level of interest in IRs is an opportunity for other-than-research-universities to share their stories about IRs with an audience that is craving for information. It is also an opportunity for the MIRACLE Project to focus on other-than-research-universities in subsequent project activities, where the need is greatest and where the widest gap in our knowledge about IRs exists.

9.2            Verifying Previous Survey Findings Pertaining to Institutions Involved with IRs

Findings from the MIRACLE Project census verify previous survey findings pertaining to institutions with operational IRs (see Appendix F). Table 9.2 summarizes the most important of these findings. The Executive Summary is also comprehensive in its enumeration of MIRACLE Project census findings.

Table 9.2. Previous survey findings verified
in the MIRACLE Project census

Finding Report references
Research universities lead in the implementation of IRs. Table 2.3 and Subchapter 2.2
Master’s and baccalaureate institutions lag far behind in the implementation of IRs. Table 2.3 and Subchapter 2.2
Libraries play a leading role in planning, pilot testing, and implementing IRs. Table 2.4 and Subchapter 2.3; Table 2.5, Figure 2.5, and Subchapter 2.5; Figure 2.6, and Subchapter 2.6; Table 3.1
Committee membership becomes increasingly less inclusive as the IR project progresses from pilot testing to implementation. Figure 2.5 and Subchapter 2.5
The number of staff involved in the IR effort decreases from the planning and pilot-testing stage to the IR implementation stage. Figure 2.4 and Subchapter 2.4
Libraries bear the brunt of the cost of the IR. Table 3.1 and Subchapter 3.1
A typical approach to funding the IR is absorbing its cost in routine library operating costs. Table 3.1 and Subchapter 3.1
Staff and benefits costs dominate the budget for IR. Figure 3.1 and Subchapter 3.2
Pilot testing IR-system software is an important investigative activity. Table 4.3 and Subchapter 4.3
Institutions’ preferred IR-system software for both pilot testing and implementation is DSpace. Table 5.2 and Subchapter 5.2
Most IR staff modify their IR-system software. Figure 5.1 and Subchapter 5.3
Both pilot-test and operational IRs are very small. Figure 6.1, Table 6.1, and Subchapter 6.1
Dominating pilot-test and operational IRs are the traditional products and by-products of the research enterprise. Table 6.1 and Subchapter 6.1
Operational IRs contain a wide range of text, numeric, and multimedia files. Table 6.1 and Subchapter 6.2
Except for PDFs, institutions with operational IRs do not guarantee file formats in perpetuity. Table 6.2 and Subchapter 6.4
The major contributors to operational IRs are faculty or graduate students. Table 6.4 and Subchapter 6.5.2
Recruiting digital content for the IR is difficult. Figures 6.7 and Subchapter 6.5.4; Table 7.3 and Subchapter 7.3; Figure 7.3 and Subchapter 7.4
IR staff working one-on-one with early adopters is a successful method for recruiting IR content. Figure 6.5 and Subchapter 6.5.3
IR staff may consider institutional mandates that require members of their institution’s learning community to deposit certain document types in the IR. Subchapter 6.5.4; Table 7.3 and Subchapter 7.3
For IR staff, top-ranked benefits of IRs are institution based. Table 7.1 and Subchapter 7.1
Evaluation methods to date are limited to simple counts that most IR systems produce automatically in management reports. Table 7.5 and Subchapter 7.5

9.3            Building on Our Knowledge of IRs

MIRACLE Project findings that build on our knowledge of IRs are featured in this subchapter. Because the questionnaire method does not allow MIRACLE Project investigators to gain insight above and beyond closed-ended responses, we must sometimes speculate on the reasons for these findings. Subsequent project activities (e.g., phone interviews, case studies, follow-up electronic mail correspondence, and quasi-controlled experiments with IR users) will determine whether our speculation, arguments, and reasoning are on target.

  1. Except for the library director, the key people who would have to be very active to initiate an IR effort where none is under way are external to the library. (See Table 2.4 and Subchapter 2.3, and Table 8.3. See also Table 4.1 and Subchapter 4.1.)

Librarians are especially active in the IR effort at PO, PPT, and IMP institutions, serving on planning and advisory committees, pilot testing software, recruiting content, identifying early adopters, etc. At NP institutions where no IR effort is under way, the library director takes the lead, inquiring about funding from the provost and technical expertise from the chief information officer (CIO) and learning about the faculty’s interest in making contributions and urging their students to make contributions.

Related to this finding are the important investigative activities that IR staff undertake. Especially important to PO respondents is demonstrating operational IRs to their institutions’ decision makers (see Table 4.1). Because PO respondents are in the early stages of the IR initiative, they want those who will ultimately be making the decision about their institution’s IR effort and, possibly, giving financial support for the IR project, to understand the basic concept of IRs. Demonstrating IRs makes them more tangible to decision makers so they may be more favorably inclined to the IR initiative in terms of both funding and rhetoric.

  1. Archivists are less prominent in the IR effort than expected. (See Tables 2.4 and 2.5, figures 2.5 and 2.6, Table 3.1 and Subchapter 3.1, and Subchapters 2.3, 2.4, 2.5, and 2.6.)

Despite the inclusion of “archivists” response categories on questionnaires, archivists figure in the middle when querying respondents about the positions of people involved in the IR effort (see Table 2.4), leading the IR effort (see Table 2.5), and serving on IR committees (see Figure 2.5). Between the planning through implementation phases of the IR effort, archivists’ responsibility for the IR appears to diminish (see Figure 2.6). Archivists are also not expected to bear the burden of funding the IR project (see Table 3.1).

MIRACLE Project investigators have no census data that would help explain the marginalization of the archivist with respect to IRs. There may be merit to Crow’s (2002a) observation that the IR competes with the university archives (see Appendix F4).

IRs could benefit from archivists who are experts in collection building and disposition. The type of one-on-one collection development and content recruitment now being carried out by librarians to populate IRs is exactly the type of field work that archivists have done for decades. Closely related to this type of content recruitment is archival appraisal, which is a different type of collection analysis for librarians and pushes their skill set into the archival arena.

In future studies, the relationship between IRs and archivists deserves further investigation to shed light on the reasons for archivists’ limited participation in IR efforts.

  1. Staff involved with the IR effort have voracious appetites for information about IRs, especially information pertaining to successful implementations at institutions like their own. (See Table 4.1 and Subchapter 4.1, Tables 8.1 and 8.2, and Subchapters 8.1 and 8.2.)

The tables cited above only touch the surface in terms of what respondents want. The wide range of their interests and needs is demonstrated in their answers to the final question on MIRACLE Project questionnaires, “How can the MIRACLE Project assist you regarding IRs?” Because enumerating their specific requests in their own words would be too lengthy for this report, we use Table 9.3 to characterize them.

Table 9.3. Characterizing respondents’ requests
for more information about IRs

Respondents want information on NP PO PPT IMP
MIRACLE Project census findings X X X X
Successful IR implementations especially at institutions like their own X X X X
Best practices X X X X
State-of-the-art regarding IRs X X X X
Examples* X X X X
MIRACLE Project census data X X X
Costs and budgets X X X
Case studies especially at institutions like their own X X
Written policies X X
MIRACLE Project as a clearinghouse for all information about IRs X X X
Joining a consortium or partnership X X
Benefits of IRs X X
Compelling arguments for an IR in institutions like their own X X
Software reviews X X
Grant funding opportunities X
MIRACLE Project questionnaires X
*   Examples cited by one of more respondent types: written policies, procedures, consortial agreements, permissions templates, intellectual property agreements, requests for proposals, benchmarks, models, collection development policies, IR-system software checklists (especially for open-source alternatives), comparative analyses of IR-system software products.

Respondents from PO institutions are especially demanding and the most articulate about their demands. In fact, the majority of examples come from PO respondents.

  1. The needs assessment is not as important as other IR investigative activities. (See Table 4.1 and Subchapter 4.1, Figure 4.1, and Subchapter 4.2.)

The needs assessment ranks in the middle of a dozen investigative activities (see Table 4.1). The majority of MIRACLE Project respondents do not conduct such an assessment (see Figure 4.1); in fact, between 5% and 12% of PO, PPT, and IMP respondents do not even know whether their institution has conducted a needs assessment. Learning about successful IR implementations at comparable institutions ranks heads and toes above the needs assessment. Table 9.3 lists other information that IR planners and implementers would like in hand.

  1. The next steps for IR planners and pilot testers will be to continue, not terminate, their institution’s IR effort. (See Table 4.4, Figure 4.5, and Subchapter 4.4.)

The next step for PO respondents is to widen the scope of their IR investigations, and, for PPT respondents, to implement an IR-system software package (see Table 4.4). Terminating the IR project is another logical next step for respondents in these two two groups, but only about 10% of PO and PPT respondents in the MIRACLE Project census plan to do so (see Figure 4.5).

  1. Waiting for a consortium is a viable alternative for a very small minority of institutions interested in IR services. (See Table 4.1 and Subchapter 4.1, Table 4.4, Figure 4.5 and Subchapter 4.4, Subchapter 5.4, Table 8.1 and Subchapter 8.1, and Subchapter 2.6.)

When MIRACLE Project investigators included references to consortia, partnerships, networks, or groups in response categories, few respondents chose these categories. When we failed to include such references, a handful of respondents volunteer them in write-in responses. Asked directly about joining a consortium, respondents rank it in the middle of a pack of reasons why they have not yet begun IR planning (see Table 8.1). Respondents involved with IR planning only or not involved with IRs express their interests in consortia in the final question on MIRACLE Project instruments about how the Project could help them. For example:

  • “We are in the process of investigating IR systems and are in talks with other colleges about our digital needs. A consortial agreement for an IR system would be ideal.” (PO respondent at a small private liberal arts college in a Great Lakes state)
  • “Provide information about collaboratives, either within a consortium, a system, or amongst institutions with similar needs.” (PO respondent at a mid-size master’s university in a northern Great Lakes state)
  • “Offer guidelines for partnering with other institutions.” (NP respondent at a small public baccalaureate university in the Mountain West)
  • “Best practice, identification of institutions like ours who have succeeded, formation or information about collaborative groups who have (or will have) a shared IR that we can join. We see a shared system as one of the more viable options.” (NP respondent at a small private liberal arts college in the central Atlantic states)
  • “Would love to see models in a small, liberal arts college environment, particularly for consortial opportunities.” (NP respondent at a small master’s university in the Southeast)
  1. About one-quarter of institutions pilot testing or implementing an IR have two or more IRs available to their institution’s learning community. (See Table 5.1 and Subchapter 5.1.)

Perhaps MIRACLE Project respondents are counting the academic departments and research units that have launched IR-like software to preserve, exchange, and distribute research and teaching objects among themselves, colleagues at other schools, and Web searchers generally. These could also be subject-based repositories that prefer to maintain their subject focus rather than IRs that reflect the encyclopedic nature of liberal arts colleges and research universities. What will happen at institutions with multiple IRs? Will they join forces and consolidate their efforts? What are the forces for and against centralization? Is it advantageous for multiple IRs at a single institution to prosper?

  1. The availability of additional commercial options for IR-system software may enable more institutions to get involved with IRs especially at the many master’s and baccalaureate institutions where IR implementation is uncommon. (See Table 5.2 and Subchapter 5.2, Table 4.3 and Subchapter 4.3, and Table 8.1.)

Although the open-source DSpace and Fedora systems are the most popular IR systems for pilot testing and implementation (see Table 5.2), they require systems staff to program, profile, and deploy. An important benefit of pilot testing is to develop the requisite technical expertise for system deployment (see Table 4.3). NP institutions in the MIRACLE Project census have neither the resources nor in-house expertise to support IR planning (see Table 8.1). These institutions could benefit from commercial vendors who install the system and train on-site staff in system management and maintenance. Institutions involved with IR planning only or not at all involved with IRs express their interest in commercial vendors in the final question on MIRACLE Project instruments about how the Project could help them. For example:

  • “Models of IRs that are managed at the institution and those that are managed by vendors, for example, Digital Commons.” (PO respondent at a small public master’s university in the Southeast)
  • “Review and publicize tools, especially those for institutions with limited technical support and funding.” (NP respondent at a small baccalaureate college in the Southeast)
  • “Currently investigating platforms and working with a consortium to get RFPs … We are still planning. Would help to know more about commercial alternatives.” (PO respondent at a small baccalaureate college in a Central Plains state)
  • “Conduct some measure of comparative analyses of IR product offerings or software that would allow a single, inexperienced institution to expedite or focus its own analysis.” (PO respondent at a small master’s university in the Pacific Northwest)
  1. IR-system functionality is satisfactory but the user interface including controlled vocabulary searching and authority control needs serious reworking. (See Table 5.3 and Subchapter 5.3.)

Census respondents give high ratings to IR-system functionality for browsing, searching, and retrieving digital content; mediocre ratings to the user interface; and dead-last ratings to controlled vocabulary searching and authority control (see Table 5.3). A user interface that impedes retrieval will send most users packing, that is, switching from the IR to one of the hundreds of different databases that research institutions where most IRs are deployed offer their learning communities. Another option is to switch to Google with its popularity-based retrieval that does a good job ranking relevant retrievals at the top. People who search online systems conform to the principle of least effort, “The design of any … information system should be the system’s ease of use … If an organization desires to have a high quality of information used, it must make ease of use of primary importance” (Rosenberg 1966, 19).

At this early point in the development and deployment of IRs, few people have searched these systems. Now is the time to make user-interface improvements before too many users have negative experiences and abandon them altogether.

  1. Improve preservation functionality in IRs. (See Appendix F1, Table 5.4 and Subchapter 5.4, and Table 6.2 and Subchapter 6.4.)

Long-term preservation of digital materials figures prominently in Clifford Lynch’s (2003) definition of IRs (see Appendix F1). If universities, through their IRs, are going to replace the current publishing paradigm, the ability to maintain the documents (in whatever format or medium) over time is required. The promise of the IR, then, is not only to maintain the viability of the byte stream of these materials but also to support technologies that make a variety of file formats accessible over the long term. If one agrees with Lynch’s definition, every IR must become a trusted digital repository (RLG 2002).

Except for PDF files (see Table 6.2), today’s IR systems make few promises about guaranteeing digital file formats in perpetuity. The top reason for census respondents migrating to new IR-system software is greater capacity for handling preservation (see Table 5.4). IR systems must improve their preservation functionality. At the least, such an improvement fulfills a key reason for the very existence of IRs.

  1. Institutions do not need policies written in stone at the public launch of their IR. (See Figure 6.2 and Subchapter 6.3.)

At least 60% of census respondents with operational IRs report implemented policies for (1) acceptable file formats, (2) determining who is authorized to make contributions to the IR, (3) defining collections, (4) restricting access to IR content, (5) identifying metadata formats and authorized metadata creators, and (6) determining what is acceptable content (see Figure 6.2). There are many more policies for which these institutions report drafted policies or no policies at all.

It may be not necessary for all IR policies to be in place at the time of the public launch of an institution’s IR. Taking a wait-and-see attitude, evaluating what transpires after a period of time, then firming up existing policies and implementing new ones as needed may be the most expedient course. Here is advice from a respondent whose institution has an operational IR:

  • “Halfway through [completing this questionnaire], I realized that it wasn’t going to help me at all and that it would only serve to let the timid think that they had to have all of their eggs in the basket before they tried anything. JUST DO IT!” (IMP respondent from a mid-size master’s public university in the Midwest)
  1. The IR helps libraries build new relationships. (See Figure 7.2 and Subchapter 7.2, and Table 7.2.)

Asked to what extent the IR will affect their institution’s ability to build relationships with others such as archives, student services, digital asset management systems, etc., PO, PPT, and IMP respondents respond overwhelmingly positively (see Figure 7.2). Because of the leading role that libraries take in the IR effort, this question really pertains to the new relationships that are a result of the library’s involvement with IRs. MIRACLE Project staff can explore the nature of these relationships in this project’s follow-up activities, e.g., phone interviews and case studies, when respondents can give open-ended responses to interview questions and interviewers can probe deeper into fruitful areas. A preview of potential findings in this regard comes from IMP respondents who credit the IR with increasing the library’s role as a viable partner in the research enterprise (see Table 7.2).

  1. To what extent is the impetus for the IR coming from faculty, staff, and students? (See Subchapter 4.2, and Subchapters 7.1 and 7.2.)

The ARL SPEC Kit survey reports “38% of implementers and 47% of planners were responding to requests for an IR from faculty, staff, and students” (Bailey et al. 2006, 25). The MIRACLE Project census did not question respondents directly about the impetus for an IR; however, it did question them extensively about benefits of IRs (see Subchapters 7.1 and 7.2) and respondents failed to volunteer write-in responses that mentioned faculty, staff, and students in this regard.

A few comments that IMP respondents volunteered in response to a question about the importance of conducting a needs assessment describe faculty interest in IRs that helped start the IR project:

  • “Our assessment was more dynamic and ongoing … it involved response to innovative faculty requests and ongoing outreach from librarians regarding changes in scholarly communication practices … “
  • “Our former Dean of Faculty was particularly interested in DSpace and secured funding … to support its use here.”
  • There was no needs assessment but the IR was very much faculty driven. Leadership was taken by the University Library Council (a senate-provostial advisory group) that pushed the agenda and prepared the report that led to provost funding and support.”

The difficulty that IR staff are experiencing getting contributions from members of their institution’s learning community may be evidence contradicting the faculty, staff, and student impetus for IRs. In subsequent project activities, MIRACLE Project staff will look for evidence of such an impetus. They will also look for evidence of “peer pressure” as the impetus for an IR.

9.4            Observations on Long-term Issues Pertaining to IRs

MIRACLE Project census findings and the project’s follow-up activities will be able to explore the seven points discussed in this subchapter, but they will not be able to answer them conclusively. Definitive answers are possible only for those who hold a mirror on the future. The passing of time, convergence of events, advances in technology, and the inevitable march of human progress will eventually reveal the future.

  1. Is it too early in the evolution of IRs to single out one particular benefit? (See Table 7.1, Figure 7.1, and Subchapter 7.1, Table 7.4 and Subchapter 7.4, and Table 7.5.)

Asked to rate a list of 16 benefits of IRs, census respondents give very high ratings to all but two (see Figure 7.1 and Table 7.1). Instead of having a couple of benefits that stand head and toes above the others, IRs may have many benefits. Or it may be premature for one or two benefits to rise above the others because IRs have not yet come into their own. It may be prudent to give IRs a half decade or so to become commonplace in all types of educational institutions and to then pose this question again to the same audience. One benefit may rise above the others.

Because the MIRACLE Project census was limited to IR staff, we could ask them only why they thought people would contribute to IRs (see Table 7.4). They gave high ratings to reasons connected with enhancing faculty scholarly reputations and passing on research-dissemination tasks to others so that faculty can stay focused on intellectual tasks. Lower-rated reasons pertained to enhancing the institution’s standing.

Future surveys should ask IR contributors directly about why they contribute to IRs and the benefits they receive from their contributions and compare results from IR staff regarding benefits.

Today’s IRs do not yet have a dedicated corps of end users. No studies have been conducted to determine why people search IRs and whether they find relevant materials of interest. Very few institutions at which IRs are operational are collecting much more than counts of users, unique contributors, and searches (see Table 7.5). Future user studies should question users about IR benefits. IR staff who have done no IR planning or are just beginning to plan will benefit from anecdotes about how IR content helps researchers, for example, enabling a scholar to examine unique primary materials or putting an undergraduate working on a senior thesis project in touch with a scientist who supplied her with data files for analysis. Institution staff will especially use these stories to convince their decision makers to support an IR effort. Here is what they have to say in this regard:

  • “If our president and provost were more aware of the benefit of an IR to the scholarly community and for access to the institution’s historical record, I would be more likely to meet with success. At this small institution, it is imperative that I use an approach that addresses both scholarly communication and the institution’s digital archival material.” (NP respondent at a small baccalaureate college in New England)
  • “Provide concise examples and talking points of benefits and successes from IRs for use to gain campus and administrative support.” (NP respondent at a mid-sized public master’s university in a Central Plains state)
  • “Testimonials that cut to the heart of what each size institution can gain. …” (NP respondent at a small private church-affiliated liberal arts college in a Great Lakes state)
  • “Help publicize to the great diversity of academic community members the value that IRs have added to all types of institutions that have implemented them.” (PO respondent at a small master’s university in the Pacific Northwest)
  1. Will top-rated IR benefits someday pertain to derailing the current publishing model? (See Appendixes F8.4 and F9, Table 7.1 and Subchapter 7.1, Table 7.3, and Table 7.4 and Subchapter 7.4.)

MIRACLE Project investigators included three benefits that figure prominently in discussions of the ability of IRs to derail the current publishing model (see Table 7.1 and Subchapter 7.1 and Table 7.4 and Subchapter 7.4). Although census respondents rated these three benefits in the middle of the list of 16 benefits, they were generally positive about them. Will these benefits be the ones to rise above the others in the years to come? If the future brings significant changes to today’s publishing model, to what extent will IRs be responsible for the changes?

The low rate of contributions to IRs could be attributed to researchers’ reluctance to upset the delicate balance between themselves and publishers (see Table 7.3). While some call for restraint, inviting the various stakeholders to partner in discussion and negotiation (Lynch 1992; Drabenstott 1994; Borgman 2006), others have become activists, urging scholars to license their publications to the public domain (Creative Commons 2006), rating publishers on their self-archiving policies (RoMEO SHERPA 2006), and serving as vocal advocates of the open-access movement, e.g., Harnad 2006; Suber 1996–2006.

As more academics become aware of the evidence in favor of higher citation rates for articles published in open-access publications (see Appendix F8.4), will they be more likely to seek open-access publishers for publishing their work? What differences do scholars and scientists notice when they publish in open-access publications? Are publishers noticing changes in their relationships with authors, members of professional societies and editorial boards, and reviewers? Although others have asked these questions in the past, it has been only recently that the infrastructure for self-archiving been in place to challenge publishers and the stronghold they have had on scholarly publishing for so long.

  1. Will IRs coexist alongside subject- and discipline-oriented repositories? Or, after the dust on open access settles, will one repository type be left standing? (See Subchapter 6.5.)

The physics discipline with its arXiv subject-oriented repository is exemplary in terms of building and maintaining a successful digital repository that is embraced by the discipline as a whole (Pinfield 2001). Physicists are expected to contribute to arXiv—their standing in the field depends on it.

No other digital repository has made such deep inroads into a discipline or been met with such widespread acceptance as the physics-based arXiv digital repository. For example, Borgman (2006) cites contribution rates for subject-oriented repositories struggling to reach 15% and mentions in passing PubMed Central, where the contribution rate is a disappointing 3.8%.

The IRs that academic institutions support report low contributions rates in this very report (see Subchapter 6.5), in all previous surveys (Lynch and Lippincott 2005; Shearer 2004; Bailey et al. 2006), and in a long list of articles focusing on contributions (e.g., Foster and Gibbons 2005; Jenkins, Breakstone, and Hixson 2005; Chan, Kwok, and Yip 2005; Bell, Foster, and Gibbons 2005). Considering such low contribution rates, does it make sense for IRs and subject-oriented repositories to compete for contributors? Should one repository type yield to the other? One could imagine building functionality into the latter that automatically pops links to newly deposited material into the former. Is such double-posting necessary? Will it serve only to confuse end users searching for their topics of interest?

In the future, we might expect professional societies to enter the competition, wanting to be the digital repository of record for their discipline’s scholarly and scientific production. At a certain point, a shakedown will occur: some repositories will merge, and others will disappear. Who will remain standing in the digital repository business remains to be seen.

  1. What is the likelihood of making scholarly and scientific data and production mandatory in digital repositories? Who would police compliance with such a policy? (See Subchapter 7.3 and Table 7.3.)

MIRACLE Project questionnaires asked PO, PPT, and IMP respondents what was likely to inhibit their ability to deploy a successful IR. It listed a dozen potentially inhibiting factors and asked them to rate each factor. The three respondent types gave listed factors different ratings depending on their stage in the IR effort. PO respondents rate factors pertaining to deploying an operational IR highest, e.g., supporting the costs of an operational IR and competing with other priorities. PPT respondents are most concerned with building their IR’s database with quality content, e.g., encouraging faculty to contribute to the IR and contributors’ lack of knowledge of the benefits of IRs. IMP respondents, who know firsthand how difficult it is the recruit contributors and contributions to the IR, rank the “absence of campus-wide mandates regarding mandatory contribution of certain material types” as the top factor.

Policing compliance may be easier to do in IRs than in subject-oriented repositories because of promotion, tenure, and merit-increase reviews that many academic units periodically require of their research and teaching staff. Staff would be expected to link their publications to full-text sources in digital repositories. Whether they deposit in repositories the data files that they create for their research depends on their activity in humanities, social science, and science disciplines because of the different ways in which scholars create new knowledge across the disciplines (Borgman 2006).

To our knowledge, Queensland University of Technology is the only academic institution that has adopted a mandatory deposit policy for all staff members (Queensland University of Technology 2006). University administrators who are being asked to support IRs in both rhetoric and funds, library directors who bear the brunt of the IR’s support in their budgets (see Table 3.1), and foundation, federal, and state funding officers who are eager for positive outcomes as a result the grants, contracts, and cooperative agreements they award are undoubtedly monitoring the Queensland mandate with great interest.

  1. In the absence of IR-user studies, let us enumerate questions about IR users and uses. Answers to these questions will interest (1) decision makers at academic institutions who want to initiate an IR effort and need to convince their superiors of the benefits of IRs, (2) IR staff who recruit digital content, (3) IR-systems staff who are responsible for migrations to new versions and new systems, and (4) IR-system designers who are making improvements to existing systems and planning new ones. An IR-user study is a future MIRACLE Project activity. The list that follows provides examples of questions that could figure into such a study.
  • Who uses IRs?
  • For what are they searching?
  • What level of perseverance do they demonstrate searching the IR?
  • Why do they use IRs?
  • How did they learn about the IR?
  • How many times have they searched the IR in the past?
  • Would they search the IR in the future? Why or why not?
  • Do they understand the digital artifacts they retrieve?
  • To what extent do people’s searches of IRs yield relevant results?
  • What exactly are relevant results?
  • To what extent do people come across things of interest other than what they are looking for?
  • Is a particular user type (e.g., faculty, librarians, archivists, undergraduate students) more likely to use IRs and to have success finding relevant results?
  • To what purpose do people put their relevant results?
  • How do they benefit from IRs?
  • What help do people need before, during, and after their search of the IR?
  • What improvements can be made to IR searching? To IR metadata? To IR contents? To IR controlled vocabularies?
  • What other online systems do they search daily?
  • What do they like about the other online systems they search that they would like to see in the IR?
  • Would they recommend that their peers, colleagues, students, subordinates, etc., search the IR? Why or why not?
  1. What metadata are appropriate for the wide range of artifact genres that characterize IR databases? (See Table 5.3.)

MIRACLE Project investigators were cautious about asking respondents about metadata because the closed-ended nature of questionnaires precluded achieving much depth on the topic. We will explore metadata issues in subsequent project activities. The one question that touched on metadata asked PPT and IMP respondents about IR-system features and respondents ranked features pertaining to metadata dead last (see Table 5.3). MIRACLE Project investigators will follow up on this issue and other issues pertaining to metadata in future activities.

A source of new-knowledge products and byproducts, IRs will enable scientists and scholars to find data files that pertain to their research interests, retrieve research papers that describe original and follow-up analyses using these files, and download the files for their own analyses. Unlike texts, data files are not self-describing; thus, users will benefit from metadata during both the retrieval and selection phases of the search process.

The library community that has taken the responsibility for IRs has much experience in metadata creation, but this experience pertains primarily to text-based documents. What metadata pertaining to texts are appropriate for data files? Do some metadata elements pertain to data files only? What do data-file contributors want prospective users to know about their files that deserve to be represented in metadata? What do prospective data-file users want to know? What can librarians learn from the practice of data archivists about metadata (e.g., ICPSR 2005)? How should metadata in the IR relate to metadata in other campus information systems and library databases?

Metadata for traditional library collections have fallen short of user expectations (Markey 2007). IRs are an opportunity to start anew and to learn from the people who create the data files, the wide range of prospective users of these files, and from data archivists who have a proven track record with data files, uses, and users, so that the metadata they assign to the data files in IRs fulfill everyone’s expectations.

  1. In the design of IR systems and enhancement of metadata for IR content, be aware of the principle of least effort.

When staff are designing new IRs, they must keep in mind the principle of least effort. “This principle states that most researchers (even ‘serious’ scholars) will tend to choose easily available information sources, even when they are objectively of low quality, and, further, will tend to be satisfied with whatever can be found easily in preference to pursing higher-quality sources whose use would require a greater expenditure of effort” (Mann 1993, 91).

IRs are likely to be a curious mix of primary, secondary, and tertiary (e.g., encyclopedias, annual reviews, yearbooks, bibliographies) sources. Humanities scholars are accustomed to searching resource-type mixtures; in fact, “recognizing something [from an archive, library, corporate records, mass media, etc.] could be a data source is a scholarly act in itself” (Borgman 2006). Researchers in science and social science disciplines are less accustomed to finding data and the research that interprets the data in one place. IR- system designers can expect people with varying levels of domain expertise—from undergraduate students to senior faculty members—to be potential users of IRs at academic institutions. A key objective for these designers should be the principle of least effort so that IRs are usable regardless of their users’ domain expertise.

9.5            Chapter 9 Summary

Chapter 9 discusses the findings of the MIRACLE Project census. It begins with an examination of NP respondents, who represent the largest percentage (52%) of census respondents (see Subchapter 9.1). NP respondents come from institutions where no IR planning has been done. Dominating NPs are master’s and baccalaureate institutions.

Our analysis of NP respondents reveals their great interest in IRs. They want to know how much IRs cost to plan, implement, and maintain, and what institutions comparable to their own are doing with regard to IRs (see Table 8.2 and Subchapter 9.1). None of the top-ranked reasons why NP institutions have not begun IR planning rules out their involvement with IRs at a later date (see Table 8.1). Right now, NP institutions have other things on their plate or have no resources or expertise for IR planning. Very few are totally in the dark in terms of what IRs are and whether IRs have relevance for their institutions (see Figure 8.1). Slightly less than 50% of NP respondents may start IR planning within the next 24 months (see Figure 8.2).

NP respondents would benefit from success stories about IRs from their colleagues at other-than-research-universities. If subsequent MIRACLE Project activities are biased toward the experiences of other-than-research-universities, they would focus where the need is greatest and where the gap in knowledge about IRs exists is widest.

Subchapter 9.2 enumerates census findings that verify findings from previous surveys. Subchapter 9.3 is an in-depth examination of 13 findings that are unique to the MIRACLE Project census. Subchapter 9.4 concludes the report by making observations on seven long-term issues pertaining to IRs that will continue to occupy educational institutions long after the MIRACLE Project ends. Consult the report’s Executive Summary for a comprehensive treatment of MIRACLE Project census findings.

Skip to content