Close this search box.
Close this search box.

Part 3: Research Methodology

This part of the report describes the research methodologies used in the study. The intention is to enable readers to judge how reliable the study’s findings are and to explore further the implications of the study’s data.

Described here are the methodologies used in (1) compiling the investment parameters reported in Figure 1 and Table 1; (2) the survey of academic institutions that undertook renovations and additions to existing libraries or built new libraries between 1992 and 2001; and (3) the phone interviews of library directors and academic officers at a number of institutions that responded to the survey. For information about the methodology used in the Council of Independent Colleges (CIC) survey of library directors and chief academic officers, see section 4 of the report on that survey, which is available at

Investment Parameters

Information from 1992 through 2001 on several factors-e.g., number of academic library projects, total cost, total gross square feet (GSF)-were extracted and summed from the reports on library capital projects
reported annually in the December issue of Library Journal. Ten-year means and standard deviations were calculated for each factor, as were the z scores for each annual statistic. These z scores indicate that all the annual statistics fall well within a normal distribution of values. Note that the total current dollar costs, as reported in the Library Journal, were converted to total real dollar costs using the index values of the gross domestic product published by the U.S. Bureau of Economic Analysis. This conversion was done to permit comparisons
across a ten-year period.

Survey of Academic Institutions Undertaking Library Construction between 1992 and 2001

Scope of the Survey
The survey focused on academic library projects (new buildings, renovations, and additions) undertaken in the United States primarily
between 1992 and 2001.

  • Academic libraries are of interest to the study’s sponsor, the Council
    on Library and Information Resources. They have a distinctive institutional setting and clientele, and the investigator has some familiarity with academic libraries, having worked most of his professional life in such libraries.
  • Projects in the 1990s were completed at a time of significant pedagogical
    and technological change in higher education. Personal and institutional memories of projects completed earlier than 1992 are likely to have dimmed.
  • Imaginative, forward-looking libraries were built in many countries
    other than the United States in the 1990s. This study nonetheless
    confined itself to those built in the United States to simplify the identification of projects, to keep the number of projects manageable, to facilitate the interviews (which could be done in one language and in only four time zones), and to avoid contextual issues (e.g., the kind of national planning represented by the Follett report in the United Kingdom) with which the investigator is not familiar.

Database of Projects Undertaken between 1992 and 2001
A list of additions, renovations, and new building projects was compiled
from the lists of libraries undertaking such construction published
annually in the December issue of Library Journal. Other projects
identified in a literature search were added, yielding a list of 443 projects. The intention was to identify projects completed between 1992 and 2001, but inaccuracies in reporting dates and other factors resulted in the inclusion of a small number of projects completed or ongoing in 2002.

For each project, the following information was compiled:

  • the institution’s name and mailing address
  • the institution’s Carnegie classification
  • the name of the library involved
  • the name of and contact information for the library director responsible
    for the project
  • the nature of the project (i.e., renovation, addition, or new construction),
    the size of the project in gross square feet, and its completion
  • the name of the lead architect for the project (this information about architects provides the basis for Figure 7 in part 1 and Table 5 in part 2 of the report)

Survey respondents were asked to review the database information
about their project and to correct any mistakes.To view the letter inviting library directors to participate in the study, click here (goes to p. W-167).

Survey 1
Survey 2
Survey 3
Survey 4
Survey 5

Methods of Analyzing Survey DataThe survey posed four kinds of questions:

  • Questions 2, 5, 8, and 10–13: These questions invited a single affirmative
    response. Every affirmative implied a reciprocal negative.
  • Questions 3, 4, 6, and 7: These three- to six-element questions invited multiple affirmative responses. An affirmative response to one element in these questions did not exclude an affirmative response to other elements but did, by inference, imply a reciprocal negative for the element in question. The individual elements in these questions could therefore be treated as if they were “yes/no” questions comparable to questions 2, 5, 8, and 10–13.
  • Question 9: This three-element question was treated as if an affirmative response to one element excluded an affirmative response to other elements.
  • Question 1: This 14-element question invited affirmative responses weighted (on a six-point scale) by intensity. The intensity rating for any given element carried no implication for the rating of other elements.

Excluding the “other” responses, there were a total of 114 question categories to track for each survey returned and reported in Tables 3a and 4a. When the responses were analyzed by ungrouped and grouped Carnegie classification numbers (Tables 3b and 4b), there were 1,596 question categories to track. When the responses were analyzed by project-completion year (Tables 3c and 4c), there were 1,254 question categories to track.

There were 240 usable, nonduplicative responses to the survey, yielding a 54% rate of return. The study considered the 443 libraries identified in its database as the study’s population, while it took the 240 survey responses as a study sample representing the study population.

In the ensuing description of statistical methods, the following terms are used:

  • “Question” identifies a question as it was presented to readers of the survey-e.g., question 1 about several different factors that motivated projects
  • “Question element” identifies a particular factor in the question-e.g., the growth of library staff in question 1 or the “yes” option in several other questions
  • “Question category” or “category” identifies the intensity response or the yes/no response used by the respondent for each question element-e.g., the growth of library staff identified as a weak motivator, or a “no” response about post-occupancy assessment.

For all questions, the proportion (P) of affirmative responses to each question element was determined and a corresponding confidence interval for P calculated. This confidence interval indicates the range (reported as plus-or-minus percentage points) within which responses for the study population are likely to vary from the study sample (i.e., P) in 95 out of 100 cases.

The chi-square test was then applied in the following ways to determine whether differences in P were statistically significant.

  • In Tables 3a and 4a, actual P responses were compared to a random (i.e., uniform) set of responses. For question 1 (Table 3a) “random response” was defined as the total number of responses divided by six, the number of question categories (i.e., random response = mean response). To determine the mean or random response rates in the other questions (Table 4a), the total number of responses was divided by 2 for the “yes” and “no” response categories actually provided in questions 2, 5, 8, and 10–13 or implied for each of the elements in questions 3, 4, 6, and 7. For questions 3, 4, 6, and 7, the total number of responses had to be inferred because it could not be observed directly (that is, an affirmative response to one element in these questions did not exclude an affirmative response to other elements). The inferred number of responses was defined as the mean of the actual responses to questions 2, 5, 8, 9, and 10–13 (where an affirmative response to one element did exclude an affirmative response to other elements). This mean was 224, suggesting that 224 out of a possible 240 respondents (93.3%) actually answered the questions at issue here.
  • In the analysis of P responses grouped by Carnegie classification type (Tables 3b and 4b) and by projection-completion year (Tables 3c and 4c), the actual responses by institutional type were compared not with random responses but with the actual responses for the sample, considered as a whole (i.e., the Tables 3a and 3b results).
  • Where differences in response were statistically significant, that fact was registered in Tables 3a–c and 4a–c by the use of bold type and by a ratio called the chi-square factor. This factor equals the results of the chi-square test divided by the value of the critical region appropriate to the given question element (different elements
    require the use of different degrees of freedom in determining the critical region). In effect, the chi-square factor indicates how many times the results of the chi-square test exceeded the value of the critical region for that test. This use of the chi-square factor provides a single scale for comparing responses that require somewhat different underlying values in testing for statistical significance. Hence, any chi-square factor value of =1 falls outside the critical region and is not statistically significant. Any factor value of =1 falls within the critical region and is statistically significant. The higher the chi-square factor value is above 1, the less likely it is that the response could have happened by chance.

Interviews of Library Directors and Chief Academic Officers at Some Institutions that Responded to the Survey

Selection of Library Directors to Participate in Phone Interviews
The identification of persons to interview was a multistep process. No attempt was made to identify a random, stratified sample from the survey respondents, but an attempt was made to include a variety
of types of institutions in the interviews that roughly paralleled the variety of institutions in the study sample.

  • As a first step, the number of interviews to be sought at various institutional types (using the Carnegie Classification) was determined, based on the proportion of institutional types responding to the survey: Doctoral/Research Universities-Extensive (nine interviews); Doctoral/Research Universities-Intensive (four interviews); Master’s Colleges and Universities I (seven interviews); Master’s Colleges and Universities II (one interview); Baccalaureate Colleges-Liberal Arts (five interviews); Baccalaureate Colleges-General (two interviews); Associate’s Colleges (two interviews). This institutional profile for the interviews approximates
    that of the study sample, except that Doctoral/Research Universities-Extensive are under-represented by one interview and Baccalaureate Colleges-Liberal Arts are overrepresented by one interview. This slight adjustment was made with the hope of securing more informative interview results.
  • All respondents who offered three or more comments in responding
    to the survey were selected for interviews, on the supposition that they were most strongly engaged with the topics being investigated. In fact, these respondents often had thoughtful, provocative things to say in their survey comments.
  • All the respondents who offered two comments in responding to the survey were sorted into institution types (using the Carnegie classification). Most of these respondents could be included in the interviews within the limits set for the various institutional types. No individual with particularly interesting or provocative comments
    in the survey was omitted.
  • A small number of Doctoral/Research Universities-Extensive, Master’s Colleges and Universities I, Baccalaureate Colleges-Liberal Arts, and Baccalaureate Colleges-General were selected from the list of survey respondents, sorted by institutional types. In selecting these institutions, at attempt was made to balance public and private institutions and to secure some geographical spread.
  • All respondents identified in this way had indicated, in survey question 14, a willingness to participate in a follow-up interview. Not all respondents so identified were in fact willing to be interviewed,
    so the institutional profile of completed interviews did not match the intended profile. Notably, there were no interviews of library directors from Associate’s Colleges.

In the event, the study included 25 interviews with library directors. Some directors asked colleagues to substitute for them in these interviews; others asked colleagues to join them during the interviews.

Questions Posed in the Phone Interviews of Library Directors
Library directors received the following set of interview questions well before their actual interview. They were asked, as part of the scheduling process, to identify which of these questions would be most pertinent to their project. The scripts for each interview included the questions so identified as well as other questions of particular interest to the investigator. Actual interviews often varied somewhat from the prepared scripts.

1. Survey results indicate that meeting the space needs of library instruction, especially that for electronic classrooms, was a major motivator of library capital projects.

  • Was this a major motivator for your project?
  • Aside from electronic classrooms, how if at all did your project strengthen your library instruction program?

2. Survey results indicate that accommodating changing patterns of student study, especially as regards group study, was a major motivator of library capital projects.

  • Was this a major motivator for your project?
  • Aside from group study space, how if at all did your project respond to student needs for study space?

3. Though not explicitly inquired about, respondent comments on the survey indicate that the needs of special collections sometimes were a major motivator of library capital projects.

  • Was that so for your project?
  • If it was, what conception of these often less frequently used and/or relatively narrowly defined collections succeeded in attracting support for your project?

4. Survey results indicate that the need to provide shelving for collections was a major motivator of library capital projects.

  • Was this a major motivator for your project?
  • Do you expect shelving to be a major motivator of capital spending for your library over the next 30 years?
  • Is the long-term preservation of your print collections a major factor in the design of your library’s shelving?
  • If preservation and access values were in conflict in the design of shelving space (e.g., lighting and temperature conditions ideal for books but less than ideal for readers), which value would prevail?
  • Have you considered a satellite, high-efficiency shelving facility for your library?
  • Does the availability of electronic journals and books figure explicitly in your thinking about future shelving needs? Have you quantified the likely impact of electronic materials on your future need for shelving?
  • Does the availability of print material through consortial arrangements figure explicitly in your thinking about future shelving needs? Have you quantified the likely impact of consortial access arrangements on your future need for shelving?

5. Survey results indicate that “vision statements” were often critically important in guiding library capital projects.

  • Was that so for your project?
  • How was the vision statement developed?
  • What relationship did the vision statement have to formal, systematic needs assessments?
  • Might you share the vision statement with me?

6. Survey results indicate that while changes in technology frequently drive the need to reconfigure library space for specific services and operations, there is relatively little fundamental rethinking of the need for and uses of library space. Aside from the omnipresent computer (often presented in clusters), group study space, and electronic classrooms, library space today has much the same character and basic function as library space built a generation ago.

  • Do you agree with this characterization of your project? If not, how would you modify it?
  • Should we expect major changes in library space design to evolve in largely incremental and experimental ways, building on what we know has worked well in the past?
  • Are there opportunities to break with an evolutionary process of library design and adopt more radical, revolutionary, and possibly risky views of what library space should be?

7. Survey results indicate that the formal, systematic assessment of specific departmental operations sometimes plays a significant role in formulating library capital projects and in justifying them to academic and funding bodies. Otherwise, the formal assessment of readers’ wishes, of student and faculty academic needs, and of library space as one element in the campus-wide provision of academic space is rarely done. By contrast, consultation with library users, as distinguished from formal needs assessment, is quite frequent. But respondents often comment that such consultation is largely routine in nature and rarely if ever decisively important in project design.

  • Was your project completed without any significant, formal, and systematic assessment of reader needs?
  • If so, would your project have been strengthened by such an assessment
    of (for instance) student learning behaviors, faculty teaching strategies, the campus-wide provision of study space, or the interrelations of social space and study space? Why (or why not) would assessments of this sort have strengthened your project? Why was such assessment not done, if it would have been helpful?
  • Was consultation with faculty and students decisively important to specific design decisions made for your project?
  • Faculty members are quite frequently consulted about library projects. In your case, was that done for reasons relating to the specific teaching and research functions of faculty, as distinguished from reasons relating to the weight of faculty opinion generally in setting campus goals and priorities?
  • Research indicates that faculty visit and use library space much less frequently than do students, yet consultation with students about library design happens much less frequently than with faculty.
    Was this the case in your project? If so, does this fact reflect the relative weight of faculty and student opinion in setting priorities on your campus? Does it reflect some other consideration?
  • In consulting with faculty and students about your project, what did you aim for beyond “buy in” and “political support” among decision makers?

8. Aside from the assessment and consultation activities just discussed, how would you describe the process for coming to agreement on your library project

  • as regards its programmatic goals, especially any goals rooted in the identified needs of students and faculty as distinguished from the operational goals of library units?
  • as regards your project’s relative priority among competing campus projects?
  • as regards funding?

9. How long did it take to move your project from (a) its first formulation for members of the campus community beyond librarians to (b) the institutional funding of the project? Did the length of project gestation bear on your decisions regarding formal, systematic needs assessment and consultation with various reader constituencies?

10. Survey results indicate that formal post-occupancy studies are infrequently undertaken to measure the success of library capital projects.

  • Was that the case for your project?
  • If so, what other methods (if any) did you use to assess the success of your project?
  • Has your assessment (whether formal or informal) of the success of your project changed over time?
  • If you would now do your project differently in some significant way, did you know at the time that the project design should be changed or did you discover that only afterwards?

11. Was the inclusion in your project of some function not administered by the library (e.g., some function related to information technology services) critical to its conception and success? If so, what was that function? Do you regard the inclusion of that nonlibrary function as a strategically important alliance for the library or as a “marriage of convenience” useful in moving your project forward?

12. If you had to reduce to just one factor the value your library creates for your campus, would timely and convenient access to information resources be that value?

  • If not this, what would be the single most important value your library creates?
  • Where would you rank instruction in the identification and effective use of information resources among the values created by your library?

13. What questions, beyond those posed above, would help to understand the planning process for your project, especially as regards the identification of the teaching and learning functions of library space?

Phone Interview Procedures for Library Directors
Click here (goes to p. W-168) for a description of the procedures used in phone interviews.

Selection of Chief Academic Officers to Participate in Study Interviews
Library directors were asked about the participation in their library projects of their chief academic officers and other academic officers. Interviews were sought with all such individuals identified as having
a significant, substantive impact on the project beyond the typical responsibilities of setting the priority of the library project amid competing campus projects, establishing project budget parameters, and fund raising. The study included six interviews with chief academic officers and other administrative officers.

Questions Posed in the Phone Interviews of Chief Academic Officers
The e-mail message inviting chief academic officers and other administrative officers to participate in the interviews included the following paragraphs. Actual interviews often varied somewhat from the script.

The following questions all ask about the same thing: the role of the chief academic officer in ensuring that library space is as responsive as possible to the institution’s teaching and learning missions. Our conversation could begin with any one of these questions, or with some other matter that seems more salient to you.

1. Chief academic officers typically play several managerial roles in library space planning. They are involved in determining the priority of library projects among other campus projects, setting the timetable
for projects, establishing project budgets and securing funds, and managing the political process needed to initiate and complete capital projects. What other roles, if any, did you play in planning for [project name]? Did any part of your involvement in library space planning focus specifically on possibilities for advancing your core concerns with teaching and learning?

2. In setting priorities for the project, how did you balance responding to long-accumulating space problems (e.g., lack of shelving space, obsolescent mechanical systems) with opportunities to enhance teaching and learning (e.g., group study space, electronic classrooms)?

3. Most library planning efforts involve consultation with students and faculty. This consultation seeks to understand the operational needs of these readers (e.g., students’ seating preferences); gain buy-in for the project, especially from faculty; and manage the political process of deciding on project priorities. How well does this consultation process identify opportunities for library space to advance strongly the institution’s fundamental missions in teaching and learning? Can this consultation process be improved? Are there other steps-such as formal assessments of modes of student teaching and faculty teaching-that might increase the likelihood that library space will advance the institution’s teaching and learning missions?

4. Aside from the consultation process, it appears that college and university academic officers depend heavily on the good professional judgment of librarians, especially the library director, to guide library space planning. How well does this dependence advance opportunities for library space to serve the institution’s fundamental missions in teaching and learning?

5. Chief academic officers are involved in planning for all sorts of capital projects. Does library space planning offer you distinctive opportunities to advance the teaching and learning missions of your institution? What is distinctive, if anything, about your involvement in library space planning?

The phone interview procedures for chief academic officers and other administrative officers were essentially the same as for library directors.



Skip to content