Close this search box.
Close this search box.

Interview with Christa Williford, Senior Director of Research and Assessment

What is your elevator pitch for this report, or to use a more popular phrase these days, the TL;DR (too long didn't read)?

This report is the story of how our attempt to adapt the original Digitizing Hidden Collections program affected the people and groups who participated, in their own words. Because we are a funder, we don’t always get a completely honest impression of how individuals are affected by our program and process. I feel like the researchers did a great job of reflecting that diversity of individual perspectives. Click to read full DHC:AUV report

What do you think the researchers, Jesse Johnston and Ricardo Punzalan, did to earn the trust of those responding to the survey questions?

First, they were not CLIR staff. As contractors, Jesse and Ricky guaranteed anonymity to the study participants, which lowered the fear of consequences, especially when critical statements were made. These honest opinions are what all funders yearn to know when they’re designing programs and making decisions. But frequently, when asked direct questions by a funder, there’s another narrative happening in the grantee’s head.  They are often thinking, “how does my idea affect my potential competitiveness for future awards?” So having independent researchers and protecting the anonymity of participants took pressure out of the conversation.

Christa Williford
Christa Williford, Senior Director of Research and Assessment

So anonymity is important…

Super important. With internal assessments, which we do with all of our programs, people know that when they’re filling out the survey, they’re speaking directly to CLIR—even when a survey is anonymous. Speaking to a third party researcher who filters the diversity of impressions into a coherent message for us allows us to learn and make the program better.

How were the researchers chosen and how did their interactions with the participants help the report?

That’s a fantastic question and I’m glad you asked it. Jesse and Ricky were selected in April of 2021 (see official announcement). As a team, they have deep expertise in qualitative assessment and familiarity with the activities of grant seeking and operating funding programs, as well as working with smaller cultural heritage organizations and community-based archives with culturally sensitive materials. Since DHC:AUV sought to serve a wide variety of organizations, it was important that the researchers had the sensitivity to pose questions that are relevant to people working in a wide range of contexts.

Johnston and Punzalan
Jesse Johnston (left) and Ricardo Punzalan (right)

So they asked the questions that participants might be afraid to answer if they were asked directly by CLIR staff?

Yes, and in a non-threatening way that clarified the purpose of the evaluation, how answers were to be used, and how the surveys would be translated and given to CLIR as feedback. Qualitative research, using methods such as interviews, involves putting people at ease when asking questions that might be challenging or even feel a little invasive. Handling these conversations and then translating what you learn from different people into a coherent set of findings is not easy and it requires practice in order to get it right.  We were lucky that Jesse and Ricky were interested in the project and agreed to take it on.

Who will find the DHC:AUV report interesting and why should they read it?

This is a detailed look at one program with a specific set of goals and values. On its surface, one wouldn’t think that it would necessarily translate to audiences outside of CLIR, but this is far from true. People working in philanthropy and those who are interested in funding work with collections will find it helpful. Not only do the researchers identify the ways that our program had an impact, but they also point to other types of work that is equally important for advancing the goal of amplifying unheard voices; this is work that CLIR does not do with DHC:AUV that other organizations could take on. I hope that those working in philanthropy will pick this up, for they will get great ideas for their work.

The detailed discussion of the methodology and examples of instruments used by the researchers will also be useful to organizations creating programmatic assessments. And they will find the diversity of voices in the firsthand quotations from participants quite valuable. Professionals in cultural heritage often specialize in one sector or type of institution, and they approach work with their collections in that narrow context. They may not think about how the experience of working with collections is different for other groups or collections. Because our program is trying to appeal to such a wide variety of cultural organizations, it’s good to consider what other organizations are doing to advance and enrich our digital research and learning environment.

So this report is important to read if you are a practitioner interested in learning about different approaches to work?

Yes, or if you are thinking about the full range of organization types that are engaged in building our digital research environment. Readers can get a taste of the wide variety of content that could be accessible to them—the different types of artifacts, the audio and audio visual content, images, published and unpublished works. We need different types of organizations to participate in the creation of our research environment.

Will students in MLIS programs find the assessment useful for their studies?

Students who are planning future careers and considering challenges that they may encounter could find it useful. Grant seeking is such a necessary part of cultural heritage work that learning about how grant programs are designed could have an impact. Grants can be mysterious and challenging for people new to the field.  One of the best ways to figure grant work out is to consider how programs are built and operated, which you can glean from reviewing assessments like this.

Demystifying the grant process would seem to be useful for both the grant makers and grant seekers.

Absolutely. Grant seekers appreciate when CLIR is transparent about the ways the process is imperfect and unpredictable, so it’s important to acknowledge the complexity. It helps grantseekers understand that although they may not have gotten a grant, it wasn’t because they didn’t have a great idea.

Was there anything that surprised you when reviewing the assessment?

Seeing how the researchers pointed to the limitations of our tightly focused program was eye opening. I’ve thought for years that when trying to appeal to a broad range of organizations and attempting to even the playing field for all grant-seekers, it helped to have a focused goal or set of activities that we do and don’t fund. I thought it was the best way of being fair. This assessment has led me to question that assumption and is encouraging me to consider ways that the program can still be fair, but also be more flexible.

I still like the neatness of a broadly appealing program with a narrow scope, but there are so many different dimensions of amplifying unheard voices. So there is project work that we may need to be more open to in the future. We must consider this idea of creating an even playing field for organizations and their projects, even if they have very different starting points and different priorities.

Can you expand on that a little more?

The original DHC program focused mainly on “assembly line” digitization projects. A project takes an original object and captures it digitally, then the digital object is described, then it is put somewhere so that it’s accessible, and then it is put somewhere else so that it’s preservable. This focus is useful because there’s a simplicity there.

But there is pre-work that has to be done to identify collections and to understand their importance. Then there’s post work involving making decisions about whether and how to make items accessible and who should make those decisions. Plus, there are legal considerations. So, if DHC is only funding the middle part—the digitization—and not funding the beginning, the planning, or the work at the end, some organizations can’t get to where they need to be in order to be competitive in our grant process. The report tells us that we need to be more open to other types of work, such as reparative description, translation, transcription—all of which are expensive and laborious. But it is important work that needs to happen to make digital collections useful at the end. Our focus on the assembly line concept of digitization is insufficient if our goal is to enrich and make previously underrepresented constituencies as visible to everyone as they really should be.

In the past, the tighter focus on funding digital capture, description, and preservation helped us with assessments of DHC. We tallied the numbers of digital items created over time and those comparisons gave us a quantitative measurement of the impact of our program. If we expand our focus to include other types of related work, it will make that final assessment more difficult. It’s not impossible, but it will require more staff and a more nuanced approach to the review of applications and reports than we currently have allotted.

A broader focus means that you will have to compare apples to oranges as they say, and you will also have to include watermelons and pineapples and kiwis.

Exactly, a whole fruit basket of differences. So redrawing achievable boundaries in a way that works for our small staff and the grant seekers will be part of our decision making for the program moving forward.

Another recommendation from the report focuses on DHC’s approach to intellectual property and collection ownership. Why is a reciprocal notion of ownership important?

Many collections are products of, or reflect, a whole community or group, but often there are others involved in the stewardship or management of those collections who are making decisions about digitization. Ownership is very complicated, especially when you create a digital copy of an original item or recording. Now you have the original and the digital copy to consider, and the caretakers of each item can be different and involve unrelated people or organizations. When collections reflect a community, the caretakers have to negotiate decisions about access and prioritize the needs of the people represented in the items, especially with projects that involve indigenous communities where the American legal concepts of ownership don’t apply. It gets very complicated when multiple partners with different cultural backgrounds are trying to sustain collections that can affect the lives and experiences of individuals.

There has to be trust built up over time for people to feel comfortable with sharing ownership. With digitization projects, once something is made accessible online, you can’t take it back easily. There must be a reciprocal notion of ownership based in a relationship of trust where people are consulted and decisions are made together about what happens.

What is the future direction of the DHC program?

We’re working on a plan to continue the program, at least for one year, which will give us a chance to apply some of the recommendations. Ideally, we hope to run the program two or three times over the next few years. But we are waiting on funding. We have a short list of changes that we’d like to make that I’m negotiating with the staff. Right now we have a smaller staff, and that means that a lot of the changes are intimidating to consider. Hopefully, the trust that we’ve built from doing this initial assessment will help our community members feel comfortable giving us feedback on an ongoing basis so that we’ll continue to improve and share what we’re learning.

Will you do this level of assessment for every year?

This was a one-off project. It had a pretty big budget for an assessment project and we’ve been very fortunate it was written into the grant for this first iteration. I don’t think we will get that again. Fortunately, there’s enough in this report to inform not just one year’s worth of evolution, but other future programs. We will make use of these findings for some time.

That's wonderful. Are there any final thoughts that you would like to share about the draft report?

We would love for readers to let us know what they think about the report, what they agree with, what they don’t agree with, and what they think we should prioritize in making changes moving forward. That would be great. Readers can email us at

Skip to content