Search
Close this search box.
Search
Close this search box.

A conversation between digital and “traditional” humanities

By Sayan Bhattacharyya

Some time in the first decade of the twenty-first century, what used to be called “humanities computing” until then gave way to a supposedly new field, “digital humanities.” At that point, the word “humanities” in the description shifted within the conceptual space of the academic imaginary. No longer a mere adjective qualifying a gerund seemingly rooted in the world of technology, “humanities” became the central noun of the name when the name changed from “humanities computing” to “digital humanities”. The shift seemed to announce that the field, with its newly acquired name, was staking a claim to centrality in a way intended to make it difficult for mainstream practitioners of the humanities to ignore the field. Not coincidentally, this shift in nomenclature took place when, first, an economic recession and its effect on state higher education budgets had accentuated a sense of crisis in many humanities departments, and, second, a revolution in data science, leading to an increased emphasis on quantification in many aspects of life, seemed to be taking off. To quite a few humanities scholars, it seemed that the upstart and newly ascendant field of digital humanities, having shaken off the slightly musty odor that used to cling to the now quaint-seeming term “humanities computing”, had suddenly revealed itself to be the stalking-horse of neoliberalism in an academy where the ideal of the disinterested pursuit of the life of the mind had previously helped, at least in theory if not always in practice, to keep the vulgarity of the market somewhat at bay. The newly repurposed and renamed field of digital humanities at times appeared to be deserving of suspicion, if not of hostility, by the more staid practitioners of the humanities academy. Often funded by short-term grants, with its practitioners sometimes embracing a style of functioning that tends to get called a “startup mindset,” some parts of digital humanities simply seemed too alien from how the humanities had hitherto gotten done in the university.

One worriesome consequence of this state of affairs is a kind of balkanization, in which, while the digital humanities have carved out a place at the table at major humanities conferences such as the annual MLA convention, the conference sessions seem to be self-segregated, at least in terms of speakers. Sessions that have a digital humanities theme tend, in my experience, to have bird-of-a-feather speakers, and so sessions in which digital humanities researchers engage with traditional humanities scholars remain a rarity. However, I think that there is a lot of common ground to be found if such conversations were to take place. For example, what would the use of digital tools in the service of humanistic inquiry entail? How can digital techniques work in tandem with, or even enhance and inform, philosophical critique or “theory”? Such questions are not only of considerable interest to digital humanists, but also have the potential to make the humanities more critically self-aware, as it opens up the question of what it is that we actually do when we “do” humanities.

With these thoughts in mind, I decided to organize a seminar at the American Comparative Literature Association (ACLA)’s annual conference in Seattle last month, that would put presenters from the digital humanities at a table in conversation with presenters from the “non-digital” humanities, so to speak. I selected the metaphor of “text as process” as the common ground for the endeavor, as it occurred to me that both digital humanities and traditional humanities potentially have a lot to say on this subject in a mutually intelligible way. Computation makes the process of any form of analysis salient, but humanists have, after all, been talking about deconstructing, mobilizing, defamiliarizing, and generally unraveling text for decades;so a productive encounter did not seem unreasonable to expect. My reason for choosing the ACLA as the venue for this experiment was two-fold: first, unlike the annual conference of its larger sister, the Modern Language Association (MLA), the ACLA conference has not had much of a digital humanities presence. For this reason, it was not going to be the case that birds-of-the-same-digital feather would already have partitioned themselves off to their own corner of the conference at ACLA. Second — as Ali Behdad mentioned in a keynote speech at the ACLA this year — what is at the core of Comparative Literature is really a comparative mindset: an openness to looking at things in multiple ways simultaneously. The ACLA was a venue, then, where the digital and the non-digital could productively mingle.

I am happy to report that the attempt seemed to be quite a success. I received so many excellent papers in response to my call for papers that, in the end, we had to open two seminars instead of one, the result being that, instead of two sessions, we ended up having a total of four. Jacob Haubenreich, an assistant professor at Southern Illinois University, who had himself submitted a paper, graciously took charge of administering the additional seminar. A compendium of the abstracts that were submitted and accepted is available here, and browsing it will give the reader a sense of the lively conversation that ensued in the sessions. The presentations by Jessica Merrill (whose presentation highlighted how the work of early twentieth-century East European thinkers represent an alternative genealogy for subsequent literary theory in a way that highlights process), and by Peter Libbey (whose presentation painted a rich and complex picture of the relationship between textuality and causality) provided a theoretical scaffolding that helped contextualize our conversations. Graham Sack and my presentations focused on relationships between literary and cultural theory in relation to actual digital instantiations (simulated models based on literary theory and data-mining in digital libraries, respectively), while Rita Raley, Chelsea Adewunmi and Roberto Cruz Arzabal focused on the material form and media of textuality (in relation to both traditional and non-traditional media), while Jordan Buysse’s intervention pointed the way to how science studies can help inform our investigations into textuality. The seminar marked the beginning of a conversation that will, one hopes, continue into the future.

Sayan Bhattacharyya is currently a CLIR Postdoctoral Fellow at the Graduate School of Library and Information Science at the University of Illinois, Urbana-Champaign, where he works with the HathiTrust Research Center, the research arm of the HathiTrust Digital Library.

Did you enjoy this post? Please Share!

Facebook
Twitter
LinkedIn
Pinterest
Reddit

Related Posts

CLIR and Coherent Digital: An Idea Applied

CLIR and Coherent Digital: An Idea Applied Building Capacity through a Not-for-Profit and Corporate Partnership —by Charles Henry CLIR and Coherent Digital LLC (CD) have

Skip to content