Skip to main content
Indiana University Bloomington

scene from Newton project

Past Colloquia

Nineteenth-Century Digital Humanities

Roger Whitson
Washington State University

Scholars tend to consider the digital humanities as a methodological field. But the changes occurring to culture due to digital technology also have the potential to impact the content of humanistic study. I'll examine steampunk as a subculture whose combination of experiments with alternate history, nineteenth-century writing, and the engineering of anachronistic devices all contribute to a bold vision of a historically-inflected digital humanities practice.

In addition to showing how steampunk authors respond to the use of nineteenth-century technological design in countries like India and China and the looming reality of climate change transforming our technological infrastructure, I will detail my own experiments with anachronistic computing as a response to Kari Kraus's call for more conjectural inquiry in the digital humanities. Steampunk is unique in its ability to appropriate nineteenth century history in order to find alternatives to the futurism that runs rampant in our neoliberal technology industry. This talk will show how a similar futurism can be found in some parts of the digital humanities, while detailing how nineteenth-century digital humanities can inspire alternatives to the future.

Monday, October 20, 2014
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

The VAT: Video Analysis at Scale

Virginia Kuhn
School of Cinematic Arts, University of Southern California

Cultural analytics, a newer branch of the digital humanities, is an approach that deploys computer technologies to analyze the formal features of art and culture, making them available to interpretive methods. Moving image media is particularly ripe for computational analysis given its increasing ubiquity in contemporary culture. Indeed, digital video—whether recorded digitally or digitized from film—is a rapidly expanding form of contemporary cultural production, one made possible by the proliferation of personal recording technologies and hosting platforms like YouTube, Vimeo and the Internet Archive. In short, video is one of the most compelling “big data,” issues of the current cultural moment; its formats are diverse, rapidly transmitted, and boundlessly large in number.

Yet despite its scale and importance, video remains a daunting object for sustained research, for obstacles that are technological, institutional and conceptual in nature. In this talk, Virginia Kuhn will describe her large-scale video analytics project which is supported by the NSF’s XSEDE program (extreme science and engineering discovery environment) and her project team’s efforts at establishing a software workbench for video analysis, annotation, and visualization, using both current and experimental discovery methods.

Thursday, October 16, 2014
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

Computer Science and Literary History, in Conversation about Poetry

Ted Underwood
University of Illinois at Urbana-Champaign

To the extent that humanists discuss computer science at all, we tend to imagine it instrumentally, as a source of useful "tools." But the conversation between computer scientists and humanists needn't be purely instrumental. Computer science is an epistemologically flexible discourse that seeks to model learning and belief; humanists may have more in common with it than we imagine.

To make these abstract reflections more concrete I'll describe a research project that tries to understand the history of poetic diction from 1700 to 1920. In this project, I've found computers useful for practical reasons: confronted with hundreds of thousands of digitized books, we need a way to identify the particular volumes and pages that contain "poetry," and a way to identify historically significant trends in hundreds of thousands of works. But beyond those practical problems, I've found the conceptual resources of computer science helpful as I grapple with the necessary ambiguities surrounding critical terms like "genre" and "diction." Methods discussed will include multi-label classification, semi-supervised learning, and probabilistic graphical models as well as versification and close reading.

Monday, April 7, 2014
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

Exploring, Analyzing, Remixing: Teaching History with Digital Media

Mills Kelly
George Mason University

Digital media have profoundly changed the ways that our students find, make sense of, and work with historical information. In this talk, Professor Kelly will discuss his research on teaching and learning history with digital means, with a particular focus on how students are using digital tools to work with massive data sets and how they are using new media to stretch the boundaries of traditional forms of learning.

Monday, March 31, 2014
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

The Syntactic Reference Corpus of Medieval French (SRCMF):
A dependency treebank for Old French

Thomas Rainsford
University of Oxford

Building on two large existing corpora of Old French (the Base de Français Médiéval (BFM) and the Nouveau Corpus d'Amsterdam (NCA)), the SRCMF project has created the first dependency treebank for Old French. This talk will consider the preparation of the corpus, and will go on to highlight some of the work that the corpus has made possible in the two years since the end of the project. The manual annotation of the corpus according to an innovative grammar model was made possible by the NotaBene annotation tool (Mazziotta 2010), and the quality of the annotation was ensured by a process of double blind annotation and subsequent comparison. The annotated texts are available (i) in Tiger-XML format, most helpful for conventional linguistic queries, and (ii) in CoNLL format for use in training the mate tools parser (Bohnet 2010). I will present ongoing work on constructing a web interface for the completed corpus using the TXM platform (Heiden 2010) which is used to host the (BFM), and will demonstrate the export of query results in the form of "Key Node in Context" concordances.

Wednesday, March 26, 2014
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

The Virtual World Heritage Laboratory: How We Apply 3D Technologies to Teaching and Research at IU

Bernard Frischer
Indiana University
Department of Informatics

In August, 2013 the Virtual World Heritage Laboratory moved to IU from the University of Virginia. In this talk lab director Bernard Frischer will discuss recent projects of the lab including the Virtual Meridian of Augustus, the Digital Hadrian's Villa Project, and the Digital Sculpture Project. Frischer will also present the new online, peer-reviewed journal he founded, *Digital Applications in Archaeology and Cultural Heritage, *the first scholarly outlet focusing on 3D modeling of cultural heritage artifacts, monuments, and settlements.

Monday, March 3, 2014
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

3D Technologies and the Democratization of Archaeology and Natural History:
A Model for Humanities and Sciences

Herbert Maschner
Idaho State University

3D technologies are revolutionizing the research experience by bringing critical, rare, fragile, or inaccessible collections to a world-wide audience. Coupled with on-screen analytical tools, the virtualization of entire repositories is democratizing the research enterprise by creating opportunities for students, indigenous peoples, politicians, and a global research community to conduct analyses from anywhere on the planet. The Idaho Museum of Natural History is at the forefront of virtualizing collections with over 15,000 archaeological and paleontological specimens on line, with key visualization and analytical tools built into the user experience. This presentation will provide a complete demonstration of this new model for the creation of global heritage repositories.

Monday, March 3, 2014
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

Blaise Agüera y Arcas
Google Inc.

Talk #1: The Economic Future of Women

With rare exceptions, women have been an economic underclass relative to men for thousands of years— perhaps since the dawn of agriculture. While we have made strides in recent history toward equal rights for the genders, it is still the case that the great majority of world capital remains in the hands of men; that a 50 year old American woman with full-time employment in 1970 made on average only 55% of a man’s wage; that today’s business executives and tech industry are overwhelmingly male-dominated. However, in this talk it will be argued that over the next few years we will see a reversal of this very longstanding status quo: women will become economically dominant. The case for this is argued using labor and wealth statistics, both American and international. The second half of the talk is more speculative, and delves into some theories as to why we are seeing this reversal now.

Monday, February 10, 2014
Sponsored by the Rob Kling Center for Social Informatics and
the Catapult Center for Digital Humanities & Computational Analysis of Texts

Talk #2: Reinventing Gutenberg

Johann Gutenberg is widely understood to have been the inventor of the most important information technology of the second millennium: printing.  Before Gutenberg, literacy in Europe was for the rich and noble, and books were individually hand-copied treasures.  After his invention and its rapid spread throughout the continent, books and letters became widely accessible, leading to the rise of an educated middle class, a profound acceleration in technological innovation, the Protestant Reformation, and much else that shaped world history from 1500 on.

But how much do we know about what Gutenberg's invention actually was?  Prior to the fruitful collaboration ten years ago between Blaise Agüera y Arcas, an applied math grad student at Princeton at the time, and Paul Needham, a world authority on Gutenberg and the curator of the Scheide Library at Princeton, much of what was known about Gutenberg's life and work came from secondary sources, legal records, and guesswork-- and much of it was in error.  Needham and Agüera y Arcas worked together to apply modern analytic methods to Gutenberg's surviving output, using high resolution imaging and shape clustering to prove that while he was indeed a great inventor, he did not invent the technologies that have been ascribed to him, and that the early evolution of printing technology was much more complex than had been thought.

This talk will be of interest to humanists and scientists alike.

Monday, February 10, 2014
Sponsored by the Lilly Library and the Catapult Center for Digital Humanities & Computational Analysis of Texts

Operationalizing Networks in the Humanities (video)
Elijah Meeks
Stanford University

The use of the network as an object for representing and measuring systems has grown in the humanities and continues to increase in popularity but as it does there are more and more signs of the use of networks in simulation and modeling to explore humanities subjects. Building on Moretti's concept of operationalization, this talk will focus on the use of networks and network models not merely to describe systems but to hypothesize their formation, reproduction, maintenance and decline. This will include an exploration of the general principles of network analysis and representation as well as the use of networks in establishing system-based and agent-based models. These objects are dynamic, and as such interactive methods of publication are necessary to allow readers to interrogate, cite, critique or extend them. Some examples of these publication methods will be explored in depth, utilizing models from the work being done in the English and Classics departments at Stanford.

Monday, January 27, 2014
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

The Emergence of the Digital Humanities
Steven E. Jones
Loyola University Chicago

The past decade has seen a profound shift in our collective understanding of the digital network. What was once understood to be a transcendent virtual reality is now experienced as a ubiquitous grid of data that we move through every day. As William Gibson says, cyberspace has everted, turned inside out and spilled out into the world. At the same time, the digital humanities has emerged as a new interdisciplinary field. This is no coincidence, as Jones argues in his new book, The Emergence of the Digital Humanities (Routledge, 2013). The eversion of cyberspace--as manifested in 3D printing, ubiquitous computing, data mining and analysis, the geospatial turn, and experiments with digital publishing platforms--provides an essential context for understanding the emergence and influence of the digital humanities in the new millennium.

Tuesday, November 19, 2013
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

Networks and Neighbourhoods in Early Modern London

Janelle Jenstad
University of Victoria

Janelle Jenstad will take you on a tour of early modern London via the four interoperable projects that make up The Map of Early Modern London.  Taking the Agas map of 1560s London as its landscape, MoEML connects an encyclopedia of its people and places with an anthology of London's literature and a versioned edition of John Stow's Survey of London. The MoEML team works with an network of contributors in Digital Humanities and early modernists in various disciplines to map out relationships of proximity and visualize social and material relationships in the early modern city.

Bio: Janelle Jenstad is Associate Professor of English at the University of Victoria. She is the founder and current Director of The Map of Early Modern London, as well as the Assistant Coordinating Editor of the Internet Shakespeare Editions.

Monday, October 28, 2013
Sponsored by College of Arts and Sciences - Themester, Department of English, and
the Catapult Center for Digital Humanities & Computational Analysis of Texts

Digital Humanities: Chances and Challenges

Caroline Sporleder
Saarland University

The field of "Digital Humanities" has expanded considerably in the past few years and continues to grow. Dedicated programmes and funding schemes led to both an increase in the availability of data through large-scale digitisation projects (e.g. historical newspapers, archaeological field books, mediaeval manuscripts) and a boost in basic research in this area. Several universities have set up research centers dedicated to digital humanities research. Furthermore the growing demand for researchers who combine deep knowledge of the humanities with IT-skills led to the establishment of many new degree programmes in digital humanities and related areas. However, the Digital Humanities are a very heterogeneous field, with different sub-communities which have very different views of what "Digital Humanities" should entail. Moreover, there continues to be a certain amount of skepticism towards the use of digital techniques in the Humanities themselves.

In this talk, I will present my view of the field and of its chances and challenges. I will present several case studies, ranging from digitally assisted curation in cultural heritage institutes, over digital research tools for humanities researchers (virtual research environments, digital editions, search interfaces) to methods for visualising and mining data and for extracting and quantifying information. It is this last area that has the biggest potential for changing humanities research by enabling researchers to discover facts that could not be discovered by traditional, purely manual means. However, this last area is also the most controversial and poses the biggest challenge, often requiring the development of novel, quantitative research methodologies in the Humanities. Clearly, these new methods should complement rather than substitute traditional methods.

Friday, October 11, 2013
Sponsored by the Data to Insight Center and
Catapult Center for Digital Humanities & Computational Analysis of Texts

Global Seduction and Disasters at 1:40,000,000 - Challenges in Social Geography

Ingo Günther

Friday, September 20, 2013
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

The Hands-On Imperative

William J. Turkel
University of Western Ontario

The idea that "making is thinking," as Richard Sennett puts it, has always had some place in the humanities. Until recently, however, it was costly and difficult to produce physical objects. Now online maker communities, powerful design software, cloud-based services, desktop fabrication and physical computing make it almost as easy for people to make and share artifacts as information or software. I describe how to set up a makerspace and fab lab for humanists, and why you might want to.

Friday, April 26, 2013
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

The Research Program at the Distributed Digital Music Archives and Libraries Laboratory

Ichiro Fujinaga
McGill University

The main goal of this research program is to develop and evaluate practices, frameworks, and tools for the design and construction of worldwide distributed digital music archives and libraries. Over the last few millennia, humans have amassed an enormous amount of information and cultural material that is scattered around the world. It is becoming abundantly clear that the optimal path for acquisition is to distribute the task of digitizing the wealth of historical and cultural heritage material that exists in analogue formats, which may include books, manuscripts, music scores, maps, photographs, videos, analogue tapes, and phonograph records. In order to achieve this goal, libraries, museums, and archives throughout the world, large or small, need well-researched policies, proper guidance, and efficient tools to digitize their collections and to make them available economically. The research conducted within the program will address unique and imminent challenges posed by the digitization and dissemination of music media. In this talk various projects currently conducted at our laboratory will be presented; including optical music recognition, workflow management for automatic metadata extraction of LP recordings, creation of ground truth for music structural and chord analysis, and evaluation of digitization methods for analogue recordings.

Friday, March 22, 2013
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

A Computational Research System for the History of Science and its Connections to Bioinformatics

Manfred Laubichler, Julia Damerow, & Erick Peirson
Arizona State University

Computational methods and perspectives can transform the history of science by enabling the pursuit of novel types of questions, expanding dramatically the scale of analysis (geographically and temporally) and offering novel forms of publication that greatly enhance access and transparency. In this talk we present a brief summary of a computational research system for the history of science, introduce some of the tools and use cases, discuss its implications for research, education and publication practices and its connections to the open access movement and similar transformations in the natural and social sciences emphasizing big data. One of the connections of this approach is with genomics and we will explore the isomorphic structure between different types of historically evolving information systems, such as genomes and science. We also argue that computational approaches help to reconnect the history of science to individual scientific disciplines.

Tuesday, February 19, 2013
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

Correlating Theme, Geography, and Sentiment in the 19th Century Literary Imagination

Matthew Jockers
University of Nebraska

How do literary expressions of and attitudes toward slavery in the 19th century change according to fictional setting? Do novels set in Ireland present a perspective toward landlords and tenants that is similar or different from what we find in novels set in America or England? How do the answers to these and similar questions fluctuate over time or according to author gender or author nationality?

This study uses tools and techniques from text mining, natural language processing, machine learning, and statistics to address questions such as these and to identify and study how specific places, themes, and sentiments find synchronous or asynchronous expression within the 19th century literary imagination. Using data mined from a large corpus, ~3500 works of British, Irish and American fiction, this macroanalysis seeks to expose persistent links between geographic setting, theme, and sentiment and to then chart the ways in which places (such as Ireland) are constructed, or “invented,” within the literary imagination of the century.

Friday, February 8, 2013
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts and
The Digital Culture Lab in the School of Library and Information Science

Challenges for a Humanities Macroscope

Timothy R. Tangherlini

With the advent of the very large digital repositories of literary and other culturally expressive works, canonical approaches to Humanities research are confronted with a crisis of sustainability. If we have access to all the literature produced in the 19th century, should we not take it into account in some way in our research? Similarly, if we have access to several hundred thousand stories in our folklore archives, should we not devise methods that allow for all of that information to inform our research? Several fundamental challenges exist for Humanities scholars as we move toward considering these larger and larger corpora, not least of which is how one “reads” fifteen thousand novels, or a quarter of a million stories? In this presentation, I explore the theoretically tantalizing prospect of a research environment that Katy Börner has labeled the “macroscope.” For Börner, “Macroscopes provide a ‘vision of the whole,’ helping us ‘synthesize’ the related elements and detect patterns, trends, and outliers while granting access to myriad details. Rather than make things larger or smaller, macroscopes let us observe what is at once too great, slow, or complex for the human eye and mind to notice and comprehend” (Börner 2011, 60). The macroscope holds the promise of wedding “close reading” approaches, which have been a fundamental analytical approach in folkloristics since the beginning of the field, to what Franco Moretti has called “Distant Reading” where “Distance… is a condition of knowledge: it allows you to focus on units that are much smaller or much larger than the text: devices, themes, tropes—or genres and systems” (Moretti 2000, 57). In this presentation, I make use of work on a large corpus of Danish folklore, the Google Books corpus, and other preliminary explorations on internet data, from blogs, to twitter, to Facebook and YouTube. I present some possible algorithmic approaches to specific research problems, including network methods and topic modeling.

Friday, January 25, 2013
Sponsored by the Catapult Center for Digital Humanities & Computational Analysis of Texts

Digital Scholarship and the Mental Worlds of Isaac Newton

Rob Iliffe
University of Sussex

Since Newton's non-scientific papers were sold at auction in 1936, there has been a progressive revelation of information about the little known private mental life of Isaac Newton.  Although his writings on alchemical and religious topics were effectively available on microfilm by the mid-1980s, the online publication of all of his writings dramatically improves our ability to investigate this unknown world.  In this talk I examine how the digital medium -- coupled with hard scholarly work -- has facilitated the acquisition of new insights into the development of Newton's beliefs and research practices.  I conclude by considering some of the wider implications for humanities research that arise from creating and engaging with the online Newton.

Friday, November 30, 2012
Sponsored by History and Philosophy of Science

Challenges for the Humanities: Scholarly Work and Publishing in the Digital Age

Urs Schoepflin
Max Planck Institute for the History of Science

Since the foundation of the Max Planck Institute of the History of Science in 1994, it is our primary concern to make source materials available in digital form together with developing cutting edge tools and instruments to adequately support the scholarly work. ECHO - Cultural Heritage Online as an open access repository and research environment is the most prominent outcome of this endeavor. Based on our experience, basic issues of motivation, collection building strategies, specific tool development, Open Access as primary prerequisite (Berlin Declaration), research collaboration and trans-disciplinarity will be raised. Reflecting on changing notions of "the document" and on information economy, novel ways of disseminating research results will be presented (e.g. by way of Virtual Spaces; with Edition Open Access). Finally, problems of organizing quality control, of long-term sustainability, and of gaining recognition in evaluation procedures will be discussed.

Monday, October 22, 2012
Sponsored by Sawyer Seminar (Mellon Foundation)
Co-sponsored Catapult Center for Digital Humanities
Co-sponsored by HPSC

<< Return to Upcoming Colloquia