In the last decade, the use of software tools for data analysis and data visualization has proliferated in the humanities. The availability of digitized material, increasing computational power, and analytical techniques adopted from network science, geospatial analysis, and natural language processing have inspired new ways to interrogate cultural heritage data. But those tools, reliant on statistical modeling, also limit the questions we can ask and the meaning we discover. In order to uncover significance in materials that have passed through many hands, and stories that have been telegraphed by different voices inflected with opinion, argument, and perspective, we need tools that support human-scale exploration of complex systems. The research process requires “thinking through data,” which is how we describe the reflective, slow collecting and editing of information, as distinct from the quick, mechanistic, algorithmic approach to data processing. This talk will demonstrate how the requirements of humanistic inquiry are encoded in tools developed at Humanities + Design and why, in this age of artificial intelligence, it is so important to capture the intellectual work of data modeling.
Nicole Coleman is Digital Research Architect for the Stanford University Libraries and consultant for the Stanford University Press’s Digital Publications project. Nicole is also co-founder and Research Director for Humanities + Design, a research lab at Stanford’s Center for Spatial and Textual Analysis dedicated to encoding humanistic method into open source software for research. She is currently developing an initiative to make library collections more useful to researchers through applications of artificial intelligence.