We collect the detritus of the dead; we carefully tag, sort, and organize. We group it in some ways, we ungroup it in others. We shelve it in this thing called ‘museum’, or, more recently, this ‘database’. We look at dots on a map, with a God’s-eye-view from on high.

In short, we look at the past through an oracular omnipresent present.

We try to see the past. But I’m tired of that. I want to hear it instead. But I know that I can’t; nevertheless, when I hear an instrument, I can imagine the physicality of the player playing it; in its echoes and resonances I can discern the physical space. I can feel the bass; I can move to the rhythm. The music engages my whole body, my whole imagination. Its associations with sounds, music, and tones I’ve heard before create a deep temporal experience, a system of embodied relationships (to paraphrase Husserl). Visual? Close your eyes. The graphics are getting in the way.

In Listening to Watling Street, I continue a series of experiments in trying to sonify the digital representations of the past (see this post and this one ) . To hear the discordant notes as well as the pleasing ones, and to use these to understand something of the unseen experience of the Roman world: that is my goal. Space in the Roman world was most often represented as a series of places-that-come next; traveling along these two-dimensional paths replete with meanings was a sequence of views, sounds, and associations. In Listening to Watling Street, I take the simple counts of numbers of epigraphs in the Inscriptions of Roman Britain website discovered in the modern districts that correspond with the Antonine Itinerary along Watling street. I compare these to the total number of inscriptions for that county. The algorithm then selects instruments, tones, and durations according to a kind of auction based on my counts, and stitches them into a song. As we listen to this song, we hear crescendoes and dimuendoes that reflect a kind of place-based shouting: here are the places that are advertising their Romanness, that have an expectation to be heard(Roman inscriptions quite literally speak to the reader); as Western listeners, we have also learned to interpret such musical dynamics as implying movement (emotional, phsycial) or importance. The same itinerary can then be repeated using different base data - coins from the Portable Antiquities Scheme database, for instance - to generate a new tonal poem that speaks to the economic world, and, perhaps the insecurity of that world (for why else would one bury coins?).

As this project develops, competing songs backwards and forwards along the same paths will be generated from various kinds of data, to create a sonified, ‘songified’, Roman Britain, and a way of engaging our other analytic senses.

Code

The code is repurposed from Brian Foo’s ‘DataDrivenDJ’ project. His code was developed to illustrated economic disparity along a subway train line in New York. Thus, his code embodies assumptions about how best to hilight that - most notably, in using a kind of auctioning system to select instruments and tones. The listener is invited to think about what those assumptions do to our understanding of Roman Britain sonified in this way. Foo draws his musical samples from music written by New York artists, music that ‘captures the throbbing vibrancy of New York and the movement of its citizens’. I too am interested in movement, but using these base samples perhaps unwittingly makes aural comparison to New York: Roman Britain represents a new moment in urban development on the island, so maybe the comparison isn’t that off. In the first version of Listening to Watling street (available to hear here), I slowed down the beats-per-minute to reflect a kind of marching cadence, to introduce subtlely the idea of the marching Roman army. In the second version, I sped it up to capture the frenetic motion of the Roman trader. Both versions are true, for a given value of ‘truth’.

In future versions, I will need to reflect more deeply on these stylistic and compositional approaches, and find instrument samples that better capture the emic sensibilities of Roman experience. The process then is to swap the raw data into a spreadsheet of values, which are fed to a python script that creates a sequence of sound files to play (with duration and instrumentation). This sequence is fed into ChucK, a language that converts this sequence into music. The animation is created using Foo’s Processing script, which neatly reflects that sequential series-of-places-that-come-next conception of space so natural to the Roman.

Acknowledgements

I can’t thank Brian Foo enough. His experiments in sonification, and his committment to sharing his code and expertise are exemplary. You too can play with this code to generate your own sonified experience of space - see Foo’s Github repo at https://github.com/beefoo/music-lab-scripts

Electric Archaeology

is a production of Shawn Graham, Associate Professor of Digital Humanities in the Department of History at Carleton University. Shawn likes to think of himself as a Romanist, and quite possibly, an archaeologist of Roman landscapes and social spaces. His digital archaeology work often turns on questions of simulation and the best ways to represent these things in everything from interactive fiction to virtual worlds. He’s into data mining archaeological datasets, and has a book coming out soonish on digital methods in history with Ian Milligan and Scott Weingart

Earlier experiments in sonfication may be enjoyed here: ‘Listening to Topic Models’; ‘Historical Friction’