"it will discourse most eloquent music": sonify variants of hamlet

14
“It will discourse most eloquent music” Sonifying Variants of Hamlet Iain Emsley Oxford e-Research Centre, University of Oxford [email protected] @iainemsley @minnelieder 1

Upload: iain-emsley

Post on 21-Jan-2017

463 views

Category:

Data & Analytics


0 download

TRANSCRIPT

CRISP-SKA Cluster of Research Infrastructures for Synergies in Physics - Square Kilometre Array

It will discourse most eloquent music

Sonifying Variants of HamletIain EmsleyOxford e-Research Centre, University of [email protected]@iainemsley@minnelieder1

OverviewIntroduction to SonificationSonifying HamletAuditory BeaconsVisualizationConclusionsFuture work2CRISP-SKA

Cluster of Research Infrastructures for Synergies in Physics - Square Kilometre Array

Introduction to SonificationWhat is sonification?Sonification is an alternative to visualization

the use of nonspeech audio to convey information. More specifically, sonification is the transformation of data relations into perceived relations in an acoustic signal for the purposes of facilitating communication or interpretation. (Kramer, 1997) (bold is mine)3

Sonification as an alternative or complement to visualisation.

This is the standard definition of sonification. *pause* We believe that it can be extended into analytics. Previous work has focused on the transformation. We are looking at way of facilitating interpretation and understanding of changes.

Introduction to SonificationTEI-Comparator James Cummings and Arno Mittelbach

Listening to Wikipedia - Hatnote

We Need Us - Julie Freeman

Sonification of hyperstructures - De Roure, Blackburn et al.

4

TEI Comparator used for visual comparisons of words for the Holinshed Chroniciles. We look at the editorial structures in auditory fashion.

In Listening..., Hatnote sonifies the size of change using a Wikipedia feed with a simple sonification.We try to move to music rather than sound.

We Need Us is sonification and visualisation of click and swipe data

De Roure, Blackburn sonify an artificial hyperstructure using algorithmic composition. We sonify extracted structures and transform them.

Clearly there is another related work

Hinman Collator imageBy Folger Shakespeare Library, Washington, DC (Julie Ainsworth, photographer) (http://luna.folger.edu/luna/servlet/s/919x31) [CC BY-SA 4.0 (http://creativecommons.org/licenses/by-sa/4.0)], via Wikimedia Commons

Took inspiration from the Hinman Collator and the stereoscope.

Sonifying Hamlet6Sonifying the VariantsAuditory BeaconsVisualization

William Gaver 1982 paper.

Sonifying the VariantsFrom Play to SonificationUsing First Folio and Quartos dataParsing the TEI XML, converting it with rule set into numbers, sonifying the data to produce sounds

7

Sonification

----- Meeting Notes (25/10/15 20:38) -----Pipeline to transform the XML into numbers according to a simple set of rules. These numbers are then transformed into sound in the black box.

Mention the Hinman collator here and stereoscopy.

Used the First Folio Hamlet and the Quartos variants as the test data.

One streamTwo steams to create an audio version of a steroscopic illusion.

Auditory BeaconsActs and ScenesDifferent Instruments and PitchesStage DirectionsDifferent instrumentsPeriod versus Modern sounds?SpeakersIncreasing volumeStereoscopic illusion using two streams

Challenges of making the information useful for the listener. Builds on the work of William Gaver, going back to the early 1980s.

Acts & Scenes are relatively static. We can be confident that they will be in each play.

Stage directions - use different instruments. *pause* do we use the period sounds versus modern sounds? How does this affect the listener?

Speakers - same instrument with different pitches. Also use increasing volumes as a way marker.

play twice once before and then after explanation.

Visualization9Use of Processing visual arts languageReused the note data to create images

----- Meeting Notes (25/10/15 20:38) -----Abstract notation: use of bars and circles for speakers.

Aided the comprehension.

Issues of timing and synchronization between events.

Expand on this in the talk verbally. Add exeunt to slide

ConclusionsSonification has potential in comparing textsPerceptions altered Choices of soundUse of multiple parametersUse of spatial as well as temporal events to help users 10

Either single channel or stereo channels. Stereo channels allows for stereoscopic channels. This emulates the collation illusion in audio.

Sonification for interpretation and analysis. It is not as well researched as visualization for this case. Need to involve specialists to make this more useful.

Future WorkWork on different sounds for elementsTransforming and linking the structuresSonifying speeches with the choice elementsUsing metadata for additional sonificationPerceptions of speakers characteristics through pitchesCharacteristics of speech: blank prose and rhymeFiltering

Period versus Modern sounds ;historicist arguments and understanding the staging and period Mention the workshopTransformingWriting out the timing data to link the structures together using SMIL or MEI. Sonifying the words:Need an algorithm for thisThe choice element provides a different set of challengesMetadata:Use of genderUse of rhythm elements -> rhyming or prose

Filtering: Already some filtering for elements to map. But may be useful to add a way for a user to filter further elements.

Acknowledgements

The work is part of an MSc project in Software Engineering being supported by the Oxford e-Research Centre. Thanks to Pip Willcox, Rahim Lakhoo and Professor David De Roure.

It will discourse most eloquent music: Sonifying variants of Hamlet http://ora.ox.ac.uk/objects/uuid:1785e0ac-5cbb-4d35-8546-4495aa8baec8

Thank you for listening

Any questions?

[email protected]@iainemsley@minnelieder13

ReferencesGregory Kramer. 1993. Auditory Display: Sonification, Audification, and Auditory Interfaces. Perseus Publishing. `The Search for the Killer Application: Drawing the Boundaries around the Sonification of Scientific Data`, Supper, Alexandra in The Oxford Handbook of Sound Studies, Pinch, Trevor, and Bijsterveld, Karin 2012. New York: Oxford University Press, New York, p253 Keith V. Nesbitt and Stephen Barrass, Finding Trading Patterns in Stock Market Data, IEEE Computer Graphics and Applications 24:5, IEEE Computer Society, pp 45-55, 2004 Digital facsimile of the Bodleian First Folio of Shakespeare's plays, Arch. G c.7, First Folio home page, http://firstfolio.bodleian.ox.ac.uk/ C Hinman, Mechanized collation; a preliminary report., Papers of the Bibliographical Society of America 41 (1947): 99-106. Smith, Steven Escar. 2000. "'The Eternal Verities Verified': Charlton Hinman and the Roots of Mechanical Collation." Studies in Bibliography 53. 129-62. The tragedy of Hamlet Prince of Denmarke: an electronic edition, Hamlet, First Quarto, 1603. British Library Shelfmark: C.34.k.1, http://www.quartos.org/XML_Orig/ham-1603-22275x-bli-c01_orig.xml The tragedy of Hamlet Prince of Denmarke: an electronic editionHamlet, Second Quarto Variant, 1605. British Library Shelfmark: C.34.k.2, http://www.quartos.org/XML_Orig/ham-1605-22276a-bli-c01_orig.xml The Holinshed Project: Comparing and linking two editions of Holinshed's Chronicle, James Cummings and Arno Mittelbach, International Journal of Humanities and Arts Computing. Volume 4, Issue 1-2, Page 39-53, ISSN 1753- 8548, Available Online http://dx.doi.org/10.3366/ijhac.2011.0006 October 2010, Keith V. Nesbitt and Stephen Barrass, of a Multimodal Sonification and Visualisation of Depth of Market Stock Data,. Nakatsu and H. Kawahara (eds), International Conference on Auditory Display (ICAD), 2002 , pp25 De Roure, David C., Cruickshank, Don G., Michaelides, Danius T., Page, Kevin R. and Weal, Mark J. (2002) On Hyperstructure and Musical Structure. The Thirteenth ACM Conference on Hypertext and Hypermedia (Hypertext 2002), Maryland, USA, 11 - 15 Jun 2002. ACM, 95-104. Holden, Claire 2012. Recreating early 19th- century style in a 21st-century marketplace: An orchestral violinists perspective. Presented at: Institute of Musical Research DeNote Seminar, Senate House, London, 30 January 2012. Sonification report: Status of the field and research agenda Prepared for the National Science Foundation by members of the International Community for Auditory Display (1997) by G. Kramer, B. Walker, T. Bonebright, et al., http://sonify.psych.gatech.edu/publications/pdfs/1999-NSF-Report.pdf Lehmann, L., Mittelbach, A., Cummings, J., Rensing, C., & Steinmetz, R. 2010. Automatic Detection and Visualisation of Overlap for Tracking of Information Flow. In Proceedings I-Know. William W. Gaver. 1986. Auditory icons: using sound in computer interfaces. Hum.-Comput. Interact. 2, 2 (June 1986), 167-177. DOI=http://dx.doi.org/10.1207/s15327051hci0202_3