65 motion origami danielbartos - university of...

6
Motion Origami Daniel Bartoš 1 1 FAMU, Prague, Czech Republic [email protected] Abstract. The Motion Origami project explores live performance strategies focused on gesture based control of sound. The sound processing involves granular sound synthesis in a Max/MSP patch. Motion Origami is designed for a live performance scenario as an interactive music instrument. Motion Origami makes use of the live performance audio input or of the pre-recorded sound material. This live interface prototype explores a gesture-based music composition and music performance techniques. The sound transformations are driven by the hand gestures, while the use of motion tracking device lets the user build up a specific experience and virtuosity. Keywords: Leap Motion sensor, gesture recognition, motion tracking, music expression, granular synthesis, Max/MSP Introduction The name of the Motion Origami project 1 is inspired by the Japanese art tradition of origami folding. The actual process of paper folding is reflected in a specific compositional strategy, which uses captured hand gestures. In other words the performing artist, the musician, is able to 'fold' sounds with his own hand gestures into new sound objects. The simple 'folds' therefore result into complex soundscapes during the performance. 1 Motion Origami; the project presentation can be accessed online at http://www.danielbartos.com/motion-origami Figure 1. Motion Origami – MAX/MSP patch with the Leap Motion

Upload: vokhanh

Post on 29-Jun-2018

218 views

Category:

Documents


0 download

TRANSCRIPT

MotionOrigamiDanielBartoš1

1FAMU,Prague,[email protected]

Abstract.TheMotionOrigamiprojectexploresliveperformancestrategiesfocusedongesturebasedcontrolofsound.ThesoundprocessinginvolvesgranularsoundsynthesisinaMax/MSPpatch.MotionOrigamiisdesignedforaliveperformancescenarioasaninteractivemusicinstrument.MotionOrigamimakesuseoftheliveperformanceaudioinputorofthepre-recordedsoundmaterial.Thisliveinterfaceprototypeexploresagesture-basedmusiccompositionandmusicperformancetechniques.Thesoundtransformationsaredrivenbythehandgestures,whiletheuseofmotiontrackingdeviceletstheuserbuildupaspecificexperienceandvirtuosity.

Keywords:LeapMotionsensor,gesturerecognition,motiontracking,musicexpression,granularsynthesis,Max/MSP

IntroductionThenameoftheMotionOrigamiproject1isinspiredbytheJapanesearttraditionoforigamifolding.Theactualprocessofpaperfoldingisreflectedinaspecificcompositionalstrategy,whichusescapturedhandgestures.Inotherwordstheperformingartist,themusician,isableto'fold'soundswithhisownhandgesturesintonewsoundobjects.Thesimple'folds'thereforeresultintocomplexsoundscapesduringtheperformance.

1MotionOrigami;theprojectpresentationcanbeaccessedonlineathttp://www.danielbartos.com/motion-origami

Figure1.MotionOrigami–MAX/MSPpatchwiththeLeapMotion

ThispaperdescribesanoriginalMax/MSPpatch,whichusestheSmoothOverlappingGranularSynthesisobjectsogs~2asit'smainsoundtransformingelement.Thenoveltyofthisapproachisnotintheproductionofanewcodeoraparticularspecificdesign,butinmakingacreativeuseoftheexistingandadvancedaudioprocessingtoolsinthesoundandmusicperformance.Theprojectshowshowaliveinterfaceprototypecanbeturnedintoanoriginalcompositionaltool.Assuch,theMotionOrigamiprojectrepresentsarecipeonhowtoapproachadesignofanexperimentalmusicinstrumentandshowsanapproachsimilartotherapidprototypingtechniqueappliedtotherealmoftheadvancedMax/MSPaudioprocessingdomain.

LeapMotionTheLeapMotion3sensorwasdesignedfortouch-lesstrackingofhandgesturesandtheirmovements.ThesensorinconjunctionwiththeLeapMotionSDKbecomesasophisticatedcontrolleranddeliverscomplexinformationmonitoringthehandmovementsinreal-time.Handgesturesarecapturedwithahighprecisionaccuracyalongwiththeindividualfingerpositions,rotationsandfingertipsaccelerations.

TheLeapMotionsensorwasintroducedtothemarketinthemid2013andswiftlyfounditsplaceamongotherdevicesdesignedforbodymovementsandgesturetracking–asforexampletheKinect4andtheWiicontroller5.Suchdevicesworkusuallywithgeneralbodytrackingandcanberepurposed.TheLeapMotiononthecontraryisdesignedtocapturethehandgesturesandmovementsonly.InfacttheLeapMotionsensorcouldbethoughtofasaspecialinteractivespace6ofpredefinedvolumeofair.CommonmusicapplicationsoftheLeapMotionsensorareprimarilybasedontheimitationsofphysicalinterfacesofexistingmusicalinstruments.Thisistrueespeciallyforthefollowingprojects:Air-Keyes,CrystalPiano,AirHarp,etc.(Han2014).Themainreasonforthatis,thatthevirtualkeyboardrepresentsanidealsetupfortestingofthesystemlatency(Silva2013),asthelowlatencyresponseisoneofthemostimportantelementsforanyreal-timemusicperformanceapplication.ThelatencyoftheLeapMotion,asadvertisedbythemanufacturer,isanywherefrom5msto20ms,butthisparticularfigureobviouslydependsonthewholesystemconfigurationandcomponentsused.AnothercategoryintheexistingapplicationsoftheLeapMotionsensorisrepresentedbyvarioushybridinstruments.AspecificselectionofsuchprojectsisdescribedinthepaperLessonsLearnedinExploringLeapMotion™(Han,Gold2014).

TheMotionOrigamiBody&SoundInteractionThethemeofphysicalinteractionandsoundprocessingisthoroughlyinvestigatedinthepapercalledSoundDesignasHumanMatterInteraction(Wei2013),whereasthemostimportantkeywordbecomesthetermmaterialcomputation7.IntheextendedsensewecanthinkoftheLeapMotionsensorasifitisconstitutinganinteractivespaceonitsown,wherethecalculationsandinteractiontakeplace.Thegesturerecognitioninconjunctionwithamusicperformanceis

2MAX/MSPobjectsogs~(SmoothOverlapGranularSynthesis)byNorbertSchnell,IRCAM–CentrePompidou;moreinformationcanbeaccessedonlineathttp://forumnet.ircam.fr/product/max-sound-box3LeapMotion,completesensorandSDKspecificationcanbeaccessedonlineathttp://leapmotion.com4MicrosoftKinectisasensorusingdepthmaptocreate3Dspacerepresentation.DevelopedbyMicrosoftforagameconsoleMicrosoft© Xbox360.Kinectspecificationcanbeaccessedonlineatwww.microsoft.com/en-us/kinectforwindows5WiiRemoteispartofNintendo©WiiConsoledevelopedbyNintendo©Company,Ltd.Wiispecificationcanbeaccessedonlineatwww.nintendo.com/wii6Interactivespaceof8cubicfeet,equivalentof0,22m3respectively,asstatedbythemanufacturer;moreinformationcanbeaccessedonlineathttp://leapmotion.com7„Realtime,continuouslyresponsiveenvironmentsforthedesignoftechnicalsystemsforhuman-computerinteractiondesign…“.Ibid,p.2010.

alsobeingexploredbytheIRCAM'sISMMresearchteam8.Whilethephysicalbody&soundinteractionconceptispresentforexampleintheprojectsofMarcoDonnarumma,whousessetofrepurposedbio-sensors9.

InthecaseoftheMotionOrigamiMax/MSPpatch,theperformer'sphysicalgestureinteractionistheprimarysourceoftheresultingsoundtransformations.Theperformercreatesnewsoundobjectswiththecapturedhandgestures.ThehandgesturesintheMotionOrigamipatcharerecognisedintheinteractivespaceabovethesensorandareturnedintoacontrolmechanismcodedintheMax/MSPpatch.Asinglerecognisedhandgestureinitialisestheaudiobufferwiththeincomingaudio.Thebufferactsasastartingpointforthegranularsynthesissoundscapebuilding.Thehandgesturescontrolthegranularsynthesisengineparameters,alongwiththetimingofthebufferinitializationwithanewaudiomaterialduringtheliveperformanceaswell.Thehandgesturescontrolthewet&dryratiooftheaudiosignalinputandalsothemultichannelaudiodistributionviatheAmbisonicsengine10.

MotionOrigamiPatchImplementationTheMotionOrigamipatchisprogrammedinMax/MSP11.DatafromtheLeapMotionsensorarecapturedbytheSwirlyLeapobject12.TheupdatedversionofthepatchusesthecurrentandwelldocumentedIRCAM'sleapmotion

8IRCAM–ISMMteam{SOUNDMUSICMOVEMENT}INTERACTION;moreinformationcanbeaccessedonlineathttp://ismm.ircam.fr9MarcoDonnarumma;projectpresentationscanbeaccessedonlineat:www.marcodonnarumma.com10AmbisonicsToolsfromtheInstituteforComputerMusicandSoundTechnologyICST,ZurichUniversityoftheArtscanbeaccessedonlineat:https://www.zhdk.ch/index.php?id=icst_ambisonicsexternals11Max/MSPvisualprogrammingenvironmentbyCycling74,moreinformationcanbeaccessedonlineathttp://www.cycling74.com

Figure2:MotionOrigami–theindividualpatchsectionsexplained

object13.TheSmoothOverlappingGranularSynthesisobjectsogs~14waschosenbecauseitoffersasimpleandcreativecontrolovertheaudiocapturedintotheaudiobufferandcanbeusedforaspecificnavigationandexplorationoftheaudiobuffer.Thesogs~objectalsomimicsthepaperfoldingtechniqueinthesense,thattheoriginalpapersurfaceissubstitutedwitha2Dplainmadeoftwoindividualparameters:thegrainpositionandthegrainsize.ThedatafromtheLeapMotionsensoraremappedtodrivethesogs~objectwiththosetwoselectedparameters:theperformerthannavigatesthespaceoftheaudiobufferdefinedbythegrainsizeandthegrainpositionparametersrespectively.

Thewet&drymixratio,whichisalsomappedtohandgestures,offersdetailedcontroloverthesoundmergingandaccentstheactual'soundfolds'.These'soundfolds'canbuildupacertainlevelofcomplexitythankstothefact,thattheliveaudiosourceiscoupledwiththeaudiomaterialstoredinthebuffer.Althoughtheaudioismodifiedinthegranulationprocess,itsharesthesamespectralandtonalcharacteristicswiththeoriginalsoundsource.Thisinturncreateselaboratedsoundtransformations,whichcanbethoughofastheintroduced'soundfolding'process.Thepatchrecognisesaspecificgesture,whichisrequiredtostarttheinitialisationoftheaudiobuffer.Inthiswaytheaudiobufferisfilledwithanewaudiomaterial.Thebufferinitialisationstartswithagestureofaclosedhand.Thisgesturecanbeparaphrasedas'grabthesound'gesture.Inthisverymoment,thebufferisfilledwiththelivesoundinputandbecomesavailabletosogs~object.Subsequenthandgesturescontrolvariousaspectsofthegranularsynthesisengine:horizontalhandswipecontrolsgrainpositionselection,verticalhandmovementcontrolstimelengthofagrain.Moreovertheoverallpalmpositionabovethesensorinthex-yplanedefinesthesoundsourcepositioninthemultichannelAmbisonicsspaceandaddsamultichannelspatialisationlayertotheperformance.TheotherLeapMotionrecognisedvariablesasyaw,pitcharollarealternativelymappedtoextendedFXprocessing(reverb,distortion,etc.),dependingontheperformancescenario.

12SwirlyLeapMax/MSPaccesstheLeapMotionAPI,writtenbyTomRitchfordfromNewYorku.Projectcanbeaccessedonlineathttps://github.com/rec/swirly-leap13WelldocumentedMax/MSPobjectleapmotionbyIRCAM,moreinformationcanbeaccessedonlineathttp://ismm.ircam.fr/leapmotion/14Max/MSPobjectsogs~(SmoothOverlapGranularSynthesis)byNorbertSchnell,IRCAM–CentrePompidou,moreinformationcanbeaccessedathttp://forumnet.ircam.fr/product/max-sound-box

Figure3.MotionOrigami–therecognisedgesturesillustrated

Conclusions&PerformanceThemostinvitingapplicationoftheMotionOrigamipatch15isavocalperformanceorsimplyamusicinstrument,whichleavesenoughspacefortheinteractionwiththesensoritself.Thebeautyoftheliveperformanceapproachliesthefact,thattheperformercaninteractwithhis/hercapturedmusicmaterialandaddmultiplelayersofexpressionbysolelyusingthehandgestures.Newlayerofimprovisationcanbeintroduced,whilenewthemesandphrasesemerge.Theperformerthaninteractswithanewmusicmaterial,whichisbasedonthesoundqualitiesoftheoriginalmusicinstrumentorthevocalperformance.TheperformercancontrolthefollowingparametersintheMotionOrigamiliveinterface:timebasedselectionofaphrasesampledintothebuffer;grainsizeandit'spositioninthebuffer;wet&drymixratioandAmbisonicssoundsourcespaceposition(ifapplicable).

Usinggesturesinmusiccompositionandperformanceprovestobeveryintuitive.Thesensoralonehastobe'learned'tobeoperatedproperlyandthisfactdeliversaspecificvirtuosityovertime.TheLeapMotionsensorwiththeMotionOrigamipatchopensupanewexcitingfieldofmusiccompositionandsoundprocessingcoupledwithimmediategesturalinteraction.Thebiggestchallengeinthegesturebasedperformanceistherecognitionofthequantizedgestures(Potter2013).Whileparametriccontrolofthevariouspatchelementsdoesn'tpresentanytechnicalproblemtracking-wise,therecognitionofthequantizedanduniquegesturesproveddifficultthroughoutthedevelopmentphaseofthepatch.Forexample,whileplayingonavirtualkeyboard,onecanlimitthekeystrokestoaspecificscaleandlimitthemis-triggerednotesthisway.Butwhenitcomestoevokingaspecificfunctionality(samplinginitialization,soundbanktoggle,etc.)thegestureshavetoberecognizedwithexceptionallyhighprecisionasthosedecisionscreateanintegralpartoftheperformanceitself.Thisaspectoftheliveperformancegivesusaspecificconstrainsandwehavetoconsiderthemintheliveperformancescenarios,whenusingtheLeapMotionsensor.Thequalityofthetrackingdependsalsoonthepresentlightconditionsandtheoverallsensorsetup.Forexample,adirectlightreachingthesensors'ssurfacecanintroduceinconsistencyinthetracking.

15MotionOrigami,theprojectpresentationcanbeaccessedonlineathttp://www.danielbartos.com/motion-origami

Figure4:MotionOrigami–adetailofthesubpatchcontrollingtheaudiobufferinitialisation

Overall,theLeapMotionisverysuitableforvariousintuitivemusiccompositionandperformancescenarios.Occasionalerrorsintrackingcanbeovercomewithagoodpatchdesignandtherestricteduseofthequantizedgestures–leavingoutthequantizedgesturestobeservicesbytraditionalhardwarecontrollers.Havingsaidthat,theLeapMotionsensorexcelsintheintuitivegestureinteractionperformanceandthegesturebasedmusiccompositionstrategies.

Acknowledgements.TheMotionOrigamiprojectwasfundedbyaresearchgrantscheme“Sensorsasmusicinstruments“atHAMU,MusicAcademyinPraguerunbyMichalRataj.SpecialthanksgotoMichalRataj16andMatthewOstrowski17fortheirsupport,helpwiththepatchdesignandtheirideasaboutperformanceandmusiccomposition.

ReferencesHan,Jihyun;Gold,Nicolas.2014.“LessonsLearnedinExploringtheLeapMotion™SensorforGesture-basedInstrumentDesign,”inCaramiaux,BandTahiroğlu,KandFiebrink,RandTanaka,A(eds.).ProceedingsoftheInternationalConferenceonNewInterfacesforMusicalExpression(NIME'14)London2014,GoldsmithsUniversityofLondon:371-374.Potter,LeightElen;Araullo,Jake;CarterLewis.2013.“TheLeapMotioncontroller:Aviewonsignlanguage,”Proceedingsofthe25thAustralianComputer-HumanInteractionConference:Augmentation,Application,Innovation,CollaborationAdelaide,ACM,NewYork:175-178.

Silva,EduardoS;deAbreu,JaderAndersonO.;deAlmeida,JanielHenriqueP.;Teichrieb,Veronica;Ramalho,GeberL..2013.“APreliminaryEvaluationoftheLeapMotionSensorasControllerofNewDigitalMusicalInstruments,”ProceedingsoftheBrazilianSymposiumonComputerMusicBrasilia,SBCM:59-70.Schwarz,Diemo.2006.“Concatenativesoundsynthesis:Theearlyyears,”JournalofNewMusicResearch35(1):3–22(March).Wei,ShaXin;Freed,Adrian;Navab,Navid.2013.“SoundDesignAsHumanMatterInteraction,”ProceedingsofCHI'13ExtendedAbstractsonHumanFactorsinComputingSystemsinParis,ACM,NewYork:2009-2018.

16MichalRataj–musiccomposerandresearcheratHAMU(MusicAcademyinPrague).Moreinformationcanbeaccessedonlineathttp://michalrataj.com17MatthewOstrowski–expertonsensorsandmultimediaprogrammingbasedatHarvestworksNYC,offeredvaluableinsightsintoLeapMotionprogramming.Moreinformationcanbeaccessedonlineathttp://www.ostrowski.infoandhttp://www.harvestworks.org