Return to search

Free-space gesture mappings for music and sound

This thesis describes a set of software applications for real-time gesturally con- trolled interactions with music and sound. The applications for each system are varied but related, addressing unsolved problems in the field of audio and music technology. The three systems presented in this work capture 3D human motion with spatial sensors and map position data from the sensors onto sonic parameters. Two different spatial sensors are used interchangeably to perform motion capture: the radiodrum and the Xbox Kinect. The first two systems are aimed at creating immersive virtually-augmented environments. The first application uses human ges- ture to move sounds spatially in a 3D surround sound by physically modelling the movement of sound in a space. The second application is a gesturally controlled self- organized music browser in which songs are clustered based on auditory similarity. The third application is specifically aimed at extending musical performance through the development of a digitally augmented vibraphone. Each of these applications is presented with related work, theoretical and technical details for implementation, and discussions of future work. / Graduate

Identiferoai:union.ndltd.org:uvic.ca/oai:dspace.library.uvic.ca:1828/4288
Date21 September 2012
CreatorsOdowichuk, Gabrielle
ContributorsDriessen, Peter F., Tzanetakis, George
Source SetsUniversity of Victoria
LanguageEnglish, English
Detected LanguageEnglish
TypeThesis
RightsAvailable to the World Wide Web

Page generated in 0.002 seconds