This thesis studies the interaction with music synthesis systems using hand gestures. Traditionally users of such systems were limited to input devices such as buttons, pedals, faders, and joysticks. The use of gestures allows the user to interact with the system in a more intuitive way. Without the constraint of input devices, the user can simultaneously control more elements within the music composition, thus increasing the level of the system's responsiveness to the musician's creative thoughts. A working system of this concept is implemented, employing computer vision and machine intelligence techniques to recognise the user's gestures. / Dissertation (MSc)--University of Pretoria, 2006. / Computer Science / MSc / unrestricted
Identifer | oai:union.ndltd.org:netd.ac.za/oai:union.ndltd.org:up/oai:repository.up.ac.za:2263/29240 |
Date | 05 November 2007 |
Creators | Pun, James Chi-Him |
Contributors | Engelbrecht, Andries P., respho@gmail.com, Van den Bergh, Frans |
Publisher | University of Pretoria |
Source Sets | South African National ETD Portal |
Detected Language | English |
Type | Dissertation |
Rights | © University of Pretor |
Page generated in 0.0022 seconds