Return to search

A Machine Vision System for Capture and Interpretation of an Orchestra Conductor's Gestures

This work involves the design and implementation of a real-time Machine Vision-based Human Computer Interface (HCI) that analyzes and interprets a music conductor's gestures to detect the musical "beat". This HCI system interfaces directly with the "Virtual Orchestra", an electronic MIDI sequenced "orchestra". Prior to the development of this HCI system, the real time control of the tempo of the "Virtual Orchestra" could only be controlled by tapping a tempo on a MIDI controller device--a process that is foreign to most music conductors. The real-time beat information detected by this HCI system allows a conductor to conduct the "Virtual Orchestra" as if it were a live orchestra. This system was developed using the Broadway real-time color image capture board manufactured by Data Translation, Incorporated. The implementation involved the use of Microsoft Visual C++, Microsoft Foundation Classes (MFC) for the Graphical User Interface (GUI), Video For Windows (VFW), MIDI note generation, and Intel assembly level code optimization. Algorithms were developed for rapid RGB color thresholding, multiple contour extraction, fast contour based area and center of mass calculations, and gesture interpretation. Real time, live-video interpretation has been achieved and an end-to-end system has been demonstrated in conjuction with a MIDI sequencer.

Identiferoai:union.ndltd.org:wpi.edu/oai:digitalcommons.wpi.edu:etd-theses-1807
Date11 May 1999
CreatorsDriscoll, Michael T.
ContributorsRichard Campbell, Committee Member, Frederick Bianchi, Committee Member, David Cyganski, Advisor
PublisherDigital WPI
Source SetsWorcester Polytechnic Institute
Detected LanguageEnglish
Typetext
Formatapplication/pdf
SourceMasters Theses (All Theses, All Years)

Page generated in 0.0022 seconds