Return to search

Robust Upper Body Pose Recognition in Unconstrained Environments Using Haar-Disparity

In this research, an approach is proposed for the robust tracking of upper
body movement in unconstrained environments by using a Haar-
Disparity algorithm together with a novel 2D silhouette projection
algorithm. A cascade of boosted Haar classifiers is used to identify
human faces in video images, where a disparity map is then used to
establish the 3D locations of detected faces. Based on this information,
anthropometric constraints are used to define a semi-spherical interaction
space for upper body poses. This constrained region serves the purpose of
pruning the search space as well as validating user poses. Haar-Disparity
improves on the traditional skin manifold tracking by relaxing constraints
on clothing, background and illumination. The 2D silhouette projection
algorithm provides three orthogonal views of the 3D objects. This allows
tracking of upper limbs to be performed in the 2D space as opposed to
manipulating 3D noisy data directly. This thesis also proposes a complete
optimal set of interactions for very large interactive displays.
Experimental evaluation includes the performance of alternative camera
positions and orientations, accuracy of pointing, direct manipulative
gestures, flag semaphore emulation, and principal axes. As a minor part
of this research interest, the usability of interacting using only arm
gestures is also evaluated based on ISO 9241-9 standard. The results
suggest that the proposed algorithm and optimal set of interactions are
useful for interacting with large displays.

Identiferoai:union.ndltd.org:canterbury.ac.nz/oai:ir.canterbury.ac.nz:10092/2165
Date January 2008
CreatorsChu, Cheng-Tse
PublisherUniversity of Canterbury. Computer Science and Software Engineering
Source SetsUniversity of Canterbury
LanguageEnglish
Detected LanguageEnglish
TypeElectronic thesis or dissertation, Text
RightsCopyright Cheng-Tse Chu, http://library.canterbury.ac.nz/thesis/etheses_copyright.shtml
RelationNZCU

Page generated in 0.0026 seconds