Return to search

Dynamic Descriptors in Human Gait Recognition

Feature extraction is the most critical step in any human gait recognition system. Although gait is a dynamic process yet the static body parameters also play an important
role in characterizing human gait. A few studies were performed in the past to assess the comparative relevance of static and dynamic gait features. There is, however, a lack of work in comparative performance analysis of dynamic gait features from different parts of the silhouettes in an appearance based setup. This dissertation presents a comparative study of dynamic features extracted from legs, arms and shoulders for gait recognition.
Our study partially supports the general notion of leg motion being the most important
determining factor in gait recognition. But it is also observed that features extracted from upper arm and shoulder area become more significant in some databases. The usefulness
of the study hinges on the fact that lower parts of the leg are generally more noisy due to a variety of variations such as walking surface, occlusion and shadows. Dynamic features extracted from the upper part of the silhouettes posses significantly higher discriminatory power in such situations. In other situations these features can play a complementary role in the gait recognition process.
We also propose two new feature extraction methods for gait recognition. The new
methods use silhouette area signals which are easy and simple to extract. A significant
performance increase is achieved by using the new features over the benchmark method
and recognition results compare well to the other current techniques. The simplicity and
compactness of the proposed gait features is their major advantage because it entails low
computational overhead.

Identiferoai:union.ndltd.org:TORONTO/oai:tspace.library.utoronto.ca:1807/35762
Date02 August 2013
CreatorsAmin, Tahir
ContributorsHatzinakos, Dimitrios
Source SetsUniversity of Toronto
Languageen_ca
Detected LanguageEnglish
TypeThesis

Page generated in 0.0021 seconds