Return to search

Human Action Recognition from Gradient Boundary Histograms

This thesis presents a framework for automatic recognition of human actions in un- controlled, realistic video data with fixed cameras, such as surveillance videos. In this thesis, we divide human action recognition into three steps: description, representation, and classification of local spatio-temporal features. The bag-of-features model was used to build the classifier. Fisher Vectors were also studied. We focus on the potential of the methods, with the joint optimization of two constraints: the classification precision and its efficiency.
On the performance side, a new local descriptor, called Gradient Boundary Histograms (GBH), is adopted. It is built on simple spatio-temporal gradients, which can be computed quickly. We demonstrate that GBH can better represent local structure and motion than other gradient-based descriptors, and significantly outperforms them on large datasets. Our evaluation shows that compared to HOG descriptors, which are based solely on spatial gradient, GBH descriptor preserves the recognition precision even in difficult situation.
Since surveillance video captured with fixed cameras is the emphasis of our study, removing the background before action recognition is helpful for improving efficiency. We first preprocess the video data by applying HOG to detect humans. GBH descriptor is then used at reduced spatial resolutions, which yields both high efficiency and low memory usage; in addition, we apply PCA to reduce the feature dimensions, which results in fast matching and an accelerated classification process.
Experiments our methods achieved good performance in recognizing precision, while simultaneously highlighting effectiveness and efficiency.

Identiferoai:union.ndltd.org:uottawa.ca/oai:ruor.uottawa.ca:10393/35931
Date January 2017
CreatorsWang, Xuelu
ContributorsLaganière, Robert
PublisherUniversité d'Ottawa / University of Ottawa
Source SetsUniversité d’Ottawa
LanguageEnglish
Detected LanguageEnglish
TypeThesis

Page generated in 0.0028 seconds