Return to search

Learning and identification using intelligent shoes. / CUHK electronic theses & dissertations collection

Finally, the research of classifying and identifying individuals through their walking patterns is introduced. Alive biometrical features in dynamic human gait are adopted in the intelligent shoe system. Since gait data are dynamic, non-linear, stochastic, time-varying, noisy and multi-channel, we must select a modeling framework capable of dealing with these expected complexities in the data. Using the proposed machine learning methods, support vector machine (SVM) and hidden Markov models (HMMs), we build up probabilistic models that take the information of human walking patterns into account, and compare the overall similarity among human walking patterns of several wearers. / In this thesis, we will build intelligent shoes under the framework for capturing and analyzing dynamic human gait. Existing MEMS technology makes it possible to integrate all the sensors and circuits inside a small module. In designing our intelligent shoe system, we require the following key characteristics in our system: (1) It should be convenient to wear and socially acceptable. Thus, the sensors and electronic hardware installed should not substantially change the weight and weight balance of a typical shoe, lest it alters how an individual normally walks. (2) We want to analyze a user's motion in real-time through a wireless interface to a remote laptop or other computer; we will also incorporate on-shoe data logging hardware for off-line analysis. (3) Sensors that monitor gait motion conditions may need to be attached to the insoles, in closer proximity to the foot of users. In order to investigate the problem of capturing power parasitically from normal human-body-motion for use in personal electronics applications, we also plan to develop an electromechanical generator embedded within the shoe for parasitic power collection from heel strike. / Next, we can encode specific motions to control external devices through a wireless interface. This same system architecture that allows us to classify broad categories of motion also allows the intelligent shoe to act as a programmable, low-data rate control interface. We apply the system to several successful tasks based on this platform, especially the Shoe-Mouse. By using this interface, we can operate a device with our feet. / Then, we present potential use of machine learning techniques, in particular support vector machine (SVM), and the intelligent shoe platform to detect discrete stages in the cyclic motion of dynamic human gait, and construct an identifier of five discrete events that occur in a cyclic process for precise control of functional electrical stimulation (FES). With the information of when the legs are in each phase of a gait, the timing of specific gait phase can be assessed. / Huang, Bufu. / "September 2007." / Adviser: Yangsheng Xu. / Source: Dissertation Abstracts International, Volume: 69-08, Section: B, page: 4931. / Thesis (Ph.D.)--Chinese University of Hong Kong, 2007. / Includes bibliographical references (p. 122-131). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Abstracts in English and Chinese. / School code: 1307.

Identiferoai:union.ndltd.org:cuhk.edu.hk/oai:cuhk-dr:cuhk_344117
Date January 2007
ContributorsHuang, Bufu., Chinese University of Hong Kong Graduate School. Division of Automation and Computer-Aided Engineering.
Source SetsThe Chinese University of Hong Kong
LanguageEnglish, Chinese
Detected LanguageEnglish
TypeText, theses
Formatelectronic resource, microform, microfiche, 1 online resource (x, 131 p. : ill.)
RightsUse of this resource is governed by the terms and conditions of the Creative Commons “Attribution-NonCommercial-NoDerivatives 4.0 International” License (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Page generated in 0.0012 seconds