This thesis aims to apply neural networks in the classification of human patterns of movement and to compare the accuracy of this technique with existing methods (conventional statistics and clinical assessment). Three different examples of human movement and one of posture were chosen for study and a variety of biomechanical parameters used to describe them. The temporal parameters of gait patterns, related to speed of walking and walking with splinted knee or weighted leg, were recorded. The angular displacement of both hips and knees was measured during stepping up or down steps of five different heights. Different standing postures were studied by measuring the disposition of body landmarks associated with imagined moods of human subjects. Finally, changes of the sit-stand-sit manoeuvre due to chronic low back pain, expressed as joint movement and forces exerted on the ground, were recorded. Patterns were classified by neural networks, linear discriminant analysis and, in the case of sit-stand patterns, by qualified clinicians. By altering the number of variables to discriminate between patterns, benefits of the above classifiers were identified. The success in classification of the measured patterns by neural networks was found to have an accuracy at least as high as that of linear discriminant analysis. A neural network is a useful tool for the discrimination of patterns of human movements; its main advantage is the ability to deal with a large number of predictor variables. A successfully trained and tested neural networks can easily be set up in a computer and, on the evidence presented, could be used to help clinicians diagnose or assess pathological patterns of movement.
Identifer | oai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:307479 |
Date | January 1994 |
Creators | Gioftsos, George |
Publisher | University College London (University of London) |
Source Sets | Ethos UK |
Detected Language | English |
Type | Electronic Thesis or Dissertation |
Source | http://discovery.ucl.ac.uk/1317946/ |
Page generated in 0.0022 seconds