Return to search

Human-Robot Interaction Strategies for Walker-Assisted Locomotion

Made available in DSpace on 2018-08-02T00:02:04Z (GMT). No. of bitstreams: 1
tese_8979_[Cifuentes2015]Thesis20160322-161800.pdf: 19912329 bytes, checksum: 99cc718d614e10d2d6cce22fe9e19124 (MD5)
Previous issue date: 2015-06-25 / Neurological and age-related diseases affect human mobility at different levels causing partial or total loss of such faculty. There is a significant need to improve safe and efficient ambulation of patients with gait impairments. In this context, walkers present important benefits for human mobility, improving balance and reducing the load on their lower limbs. Most importantly, walkers induce the use of patients residual mobility capacities in different environments. In the field of robotic technologies for gait assistance,
a new category of walkers has emerged, integrating robotic technology, electronics and mechanics. Such devices are known as robotic walkers, intelligent walkers or smart walkers One of the specific and important common aspects to the field of assistive technologies and rehabilitation robotics is the intrinsic interaction between the human and the robot.
In this thesis, the concept of Human-Robot Interaction (HRI) for human locomotion assistance is explored. This interaction is composed of two interdependent components.
On the one hand, the key role of a robot in a Physical HRI (pHRI) is the generation of supplementary forces to empower the human locomotion. This involves a net flux of power between both actors. On the other hand, one of the crucial roles of a Cognitive HRI (cHRI) is to make the human aware of the possibilities of the robot while allowing him to maintain control of the robot at all times.
This doctoral thesis presents a new multimodal human-robot interface for testing and validating control strategies applied to a robotic walkers for assisting human mobility and gait rehabilitation. This interface extracts navigation intentions from a novel sensor fusion method that combines: (i) a Laser Range Finder (LRF) sensor to estimate the users legs kinematics, (ii) wearable Inertial Measurement Unit (IMU) sensors to capture the human and robot orientations and (iii) force sensors measure the physical interaction
between the humans upper limbs and the robotic walker.
Two close control loops were developed to naturally adapt the walker position and to perform body weight support strategies. First, a force interaction controller generates velocity outputs to the walker based on the upper-limbs physical interaction. Second, a inverse kinematic controller keeps the walker within a desired position to the human improving such interaction.
The proposed control strategies are suitable for natural human-robot interaction as shown during the experimental validation. Moreover, methods for sensor fusion to estimate the control inputs were presented and validated. In the experimental studies, the parameters estimation was precise and unbiased. It also showed repeatability when speed changes and continuous turns were performed.

Identiferoai:union.ndltd.org:IBICT/oai:dspace2.ufes.br:10/9725
Date25 June 2015
CreatorsGARCIA, C. A. C.
ContributorsBASTOS FILHO, T. F., SIQUEIRA, A. A. G., ANDREAO, R. V., FERREIRA, A., SALLES, E. O. T., Anselmo Frizera Neto
PublisherUniversidade Federal do Espírito Santo, Doutorado em Engenharia Elétrica, Programa de Pós-Graduação em Engenharia Elétrica, UFES, BR
Source SetsIBICT Brazilian ETDs
Detected LanguageEnglish
Typeinfo:eu-repo/semantics/publishedVersion, info:eu-repo/semantics/doctoralThesis
Formatapplication/pdf
Sourcereponame:Repositório Institucional da UFES, instname:Universidade Federal do Espírito Santo, instacron:UFES
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0022 seconds