This thesis focuses on the development of a robotic device to be used in parallel with observational learning techniques for facilitating the recovery of the upper limb in post-stroke patients. It has been shown in the existing observational learning literature that observational practice for the execution of goal-directed single arm movements can engage the mirror neuron system and motor areas involved in learning motor actions. On the other hand, robotic-based therapy protocols have proven successful in which participants are able to learn the required perception-action skill. However, robotics have not been overly successful in the generalization of learning to other tasks and this is an essential aspect on improving performance on Activities of Daily Life (ADL). Observational learning of motor skills has been shown to produce transfer across limbs and generalization across muscle groups in the same limb, as well as transfer to perceptual tasks. Therefore, our long-term hypothesis is that a combination of interactive robotics and action observation techniques might offer a greater benefit regarding transfer to ADLs in comparison to pure robotic training.
The results from this research broaden the theoretical understanding of observational learning and drive the future development of rehabilitation protocols using the combination of robotic and observational learning techniques. We hypothesize that if the application of these techniques, for non-stroke individuals, yield benefits for the learning of motor/skill actions, then such paradigm will serve as a foundation in the future development of methods for facilitating the recovery of upper limb function after stroke.
Identifer | oai:union.ndltd.org:tamu.edu/oai:repository.tamu.edu:1969.1/149455 |
Date | 03 October 2013 |
Creators | Ramos, Jorge Adrian |
Contributors | Hsieh, Sheng-Jen, Robson, Nina P., Buchanan, John J. |
Source Sets | Texas A and M University |
Language | English |
Detected Language | English |
Type | Thesis, text |
Format | application/pdf |
Page generated in 0.0018 seconds