This thesis presents the design for a novel haptic interface for large-format touchscreens. Techniques such as electrovibration, ultrasonic vibration, and external braked devices have been developed by other researchers to deliver haptic feedback to touchscreen users. However, these methods do not address the need for spatial constraints that only restrict user motion in the direction of the constraint. This technology gap contributes to the lack of haptic technology available for touchscreen-based upper-limb rehabilitation, despite the prevalent use of haptics in other forms of robotic rehabilitation. The goal of this thesis is to display kinesthetic haptic constraints to the touchscreen user in the form of boundaries and paths, which assist or challenge the user in interacting with the touchscreen. The presented prototype accomplishes this by steering a single wheel in contact with the display while remaining driven by the user. It employs a novel embedded force sensor, which it uses to measure the interaction force between the user and the touchscreen. The haptic response of the device is controlled using this force data to characterize user intent. The prototype can operate in a simulated free mode as well as simulate rigid and compliant obstacles and path constraints. A data architecture has been created to allow the prototype to be used as a peripheral add-on device which reacts to haptic environments created and modified on the touchscreen. The long-term goal of this work is to create a haptic system that enables a touchscreen-based rehabilitation platform for people with upper limb impairments.
Identifer | oai:union.ndltd.org:UMASS/oai:scholarworks.umass.edu:masters_theses_2-1388 |
Date | 13 July 2016 |
Creators | Price, Mark |
Publisher | ScholarWorks@UMass Amherst |
Source Sets | University of Massachusetts, Amherst |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | Masters Theses |
Page generated in 0.0015 seconds