1 |
Investigation Of Tactile Displays For Robot To Human CommunicationBarber, Daniel 01 January 2012 (has links)
Improvements in autonomous systems technology and a growing demand within military operations are spurring a revolution in Human-Robot Interaction (HRI). These mixed-initiative human-robot teams are enabled by Multi-Modal Communication (MMC), which supports redundancy and levels of communication that are more robust than single mode interaction. (Bischoff & Graefe, 2002; Partan & Marler, 1999). Tactile communication via vibrotactile displays is an emerging technology, potentially beneficial to advancing HRI. Incorporation of tactile displays within MMC requires developing messages equivalent in communication power to speech and visual signals used in the military. Toward that end, two experiments were performed to investigate the feasibility of a tactile language using a lexicon of standardized tactons (tactile icons) within a sentence structure for communication of messages for robot to human communication. Experiment one evaluated tactons from the literature with standardized parameters grouped into categories (directional, dynamic, and static) based on the nature and meaning of the patterns to inform design of a tactile syntax. Findings of this experiment revealed directional tactons showed better performance than non-directional tactons, therefore syntax for experiment two composed of a non-directional and a directional tacton was more likely to show performance better than chance. Experiment two tested the syntax structure of equally performing tactons identified from experiment one, revealing participants’ ability to interpret tactile sentences better than chance with or without the presence of an independent work imperative task. This finding advanced the state of the art in tactile displays from one to two word phrases facilitating inclusion of the tactile modality within MMC for HRI
|
2 |
Cross-modal Effects In Tactile And Visual SignalingMerlo, James 01 January 2008 (has links)
Using a wearable tactile display three experiments were conducted in which tactile messages were created emulating five standard US Army and Marine arm and hand signals for the military commands, namely: "Attention", "Halt", "Rally", "Move Out", and "Nuclear Biological or Chemical event (NBC)". Response times and accuracy rates were collected for novices responding to visual and tactile representations of these messages, which were displayed either alone or together in congruent or incongruent combinations. Results indicated synergistic effects for concurrent, congruent message presentations showing superior response times when compared to individual presentations in either modality alone. This effect was mediated by participant strategy. Accuracy similarly improved when both the tactile and visual presentation were concurrently displayed as opposed to separately. In a low workload condition, participants could largely attend to a particular modality, with little interference from competing signals. If participants were not given instructions as to which modality to attend to, participants chose that modality which was received first. Lastly, initial learning and subsequent training of intuitive tactile signals occurred rapidly with large gains in performance in short training periods. These results confirm the promise for tactile messages to augment visual messaging in challenging and stressful environments particularly when visual messaging is maybe preferred but is not always feasible or possible.
|
3 |
Informations vibrotactiles pour l'aide à la navigation et la gestion des contacts avec l'environnement / Vibrotactile information for approach regulation and making contactsMandil, Cynthia 26 October 2017 (has links)
Ce travail de recherche vise à étudier la transmission d’informations vibrotactiles pour l’aide à la navigation et plus particulièrement pour améliorer la régulation des phases d’approche et la gestion des contacts avec l’environnement. L’un des défis majeurs de ce domaine de recherche est de comprendre comment rendre compte d’informations, parfois complexes, avec une modalité sensorielle n’étant pas naturellement utilisée pour les traiter. Ainsi, ce travail doctoral avait pour but de montrer la possibilité de suppléer la vision et à spécifier les caractéristiques de la stimulation vibrotactile qui influencent l’accès aux informations d’approche. Les différentes études qui étayent cette thèse ont été réalisées à partir d’un dispositif expérimental couplant un environnement virtuel et un dispositif tactile comprenant différents vibrateurs placés à la surface de la peau. Les deux premiers chapitres expérimentaux se sont appuyés sur des tâches d’estimation de temps de pré-contact (time-to-contact, TTC) classiquement utilisées pour étudier les processus visuels mis en jeu dans la régulation des situations d’approche. Le premier chapitre expérimental (expériences 1, 2 et 3) constituait une étude préliminaire qui a notamment montré que le jugement était plus précis lorsque le dispositif tactile renvoyait des informations concernant la distance d’approche (par rapport à des informations sur la taille angulaire). Les résultats du second chapitre expérimental (expériences 4 et 5) ont montré que la modalité tactile permettait d’estimer le TTC mais de manière moins précise que la modalité visuelle. Toutefois, lorsque la modalité visuelle est occultée, transmettre des informations tactiles durant la période d’occultation permet d’améliorer la précision du jugement. Le dernier chapitre expérimental (expériences 6 et 7) s’est intéressé plus précisément à l’influence des informations vibrotactiles sur la régulation d’une approche au sol dans une situation simulée d’atterrissage en hélicoptère. Les deux expérimentations ont montré que l’utilisation d’informations tactiles permettait une diminution significative de la vitesse de contact au sol lorsque l’environnement visuel était dégradé et que cette diminution dépendait de la variable informationnelle transmise par le dispositif. Au final, les résultats de ce travail de recherche sont discutés au regard des théories fondamentales sur la perception et l’action. Ils permettent de montrer comment des informations d’approche peuvent être perçues à travers la modalité tactile et ainsi suppléer la vision lorsqu’elle est dégradée. / The purpose of this doctoral research was to study vibrotactile information in navigation tasks, especially for approach regulation. One of the main issues in this research area is to find out how to specify complex information though a sensory modality that is usually unused. Thus, this work aimed at demonstrating the possibility to supply vision with tactile information and at specifying the characteristics of the vibrotactile stimulation that allow access to the information. The different studies have been carried out with an experimental display coupling a virtual environment and a tactile display consisting of several actuators placed on the skin. The first two empirical chapters were based on time-to-contact (TTC) judgment tasks, a paradigm generally used to study visual processes involved in approach situations. The first experimental chapter (experiments 1, 2 and 3) was a preliminary study, which showed that TTC estimation were more precise when the tactile display conveyed information about the distance to the target (compared to information about its angular size). The results of the second chapter (experiments 4 and 5) showed that TTC estimation was less accurate with tactile information compared to vision. Nevertheless, conveying tactile information when visual information was occluded significantly improved time-to-contact estimation. In the last chapter of this thesis, we focused on the influence of vibrotactile information on the regulation of a ground approach with a virtual situation of landing with a helicopter. We showed that tactile information reduced significantly the impact velocity when the visual environment was degraded (experiment 6 and 7). Moreover, the results showed that this decrease of velocity depended on the variable conveyed by the tactile display. Finally, the results of this work are discussed regarding fundamental theories about perception and action. Overall, it shows that approach information can be perceive through the tactile modality and thus supply vision in degraded environment.
|
4 |
Contribution to the Design and Implementation of Portable Tactile Displays for the Visually ImpairedVelazquez-Guerrero, Ramiro 06 1900 (has links)
This thesis explores the design, implementation and performance of a new concept for a low-cost, high-resolution, lightweight, compact and highly-portable tactile display. This tactile device is intended to be used in a novel visuo-tactile sensory substitution/supplemen-tation electronic travel aid (ETA) for the blind/visually impaired.Based on the psychophysiology of touch and using Shape Memory Alloys (SMAs) as the actuation technology, a mechatronic device was designed and prototyped to stimulate the sense of touch by creating sensations of contact on the fingertips.The prototype consists of an array of 64 elements spaced 2.6 mm apart that vertically actuates SMA based miniature actuators of 1.5 mm diameter to a height range of 1.4 mm with a pull force of 300 mN up to a 1.5 Hz bandwidth. The full display weights 200 g and its compact dimensions (a cube of 8 cm side-length) make it easy for the user to carry. The display is capable of presenting a wide range of tactile binary information on its 8 x 8 matrix. Moreover, both mechanical and electronic drive designs are easily scalable to larger devices while still being price attractive.Human psychophysics experiments demonstrate the effectiveness of the tactile information transmitted by the display to sighted people and show feasibility in principle of the system as an assistive technology for the blind/visually impaired.
|
5 |
TOWARDS IMPROVING TELETACTION IN TELEOPERATION TASKS USING VISION-BASED TACTILE SENSORSOscar Jia Jun Yu (18391263) 01 May 2024 (has links)
<p dir="ltr">Teletaction, the transmission of tactile feedback or touch, is a crucial aspect in the</p><p dir="ltr">field of teleoperation. High-quality teletaction feedback allows users to remotely manipulate</p><p dir="ltr">objects and increase the quality of the human-machine interface between the operator and</p><p dir="ltr">the robot, making complex manipulation tasks possible. Advances in the field of teletaction</p><p dir="ltr">for teleoperation however, have yet to make full use of the high-resolution 3D data provided</p><p dir="ltr">by modern vision-based tactile sensors. Existing solutions for teletaction lack in one or more</p><p dir="ltr">areas of form or function, such as fidelity or hardware footprint. In this thesis, we showcase</p><p dir="ltr">our research into a low-cost teletaction device for teleoperation that can utilize the real-time</p><p dir="ltr">high-resolution tactile information from vision-based tactile sensors, through both physical</p><p dir="ltr">3D surface reconstruction and shear displacement. We present our device, the Feelit, which</p><p dir="ltr">uses a combination of a pin-based shape display and compliant mechanisms to accomplish</p><p dir="ltr">this task. The pin-based shape display utilizes an array of 24 servomotors with miniature</p><p dir="ltr">Bowden cables, giving the device a resolution of 6x4 pins in a 15x10 mm display footprint.</p><p dir="ltr">Each pin can actuate up to 3 mm in 200 ms, while providing 80 N of force and 3 um of</p><p dir="ltr">depth resolution. Shear displacement and rotation is achieved using a compliant mechanism</p><p dir="ltr">design, allowing a minimum of 1 mm displacement laterally and 10 degrees of rotation. This</p><p dir="ltr">real-time 3D tactile reconstruction is achieved with the use of a vision-based tactile sensor,</p><p dir="ltr">the GelSight, along with an algorithm that samples the depth data and marker tracking to</p><p dir="ltr">generate actuator commands. With our device we perform a series of experiments including</p><p dir="ltr">shape recognition and relative weight identification, showing that our device has the potential</p><p dir="ltr">to expand teletaction capabilities in the teleoperation space.</p>
|
6 |
The Pursuit of Effective Artificial Tactile Speech Communication: Improvements and Cognitive Characteristics of a Phonemic-based ApproachJuan S Martinez (6622304) 26 April 2023 (has links)
<p>Tactile speech communication allows individuals to understand speech by sensations transmitted through the sense of touch. Devices that enable tactile speech communication can be an effective means to transmit important messages when the visual and/or auditory systems are overloaded or impaired. This has applications in silent communication and for people with hearing and/or visual impairments. An effective artificial speech communication system must be learned in a reasonable time and be easily remembered. Moreover, it must transmit any word at suitable rates for speech communication. The pursuit of a system that fulfills these requirements is a complex task that requires work in different areas. This thesis presents advancements in four of them. First is the matter of encoding speech information. Here, a phonemic-based approach allowed participants to recognize of tactile phonemes, words, phrases and full sentences. Second is the issue of training users in the use of the system. To this end, this thesis investigated the phenomenon of incidental categorization of vibrotactile stimuli as the foundation of more natural methods to learn a tactile speech communication system. Third is the matter of the neural processing of the tactile speech information. Here, an exploration of the functional characteristics of the phonemic-based approach using EEG was conducted. Finally, there is the matter of implementing the system for consumer use. In this area, this work addresses practical considerations of delivering rich haptic effects with current wearable technologies. These are informative for the design of actuators used in tactile speech communication devices.</p>
|
Page generated in 0.1002 seconds