1 |
The Design and Realization of a Sensitive Walking PlatformChernyak, Vadim 24 April 2012 (has links)
Legged locomotion provides robots with the capability of adapting to different terrain conditions. General complex terrain traversal methodologies solely rely on proprioception which readily leads to instability under dynamical situations. Biological legged locomotion utilizes somatosensory feedback to sense the real-time interaction of the feet with ground to enhance stability. Nevertheless, limited attention has been given to sensing the feet-terrain interaction in robotics. This project introduces a paradigm shift in robotic walking called sensitive walking realized through the development of a compliant bipedal platform. Sensitive walking extends upon the success of sensitive manipulation which utilizes tactile feedback to localize an object to grasp, determine an appropriate manipulation configuration, and constantly adapts to maintain grasp stability. Based on the same concepts of sensitive manipulation, sensitive walking utilizes podotactile feedback to enhance real-time walking stability by effectively adapting to variations in the terrain. Adapting legged robotic platforms to sensitive walking is not as simple as attaching any tactile sensor to the feet of a robot. The sensors and the limbs need to have specific characteristics that support the implementation of the algorithms and allow the biped to safely come in contact with the terrain and detect the interaction forces. The challenges in handling the synergy of hardware and sensor design, and fabrication in a podotactile-based sensitive walking robot are addressed. The bipedal platform provides contact compliance through 12 series elastic actuators and contains 190 highly flexible tactile sensors capable of sensing forces at any incident angle. Sensitive walking algorithms are provided to handle multi-legged locomotion challenges including stairs and irregular terrain.
|
2 |
Electrotactile Feedback System Using Psychophysical Mapping FunctionsMarcus, Patrick January 2006 (has links)
Advancements in movement restoration have accelerated in recent years while the restoration of somatosensation has progressed relatively slowly. This dissertation attempts to partially correct this oversight by developing an electrotactile feedback system that might be used to restore the sense of touch.Initially, the perceptual parameters of the skin regions likely to be used as a source of tactile information (the fingertip) and as a destination for electrotactile feedback (the back of the neck) were evaluated. The perceptual parameters of tactile threshold sensitivity, spatial acuity, and gain scaling were collected from subjects for both regions of skin. These same parameters were also gathered in response to electrotactile stimulation of the neck. The threshold sensitivity and spatial acuity of the fingertip was found to be far superior to that on the back of the neck, yet the mechanical perceptual gain scaling parameters of the neck were similar to that of the finger tip. Yet, the psychometric functions for electrical stimulation on the neck differed markedly in gain sensitivity from that of mechanical stimulation. A mapping function between the two modalities was then calculated based upon the tactile and electrotacile characterization data that was collected.An electrotactile feedback system was then developed based upon the calculated mapping function, allowing conversion of force applied to an artificial sensor on the fingertip to a perceptually equivalent electrical stimulus on the neck. The system proved to be quite effective: Subjects were able to effectively evaluate electrical stimulus that was derived from application of force to the sensor on the fingertip. The perceptual gain scaling for the feedback system matched that of natural mechanical stimulation.A grip force matching task was evaluated in test subjects under three conditions: a) normal tactile sensation, b) anesthesia of the fingers, and c) anesthesia of the fingers with restored tactile information via the electrotactile feedback system. The relative loss in grip-force matching ability when tactile feedback was abolished by local anesthetic was mild, indicating a strong ability for individuals to generate target force levels using other forms of feedback. Electrotactile feedback, therefore, offered only modest improvement when deployed in the anesthetized hand.
|
3 |
A role for sensory areas in coordinating active sensing motionsSchroeder, Joseph Bradley 21 June 2016 (has links)
Active sensing, which incorporates closed-loop behavioral selection of information during sensory acquisition, is an important feature of many sensory modalities. We used the rodent whisker tactile system as a platform for studying the role cortical sensory areas play in coordinating active sensing motions. We examined head and whisker motions of freely moving mice performing a tactile search for a randomly located reward, and found that mice select from a diverse range of available active sensing strategies. In particular, mice selectively employed a strategy we term contact maintenance, where whisking is modulated to counteract head motion and sustain repeated contacts, but only when doing so is likely to be useful for obtaining reward. The context dependent selection of sensing strategies, along with the observation of whisker repositioning prior to head motion, suggests the possibility of higher level control, beyond simple reflexive mechanisms. In order to further investigate a possible role for primary somatosensory cortex (SI) in coordinating whisk-by-whisk motion, we delivered closed-loop optogenetic feedback to SI, time locked to whisker motions estimated through facial electromyography. We found that stimulation regularized whisking (increasing overall periodicity), and shifted whisking frequency, changes that emulate behaviors of rodents actively contacting objects. Importantly, we observed changes to whisk timing only for stimulation locked to whisker protractions, possibly encoding that natural contacts are more likely during forward motion of the whiskers. Simultaneous neural recordings from SI show cyclic changes in excitability, specifically that responses to excitatory stimulation locked to whisker retractions appeared suppressed in contrast to stimulation during protractions that resulted in changes to whisk timing. Both effects are evident within single whisks. These findings support a role for sensory cortex in guiding whisk-by-whisk motor outputs, but suggest a coupling that depends on behavioral context, occurring on multiple timescales. Elucidating a role for sensory cortex in motor outputs is important to understanding active sensing, and may further provide novel insights to guide the design of sensory neuroprostheses that exploit active sensing context.
|
4 |
{Spatial Tactile Feedback Support for Mobile Touch-screen DevicesYatani, Koji 12 January 2012 (has links)
Mobile touch-screen devices have the capability to accept flexible touch input, and can provide a larger screen than mobile devices with physical buttons. However, many of the user interfaces found in mobile touch-screen devices require visual feedback. This raises a number of user interface challenges. For instance, visually-demanding user interfaces make it difficult for the user to interact with mobile touch-screen devices without looking at the screen---a task the user sometimes wishes to do particularly in a mobile setting. In addition, user interfaces on mobile touch-screen devices are not generally accessible to visually impaired users. Basic tactile feedback (e.g., feedback produced by a single vibration source) can be used to enhance the user experience on mobile touch-screen devices. Unfortunately, this basic tactile feedback often lacks the expressiveness for generating vibration patterns that can be used to convey specific information about the application to the user. However, the availability of richer information accessible through the tactile channel would minimize the visual demand of an application. For example, if the user can perceive which button she is touching on the screen through tactile feedback, she would not need to view the screen, and can instead focus her visual attention towards the primary task (e.g., walking).
In this dissertation, I address high visual demand issues found in existing user interfaces on mobile touch-screen devices by using spatial tactile feedback. Spatial tactile feedback means tactile feedback patterns generated in different points of the user's body (the user's fingers and palm in this work). I developed tactile feedback hardware employing multiple vibration motors on the backside of a mobile touch-screen device. These multiple vibration motors can produce various spatial vibration patterns on the user's fingers and palm. I then validated the effects of spatial tactile feedback through three different applications: eyes-free interaction, a map application for visually impaired users, and collaboration support. Findings gained through the series of application-oriented investigations indicate that spatial tactile feedback is a beneficial output modality in mobile touch-screen devices, and can mitigate some visual demand issues.
|
5 |
{Spatial Tactile Feedback Support for Mobile Touch-screen DevicesYatani, Koji 12 January 2012 (has links)
Mobile touch-screen devices have the capability to accept flexible touch input, and can provide a larger screen than mobile devices with physical buttons. However, many of the user interfaces found in mobile touch-screen devices require visual feedback. This raises a number of user interface challenges. For instance, visually-demanding user interfaces make it difficult for the user to interact with mobile touch-screen devices without looking at the screen---a task the user sometimes wishes to do particularly in a mobile setting. In addition, user interfaces on mobile touch-screen devices are not generally accessible to visually impaired users. Basic tactile feedback (e.g., feedback produced by a single vibration source) can be used to enhance the user experience on mobile touch-screen devices. Unfortunately, this basic tactile feedback often lacks the expressiveness for generating vibration patterns that can be used to convey specific information about the application to the user. However, the availability of richer information accessible through the tactile channel would minimize the visual demand of an application. For example, if the user can perceive which button she is touching on the screen through tactile feedback, she would not need to view the screen, and can instead focus her visual attention towards the primary task (e.g., walking).
In this dissertation, I address high visual demand issues found in existing user interfaces on mobile touch-screen devices by using spatial tactile feedback. Spatial tactile feedback means tactile feedback patterns generated in different points of the user's body (the user's fingers and palm in this work). I developed tactile feedback hardware employing multiple vibration motors on the backside of a mobile touch-screen device. These multiple vibration motors can produce various spatial vibration patterns on the user's fingers and palm. I then validated the effects of spatial tactile feedback through three different applications: eyes-free interaction, a map application for visually impaired users, and collaboration support. Findings gained through the series of application-oriented investigations indicate that spatial tactile feedback is a beneficial output modality in mobile touch-screen devices, and can mitigate some visual demand issues.
|
6 |
Design and Development of a Framework to Bridge the Gap Between Real and VirtualHossain, SK Alamgir 01 November 2011 (has links)
Several researchers have successfully developed realistic models of real world objects/ phenomena and then have simulated them in the virtual world. In this thesis, we propose the opposite: instantiating virtual world events in the real world. The interactive 3D virtual environment provides a useful, realistic 3D world that resembles objects/phenomena of a real world, but it has limited capability to communicate with the physical environment. We argue that new and intuitive 3D user interfaces, such as 3D virtual environment interfaces, may provide an alternative form of media for communicating with the real environment. We propose a 3D virtual world-based add-on architecture that achieves a synchronized virtual-real communication. In this framework, we explored the possibilities of integrating haptic and real world object interactions with Linden Lab's multiuser online 3D virtual world, Second Life. We enhanced the open source Second Life viewer client in order to facilitate communications between the real and virtual world. Moreover, we analyzed the suitability of such an approach in terms of user perception, intuition and other common parameters. Our experiments suggest that the proposed approach not only demonstrates a more intuitive mode of communication system, but also is appealing and useful to the user. Some of the potential applications of the proposed approach include remote child-care, communication between distant lovers, stress recovery, and home automation.
|
7 |
Design and Development of a Framework to Bridge the Gap Between Real and VirtualHossain, SK Alamgir 01 November 2011 (has links)
Several researchers have successfully developed realistic models of real world objects/ phenomena and then have simulated them in the virtual world. In this thesis, we propose the opposite: instantiating virtual world events in the real world. The interactive 3D virtual environment provides a useful, realistic 3D world that resembles objects/phenomena of a real world, but it has limited capability to communicate with the physical environment. We argue that new and intuitive 3D user interfaces, such as 3D virtual environment interfaces, may provide an alternative form of media for communicating with the real environment. We propose a 3D virtual world-based add-on architecture that achieves a synchronized virtual-real communication. In this framework, we explored the possibilities of integrating haptic and real world object interactions with Linden Lab's multiuser online 3D virtual world, Second Life. We enhanced the open source Second Life viewer client in order to facilitate communications between the real and virtual world. Moreover, we analyzed the suitability of such an approach in terms of user perception, intuition and other common parameters. Our experiments suggest that the proposed approach not only demonstrates a more intuitive mode of communication system, but also is appealing and useful to the user. Some of the potential applications of the proposed approach include remote child-care, communication between distant lovers, stress recovery, and home automation.
|
8 |
Design and Development of a Framework to Bridge the Gap Between Real and VirtualHossain, SK Alamgir 01 November 2011 (has links)
Several researchers have successfully developed realistic models of real world objects/ phenomena and then have simulated them in the virtual world. In this thesis, we propose the opposite: instantiating virtual world events in the real world. The interactive 3D virtual environment provides a useful, realistic 3D world that resembles objects/phenomena of a real world, but it has limited capability to communicate with the physical environment. We argue that new and intuitive 3D user interfaces, such as 3D virtual environment interfaces, may provide an alternative form of media for communicating with the real environment. We propose a 3D virtual world-based add-on architecture that achieves a synchronized virtual-real communication. In this framework, we explored the possibilities of integrating haptic and real world object interactions with Linden Lab's multiuser online 3D virtual world, Second Life. We enhanced the open source Second Life viewer client in order to facilitate communications between the real and virtual world. Moreover, we analyzed the suitability of such an approach in terms of user perception, intuition and other common parameters. Our experiments suggest that the proposed approach not only demonstrates a more intuitive mode of communication system, but also is appealing and useful to the user. Some of the potential applications of the proposed approach include remote child-care, communication between distant lovers, stress recovery, and home automation.
|
9 |
Utilizing Immediate Feedback in Piano PedagogySzabo, Michael 23 March 2016 (has links)
Piano pedagogy is the study of the teaching of piano performance. Several effective methods have been developed since the early 1700's, but lack empirically supported techniques. Immediate feedback procedures have been shown to be effective with skill acquisition in various capacities within the literature. While some innovative techniques are being developed which utilize technologies such as video and sensor-based feedback, the true impact of these interventions has not been empirically validated. There is also a paucity of research in the behavioral literature evaluating the efficacy of immediate feedback procedures in acquisition of music performance. The current study evaluated the effectiveness of an immediate tactile feedback procedure for teaching basic introductory piano to new learners by teaching three unique scales, proper hand/finger positioning, rhythm and tempo. All three participants successfully acquired the different skill sets which supported the learning of a simplified arrangement for a preferred song on the part of the participants.
|
10 |
Braille Hero : Feedback modalities and their effectiveness on alphabetic braille learningHellkvist, Marcus January 2017 (has links)
Braille literacy is an important and vital part of visually impaired and blind peoples’ everyday lives. The purpose of this paper was to evaluate different feedback modalities used in a smartphone game and analyze their impact and effectiveness on alphabetic braille learning. In this study, three different modalities were used and tested. These were tactile feedback, auditory feedback, and a combination of both. A quantitative method and a post-test consisting of braille writing and reading exercises was used to measure the effectiveness of each feedback modality. 18 people, equally distributed between the three different feedback modalities participated in the study. Each played the game using blindfolds. The result show that there was no statistically significant difference between the feedback modalities as determined by a one-way ANOVA test. However, a practical difference when playing the game was found. The respondents who used the combined feedback method performed better in the game. On average, the respondent learned to identify seven out of twelve braille characters and was able to read one out of five words in braille print. The study concluded that the game could be played autonomously and that the feedback modalities could be used separately or in combination with each other without affecting the knowledge post-test.
|
Page generated in 0.0624 seconds