Spelling suggestions: "subject:"haptics."" "subject:"bhaptics.""
81 |
Network-based Haptic Systems with Time-DelaysLiacu, Bogdan Cristian 20 November 2012 (has links) (PDF)
During the last decades, virtual environments have become very popular and are largely used in many domains as, for example, prototyping, trainings for different devices, assistance in completing difficult tasks, etc. The interaction with the virtual reality, as well as the feedback force, is assured by haptic interfaces. Generally, such systems are affected by communication and processing time-delays, resulting in a deterioration of performances. In this thesis, a complete study of the existing methods, as well as theoretical tools and new solutions, are proposed for the haptic framework. First, a comparative study, based on the experimental results obtained on a 1-dof haptic system, highlights the advantages and drawbacks of the most common control algorithms ported from teleoperation to haptics. Next, the theoretical tools needed in analyzing the stability of the delayed systems in different situations, as well as the physical limitations of the experimental platforms considered, are examined. Besides the standard case of constant time-delays, uncertainties are also considered and modeled by different types of distributions (uniform, normal and gamma distribution with gap). In the sequel, for overcoming the drawback of time-delays, two new approaches are proposed. First, the use of Smith predictor-based control is addressed and a specific solution for haptic systems is developed and discussed. The main idea is to introduce into the Smith predictor the environmental forces by using the additional information from the virtual reality regarding the distances between the controlled virtual object and other objects in the scene. To overcome the loss of performances induced by using a fixed gain in the controllers for all situations (free or restricted motions), the second approach proposes a gain-scheduling Proportional Derivative control strategy depending on the distance until a possible collision. Both approaches are experimentally validated on a 3-dof haptic platform, under different scenarios elaborated gradually from simple situations - free and restricted motion, contacts with moving objects, to more complex situations - virtual box with fixed or moving sides.
|
82 |
Design and Development of a Framework to Bridge the Gap Between Real and VirtualHossain, SK Alamgir 01 November 2011 (has links)
Several researchers have successfully developed realistic models of real world objects/ phenomena and then have simulated them in the virtual world. In this thesis, we propose the opposite: instantiating virtual world events in the real world. The interactive 3D virtual environment provides a useful, realistic 3D world that resembles objects/phenomena of a real world, but it has limited capability to communicate with the physical environment. We argue that new and intuitive 3D user interfaces, such as 3D virtual environment interfaces, may provide an alternative form of media for communicating with the real environment. We propose a 3D virtual world-based add-on architecture that achieves a synchronized virtual-real communication. In this framework, we explored the possibilities of integrating haptic and real world object interactions with Linden Lab's multiuser online 3D virtual world, Second Life. We enhanced the open source Second Life viewer client in order to facilitate communications between the real and virtual world. Moreover, we analyzed the suitability of such an approach in terms of user perception, intuition and other common parameters. Our experiments suggest that the proposed approach not only demonstrates a more intuitive mode of communication system, but also is appealing and useful to the user. Some of the potential applications of the proposed approach include remote child-care, communication between distant lovers, stress recovery, and home automation.
|
83 |
Haptic Image ExplorationLareau, David 12 January 2012 (has links)
The haptic exploration of 2-D images is a challenging problem in computer haptics. Research on the topic has primarily been focused on the exploration of maps and curves. This thesis describes the design and implementation of a system for the haptic exploration of photographs. The system builds on various research directions related to assistive technology, computer haptics, and image segmentation. An object-level segmentation hierarchy is generated from the source photograph to be rendered haptically as a contour image at multiple levels-of-detail. A tool for the authoring of object-level hierarchies was developed, as well as an innovative type of user interaction by region selection for accurate and efficient image segmentation. According to an objective benchmark measuring how the new method compares with other interactive image segmentation algorithms shows that our region selection interaction is a viable alternative to marker-based interaction. The hierarchy authoring tool combined with precise algorithms for image segmentation can build contour images of the quality necessary for the images to be understood by touch with our system. The system was evaluated with a user study of 24 sighted participants divided in different groups. The first part of the study had participants explore images using haptics and answer questions about them. The second part of the study asked the participants to identify images visually after haptic exploration. Results show that using a segmentation hierarchy supporting multiple levels-of-detail of the same image is beneficial to haptic exploration. As the system gains maturity, it is our goal to make it available to blind users.
|
84 |
Tactile Haptics: A Study of Roughness Perception in Virtual EnvironmentsSamra, Roopkanwal January 2009 (has links)
This thesis presents the design of a tactile device that can be used to display varying magnitudes of roughness. The device is designed to be attached to an existing force feedback device in order to create a package that is able to display both macro-level (force feedback) and micro-level (tactile feedback) information to the users. This device allows the users to feel a simulated texture by placing an index finger on an aperture. The stimulus is created with a spiral brush made of nylon bristles. The brush is attached to a DC motor and the speed and direction of rotation of the brush are used to generate textures at the fingertip through the aperture.
Three psychophysical experiments are conducted to study the effects of speed and direction on the roughness perception. The first experiment is designed to investigate the sensitivity to a change in the speed of the brush. This experiment is conducted for two levels of base speed and it is found that as the base speed increases, the just noticeable difference (JND) with respect to speed decreases.
In the second experiment, it is found that this tactile device is able to represent textures of rough nature, such as sandpaper. It is also found that the human roughness perception cannot be described in a unique manner. Two opposite definitions of rough textures are identified in this experiment. While some users relate an increase in the speed of the brush to increasing roughness, others relate it to decreasing roughness. Further, the results show that the effects of direction are insignificant on the roughness perception for both groups of users.
In the third experiment, the effects of direction are studied more closely by presenting the two directions successively with a time gap of $0.5s$. It is found that with this small time gap, the users are able to discriminate between directions, unlike in the previous experiment. The roughness perception is affected by the change in direction when the time gap is small.
These findings open further areas that need to be investigated before a robust tactile device can be designed.
|
85 |
Advanced Multi-modal User Interfaces in 3D Computer Graphics and Virtual RealityChen, Yenan January 2012 (has links)
Computers are developed continuously to satisfy the human demands, and typical tools used everywhere for ranging from daily life usage to all kinds of research. Virtual Reality (VR), a virtual environment simulated to present physical presence in the real word and imaginary worlds, has been widely applied to simulate the virtual environment. People’s feeling is limited to visual perception when only computers are applied for simulations, since computers are limited to display visualization of data, while human senses include sight, smell, hearing, taste, touch and so on. Other devices can be applied, such as haptics, a device for sense of touch, to enhance the human perception in virtual environment. A good way to apply VR applications is to place them in a virtual display system, a system with multiply tools displays a virtual environment with experiencing different human senses, to enhance the people’s feeling of being immersed in a virtual environment. Such virtual display systems include VR dome, recursive acronym CAVE, VR workbench, VR workstation and so on. Menus with lots of advantages in manipulating applications are common in conventional systems, operating systems or other systems in computers. Normally a system will not be usable without them. Although VR applications are more natural and intuitive, they are much less or not usable without menus. But very few studies have focused on user interfaces in VR. This situation motivates us working further in this area. We want to create two models on different purposes. One is inspired from menus in conventional system and the sense of touch. And the other one is designed based on the spatial presence of VR. The first model is a two-dimensional pie menu in pop-up style with spring force feedback. This model is in a pie shape with eight options on the root menu. And there is a pop-up style hierarchical menu belongs to each option on the root menu. When the haptics device is near an option on the root menu, the spring force will force the haptics device towards to the center of the option and that option will be selected, and then the sub menu with nine options will pop up. The pie shape together with the spring force effect is expected to both increase the speed of selection and decrease the error rate of selection. The other model is a semiautomatic three-dimensional cube menu. This cube menu is designed with a aim to provide a simple, elegant, efficient and accurate user interface approach. This model is designed with four faces, including the front, back, left and right faces of the cube. Each face represents a category and has nine widgets. Users can make selections in different categories. An efficient way to change between categories is to rotate the cube automatically. Thus, a navigable rotation animation system is built and is manipulating the cube rotate horizontally for ninety degrees each time, so one of the faces will always face users. These two models are built under H3DAPI, an open source haptics software development platform with UI toolkit, a user interface toolkit. After the implementation, we made a pilot study, which is a formative study, to evaluate the feasibility of both menus. This pilot study includes a list of tasks for each menu, a questionnaire regards to the menu performance for each subject and a discussion with each subject. Six students participated as test subjects. In the pie menu, most of the subjects feel the spring force guides them to the target option and they can control the haptics device comfortably under such force. In the cube menu, the navigation rotation system works well and the cube rotates accurately and efficiently. The results of the pilot study show the models work as we initially expected. The recorded task completion time for each menu shows that with the same amount of tasks and similar difficulties, subjects spent more time on the cube menu than on the pie menu. This may implicate that pie menu is a faster approach comparing to the cube menu. We further consider that both the pie shape and force feedback may help reducing the selection time. The result for the option selection error rate test on the cube menu may implicates that option selection without any force feedback may also achieve a considerable good effect. Through the answers from the questionnaire for each subject, both menus are comfortable to use and in good control.
|
86 |
Tactile Haptics: A Study of Roughness Perception in Virtual EnvironmentsSamra, Roopkanwal January 2009 (has links)
This thesis presents the design of a tactile device that can be used to display varying magnitudes of roughness. The device is designed to be attached to an existing force feedback device in order to create a package that is able to display both macro-level (force feedback) and micro-level (tactile feedback) information to the users. This device allows the users to feel a simulated texture by placing an index finger on an aperture. The stimulus is created with a spiral brush made of nylon bristles. The brush is attached to a DC motor and the speed and direction of rotation of the brush are used to generate textures at the fingertip through the aperture.
Three psychophysical experiments are conducted to study the effects of speed and direction on the roughness perception. The first experiment is designed to investigate the sensitivity to a change in the speed of the brush. This experiment is conducted for two levels of base speed and it is found that as the base speed increases, the just noticeable difference (JND) with respect to speed decreases.
In the second experiment, it is found that this tactile device is able to represent textures of rough nature, such as sandpaper. It is also found that the human roughness perception cannot be described in a unique manner. Two opposite definitions of rough textures are identified in this experiment. While some users relate an increase in the speed of the brush to increasing roughness, others relate it to decreasing roughness. Further, the results show that the effects of direction are insignificant on the roughness perception for both groups of users.
In the third experiment, the effects of direction are studied more closely by presenting the two directions successively with a time gap of $0.5s$. It is found that with this small time gap, the users are able to discriminate between directions, unlike in the previous experiment. The roughness perception is affected by the change in direction when the time gap is small.
These findings open further areas that need to be investigated before a robust tactile device can be designed.
|
87 |
Exploring the Human Interactivity with a Robot to Obtain the Fundamental Properties of MaterialsChristian, William L. 14 October 2010 (has links)
This research studies the way in which humans and robots interact with each other. When two humans are working together through a set of robotic devices, do they tend to work together or fight with each other more? In which Cartesian direction do they have the most difficulty? Does fighting drastically affect the performance of the team? Finally, what measures can be taken to promote better cooperation between humans and robots to ultimately allow humans to work just as comfortably with a robotic partner as with a human partner? This research answers these questions and provides an analysis of human-robot interaction.
It was found that significant fighting between the subjects does have a negative impact on the performance of the team. Out of the three Cartesian directions, the up-down direction was found to be the most difficult to cooperate in. Although the level of fighting varied greatly among different dyads, two things which greatly assisted in completing the experiments were force feedback and visual feedback. Different methods of feedback were tested, and subject performance in each was compared.
|
88 |
The Effect of Training on Haptic Classification of Facial Expressions of Emotion in 2D Displays by Sighted and Blind ObserversABRAMOWICZ, ANETA 23 October 2009 (has links)
Abstract
The current study evaluated the effects of training on the haptic classification of culturally universal facial expressions of emotion as depicted in simple 2D raised-line drawings. Blindfolded sighted (N = 60) and blind (N = 4) participants participated in Experiments 1 and 3, respectively. A small vision control study (N = 12) was also conducted (Experiment 2) to compare haptic versus visual learning patterns. A hybrid learning paradigm consisting of pre/post- and old/new-training procedures was used to address the nature of the underlying learning process in terms of token-specific learning and/or generalization. During the Pre-Training phase, participants were tested on their ability to classify facial expressions of emotion using the set with which they would be subsequently trained. During the Post-Training phase, they were tested with the training set (Old) intermixed with a completely novel set (New). For sighted observers, visual classification was more accurate than haptic classification; in addition, two of the three adventitiously blind individuals tended to be at least as accurate as the sighted haptic group. All three groups showed similar learning patterns across the learning stages of the experiment: accuracy improved substantially with training; however, while classification accuracy for the Old set remained high during the Post-Training test stage, learning effects for novel (New) drawings were reduced, if present at all. These results imply that learning by the sighted was largely token-specific for both haptic and visual classification. Additional results from a limited number of blind subjects tentatively suggest that the accuracy with which facial expressions of emotion are classified is not impaired when visual loss occurs later in life. / Thesis (Master, Neuroscience Studies) -- Queen's University, 2009-10-23 12:04:41.133
|
89 |
Design of a Haptic Simulator for Pedicle Screw Insertion in Pediatric Scoliosis SurgeryLeung, Regina 04 December 2013 (has links)
The following work presents the design of a haptic training simulator for pedicle screw insertions in pediatric scoliosis surgery. In particular, the haptic simulator simulates the haptic sensations associated with probe channeling through the pedicle using the free-hand technique. The design includes 1 DOF custom haptic device, haptic model, and controller. The design is tested and evaluated for feasibility through a small pilot studying involving 5 expert surgeons. Significant agreement across expert surgeons was obtained regarding the feasibility and potential for the simulator to be a useful training tool.
|
90 |
Design of a Haptic Simulator for Pedicle Screw Insertion in Pediatric Scoliosis SurgeryLeung, Regina 04 December 2013 (has links)
The following work presents the design of a haptic training simulator for pedicle screw insertions in pediatric scoliosis surgery. In particular, the haptic simulator simulates the haptic sensations associated with probe channeling through the pedicle using the free-hand technique. The design includes 1 DOF custom haptic device, haptic model, and controller. The design is tested and evaluated for feasibility through a small pilot studying involving 5 expert surgeons. Significant agreement across expert surgeons was obtained regarding the feasibility and potential for the simulator to be a useful training tool.
|
Page generated in 0.0435 seconds