Spelling suggestions: "subject:"haptics."" "subject:"bhaptics.""
141 |
Time course of information processing in visual and haptic object classificationMartinovic, Jasna, Lawson, Rebecca, Craddock, Matt 28 July 2022 (has links)
Vision identifies objects rapidly and efficiently. In contrast, object recognition by touch is much slower. Furthermore, haptics usually serially accumulates information from different parts of objects, whereas vision typically processes object information in parallel. Is haptic object identification slower simply due to sequential information acquisition and the resulting memory load or due to more fundamental processing differences between the senses? To compare the time course of visual and haptic object recognition, we slowed visual processing using a novel, restricted viewing technique. In an electroencephalographic (EEG) experiment, participants discriminated familiar, nameable from unfamiliar, unnamable objects both visually and haptically. Analyses focused on the evoked and total fronto-central theta-band (5–7 Hz; a marker of working memory) and the occipital upper alpha-band (10–12 Hz; a marker of perceptual processing) locked to the onset of classification. Decreases in total upper alpha-band activity for haptic identification of objects indicate a likely processing role of multisensory extrastriate areas. Long-latency modulations of alpha-band activity differentiated between familiar and unfamiliar objects in haptics but not in vision. In contrast, theta-band activity showed a general increase over time for the slowed-down visual recognition task only. We conclude that haptic object recognition relies on common representations with vision but also that there are fundamental differences between the senses that do not merely arise from differences in their speed of processing.
|
142 |
Multi-Rate Control Architectures for Network-Based Multi-User Haptics InteractionGhiam, Mahyar Fotoohi 12 1900 (has links)
<p> Cooperative haptics enables multiple users to manipulate computer simulated objects in a shared virtual environment and to feel the presence of other users. Prior research in the literature has mainly addressed single user haptic interaction. This thesis is concerned with haptic simulation in multi-user virtual environments in which the users can interact in a shared virtual world from separate workstations over Ethernet-based Local Area Networks (LANs) or Metropolitan Area Networks (MANs). In practice, the achievable real-time communication rate using a typical implementation of network protocols such as the UDP and TCP/IP can be well below the 1kHz update rate that is suggested in the literature for high fidelity haptic rendering. However by adopting a multi-rate control strategy as proposed in this work, the local control loops can be executed at 1kHz while the data packet transmission between the user workstations occur at a lower rate. Within such a framework, two control architectures, namely centralized and distributed are presented. In the centralized controller a central workstation simulates the virtual environment, whereas in the distributed controller each user workstation simulates its own copy of the virtual environment. Two different approaches have been proposed for mathematical modeling of the controllers and have been used in a comparative analysis of their stability and performance. The results of such analysis
demonstrate that the distributed control architecture has greater stability margins and outperforms the centralized controller. They also reveal that the limited network transmission rate can degrade the haptic fidelity by introducing viscous damping into the virtual object perceived impedance. This extra damping is compensated by active control based on the damping values obtained from the analytical results. Experimental results conducted with a dual-user/dual-finger haptic platform are presented for each of the proposed controller under various scenarios in which the user workstations communicate with UDP protocol subjected to a limited transmission rate. The results demonstrate the effectiveness of the proposed distributed architecture in providing a stable and transparent haptic simulation in free motion and in contact with rigid environments.</p> / Thesis / Master of Applied Science (MASc)
|
143 |
GENTLE/A : adaptive robotic assistance for upper-limb rehabilitationGudipati, Radhika January 2014 (has links)
Advanced devices that can assist the therapists to offer rehabilitation are in high demand with the growing rehabilitation needs. The primary requirement from such rehabilitative devices is to reduce the therapist monitoring time. If the training device can autonomously adapt to the performance of the user, it can make the rehabilitation partly self-manageable. Therefore the main goal of our research is to investigate how to make a rehabilitation system more adaptable. The strategy we followed to augment the adaptability of the GENTLE/A robotic system was to (i) identify the parameters that inform about the contribution of the user/robot during a human-robot interaction session and (ii) use these parameters as performance indicators to adapt the system. Three main studies were conducted with healthy participants during the course of this PhD. The first study identified that the difference between the position coordinates recorded by the robot and the reference trajectory position coordinates indicated the leading/lagging status of the user with respect to the robot. Using the leadlag model we proposed two strategies to enhance the adaptability of the system. The first adaptability strategy tuned the performance time to suit the user’s requirements (second study). The second adaptability strategy tuned the task difficulty level based on the user’s leading or lagging status (third study). In summary the research undertaken during this PhD successfully enhanced the adaptability of the GENTLE/A system. The adaptability strategies evaluated were designed to suit various stages of recovery. Apart from potential use for remote assessment of patients, the work presented in this thesis is applicable in many areas of human-robot interaction research where a robot and human are involved in physical interaction.
|
144 |
Experiencing artists' books : haptics and intimate discovery in the work of Estelle Liebenberg-Barkhuizen and Cheryl Penn.Haskins, Phillipa. January 2013 (has links)
This dissertation centres on the classification of artists’ books based on the qualities they possess as works of art as well as the intimate engagement required by the reader in order to experience such works in their entirety.
Among the qualities investigated are intimacy through the use of novelty devices, haptics, text, narrative and concrete systems, space, and shape. These qualities are exemplified through works by Estelle Liebenberg-Barkhuizen and Cheryl Penn. / Thesis (M.A.)-University of KwaZulu-Natal, Pietermaritzburg, 2013.
|
145 |
Experiencing the moment : Enhancing surroundig awareness when walkingKerzic, Borut January 2019 (has links)
Todays technology provides the ability for people to interact with devices at all time, but it also makes them gaze down and loose touch with their surrounding while doing so. This project explored alternatives to the current way of providing guidance while walking by designing in the context, rather than design for the context. This led to several iterations of prototypes that were tested with people in the context. The findings are showcased in a form of a multi modal guidance system called UP, that provides reassurance in all steps of the way without having to look down at the screen.
|
146 |
Force Sensing and Teleoperation of Continuum Robot for MRI-Guided SurgerySu, Hao 24 April 2013 (has links)
Percutaneous needle placement, a minimally invasive procedure performed dozens of millions in the U.S. each year, relies on dedicated skill and long-term training due to difficult control of needle trajectory inside tissue and mental registration of images to locations inside the patient. Inaccurate needle placement may miss cancer tumors during diagnosis or eradicate healthy tissue during therapy. MRI provides ideal procedure guidance with the merit of excellent soft tissue contrast and volumetric imaging for high spatial resolution visualization of targets and surgical tool. However, manual insertion in the bore of an MRI scanner has awkward ergonomics due to difficult access to the patient, making both training and intervention even harder.
To overcome the challenges related to MRI electromagnetic compatibility and mechanical constraints of the confined close-bore, a modular networked robotic system utilizing piezoelectric actuation for fully actuated prostate biopsy and brachytherapy is developed and evaluated with accuracy study. To enhance manipulation dexterity, two kinds of steerable continuum needle robots are developed. The asymmetric tip needle robot performs needle rotation and translation control to minimize tissue deformation, and increase steering dexterity to compensate placement error under continuous MRI guidance. The MRI-guided concentric tube robot is deployed to access delicate surgical sites that are traditionally inaccessible by straight and rigid surgical tools without relying on tissue reaction force. The master-slave teleoperation system with hybrid actuation is the first of its kind for prostate intervention with force feedback. The teleoperation controller provides the feel and functionality of manual needle insertion. Fabry- Perot interferometer based fiber optic force sensor is developed for the slave manipulator to measure needle insertion force and render proprioception feedback during teleoperation.
|
147 |
Improved Design and Performance of Haptic Two-Port Networks through Force Feedback and Passive ActuatorsTognetti, Lawrence Joseph 18 January 2005 (has links)
Haptic systems incorporate many different components, ranging from virtual simulations, physical robotic interfaces (super joysticks), robotic slaves, signal communication, and digital control; two-port networks offer compact and modular organization of such haptic components. By establishing specific stability properties of the individual component networks, their control parameters can be tuned independently of external components or interfacing environment. This allows the development of independent haptic two-port networks for interfacing with a class of haptic components. Furthermore, by using the two-port network with virtual coupling paradigm to analyze linear haptic systems, the complete duality between an admittance controlled device with velocity (position) feedback and virtual coupling can be compared to an impedance controlled device with force feedback and virtual coupling.
This research first provides background on linear haptic two-port networks and use of Llewelyn's Stability Criterion to prove their stability when interfaced with passive environments, with specific comments regarding application of these linear techniques to nonlinear systems. Furthermore, man-machine interaction dynamics are addressed, with specific attention given to the human is a passive element assumption and how to include estimated human impedance / admittance dynamic limits into the two--port design. Two--port numerical tuning algorithms and analysis techniques are presented and lay the groundwork for testing of said haptic networks on HuRBiRT (Human Robotic Bilateral Research Tool), a large scale nonlinear hybrid active / passive haptic display.
First, two-port networks are numerically tuned using a linearized dynamic model of HuRBiRT. Resulting admittance and impedance limits of the respective networks are compared to add insight on the advantages / disadvantages of the two different implementations of haptic causality for the same device, with specific consideration given to the advantage of adding force feedback to the impedance network, selection of virtual coupling form, effects of varying system parameters (such as physical or EMF damping, filters, etc.), and effects of adding human dynamic limits into the network formulation. Impedance and admittance two-port network implementations are experimentally validated on HuRBiRT, adding further practical insight into network formulation. Resulting experimental networks are directly compared to those numerically formulated through use of HuRBiRT's linearized dynamic models.
|
148 |
Haptics with Applications to Cranio-Maxillofacial Surgery PlanningOlsson, Pontus January 2015 (has links)
Virtual surgery planning systems have demonstrated great potential to help surgeons achieve a better functional and aesthetic outcome for the patient, and at the same time reduce time in the operating room resulting in considerable cost savings. However, the two-dimensional tools employed in these systems today, such as a mouse and a conventional graphical display, are difficult to use for interaction with three-dimensional anatomical images. Therefore surgeons often outsource virtual planning which increases cost and lead time to surgery. Haptics relates to the sense of touch and haptic technology encompasses algorithms, software, and hardware designed to engage the sense of touch. To demonstrate how haptic technology in combination with stereo visualization can make cranio-maxillofacial surgery planning more efficient and easier to use, we describe our haptics-assisted surgery planning (HASP) system. HASP supports in-house virtual planning of reconstructions in complex trauma cases, and reconstructions with a fibula osteocutaneous free flap including bone, vessels, and soft-tissue in oncology cases. An integrated stable six degrees-of-freedom haptic attraction force model, snap-to-fit, supports semi-automatic alignment of virtual bone fragments in trauma cases. HASP has potential beyond this thesis as a teaching tool and also as a development platform for future research. In addition to HASP, we describe a surgical bone saw simulator with a novel hybrid haptic interface that combines kinesthetic and vibrotactile feedback to display both low frequency contact forces and realistic high frequency vibrations when a virtual saw blade comes in contact with a virtual bone model. We also show that visuo-haptic co-location shortens the completion time, but does not improve the accuracy, in interaction tasks performed on two different visuo-haptic displays: one based on a holographic optical element and one based on a half-transparent mirror. Finally, we describe two prototype hand-worn haptic interfaces that potentially may expand the interaction capabilities of the HASP system. In particular we evaluate two different types of piezo-electric motors, one walking quasi-static motor and one traveling-wave ultrasonic motor for actuating the interfaces.
|
149 |
Modeling of operator action for intelligent control of haptic human-robot interfacesGallagher, William John 13 January 2014 (has links)
Control of systems requiring direct physical human-robot interaction (pHRI) requires special consideration of the motion, dynamics, and control of both the human and the robot. Humans actively change their dynamic characteristics during motion, and robots should be designed with this in mind. Both the case of humans trying to control haptic robots using physical contact and the case of using wearable robots that must work with human muscles are pHRI systems.
Force feedback haptic devices require physical contact between the operator and the machine, which creates a coupled system. This human contact creates a situation in which the stiffness of the system changes based on how the operator modulates the stiffness of their arm. The natural human tendency is to increase arm stiffness to attempt to stabilize motion. However, this increases the overall stiffness of the system, making it more difficult to control and reducing stability. Instability poses a threat of injury or load damage for large assistive haptic devices with heavy loads. Controllers do not typically account for this, as operator stiffness is often not directly measurable. The common solution of using a controller with significantly increased controller damping has the disadvantage of slowing the device and decreasing operator efficiency. By expanding the information available to the controller, it can be designed to adjust a robot's motion based on the how the operator is interacting with it and allow for faster movement in low stiffness situations. This research explored the utility of a system that can estimate operator arm stiffness and compensate accordingly. By measuring muscle activity, a model of the human arm was utilized to estimate the stiffness level of the operator, and then adjust the gains of an impedance-based controller to stabilize the device. This achieved the goal of reducing oscillations and increasing device performance, as demonstrated through a series of user trials with the device. Through the design of this system, the effectiveness of a variety of operator models were analyzed and several different controllers were explored. The final device has the potential to increase the performance of operators and reduce fatigue due to usage, which in industrial settings could translate into better efficiency and higher productivity.
Similarly, wearable robots must consider human muscle activity. Wearable robots, often called exoskeleton robots, are used for a variety of tasks, including force amplification, rehabilitation, and medical diagnosis. Force amplification exoskeletons operate much like haptic assist devices, and could leverage the same adaptive control system. The latter two types, however, are designed with the purpose of modulating human muscles, in which case the wearer's muscles must adapt to the way the robot moves, the reverse of the robot adapting to how the human moves. In this case, the robot controller must apply a force to the arm to cause the arm muscles to adapt and generate a specific muscle activity pattern. This related problem is explored and a muscle control algorithm is designed that allows a wearable robot to induce a specified muscle pattern in the wearer's arm.
The two problems, in which the robot must adapt to the human's motion and in which the robot must induce the human to adapt its motion, are related critical problems that must be solved to enable simple and natural physical human robot interaction.
|
150 |
Haptic interaction between naive participants and mobile manipulators in the context of healthcareChen, Tiffany L. 22 May 2014 (has links)
Human-scale mobile robots that manipulate objects (mobile manipulators) have the potential to perform a variety of useful roles in healthcare. Many promising roles for robots require physical contact with patients and caregivers, which is fraught with both psychological and physical implications. In this thesis, we used a human factors approach to evaluate system performance and participant responses when potential end users performed a healthcare task involving physical contact with a robot. We performed four human-robot interaction studies with 100 people who were not experts in robotics (naive participants). We show that physical contact between naive participants and human-scale mobile manipulators can be acceptable and effective in a variety of healthcare contexts. In this thesis, we investigated two forms of touch-based (haptic) interaction relevant to healthcare. First, we studied how participants responded to physical contact initiated by an autonomous robotic nurse. On average, people responded favorably to
robot-initiated touch when the robot indicated that it was a necessary part of a healthcare task. However, their responses strongly depended on what they thought the robot's intentions were, which suggests that this will be an important consideration for future healthcare robots. Second, we investigated the coordination of whole-body motion between human-scale robots and people by the application of forces to the robot's hands and arms. Nurses found this haptic interaction to be intuitive and preferred it over a standard gamepad interface. They also navigated the robot through a cluttered healthcare environment in less time, with fewer collisions, and with less cognitive load via haptic interaction. Through a study with expert dancers, we demonstrated the feasibility of robots as dance-based exercise partners. The experts rated a robot that used only haptic interaction to be a good follower according to subjective measures of dance quality. We also determined that healthy older adults were accepting of using a robot for partner dance-based exercise. On average, they found the robot easy and enjoyable to use and that it performed a partnered stepping task well. The findings in this work make several impacts on the design of robots in healthcare. We found that the perceived intent of robot-initiated touch significantly influenced people's responses. Thus, we determined that autonomous robots that initiate touch with patients can be acceptable in some contexts. This result highlights the importance of
considering the psychological responses of users when designing physical human-robot interactions in addition to considering the mechanics of performing tasks. We found that naive users across three user groups could quickly learn how to effectively use physical interaction to lead a robot during navigation, positioning, and partnered stepping tasks. These consistent results underscore the value of using physical interaction to enable users of varying backgrounds to lead a robot during whole-body motion coordination across different healthcare contexts.
|
Page generated in 0.0542 seconds