• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 1
  • Tagged with
  • 8
  • 8
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Multirate and Perceptual Techniques for Haptic Rendering in Virtual Environments

Ruffaldi, Emanuele January 2006 (has links)
Haptics is a field of robotics that involves many aspects of engineering, requiring the collaboration of different disciplines like mechanics, electronics, control theory and computer science. Although multi-disciplinarity is an element in common with other robotic application, haptic system has the additional requirement of high performance because of the human perception requirement of 1KHz feedback rate. Such high computing requirement impacts the design of the whole haptic system but it is has particular effects in the design and implementation of haptic rendering algorithms. In the chain of software and hardware components that describe a haptic system the haptic rendering is the element that has the objective of computing the force feedback given the interaction of the user with the device.A variety of haptic rendering algorithms have been proposed in the past for the simulation of three degree of freedom (3DoF) interactions in which a single point touches a complex object as well as 6DoF interactions in which two complex objects interact in multiple points. The use of 3DoF or 6DoF algorithms depends mostly from the type of application and consequently the type of device. For example applications like virtual prototype require 6DoF interaction while many simulation applications have less stringent requirements. Apart the number of degree of freedom haptic rendering algorithms are characterized by the geometrical representation of the objects, by the use of rigid or deformable objects and by the introduction of physical properties of the object surface like friction and texture properties. Given this variety of possibilities and the presence of the human factor in the computation of haptic feedback it is hard to compare different algorithms to asses whether one specific solution performs better than any other previously proposed.The goal of the proposed work is two-fold. First this thesis proposes a framework allowing for more objective comparison of haptic rendering algorithms. Such comparison take into account the perceptual aspect of haptic interaction but tries to remove it from the comparison with the aim of obtaining an objective comparison between algorithms. Second, this thesis proposes a new haptic rendering algorithm for 3DoF interaction and one for 6DoF interaction. The first algorithm for 3DoF interaction provides interaction with rotational friction based on a simulation of the soft finger contact model. The new 6DoF interaction algorithm allows the computation of the haptic feedback of interaction between voxel models.
2

Haptic rendering for 6/3-DOF haptic devices / Haptic rendering for 6/3-DOF haptic devices

Kadleček, Petr January 2013 (has links)
Application of haptic devices expanded to fields like virtual manufacturing, virtual assembly or medical simulations. Advances in development of haptic devices have resulted in a wide distribution of assymetric 6/3-DOF haptic devices. However, current haptic rendering algorithms work correctly only for symmetric devices. This thesis analyzes 3-DOF and 6-DOF haptic rendering algorithms and proposes an algorithm for 6/3-DOF haptic rendering involving pseudo-haptics. The 6/3-DOF haptic rendering algorithm is implemented based on the previous analysis and tested in a user study.
3

Visualization and Haptics for Interactive Medical Image Analysis / Visualisering och Haptik för Interaktiv Medicinsk Bildanalys

Vidholm, Erik January 2008 (has links)
Modern medical imaging techniques provide an increasing amount of high-dimensional and high-resolution image data that need to be visualized, analyzed, and interpreted for diagnostic and treatment planning purposes. As a consequence, efficient ways of exploring these images are needed. In order to work with specific patient cases, it is necessary to be able to work directly with the medical image volumes and to generate the relevant 3D structures directly as they are needed for visualization and analysis. This requires efficient tools for segmentation, i.e., separation of objects from each other and from the background. Segmentation is hard to automate due to, e.g., high shape variability of organs and limited contrast between tissues. Manual segmentation, on the other hand, is tedious and error-prone. An approach combining the merits from automatic and manual methods is semi-automatic segmentation, where the user interactively provides input to the methods. For complex medical image volumes, the interactive part can be highly 3D oriented and is therefore dependent on the user interface. This thesis presents methods for interactive segmentation and visualization where true 3D interaction with haptic feedback and stereo graphics is used. Well-known segmentation methods such as fast marching, fuzzy connectedness, live-wire, and deformable models, have been tailored and extended for implementation in a 3D environment where volume visualization and haptics are used to guide the user. The visualization is accelerated with graphics hardware and therefore allows for volume rendering in stereo at interactive rates. The haptic feedback is rendered with constraint-based direct volume haptics in order to convey information about the data that is hard to visualize and thereby facilitate the interaction. The methods have been applied to real medical images, e.g., 3D liver CT data and 4D breast MR data with good results. To provide a tool for future work in this area, a software toolkit containing the implementations of the developed methods has been made publicly available.
4

Development Of A Material Cutting Model For Haptic Rendering Applications

Uner, Gorkem 01 July 2007 (has links) (PDF)
Haptic devices and haptic rendering is an important topic in the burgeoning field of virtual reality applications. In this thesis, I describe the design and implementation of a cutting force model integrating a haptic device, the PHANToM, with a high &ndash / powered computer. My goal was to build a six degree &ndash / of &ndash / freedom force model to allow user to interact with three &ndash / dimensional deformable objects. Methods for haptic rendering including graphical rendering, collision detection and force feedback are illustrated, implementation of haptic rendering system is introduced, and application is evaluated to explore the effectiveness of the system.
5

Haptics with Applications to Cranio-Maxillofacial Surgery Planning

Olsson, Pontus January 2015 (has links)
Virtual surgery planning systems have demonstrated great potential to help surgeons achieve a better functional and aesthetic outcome for the patient, and at the same time reduce time in the operating room resulting in considerable cost savings. However, the two-dimensional tools employed in these systems today, such as a mouse and a conventional graphical display, are difficult to use for interaction with three-dimensional anatomical images. Therefore surgeons often outsource virtual planning which increases cost and lead time to surgery. Haptics relates to the sense of touch and haptic technology encompasses algorithms, software, and hardware designed to engage the sense of touch. To demonstrate how haptic technology in combination with stereo visualization can make cranio-maxillofacial surgery planning more efficient and easier to use, we describe our haptics-assisted surgery planning (HASP) system. HASP supports in-house virtual planning of reconstructions in complex trauma cases, and reconstructions with a fibula osteocutaneous free flap including bone, vessels, and soft-tissue in oncology cases. An integrated stable six degrees-of-freedom haptic attraction force model, snap-to-fit, supports semi-automatic alignment of virtual bone fragments in trauma cases. HASP has potential beyond this thesis as a teaching tool and also as a development platform for future research. In addition to HASP, we describe a surgical bone saw simulator with a novel hybrid haptic interface that combines kinesthetic and vibrotactile feedback to display both low frequency contact forces and realistic high frequency vibrations when a virtual saw blade comes in contact with a virtual bone model.  We also show that visuo-haptic co-location shortens the completion time, but does not improve the accuracy, in interaction tasks performed on two different visuo-haptic displays: one based on a holographic optical element and one based on a half-transparent mirror.  Finally, we describe two prototype hand-worn haptic interfaces that potentially may expand the interaction capabilities of the HASP system. In particular we evaluate two different types of piezo-electric motors, one walking quasi-static motor and one traveling-wave ultrasonic motor for actuating the interfaces.
6

Control Implementation and Co-simulation  of A 6-DOF TAU Haptic Device / Reglering, implementering och samsimulering av en 6-DOF TAU haptisk enhet

Zhang, Yang January 2020 (has links)
In the research area of virtual reality, the term haptic rendering is defined as the process of computing and generating the interaction force between the virtual object and the operator. One of the major challenges of haptic rendering is the stably rendering contact with a stiff object. Traditional haptic rendering algorithms performs well when rendering contact with soft objects. But when it is used to simulate contact with objects with high stiffness, the algorithm may cause unstable response of haptic devices. Such unstable behavior (e.g., oscillation of the device) can destroy the fidelity of the virtual environment and even hurt the user.  To address the above stability issues, a new design approach has been proposed in this paper. The proposed approach consists of three main process steps: modeling and linearization in ADAMS, LQR position controller design, verification with co-simulation. In the first step, a simulation model of the system is firstly created in ADAMS/View. Then this nonlinear ADAMS multi-body dynamics model is linearized and exported as a set of linear state space matrices with the help of ADAMS/Linear. In the second step, different from the traditional force-control algorithms, LQR position controller is developed in Matlab Simulink based on the exported matrices to emulate interactions with stiff objects. At last, the verification of control performance is carried out by setting up co-simulation between ADAMS and Simulink.  A case study implementation of this proposed method was performed on the TAU device which was previously developed by Machine Design department at KTH. TAU is an asymmetrical parallel robot with six degrees of freedom for the simulation of surgical procedures like drilling and milling of hard tissues of bones and teeth. The results show that the linear model exported from ADAMS is sufficiently accurate and the proposed controller can render a virtual wall with stiffness at the level of 105 N/m. / Inom forskningsområdet virtuell verklighet definieras termen hatisk återgivning (haptic rendering) som processen för beräkning och generering av interaktiva krafter mellan det virtuella objektet och användaren. En av de största utmaningarna med haptisk återgivning är att stabilt simulera känslan av beröring av styv material för användare. Traditionella algoritmer fungerar när det gäller att simulera känslan av beröring av mjuk material, men när algoritmerna används för att simulera kontakt med materialer med stor styvhet kan det orsaka instabilitet hos haptiska enheter. Sådana instabilitet, bland annat svängning hos enheten, kan förstöra den virtuella miljöns exakthet och till och med skada användare.  Denna uppsats försöker ta itu med det ovanstående problemet genom att föreslå en ny designmetod. Metoden består av tre huvudsteg: modellering och linearisering med hjälp av ADAMS, design av LQR-positionskontroll, och verifiering med samsimulering (co-simulation). I det första steget skapas systemets simuleringsmodell med hjälp av ADAMS/View. Sedan linjäriseras denna icke-linjära ADAMS-multikroppsdynamikmodell. Modellen exporteras som linjära tillståndsmatriser med hjälp av ADAMS/Linear. I det andra steget designas en LQR-positionskontroll med hjälp av Matlab Simulink baserat på de exporterade matriserna tidigare för att simulera interaktioner med styv material, vilket skiljer sig från de traditionella kraftkontrollalgoritmer (force-control algorithms). I det sista steget utförs verifieringen av positionskontrollens prestanda genom att ställa in samsimulering (co-simulation) mellan ADAMS och Simulink.  En testkörning av denna föreslagna metod har utförs på TAU-enheten som tidigare utvecklades av KTH institutionen för maskinkonstruktion. TAU är en asymmetrisk parallellsrobot med sex frihetsgrader för att simulera kirurgiska ingrepp som borrning av hårda vävnader i ben och tänder. Resultaten visar att den linjära modellen som exporteras från ADAMS är tillräckligt korrekt, för den föreslagna positionskontrollen kan framställa en virtuell vägg med styvhet vid 105 N/m.
7

Couplage de la rObotique et de la simulatioN mEdical pour des proCédures automaTisées (CONECT) / Coupling robotics and medical simulations for automatic percutaneous procedures

Adagolodjo, Yinoussa 06 September 2018 (has links)
Les techniques d'insertion d'aiguille font partie des interventions chirurgicales les plus courantes. L'efficacité de ces interventions dépend fortement de la précision du positionnement des aiguilles dans un emplacement cible à l'intérieur du corps du patient. L'objectif principal dans cette thèse est de développer un système robotique autonome, capable d'insérer une aiguille flexible dans une structure déformable le long d'une trajectoire prédéfinie. L’originalité de ce travail se trouve dans l’utilisation de simulations inverses par éléments finis (EF) dans la boucle de contrôle du robot pour prédire la déformation des structures. La particularité de ce travail est que pendant l’insertion, les modèles EF sont continuellement recalés (étape corrective) grâce à l’information extraite d’un système d’imagerie peropératoire. Cette étape permet de contrôler l’erreur des modèles par rapport aux structures réelles et ainsi éviter qu'ils divergent. Une seconde étape (étape de prédiction) permet, à partir de la position corrigée, d’anticiper le comportement de structures déformables, en se reposant uniquement sur les prédictions des modèles biomécaniques. Ceci permet ainsi d’anticiper la commande du robot pour compenser les déplacements des tissus avant même le déplacement de l’aiguille. Expérimentalement, nous avions utilisé notre approche pour contrôler un robot réel afin d'insérer une aiguille flexible dans une mousse déformable le long d'une trajectoire (virtuelle) prédéfinie. Nous avons proposé une formulation basée sur des contraintes permettant le calcul d'étapes prédictives dans l'espace de contraintes offrant ainsi un temps d'insertion total compatible avec les applications cliniques. Nous avons également proposé un système de réalité augmentée pour la chirurgie du foie ouverte. La méthode est basée sur un recalage initial semi-automatique et un algorithme de suivi peropératoire basé sur des marqueurs (3D) optiques. Nous avons démontré l'applicabilité de cette approche en salle d'opération lors d'une chirurgie de résection hépatique. Les résultats obtenus au cours de ce travail de thèse ont conduit à trois publications (deux IROS et un ICRA) dans les conférences internationales puis à un journal (Transactions on Robotics) en cours de révision. / Needle-based interventions are among the least invasive surgical approaches to access deep internal structures into organs' volumes without damaging surrounding tissues. Unlike traditional open surgery, needle-based approaches only affect a localized area around the needle, reducing this way the occurrence of traumas and risks of complications \cite{Cowan2011}. Many surgical procedures rely on needles in nowadays clinical routines (biopsies, local anesthesia, blood sampling, prostate brachytherapy, vertebroplasty ...). Radiofrequency ablation (RFA) is an example of percutaneous procedure that uses heat at the tip of a needle to destroy cancer cells. Such alternative treatments may open new solutions for unrespectable tumors or metastasis (concerns about the age of the patient, the extent or localization of the disease). However, contrary to what one may think, needle-based approaches can be an exceedingly complex intervention. Indeed, the effectiveness of the treatment is highly dependent on the accuracy of the needle positioning (about a few millimeters) which can be particularly challenging when needles are manipulated from outside the patient with intra-operative images (X-ray, fluoroscopy or ultrasound ...) offering poor visibility of internal structures. Human factors, organs' deformations, needle deflection and intraoperative imaging modalities limitations can be causes of needle misplacement and rise significantly the technical level necessary to master these surgical acts. The use of surgical robots has revolutionized the way surgeons approach minimally invasive surgery. Robots have the potential to overcome several limitations coming from the human factor: for instance by filtering operator tremors, scaling the motion of the user or adding new degrees of freedom at the tip of instruments. A rapidly growing number of surgical robots has been developed and applied to a large panel of surgical applications \cite{Troccaz2012}. Yet, an important difficulty for needle-based procedures lies in the fact that both soft tissues and needles tend to deform as the insertion proceeds in a way that cannot be described with geometrical approaches. Standard solutions address the problem of the deformation extracting a set of features from per-operative images (also called \textit{visual servoing)} and locally adjust the pose/motion of the robot to compensate for deformations \cite{Hutchinson1996}. [...]To overcome these limitations, we introduce a numerical method allowing performing inverse Finite Element simulations in real-time. We show that it can be used to control an articulated robot while considering deformations of structures during needle insertion. Our approach relies on a forward FE simulation of a needle insertion (involving complex non-linear phenomena such as friction, puncture and needle constraints).[...]
8

Application of Electrorheological Fluid for Conveying Realistic Haptic Feedback in Touch Interfaces

Mazursky, Alex James 03 May 2019 (has links)
No description available.

Page generated in 0.089 seconds