31 |
Computational Models and Analyses of Human Motor Performance in Haptic ManipulationFu, Michael J. 27 April 2011 (has links)
No description available.
|
32 |
Real-time mandibular angle reduction surgical simulator with haptic rendering. / 基于触觉绘制的实时下颌角缩小手术模拟系统 / CUHK electronic theses & dissertations collection / Ji yu chu jue hui zhi de shi shi xia han jiao suo xiao shou shu mo ni xi tongJanuary 2012 (has links)
下颌角缩小术是一种非常流行、有效、并广泛用于修饰脸部轮廓的手术方式。手术中所用到的主要工具有往复锯和圆头磨钻,这两种手术工具工作时有一个共同的特点:通过其高速运转去除骨质。缺乏经验的医生通常需要较长周期的训练,来学习和熟悉如何操作这两种手术工具,并在操作过程中避免由于无法控制好工具与骨骼的触碰以及工具运转时的在骨骼上的移动所造成的危险。具有视觉和触觉反馈的虚拟手术模拟系统为医生们练习手术技巧提供了一种可行并且安全的方式。然而,创建高速运转的手术工具与坚硬的骨骼之间的真实触觉交互模型是一个非常有挑战性的任务。 / 这篇论文设计并实现了虚拟下颌角缩小手术模拟系统,并且创建高保真度的视觉和触觉反馈来增强虚拟手术环境的真实性。文章提出了基于冲量理论的力反馈模型用来模拟作用在工具上的碰撞力和力矩。在不同的往复速率或者旋转速度的情况下,所提出的模型都可以为医生提供可信真实的力感反馈。并且针对磨钻在磨骨是震动明显对磨骨操作有较大影响的特点,论文还提出了一个三维震动模型来模拟磨骨时作用在钻轴上的橫向震动和轴向震动。同时,论文还提出了用于模拟手术中骨质去除以及重建的实时绘制方法。为了验证力模型的真实性,我们还创建了机械平台,采集磨骨和截骨过程中产生的真实力数据,从而用来与虚拟手术中产生的力数据进行比较。最后,还引入真实病人的CT扫描数据来对虚拟手术系统进行实证研究,评估创建的系统是否可以用于训练具有不同手术经验的医生。实证研究的结果也验证了所提出的虚拟手术系统的有效性。 / Mandibular angle reduction is a popular and efficient procedure widely used to alter the facial contour. The primary surgical instruments, the reciprocating saw and the round burr, employed in the surgery have a common feature: operating at a high-speed. Generally, inexperienced surgeons need a longtime practice to learn how to minimize the risks caused by the uncontrolled contacts and cutting motions in manipulation of instruments with high-speed reciprocation or rotation. Virtual reality (VR)-based surgical simulations with both visual and haptic feedbacks provide novice surgeons with a feasible and safe way to practise their surgical skill. However, creating realistic haptic interactions between a high-speed rotary or reciprocating instrument and stiff bone is a challenging task. In this work, a virtual reality-based surgical simulator for the mandibular angle reduction was designed and implemented. High-fidelity visual and haptic feedbacks are provided to enhance the perception in a realistic virtual surgical environment. The impulse-based haptic model was proposed to simulate the contact forces and torques on the instruments. It provides convincing haptic sensation for surgeons to control the instruments under different reciprocation or rotation velocities. Also, in order to mimic the lateral and axial burring vibration forces, a three dimensional vibration model has been developed. The real-time methods for bone removal and reconstruction during surgical procedures have been proposed to support realistic visual feedbacks. The simulated contact forces were verified by comparing against the actual force data measured through the constructed mechanical platform. An empirical study based on the patient-specific data was conducted to evaluate the ability of the proposed system in training surgeons with various experiences. The results confirm the validity of our simulator. / Detailed summary in vernacular field only. / Detailed summary in vernacular field only. / Wang, Qiong. / Thesis (Ph.D.)--Chinese University of Hong Kong, 2012. / Includes bibliographical references (leaves 100-114). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Abstract also in Chinese. / Abstract --- p.i / Acknowledgement --- p.v / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Contributions of the Thesis --- p.5 / Chapter 1.2 --- Thesis Roadmap --- p.7 / Chapter 2 --- Related Work --- p.9 / Chapter 2.1 --- Virtual Orthopaedic Surgical Simulator --- p.9 / Chapter 2.2 --- Haptic Rendering for Virtual Surgery --- p.11 / Chapter 2.3 --- Evaluation of the Virtual System --- p.14 / Chapter 3 --- System Design --- p.17 / Chapter 3.1 --- Overall System Framework --- p.17 / Chapter 4 --- Bone-burring Surgical Simulation --- p.21 / Chapter 4.1 --- Impulse-Based Modeling of Haptic Simulation of Bone-Burring --- p.22 / Chapter 4.1.1 --- Basic Assumptions --- p.22 / Chapter 4.1.2 --- Bone-Burring Contact Description --- p.25 / Chapter 4.1.3 --- Burring Force Modeling --- p.29 / Chapter 4.2 --- Simulation of Bone Removal --- p.41 / Chapter 4.2.1 --- Bone Removal model --- p.41 / Chapter 4.2.2 --- Adaptive Subdividing Removal Surface --- p.42 / Chapter 4.3 --- Implementation and Experimental Results --- p.52 / Chapter 4.3.1 --- Force Evaluation --- p.53 / Chapter 4.3.2 --- Task-based Evaluation --- p.57 / Chapter 4.3.3 --- Time Performance --- p.61 / Chapter 5 --- Bone-sawing Surgical Simulation --- p.64 / Chapter 5.1 --- Impulse-Based Modeling of Haptic Simulation of Bone-Sawing --- p.65 / Chapter 5.1.1 --- Haptic Saw Instruments Description --- p.65 / Chapter 5.1.2 --- Sawing Force Modeling --- p.67 / Chapter 5.1.3 --- Sawing Torque Constraint --- p.70 / Chapter 5.2 --- Real-time Bone Mesh Reconstruction --- p.74 / Chapter 6 --- Evaluation --- p.78 / Chapter 6.1 --- Haptic Feedback Evaluation --- p.79 / Chapter 6.1.1 --- Mechanical Platform Setup --- p.79 / Chapter 6.1.2 --- Comparison of The Measured and Simulated forces --- p.81 / Chapter 6.2 --- Empirical Study --- p.85 / Chapter 6.2.1 --- Patient Specific Data --- p.87 / Chapter 6.2.2 --- Objective Performance Metrics --- p.89 / Chapter 6.2.3 --- Evaluation Results --- p.90 / Chapter 7 --- Conclusion --- p.94 / Publication List --- p.98 / Bibliography --- p.100
|
33 |
"Spindex" (speech index) enhances menu navigation user experience of touch screen devices in various input gestures: tapping, wheeling, and flickingJeon, Myounghoon 11 November 2010 (has links)
In a large number of electronic devices, users interact with the system by navigating through various menus. Auditory menus can complement or even replace visual menus, so research on auditory menus has recently increased with mobile devices as well as desktop computers. Despite the potential importance of auditory displays on touch screen devices, little research has been attempted to enhance the effectiveness of auditory menus for those devices. In the present study, I investigated how advanced auditory cues enhance auditory menu navigation on a touch screen smartphone, especially for new input gestures such as tapping, wheeling, and flicking methods for navigating a one-dimensional menu. Moreover, I examined if advanced auditory cues improve user experience, not only for visuals-off situations, but also for visuals-on contexts. To this end, I used a novel auditory menu enhancement called a "spindex" (i.e., speech index), in which brief audio cues inform the users of where they are in a long menu. In this study, each item in a menu was preceded by a sound based on the item's initial letter. One hundred and twenty two undergraduates navigated through an alphabetized list of 150 song titles. The study was a split-plot design with manipulated auditory cue type (text-to-speech (TTS) alone vs. TTS plus spindex), visual mode (on vs. off), and input gesture style (tapping, wheeling, and flicking). Target search time and subjective workload for the TTS + spindex were lower than those of the TTS alone in all input gesture types regardless of visual type. Also, on subjective ratings scales, participants rated the TTS + spindex condition higher than the plain TTS on being 'effective' and 'functionally helpful'. The interaction between input methods and output modes (i.e., auditory cue types) and its effects on navigation behaviors was also analyzed based on the two-stage navigation strategy model used in auditory menus. Results were discussed in analogy with visual search theory and in terms of practical applications of spindex cues.
|
34 |
Haptic cinema: an art practice on the interactive digital media tabletopChenzira, Ayoka 31 January 2011 (has links)
Common thought about cinema calls to mind an audience seated in a darkened theatre watching projected moving images that unfold a narrative onto a single screen. Cinema is much more than this. There is a significant history of artists experimenting with the moving image outside of its familiar setting in a movie theatre. These investigations are often referred to as "expanded cinema".
This dissertation proposes a genre of expanded cinema called haptic cinema, an approach to interactive narrative that emphasizes material object sensing, identification and management; viewer's interaction with material objects; multisequential narrative; and the presentation of visual and audio information through multiple displays to create a sensorially rich experience for viewers. The interactive digital media tabletop is identified as one platform on which to develop haptic cinema. This platform supports a subgenre of haptic cinema called tabletop cinema. Expanded cinema practices are analyzed for their contributions to haptic cinema. Based on this theoretical and artistic research, the thesis claims that haptic cinema contributes to the historical development of expanded cinema and interactive cinema practices. I have identified the core properties of a haptic cinema practice during the process of designing, developing and testing a series of haptic cinema projects. These projects build on and make use of methods and conventions from tangible interfaces, tangible narratives and tabletop computing.
|
35 |
Touchscreen interfaces for machine control and educationKivila, Arto 20 September 2013 (has links)
The touchscreen user interface is an inherently dynamic device that is becoming
ubiquitous. The touchscreen’s ability to adapt to the user’s needs makes it superior
to more traditional haptic devices in many ways. Most touchscreen devices come with
a very large array of sensors already included in the package. This gives engineers
the means to develop human-machine interfaces that are very intuitive to use. This
thesis presents research that was done to develop a best touchscreen interface for
driving an industrial crane for novice users. To generalize the research, testing also
determined how touchscreen interfaces compare to the traditional joystick in highly
dynamic tracking situations using a manual tracking experiment.
Three separate operator studies were conducted to investigate touchscreen control
of cranes. The data indicates that the touchscreen interfaces are superior to the
traditional push-button control pendent and that the layout and function of the
graphical user interface on the touchscreen plays a roll in the performance of the
human operators.
The touchscreen interface also adds great promise for allowing users to navigate
through interactive textbooks. Therefore, this thesis also presents developments
directed at creating the next generation of engineering textbooks. Nine widgets were
developed for an interactive mechanical design textbook that is meant to be delivered
via tablet computers. Those widgets help students improve their technical writing
abilities, introduce them to tools they can use in product development, as well as give
them knowledge in how some dynamical systems behave. In addition two touchscreen
applications were developed to aid the judging of a mechanical design competition.
|
36 |
Modeling of operator action for intelligent control of haptic human-robot interfacesGallagher, William John 13 January 2014 (has links)
Control of systems requiring direct physical human-robot interaction (pHRI) requires special consideration of the motion, dynamics, and control of both the human and the robot. Humans actively change their dynamic characteristics during motion, and robots should be designed with this in mind. Both the case of humans trying to control haptic robots using physical contact and the case of using wearable robots that must work with human muscles are pHRI systems.
Force feedback haptic devices require physical contact between the operator and the machine, which creates a coupled system. This human contact creates a situation in which the stiffness of the system changes based on how the operator modulates the stiffness of their arm. The natural human tendency is to increase arm stiffness to attempt to stabilize motion. However, this increases the overall stiffness of the system, making it more difficult to control and reducing stability. Instability poses a threat of injury or load damage for large assistive haptic devices with heavy loads. Controllers do not typically account for this, as operator stiffness is often not directly measurable. The common solution of using a controller with significantly increased controller damping has the disadvantage of slowing the device and decreasing operator efficiency. By expanding the information available to the controller, it can be designed to adjust a robot's motion based on the how the operator is interacting with it and allow for faster movement in low stiffness situations. This research explored the utility of a system that can estimate operator arm stiffness and compensate accordingly. By measuring muscle activity, a model of the human arm was utilized to estimate the stiffness level of the operator, and then adjust the gains of an impedance-based controller to stabilize the device. This achieved the goal of reducing oscillations and increasing device performance, as demonstrated through a series of user trials with the device. Through the design of this system, the effectiveness of a variety of operator models were analyzed and several different controllers were explored. The final device has the potential to increase the performance of operators and reduce fatigue due to usage, which in industrial settings could translate into better efficiency and higher productivity.
Similarly, wearable robots must consider human muscle activity. Wearable robots, often called exoskeleton robots, are used for a variety of tasks, including force amplification, rehabilitation, and medical diagnosis. Force amplification exoskeletons operate much like haptic assist devices, and could leverage the same adaptive control system. The latter two types, however, are designed with the purpose of modulating human muscles, in which case the wearer's muscles must adapt to the way the robot moves, the reverse of the robot adapting to how the human moves. In this case, the robot controller must apply a force to the arm to cause the arm muscles to adapt and generate a specific muscle activity pattern. This related problem is explored and a muscle control algorithm is designed that allows a wearable robot to induce a specified muscle pattern in the wearer's arm.
The two problems, in which the robot must adapt to the human's motion and in which the robot must induce the human to adapt its motion, are related critical problems that must be solved to enable simple and natural physical human robot interaction.
|
37 |
Haptic interaction between naive participants and mobile manipulators in the context of healthcareChen, Tiffany L. 22 May 2014 (has links)
Human-scale mobile robots that manipulate objects (mobile manipulators) have the potential to perform a variety of useful roles in healthcare. Many promising roles for robots require physical contact with patients and caregivers, which is fraught with both psychological and physical implications. In this thesis, we used a human factors approach to evaluate system performance and participant responses when potential end users performed a healthcare task involving physical contact with a robot. We performed four human-robot interaction studies with 100 people who were not experts in robotics (naive participants). We show that physical contact between naive participants and human-scale mobile manipulators can be acceptable and effective in a variety of healthcare contexts. In this thesis, we investigated two forms of touch-based (haptic) interaction relevant to healthcare. First, we studied how participants responded to physical contact initiated by an autonomous robotic nurse. On average, people responded favorably to
robot-initiated touch when the robot indicated that it was a necessary part of a healthcare task. However, their responses strongly depended on what they thought the robot's intentions were, which suggests that this will be an important consideration for future healthcare robots. Second, we investigated the coordination of whole-body motion between human-scale robots and people by the application of forces to the robot's hands and arms. Nurses found this haptic interaction to be intuitive and preferred it over a standard gamepad interface. They also navigated the robot through a cluttered healthcare environment in less time, with fewer collisions, and with less cognitive load via haptic interaction. Through a study with expert dancers, we demonstrated the feasibility of robots as dance-based exercise partners. The experts rated a robot that used only haptic interaction to be a good follower according to subjective measures of dance quality. We also determined that healthy older adults were accepting of using a robot for partner dance-based exercise. On average, they found the robot easy and enjoyable to use and that it performed a partnered stepping task well. The findings in this work make several impacts on the design of robots in healthcare. We found that the perceived intent of robot-initiated touch significantly influenced people's responses. Thus, we determined that autonomous robots that initiate touch with patients can be acceptable in some contexts. This result highlights the importance of
considering the psychological responses of users when designing physical human-robot interactions in addition to considering the mechanics of performing tasks. We found that naive users across three user groups could quickly learn how to effectively use physical interaction to lead a robot during navigation, positioning, and partnered stepping tasks. These consistent results underscore the value of using physical interaction to enable users of varying backgrounds to lead a robot during whole-body motion coordination across different healthcare contexts.
|
38 |
Development of a telerobotic test bench system for small-field-of-operation bilateral applications with 3D visual and haptic (kinaesthetic) feedbackSmit, Andre 04 1900 (has links)
Thesis (MScEng) Stellenbosch University, 2014 / ENGLISH ABSTRACT: Teleoperation as a field has seen much change since its inception in the early 1940s with Dr.
Raymond Goertz producing the first teleoperation system for manipulating radioactive
materials. With advances in core and supporting technologies, the systems have grown
in complexity and capability, allowing users to perform tasks anywhere in the world irrespective
of physical distance. The feasibility of such systems has increased as the drive for
use of telepresence robots, exploration robots as in space exploration, search and rescue
robots and military systems such as UAVs and UGVs gain popularity.
This prompted the development of a proof of concept modular, user centred telerobotic
system. The current project is the second iteration in the development process.
Teleoperation and more specifically telerobotic systems pose a challenge for many system
developers. This may be a result of complexity or the wide assortment of knowledge areas
that developers must master in order to deliver the final system. Developers have to balance
system usability, user requirements, technical design and performance requirements.
Several developmental process models are considered in context of Engineering Management
(EM). A larger Systems Engineering developmental process is used, with focus on
the primary and supportive EM components. The author used a hybrid developmental
model that is user focussed in its approach, the User-Centred Systems Design (UCSD)
methodology was adopted as the primary model for application within the two distinct
developmental categories. The first category hardware and system integration utilised the
UCSD model as is. The second - Software development - relied on the use of agile models,
rapid application development (RAD) and extreme programming (XP) were discussed
with XP being chosen as it could easily incorporate UCSD principles in its development
process.
Hardware systems development consisted of mechanical design of end-effectors, configuration
management and design, as well as haptic and visual feedback systems design for
the overall physical system. Also included is the physical interface design of the input
(master) cell. Further software development was broken into, three sections, the first and
most important was the graphical user interface, haptic control system with kinematic
model and video feedback control.
The force following and matching characteristics of the system were tested and were found
to show an improvement over the previous implementation. The force magnitude error
at steady state was reduced by 10%. While there was a dramatic improvement in system
response, the rise time was reduced by a factor 10. The system did however show a decrease
in angular accuracy, which was attributed to control system limitations.
Further human-factor analysis experiments were conducted to test the system in two typical
use-case scenarios. The first was a planar experiment and the second a 3D placement
task. The factors of interest identified were field-of-view, feedback vision mode, and input
modality. Heuristic performance indicators such as time-to-completion and number of collisions
for a given task were measured. System performance was only showed significant
improvement when used with haptic control. This shows that the research into haptic
control systems will prove to be valuable in producing usable systems. The vision factor
analysis failed to yield significant results, although they were useful in the qualitative
systems analysis.
The feedback from post-experimentation questionnaires showed that users prefer the Point
of View as a field of view and 2D viewing over 3D viewing, while the haptic input modality
was preferred.
The results from the technical verification process can be used in conjunction with insights
gained from user preference and human-factor analysis to provide guidance for future
telerobotic systems development at Stellenbosch University. / AFRIKAANSE OPSOMMING: Telewerksverigting as ’n gebied het al vele veranderinge ondergaan vandat die eerste stelsels
deur Dr. Raymond Goertz geimplementeer was in die vroeë 1940s vir die hantering
van radioaktiewe materiale. Met vordering in kern en ondersteunende tegnologieë, het
die telewerksverigtingstelsels toegeneem in kompleksiteit asook gevorder in vermoeënsvaardigheid,
wat gebruikers in staat stel om take te verrig vanuit enige plek op aarde,
ongeag die fisiese afstand wat die gebruiker en die werksarea skei. Die lewensvatbaarheid
van hierdie stelsels het ook toegeneem weens die belangstelling in teleteenwoordigheidrobotte,
ruimtevaardige-robotte, reddings-robotte en militêre-robotte soos onbemandelug-
voertuie (OLV) en onbemande-grond-voertuie(OGV).
As gevolg van die belangstelling in telerobotiese stelsels is die ontwikkeling van ’n modulêre,
gebruikers-gesentreerde telewerksverigting stelsel onderneem. Die huidige projek is
’n tweede iterasie hiervan.
Telewerksverigting, en meer spesifiek, telerobotika stelsels ontwikelling, vereis dat stelselontwikkelaars
’n verskeidenheid kennisareas bemeester. Die ontwikkelaar moet ’n belans
vind tussen gebruiker vereistes, bruikbaarheid asook tegniese ontwerp en prestasie vereistes.
Menigde ontwikkelingsproses modelle is oorweeg en behandel in die konteks van
Ingenieursbestuur (IB). ’n Stelselsontwikkeling proses is gevolg met ’n fokus op primêre
en ondersteunende IB komponente. ’n Gemengde ontwikkeling is toegepass tot die projek
wat die gebruiker as ’n hoof komponent van die stelsel in ag neem. Die oorhoofse ontwikkelingsmodel
is die User-centred Systems Design (UCSD) proses, wat vir beide hardeware
en sagteware ontwikkeling gebruik is.
Vir die hardeware ontwikkeling is die UCSD toegepas soos dit uiteengesit is in die literatuur.
Die sagteware ontwikkeling is voltooi met behulp van ratse metodes, “Rapid
Application Development” RAD en “Extreme Programming” (XP) was oorweeg en XP was gekies as ontwikkelingsmodel. XP was die natuurlike keuse weens die gemak waarmee
UCSD metodes en prinsiepe kon geinkorporeer word in die ontwikkelings proses.
Hardeware onwikkeling het bestaan uit meganiese ontwerp, manipulasiegereedskap ontwerp,
konfigurasie bestuur en ontwikkeling asook haptiese en visueleterugvoer stelselsontwerp
van die fisiese stelsel insluitend die fisiese koppelvlakontwerp van die meester sel.
Verder is sagtewareontwerp opgedeel in ’n haptiesebeheerstel met ’n kinematiese model
ontwikkeling, videoterugvoerbeheer en gebruikersintervlak ontwerp.
Die vermoëe van die stelsel om krag insette na te boots was verbeter met ’n gestadigde
verbetering van 10%. Die reaksietyd van die stelsel is verbeter met ’n faktor van 10. Die
stelsel het ’n verswakking getoon in die algehele hoekakkuraatheid, die oorsprong van die
verswakking kan aan die beheerstelsel teogeken word.
Verdere menslikefaktoranalise eksperimente is voltooi om die stelsel in twee tipiese gebruikgeval
scenario’s te toets. Die eerste, ’n platvlak-eksperiment en die tweede ’n 3D plasingingstaak
eksperiment. Die faktore van belang is ïdentifiseer as, visie-veld, terugvoervisie
modus en insette modaliteit. Heuristiese prestasie-aanwysers soos tyd-tot-voltooiing en
die aantal botsings vir ’n gegewe taak is gemeet. Stelselprestasie het slegs aansienlike
verbetering getoon wanneer die stelsel met die haptiesebeheer modus bedryf word. Die
visiefaktor ontleding het geen noemenswaardige resultate opgelewer nie.
Terugvoervorms was na elke eksperiment voltooi. Vraelyste het getoon dat gebruikers
die oogpunt van ’n lae hoek en 2D video oor 3D video verkies, terwyl die haptic beheer
modaliteit verkies word.
|
39 |
Development of Multipoint Haptic Device for Spatial PalpationMuralidharan, Vineeth January 2017 (has links) (PDF)
This thesis deals with the development of novel haptic array system that can render distributed pressure pattern. The haptic devices are force feedback interfaces, which are widely seen from consumer products to tele-surgical systems, such as vibration feedback in game console, mobile phones, virtual reality applications, and daVinci robots in minimally invasive surgery. Telemedicine and computer-enabled medical training system are modern medical infrastructures where the former provides health care services to people especially in rural and remote places while the latter is meant for training the next generation of doctors and medical students. In telemedicine, a patient at a remote location consults the physician at a distant place through the telecommunication media whereas in computer enabled medical training system, physician and medical students interact with the virtual patient. The experience of physical presence of the remote patient in telemedicine and immersive interaction with virtual patient on the computer-enabled training system can be attained through haptic devices. In this work we focus on palpation simulation in telemedicine and medical training systems. Palpation is a primary diagnostic method which involves multi-finger, multi-contact interaction between the patient and physician. During palpation, a distributed pressure pattern rather than point load is perceived by the physician. The commercially available haptic devices are single and five point devices, which lack the face validity in rendering distributed pressure pattern; there are only a few works reported in literatures that deal with palpation simulation. There is a strong need of a haptic device which provide distributed force pattern with multipoint feedback which can be applied for palpation simulation in telemedicine and medical training purposes. The haptic device should be a multipoint device to simulate palpation process, an array device to render distributed force pattern, light weight to move from one place to another and finally it has to cover hand portion of physician. We are proposing a novel under-actuated haptic array device, called taut cable haptic array system (TCHAS), which in general is a m x n system, consist of m+n actuators to obtain m.nhaptels, that are multiple end effectors. A prototype of 3 x 3 TCHAS is developed during this work and detailed study on its characterisation is explored. The performance of device is validated with elaborate user study and it establishes that the device has promising capability in rendering distributed spatio-temporal pressure pattern.
|
40 |
Design and control of a teleoperated palpation device for minimally invasive thoracic surgeryButtafuoco, Angelo 25 February 2013 (has links)
Minimally invasive surgery (MIS) consists in operating through small incisions in which a camera and adapted instruments are inserted. It allows to perform many interventions with reduced trauma for the patient. One of these is the ablation of peripheral pulmonary nodules.<p><p>Nevertheless, the means for detecting nodules during MIS are limited. In fact, because of the lack of direct contact, the surgeon cannot palpate the lung to find invisible lesions, as he would do in classical open surgery. As a result, only clearly visible nodules can be treated by MIS presently.<p><p>This work aims at designing, building and controlling a teleoperated palpation instrument, in order to extend the possibilities of MIS in the thoracic field. Such an instrument is made of a master device, manipulated by an operator, and a slave device which is in contact with the patient and reproduces the task imposed by the master. Adequate control laws between these two parts allow to restore the operator's haptic sensation. The goal is not to build a marketable prototype, but to establish a proof of concept.<p><p>The palpation device has been designed in collaboration with thoracic surgeons on the basis of the study of the medical gesture. The specifications have been deduced through experiments with experiencied surgeons from the Erasmus Hospital and the Charleroi Civil Hospital.<p><p>A pantograph has been built to be used as the master of the palpation tool. The slave is made of a 2 degrees of freedom (dof) clamp, which can be actuated in compression and shear. The compression corresponds to vertical moves of the pantograph, and the shear to horizontal ones. Force sensors have been designed within this project to measure the efforts along these directions, both at the master and the slave side, in order to implement advanced force-feedback control laws and for validation purposes.<p><p>Teleoperation control laws providing a suitable kinesthetic force feedback for lung palpation have been designed and validated through simulations. These simulations have been realized using a realistic model of lung, validated by experienced surgeons. Among the implemented control schemes, the 3-Channel scheme, including a local force control loop at the master side, is the most efficient for lung palpation. Moreover, the increased efficiency of a 2 dof device with respect to a 1 dof tool has been confirmed. Indeed, a characteristic force profile due to the motion in 2 directions appeared in the compression force tracking, making the lesion easier to locate. / Doctorat en Sciences de l'ingénieur / info:eu-repo/semantics/nonPublished
|
Page generated in 0.054 seconds