• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • Tagged with
  • 6
  • 6
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Camera-based estimation of needle pose for ultrasound percutaneous procedures

Khosravi, Sara 05 1900 (has links)
A pose estimation method is proposed for measuring the position and orientation of a biopsy needle. The technique is to be used as a touchless needle guide system for guidance of percutaneous procedures with 4D ultrasound. A pair of uncalibrated, light-weight USB cameras are used as inputs. A database is prepared offline, using both the needle line estimated from camera-captured images and the true needle line recorded from an independent tracking device. A nonparametric learning algorithm determines the best fit model from the database. This model can then be used in real-time to estimate the true position of the needle with inputs from only the camera images. Simulation results confirm the feasibility of the method and show how a small, accurately made database can provide satisfactory results. In a series of tests with cameras, we achieved an average error of 2.4mm in position and 2.61° in orientation. The system is also extended to real ultrasound imaging, as the two miniature cameras capture images of the needle in air and the ultrasound system captures a volume as the needle moves through the workspace. A new database is created with the estimated 3D position of the needle from the ultrasound volume and the 2D position and orientation of the needle calculated from the camera images. This study achieved an average error of 0.94 mm in position and 3.93° in orientation.
2

Camera-based estimation of needle pose for ultrasound percutaneous procedures

Khosravi, Sara 05 1900 (has links)
A pose estimation method is proposed for measuring the position and orientation of a biopsy needle. The technique is to be used as a touchless needle guide system for guidance of percutaneous procedures with 4D ultrasound. A pair of uncalibrated, light-weight USB cameras are used as inputs. A database is prepared offline, using both the needle line estimated from camera-captured images and the true needle line recorded from an independent tracking device. A nonparametric learning algorithm determines the best fit model from the database. This model can then be used in real-time to estimate the true position of the needle with inputs from only the camera images. Simulation results confirm the feasibility of the method and show how a small, accurately made database can provide satisfactory results. In a series of tests with cameras, we achieved an average error of 2.4mm in position and 2.61° in orientation. The system is also extended to real ultrasound imaging, as the two miniature cameras capture images of the needle in air and the ultrasound system captures a volume as the needle moves through the workspace. A new database is created with the estimated 3D position of the needle from the ultrasound volume and the 2D position and orientation of the needle calculated from the camera images. This study achieved an average error of 0.94 mm in position and 3.93° in orientation.
3

Camera-based estimation of needle pose for ultrasound percutaneous procedures

Khosravi, Sara 05 1900 (has links)
A pose estimation method is proposed for measuring the position and orientation of a biopsy needle. The technique is to be used as a touchless needle guide system for guidance of percutaneous procedures with 4D ultrasound. A pair of uncalibrated, light-weight USB cameras are used as inputs. A database is prepared offline, using both the needle line estimated from camera-captured images and the true needle line recorded from an independent tracking device. A nonparametric learning algorithm determines the best fit model from the database. This model can then be used in real-time to estimate the true position of the needle with inputs from only the camera images. Simulation results confirm the feasibility of the method and show how a small, accurately made database can provide satisfactory results. In a series of tests with cameras, we achieved an average error of 2.4mm in position and 2.61° in orientation. The system is also extended to real ultrasound imaging, as the two miniature cameras capture images of the needle in air and the ultrasound system captures a volume as the needle moves through the workspace. A new database is created with the estimated 3D position of the needle from the ultrasound volume and the 2D position and orientation of the needle calculated from the camera images. This study achieved an average error of 0.94 mm in position and 3.93° in orientation. / Applied Science, Faculty of / Electrical and Computer Engineering, Department of / Graduate
4

A Fast and Robust Image-Based Method for tracking Robot-assisted Needle Placement in Real-time MR Images

Janga, Satyanarayana Reddy 15 January 2014 (has links)
This thesis deals with automatic localization and tracking of surgical tools such as needles in Magnetic Resonance Imaging(MRI). The accurate and precise localization of needles is very important for medical interventions such as biopsy, brachytherapy, anaesthesia and many other needle based percutaneous interventions. Needle tracking has to be really precise, because the target may reside adjacent to organs which are sensitive to injury. More over during the needle insertion, Magnetic Resonance Imaging(MRI) scan plane must be aligned such that needle is in the field of view (FOV) for surgeon. Many approaches were proposed for needle tracking and automatic MRI scan plane control over last decade that use external markers, but they are not able to account for possible needle bending. Significant amount of work has already been done by using the image based approaches for needle tracking in Image Guided Therapy (IGT) but the existing approaches for surgical robots under MRI guidance are purely based on imaging information; they are missing the important fact that, a lot of important information (for example, depth of insertion, entry point and angle of insertion) is available from the kinematic model of the robot. The existing approaches are also not considering the fact that the needle insertion results in a time sequence of images. So the information about needle positions from the images seen so far can be used to make an approximate estimate about the needle position in the subsequent images. During the course of this thesis we have investigated an image based approach for needle tracking in real-time MR images that leverages additional information available from robot's kinematics model, supplementing the acquired images. The proposed approach uses Standard Hough Transform(SHT) for needle detection in 2D MR image and uses Kalman Filter for tracking the needle over the sequence of images. We have demonstrated experimental validation of the method on Real MRI data using gel phantom and artificially created test images. The results proved that the proposed method can track the needle tip position with root mean squared error of 1.5 mm for straight needle and 2.5mm for curved needle.
5

TELEOPERATED MRI‐GUIDED PROSTATE NEEDLE PLACEMENT

Seifabadi, REZA 30 May 2013 (has links)
Most robotic systems reported for MRI-guided prostate interventions use manual needle insertion, based on a previously acquired image, which requires withdrawing the patient from the scanner multiple times during the procedure. This makes the intervention longer, more expensive and elongating the discomfort to patient and, most importantly, less accurate due to the virtually inevitable motion of the target. As a remedy, automated needle placement methods were proposed, putting human supervision out of the control loop. This thesis presents the development of enabling technologies for human-operated in-room master-slave needle placement under real-time MRI guidance, while the patient is kept in the scanner and having the process of needle placement under continuos control of the physician. The feasibility of teleoperated needle insertion was demonstrated by developing a 1-DOF (degree of freedom) MRI-compatible master-slave system, which was integrated with a 4-DOF robot for transperineal prostate biopsy and brachytherapy. An accuracy study was conducted on a robotic system for MRI-guided prostate needle placement. Different error sources were identified and quantified. This study concluded that errors occurring during needle insertion have the most significant contribution to needle placement error. In order to compensate for these errors, teleoperated needle steering under real-time MRI guidance was proposed. A 2-DOF piezo-actuated MRI-compatible needle steering module was developed and integrated with the aforementioned 4-DOF transperineal robot, yielding a fully actuated 6-DOF (x, y, z, yaw, pitch, roll) robotic platform for MRI-guided prostate interventions. A novel MRI-compatible master robot was also developed to enable teleoperated needle steering inside the MRI room. MRI-compatible controller hardware and software were developed. A novel MRI-compatible force/torque sensor was devised using Fiber Bragg Grating for force measurement in MRI room. Phantom experiments proved the feasibility iii of teleoperated needle steering under real-time MRI guidance. A system was also developed for real-time 3D shape tracking of a bevel-tip needle with Fiber Bragg Grating sensors embedded along the needle shaft. The needle profile was overlaid on the real-time MR image, yielding real time navigation with accuracy better than 0.5 mm. The experimental system is presently being refitted for clinical safety and feasibility trials on real patients. / Thesis (Ph.D, Mechanical and Materials Engineering) -- Queen's University, 2013-05-30 12:26:18.732
6

VISUALLY GUIDED ROBOT CONTROL FOR AUTONOMOUS LOW-LEVEL SURGICAL MANIPULATION TASKS

Ozguner, Orhan 28 January 2020 (has links)
No description available.

Page generated in 0.3755 seconds