Spelling suggestions: "subject:"diagnostic"" "subject:"hiagnostic""
41 |
An evaluation of transvaginal ultrasound in the assessment of endometrial thickness in black South African patients presenting with postmenopausal uterine bleedingMoodley, Premla January 2004 (has links)
Dissertation submitted in full compliance with the requirements for the Master's degree in Technology: Radiography, Durban Institute of Technology, Durban, 2004. / The object of this study was to use Transvaginal ultrasound to evaluate the thickness of the endometrium to exclude endometrial abnormality in Black South African women with postmenopausal uterine bleeding. Transvaginal ultrasound is an excellent diagnostic method for assessing endometrial pathology. The study was carried out at the Gynaecological Ultrasound Department, King Edward VIII Hospital. The study included 76 Black women with postmenopausal uterine bleeding. The thickness of the endometrium was measured by Transvaginal ultrasound. The measurement included both endometrial layers (double-layer technique). The Transvaginal ultrasound measurement was compared with the histopathological diagnosis of the biopsy specimens. At the end of the investigation, findings obtained were 3.9% non-representative, 44.8% endometrial adenocarcinomas, 14.5% benign polyp, 3.9% chronic Endometritis, 17.1% benign endometrium, 5.3% endometrial hyperplasia, 9.2% atrophic endometrium, 3.9% myometrial invasion and 1.3% Malignant Mixed Mullerian Tumour. In this study, the thickness of the endometrial echo varied from 5mm to 35mm, with a mean of 18,2mm. When the thickness of the endometrial echo was compared with the histopathological results, the mean value for non-representative was 7.83mm, much lower than the thickness of an active endometrium (13.25mm). In cases with atrophic endometrium, the thickness ranged from 6mm to 30mm with a mean of 15.86mm. The mean value obtained for cases with endometrial adenocarcinoma was 20.32mm (range 11 to 35mm). The sensitivity, specificity and accuracy of Transvaginal ultrasound for detecting endometrial malignancy were 100% if the cutoff limit of 4mm was used
In conclusion, this study using Transvaginal ultrasound demonstrated that a thickness limit greater than 8mm was considered in detecting malignancy. No malignant endometrium was thinner than 5mm. Therefore in women with postmenopausal uterine bleeding and an endometrium less than 4mm, it may be justified not to perform further investigations. Transvaginal ultrasound is a simple,
well-tolerated safe and reliable method for identifying endometrial thickness in postmenopausal Black South African women. / M
|
42 |
Diagnostic robuste et estimation de défauts à base de modèle Bond Graph / Robust diagnosis and fault estimation using bond graph approachTouati, Youcef 05 December 2012 (has links)
Ce travail de thèse concerne la conception d’un système de diagnostic robuste à base de modèle bond graph et a pour objectif principal de générer des seuils robustes aux incertitudes de mesures mais aussi la génération systématiques des équations d’estimation de défauts. Une procédure de génération des seuils basée sur une représentation graphiques des incertitudes de mesure a été développée et implémentée sur un système réel. La représentation bond graph sous forme LFT (transformation linéaire fractionnelle) a été exploitée pour la génération des équations d’estimation de défauts qui sont utilisées pour l’amélioration de la décision concernant l’isolabilité de défauts ayant la même signature de pannes ainsi à l’analyse de la sensibilité des résidus aux défauts capteurs, actionneurs et aux défauts liés aux paramètres. Les algorithmes développés dans ce travail de thèse ont été validés par une implémentation sur un système mécatronique représentant un robot mobile, appelé : Robotino. / This work deals with robust bond graph model-based fault diagnosis. The main objectives of this work are related to the generation of robust thresholds with respect to measurement uncertainties, and the creation of a systematic procedure for the generation of fault estimation equations. A procedure of thresholds generation based on a graphical representation of the measurement uncertainty has been developed and implemented on a real system. The Bond graph under LFT (linear fractional transformation) form has been used for the generation of fault estimation equations. These equations are used to improve the decision step concerning the isolation of the faults having the same signature and for analyzing the sensitivity of the residuals to faults affecting sensors, actuators and parameters. The algorithms developed in this work have been on a mechatronic system representing a mobile robot, called: Robotino.
|
43 |
An implantable sensor for disease detection and treatmentNgoepe, Mpho Phehello 25 August 2014 (has links)
Current sensors employed in medicine are used to detect chemical and biochemical abnormalities. Their applications range from biopsy (brain), enzyme-linked immunosorbent assay (ELISA) (spinal fluid), blood (bio-barcode), and sweat and urine bio-diagnostics where the primary focus is the selection of biomarkers that can pinpoint the occurrence of the disease. Emerging sensors for cholesterol detection are based on enzymatic functions, which degrade these molecules, where the signal can be visualized optically by using a transducer. Cholesterol is a steroid metabolite that is employed for the synthesis of steroid hormones, and the establishment of proper membrane permeability and fluidity. Since cholesterol is insoluble in blood, it is transported in the circulatory system within lipoproteins, complex spherical particles which have an exterior comprising of amphiphilic proteins and lipids with outward-facing surfaces that are water-soluble and inward-facing surfaces that are lipid-soluble. Low-density lipoprotein (LDL) is known as ‘bad’ cholesterol. High-density lipoprotein (HDL) is known as ‘good’ cholesterol. LDL is linked to cardiovascular conditions such as atherosclerosis and hypertension, which ultimately lead to coronary heart disease, myocardial and cerebral infarction (stroke). An appropriate therapeutic response to a sensor system for cholesterol, specifically LDL, detection implicates the design of an implantable system for stimuli-responsive drug release. The proposed system was designed to detect specific biochemical changes by employing nanoparticles made of glyceryl behenate, polyoxyethylene-polyoxypropylene block copolymer, avidin, biotin and anti-beta lipoprotein antibodies as sensors. This was achieved by coating nanoparticles with antibodies specific to the antigen (i.e. LDL) to create an antibody-conjugated antibody conjugated solid lipid nanoparticles (henceforth known as ‘antibody conjugated SLN). Fenofibrate was used as a model drug due to its low water solubility and to its lipophilic properties similar to statins. The antibody conjugated SLNs were of 150nm in size and had a zeta potential of -28mV. Their drug entrapment efficacy was 86%, with a drug release of 16mg/day due to Fickian diffusion and erosion mechanism. The slow release was due semi-crystalline structure determined by XRD and DSC. Antigen responsive hydrogel was designed by incorporation of thiolated antibody conjugated SLN via Traut’s reagents, polyethylene glycol diacrylate, methyl acrylic acid and polyethylene glycol 200. The osmotic pump was designed from polyethylene oxide, ethyl cellulose and mannitol. The drug reservoir was synthesized from ethyl cellulose coated gelatin capsule via coacervation phase separation method. The polymeric tube synthesized from ethyl cellulose, methyl cellulose and castor oil was coated with antigen responsive hydrogel. Ex vivo studies evaluating intravascular stability of the implant in correlation with mechanical analysis indicated the polymeric tube unstable. An 18-gauge catheter was used for forming an infusion tube as a substitute for the polymeric tube. The implant showed a correlation of Korsmeyer-Peppas drug release during in vivo and in vitro studies. A constant drug release of 881μg/day was observed during in vivo. This played a role in reduction of total cholesterol by means of reduction in LDL sub-fractions by 30%; in correlation with LDL particle enhance clearance from the plasma due to SLN-LDL uptake. An increase by 46% in HDL was observed, which correlated to fenofibrate therapeutic effect. Pharmacokinetic analysis indicated improved mean residence time and efficacy. This indicated that the device could be used for delivery of lipophilic drugs and detection of circulating biomarkers.
|
44 |
The spectrum of radiological appearances in bronchoscopically proven pneumocystis pneumonia in HIV positive adults: a retrospective analysis from Helen Joseph HospitalRubin, Grace 21 February 2012 (has links)
M.Med. (Diagnostic Radiology), Faculty of Health Sciences, University of the Witwatersrand, 2011 / Pneumocystis jirovecci pneumonia (PJP) in HIV/AIDS is a significant opportunistic infection. As CD4 counts decrease, so does specificity of chest X-ray (CXR). AIM: To determine the proportion of bronchoscopically proven PJP in HIV infected adults, CD4 counts, CXR signs and compare PJP to TB. METHODS: The proportion of bronchoscopically proven PJP and co-infection was determined. Sensitivity and specificity of CXR for the diagnosis of PJP and TB, and frequency of CXR signs were determined. RESULTS: PJP was present in 26.6% and co-infection 19%. Median CD4 (13 cell/mm3) was significantly lower for PJP patients (p = 0.0089). CXR sensitivity for PJP was 33% and specificity was 100%. Bilateral, multilobar and diffuse disease, bronchopneumonia, nodules and cavitation overlapped for PJP and TB. Unilateral and unilobar disease indicated TB over PJP. Effusions and lymphadenpopathy were not seen with PJP. CONCLUSION: PJP makes up a quarter of indeterminate diagnoses in HIV infected adults. Sensitivity of diagnosis on CXR is low. The CXR diagnosis of TB is made more confidently, but is overcalled. In patients with low CD4 levels, a diagnosis of PJP should be considered as important as TB.
|
45 |
Assessment of diagnostic modalities in penetrating cardiac trauma for the haemodynamically stable patientSurridge, Daniel Johnathan David 10 September 2014 (has links)
Introduction: One in 11.5 patients with a thoracic wound has cardiac involvement with
potentially life-threatening consequences. Therefore, cardiac injury must be assumed in
every patient with a penetrating chest lesion, even if the patient is haemodynamically
stable. A need exists to diagnose or screen for “occult” cardiac injury.
Methods: A retrospective analysis was conducted in patients with a penetrating injury to
the chest at Charlotte Maxeke Johannesburg Academic Hospital Trauma Unit from 1
January 2007 to 30 June 2010. Data was compared between patients with and without
cardiac injury. Clinical examination and special investigations were assessed for
sensitivity, specificity, positive and negative predictive values.
Results: Of 7 781 major injuries assessed, 1 591 (20%) sustained a penetrating injury to
the chest. All cardiac injury was incurred through a precordial wound. Two investigations
were found to be both significant and useful. Transthoracic echocardiography (TTE) had a
sensitivity of 100% and specificity of 95%. Serial Troponin T (Trop T) levels showed a
peak at 4 hours and by 6 hours post admission the specificity and negative predictive
values were 100%.
Conclusion: Of the investigations examined, TTE was found to have the best results. The
need for specialised equipment and training make TTE less practical in a resource-limited
environment. Serial Trop T shows a high negative predictive value and is a cost effective
screening test for penetrating cardiac injury.
|
46 |
Interactive 3D GPU-Based Breast Mass Lesion Segmentation Method Based on Level Sets for Dce-MRI ImagesUnknown Date (has links)
A new method for the segmentation of 3D breast lesions in dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) images, using parallel programming with general purpose
computing on graphics processing units (GPGPUs), is proposed. The method has two main parts: a pre-processing step and a segmentation algorithm. In the pre-processing step, DCE-MRI images
are registered using an intensity-based rigid transformation algorithm based on gradient descent. After the registration, voxels that correspond to breast lesions are enhanced using the Naïve
Bayes machine learning classifier. This classifier is trained to identify four different classes inside breast images: lesion, normal tissue, chest and background. Training is
performed by manually selecting 150 voxels for each of the four classes from images in which breast lesions have been confirmed by an expert in the field. Thirteen attributes obtained from
the kinetic curves of the selected voxels are later used to train the classifier. Finally, the classifier is used to increase the intensity values of voxels labeled as lesions and to
decrease the intensities of all other voxels. The post-processed images are used for volume segmentation of the breast lesions using a level set method based on the active contours
without edges (ACWE) algorithm. The segmentation algorithm is implemented in OpenCL for the GPGPUs to accelerate the original model by parallelizing two main steps of the segmentation
process: the computation of the signed distance function (SDF) and the evolution of the segmented curve. The proposed framework uses OpenGL to display the segmented volume in real time,
allowing the physician to obtain immediate feedback on the current segmentation progress. The proposed implementation of the SDF is compared with an optimal implementation developed in
Matlab and achieves speedups of 25 and 12 for 2D and 3D images, respectively. Moreover, the OpenCL implementation of the segmentation algorithm is compared with an optimal implementation
of the narrow-band ACWE algorithm. Peak speedups of 55 and 40 are obtained for 2D and 3D images, respectively. The segmentation algorithm has been developed as open source software, with
different versions for 2D and 3D images, and can be used in different areas of medical imaging as well as in areas within computer vision, such like tracking, robotics and
navigation. / A Dissertation submitted to the Department of Scientific Computing in partial fulfillment of the requirements for the degree of Doctor of Philosophy. / Fall Semester 2015. / November 2, 2015. / GPU, Level Sets, OpenCL, OpenGL, Segmentation / Includes bibliographical references. / Anke Meyer-Baese, Professor Directing Dissertation; Mark Sussman, University Representative; Gordon Erlebacher, Committee Member; Dennis Slice,
Committee Member; Xiaoqiang Wang, Committee Member.
|
47 |
Pattern Recognition in Medical Imaging: Supervised Learning of fMRI and MRI DataUnknown Date (has links)
Machine learning algorithms along with magnetic resonance imaging (MRI) provides promising techniques to overcome the drawbacks of the current clinical screening techniques. In this study the resting-state functional magnetic resonance imaging (fMRI) to see the level of activity in a patient's brain and dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) to explore the level of improvement of neo-adjuvant chemotherapy in patients with locally advanced breast cancer were considered. As the first project, we considered fMRI of patients before and after they underwent a double-blind smoking cessation treatment. For the first time, this study aims at developing new theory-driven biomarkers by implementing and evaluating novel techniques from resting-state scans that can be used in relapse prediction in nicotine-dependent patients and future treatment efficacy. In this regards, two classes of patients have been studied, one took the drug N-acetylcysteine and the other took a placebo. Our goal was to classify the patients as treatment or non-treatment, based on their fMRI scans. The image slices of brain are used as the variable. We have applied different voxel selection schemes and data reduction algorithms on all images. Then, we compared several multivariate classifiers and deep learning algorithms and also investigated how the different data reductions affect classification performance. For the second part, we have employed multi-parametric magnetic resonance imaging (mpMRI) using different morphological and functional MRI parameters such as T2-weighted, dynamic contrast-enhanced (DCE) MRI, and diffusion weighted imaging (DWI) has emerged as the method of choice for the early response assessments to NAC. Although, mpMRI is superior to conventional mammography for predicting treatment response, and evaluating residual disease, yet there is still room for improvement. In the past decade, the field of medical imaging analysis has grown exponentially, with an increased numbers of pattern recognition tools, and an increase in data sizes. These advances have heralded the field of radiomics. Radiomics allows the high-throughput extraction of the quantitative features that result in the conversion of images into mineable data, and the subsequent analysis of the data for an improved decision support with response monitoring during NAC being no exception. In this study. we determined the importance and ranking of the extracted parameters from mpMRI using T2-weighted, DCE, and DWI for prediction of pCR and patient outcomes with respect to metastases and disease-specific death employing different machine learning algorithms. / A Dissertation submitted to the Department of Scientific Computing in partial fulfillment of the requirements for the degree of Doctor of Philosophy. / Summer Semester 2018. / July 6, 2018. / Breast Cancer, Data Mining, Machine Learning, Medical Imaging, Neuroimaging / Includes bibliographical references. / Anke Meyer-Baese, Professor Directing Dissertation; Simon Y. Foo, University Representative; Katja Pinker-Domenig, Committee Member; Peter Beerli, Committee Member; Dennis Slice, Committee Member.
|
48 |
Effective dose estimation for U.S. Army soldiers undergoing multiple computed tomography scansPrins, Robert Dean January 2011 (has links)
Diagnosing the severity of blunt trauma injuries is difficult and involves the use of diagnostic radiological scanning. The primary diagnostic radiology modality used for assessing these injuries is computed tomography (CT). CT delivers more radiation dose than other diagnostic scanning modalities. Trauma patients are at an increased risk of radiation induced cancer because of the cumulative dose effects from multiple scanning procedures. Current methods for estimating effective dose, the quantity used to describe the whole body health detriment from radiation, involves the use of published conversion coefficients and procedure specific machine parameters such as dose-length-product based on computed tomography dose index and scan length. Other methods include the use of Monte Carlo simulations based upon the specific machine geometry and radiation source. Unless the requisite machine information is known, the only means of estimating the effective dose is through the use of generic estimates that are published by scientific radiation committees and have a wide range of values. This research addressed a knowledge gap in assigning effective doses from computed tomography when machine parameters knowledge is either unknown or incomplete. The research involved the development of a new method of estimating the effective dose from CT through the use of regression models incorporating the use of patient parameters as opposed to machine specific parameters. This new method was experimentally verified using two adult anthropomorphic phantoms and optically stimulated luminescent dosimeters. The new method was then compared against a real patient population undergoing similar computed tomography scanning procedures. Utilizing statistical procedures, the new method was tested for repeatability and bias against the current conversion coefficient method. The analysis of the new method verifies that the estimation ability is similar to recent research indicating that the older conversion coefficient methods can underestimate the effective dose to the patient by up to 40%. The new method can be used as a retrospective tool for effective dose estimation from CT trauma protocols for a patient population with physical characteristics similar to the U.S. Army Soldier population.
|
49 |
Development and Applications of Laminar Optical Tomography for In Vivo ImagingBurgess, Sean Adam January 2011 (has links)
Laminar optical tomography (LOT) is an optical imaging technique capable of making depth-resolved measurements of absorption and fluorescence contrast in scattering tissue. LOT was first demonstrated in 2004 by Hillman et al [1]. The technique combines a non-contact laser scanning geometry, similar to a low magnification confocal microscope, with the imaging principles of diffuse optical tomography (DOT). This thesis describes the development and application of a second generation LOT system, which acquires both fluorescence and multi-wavelength measurements simultaneously and is better suited for in vivo measurements. Chapter 1 begins by reviewing the interactions of light with tissue that form the foundation of optical imaging. A range of related optical imaging techniques and the basic principles of LOT imaging are then described. In Chapter 2, the development of the new LOT imaging system is described including the implementation of a series of interfaces to allow clinical imaging. System performance is then evaluated on a range of imaging phantoms. Chapter 3 describes two in vivo imaging applications explored using the second generation LOT system, first in a clinical setting where skin lesions were imaged, and then in a laboratory setting where LOT imaging was performed on exposed rat cortex. The final chapter provides a brief summary and describes future directions for LOT. LOT has the potential to find applications in medical diagnostics, surgical guidance, and in-situ monitoring owing to its sensitivity to absorption and fluorescence contrast as well as its ability to provide depth sensitive measures. Optical techniques can characterize blood volume and oxygenation, two important biological parameters, through measurements at different wavelengths. Fluorescence measurements, either from autofluorescence or fluorescent dyes, have shown promise for identifying and analyzing lesions in various epithelial tissues including skin [2, 3], colon [4], esophagus [5, 6], oral mucosa [7, 8], and cervix [9]. The desire to capture these types of measurements with LOT motivated much of the work presented here.
|
50 |
Taming unstable inverse problems: Mathematical routes toward high-resolution medical imaging modalitiesMonard, Francois January 2012 (has links)
This thesis explores two mathematical routes that make the transition from some severely ill-posed parameter reconstruction problems to better-posed versions of them. The general introduction starts by defining what we mean by an inverse problem and its theoretical analysis. We then provide motivations that come from the field of medical imaging. The first part consists in the analysis of an inverse problem involving the Boltzmann transport equation, with applications in Optical Tomography. There we investigate the reconstruction of the spatially-dependent part of the scattering kernel, from knowledge of angularly averaged outgoing traces of transport solutions and isotropic boundary sources. We study this problem in the stationary regime first, then in the time-harmonic regime. In particular we show, using techniques from functional analysis and stationary phase, that this inverse problem is severely ill-posed in the former setting, whereas it is mildly ill-posed in the latter. In this case, we deduce that making the measurements depend on modulation frequency allows to improve the stability of reconstructions. In the second part, we investigate the inverse problem of reconstructing a tensor-valued conductivity (or diffusion) coefficient in a second-order elliptic partial differential equation, from knowledge of internal measurements of power density type. This problem finds applications in the medical imaging modalities of Electrical Impedance Tomography and Optical Tomography, and the fact that one considers power densities is justified in practice by assuming a coupling of this physical model with ultrasound waves, a coupling assumption that is characteristic of so-called hybrid medical imaging methods. Starting from the famous Calderon's problem (i.e. the same parameter reconstruction problem from knowledge of boundary fluxes of solutions), and recalling its lack of injectivity and severe instability, we show how going from Dirichlet-to-Neumann data to considering the power density operator leads to reconstruction of the full conductivity tensor via explicit inversion formulas. Moreover, such reconstruction algorithms only require the loss of either zero or one derivative from the power density functionals to the unknown, depending on what part of the tensor one wants to reconstruct. The inversion formulas are worked out with the help of linear algebra and differential geometry, in particular calculus with the Euclidean connection. The practical pay-off of such theoretical improvements in injectivity and stability is twofold: (i) the lack of injectivity of Calderà³n's problem, no longer existing when using power density measurements, implies that future medical imaging modalities such as hybrid methods may make anisotropic properties of human tissues more accessible; (ii) the improvements in stability for both problems in transport and conductivity may yield practical improvements in the resolution of images of the reconstructed coefficients.
|
Page generated in 0.0339 seconds