• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 5
  • 2
  • 1
  • 1
  • Tagged with
  • 19
  • 19
  • 19
  • 8
  • 7
  • 7
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Computer aided analysis of inflammatory muscle disease using magnetic resonance imaging

Jack, James January 2015 (has links)
Inflammatory muscle disease (myositis) is characterised by inflammation and a gradual increase in muscle weakness. Diagnosis typically requires a range of clinical tests, including magnetic resonance imaging of the thigh muscles to assess the disease severity. In the past, this has been measured by manually counting the number of muscles affected. In this work, a computer-aided analysis of inflammatory muscle disease is presented to help doctors diagnose and monitor the disease. Methods to quantify the level of oedema and fat infiltration from magnetic resonance scans are proposed and the disease quantities determined are shown to have positive correlation against expert medical opinion. The methods have been designed and tested on a database of clinically acquired T1 and STIR sequences, and are proven to be robust despite suboptimal image quality. General background information is first introduced, giving an overview of the medical, technical, and theoretical topics necessary to understand the problem domain. Next, a detailed introduction to the physics of magnetic resonance imaging is given. A review of important literature from similar and related domains is presented, with valuable insights that are utilised at a later stage. Scans are carefully pre-processed to bring all slices in to a common frame of reference and the methods to quantify the level of oedema and fat infiltration are defined and shown to have good positive correlation with expert medical opinion. A number of validation tests are performed with re-scanned subjects to indicate the level of repeatability. The disease quantities, together with statistical features from the T1-STIR joint histogram, are used for automatic classification of the disease severity. Automatic classification is shown to be successful on out of sample data for both the oedema and fat infiltration problems.
2

Deep learning-based algorithm improved radiologists’ performance in bone metastases detection on CT / 深層学習を用いたアルゴリズムにより放射線科医のCTでの骨転移検出能が向上した

Noguchi, Shunjiro 23 March 2023 (has links)
京都大学 / 新制・課程博士 / 博士(医学) / 甲第24473号 / 医博第4915号 / 新制||医||1062(附属図書館) / 京都大学大学院医学研究科医学専攻 / (主査)教授 溝脇 尚志, 教授 黒田 知宏, 教授 花川 隆 / 学位規則第4条第1項該当 / Doctor of Medical Science / Kyoto University / DFAM
3

Automated Target Detection in Diagnostic Ultrasound based on the CLEAN Algorithm

Masoom, Hassan 14 December 2011 (has links)
In this thesis, we present an algorithm for the automated detection of abnormalities (targets) in ultrasound images. The algorithm uses little a priori information and does not require training data. The proposed scheme is a combination of the CLEAN algorithm, originally proposed for radio astronomy, and constant false alarm rate (CFAR) processing, developed for use in radar systems. Neither of these algorithms appears to have been previously used for target detection in ultrasound images. The CLEAN algorithm identifies areas in the ultrasound image that stand out above a threshold in relation to the background; CFAR techniques allow for an automated and adaptive selection of the threshold. The algorithm was tested on simulated B-mode images. Using a contrast-detail analysis, probability of detection curves indicate that, depending on the contrast, the method has considerable promise for the automated detection of abnormalities with diameters greater than a few millimetres.
4

Automated Target Detection in Diagnostic Ultrasound based on the CLEAN Algorithm

Masoom, Hassan 14 December 2011 (has links)
In this thesis, we present an algorithm for the automated detection of abnormalities (targets) in ultrasound images. The algorithm uses little a priori information and does not require training data. The proposed scheme is a combination of the CLEAN algorithm, originally proposed for radio astronomy, and constant false alarm rate (CFAR) processing, developed for use in radar systems. Neither of these algorithms appears to have been previously used for target detection in ultrasound images. The CLEAN algorithm identifies areas in the ultrasound image that stand out above a threshold in relation to the background; CFAR techniques allow for an automated and adaptive selection of the threshold. The algorithm was tested on simulated B-mode images. Using a contrast-detail analysis, probability of detection curves indicate that, depending on the contrast, the method has considerable promise for the automated detection of abnormalities with diameters greater than a few millimetres.
5

Computer Aided Detection of Masses in Breast Tomosynthesis Imaging Using Information Theory Principles

Singh, Swatee 18 September 2008 (has links)
<p>Breast cancer screening is currently performed by mammography, which is limited by overlying anatomy and dense breast tissue. Computer aided detection (CADe) systems can serve as a double reader to improve radiologist performance. Tomosynthesis is a limited-angle cone-beam x-ray imaging modality that is currently being investigated to overcome mammography's limitations. CADe systems will play a crucial role to enhance workflow and performance for breast tomosynthesis.</p><p>The purpose of this work was to develop unique CADe algorithms for breast tomosynthesis reconstructed volumes. Unlike traditional CADe algorithms which rely on segmentation followed by feature extraction, selection and merging, this dissertation instead adopts information theory principles which are more robust. Information theory relies entirely on the statistical properties of an image and makes no assumptions about underlying distributions and is thus advantageous for smaller datasets such those currently used for all tomosynthesis CADe studies.</p><p>The proposed algorithm has two 2 stages (1) initial candidate generation of suspicious locations (2) false positive reduction. Images were accrued from 250 human subjects. In the first stage, initial suspicious locations were first isolated in the 25 projection images per subject acquired by the tomosynthesis system. Only these suspicious locations were reconstructed to yield 3D Volumes of Interest (VOI). For the second stage of the algorithm false positive reduction was then done in three ways: (1) using only the central slice of the VOI containing the largest cross-section of the mass, (2) using the entire volume, and (3) making decisions on a per slice basis and then combining those decisions using either a linear discriminant or decision fusion. A 92% sensitivity was achieved by all three approaches with 4.4 FPs / volume for approach 1, 3.9 for the second approach and 2.5 for the slice-by-slice based algorithm using decision fusion.</p><p>We have therefore developed a novel CADe algorithm for breast tomosynthesis. The techniques uses an information theory approach to achieve very high sensitivity for cancer detection while effectively minimizing false positives.</p> / Dissertation
6

Computer-Aided Detection of Breast Cancer Using Ultrasound Images

Guo, Yanhui 01 May 2010 (has links)
Ultrasound imaging suffers from severe speckle noise. We propose a novel approach for speckle reduction using 2D homogeneity and directional average filters to remove speckle noise. We transform speckle noise into additive noise using a logarithm transformation. Texture information is employed to describe the speckle characteristics of the image. The homogeneity value is defined using texture information value, and the ultrasound image is transformed into a homogeneity domain from the gray domain. If the homogeneity value is high, the region is homogenous and has less speckle noise. Otherwise, the region is nonhomogenous, and speckle noise occurs. The threshold value is employed to distinguish homogenous regions from regions with speckle noise obtained from a 2D homogeneity histogram according to the maximal entropy principle. A new directional filtering is convoluted to remove noise from pixels in a nonhomogenous region. The filtering processing iterates until the breast ultrasound image is homogenous enough. Experiments show the proposed method improves denoising and edge-preserving capability. We present a novel enhancement algorithm based on fuzzy logic to enhance the fine details of ultrasound image features, while avoiding noise amplification and over-enhancement. We take into account both the fuzzy nature of an ultrasound and feature regions on images, which are significant in diagnosis. The maximal entropy principle utilizes the gray-level information to map the image into fuzzy domain. Edge and textural information is extracted in fuzzy domain to describe the features of lesions. The contrast ratio is computed and modified by the local information. Finally, the defuzzification operation transforms the enhanced ultrasound images back to the spatial domain. Experimental results confirm a high enhancement performance including fine details of lesions, without over- or under-enhancement. Identifying object boundaries in ultrasound images is a difficult task. We present a novel automatic segmentation algorithm based on characteristics of breast tissue and eliminating particle swarm optimization (EPSO) clustering analysis, thus transforming the segmentation problem into clustering analysis. Mammary gland characteristics in ultrasound images are utilized, and a step-down threshold technique is employed to locate the mammary gland area. Experimental results demonstrate that the proposed approach increases clustering speed and segments the mass from tissue background with high accuracy.
7

Computer-aided detection and classification of microcalcifications in digital breast tomosynthesis

Ho, Pui Shan January 2012 (has links)
Currently, mammography is the most common imaging technology used in breast screening. Low dose X-rays are passed through the breast to generate images called mammograms. One type of breast abnormality is a cluster of microcalcifications. Usually, in benign cases, microcalcifications result from the death of fat cells or are due to secretion by the lobules. However, in some cases, clusters of microcalcifications are indicative of early breast cancer, partly because of the secretions by cancer cells or the death of such cells. Due to the different attenuation characteristics of normal breast tissue and microcalcifications, the latter ideally appear as bright white spots and this allows detection and analysis for breast cancer classification. Microcalcification detection is one of the primary foci of screening and has led to the development of computer-aided detection (CAD) systems. However, a fundamental limitation of mammography is that it gives a 2D view of the tightly compressed 3D breast. The depths of entities within the breast are lost after this imaging process, even though the breast tissue is spread out as a result of the compression force applied to the breast. The superimposition of tissues can occlude cancers and this has led to the development of digital breast tomosynthesis (DBT). DBT is a three-dimensional imaging involving an X-ray tube moving in an arc around the breast, over a limited angular range, producing multiple images, which further undergo a reconstruction step to form a three-dimensional volume of breast. However, reconstruction remains the subject of research and small microcalcifications are "smeared" in depth by current algorithms, preventing detailed analysis of the geometry of a cluster. By using the geometry of the DBT acquisition system, we derive the "epipolar" trajectory of a microcalcification. As a first application of the epipolars, we develop a clustering algorithm after using the Hough transform to find corresponding points generated from a microcalcification. Noise points can also be isolated. In addition, we show how microcalcification projections can be detected adaptively. Epipolar analysis has also led to a novel detection algorithm for DBT using a Bayesian method, which estimates a maximum a posterior (MAP) labelling in each individual image and subsequently for all projections iteratively. Not only does this algorithm output the binary decision of whether a pixel is a microcalcification, it can predict the approximate depth of the microcalcification in the breast if it is. Based on the epipolar analysis, reconstruction of just a region of interest (ROI) e.g. microcalcification clusters is possible and it is more straightforward than any existing method using reconstruction slices. This potentially enables future classification of breast cancer when more clinical data becomes available.
8

Resampling Evaluation of Signal Detection and Classification : With Special Reference to Breast Cancer, Computer-Aided Detection and the Free-Response Approach

Bornefalk Hermansson, Anna January 2007 (has links)
<p>The first part of this thesis is concerned with trend modelling of breast cancer mortality rates. By using an age-period-cohort model, the relative contributions of period and cohort effects are evaluated once the unquestionable existence of the age effect is controlled for. The result of such a modelling gives indications in the search for explanatory factors. While this type of modelling is usually performed with 5-year period intervals, the use of 1-year period data, as in Paper I, may be more appropriate.</p><p>The main theme of the thesis is the evaluation of the ability to detect signals in x-ray images of breasts. Early detection is the most important tool to achieve a reduction in breast cancer mortality rates, and computer-aided detection systems can be an aid for the radiologist in the diagnosing process.</p><p>The evaluation of computer-aided detection systems includes the estimation of distributions. One way of obtaining estimates of distributions when no assumptions are at hand is kernel density estimation, or the adaptive version thereof that smoothes to a greater extent in the tails of the distribution, thereby reducing spurious effects caused by outliers. The technique is described in the context of econometrics in Paper II and then applied together with the bootstrap in the breast cancer research area in Papers III-V.</p><p>Here, estimates of the sampling distributions of different parameters are used in a new model for free-response receiver operating characteristic (FROC) curve analysis. Compared to earlier work in the field, this model benefits from the advantage of not assuming independence of detections in the images, and in particular, from the incorporation of the sampling distribution of the system's operating point.</p><p>Confidence intervals obtained from the proposed model with different approaches with respect to the estimation of the distributions and the confidence interval extraction methods are compared in terms of coverage and length of the intervals by simulations of lifelike data.</p>
9

Semi-automated search for abnormalities in mammographic X-ray images

Barnett, Michael Gordon 24 October 2006
Breast cancer is the most commonly diagnosed cancer among Canadian women; x-ray mammography is the leading screening technique for early detection. This work introduces a semi-automated technique for analyzing mammographic x-ray images to measure their degree of suspiciousness for containing abnormalities. The designed system applies the discrete wavelet transform to parse the images and extracts statistical features that characterize an images content, such as the mean intensity and the skewness of the intensity. A naïve Bayesian classifier uses these features to classify the images, achieving sensitivities as high as 99.5% for a data set containing 1714 images. To generate confidence levels, multiple classifiers are combined in three possible ways: a sequential series of classifiers, a vote-taking scheme of classifiers, and a network of classifiers tuned to detect particular types of abnormalities. The third method offers sensitivities of 99.85% or higher with specificities above 60%, making it an ideal candidate for pre-screening images. Two confidence level measures are developed: first, a real confidence level measures the true probability that an image was suspicious; and second, a normalized confidence level assumes that normal and suspicious images were equally likely to occur. The second confidence measure allows for more flexibility and could be combined with other factors, such as patient age and family history, to give a better true confidence level than assuming a uniform incidence rate. The system achieves sensitivities exceeding those in other current approaches while maintaining reasonable specificity, especially for the sequential series of classifiers and for the network of tuned classifiers.
10

Semi-automated search for abnormalities in mammographic X-ray images

Barnett, Michael Gordon 24 October 2006 (has links)
Breast cancer is the most commonly diagnosed cancer among Canadian women; x-ray mammography is the leading screening technique for early detection. This work introduces a semi-automated technique for analyzing mammographic x-ray images to measure their degree of suspiciousness for containing abnormalities. The designed system applies the discrete wavelet transform to parse the images and extracts statistical features that characterize an images content, such as the mean intensity and the skewness of the intensity. A naïve Bayesian classifier uses these features to classify the images, achieving sensitivities as high as 99.5% for a data set containing 1714 images. To generate confidence levels, multiple classifiers are combined in three possible ways: a sequential series of classifiers, a vote-taking scheme of classifiers, and a network of classifiers tuned to detect particular types of abnormalities. The third method offers sensitivities of 99.85% or higher with specificities above 60%, making it an ideal candidate for pre-screening images. Two confidence level measures are developed: first, a real confidence level measures the true probability that an image was suspicious; and second, a normalized confidence level assumes that normal and suspicious images were equally likely to occur. The second confidence measure allows for more flexibility and could be combined with other factors, such as patient age and family history, to give a better true confidence level than assuming a uniform incidence rate. The system achieves sensitivities exceeding those in other current approaches while maintaining reasonable specificity, especially for the sequential series of classifiers and for the network of tuned classifiers.

Page generated in 0.6133 seconds