• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • 1
  • Tagged with
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Automatic usability assessment of CR images using deep learning

Wårdemark, Erik, Unell, Olle January 2024 (has links)
Computed Radiography exams are rarely performed by the same physicians who will interpret the image. Therefore, if the image does not help the physician diagnose the patient, the image can be rejected by the interpreting physician. The rejection normally happens after the patient has already left the hospital meaning that they will have to return to retake the exam. This leads to unnecessary work for the physicians and for the patient. In order to solve this problem we have explored deep learning algorithms to automatically analyze the images and distinguish between usable and unusable images. The deep learning algorithms include convolutional neural networks, vision transformers and fusion networks utilizing different types of data. In total, seven architectures were used to train 42 models. The models were trained on a dataset of 61 127 DICOM files containing images and metadata collected from a clinical setting and labeled based on if the images were deemed usable in the clinical setting. The complete dataset was used for training generalized models and subsets containing specific body parts were used for training specialized models. Three architectures were used for classification using images only, where two architectures used a ResNet-50 backbone and one architecture used a ViT-B/16 backbone. These architectures created 15 specialized models and three generalized models. Four architectures implementing joint fusion created 20 specialized models and four generalized models. Two of these architectures had a backbone of ResNet-50 and the other two utilized a ViT-B/16 backbone. For each of the backbones used, two types of joint fusion were implemented, type I and type II, which had different structures. The two modalities utilized were images and metadata from the DICOM files. The best image only model had a ViT-B/16 backbone and was trained on a specialized dataset containing hands and feet. This model reached an AUC score of 0.842 and MCC of 0.545. The two fusion models trained on the same dataset reached an AUC score of 0.843 and 0.834 respectively and an MCC of 0.547 and 0.546 respectively. We concluded that it is possible to perform automatic rejections with deep learning models even though the results of this study are not good enough for clinical use. The models using ViT-B/16 performed better than the ones using ResNet-50 for all models. The generalized and specialized models performed equally well in most cases with the exception of the smaller subsets of the full dataset. Utilizing metadata from the DICOM files did not improve the models compared to the image only models.
2

Calcul haute performance pour la détection de rayon Gamma / High Performance Computing for Detection of Gamma ray

Aubert, Pierre 04 October 2018 (has links)
La nouvelle génération d'expériences de physique produira une quantité de données sans précédent. Cette augmentation du flux de données cause des bouleversements techniques à tous les niveaux, comme le stockage des données, leur analyse, leur dissémination et leur préservation.Le projet CTA sera le plus grand observatoire d'astronomie gamma au sol à partir de 2021. Il produira plusieurs centaines de Péta-octets de données jusqu'en 2030 qui devront être analysées, stockée, compressées, et réanalysées tous les ans.Ce travail montre comment optimiser de telles analyses de physique avec les techniques de l'informatique hautes performances par le biais d'un générateur de format de données efficace, d'optimisation bas niveau de l'utilisation du pipeline CPU et de la vectorisation des algorithmes existants, un algorithme de compression rapide d'entiers et finalement une nouvelle analyse de données basée sur une méthode de comparaison d'image optimisée. / The new generation research experiments will introduce huge data surge to a continuously increasing data production by current experiments. This increasing data rate causes upheavals at many levels, such as data storage, analysis, diffusion and conservation.The CTA project will become the utmost observatory of gamma astronomy on the ground from 2021. It will generate hundreds Peta-Bytes of data by 2030 and will have to be stored, compressed and analyzed each year.This work address the problems of data analysis optimization using high performance computing techniques via an efficient data format generator, very low level programming to optimize the CPU pipeline and vectorization of existing algorithms, introduces a fast compression algorithm for integers and finally exposes a new analysis algorithm based on efficient pictures comparison.

Page generated in 0.067 seconds