Return to search

Dose optimization to minimize radiation risk with acceptable image quality

Image quality has been found to be positively correlated with diagnosis accuracy. Radiologist aim for the highest quality image possible to determine the location of the suspected pathology. However, the most effective way of producing high quality images is to increase the radiation dosage to the patient. To avoid the many risks that come with radiation, patients want to keep dosage as low as possible. Diagnosing instruments are constantly being re-engineered and optimized to keep image quality high and radiation dosage low. If patients wish to avoid nuclear radiation exposure, alternative non-nuclear and low radiation modalities must be employed. The three most important metrics of image quality are spatial resolution, signal-to-noise (SNR) ratio and contrast-to-noise (CNR) ratio [1]. Radiologists and imaging technicians can do very little to improve the spatial resolution; and to improve the CNR a higher dosage is necessary to increase the value of every pixel. To increase radiation-SNR efficiency, the dosage can be reduced by 50% while only dropping the SNR by about 30% [2]. To simulate lower dosage, data is randomly taken out while the image is reconstructed until the acceptable SNR value is achieved. The broad applications can include reducing the signal-to-dosage ratio for any modality involving ionizing radiation and image reconstruction, reducing the risk for every imaged patient.

Identiferoai:union.ndltd.org:bu.edu/oai:open.bu.edu:2144/43437
Date20 November 2021
CreatorsJi, Chuncheng
ContributorsThomas, Kevin
Source SetsBoston University
Languageen_US
Detected LanguageEnglish
TypeThesis/Dissertation
RightsAttribution 4.0 International, http://creativecommons.org/licenses/by/4.0/

Page generated in 0.0019 seconds