191 |
Adaptive Image Quality Improvement with Bayesian Classification for In-line MonitoringYan, Shuo 01 August 2008 (has links)
Development of an automated method for classifying digital images using a combination of image quality modification and Bayesian classification is the subject of this thesis. The specific example is classification of images obtained by monitoring molten plastic in an extruder. These images were to be classified into two groups: the “with particle” (WP) group which showed contaminant particles and the “without particle” (WO) group which did not. Previous work effected the classification using only an adaptive Bayesian model. This work combines adaptive image quality modification with the adaptive Bayesian model. The first objective was to develop an off-line automated method for determining how to modify each individual raw image to obtain the quality required for improved classification results. This was done in a very novel way by defining image quality in terms of probability using a Bayesian classification model. The Nelder Mead Simplex method was then used to optimize the quality. The result was a “Reference Image Database” which was used as a basis for accomplishing the second objective. The second objective was to develop an in-line method for modifying the quality of new images to improve classification over that which could be obtained previously. Case Based Reasoning used the Reference Image Database to locate reference images similar to each new image. The database supplied instructions on how to modify the new image to obtain a better quality image. Experimental verification of the method used a variety of images from the extruder monitor including images purposefully produced to be of wide diversity. Image quality modification was made adaptive by adding new images to the Reference Image Database. When combined with adaptive classification previously employed, error rates decreased from about 10% to less than 1% for most images. For one unusually difficult set of images that exhibited very low local contrast of particles in the image against their background it was necessary to split the Reference Image Database into two parts on the basis of a critical value for local contrast. The end result of this work is a very powerful, flexible and general method for improving classification of digital images that utilizes both image quality modification and classification modeling.
|
192 |
Synthetic test patterns and compression artefact distortion metrics for image codecs : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Engineering at Massey University, Palmerston North, New ZealandPunchihewa, Amal January 2009 (has links)
This thesis presents a framework of test methodology to assess spatial domain compression artefacts produced by image and intra-frame coded video codecs. Few researchers have studied this broad range of artefacts. A taxonomy of image and video compression artefacts is proposed. This is based on the point of origin of the artefact in the image communication model. This thesis presents objective evaluation of distortions known as artefacts due to image and intra-frame coded video compression made using synthetic test patterns. The American National Standard Institute document ANSI T1 801 qualitatively defines blockiness, blur and ringing artefacts. These definitions have been augmented with quantitative definitions in conjunction with test patterns proposed. A test and measurement environment is proposed in which the codec under test is exercised using a portfolio of test patterns. The test patterns are designed to highlight the artefact under study. Algorithms have been developed to detect and measure individual artefacts based on the characteristics of respective artefacts. Since the spatial contents of the original test patterns form known structural details, the artefact distortion metrics based on the characteristics of those artefacts are clean and swift to calculate. Distortion metrics are validated using a human vision system inspired modern image quality metric. Blockiness, blur and ringing artefacts are evaluated for representative codecs using proposed synthetic test patterns. Colour bleeding due to image and video compression is discussed with both qualitative and quantitative definitions for the colour bleeding artefacts introduced. The image reproduction performance of a few codecs was evaluated to ascertain the utility of proposed metrics and test patterns.
|
193 |
Vliv korekce prostorové rozlišovací schopnosti kolimátoru na kvalitu obrazu v perfuzní scintigrafii myokardu / Effect of resolution recovery on image quality in myocardial perfusion SPECTSkibová, Daniela January 2013 (has links)
Objectives: Resolution recovery algorithms (IR-RR) were recently proposed as tools to improve quality of SPECT images through better resolution. The aim of the presented study was to investigate the effect of IR-RR on myocardial perfusion SPECT studies. Methods: Phantom and clinical studies were performed using SPECT-CT Infinia/Hawkeye (GE Healthcare). NEMA triple line phantom was scanned according to NEMA procedures. Cold sphere and cardiac phantom were scanned under clinical conditions (900 -angled detectors, 60 views and circular orbit) and reconstructed via IR-RR (Evolution for Cardiac, GE, 12 iterations and 10 subsets), OSEM (ordered subset expectation maximization, 2 iterations and 10 subsets) and FBP (filtered back projection). IR-RR and OSEM reconstructions were used with/without attenuation and scatter correction (ACSC). The effect of post-reconstruction filtering was evaluated. In clinical studies two data-sets were used (half-time and full-time). Conventional scan using full-time (20 sec per stress projection and 25 sec per rest projection) was reconstructed via FBP and IR-RR; half-time scan (10 sec stress, 12 sec rest studies) was reconstructed via IR-RR. End-diastolic volume (EDV), end-systolic volume (ESV) and left ventricular ejection fraction (EF) were calculated using two software...
|
194 |
Reconstrução espectral de tubos de radiação odontológicos usando a transformada inversa de Laplace da curva de atenuação / Spectral Reconstruction of Dental X Ray Tubes Using the Inverse Laplace Transform of the Attenuation Curve.Alex Malezan 27 June 2013 (has links)
No estudo de imagens radiográcas, os parâmetros relacionados ao contraste objeto, SC, razão sinal ruído, SNR, e dose, estão vinculados à forma do espectro de raios X utilizado e seu conhecimento permite predizer e otimizar a qualidade da imagem. Neste trabalho foi desenvolvida uma metodologia que permite obter o espectro de tubos de raios X odontológicos de uso clínico de forma indireta. Esta metodologia é baseada na aplicação de um modelo matemático que utiliza a transformada inversa de Laplace da curva de transmissão do feixe para gerar dados sobre a distribuição espectral do mesmo. Com o auxílio de uma câmara de ionização e ltros alumínio de alta pureza, foram levantadas as curvas de transmissão de 8 tubos de raios X disponíveis comercialmente. Para a validação do método foi realizada a espectrometria direta com detector de telureto de cádmio (CdTe), cuja resposta foi determinada por simulação Monte Carlo (MC). A partir reconstrução espectral obtida, foram realizados estudos sobre os parâmetros de qualidade de imagem SNR, contraste objeto, SC, KERMA na entrada da pele. O desempenho dos tubos foi avaliado com base na relação entre SNR e KERMA na entrada da pele. Os resultados mostram que é possível determinar a distribuição espectral de tubos de raios X odontológicos com base no método proposto. A relação proposta entre SNR e KERMA na entrada da pele sugere que tubos com fótons de baixa energia possuem baixo rendimento. / In the study of radiographic images, the parameters related to the subject contrast, SC, signal to noise ratio, SNR, and dose, are linked to the shape of the X-ray spectrum used and their knowledge allows to predict and optimize the image quality. In this work we developed a methodology to obtain the spectrum of dental X-ray tubes of clinical usage in an indirety way. This methodology is based on application of a mathematical model that uses the inverse Laplace transform of the attenuation curve to generate data on the spectral distribution of the beam. With the aid of an ionization chamber and high purity aluminum lters, were raised the transmission curves of 8 X-ray tubes that are available commercially. The method validation was performed with direct spectrometry detector cadmium telluride (CdTe), whose response was determined by Monte Carlo simulation (MC). From reconstruction obtained spectral studies were carried out on the parameters of SNR image quality, contrast object, SC, KERMA entrance skin. The performance of the tubes was evaluated based on the relationship between SNR and KERMA entrance skin. The results show that it is possible to determine the spectral distribution of dental X-ray tubes based on the proposed method. The proposed relationship between SNR and KERMA entrance skin suggests that tubes with low energy photons have a low performance.
|
195 |
Estudo experimental da otimização em sistemas de mamografia digital CR e DR / Experimental study of optimization in CR and DR digital mammography systemsAlessandra Maia Marques Martinez Perez 29 January 2015 (has links)
A recente inserção e forte avanço da mamografia digital no Brasil como ferramenta de rastreamento do câncer mamário e as evidências de outras condições de otimização, quando comparadas à mamografia convencional (tela filme), requerem que novos parâmetros de qualidade sejam incluídos e estudados, bem como que as condições de otimização sejam revistas. O objetivo deste trabalho foi determinar a técnica radiográfica otimizada para dois sistemas de detecção (CR e DR) em uso em três unidades de mamografia: Mammomat 3000 Nova (Siemens), Senographe DMR (GE) e Senographe 2000D (GE). A otimização foi conduzida para uma variedade de combinações de fatores técnicos e configurações de simuladores de mama, tais como valores de kilovoltagem (26 a 32 kV), combinações anodo/filtro (Mo/Mo, Mo/Rh e Rh/Rh), material simulador de mama de várias espessuras (2 a 8 cm) e lesões simuladas como massas e calcificações, usando uma figura de mérito (FOM) como parâmetro. Verificou-se que o uso da combinação anodo/filtro que gera os espectros mais energéticos em cada equipamento proporcionou os maiores valores de FOM para todas as espessuras de simulador de mama e voltagens, devido a redução da dose. As combinações anodo/filtro que deram esses resultados foram Mo/Rh para o equipamento da marca Siemens e Rh/Rh para ambos os equipamentos da marca GE, correspondentes aos espectros mais energéticos de cada unidade. Foi observada ainda uma tendência de aumento do kV que maximiza FOM com o aumento da espessura. / The recent introduction and intense advance of digital mammography in Brazil as a tool in breast cancer screening and the evidences of new optimization conditions when compared to conventional mammography (screen-film) require adding and studying novel quality parameters, as well as revisiting optimization conditions. The objective of this work was to determine optimized radiographic technique for two detection systems (CR and DR) in use in three mammography units: Mammomat 3000 Nova (Siemens), Senographe DMR (GE) and Senographe 2000D (GE). Optimization was conducted for various combinations of technique factors and breast phantom configurations, such as kilovoltage settings (26 to 32 kV), target/filter combinations (Mo/Mo, Mo/Rh and Rh/Rh), breast equivalent material in various thicknesses (2 to 8 cm) and simulated mass and calcification lesions, using a figure of merit (FOM) as a parameter. When using anode/filter combination which generates higher energy spectra in each equipment, it was verified that higher FOM values were achieved for all voltages and phantom thicknesses, due to dose reduction. Anode/filter combinations which led to those results were Mo/Rh for Siemens equipment and Rh/Rh for both GE equipments, corresponding to the higher energy spectra in each unity. It was also observed an increasing tendency of kV which maximizes FOM with the increase of thickness.
|
196 |
Quantifying image quality in diagnostic radiology using simulation of the imaging system and model observersUllman, Gustaf January 2008 (has links)
Accurate measures of both clinical image quality and patient radiation risk are needed for successful optimisation of medical imaging with ionising radiation. Optimisation in diagnostic radiology means finding the image acquisition technique that maximises the perceived information content and minimises the radiation risk or keeps it at a reasonably low level. The assessment of image quality depends on the diagnostic task and may in addition to system and quantum noise also be hampered by overlying projected anatomy. The main objective of this thesis is to develop methods for assessment of image quality in simulations of projection radiography. In this thesis, image quality is quantified by modelling the whole x‐ray imaging system including the x‐ray tube, patient, anti‐scatter device, image detector and the observer. This is accomplished by using Monte Carlo (MC) simulation methods that allow simultaneous estimates of measures of image quality and patient dose. Measures of image quality include the signal‐to‐noise‐ratio, SNR, of pathologic lesions and radiation risk is estimated by using organ doses to calculate the effective dose. Based on high‐resolution anthropomorphic phantoms, synthetic radiographs were calculated and used for assessing image quality with model‐observers (Laguerre‐Gauss (LG) Hotelling observer) that mimic real, human observers. Breast and particularly chest imaging were selected as study cases as these are particularly challenging for the radiologists. In chest imaging the optimal tube voltage in detecting lung lesions was investigated in terms of their SNR and the contrast of the lesions relative to the ribs. It was found that the choice of tube voltage depends on whether SNR of the lesion or the interfering projected anatomy (i.e. the ribs) is most important for detection. The Laguerre‐Gauss (LG) Hotelling observer is influenced by the projected anatomical background and includes this into its figure‐of‐merit, SNRhot,LG. The LG‐observer was found to be a better model of the radiologist than the ideal observer that only includes the quantum noise in its analysis. The measures of image quality derived from our model are found to correlate relatively well with the radiologist’s assessment of image quality. Therefore MC simulations can be a valuable and an efficient tool in the search for dose‐efficient imaging systems and image acquisition schemes.
|
197 |
Web designers, don’t be afraid to use low-quality images in e-retail, unless you want to impress users : Purchase intent and attitudes on product listing pages with varying product image qualityLundberg, Annika January 2021 (has links)
Internet usage is increasing every year, and so do the different online activities on the internet. Online shopping is one of the most popular activities on the internet and is continuing to grow. This study investigated if visual information influences consumer behaviour on e-retail websites when shown together with non-imagery information. The study also looks at whether or not the visual information that is being presented is a factor for consumers behaviour, and if the quality of the image matters in purchase intent. Another point of view in this study was whether or not imagery information in e-retail would increase the consumer attitude towards the design and if the type of visual information mattered. A significant difference was found in purchase intent for products having imagery information, regardless of the quality. Attitudes towards e-retail designs with high-quality product images were also found to be significantly more positive over both no- and low-quality images. The findings of this study fills the gap whether or not visual stimuli influence consumers when displayed to products without visual stimuli. The type of product imagery information being displayed is also a factor for consumer enjoyment. High-quality visual information is perceived better compared to designs with low-quality and non-imagery product information.
|
198 |
Optimization of Fast MR Imaging Technologies using the Case-PDM to Quantitatively Assess Image QualityMiao, Jun 08 March 2013 (has links)
No description available.
|
199 |
Matching Pursuit and Residual Vector Quantization: Applications in Image CodingEbrahimi-Moghadam, Abbas 09 1900 (has links)
In this thesis, novel progressive scalable region-of-interest (ROI) image coding
schemes with rate-distortion-complexity trade-off based on residual vector
quantization (RVQ) and matching pursuit (MP) are developed. RVQ and MP
provide the encoder with multi-resolution signal analysis tools, which are useful for rate-distortion trade-off and can be used to render a selected region
of an image with a specific quality. An image quality refinement strategy is
presented in this thesis, which improves the quality of the ROI in a progressive
manner. The reconstructed image can mimic foveated images in perceptual
image coding context. The systems are unbalanced in the sense that the decoders have less computational requirements than the encoders. The methods also provide interactive way of information refinement for regions of image with receiver 's higher priority. The receiver is free to select multiple regions of interest and change his/her mind and choose alternative regions in the middle of signal transmission. The proposed RVQ and MP based image coding methods in this thesis raise a couple of issues and reveal some capabilities in image coding and communication. In RVQ based image coding, the effects of dictionary size, number of RVQ stages and the size of image blocks on the reconstructed image quality, the resulting bit rate, and the computational complexity are investigated. The progressive nature of the resulting bit-stream makes RVQ and MP based image coding methods suitable platforms for unequal error protection. Researchers have paid lots of attention to joint source-channel ( JSC) coding in recent years. In this popular framework, JSC decoding based on residual redundancy exploitation of a source coder output bit-stream is an interesting bandwidth efficient approach for signal reconstruction. In this thesis, we also addressed JSC decoding and error concealment problem for matching pursuit based coded images transmitted over a noisy memoryless channel. The problem is solved on minimum mean squared error (MMSE) estimation foundation and a suboptimal solution is devised, which yields high quality error concealment with different levels of computational complexity. The proposed decoding and error concealment solution takes advantage of the residual redundancy,
which exists in neighboring image blocks as well as neighboring MP analysis stages, to improve the quality of the images with no increase in the required bandwidth. The effects of different parameters such as MP dictionary size and number of analysis stages on the performance of the proposed soft decoding method have also been investigated. / Thesis / Doctor of Philosophy (PhD)
|
200 |
Accommodative lag, peripheral aberrations, and myopia in childrenBerntsen, David A. 01 September 2009 (has links)
No description available.
|
Page generated in 0.0862 seconds