• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 14
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 20
  • 20
  • 11
  • 8
  • 8
  • 6
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Adaptivní ekvalizace histogramu digitálních obrazů / Adaptive histogram equalization for digital images processing

Kvapil, Jiří January 2009 (has links)
The diploma thesis is focused on histogram equalization method and his extension by the adaptive boundary. This thesis contains explanations of basic notions on that histogram equalization method was created. Next part is described the human vision and priciples of his imitation. In practical part of this thesis was created software that makes it possible to use methods of adaptive histogram equalization on real images. At the end is showed some results that was reached.
2

Image Enhancement & Automatic Detection of Exudates in Diabetic Retinopathy

Mallampati, Vivek January 2019 (has links)
Diabetic retinopathy (DR) is becoming a global health concern, which causes the loss of vision of most patients with the disease. Due to the vast prevalence of the disease, the automated detection of the DR is needed for quick diagnoses where the progress of the disease is monitored by detection of exudates changes and their classifications in the fundus retina images. Today in the automated system of the disease diagnoses, several image enhancement methods are used on original Fundus images. The primary goal of this thesis is to make a comparison of three of popular enhancement methods of the Mahalanobis Distance (MD), the Histogram Equalization (HE) and the Contrast Limited Adaptive Histogram Equalization (CLAHE). By quantifying the comparison in the aspect of the ability to detect and classify exudates, the best of the three enhancement methods is implemented to detect and classify soft and hard exudates. A graphical user interface is also adopted, with the help of MATLAB. The results showed that the MD enhancement method yielded better results in enhancement of the digital images compared to the HE and the CLAHE. The technique also enabled this study to successfully classify exudates into hard and soft exudates classification. Generally, the research concluded that the method that was suggested yielded the best results regarding the detection of the exudates; its classification and management can be suggested to the doctors and the ophthalmologists.
3

Contrast enhancement in digital imaging using histogram equalization

Gomes, David Menotti 18 June 2008 (has links) (PDF)
Nowadays devices are able to capture and process images from complex surveillance monitoring systems or from simple mobile phones. In certain applications, the time necessary to process the image is not as important as the quality of the processed images (e.g., medical imaging), but in other cases the quality can be sacrificed in favour of time. This thesis focuses on the latter case, and proposes two methodologies for fast image contrast enhancement methods. The proposed methods are based on histogram equalization (HE), and some for handling gray-level images and others for handling color images As far as HE methods for gray-level images are concerned, current methods tend to change the mean brightness of the image to the middle level of the gray-level range. This is not desirable in the case of image contrast enhancement for consumer electronics products, where preserving the input brightness of the image is required to avoid the generation of non-existing artifacts in the output image. To overcome this drawback, Bi-histogram equalization methods for both preserving the brightness and contrast enhancement have been proposed. Although these methods preserve the input brightness on the output image with a significant contrast enhancement, they may produce images which do not look as natural as the ones which have been input. In order to overcome this drawback, we propose a technique called Multi-HE, which consists of decomposing the input image into several sub-images, and then applying the classical HE process to each one of them. This methodology performs a less intensive image contrast enhancement, in a way that the output image presented looks more natural. We propose two discrepancy functions for image decomposition which lead to two new Multi-HE methods. A cost function is also used for automatically deciding in how many sub-images the input image will be decomposed on. Experimental results show that our methods are better in preserving the brightness and producing more natural looking images than the other HE methods. In order to deal with contrast enhancement in color images, we introduce a generic fast hue-preserving histogram equalization method based on the RGB color space, and two instances of the proposed generic method. The first instance uses R-red, G-green, and Bblue 1D histograms to estimate a RGB 3D histogram to be equalized, whereas the second instance uses RG, RB, and GB 2D histograms. Histogram equalization is performed using 7 Abstract 8 shift hue-preserving transformations, avoiding the appearance of unrealistic colors. Our methods have linear time and space complexities with respect to the image dimension, and do not require conversions between color spaces in order to perform image contrast enhancement. Objective assessments comparing our methods and others are performed using a contrast measure and color image quality measures, where the quality is established as a weighed function of the naturalness and colorfulness indexes. This is the first work to evaluate histogram equalization methods with a well-known database of 300 images (one dataset from the University of Berkeley) by using measures such as naturalness and colorfulness. Experimental results show that the value of the image contrast produced by our methods is in average 50% greater than the original image value, and still keeping the quality of the output images close to the original
4

True random number generation using genetic algorithms on high performance architectures

MIJARES CHAN, JOSE JUAN 01 September 2016 (has links)
Many real-world applications use random numbers generated by pseudo-random number and true random number generators (TRNG). Unlike pseudo-random number generators which rely on an input seed to generate random numbers, a TRNG relies on a non-deterministic source to generate aperiodic random numbers. In this research, we develop a novel and generic software-based TRNG using a random source extracted from compute architectures of today. We show that the non-deterministic events such as race conditions between compute threads follow a near Gamma distribution, independent of the architecture, multi-cores or co-processors. Our design improves the distribution towards a uniform distribution ensuring the stationarity of the sequence of random variables. We improve the random numbers statistical deficiencies by using a post-processing stage based on a heuristic evolutionary algorithm. Our post-processing algorithm is composed of two phases: (i) Histogram Specification and (ii) Stationarity Enforcement. We propose two techniques for histogram equalization, Exact Histogram Equalization (EHE) and Adaptive EHE (AEHE) that maps the random numbers distribution to a user-specified distribution. EHE is an offline algorithm with O(NlogN). AEHE is an online algorithm that improves performance using a sliding window and achieves O(N). Both algorithms ensure a normalized entropy of (0:95; 1:0]. The stationarity enforcement phase uses genetic algorithms to mitigate the statistical deficiencies from the output of histogram equalization by permuting the random numbers until wide-sense stationarity is achieved. By measuring the power spectral density standard deviation, we ensure that the quality of the numbers generated from the genetic algorithms are within the specified level of error defined by the user. We develop two algorithms, a naive algorithm with an expected exponential complexity of E[O(eN)], and an accelerated FFT-based algorithm with an expected quadratic complexity of E[O(N2)]. The accelerated FFT-based algorithm exploits the parallelism found in genetic algorithms on a homogeneous multi-core cluster. We evaluate the effects of its scalability and data size on a standardized battery of tests, TestU01, finding the tuning parameters to ensure wide-sense stationarity on long runs. / October 2016
5

Advanced Image Processing Using Histogram Equalization and Android Application Implementation

Gaddam, Purna Chandra Srinivas Kumar, Sunkara, Prathik January 2016 (has links)
Now a days the conditions at which the image taken may lead to near zero visibility for the human eye. They may usually due to lack of clarity, just like effects enclosed on earth’s atmosphere which have effects upon the images due to haze, fog and other day light effects. The effects on such images may exists, so useful information taken under those scenarios should be enhanced and made clear to recognize the objects and other useful information. To deal with such issues caused by low light or through the imaging devices experience haze effect many image processing algorithms were implemented. These algorithms also provide nonlinear contrast enhancement to some extent. We took pre-existed algorithms like SMQT (Successive mean Quantization Transform), V Transform, histogram equalization algorithms to improve the visual quality of digital picture with large range scenes and with irregular lighting conditions. These algorithms were performed in two different method and tested using different image facing low light and color change and succeeded in obtaining the enhanced image. These algorithms helps in various enhancements like color, contrast and very accurate results of images with low light. Histogram equalization technique is implemented by interpreting histogram of image as probability density function. To an image cumulative distribution function is applied so that accumulated histogram values are obtained. Then the values of the pixels are changed based on their probability and spread over the histogram. From these algorithms we choose histogram equalization, MATLAB code is taken as reference and made changes to implement in API (Application Program Interface) using JAVA and confirms that the application works properly with reduction of execution time.
6

Automatic Exposure Correction And Local Contrast Setting For Diagnostic Viewing of Medical X-ray Images

Pehrson Skidén, Ottar January 2010 (has links)
<p>To properly display digital X-ray images for visual diagnosis, a proper display range needs to be identified. This can be difficult when the image contains collimators or large background areas which can dominate the histograms. Also, when there are both underexposed and overexposed areas in the image it is difficult to display these properly at the same time. The purpose of this thesis is to find a way to solve these problems. A few different approaches are evaluated to find their strengths and weaknesses. Based on Local Histogram Equalization, a new method is developed to put various constraints on the mapping. These include alternative ways to perform the histogram calculations and how to define the local histograms. The new method also includes collimator detection and background suppression to keep irrelevant parts of the image out of the calculations. Results show that the new method enables proper display of both underexposed and overexposed areas in the image simultaneously while maintaining the natural look of the image. More testing is required to find appropriate parameters for various image types.</p>
7

Fingerprint Segmentation

Jomaa, Diala January 2009 (has links)
In this thesis, a new algorithm has been proposed to segment the foreground of the fingerprint from the image under consideration. The algorithm uses three features, mean, variance and coherence. Based on these features, a rule system is built to help the algorithm to efficiently segment the image. In addition, the proposed algorithm combine split and merge with modified Otsu. Both enhancements techniques such as Gaussian filter and histogram equalization are applied to enhance and improve the quality of the image. Finally, a post processing technique is implemented to counter the undesirable effect in the segmented image. Fingerprint recognition system is one of the oldest recognition systems in biometrics techniques. Everyone have a unique and unchangeable fingerprint. Based on this uniqueness and distinctness, fingerprint identification has been used in many applications for a long period. A fingerprint image is a pattern which consists of two regions, foreground and background. The foreground contains all important information needed in the automatic fingerprint recognition systems. However, the background is a noisy region that contributes to the extraction of false minutiae in the system. To avoid the extraction of false minutiae, there are many steps which should be followed such as preprocessing and enhancement. One of these steps is the transformation of the fingerprint image from gray-scale image to black and white image. This transformation is called segmentation or binarization. The aim for fingerprint segmentation is to separate the foreground from the background. Due to the nature of fingerprint image, the segmentation becomes an important and challenging task. The proposed algorithm is applied on FVC2000 database. Manual examinations from human experts show that the proposed algorithm provides an efficient segmentation results. These improved results are demonstrating in diverse experiments.
8

Automatic Exposure Correction And Local Contrast Setting For Diagnostic Viewing of Medical X-ray Images

Pehrson Skidén, Ottar January 2010 (has links)
To properly display digital X-ray images for visual diagnosis, a proper display range needs to be identified. This can be difficult when the image contains collimators or large background areas which can dominate the histograms. Also, when there are both underexposed and overexposed areas in the image it is difficult to display these properly at the same time. The purpose of this thesis is to find a way to solve these problems. A few different approaches are evaluated to find their strengths and weaknesses. Based on Local Histogram Equalization, a new method is developed to put various constraints on the mapping. These include alternative ways to perform the histogram calculations and how to define the local histograms. The new method also includes collimator detection and background suppression to keep irrelevant parts of the image out of the calculations. Results show that the new method enables proper display of both underexposed and overexposed areas in the image simultaneously while maintaining the natural look of the image. More testing is required to find appropriate parameters for various image types.
9

Imaging, characterization and processing with axicon derivatives.

Saikaley, Andrew Grey 06 August 2013 (has links)
Axicons have been proposed for imaging applications since they offer the advantage of extended depth of field (DOF). This enhanced DOF comes at the cost of degraded image quality. Image processing has been proposed to improve the image quality. Initial efforts were focused on the use of an axicon in a borescope thereby extending depth of focus and eliminating the need for a focusing mechanism. Though promising, it is clear that image processing would lead to improved image quality. This would also eliminate the need, in certain applications, for a fiber optic imaging bundle as many modern day video borescopes use an imaging sensor coupled directly to the front end optics. In the present work, three types of refractive axicons are examined: a linear axicon, a logarithmic axicon and a Fresnel axicon. The linear axicon offers the advantage of simplicity and a significant amount of scientific literature including the application of image restoration techniques. The Fresnel axicon has the advantage of compactness and potential low cost of production. As no physical prior examples of the Fresnel axicons were available for experimentation until recently, very little literature exists. The logarithmic axicon has the advantage of nearly constant longitudinal intensity distribution and an aspheric design producing superior pre-processed images over the aforementioned elements. Point Spread Functions (PSFs) for each of these axicons have been measured. These PSFs form the basis for the design of digital image restoration filters. The performance of these three optical elements and a number of restoration techniques are demonstrated and compared.
10

Contrast enhancement in digital imaging using histogram equalization / Amélioration du contraste des images numériques par égalisation d'histogrammes

Gomes, David Menotti 18 June 2008 (has links)
Aujourd’hui, des appareils capables de capter et de traiter les images peuvent être trouvés dans les systèmes complexes de surveillance ou de simples téléphones mobiles. Dans certaines applications, le temps nécessaire au traitement des images n’est pas aussi important que la qualité du traitement (par exemple, l’imagerie médicale). Par contre, dans d’autres cas, la qualité peut être sacrifiée au profit du facteur temps. Cette thèse se concentre sur ce dernier cas, et propose deux types de méthodes rapides pour l’amélioration du contraste d’image. Les méthodes proposées sont fondées sur l’égalisation d’histogramme (EH), et certaines s’adressent à des images en niveaux de gris, tandis que d’autres s’adressent à des images en couleur. En ce qui concerne les méthodes EH pour des images en niveaux de gris, les méthodes actuelles tendent à changer la luminosité moyenne de l’image de départ pour le niveau moyen de l´interval de niveaux de gris. Ce n’est pas souhaitable dans le cas de l’amélioration du contraste d’image pour les produits de l’électronique grand-public, où la préservation de la luminosité de l’image de départ est nécessaire pour éviter la production de distortions dans l’image de sortie. Pour éviter cet inconvénient, des méthodes de Biégalisation d’histogrammes pour préserver la luminosité et l’amélioration du contraste ont été proposées. Bien que ces méthodes préservent la luminosité de l’image de départ tout en améliorant fortement le contraste, elles peuvent produire des images qui ne donnent pas une impression visuelle aussi naturelle que les images de départ. Afin de corriger ce problème, nous proposons une technique appelée multi-EH, qui consiste à décomposer l’image en plusieurs sous-images, et à appliquer le procédé classique de EH à chacune d’entre elles. Bien que produisant une amélioration du contraste moins marquée, cette méthode produit une image de sortie d’une apparence plus naturelle. Nous proposons deux fonctions de décalage par découpage d’histogramme, permettant ainisi de concevoir deux nouvelle méthodes de multi-EH. Une fonction de coût est également utilisé pour déterminer automatiquement en combien de sous-images l’histogramme de l’image d’entrée sera décomposée. Les expériences montrent que nos méthodes sont meilleures pour la préservation de la luminosité et produisent des images plus naturelles que d´autres méthodes de EH. Pour améliorer le contraste dans les images en couleur, nous introduisons une méthode 5 Résumé 6 générique et rapide, qui préserve la teinte. L’égalisation d’histogramme est fondée sur l’espace couleur RGB, et nous proposons deux instantiations de la méthode générique. La première instantiation utilise des histogrammes 1D R-red, G-green, et B-bleu afin d’estimer l’histogramme 3D RGB qui doit être égalisé, alors que le deuxième instantiation utilise des histogrammes 2D RG, RB, et GB. L’égalisation d’histogramme est effectué en utilisant des transformations de décalage qui préservent la teinte, en évitant l’apparition de couleurs irréalistes. Nos méthodes ont des complexités de temps et d’espace linéaire, par rapport à la taille de l’image, et n’ont pas besoin de faire la conversion d’un espace couleur à l’autre afin de réaliser l’amélioration du contraste de l’image. Des évaluations objectives comparant nos méthodes et d’autres ont été effectuées au moyen d’une mesure de contraste et de couleur afin de mesurer la qualité de l’image, où la qualité est établie comme une fonction pondérée d’un indice de “naturalité” et d’un indice de couleur. Nous analysons 300 images extraites d’une base de données de l’Université de Berkeley. Les expériences ont montré que la valeur de contraste de l’image produite par nos méthodes est en moyenne de 50% supérieure à la valeur de contraste de l’image original, tout en conservant une qualité des images produites proche de celle des images originales / Nowadays devices are able to capture and process images from complex surveillance monitoring systems or from simple mobile phones. In certain applications, the time necessary to process the image is not as important as the quality of the processed images (e.g., medical imaging), but in other cases the quality can be sacrificed in favour of time. This thesis focuses on the latter case, and proposes two methodologies for fast image contrast enhancement methods. The proposed methods are based on histogram equalization (HE), and some for handling gray-level images and others for handling color images As far as HE methods for gray-level images are concerned, current methods tend to change the mean brightness of the image to the middle level of the gray-level range. This is not desirable in the case of image contrast enhancement for consumer electronics products, where preserving the input brightness of the image is required to avoid the generation of non-existing artifacts in the output image. To overcome this drawback, Bi-histogram equalization methods for both preserving the brightness and contrast enhancement have been proposed. Although these methods preserve the input brightness on the output image with a significant contrast enhancement, they may produce images which do not look as natural as the ones which have been input. In order to overcome this drawback, we propose a technique called Multi-HE, which consists of decomposing the input image into several sub-images, and then applying the classical HE process to each one of them. This methodology performs a less intensive image contrast enhancement, in a way that the output image presented looks more natural. We propose two discrepancy functions for image decomposition which lead to two new Multi-HE methods. A cost function is also used for automatically deciding in how many sub-images the input image will be decomposed on. Experimental results show that our methods are better in preserving the brightness and producing more natural looking images than the other HE methods. In order to deal with contrast enhancement in color images, we introduce a generic fast hue-preserving histogram equalization method based on the RGB color space, and two instances of the proposed generic method. The first instance uses R-red, G-green, and Bblue 1D histograms to estimate a RGB 3D histogram to be equalized, whereas the second instance uses RG, RB, and GB 2D histograms. Histogram equalization is performed using 7 Abstract 8 shift hue-preserving transformations, avoiding the appearance of unrealistic colors. Our methods have linear time and space complexities with respect to the image dimension, and do not require conversions between color spaces in order to perform image contrast enhancement. Objective assessments comparing our methods and others are performed using a contrast measure and color image quality measures, where the quality is established as a weighed function of the naturalness and colorfulness indexes. This is the first work to evaluate histogram equalization methods with a well-known database of 300 images (one dataset from the University of Berkeley) by using measures such as naturalness and colorfulness. Experimental results show that the value of the image contrast produced by our methods is in average 50% greater than the original image value, and still keeping the quality of the output images close to the original / Dispositivos para aquisição e processamento de imagens podem ser encontrados em sistemas complexos de monitoração de segurança ou simples aparelhos celulares. Em certas aplicações, o tempo necessário para processar uma imagem não é tão importante quanto a qualidade das imagens processadas (por exemplo, em imagens médicas), mas em alguns casos a qualidade da imagem pode ser sacrificada em favor do tempo. Essa tese se foca nesse último caso, e propõe duas metodologias eficientes para o realce de contraste de imagens. Os métodos propostos são baseados em equalização de histograma (EH), e focam em imagens em tons de cinza e em imagens coloridas. Os métodos baseados em EH atualmente utilizados para processar imagens em tons de cinza tendem a mudar o brilho médio da imagem para o tom médio do intervalo de tons de cinza. Essa mudança não é desejavél em aplicações que visam melhorar o contraste em produtos eletrônicos utilizados pelo consumidor, onde preservar o brilho da imagem original é necessário para evitar o aparecimento de artefatos não exitentes na imagem de saída. Para corrigir esse problema, métodos de bi-equalização de histogramas para preservação do brilho e contraste de imagens foram propostos. Embora esses métodos preservem o brilho da imagem original na imagem processada com um realce significante do contraste, eles podem produzir imagens que não parecem naturais. Esse novo problema foi resolvido por uma nova técnica chamada de Multi-Equalização de histogramas, que decompõe a imagem original em várias sub-imagens, e aplica o método de EH clássico em cada uma delas. Essa metodologia realiza um realce de contraste menos intenso, de forma que a imagem processada parece mais “natural”. Essa tese propõe duas novas funções de discrepância para decomposição de imagens, originando dois novos métodos de Multi-EH. Além disso, uma função de custo é utilizada para determinar em quantas sub-imagens a imagem original será dividida. Através da comparação objetiva e quantitative usando uma medida de constrate, os experimentos mostraram que os métodos propostos são melhores que outros EH estudados, uma vez que eles preservam o brilho e produzem imagens com uma aparência mais natural. Em relação aos métodos para realce de contraste em imagens coloridas, essa tese propõe um método genérico e eficiente de EH baseado no espaço de cores RGB que preserva o tom 9 Resumo 10 (a matiz), e implementa duas instâncias desse método genérico. A primeira instância utiliza os histogramas 1D R-red, G-green e B-blue para estimar um histograma 3D RGB, que é então equalizado. A segunda instância, por sua vez, utiliza os histogramas 2D RG, RB, e GB. A EH é executada utilizando transformadas de deslocamento que preservam a tonalidade da cor, evitando o aparecimento de cores não reais. Os métodos propostos tem complexidade linear no espaço e no tempo em relação ao tamanho da imagem, e não usam nenhuma conversão de um espaço de cores para outro. As imagens produzidas foram avaliadas objetivamente, comparando os métodos propostos com outros estudados. A avaliação objetiva foi feita utilizando medidas de contraste e de qualidade da cor da imagem, onde a qualidade foi definida como uma função ponderada dos índices de naturalidade e cromicidade. Um conjunto de 300 imagens extraídas da base de dados da Universidade de Berkeley foram analisadas. Os experimentos mostraram que o valor do contraste das imagens produzidas pelos métodos propostos é, em médias, 50% maior que o valor do contraste na imagem original, e ao mesmo tempo a qualidade das imagens produzidas é próxima a qualidade da imagem original / Dispositivi per l’acquisizione e lo svolgimento di immagini si possono trovare nei complessi sistemi di monitoramento di sicurezza o nei semplici cellulari. In alcune applicazioni il tempo necessario per svolgere un’immagine non è cosi importante come la qualità delle immagini svolte (es. nelle immagini mediche), ma in alcuni casi la qualità dell’immagine potrà venire daneggiata a favore del tempo. Questa tesi è basata su quest’ultimo caso e propone due metodi efficienti per evidenziare il contrasto di colore delle immagini. I metodi proposti vengono basate sull’equalizazzione d’istogramma (EI), mirati su delle immagini grigie e sulle immagini colorate. I metodi basati sull’EI attualmente utilizzati per svolgere delle immagini grigie tendono a cambiare il brillo medio dell’immagine per il tono medio dell’intervallo grigio. Questo cambiamento non è desiderato nelle applicazioni mirate al miglioramento del contrasto sui prodotti elettronici utilizzati dal consumatore, dove preservare il brillo dell’immagine originale è necessario per evitare la comparsa di artefatti inesistenti nell’immagine d’uscita. Sono stati proposti dei metodi di biequalizazzione di istogrammi per corregere questo problema della preservazione del brillo e del contrasto di colore delle immagini. Nonostante questi metodi preservino il brillo dell’immagine originale con significante rilievo del contrasto nell’immagine svolta, questi possono produrre delle immagini che non sembrino naturali. Questo nuovo problema è stato risolto con una nuova tecnica detta Multiequalizazzione di istogrammi, che decompone l’immagine originale in varie sottoimmagini, applicando su ognuna di queste il metodo EI classico. Questa metodologia realizza un contrasto di rilievo meno intenso in modo che l’immagine svolta sembri più “naturale”. Questa tesi propone due nuove funzioni di discrepanza per la decomposizione delle immagini, originandone due nuovi metodi Multi-EI. Inoltre una funzione di costo viene utilizzata per determinare in quante sottoimmagini l’immagine originale verrà divisa. Attraverso paragone obiettivo e quantitativo, usando una misura di contrasto, gli esperimenti hanno convalidato che i metodi proposti sono migliori di quegli EI studiati perché quelli preservano il brillo e producono immagini con un’apparenza più naturale. Con riferimento ai metodi utilizzati per rilevare il contrasto nelle immagini colorate questa tese propone un metodo generico ed efficiente di EI, in base negli spazi di colori 11 Risumo 12 RGB, che preserva il tono (la sfumatura) e implementa due istanze di questo metodo generico. La prima istanza utilizza gli istogrammi 1D R-Red, G-green e B-blue per stimare un istogramma 3D RGB, che viene di seguito equalizzato. La seconda istanza invece utilizza gli istogrammi 2D RG, RB e GB. La EI viene eseguita utilizzando trasformate di trasloco che preservano il tono del colore, evitando così la comparsa di colori non reali. I metodi proposti hanno complessità lineare nello spazio e nel tempo rispetto alla grandezza dell’immagine e non usano nessuna conversione da un spazio di colore all’altro. Le immagini prodotte sono state valutate in modo obiettivo, paragonando i metodi proposti con gli altri studiati. La valutazione obiettiva è stata fatta utilizzando delle misure di contrasto e qualità del colore dell’immagine, dove la qualità è stata definita come una funzione ponderata degli indici di naturalità e colorito. Si analisarano un insieme di 300 immagini tratte dalla base dei dati dell’Università di Berkeley. Gli sperimenti mostrarono che il valore del contrasto delle immagini prodotte daí metodi proposti è mediamente 50% maggiore del valore del contrasto nell’immagine originale e una volta ancora la qualità delle immagini prodotte è vicina alla qualità dell’immagine originale

Page generated in 0.11 seconds