• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 30
  • 15
  • 6
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 76
  • 41
  • 31
  • 26
  • 22
  • 13
  • 11
  • 11
  • 9
  • 9
  • 8
  • 8
  • 8
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Digital Watermarking Based Image and Video Quality Evaluation

Wang, Sha 02 April 2013 (has links)
Image and video quality evaluation is very important. In applications involving signal transmission, the Reduced- or No-Reference quality metrics are generally more practical than the Full-Reference metrics. Digital watermarking based quality evaluation emerges as a potential Reduced- or No-Reference quality metric, which estimates signal quality by assessing the degradation of the embedded watermark. Since the watermark contains a small amount of information compared to the cover signal, performing accurate signal quality evaluation is a challenging task. Meanwhile, the watermarking process causes signal quality loss. To address these problems, in this thesis, a framework for image and video quality evaluation is proposed based on semi-fragile and adaptive watermarking. In this framework, adaptive watermark embedding strength is assigned by examining the signal quality degradation characteristics. The "Ideal Mapping Curve" is experimentally generated to relate watermark degradation to signal degradation so that the watermark degradation can be used to estimate the quality of distorted signals. With the proposed framework, a quantization based scheme is first implemented in DWT domain. In this scheme, the adaptive watermark embedding strengths are optimized by iteratively testing the image degradation characteristics under JPEG compression. This iterative process provides high accuracy for quality evaluation. However, it results in relatively high computational complexity. As an improvement, a tree structure based scheme is proposed to assign adaptive watermark embedding strengths by pre-estimating the signal degradation characteristics, which greatly improves the computational efficiency. The SPIHT tree structure and HVS masking are used to guide the watermark embedding, which greatly reduces the signal quality loss caused by watermark embedding. Experimental results show that the tree structure based scheme can evaluate image and video quality with high accuracy in terms of PSNR, wPSNR, JND, SSIM and VIF under JPEG compression, JPEG2000 compression, Gaussian low-pass filtering, Gaussian noise distortion, H.264 compression and packet loss related distortion.
42

Digital Watermarking Based Image and Video Quality Evaluation

Wang, Sha January 2013 (has links)
Image and video quality evaluation is very important. In applications involving signal transmission, the Reduced- or No-Reference quality metrics are generally more practical than the Full-Reference metrics. Digital watermarking based quality evaluation emerges as a potential Reduced- or No-Reference quality metric, which estimates signal quality by assessing the degradation of the embedded watermark. Since the watermark contains a small amount of information compared to the cover signal, performing accurate signal quality evaluation is a challenging task. Meanwhile, the watermarking process causes signal quality loss. To address these problems, in this thesis, a framework for image and video quality evaluation is proposed based on semi-fragile and adaptive watermarking. In this framework, adaptive watermark embedding strength is assigned by examining the signal quality degradation characteristics. The "Ideal Mapping Curve" is experimentally generated to relate watermark degradation to signal degradation so that the watermark degradation can be used to estimate the quality of distorted signals. With the proposed framework, a quantization based scheme is first implemented in DWT domain. In this scheme, the adaptive watermark embedding strengths are optimized by iteratively testing the image degradation characteristics under JPEG compression. This iterative process provides high accuracy for quality evaluation. However, it results in relatively high computational complexity. As an improvement, a tree structure based scheme is proposed to assign adaptive watermark embedding strengths by pre-estimating the signal degradation characteristics, which greatly improves the computational efficiency. The SPIHT tree structure and HVS masking are used to guide the watermark embedding, which greatly reduces the signal quality loss caused by watermark embedding. Experimental results show that the tree structure based scheme can evaluate image and video quality with high accuracy in terms of PSNR, wPSNR, JND, SSIM and VIF under JPEG compression, JPEG2000 compression, Gaussian low-pass filtering, Gaussian noise distortion, H.264 compression and packet loss related distortion.
43

Digitální hudební efekt založený na waveletové transformaci jako plug-in modul / Digital musical effect as a plug-in module based on wavelet transform

Konczi, Róbert January 2011 (has links)
This work deals with theory of wavelet transform and Mallat’s algorithm. It also includes the programming method of creating VST plug-in modules and describes the developement of the plug-in module, witch uses the modificated coeficients of wavelet transform to applicate the music effect.
44

Využití EEG ve vyhodnocování emocionálních stavů člověka / The use of EEG in assessing the emotional state of a person

Strakoš, Libor January 2016 (has links)
This thesis is focused on EEG processing and emotion classification within two-dimensional emotion space. First part consists of theoretical research about emotional responses of human subjects on sound, image and video stimuli. Emotions are examined from aspect of physiology and psychology. Furthermore technical overview of measurement, analysis and emotion classification within two-dimensional emotional space is discussed. Based on gathered knowledge measurement setup with audiovisual stimuli was designed and measured with two independent instruments – EGI GES400MR in laboratory conditions and Emotiv EPOC device in non-laboratory conditions. Signals were processed and emotions were classified based on chosen features. Performance of classifiers in multiple feature selection setups was evaluated.
45

Hand (Motor) Movement Imagery Classification of EEG Using Takagi-Sugeno-Kang Fuzzy-Inference Neural Network

Donovan, Rory Larson 01 June 2017 (has links) (PDF)
Approximately 20 million people in the United States suffer from irreversible nerve damage and would benefit from a neuroprosthetic device modulated by a Brain-Computer Interface (BCI). These devices restore independence by replacing peripheral nervous system functions such as peripheral control. Although there are currently devices under investigation, contemporary methods fail to offer adaptability and proper signal recognition for output devices. Human anatomical differences prevent the use of a fixed model system from providing consistent classification performance among various subjects. Furthermore, notoriously noisy signals such as Electroencephalography (EEG) require complex measures for signal detection. Therefore, there remains a tremendous need to explore and improve new algorithms. This report investigates a signal-processing model that is better suited for BCI applications because it incorporates machine learning and fuzzy logic. Whereas traditional machine learning techniques utilize precise functions to map the input into the feature space, fuzzy-neuro system apply imprecise membership functions to account for uncertainty and can be updated via supervised learning. Thus, this method is better equipped to tolerate uncertainty and improve performance over time. Moreover, a variation of this algorithm used in this study has a higher convergence speed. The proposed two-stage signal-processing model consists of feature extraction and feature translation, with an emphasis on the latter. The feature extraction phase includes Blind Source Separation (BSS) and the Discrete Wavelet Transform (DWT), and the feature translation stage includes the Takagi-Sugeno-Kang Fuzzy-Neural Network (TSKFNN). Performance of the proposed model corresponds to an average classification accuracy of 79.4 % for 40 subjects, which is higher than the standard literature values, 75%, making this a superior model.
46

Combined robust and fragile watermarking algorithms for still images. Design and evaluation of combined blind discrete wavelet transform-based robust watermarking algorithms for copyright protection using mobile phone numbers and fragile watermarking algorithms for content authentication of digital still images using hash functions.

Jassim, Taha D. January 2014 (has links)
This thesis deals with copyright protection and content authentication for still images. New blind transform domain block based algorithms using one-level and two-level Discrete Wavelet Transform (DWT) were developed for copyright protection. The mobile number with international code is used as the watermarking data. The robust algorithms used the Low-Low frequency coefficients of the DWT to embed the watermarking information. The watermarking information is embedded in the green channel of the RGB colour image and Y channel of the YCbCr images. The watermarking information is scrambled by using a secret key to increase the security of the algorithms. Due to the small size of the watermarking information comparing to the host image size, the embedding process is repeated several times which resulted in increasing the robustness of the algorithms. Shuffling process is implemented during the multi embedding process in order to avoid spatial correlation between the host image and the watermarking information. The effects of using one-level and two-level of DWT on the robustness and image quality have been studied. The Peak Signal to Noise Ratio (PSNR), the Structural Similarity Index Measure (SSIM) and Normalized Correlation Coefficient (NCC) are used to evaluate the fidelity of the images. Several grey and still colour images are used to test the new robust algorithms. The new algorithms offered better results in the robustness against different attacks such as JPEG compression, scaling, salt and pepper noise, Gaussian noise, filters and other image processing compared to DCT based algorithms. The authenticity of the images were assessed by using a fragile watermarking algorithm by using hash function (MD5) as watermarking information embedded in the spatial domain. The new algorithm showed high sensitivity against any tampering on the watermarked images. The combined fragile and robust watermarking caused minimal distortion to the images. The combined scheme achieved both the copyright protection and content authentication.
47

Efficient Image Processing Techniques for Enhanced Visualization of Brain Tumor Margins

Koglin, Ryan W. January 2014 (has links)
No description available.
48

Image/video compression and quality assessment based on wavelet transform

Gao, Zhigang 14 September 2007 (has links)
No description available.
49

Perceptually Lossless Coding of Medical Images - From Abstraction to Reality

Wu, David, dwu8@optusnet.com.au January 2007 (has links)
This work explores a novel vision model based coding approach to encode medical images at a perceptually lossless quality, within the framework of the JPEG 2000 coding engine. Perceptually lossless encoding offers the best of both worlds, delivering images free of visual distortions and at the same time providing significantly greater compression ratio gains over its information lossless counterparts. This is achieved through a visual pruning function, embedded with an advanced model of the human visual system to accurately identify and to efficiently remove visually irrelevant/insignificant information. In addition, it maintains bit-stream compliance with the JPEG 2000 coding framework and subsequently is compliant with the Digital Communications in Medicine standard (DICOM). Equally, the pruning function is applicable to other Discrete Wavelet Transform based image coders, e.g., The Set Partitioning in Hierarchical Trees. Further significant coding gains are ex ploited through an artificial edge segmentation algorithm and a novel arithmetic pruning algorithm. The coding effectiveness and qualitative consistency of the algorithm is evaluated through a double-blind subjective assessment with 31 medical experts, performed using a novel 2-staged forced choice assessment that was devised for medical experts, offering the benefits of greater robustness and accuracy in measuring subjective responses. The assessment showed that no differences of statistical significance were perceivable between the original images and the images encoded by the proposed coder.
50

Multiresolutional partial least squares and principal component analysis of fluidized bed drying

Frey, Gerald M. 14 April 2005
Fluidized bed dryers are used in the pharmaceutical industry for the batch drying of pharmaceutical granulate. Maintaining optimal hydrodynamic conditions throughout the drying process is essential to product quality. Due to the complex interactions inherent in the fluidized bed drying process, mechanistic models capable of identifying these optimal modes of operation are either unavailable or limited in their capabilities. Therefore, empirical models based on experimentally generated data are relied upon to study these systems.<p> Principal Component Analysis (PCA) and Partial Least Squares (PLS) are multivariate statistical techniques that project data onto linear subspaces that are the most descriptive of variance in a dataset. By modeling data in terms of these subspaces, a more parsimonious representation of the system is possible. In this study, PCA and PLS are applied to data collected from a fluidized bed dryer containing pharmaceutical granulate. <p>System hydrodynamics were quantified in the models using high frequency pressure fluctuation measurements. These pressure fluctuations have previously been identified as a characteristic variable of hydrodynamics in fluidized bed systems. As such, contributions from the macroscale, mesoscale, and microscales of motion are encoded into the signals. A multiresolutional decomposition using a discrete wavelet transformation was used to resolve these signals into components more representative of these individual scales before modeling the data. <p>The combination of multiresolutional analysis with PCA and PLS was shown to be an effective approach for modeling the conditions in the fluidized bed dryer. In this study, datasets from both steady state and transient operation of the dryer were analyzed. The steady state dataset contained measurements made on a bed of dry granulate and the transient dataset consisted of measurements taken during the batch drying of granulate from approximately 33 wt.% moisture to 5 wt.%. Correlations involving several scales of motion were identified in both studies.<p> In the steady state study, deterministic behavior related to superficial velocity, pressure sensor position, and granulate particle size distribution was observed in PCA model parameters. It was determined that these properties could be characterized solely with the use of the high frequency pressure fluctuation data. Macroscopic hydrodynamic characteristics such as bubbling frequency and fluidization regime were identified in the low frequency components of the pressure signals and the particle scale interactions of the microscale were shown to be correlated to the highest frequency signal components. PLS models were able to characterize the effects of superficial velocity, pressure sensor position, and granulate particle size distribution in terms of the pressure signal components. Additionally, it was determined that statistical process control charts capable of monitoring the fluid bed hydrodynamics could be constructed using PCA<p>In the transient drying experiments, deterministic behaviors related to inlet air temperature, pressure sensor position, and initial bed mass were observed in PCA and PLS model parameters. The lowest frequency component of the pressure signal was found to be correlated to the overall temperature effects during the drying cycle. As in the steady state study, bubbling behavior was also observed in the low frequency components of the pressure signal. PLS was used to construct an inferential model of granulate moisture content. The model was found to be capable of predicting the moisture throughout the drying cycle. Preliminary statistical process control models were constructed to monitor the fluid bed hydrodynamics throughout the drying process. These models show promise but will require further investigation to better determine sensitivity to process upsets.<p> In addition to PCA and PLS analyses, Multiway Principal Component Analysis (MPCA) was used to model the drying process. Several key states related to the mass transfer of moisture and changes in temperature throughout the drying cycle were identified in the MPCA model parameters. It was determined that the mass transfer of moisture throughout the drying process affects all scales of motion and overshadows other hydrodynamic behaviors found in the pressure signals.

Page generated in 0.0265 seconds