• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 30
  • 15
  • 6
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 76
  • 41
  • 31
  • 26
  • 22
  • 13
  • 11
  • 11
  • 9
  • 9
  • 8
  • 8
  • 8
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Time-frequency and Hidden Markov Model Methods for Epileptic Seizure Detection

Zhu, Dongqing 16 July 2009 (has links)
No description available.
32

MULTIRESOLUTION-MULTIVARIATE ANALYSIS OF VIBRATION SIGNALS; APPLICATION IN FAULT DIAGNOSIS OF INTERNAL COMBUSTION ENGINES

Haqshenas, Seyyed Reza 04 1900 (has links)
<p>Condition monitoring and fault diagnosis of mechanical systems are two important issues that have received considerable attention from both academia and industry. Several techniques have been developed to date to address these issues. One category of these techniques which has been successfully applied in many industrial plants is based on the multiresolution multivariate analysis algorithms and more specifically the multi-scale principal component analysis (MSPCA). The present research aims to develop a multi-resolution multivariate analysis technique which can be effectively used for fault diagnosis of an internal combustion engine. Crank Angle Domain (CAD) Analysis is the most intuitive strategy for monitoring internal combustion engines. \comment{ as a cyclic system in which events at each cycle is correlated to a particular position of the crankshaft, this leads to analyzing the engine performance in angle domain (i.e. Crank Angle domain for engine) as very logical and intuitive strategy.} Therefore, MSPCA and CAD analysis were combined and a new technique, named CAD-MSPCA, was developed. In addition to this contribution, two indices were defined based on estimation of covariance matrices of scores and fault matrices. These indices were then employed for both fault localization and isolation purposes. In addition to this development, an interesting discovery made through this research was to use the statistical indices , calculated by MSPCA, for fault identification. It is mathematically shown that in case these indices detect a fault in the system, one can determine the spectral characteristics of the fault by performing the spectrum analysis of these indices. This analysis demonstrated the MSPCA as an attractive and reliable alternative for bearing fault diagnosis. These new contributions were validated through simulation examples as well as real measurement data.</p> / Master of Applied Science (MASc)
33

A Unique Wavelet-based Multicarrier System with and without MIMO over Multipath Channels with AWGN

Asif, Rameez, Abd-Alhameed, Raed, Noras, James M. 05 1900 (has links)
Yes / Recent studies suggest that multicarrier systems using wavelets outperform conventional OFDM systems using the FFT, in that they have well-contained side lobes, improved spectral efficiency and BER performance, and they do not require a cyclic prefix. Here we study the wavelet packet and discrete wavelet transforms, comparing the BER performance of wavelet transform-based multicarrier systems and Fourier based OFDM systems, for multipath Rayleigh channels with AWGN. In the proposed system zero-forcing channel estimation in the frequency domain has been used. Results confirm that discrete wavelet-based systems using Daubechies wavelets outperform both wavelet packet transform- based systems and FFT-OFDM systems in terms of BER. Finally, Alamouti coding and maximal ratio combining schemes were employed in MIMO environments, where results show that the effects of multipath fading were greatly reduced by the antenna diversity.
34

Performance evaluation of ZF and MMSE equalizers for wavelets V-Blast

Asif, Rameez, Bin-Melha, Mohammed S., Hussaini, Abubakar S., Abd-Alhameed, Raed, Jones, Steven M.R., Noras, James M., Rodriguez, Jonathan January 2013 (has links)
No / In this work we present the work on the equalization algorithms to be used in future orthogonally multiplexed wavelets based multi signaling communication systems. The performance of ZF and MMSE algorithms has been analyzed using SISO and MIMO communication models. The transmitted electromagnetic waves were subjected through Rayleigh multipath fading channel with AWGN. The results showed that the performance of both of the above mentioned algorithms is the same in SISO channel but in MIMO environment MMSE has better performance.
35

Klasifikace spánkových EEG / Sleep scoring using EEG

Holdova, Kamila January 2013 (has links)
This thesis deals with wavelet analysis of sleep electroencephalogram to sleep stages scoring. The theoretical part of the thesis deals with the theory of EEG signal creation and analysis. The polysomnography (PSG) is also described. This is the method for simultaneous measuring the different electrical signals; main of them are electroencephalogram (EEG), electromyogram (EMG) and electrooculogram (EOG). This method is used to diagnose sleep failure. Therefore sleep, sleep stages and sleep disorders are also described in the present study. In practical part, some results of application of discrete wavelet transform (DWT) for decomposing the sleep EEGs using mother wavelet Daubechies 2 „db2“ are shown and the level of the seven. The classification of the resulting data was used feedforward neural network with backpropagation errors.
36

Combined robust and fragile watermarking algorithms for still images : design and evaluation of combined blind discrete wavelet transform-based robust watermarking algorithms for copyright protection using mobile phone numbers and fragile watermarking algorithms for content authentication of digital still images using hash functions

Jassim, Taha Dawood January 2014 (has links)
This thesis deals with copyright protection and content authentication for still images. New blind transform domain block based algorithms using one-level and two-level Discrete Wavelet Transform (DWT) were developed for copyright protection. The mobile number with international code is used as the watermarking data. The robust algorithms used the Low-Low frequency coefficients of the DWT to embed the watermarking information. The watermarking information is embedded in the green channel of the RGB colour image and Y channel of the YCbCr images. The watermarking information is scrambled by using a secret key to increase the security of the algorithms. Due to the small size of the watermarking information comparing to the host image size, the embedding process is repeated several times which resulted in increasing the robustness of the algorithms. Shuffling process is implemented during the multi embedding process in order to avoid spatial correlation between the host image and the watermarking information. The effects of using one-level and two-level of DWT on the robustness and image quality have been studied. The Peak Signal to Noise Ratio (PSNR), the Structural Similarity Index Measure (SSIM) and Normalized Correlation Coefficient (NCC) are used to evaluate the fidelity of the images. Several grey and still colour images are used to test the new robust algorithms. The new algorithms offered better results in the robustness against different attacks such as JPEG compression, scaling, salt and pepper noise, Gaussian noise, filters and other image processing compared to DCT based algorithms. The authenticity of the images were assessed by using a fragile watermarking algorithm by using hash function (MD5) as watermarking information embedded in the spatial domain. The new algorithm showed high sensitivity against any tampering on the watermarked images. The combined fragile and robust watermarking caused minimal distortion to the images. The combined scheme achieved both the copyright protection and content authentication.
37

Analyse d'images pour une recherche d'images basée contenu dans le domaine transformé. / Image analysis for content based image retrieval in transform domain

Bai, Cong 21 February 2013 (has links)
Cette thèse s’inscrit dans la recherche d’images basée sur leur contenu. La recherche opère sur des images eprésentéesdans un domaine transformé et où sont construits directement les vecteurs de caractéristiques ou indices. Deux types detransformations sont explorés : la transformée en cosinus discrète ou Discrete Cosine Transform (DCT) et la transforméen ondelettes discrète ou Discrete Wavelet Transform (DWT), utilisés dans les normes de compression JPEG et JPEG2000. Basés sur les propriétés des coefficients de la transformation, différents vecteurs de caractéristiquessont proposés. Ces vecteurs sont mis en oeuvre dans la reconnaissance de visages et de textures couleur.Dans le domaine DCT, sont proposés quatre types de vecteurs de caractéristiques dénommés «patterns» : Zigzag-Pattern,Sum-Pattern, Texture-Pattern et Color-Pattern. Le premier type est l’amélioration d’une approche existante. Les trois derniers intègrent la capacité de compactage des coefficients DCT, sachant que certains coefficients représentent une information de directionnalité. L’histogramme de ces vecteurs est retenu comme descripteur de l’image. Pour une réduction de la dimension du descripteur lors de la construction de l’histogramme il est défini, soit une adjacence sur des patterns proches puis leur fusion, soit une sélection des patterns les plus fréquents. Ces approches sont évaluées sur des bases de données d’images de visages ou de textures couramment utilisées. Dans le domaine DWT, deux types d’approches sont proposés. Dans le premier, un vecteur-couleur et un vecteur–texture multirésolution sont élaborés. Cette approche se classe dans le cadre d’une caractérisation séparée de la couleur et de la texture. La seconde approche se situe dans le contexte d’une caractérisation conjointe de la couleur et de la texture. Comme précédemment, l’histogramme des vecteurs est choisi comme descripteur en utilisant l’algorithme K-means pour construire l’histogramme à partir de deux méthodes. La première est le procédé classique de regroupement des vecteurs par partition. La seconde est un histogramme basé sur une représentation parcimonieuse dans laquelle la valeur des bins représente le poids total des vecteurs de base de la représentation. / This thesis comes within content-based image retrieval for images by constructing feature vectors directly fromtransform domain. In particular, two kinds of transforms are concerned: Discrete Cosine Transform (DCT) andDiscrete Wavelet Transform (DWT), which are used in JPEG and JPEG2000 compression standards. Based onthe properties of transform coefficients, various feature vectors in DCT domain and DWT domain are proposedand applied in face recognition and color texture retrieval. The thesis proposes four kinds of feature vectors in DCTdomain: Zigzag-Pattern, Sum-Pattern, Texture-Pattern and Color-Pattern. The first one is an improved method based onan existing approach. The last three ones are based on the capability of DCT coefficients for compacting energy and thefact that some coefficients hold the directional information of images. The histogram of these patterns is chosen as descriptor of images. While constructing the histogram, with the objective to reduce the dimension of the descriptor, either adjacent patterns are defined and merged or a selection of the more frequent patterns is done. These approaches are evaluated on widely used face databases and texture databases. In the aspect of DWT domain, two kinds of approaches for color texture retrieval are proposed. In the first one, color-vector and multiresolution texture-vector are constructed, which categorize this approach into the context of extracting color and texture features separately. In contrast, the second approachis in the context of extracting color and texture features jointly: multiresolution feature vectors are extracted from luminance and chrominance components of color texture. Histogram of vectors is again chosen as descriptor and using k-means algorithm to divide feature vectors into partitions corresponding to the bins of histogram. For histogram generation, two methods are used. The first one is the classical method, in which the number of vectors that fall into the corresponding partition is counted. The second one is the proposition of a sparse representation based histogram in which a bin value represents the total weight of corresponding basis vector in the sparse representation.
38

Objective Perceptual Quality Assessment of JPEG2000 Image Coding Format Over Wireless Channel

Chintala, Bala Venkata Sai Sundeep January 2019 (has links)
A dominant source of Internet traffic, today, is constituted of compressed images. In modern multimedia communications, image compression plays an important role. Some of the image compression standards set by the Joint Photographic Expert Group (JPEG) include JPEG and JPEG2000. The expert group came up with the JPEG image compression standard so that still pictures could be compressed to be sent over an e-mail, be displayed on a webpage, and make high-resolution digital photography possible. This standard was originally based on a mathematical method, used to convert a sequence of data to the frequency domain, called the Discrete Cosine Transform (DCT). In the year 2000, however, a new standard was proposed by the expert group which came to be known as JPEG2000. The difference between the two is that the latter is capable of providing better compression efficiency. There is also a downside to this new format introduced. The computation required for achieving the same sort of compression efficiency as one would get with the original JPEG format is higher. JPEG is a lossy compression standard which can throw away some less important information without causing any noticeable perception differences. Whereas, in lossless compression, the primary purpose is to reduce the number of bits required to represent the original image samples without any loss of information. The areas of application of the JPEG image compression standard include the Internet, digital cameras, printing, and scanning peripherals. In this thesis work, a simulator kind of functionality setup is needed for conducting the objective quality assessment. An image is given as an input to our wireless communication system and its data size is varied (e.g. 5%, 10%, 15%, etc) and a Signal-to-Noise Ratio (SNR) value is given as input, for JPEG2000 compression. Then, this compressed image is passed through a JPEG encoder and then transmitted over a Rayleigh fading channel. The corresponding image obtained after having applied these constraints on the original image is then decoded at the receiver and inverse discrete wavelet transform (IDWT) is applied to inverse the JPEG 2000 compression. Quantization is done for the coefficients which are scalar-quantized to reduce the number of bits to represent them, without the loss of quality of the image. Then the final image is displayed on the screen. The original input image is co-passed with the images of varying data size for an SNR value at the receiver after decoding. In particular, objective perceptual quality assessment through Structural Similarity (SSIM) index using MATLAB is provided.
39

Comparing Learning Gains in Cryptography Concepts Taught Using Different Instructional Conditions and Measuring Cognitive Processing Activity of Cryptography Concepts

Joseph W Beckman (7027982) 16 October 2019 (has links)
<div>Information security practitioners and researchers who possess sufficient depth of conceptual understanding to reconstitute systems after attacks or adapt information security concepts to novel situations are in short supply. Education of new information security professionals with sufficient conceptual depth is one method by which this shortage can be reduced. This study reports research that instructed two groups of ten undergraduate, pre-cryptography students majoring in Computer Science in cryptography concepts using representational understanding first and representational fluency first instructional treatment methods. This study compared learning results between the treatment groups using traditional paper-based measures of cognitions and fMRI scans of brain activity during cryptography problem solving. Analysis found no statistical difference in measures of cognitions or in cognitive processing, but did build a statistical model describing the relationships between explanatory variables and cryptography learning, and found common areas of cognitive processing of cryptography among the study’s twenty subjects.</div>
40

Digital Watermarking Based Image and Video Quality Evaluation

Wang, Sha 02 April 2013 (has links)
Image and video quality evaluation is very important. In applications involving signal transmission, the Reduced- or No-Reference quality metrics are generally more practical than the Full-Reference metrics. Digital watermarking based quality evaluation emerges as a potential Reduced- or No-Reference quality metric, which estimates signal quality by assessing the degradation of the embedded watermark. Since the watermark contains a small amount of information compared to the cover signal, performing accurate signal quality evaluation is a challenging task. Meanwhile, the watermarking process causes signal quality loss. To address these problems, in this thesis, a framework for image and video quality evaluation is proposed based on semi-fragile and adaptive watermarking. In this framework, adaptive watermark embedding strength is assigned by examining the signal quality degradation characteristics. The "Ideal Mapping Curve" is experimentally generated to relate watermark degradation to signal degradation so that the watermark degradation can be used to estimate the quality of distorted signals. With the proposed framework, a quantization based scheme is first implemented in DWT domain. In this scheme, the adaptive watermark embedding strengths are optimized by iteratively testing the image degradation characteristics under JPEG compression. This iterative process provides high accuracy for quality evaluation. However, it results in relatively high computational complexity. As an improvement, a tree structure based scheme is proposed to assign adaptive watermark embedding strengths by pre-estimating the signal degradation characteristics, which greatly improves the computational efficiency. The SPIHT tree structure and HVS masking are used to guide the watermark embedding, which greatly reduces the signal quality loss caused by watermark embedding. Experimental results show that the tree structure based scheme can evaluate image and video quality with high accuracy in terms of PSNR, wPSNR, JND, SSIM and VIF under JPEG compression, JPEG2000 compression, Gaussian low-pass filtering, Gaussian noise distortion, H.264 compression and packet loss related distortion.

Page generated in 0.0234 seconds