• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 4
  • 2
  • 1
  • 1
  • Tagged with
  • 29
  • 29
  • 8
  • 7
  • 7
  • 7
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Hokua – A Wavelet Method for Audio Fingerprinting

Lutz, Steven S. 20 November 2009 (has links) (PDF)
In recent years, multimedia identification has become important as the volume of digital media has dramatically increased. With music files, one method of identification is audio fingerprinting. The underlying method for most algorithms is the Fourier transform. However, due to a lack of temporal resolution, these algorithms rely on the short-time Fourier transform. We propose an audio fingerprinting algorithm that uses a wavelet transform, which has good temporal resolution. In this thesis, we examine the basics of certain topics that are needed in understanding audio fingerprinting techniques. We also look at a brief history of work done in this field. We introduce a new algorithm, called the Hokua algorithm. We developed Hokua to take advantage of certain properties of the wavelet transform. The algorithm uses coefficient peaks of wavelet transforms to identify a sample query. The various algorithms are compared.
12

The Parameter Signature Isolation Method and Applications

McCusker, James Richard 13 May 2011 (has links)
The aim of this research was to develop a method of system identification that would draw inspiration from the approach taken by human experts for simulation model tuning and validation. Human experts are able to utilize their natural pattern recognition ability to identify the various shape attributes, or signatures, of a time series from simulation model outputs. They can also intelligently and effectively perform tasks ranging from system identification to model validation. However, the feature extraction approach employed by them cannot be readily automated due to the difficulty in measuring shape attributes. In order to bridge the gap between the approach taken by human experts and those employed for traditional iterative approaches, a method to quantify the shape attributes was devised. The method presented in this dissertation, the Parameter Signature Isolation Method (PARSIM), uses continuous wavelet transformation to characterize specific aspects of the time series shape through surfaces in the time-scale domain. A salient characteristic of these surfaces is their enhanced delineation of the model outputs and/or their sensitivities. One benefit of this enhanced delineation is the capacity to isolate regions of the time-scale plane, coined as parameter signatures, wherein individual output sensitivities dominate all the others. The parameter signatures enable the estimation of each model parameter error separately with applicability to parameter estimation. The proposed parameter estimation method has unique features, one of them being the capacity for noise suppression, wherein the feature of relying entirely on the time-scale domain for parameter estimation offers direct noise compensation in this domain. Yet another utility of parameter signatures is in measurement selection, whereby the existence of parameter signatures is attributed to the identifiability of model parameters through various outputs. The effectiveness of PARSIM is demonstrated through an array of theoretical models, such as the Lorenz System and the Van der Pol oscillator, as well as through the real-world simulation models of an injection molding process and a jet engine.
13

On the application of raised-cosine wavelets for multicarrier systems design

Anoh, Kelvin O.O., Mapoka, Trust T., Abd-Alhameed, Raed, Ochonogor, O., Jones, Steven M.R. 08 1900 (has links)
Yes / New orthogonal wavelet transforms can be designed by changing the wavelet basis functions or by constructing new low-pass filters (LPF). One family of wavelet may appeal, in use, to a particular application than another. In this study, the wavelet transform based on raisedcosine spectrum is used as an independent orthogonal wavelet to study multicarrier modulation behaviour over multipath channel environment. Then, the raised-cosine wavelet is compared with other well-known orthogonal wavelets that are used, also, to build multicarrier modulation systems. Traditional orthogonal wavelets do not have side-lobes, while the raised-cosine wavelets have lots of side-lobes; these characteristics influence the wavelet behaviour. It will be shown that the raised-cosine wavelet transform, as an orthogonal wavelet, does not support the design of multicarrier application well like the existing well-known orthogonal wavelets.
14

Analyzing the Effect of Moving Resonance on Seismic Response of Structures using Wavelet Transforms

Naga, Pradeep 02 September 2011 (has links)
Nonlinear structures, when subjected to multiple ground motion records that are scaled to consistent ground motion intensity show significant variation in their response. This effect of ground motion randomness on the variation of structural response is defined as Record-to-Record (RTR) Variability. Ground motion characteristics that contribute to this variability in response includes the variation of signal composition (frequency content) with time (spectral nonstationarity).The phenomenon of moving resonance which occurs when the frequency content of the ground motion shifts in a similar manner as the natural frequencies of the structural response, is likely a contributor to variability. This brings the need to further understand the sources of variability due to moving resonance. The present study was carried out to develop a method to analyze the time-frequency content of a ground motion to assess the occurrence of moving resonance and to quantify its potential in effecting the structural systems. Bilinear elastic and elastoplastic hysteretic behavior was considered. Detailed analysis is done to quantify the effect of moving resonance on structural systems due to 22 far field ground motion records. The wavelet coefficient plots gave very good detail of the characteristics of the ground motions that were not clear from the acceleration time histories and response spectra plots. Instances of moving resonance were found out to be significant. Amplification due to moving resonance was found to be quite large. One instance studied in detail (accelerogram of Northridge earthquake at Beverly Hills) had peak displacement amplified by 6 times compared to the amount of peak displacement expected if the system did not exhibit moving resonance. Based on the analyses results, the characteristics of the ground motion records that don't cause significant moving resonance effect on structural systems were observed. Similarly, the characteristics of the ground motions that do cause moving resonance effect on structural systems were examined. / Master of Science
15

CLASSIFICATION OF HIGH IMPEDANCE FAULTS, INCIPIENT FAULTS AND CIRCUIT BREAKER RESTRIKES DURING CAPACITOR BANK DE-ENERGIZATION IN RADIAL DISTRIBUTION FEEDERS

Almalki, Mishrari Metab 01 May 2018 (has links)
Monitoring of abnormal events in a distribution feeder by using a single technique is a challenging task. Many abnormal events can cause unsafe operation, including a high impedance fault (HIF) caused by a downed conductor touch ground surface, an incipient fault (IF) caused by partial breakdown to a cable insulation, and a circuit breaker (CB) malfunction due to capacitor bank de-energization to cause current restrikes. These abnormal events are not detectable by conventional protection schemes. In this dissertation, a new technique to identify distribution feeder events is proposed based on the complex Morlet wavelet (CMW) and on a decision tree (DT) classifier. First, the event is detected using CMW. Subsequently, a DT using event signatures classifies the event as normal operation, continuous and non-continuous arcing events (C.A.E. and N.C.A.E.). Additional information from the supervisory control and data acquisition (SCADA) can be used to precisely identify the event. The proposed method is meticulously tested on the IEEE 13- and IEEE 34-bus systems and has shown to correctly classify those events. Furthermore, the proposed method is capable of detecting very high impedance incipient faults (IFs) and CB restrikes at the substation level with relatively short detection time. The proposed method uses only current measurements at a low sampling rate of 1440 Hz yielding an improvement of existing methods that require much higher sampling rates.
16

Characterization and application of analysis methods for ECG and time interval variability data

Tikkanen, P. (Pauli) 09 April 1999 (has links)
Abstract The quantitation of the variability in cardiovascular signals provides information about the autonomic neural regulation of the heart and the circulatory system. Several factors have an indirect effect on these signals as well as artifacts and several types of noise are contained in the recorded signal. The dynamics of RR and QT interval time series have also been analyzed in order to predict a risk of adverse cardiac events and to diagnose them. An ambulatory measurement setting is an important and demanding condition for the recording and analysis of these signals. Sophisticated and robust signal analysis schemes are thus increasingly needed. In this thesis, essential points related to ambulatory data acquisition and analysis of cardiovascular signals are discussed including the accuracy and reproducibility of the variability measurement. The origin of artifacts in RR interval time series is discussed, and consequently their effects and possible correction procedures are concidered. The time series including intervals differing from a normal sinus rhythm which sometimes carry important information, but may not be as such suitable for an analysis performed by all approaches. A significant variation in the results in either intra- or intersubject analysis is unavoidable and should be kept in mind when interpreting the results. In addition to heart rate variability (HRV) measurement using RR intervals, the dy- namics of ventricular repolarization duration (VRD) is considered using the invasively obtained action potential duration (APD) and different estimates for a QT interval taken from a surface electrocardiogram (ECG). Estimating the low quantity of the VRD vari- ability involves obviously potential errors and more strict requirements. In this study, the accuracy of VRD measurement was improved by a better time resolution obtained through interpolating the ECG. Furthermore, RTmax interval was chosen as the best QT interval estimate using simulated noise tests. A computer program was developed for the time interval measurement from ambulatory ECGs. This thesis reviews the most commonly used analysis methods for cardiovascular vari- ability signals including time and frequency domain approaches. The estimation of the power spectrum is presented on the approach using an autoregressive model (AR) of time series, and a method for estimating the powers and the spectra of components is also presented. Time-frequency and time-variant spectral analysis schemes with applica- tions to HRV analysis are presented. As a novel approach, wavelet and wavelet packet transforms and the theory of signal denoising with several principles for the threshold selection is examined. The wavelet packet based noise removal approach made use of an optimized signal decomposition scheme called best tree structure. Wavelet and wavelet packet transforms are further used to test their effciency in removing simulated noise from the ECG. The power spectrum analysis is examined by means of wavelet transforms, which are then applied to estimate the nonstationary RR interval variability. Chaotic modelling is discussed with important questions related to HRV analysis.ciency in removing simulated noise from the ECG. The power spectrum analysis is examined by means of wavelet transforms, which are then applied to estimate the nonstationary RR interval variability. Chaotic modelling is discussed with important questions related to HRV analysis.
17

Fingerprinting for Chiplet Architectures Using Power Distribution Network Transients

Burke, Matthew G 09 August 2023 (has links) (PDF)
Chiplets have become an increasingly popular technology for extending Moore's Law and improving the reliability of integrated circuits. They do this by placing several small, interacting chips on an interposer rather than the traditional, single chip used for a device. Like any other type of integrated circuit, chiplets are in need of a physical layer of security to defend against hardware Trojans, counterfeiting, probing, and other methods of tampering and physical attacks. Power distribution networks are ubiquitous across chiplet and monolithic ICs, and are essential to the function of the device. Thus, we propose a method of fingerprinting transient signals within the PDN to identify individual chiplet systems and physical-layer threats against these devices. In this work, we describe a Python-wrapped HSPICE model we have built to automate testing of our proposed PDN fingerprinting methods. We also document the methods of analysis used- wavelet transforms and time-domain measurements- to identify unique characteristics in the voltage response signals to transient stimuli. We provide the true positive and false positive rates of these methods for a simulated lineup of chips across varying operating conditions to determine uniqueness and reliability of our techniques. Our simulations show that, if characterized at varying supply voltage and temperature conditions in the factory, and the sensors used for identification meet the sample rates and voltage resolutions used in our tests, our protocol provides sufficient uniqueness and reliability to be enrolled. We recommend that experimentation be done to evaluate our methods in hardware and implement sensing techniques to meet the requirements shown in this work.
18

Development of digital imaging technologies for the segmentation of solar features and the extraction of filling factors from SODISM images

Alasta, Amro F.A. January 2018 (has links)
Solar images are one of the most important sources of available information on the current state and behaviour of the sun, and the PICARD satellite is one of several ground and space-based observatories dedicated to the collection of that data. The PICARD satellite hosts the Solar Diameter Imager and Surface Mapper (SODISM), a telescope aimed at continuously monitoring the Sun. It has generated a huge cache of images and other data that can be analysed and interpreted to improve the monitoring of features, such as sunspots and the prediction and diagnosis of solar activity. In proportion to the available raw material, the little-published analysis of SODISM data has provided the impetus for this study, specifically a novel method of contributing to the development of a system to enhance, detect and segment sunspots using new hybrid methods. This research aims to yield an improved understanding of SODISM data by providing novel methods to tabulate a sunspot and filling factor (FF) catalogue, which will be useful for future forecasting activities. The developed technologies and the findings achieved in this research will work as a corner stone to enhance the accuracy of sunspot segmentation; create efficient filling factor catalogue systems, and enhance our understanding of SODISM image enhancement. The results achieved can be summarised as follows: i) Novel enhancement method for SODISM images. ii) New efficient methods to segment dark regions and detect sunspots. iii) Novel catalogue for filling factor including the number, size and sunspot location. v) Novel statistical method to summarise FFs catalogue. Image processing and partitioning techniques are used in this work; these methods have been applied to remove noise and detect sunspots and will provide more information such as sunspot numbers, size and filling factor. The performance of the model is compared to the fillers extracted from other satellites, such as SOHO. Also, the results were compared with the NOAA catalogue and achieved a precision of 98%. Performance measurement is also introduced and applied to verify results and evaluate proposal methods. Algorithms, implementation, results and future work have been explained in this thesis.
19

Low-complexity block dividing coding method for image compression using wavelets : a thesis presented in partial fulfillment of the requirements for the degree of Master of Engineering in Computer Systems Engineering at Massey University, Palmerston North, New Zealand

Zhu, Jihai January 2007 (has links)
Image coding plays a key role in multimedia signal processing and communications. JPEG2000 is the latest image coding standard, it uses the EBCOT (Embedded Block Coding with Optimal Truncation) algorithm. The EBCOT exhibits excellent compression performance, but with high complexity. The need to reduce this complexity but maintain similar performance to EBCOT has inspired a significant amount of research activity in the image coding community. Within the development of image compression techniques based on wavelet transforms, the EZW (Embedded Zerotree Wavelet) and the SPIHT (Set Partitioning in Hierarchical Trees) have played an important role. The EZW algorithm was the first breakthrough in wavelet based image coding. The SPIHT algorithm achieves similar performance to EBCOT, but with fewer features. The other very important algorithm is SBHP (Sub-band Block Hierarchical Partitioning), which attracted significant investigation during the JPEG2000 development process. In this thesis, the history of the development of wavelet transform is reviewed, and a discussion is presented on the implementation issues for wavelet transforms. The above mentioned four main coding methods for image compression using wavelet transforms are studied in detail. More importantly the factors that affect coding efficiency are identified. The main contribution of this research is the introduction of a new low-complexity coding algorithm for image compression based on wavelet transforms. The algorithm is based on block dividing coding (BDC) with an optimised packet assembly. Our extensive simulation results show that the proposed algorithm outperforms JPEG2000 in lossless coding, even though it still leaves a narrow gap in lossy coding situations
20

Wavelet Based Denoising Techniques For Improved DOA Estimation And Source Localisation

Sathish, R 05 1900 (has links) (PDF)
No description available.

Page generated in 0.0816 seconds