• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 105
  • 23
  • 22
  • 10
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 227
  • 43
  • 36
  • 32
  • 31
  • 30
  • 28
  • 27
  • 25
  • 23
  • 20
  • 18
  • 18
  • 18
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Analysis of Long-Term Utah Temperature Trends Using Hilbert-Haung Transforms

Hargis, Brent H 01 June 2014 (has links) (PDF)
We analyzed long-term temperature trends in Utah using a relatively new signal processing method called Empirical Mode Decomposition (EMD). We evaluated the available weather records in Utah and selected 52 stations, which had records longer than 60 years, for analysis. We analyzed daily temperature data, both minimum and maximums, using the EMD method that decomposes non-stationary data (data with a trend) into periodic components and the underlying trend. Most decomposition algorithms require stationary data (no trend) with constant periods and temperature data do not meet these constraints. In addition to identifying the long-term trend, we also identified other periodic processes in the data. While the immediate goal of this research is to characterize long-term temperature trends and identify periodic processes and anomalies, these techniques can be applied to any time series data to characterize trends and identify anomalies. For example, this approach could be used to evaluate flow data in a river to separate the effects of dams or other regulatory structures from natural flow or to look at other water quality data over time to characterize the underlying trends and identify anomalies, and also identify periodic fluctuations in the data. If these periodic fluctuations can be associated with physical processes, the causes or drivers might be discovered helping to better understand the system. We used EMD to separate and analyze long-term temperature trends. This provides awareness and support to better evaluate the extremities of climate change. Using these methods we will be able to define many new aspects of nonlinear and nonstationary data. This research was successful and identified several areas in which it could be extended including data reconstruction for time periods missing data. This analysis tool can be applied to various other time series records.
72

Unitarily inequivalent local and global Fourier transforms in multipartite quantum systems

Lei, Ci, Vourdas, Apostolos 23 January 2023 (has links)
Yes / A multipartite system comprised of n subsystems, each of which is described with ‘local variables’ in Z(d) and with a d-dimensional Hilbert space H(d), is considered. Local Fourier transforms in each subsystem are defined and related phase space methods are discussed (displacement operators, Wigner and Weyl functions, etc). A holistic view of the same system might be more appropriate in the case of strong interactions, which uses ‘global variables’ in Z(dn) and a dn-dimensional Hilbert space H(dn). A global Fourier transform is then defined and related phase space methods are discussed. The local formalism is compared and contrasted with the global formalism. Depending on the values of d, n the local Fourier transform is unitarily inequivalent or unitarily equivalent to the global Fourier transform. Time evolution of the system in terms of both local and global variables, is discussed. The formalism can be useful in the general area of Fast Fourier transforms.
73

Word spotting in continuous speech using wavelet transform

Khan, W., Jiang, Ping, Holton, David R.W. January 2014 (has links)
No / Word spotting in continuous speech is considered a challenging issue due to dynamic nature of speech. Literature contains a variety of novel techniques for the isolated word recognition and spotting. Most of these techniques are based on pattern recognition and similarity measures. This paper amalgamates the use of different techniques that includes wavelet transform, feature extraction and Euclidean distance. Based on the acoustic features, the proposed system is capable of identifying and localizing a target (test) word in a continuous speech of any length. Wavelet transform is used for the time-frequency representation and filtration of speech signal. Only high intensity frequency components are passed to feature extraction and matching process resulting robust performance in terms of matching as well as computational cost.
74

Performance comparison of MIMO-DWT and MIMO-FrFT multicarrier systems

Anoh, Kelvin O.O., Ali, N.T., Migdadi, Hassan S.O., Abd-Alhameed, Raed, Ghazaany, Tahereh S., Jones, Steven M.R., Noras, James M., Excell, Peter S. January 2013 (has links)
No / In this work, we discuss two new multicarrier modulating kernels that can be adopted for multicarrier signaling. These multicarrier transforms are the fractional Forurier transform (FrFT) and discrete wavelet transforms (DWT). At first, we relate the transforms in terms of mathematical relationships, and then using numerical and simulation comparisons we show their performances in terms of bit error ratio (BER) for Multiple Input Multiple Output (MIMO) applications. Numerical results using BPSK and QPSK support that both can be applied for multicarrier signaling, however, it can be resource effective to drive the DWT as the baseband multicarrier kernel at the expense of the FrFT
75

Automatic Detection and Verification of Solar Features

Qahwaji, Rami S.R., Colak, Tufan January 2006 (has links)
Yes / A fast hybrid system for the automated detection and verification of active regions (plages) and filaments in solar images is presented in this paper. The system combines automated image processing with machine learning. The imaging part consists of five major stages. The solar disk is detected in the first stage, using a morphological hit-miss transform, watershed transform and Filling algorithm. An image-enhancement technique is introduced to remove the limb-darkening effect and intensity filtering is implemented followed by a modified region-growing technique to detect the regions of interest (RoI). The algorithms are tested on H- and CA II K3-line solar images that are obtained from Meudon Observatory, covering the period from July 2, 2001 till August 4, 2001. The detection algorithm is fast and it achieves false acceptance rate (FAR) error rate of 67% and false rejection rate (FRR) error rate of 3% for active regions, and FAR error rate of 19% and FRR error rate of 14% for filaments, when compared with the manually detected filaments in the synoptic maps. The detection performance is enhanced further using a neural network (NN), which is trained on statistical features extracted from the RoI and non-RoI. With the use of this combination the FAR has dropped to 2% for active regions and 4% for filaments.© 2006 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 15, 199-210, 2005
76

Asian Options: Inverse Laplace Transforms and Martingale Methods Revisited

Sudler, Glenn F. 06 August 1999 (has links)
Arithmetic Asian options are difficult to price and hedge, since, at the present, no closed-form analytical solution exists to price them. This difficulty, moreover, has led to the development of various methods and models used to price these instruments. The purpose of this thesis is two-fold. First, we present an overview of the literature. Secondly, we develop a pseudo-analytical method proposed by Geman and Yor and present an accurate and relatively quick algorithm which can be used to price European-style arithmetic Asian options and their hedge parameters. / Master of Science
77

Genomic sequence processing: gene finding in eukaryotes

Akhtar, Mahmood, Electrical Engineering & Telecommunications, Faculty of Engineering, UNSW January 2008 (has links)
Of the many existing eukaryotic gene finding software programs, none are able to guarantee accurate identification of genomic protein coding regions and other biological signals central to pathway from DNA to the protein. Eukaryotic gene finding is difficult mainly due to noncontiguous and non-continuous nature of genes. Existing approaches are heavily dependent on the compositional statistics of the sequences they learn from and are not equally suitable for all types of sequences. This thesis firstly develops efficient digital signal processing-based methods for the identification of genomic protein coding regions, and then combines the optimum signal processing-based non-data-driven technique with an existing data-driven statistical method in a novel system demonstrating improved identification of acceptor splice sites. Most existing well-known DNA symbolic-to-numeric representations map the DNA information into three or four numerical sequences, potentially increasing the computational requirement of the sequence analyzer. Proposed mapping schemes, to be used for signal processing-based gene and exon prediction, incorporate DNA structural properties in the representation, in addition to reducing complexity in subsequent processing. A detailed comparison of all DNA representations, in terms of computational complexity and relative accuracy for the gene and exon prediction problem, reveals the newly proposed ?paired numeric? to be the best DNA representation. Existing signal processing-based techniques rely mostly on the period-3 behaviour of exons to obtain one dimensional gene and exon prediction features, and are not well equipped to capture the complementary properties of exonic / intronic regions and deal with the background noise in detection of exons at their nucleotide levels. These issues have been addressed in this thesis, by proposing six one-dimensional and three multi-dimensional signal processing-based gene and exon prediction features. All one-dimensional and multi-dimensional features have been evaluated using standard datasets such as Burset/Guigo1996, HMR195, and the GENSCAN test set. This is the first time that different gene and exon prediction features have been compared using substantial databases and using nucleotide-level metrics. Furthermore, the first investigation of the suitability of different window sizes for period-3 exon detection is performed. Finally, the optimum signal processing-based gene and exon prediction scheme from our evaluations is combined with a data-driven statistical technique for the recognition of acceptor splice sites. The proposed DSP-statistical hybrid is shown to achieve 43% reduction in false positives over WWAM, as used in GENSCAN.
78

Genomic sequence processing: gene finding in eukaryotes

Akhtar, Mahmood, Electrical Engineering & Telecommunications, Faculty of Engineering, UNSW January 2008 (has links)
Of the many existing eukaryotic gene finding software programs, none are able to guarantee accurate identification of genomic protein coding regions and other biological signals central to pathway from DNA to the protein. Eukaryotic gene finding is difficult mainly due to noncontiguous and non-continuous nature of genes. Existing approaches are heavily dependent on the compositional statistics of the sequences they learn from and are not equally suitable for all types of sequences. This thesis firstly develops efficient digital signal processing-based methods for the identification of genomic protein coding regions, and then combines the optimum signal processing-based non-data-driven technique with an existing data-driven statistical method in a novel system demonstrating improved identification of acceptor splice sites. Most existing well-known DNA symbolic-to-numeric representations map the DNA information into three or four numerical sequences, potentially increasing the computational requirement of the sequence analyzer. Proposed mapping schemes, to be used for signal processing-based gene and exon prediction, incorporate DNA structural properties in the representation, in addition to reducing complexity in subsequent processing. A detailed comparison of all DNA representations, in terms of computational complexity and relative accuracy for the gene and exon prediction problem, reveals the newly proposed ?paired numeric? to be the best DNA representation. Existing signal processing-based techniques rely mostly on the period-3 behaviour of exons to obtain one dimensional gene and exon prediction features, and are not well equipped to capture the complementary properties of exonic / intronic regions and deal with the background noise in detection of exons at their nucleotide levels. These issues have been addressed in this thesis, by proposing six one-dimensional and three multi-dimensional signal processing-based gene and exon prediction features. All one-dimensional and multi-dimensional features have been evaluated using standard datasets such as Burset/Guigo1996, HMR195, and the GENSCAN test set. This is the first time that different gene and exon prediction features have been compared using substantial databases and using nucleotide-level metrics. Furthermore, the first investigation of the suitability of different window sizes for period-3 exon detection is performed. Finally, the optimum signal processing-based gene and exon prediction scheme from our evaluations is combined with a data-driven statistical technique for the recognition of acceptor splice sites. The proposed DSP-statistical hybrid is shown to achieve 43% reduction in false positives over WWAM, as used in GENSCAN.
79

Off-line signature verification

Coetzer, Johannes 03 1900 (has links)
Thesis (PhD (Mathematical Sciences))--University of Stellenbosch, 2005. / A great deal of work has been done in the area of off-line signature verification over the past two decades. Off-line systems are of interest in scenarios where only hard copies of signatures are available, especially where a large number of documents need to be authenticated. This dissertation is inspired by, amongst other things, the potential financial benefits that the automatic clearing of cheques will have for the banking industry.
80

Security and privacy in perceptual computing

Jana, Suman 18 September 2014 (has links)
Perceptual, "context-aware" applications that observe their environment and interact with users via cameras and other sensors are becoming ubiquitous on personal computers, mobile phones, gaming platforms, household robots, and augmented-reality devices. This dissertation's main thesis is that perceptual applications present several new classes of security and privacy risks to both their users and the bystanders. Existing perceptual platforms are often completely inadequate for mitigating these risks. For example, we show that the augmented reality browsers, a class of popular perceptual platforms, contain numerous inherent security and privacy flaws. The key insight of this dissertation is that perceptual platforms can provide stronger security and privacy guarantees by controlling the interfaces they expose to the applications. We explore three different approaches that perceptual platforms can use to minimize the risks of perceptual computing: (i) redesigning the perceptual platform interfaces to provide a fine-grained permission system that allows least-privileged application development; (ii) leveraging existing perceptual interfaces to enforce access control on perceptual data, apply algorithmic privacy transforms to reduce the amount of sensitive content sent to the applications, and enable the users to audit/control the amount of perceptual data that reaches each application; and (iii) monitoring the applications' usage of perceptual interfaces to find anomalous high-risk cases. To demonstrate the efficacy of our approaches, first, we build a prototype perceptual platform that supports fine-grained privileges by redesigning the perceptual interfaces. We show that such a platform not only allows creation of least-privileged perceptual applications but also can improve performance by minimizing the overheads of executing multiple concurrent applications. Next, we build DARKLY, a security and privacy-aware perceptual platform that leverages existing perceptual interfaces to deploy several different security and privacy protection mechanisms: access control, algorithmic privacy transforms, and user audit. We find that DARKLY can run most existing perceptual applications with minimal changes while still providing strong security and privacy protection. Finally, We introduce peer group analysis, a new technique that detects anomalous high-risk perceptual interface usages by creating peer groups with software providing similar functionality and comparing each application's perceptual interface usages against those of its peers. We demonstrate that such peer groups can be created by leveraging information already available in software markets like textual descriptions and categories of applications, list of related applications, etc. Such automated detection of high-risk applications is essential for creating a safer perceptual ecosystem as it helps the users in identifying and installing safer applications with any desired functionality and encourages the application developers to follow the principle of least privilege. / text

Page generated in 0.0536 seconds