• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 103
  • 18
  • 17
  • 12
  • 9
  • 6
  • 5
  • 2
  • 2
  • Tagged with
  • 197
  • 197
  • 197
  • 58
  • 34
  • 25
  • 25
  • 25
  • 24
  • 23
  • 21
  • 19
  • 19
  • 18
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Kan datorer höra fåglar? / Can Computers Hear Birds?

Movin, Andreas, Jilg, Jonathan January 2019 (has links)
Ljudigenkänning möjliggörs genom spektralanalys, som beräknas av den snabba fouriertransformen (FFT), och har under senare år nått stora genombrott i samband med ökningen av datorprestanda och artificiell intelligens. Tekniken är nu allmänt förekommande, i synnerhet inom bioakustik för identifiering av djurarter, en viktig del av miljöövervakning. Det är fortfarande ett växande vetenskapsområde och särskilt igenkänning av fågelsång som återstår som en svårlöst utmaning. Även de främsta algoritmer i området är långt ifrån felfria. I detta kandidatexamensarbete implementerades och utvärderades enkla algoritmer för att para ihop ljud med en ljuddatabas. En filtreringsmetod utvecklades för att urskilja de karaktäristiska frekvenserna vid fem tidsramar som utgjorde basen för jämförelsen och proceduren för ihopparning. Ljuden som användes var förinspelad fågelsång (koltrast, näktergal, kråka och fiskmås) så väl som egeninspelad mänsklig röst (4 unga svenska män). Våra resultat visar att framgångsgraden normalt är 50–70%, den lägsta var fiskmåsen med 30% för en liten databas och den högsta var koltrasten med 90% för en stor databas. Rösterna var svårare för algoritmen att särskilja, men de hade överlag framgångsgrader mellan 50% och 80%. Dock gav en ökning av databasstorleken generellt inte en ökning av framgångsgraden. Sammanfattningsvis visar detta kandidatexamensarbete konceptbeviset bakom fågelsångigenkänning och illustrerar såväl styrkorna som bristerna av dessa enkla algoritmer som har utvecklats. Algoritmerna gav högre framgångsgrad än slumpen (25%) men det finns ändå utrymme för förbättring eftersom algoritmen vilseleddes av ljud av samma frekvenser. Ytterligare studier behövs för att bedöma den utvecklade algoritmens förmåga att identifiera ännu fler fåglar och röster. / Sound recognition is made possible through spectral analysis, computed by the fast Fourier transform (FFT), and has in recent years made major breakthroughs along with the rise of computational power and artificial intelligence. The technology is now used ubiquitously and in particular in the field of bioacoustics for identification of animal species, an important task for wildlife monitoring. It is still a growing field of science and especially the recognition of bird song which remains a hard-solved challenge. Even state-of-the-art algorithms are far from error-free. In this thesis, simple algorithms to match sounds to a sound database were implemented and assessed. A filtering method was developed to pick out characteristic frequencies at five time frames which were the basis for comparison and the matching procedure. The sounds used were pre-recorded bird songs (blackbird, nightingale, crow and seagull) as well as human voices (4 young Swedish males) that we recorded. Our findings show success rates typically at 50–70%, the lowest being the seagull of 30% for a small database and the highest being the blackbird at 90% for a large database. The voices were more difficult for the algorithms to distinguish, but they still had an overall success rate between 50% and 80%. Furthermore, increasing the database size did not improve success rates in general. In conclusion, this thesis shows the proof of concept and illustrates both the strengths as well as short-comings of the simple algorithms developed. The algorithms gave better success rates than pure chance of 25% but there is room for improvement since the algorithms were easily misled by sounds of the same frequencies. Further research will be needed to assess the devised algorithms' ability to identify even more birds and voices.
72

Pricing Basket of Credit Default Swaps and Collateralised Debt Obligation by Lévy Linearly Correlated, Stochastically Correlated, and Randomly Loaded Factor Copula Models and Evaluated by the Fast and Very Fast Fourier Transform

Fadel, Sayed M. January 2010 (has links)
In the last decade, a considerable growth has been added to the volume of the credit risk derivatives market. This growth has been followed by the current financial market turbulence. These two periods have outlined how significant and important are the credit derivatives market and its products. Modelling-wise, this growth has parallelised by more complicated and assembled credit derivatives products such as mth to default Credit Default Swaps (CDS), m out of n (CDS) and collateralised debt obligation (CDO). In this thesis, the Lévy process has been proposed to generalise and overcome the Credit Risk derivatives standard pricing model's limitations, i.e. Gaussian Factor Copula Model. One of the most important drawbacks is that it has a lack of tail dependence or, in other words, it needs more skewed correlation. However, by the Lévy Factor Copula Model, the microscopic approach of exploring this factor copula models has been developed and standardised to incorporate an endless number of distribution alternatives those admits the Lévy process. Since the Lévy process could include a variety of processes structural assumptions from pure jumps to continuous stochastic, then those distributions who admit this process could represent asymmetry and fat tails as they could characterise symmetry and normal tails. As a consequence they could capture both high and low events¿ probabilities. Subsequently, other techniques those could enhance the skewness of its correlation and be incorporated within the Lévy Factor Copula Model has been proposed, i.e. the 'Stochastic Correlated Lévy Factor Copula Model' and 'Lévy Random Factor Loading Copula Model'. Then the Lévy process has been applied through a number of proposed Pricing Basket CDS&CDO by Lévy Factor Copula and its skewed versions and evaluated by V-FFT limiting and mixture cases of the Lévy Skew Alpha-Stable distribution and Generalized Hyperbolic distribution. Numerically, the characteristic functions of the mth to default CDS's and (n/m) th to default CDS's number of defaults, the CDO's cumulative loss, and loss given default are evaluated by semi-explicit techniques, i.e. via the DFT's Fast form (FFT) and the proposed Very Fast form (VFFT). This technique through its fast and very fast forms reduce the computational complexity from O(N2) to, respectively, O(N log2 N ) and O(N ).
73

Numeriska fouriertransformen och dess användning : En introduktion / Numerical fourier transform and its usage : An introduction

Tondel, Kristoffer January 2022 (has links)
The aim of this bachelor's thesis is to use three variants of the discrete Fourier transform (DFT) and compare their computational cost. The transformation will be used to numerically solve partial differential equations (PDE). In its simplest form, the DFT can be regarded as a matrix multiplication. It turns out that this matrix has some nice properties that we can exploit. Namely that it is well-conditioned and the inverse of the matrix elements is similar to the original matrix element, which will simplifies the implementation. Also, the matrix can be rewritten using different properties of complex numbers to reduce computational cost. It turns out that each transformation method has its own benefits and drawbacks. One of the methods makes the cost lower but can only use data of a fixed size. Another method needs a specific library to work but is way faster than the other two methods. The type of PDE that will be solved in this thesis are advection and diffusion, which aided by the Fourier transform, can be rewritten as a set of ordinary differential equations (ODE). These ODEs can then be integrated in time with a Runge-Kutta method. / Detta kandidatarbete går ut på att betrakta tre olika diskreta fouriertransformer och jämföra deras beräkningstid. Fouriertransformen används sedan också för att lösa partiella differentialekvationer (PDE). Fouriertransformerna som betraktas kan ses som en matrismultiplikation. Denna matrismultiplikation visar sig har trevliga egenskaper. Nämligen att matrisen är välkonditionerad och att matrisinversen element liknar ursprungsmatrisens element, vilket kommer underlätta implementationen. Matrisen kan dessutom skrivas om genom diverse samband hos komplexa tal för att få snabbare beräkningstid. PDE:na som betraktas i detta kandiatarbete är advektions och diffusions, vilket med speciella antaganden kan skrivas om till en ordinär differentialekvation som löses med en Runge-Kutta metod. Fouriertransformen används för att derivera, då det motsvarar en multiplikation. Det visar sig att alla metoder har fördelar och nackdelar. Ena metoden gör beräkningen snabbare men kan endast använda sig av datamängder av viss storlek. Andra metoden kräver ett specifikt bibliotek för att fungera men är mycket snabbare än de andra två.
74

Applications of Fourier Analysis to Audio Signal Processing: An Investigation of Chord Detection Algorithms

Lenssen, Nathan 01 January 2013 (has links)
The discrete Fourier transform has become an essential tool in the analysis of digital signals. Applications have become widespread since the discovery of the Fast Fourier Transform and the rise of personal computers. The field of digital signal processing is an exciting intersection of mathematics, statistics, and electrical engineering. In this study we aim to gain understanding of the mathematics behind algorithms that can extract chord information from recorded music. We investigate basic music theory, introduce and derive the discrete Fourier transform, and apply Fourier analysis to audio files to extract spectral data.
75

DIGITAL RECEIVER PROCESSING TECHNIQUES FOR SPACE VEHICLE DOWNLINK SIGNALS

Natali, Francis D., Socci, Gerard G. 10 1900 (has links)
International Telemetering Conference Proceedings / October 28-31, 1985 / Riviera Hotel, Las Vegas, Nevada / Digital processing techniques and related algorithms for receiving and processing space vehicle downlink signals are discussed. The combination of low minimum signal to noise density (C/No), large signal dynamic range, unknown time of arrival, and high space vehicle dynamics that is characteristic of some of these downlink signals results in a difficult acquisition problem. A method for rapid acquisition is described which employs a Fast Fourier Transform (FFT). Also discussed are digital techniques for precise measurement of space vehicle range and range rate using a digitally synthesized number controlled oscillator (NCO).
76

Travel time reliability assessment techniques for large-scale stochastic transportation networks

Ng, Man Wo 07 October 2010 (has links)
Real-life transportation systems are subject to numerous uncertainties in their operation. Researchers have suggested various reliability measures to characterize their network-level performances. One of these measures is given by travel time reliability, defined as the probability that travel times remain below certain (acceptable) levels. Existing reliability assessment (and optimization) techniques tend to be computationally intensive. In this dissertation we develop computationally efficient alternatives. In particular, we make the following three contributions. In the first contribution, we present a novel reliability assessment methodology when the source of uncertainty is given by road capacities. More specifically, we present a method based on the theory of Fourier transforms to numerically approximate the probability density function of the (system-wide) travel time. The proposed methodology takes advantage of the established computational efficiency of the fast Fourier transform. In the second contribution, we relax the common assumption that probability distributions of the sources of uncertainties are known explicitly. In reality, this distribution may be unavailable (or inaccurate) as we may have no (or insufficient) data to calibrate the distributions. We present a new method to assess travel time reliability that is distribution-free in the sense that the methodology only requires that the first N moments (where N is any positive integer) of the travel time to be known and that the travel times reside in a set of known and bounded intervals. Instead of deriving exact probabilities on travel times exceeding certain thresholds via computationally intensive methods, we develop analytical probability inequalities to quickly obtain upper bounds on the desired probability. Because of the computationally intensive nature of (virtually all) existing reliability assessment techniques, the optimization of the reliability of transportation systems has generally been computationally prohibitive. The third and final contribution of this dissertation is the introduction of a new transportation network design model in which the objective is to minimize the unreliability of travel time. The computational requirements are shown to be much lower due to the assessment techniques developed in this dissertation. Moreover, numerical results suggest that it has the potential to form a computationally efficient proxy for current simulation-based network design models. / text
77

Homogénéisation numérique de structures périodiques par transformée de Fourier : matériaux composites et milieux poreux / Numerical homogenization of periodic structures by Fourier transform : composite materials and porous media

Nguyen, Trung Kien 21 December 2010 (has links)
Cette étude est consacrée au développement d'outils numériques basés sur la Transformée de Fourier Rapide (TFR) en vue de la détermination des propriétés effectives des structures périodiques. La première partie est dédiée aux matériaux composites. Au premier chapitre, on présente et on compare les différentes méthodes de résolution basée sur la TFR dans le contexte linéaire. Au second chapitre on propose une approche à deux échelles, pour la détermination du comportement des composites non linéaires. La méthode couple, les techniques de résolution basées sur la TFR à l'échelle locale, une méthode d'interpolation multidimensionnelle du potentiel des déformations à l'échelle macroscopique. L'approche présente de nombreux avantages faces aux approches existantes. D'une part, elle ne nécessite aucune approximation et d'autre part, elle est parfaitement séquentielle puisqu'elle ne nécessite pas de traiter simultanément les deux échelles. La loi de comportement macroscopique obtenue a été ensuite implémentée dans un code de calcul par éléments finis. Des illustrations dans le cas d'un problème de flexion sont proposées. La deuxième partie du travail est dédiée à la formulation d'un outil numérique pour la détermination de la perméabilité des milieux poreux saturés. Au chapitre trois, on présente la démarche dans le cas des écoulements en régime quasi-statique. La méthode de résolution repose sur une formulation en contrainte du itératif basée sur la TFR, mieux adaptée pour traiter le cas des contrastes infinis. Deux extensions de cette méthode sont proposées au quatrième chapitre. La première concerne la prise en compte des effets de glissement sur la paroi de la matrice poreux. La méthodologie employée repose sur le concept d'interphase et d'interface équivalente, introduite dans le contexte de l'élasticité des composites et adaptée ici au cas des milieux poreux. Enfin, on présente l'extension de la méthode au cas des écoulements en régime dynamique. Pour cela, on propose un nouveau schéma itératif pour la prise en compte des effets d'origine inertiel / This study is devoted to developing numerical tools based on Fast Fourier Transform (FFT) for determining the effective properties of periodic structures. The first part is devoted to composite materials. In the first chapter, we present and we compare the different FFT-based methods in the context of linear composites. In the second chapter, we propose a two-scale approach for determining the behavior of nonlinear composites. The method uses both FFT-based iterative schemes at the local scale and a multidimensional interpolation of the strain potential at the macroscopic scale. This approach has many advantages over existing ones. Firstly, it requires no approximations for the determination of the macroscopic response. Moreover, it is sequential in the sense that it is not required to process both scales simultaneously. The macroscopic constitutive law has been derived and implemented in a finite element code. Some illustrations in the case of a beam bending are proposed. The second part of the work is dedicated to the formulation of a numerical tool for determining the permeability of saturated porous media. In chapter three, we present the approach in the context of quasi-static flows. To solve the problem we propose a FFT stress-based iterative scheme, better suited to handle the case of infinite contrasts. Two extensions of this method are proposed in the fourth chapter. The first concerns the slip effects which occurs at the interface between solid and fluid. The methodology use the concept of interface and the equivalent interphase, initially introduced in the context of elastic composites and adapted here to the case of porous media. Finally, we present the extension of the method in the dynamic context. We propose a new iterative scheme for taking into account the presence of inertial terms
78

Spectral Analysis of Nonstationary Heart Rate of Neonates Receiving Therapeutic Hypothermia Treatment

Al-Shargabi, Tareq 26 November 2013 (has links)
We studied Heart Rate Variability (HRV) evolution during therapeutic hypothermia in newborns with hypoxic ischemic encephalopathy (HIE) using spectral analysis. We hypothesized that HRV measures are predictive of neurological outcome in babies with HIE. Non-stationarity in the data causes inaccurate quantification of the spectral power. A modification was proposed to power spectral analysis approach to mitigate the effect of non-stationarity. The modified and the standard approaches were applied to cardiac beat-to-beat intervals of newborns receiving hypothermia treatment. The performance of the approaches in distinguishing the RRi dynamics of two groups of newborns was assessed using area under the receiver operating characteristic (ROC) curve. Our results showed that the modified spectral analysis distinguished the two groups of neonates better than the standard approach. These results may be useful in identifying the deteriorating physiology of the infants receiving hypothermia treatment early in time and strategize alternate interventions for them.
79

Deterministic Sparse FFT Algorithms

Wannenwetsch, Katrin Ulrike 09 August 2016 (has links)
No description available.
80

Tabela de covariância : um mapeamento rápido e automático de continuidade espacial

Kloeckner, Jonas January 2018 (has links)
Os modelos de covariância são ferramentas geoestatísticas essenciais para mapear a continuidade espacial. A abordagem atual busca um modelo de continuidade espacial lícito com mínima ou até mesmo sem nenhuma interferência do usuário. Alinhado a essa visão moderna, é proposto obter uma tabela de covariância que visa substituir na prática o modelo tradicional explicitamente definido de covariância. Essa tabela de covariância é obtida por meio de três etapas: interpolar o conjunto de dados para preencher um grid regular, aplicar a convolução através do algoritmo da transformada rápida de Fourier e, por fim, transformar de volta para o domínio espacial. O modelo base para extrair covariância representa o ponto chave comparando com os métodos anteriores que propuseram o uso da tabela de covariância. Os resultados são satisfatórios, tanto na validação estatística do método, quanto na rapidez de obtenção de uma análise de continuidade espacial. Um estudo de caso tridimensional ilustra a aplicação prática através de krigagem e simulação geoestatística em comparação com a modelagem espacial tradicional. / Covariance models are essential geostatistical tools to map spatial continuity. The current approach pursues a licit spatial continuity model with minimum or even no user interference. Aligned with this modern view we propose to obtain a covariance table that aims at replacing in practice traditional covariance explicit defined model. This covariance table is obtained through a three steps work flow: interpolating the dataset to fill up a regular grid, auto convolute via Fast Fourier Transform algorithm and back transform to spacial domain. The base model to extract covariance represents the turning point comparing with previous methods that proposed covariance table usage. The results are satisfactory, both in the statistical validation of the method and in the speed of obtaining a spatial continuity analysis. A three dimensional case study illustrates the practical application for kriging and geostatistical simulation in comparison with traditional spatial modeling.

Page generated in 0.4993 seconds