• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 124
  • 96
  • 34
  • 17
  • 12
  • 11
  • 5
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 405
  • 405
  • 405
  • 98
  • 87
  • 61
  • 45
  • 44
  • 41
  • 36
  • 35
  • 33
  • 28
  • 26
  • 26
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

Analýza zvukové interpretace hudby metodami číslicového zpracování signálu / Analysis of Expressive Music Performance using Digital Signal Processing Methods

Fančal, Petr January 2017 (has links)
The aim of this master's thesis is the analysis of musical compositions from the standpoint of time resources of music. The introduction briefly describes the basic musicological terms and variables that are in direct relationship to the time resources in expressive music performance. The following part of the work is devoted to the known methods of digital signal processing, suitable for music information retrieval from audio recordings. In the practical part these methods are demonstrated on three recordings in MATLAB environment and the results were compared in terms of used agogics.
322

Structure des noyaux les plus lourds : spectroscopie du noyau ²⁵¹Fm et développement pour des traitements numériques du signal / Structure of the heaviest nuclei : spectroscopy of ²⁵¹Fm and digital signal processing development

Rezynkina, Kseniia 21 September 2016 (has links)
L'un des principaux défis de la physique nucléaire moderne est de comprendre la structure nucléaire des éléments les plus lourds. Les barrières de fission calculées dans le modèle de la goutte liquide macroscopique ne parviennent pas à expliquer la stabilité des noyaux avec un nombre de protons Z≥90. Cette barrière disparaît pour les éléments transfermium (Z≥100) qui ne sont donc stabilisés que par des effets quantiques de couche. Les noyaux lourds sont un laboratoire unique pour étudier l'évolution de la structure nucléaire dans des conditions extrêmes de masse et de champ Coulombien. Bien que de nombreuses théories s’accordent sur l'existence d’un « îlot de stabilité », les prédictions sur son emplacement exact en terme de nombre de protons et neutrons varient grandement. Les études expérimentales des noyaux transfermium s’avèrent donc essentielles pour contraindre les modèles théoriques et mieux comprendre l’évolution des couches nucléaires.L'interaction entre le mouvement des particules individuelles et les degrés de liberté collectifs du noyau a été investiguée dans le 251Fm par le biais de la spectroscopie combinée délectrons de conversion interne et de photons γ. Les états excités du 251Fm ont été peuplés dans la décroissance α du 255No, produit dans les 2 réactions suivantes: 208Pb(48Ca, 1n)255No et 209Bi(48Ca, 2n)255Lr. Les expériences ont été réalisées au JINR, FLNR, Dubna. Les faisceaux intenses ont été délivrés par le cyclotron U-400, et les séparateurs VASSILISSA ou SHELS ont été utilisés pour sélectionner les résidus de fusion-évaporation. Le spectromètre GABRIELA a été utilisé pour effectuer des mesures des propriétés de décroissance caractéristique corrélées en temps et en position pour isoler les noyaux d'intérêt. La spectroscopie d'électrons de conversion interne du 251Fm a été réalisée pour la première fois. Ces mesures ont permis d'établir les multipolarités de plusieurs transitions et de quantifier le rapport de mélange M2/E3 dans la désintégration de l'isomère 5/2+. Le B (E3) valeur extraite est comparée à celles des autres membres de la chaîne isotonique N=151 et les calculs QRPA utilisant l'interaction effective de Gogny.Au cours de ce travail, une nouvelle méthode graphique d’extraction des rapports de mélange de transitions nucléaires a été développé. Cette méthode intuitive et illustrative et ses limites d'application, ainsi que certains aspects du calcul des rapports de mélange au-delà de ces limites, sont décrites et discutées.Les détecteurs silicium double-face à strips (DSDS) sont largement utilisés en spectrométrie nucléaire, en particulier au plan focal de séparateurs pour détecter l'implantation et la désintégration ultérieure des noyaux les plus lourds. Il a été constaté que la présence de strips mécaniquement déconnectés sur une face du DSDS peut conduire à l'apparition de pics d'énergie abaissée sur la face opposée en raison de la variation de la capacité totale. Cet effet, ainsi que les méthodes de correction du spectre, ont été étudiés et discutés. L'utilisation de simulations GEANT4 pour résoudre les effets de sommation α-ECI dans le DSDS et pour contraindre les coefficients de conversion interne des transitions impliquées dans la désexcitation du noyau d’intérêt est présentée à l’aide de l’exemple du 221Th.Une bonne partie des travaux ont été consacrés à la R&D pour un nouveau système électronique numérique pour le spectromètre GABRIELA et aux tests comparatifs de plusieurs cartes d'acquisition numériques. Les résultats de ces tests, ainsi que les algorithmes de traitement numérique du signal mis en œuvre pour une analyse non biaisée hors ligne sont présentés. / One of the major challenges of modern nuclear physics is to understand the nuclear structure of the heaviest elements. Fission barriers calculated within the macroscopic liquid drop model fail to explain the stability of nuclei with a number of protons Z≥90. Transfermium elements (Z≥100) have a vanishing liquid-drop barrier and are solely stabilized by quantum shell effects. They provide a unique laboratory to study the evolution of nuclear structure under the extreme conditions of large mass and strong Coulomb force. Though many theories agree on the existence of an “Island of Stability”, the predictions on its exact location in terms of number of protons and neutrons vary greatly. Hence the systematic study of transfermium nuclei is essential to constrain theoretical models and to get a better understanding of the evolution of nuclear shells.The interplay between single-particle and collective degrees of freedom in 251Fm was investigated by means combined internal conversion electron (ICE) and γ-ray spectroscopy. Excited states in 251Fm were populated via the α-decay of 255No produced in the two following fusion-evaporation reactions: 208Pb(48Ca, 1n)255No and 209Bi(48Ca, 2n)255Lr. The experiments were performed at the FLNR, JINR, Dubna. The intense beams were delivered by the U-400 cyclotron and the separators VASSILISSA or SHELS were used to select fusion evaporation residues. At their focal planes the GABRIELA spectrometer was used to perform a time and position correlated measurement of the characteristic decay properties to further isolate the nuclei of interest. ICE spectroscopy of 251Fm was performed for the first time. These measurements allowed to establish the multipolarities of several transitions in 251Fm and to quantify the M2/E3 mixing ratio in the decay of the low-lying 5/2+ isomer. The extracted B(E3) value is compared to those found in other members of the N=151 isotonic chain and to the QRPA calculations using the Gogny effective interaction.During this work, a novel graphical method of extracting mixing ratios for nuclear transitions has been developed. This intuitive and illustrative method and it’s limits of applicability, as well as certain aspects of the calculation of mixing ratios beyond these limits, are described and discussed.Double-sided silicon strip detectors (DSSD) are widely used in nuclear spectrometry, in particular at the focal plane of separators to detect the implantation and subsequent decay of the heaviest nuclei. It was found that the presence of mechanically disconnected strips on one face of the DSSD may lead to the occurrence of lower energy peaks on the opposite face due to the change of the total capacitance. This effect, along with the methods of restoring the correct spectra, has been studied and discussed. The use of GEANT4 simulations for resolving α-ICE summing in the DSSD and for constraining the internal conversion coefficients of the transitions involved in the decay of the nucleus of interest is presented with the example of 221Th.A significant part of the thesis work was dedicated to the R&D for a new digital electronics system for the GABRIELA spectrometer and to the comparative tests of several digital acquisition cards. The results of these tests, as well as the digital signal processing algorithms implemented for an unbiased off-line analysis are presented.
323

This Creature, Bride of Christ

Bober, Nicholas Bradburn 05 1900 (has links)
This Creature, Bride of Christ is a composition for soprano, alto flute, viola, marimba, and computer running custom software for live interactive performance in the Max/MSP environment. The work is a setting of excerpts from The Book of Margery Kempe, an early autobiographical manuscript depicting the life of a Christian mystic. The thesis discusses the historical, sociological, and musical context of the text and its musical setting; the use of borrowed materials from music of John Dunstable, Richard Wagner, and the tradition of change ringing; and the technologies used to realize the computer accompaniment. A score of the work is also included in the appendix.
324

Clutter Removal in Single Radar Sensor Reflection Data via Digital Signal Processing

Kazemisaber, Mohammadreza January 2020 (has links)
Due to recent improvements, robots are more applicable in factories and various production lines where smoke, fog, dust, and steam are inevitable. Despite their advantages, robots introduce new safety requirements when combined with humans. Radars can play a crucial role in this context by providing safe zones where robots are operating in the absence of humans. The goal of this Master’s thesis is to investigate different clutter suppression methods for single radar sensor reflection data via digital signal processing. This was done in collaboration with ABB Jokab AB, Sweden. The calculations and implementation of the digital signal processing algorithms are made with Octave. A critical problem is false detection that could possibly cause irreparable damage. Therefore, a safety system with an extremely low false alarm rate is desired to reduce costs and damages. In this project, we have studied four different digital low pass filters: moving average, multiple-pass moving average, Butterworth, and window-based filters. The results are compared, and it is ascertained that all the results are logically compatible, broadly comparable, and usable in this context.
325

Probability Density Function Estimation Applied to Minimum Bit Error Rate Adaptive Filtering

Phillips, Kimberly Ann 28 May 1999 (has links)
It is known that a matched filter is optimal for a signal corrupted by Gaussian noise. In a wireless environment, the received signal may be corrupted by Gaussian noise and a variety of other channel disturbances: cochannel interference, multiple access interference, large and small-scale fading, etc. Adaptive filtering is the usual approach to mitigating this channel distortion. Existing adaptive filtering techniques usually attempt to minimize the mean square error (MSE) of some aspect of the received signal, with respect to the desired aspect of that signal. Adaptive minimization of MSE does not always guarantee minimization of bit error rate (BER). The main focus of this research involves estimation of the probability density function (PDF) of the received signal; this PDF estimate is used to adaptively determine a solution that minimizes BER. To this end, a new adaptive procedure called the Minimum BER Estimation (MBE) algorithm has been developed. MBE shows improvement over the Least Mean Squares (LMS) algorithm for most simulations involving interference and in some multipath situations. Furthermore, the new algorithm is more robust than LMS to changes in algorithm parameters such as stepsize and window width. / Master of Science
326

Lokalisering av brunnar i ELISpot

Modahl, Ylva, Skoglund, Caroline January 2019 (has links)
Health is a fundamental human right. To increase global health, research in the medical sector is of great importance. Decreasing time consumption of biomedical testing could accelerate the research and development of new drugs and vaccines. This could be achieved by automation of biomedical analysis, using computerized methods. In order to perform analysis on pictures of biomedical tests, it is important to identify the area of interest (AOI) of the test. For example, cells and bacteria are commonly grown in petri dishes, in this case the AOI is the bottom area of the dish, since this is where the object of analysis is located.This study was performed with the aim to compare a few computerized methods for identifying the AOI in pictures of biomedical tests. In the study, biomedical images from a testing method called ELISpot have been used. ELISpot uses plates with up to 96 circular wells, where pictures of the separate wells were used in order to find the AOI corresponding to the bottom area of each well. The focus has been on comparing the performance of three edge detection methods. More specifically, their ability to accurately detect the edges of the well. Furthermore, a method for identifying a circle based on the detected edges was used to specify the AOI.The study shows that methods using second order derivatives for edge detection, gives the best results regarding to robustness.
327

Digital Back End Development and Interference Mitigation Methods for Radio Telescopes with Phased-Array Feeds

Black, Richard Allen 20 August 2014 (has links) (PDF)
The Brigham Young University (BYU) Radio Astronomy group, in collaboration with Cornell University, the University of Massachusetts, and the National Radio Astronomy Observatory (NRAO), have in recent years developed and deployed PAF systems that demonstrated the advantages of PAFs for astronomy. However, these systems lacked the necessary bandwidth and acquisition times to be scientifically viable. This thesis outlines the development of a 20-MHz bandwidth system that can acquire for much longer periods of time and across much larger bandwidths than previous BYU systems. A report of the deployment of this system on the 305-meter reflector at the Arecibo Observatory in Puerto Rico is also summarized.The Commonwealth Scientific and Industrial Research Organisation (CSIRO) is currently constructing a PAF-equipped synthesis imaging array named the Australian Square Kilometre Array Pathfinder (ASKAP) that offers great promise for widening FOVs and enhancing RFI mitigation techniques. Previous work in RFI mitigation has demonstrated effective cancellation for synthesis imaging arrays under the assumption that the processing bandwidth is narrowband and correlator dump times are short. However, these assumptions do not necessarily reflect real-world instrument limitations. This thesis explores simulated adaptive array cancellation algorithm effectiveness as applied on the ASKAP instrument given realistic bandwidths and correlator dump times. The results demonstrate that active RFI mitigation performed across long baselines is largely ineffectual.
328

Comparing Android Runtime with native : Fast Fourier Transform on Android / Jämförelse av Android Runtime och native : Fast Fourier Transform på Android

Danielsson, André January 2017 (has links)
This thesis investigates the performance differences between Java code compiled by Android Runtime and C++ code compiled by Clang on Android. For testing the differences, the Fast Fourier Transform (FFT) algorithm was chosen to demonstrate examples of when it is relevant to have high performance computing on a mobile device. Different aspects that could affect the execution time of a program were examined. One test measured the overhead related to the Java Native Interface (JNI).  The results showed that the overhead was insignificant for FFT sizes larger than 64.  Another test compared matching implementations of FFTs between Java and native code. The conclusion drawn from this test was that, of the converted algorithms, Columbia Iterative FFT performed the best in both Java and C++. A third test, evaluating the performance of vectorization, proved to be an efficient option for native optimization. Finally, tests examining the effect of using single-point precision (float) versus double-point precision (double) data types were covered. Choosing float could improve performance by using the cache in an efficient manner. / I denna studie undersöktes prestandaskillnader mellan Java-kod kompilerad av Android Runtime och C++-kod kompilerad av Clang på Android. En snabb Fourier Transform (FFT) användes under experimenten för att visa vilka användningsområden som kräver hög prestanda på en mobil enhet. Olika påverkande aspekter vid användningen av en FFT undersöktes. Ett test undersökte hur mycket påverkan Java Native Interface (JNI) hade på ett program i helhet. Resultaten från dessa tester visade att påverkan inte var signifikant för FFT-storlekar större än 64. Ett annat test undersökte prestandaskillnader mellan FFT-algoritmer översatta från Java till C++. Slutsatsen kring dessa tester var att av de översatta algoritmerna var Columbia Iterative FFT den som presterade bäst, både i Java och i C++. Vektorisering visade sig vara en effektiv optimeringsteknik för arkitekturspecifik kod skriven i C++. Slutligen utfördes tester som undersökte prestandaskillnader mellan flyttalsprecision för datatyperna float och double. float kunde förbättra prestandan genom att på ett effektivt sätt utnyttja processorns cache.
329

Discussion On Effective Restoration Of Oral Speech Using Voice Conversion Techniques Based On Gaussian Mixture Modeling

Alverio, Gustavo 01 January 2007 (has links)
Today's world consists of many ways to communicate information. One of the most effective ways to communicate is through the use of speech. Unfortunately many lose the ability to converse. This in turn leads to a large negative psychological impact. In addition, skills such as lecturing and singing must now be restored via other methods. The usage of text-to-speech synthesis has been a popular resolution of restoring the capability to use oral speech. Text to speech synthesizers convert text into speech. Although text to speech systems are useful, they only allow for few default voice selections that do not represent that of the user. In order to achieve total restoration, voice conversion must be introduced. Voice conversion is a method that adjusts a source voice to sound like a target voice. Voice conversion consists of a training and converting process. The training process is conducted by composing a speech corpus to be spoken by both source and target voice. The speech corpus should encompass a variety of speech sounds. Once training is finished, the conversion function is employed to transform the source voice into the target voice. Effectively, voice conversion allows for a speaker to sound like any other person. Therefore, voice conversion can be applied to alter the voice output of a text to speech system to produce the target voice. The thesis investigates how one approach, specifically the usage of voice conversion using Gaussian mixture modeling, can be applied to alter the voice output of a text to speech synthesis system. Researchers found that acceptable results can be obtained from using these methods. Although voice conversion and text to speech synthesis are effective in restoring voice, a sample of the speaker before voice loss must be used during the training process. Therefore it is vital that voice samples are made to combat voice loss.
330

Digital Signal Processing of SARSAT Signals Using the MEM and FFT

Chung, Kwai-Sum Thomas 07 1900 (has links)
<p> This thesis investigates the processing of emergency locator transmitter (ELT) signals which are used in search and rescue satellite-aided tracking (SARSAT) systems. Essentially, the system relies on the transmission of ELT signals from a distressed platform being relayed through an orbiting satellite to an earth station where signal processing can be performed. </p> <p> The methods of signal processing investigated here include both linear and nonlinear. The linear methods include the window function, the autocorrelation function, the digital filtering and the Fast Fourier Transform (FFT). The nonlinear processing is based on the Maximum Entropy Method (MEM) . In addition, additive white Gaussian noise has been added to simulate the performance under different carrier-to-noise density ratio conditions. </p> <p> For a single ELT signal, it is shown in the thesis that the MEM processor gives good spectral performance as compared to the FFT when applied to all types of modulation. When multiple ELT signals are present, the MEM also provides certain benefits in improving the spectral performance as compared to the FFT. </p> / Thesis / Master of Engineering (ME)

Page generated in 0.0954 seconds