• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 54
  • 23
  • 14
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 125
  • 52
  • 31
  • 24
  • 23
  • 22
  • 19
  • 18
  • 17
  • 14
  • 13
  • 12
  • 11
  • 11
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Não-estacionariedade de séries temporais turbulentas e a grande variabilidade dos fluxos nas baixas freqüências / Time series non-stationarity and the large low frequency turbulent flux variability

Martins, Luís Gustavo Nogueira 11 August 2011 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / Turbulent flow high complexity makes it difficult to describe complex phenomena, such as the transport of vector and scalar quantities at the lower atmosphere, making the analysis of experimental data, such as time series, largely employed. The method mostly used by the micrometeorological community to quantify such turbulent transport is associated with the determination of the statistical covariance between two variables. It is known that the determination of statistical quantities for very long temporal windows leads to a large flux uncertainty. At the same time, the theory indicates that the association between fluxes and statistical covariance is only valid for temporally stationary series. The aim of the present study is to test the hypothesis that the estimate uncertainty is directly related to the series non-stationarity. To better understand this issue, we use a methodology based on a group of parametric and nonparametric statistical tests. The tests considered here are the T-test, F-test, median test, U-test and run test. Furthermore, the test results are compared with the outputs of two signal decomposition procedures: multiresolution analysis and empirical mode decomposition. The results suggest that the flux variability over large temporal scales characterizes the existence of temporal trends and low frequency components in the time series considered, so that it is more associated with an observational limitation of the analysis than with non-stationarity, as this concept should be the property of an ensemble, rather than of a single realization. Such limitation suggests the definition of a practical single order stationarity, associated with temporal trends and low frequency components whose energy is similar or larger to that of the turbulent fluctuations. For that reason, we affirm that the interactions test is, among all considered, the best suited for analyzing atmospheric data, because it is the most sensible to the existence of temporal trends. Furthermore, such test allows obtaining a temporal scale beyond which mesoscale events become important. / A complexidade de escoamentos turbulentos causa dificuldade para a descrição de fenômenos complexos, como o transporte de grandezas vetoriais e escalares na baixa atmosfera, fazendo com que a análise de dados experimentais, principalmente séries temporais, seja amplamente utilizada. O método mais utilizado pela comunidade micrometeorológica para quantificar esse transporte pela turbulência está associado à determinação da covariância entre duas variáveis. Sabe-se que a determinação de quantidades estatísticas para janelas temporais muito longas resulta em uma grande incerteza nos valores dos fluxos obtidos através desse método. Ao mesmo tempo, a teoria indica que o procedimento de associar fluxos a covariâncias estatísticas só vale para séries temporalmente estacionárias. O objetivo deste trabalho é testar a hipótese de que a incerteza das estimativas esteja relacionada diretamente com a não-estacionariedade das séries temporais. Para entendermos melhor isso, usamos uma metodologia baseada em um conjunto de testes estatísticos paramétricos e não-paramétricos de hipótese nula. Os testes considerados são o teste-T, teste-F, teste da mediana, teste-U e o teste das interações. Os resultados dos testes são ainda comparados com os obtidos com dois métodos de decomposição de sinais: a análise de multiresolução e a Decomposição Empírica de Modos. Os resultados sugerem que a variabilidade dos fluxos nas grandes escalas temporais está associada diretamente com a presença de tendências e componentes de baixa frequência nas séries analisadas, e que este fato está mais ligado à limitação observacional em que a análise é realizada do que propriamente com a não-estacionariedade, já que esta última é uma propriedade de ensemble e não de apenas uma realização. Esta limitação sugere a definição de um conceito mais prático de estacionariedade de primeira ordem, que seja associado à presença de tendências ou componentes de baixa frequência com energias da ordem ou maiores que a energia das escalas turbulentas. Por esse motivo podemos afirmar que na análise de dados atmosféricos o teste das interações mostrou-se, entre todos os considerados, o mais sensível à presença de tendências, permitindo inclusive a obtenção de uma escala temporal na qual os eventos de meso/submesoescala ganham importância.
92

Utilização da transformada Wavelet para caracterização de distúrbios na qualidade da energia elétrica / Use of the Wavelet transform for the characterization of disturbances in the power quality

Odilon Delmont Filho 22 September 2003 (has links)
Este trabalho apresenta um estudo sobre transformada Wavelet aplicada à qualidade da energia elétrica com o intuito de detectar, localizar e classificar eventuais distúrbios que ocorrem no sistema elétrico. Inicialmente é apresentada uma introdução sobre qualidade da energia, mostrando fatos, evoluções e explicando o conceito dos principais fenômenos que interferem na qualidade da energia do sistema elétrico brasileiro, devido, principalmente, à grande demanda de aparelhos eletrônicos produzidos atualmente. Em seguida é mostrada uma revisão dos principais métodos e modelos aplicados atualmente no mundo a respeito do assunto. A transformada Wavelet vem como uma grande ajuda nesta área de análise de sinais, já que é capaz de extrair simultaneamente informações de tempo e freqüência, diferentemente da transformada de Fourier. A simulação dos diversos distúrbios ocorridos no sistema foi realizada através do software ATP (Alternative Transients Program), cujas características seguem corretamente um sistema de distribuição real da concessionária CPFL. Os distúrbios de tensão gerados e analisados foram detectados e localizados através da técnica de Análise Multiresolução e, posteriormente, classificados, utilizando para isto o método da Curva de Desvio Padrão / This dissertation presents a study of Wavelet transform applied to power quality in order to detect, locate and classify disturbances that may occur in the power system. Initially an introduction of power quality is presented, showing facts, evolutions and explaining the concept of the main phenomena that interfere the on power quality of the brazilian power system, due to, mainly, a great demand for electronic devices produced nowadays. A revision of the main methods and models currently applied in the world regarding this subject is also show. The Wavelet transform comes as a great support in the area of signal assessment, as it can extract information about time and frequency simultaneously, differently from the Fourier transform. The simulation of the diverse disturbances occurred in the system was accomplished through ATP software (Alternative Transients Program), whose characteristics correctly follow a system of real distribution of CPFL eletric utility. The generated and analyzed voltage disturbances were detected and located by Multiresolution Analysis technique and later classified by the method of the Standard Deviation
93

Segmentation de documents administratifs en couches couleur / Segmentation of administrative document images into color layers

Carel, Elodie 08 October 2015 (has links)
Les entreprises doivent traiter quotidiennement de gros volumes de documents papiers de toutes sortes. Automatisation, traçabilité, alimentation de systèmes d’informations, réduction des coûts et des délais de traitement, la dématérialisation a un impact économique évident. Pour respecter les contraintes industrielles, les processus historiques d’analyse simplifient les images grâce à une séparation fond/premier-plan. Cependant, cette binarisation peut être source d’erreurs lors des étapes de segmentation et de reconnaissance. Avec l’amélioration des techniques, la communauté d’analyse de documents a montré un intérêt croissant pour l’intégration d’informations colorimétriques dans les traitements, ceci afin d’améliorer leurs performances. Pour respecter le cadre imposé par notre partenaire privé, l’objectif était de mettre en place des processus non supervisés. Notre but est d’être capable d’analyser des documents même rencontrés pour la première fois quels que soient leurs contenus, leurs structures, et leurs caractéristiques en termes de couleurs. Les problématiques de ces travaux ont été d’une part l’identification d’un nombre raisonnable de couleurs principales sur une image ; et d’autre part, le regroupement en couches couleur cohérentes des pixels ayant à la fois une apparence colorimétrique très proche, et présentant une unité logique ou sémantique. Fournies sous forme d’un ensemble d’images binaires, ces couches peuvent être réinjectées dans la chaîne de dématérialisation en fournissant une alternative à l’étape de binarisation classique. Elles apportent en plus des informations complémentaires qui peuvent être exploitées dans un but de segmentation, de localisation, ou de description. Pour cela, nous avons proposé une segmentation spatio-colorimétrique qui permet d’obtenir un ensemble de régions locales perceptuellement cohérentes appelées superpixels, et dont la taille s’adapte au contenu spécifique des images de documents. Ces régions sont ensuite regroupées en couches couleur globales grâce à une analyse multi-résolution. / Industrial companies receive huge volumes of documents everyday. Automation, traceability, feeding information systems, reducing costs and processing times, dematerialization has a clear economic impact. In order to respect the industrial constraints, the traditional digitization process simplifies the images by performing a background/foreground separation. However, this binarization can lead to some segmentation and recognition errors. With the improvements of technology, the community of document analysis has shown a growing interest in the integration of color information in the process to enhance its performance. In order to work within the scope provided by our industrial partner in the digitization flow, an unsupervised segmentation approach was chosen. Our goal is to be able to cope with document images, even when they are encountered for the first time, regardless their content, their structure, and their color properties. To this end, the first issue of this project was to identify a reasonable number of main colors which are observable on an image. Then, we aim to group pixels having both close color properties and a logical or semantic unit into consistent color layers. Thus, provided as a set of binary images, these layers may be reinjected into the digitization chain as an alternative to the conventional binarization. Moreover, they also provide extra-information about colors which could be exploited for segmentation purpose, elements spotting, or as a descriptor. Therefore, we have proposed a spatio-colorimetric approach which gives a set of local regions, known as superpixels, which are perceptually meaningful. Their size is adapted to the content of the document images. These regions are then merged into global color layers by means of a multiresolution analysis.
94

Représentation des maillages multirésolutions : application aux volumes de subdivision / Representation of multiresolution meshes : an application to subdivision volumes

Untereiner, Lionel 08 November 2013 (has links)
Les maillages volumiques sont très répandus en informatique graphique, en visualisation scientifique et en calcul numérique. Des opérations de subdivision, de simplification ou de remaillage sont parfois utilisées afin d’accélérer les traitements sur ces maillages. Afin de maîtriser la complexité de l’objet et des traitements numériques qui lui sont appliqués, une solution consiste alors à le représenter à différentes échelles. Les modèles existants sont conçus pour des approches spécifiques rendant leur utilisation limitée aux applications pour lesquelles ils ont été pensés. Nos travaux de recherche présentent un nouveau modèle pour la représentation de maillages multirésolutions en dimension quelconque basé sur le formalisme des cartes combinatoires. Nous avons d’abord appliqué notre modèle aux volumes de subdivision multirésolutions. Dans ce cadre, nous présentons plusieurs algorithmes de raffinement d’un maillage grossier initial. Ces algorithmes supportent des hiérarchies obtenues par subdivision régulière et adaptative. Nous proposons ensuite deux représentations, opposés en terme de coût spatial et temporel, pour ce modèle. / Volume meshes are widespread in computer graphics, scientific visualization and numerical computation. Subdivision, simplification or remeshing operations are sometimes used to speed up processing of these meshes. A solution to manage the complexity of the object and numerical processing applied to it consist in presenting this object at different scales. Nevertheless, existing models are designed for specific approaches making them limited to applications for which they were designed. Our research work present a new model for the representation of multiresolution meshes in any dimension based on the combinatorial maps model. We first applied our model to the multiresolution subdivision volumes. In this framework, we present several refinement algorithms of an initial coarse mesh. These algorithms support hierarchies obtained by regular and adaptive subdivision. Finally, we propose two representations, opposed in term of time and space complexity, of this model.
95

Embebed wavelet image reconstruction in parallel computation hardware

Guevara Escobedo, Jorge January 2016 (has links)
In this thesis an algorithm is demonstrated for the reconstruction of hard-field Tomography images through localized block areas, obtained in parallel and from a multiresolution framework. Block areas are subsequently tiled to put together the full size image. Given its properties to preserve its compact support after being ramp filtered, the wavelet transform has received to date much attention as a promising solution in radiation dose reduction in medical imaging, through the reconstruction of essentially localised regions. In this work, this characteristic is exploited with the aim of reducing the time and complexity of the standard reconstruction algorithm. Independently reconstructing block images with geometry allowing to cover completely the reconstructed frame as a single output image, allows the individual blocks to be reconstructed in parallel, and to experience its performance in a multiprocessor hardware reconfigurable system (i.e. FPGA). Projection data from simulated Radon Transform (RT) was obtained at 180 evenly spaced angles. In order to define every relevant block area within the sinogram, forward RT was performed over template phantoms representing block frames. Reconstruction was then performed in a domain beyond the block frame limits, to allow calibration overlaps when fitting of adjacent block images. The 256 by 256 Shepp-Logan phantom was used to test the methodology of both parallel multiresolution and parallel block reconstruction generalisations. It is shown that the reconstruction time of a single block image in a 3-scale multiresolution framework, compared to the standard methodology, performs around 48 times faster. By assuming a parallel implementation, it can implied that the reconstruction time of a single tile, should be very close related to the reconstruction time of the full size and resolution image.
96

Non-stationary signal classification for radar transmitter identification

Du Plessis, Marthinus Christoffel 09 September 2010 (has links)
The radar transmitter identification problem involves the identification of a specific radar transmitter based on a received pulse. The radar transmitters are of identical make and model. This makes the problem challenging since the differences between radars of identical make and model will be solely due to component tolerances and variation. Radar pulses also vary in time and frequency which means that the problem is non-stationary. Because of this fact, time-frequency representations such as shift-invariant quadratic time-frequency representations (Cohen’s class) and wavelets were used. A model for a radar transmitter was developed. This consisted of an analytical solution to a pulse-forming network and a linear model of an oscillator. Three signal classification algorithms were developed. A signal classifier was developed that used a radially Gaussian Cohen’s class transform. This time-frequency representation was refined to increase the classification accuracy. The classification was performed with a support vector machine classifier. The second signal classifier used a wavelet packet transform to calculate the feature values. The classification was performed using a support vector machine. The third signal classifier also used the wavelet packet transform to calculate the feature values but used a Universum type classifier for classification. This classifier uses signals from the same domain to increase the classification accuracy. The classifiers were compared against each other on a cubic and exponential chirp test problem and the radar transmitter model. The classifier based on the Cohen’s class transform achieved the best classification accuracy. The classifier based on the wavelet packet transform achieved excellent results on an Electroencephalography (EEG) test dataset. The complexity of the wavelet packet classifier is significantly lower than the Cohen’s class classifier. Copyright / Dissertation (MEng)--University of Pretoria, 2010. / Electrical, Electronic and Computer Engineering / unrestricted
97

"Projeto multirresolução de operadores morfológicos a partir de exemplos" / "Multiresolution design of morphological operators from examples"

Daniel André Vaquero 19 April 2006 (has links)
Resolver um problema de processamento de imagens pode ser uma tarefa bastante complexa. Em geral, isto depende de diversos fatores, como o conhecimento, experiência e intuição de um especialista, e o conhecimento do domínio da aplicação em questão. Motivados por tal complexidade, alguns grupos de pesquisa têm trabalhado na criação de técnicas para projetar operadores de imagens automaticamente, a partir de uma coleção de exemplos de entrada e saída do operador desejado. A abordagem multirresolução tem sido empregada com sucesso no projeto estatístico de W-operadores de janelas grandes. Esta metodologia usa uma estrutura piramidal de janelas para auxiliar na estimação das distribuições de probabilidade condicional para padrões não observados no conjunto de treinamento. No entanto, a qualidade do operador projetado depende diretamente da pirâmide escolhida. Tal escolha é feita pelo projetista a partir de sua intuição e de seu conhecimento prévio sobre o problema. Neste trabalho, investigamos o uso da entropia condicional como um critério para determinar automaticamente uma boa pirâmide a ser usada no projeto do W-operador. Para isto, desenvolvemos uma técnica que utiliza o arcabouço piramidal multirresolução como um modelo na estimação da distribuição conjunta de probabilidades. Experimentos com o problema de reconhecimento de dígitos manuscritos foram realizados para avaliar o desempenho do método. Utilizamos duas bases de dados diferentes, com bons resultados. Além disso, outra contribuição deste trabalho foi a experimentação com mapeamentos de resolução da teoria de pirâmides de imagens no contexto do projeto de W-operadores multirresolução. / The task of finding a good solution for an image processing problem is often very complex. It usually depends on the knowledge, experience and intuition of an image processing specialist. This complexity has served as a motivation for some research groups to create techniques for automatically designing image operators based on a collection of input and output examples of a desired operator. The multiresolution approach has been successfully used to statistically design W-operators for large windows. However, the success of this method directly depends on the adequate choice of a pyramidal window structure, which is used to aid in the estimation of the conditional probability distributions for patterns that do not appear in the training set. The choice is made by the designer, based on his intuition and previous knowledge of the problem domain. In this work, we investigate the use of the conditional entropy criterion for automatically determining a good pyramid. In order to compute the entropy, we have developed a technique that uses the multiresolution pyramidal framework as a model in the estimation of the joint probability distribution. The performance of the method is evaluated on the problem of handwritten digits recognition. Two different databases are used, with good practical results. Another important contribution of this work is the experimentation with resolution mappings from image pyramids theory in the context of multiresolution W-operator design.
98

Rozpoznání ručně psaných číslic / Recognition of Handwritten Digits

Štrba, Miroslav January 2010 (has links)
Recognition of handwritten digits is a problem, which could serve as model task for multiclass recognition of image patterns. This thesis studies different kinds of algoritms (Self-Organizing Maps, Randomized tree and AdaBoost) and methods for increasing accuracy using fusion (majority voting, averaging log likelihood ratio, linear logistic regression). Fusion methods were used for combine classifiers with indentical train parameters, with different training methods and with multiscale input.
99

Auswirkung des Rauschens und Rauschen vermindernder Maßnahmen auf ein fernerkundliches Segmentierungsverfahren

Gerhards, Karl 31 July 2006 (has links)
Zur Verminderung des Rauschens sehr hochauflösender Satellitenbilder existieren eine Vielzahl von Glättungsalgorithmen. Die Wirkung verschiedener Tiefpaß- und kantenerhaltender Filter auf das Verhalten eines objektorientierten Segmentierungsverfahrens wird anhand zweier synthetischer Grauwertbilder und einer IKONOS-Aufnahme untersucht. Als Rauschmaß hat sich ein modifiziertes, ursprünglich von Baltsavias et al. [2001] vorgeschlagenes Verfahren bewährt, in dem je Grauwert nur die Standardabweichungen der gleichförmigsten Gebiete berücksichtigt werden. In Vergleich mit synthetisch verrauschten Bildern zeigt sich jedoch, daß auf diese Weise das Rauschen im Bild systematisch um fast den Faktor zwei unterschätzt wird. Einfache Filter wie Mittelwertfilter und davon abgeleitete Verfahren verschlechtern die Präzision der Objekterkennung dramatisch, kantenerhaltende Filter können bei stärker verrauschten Daten vorteilhaft sein.Als bester Filter, der bei Ansprüchen an präzise Segmentgrenzen im Pixelbereich sinnvoll einzusetzen ist und dabei mit nur einem Parameter gesteuert werden kann, erweist sich der modifizierte EPOS-Filter, ursprünglich vorgestellt von Haag und Sties [1994, 1996]. Die generellen Bildparameter, wie Standardabweichung oder Histogramm werden durch diesen kantenerhaltenden Filter nur unwesentlich beeinflußt.
100

Feature extraction and similarity-based analysis for proteome and genome databases

Ozturk, Ozgur 20 September 2007 (has links)
No description available.

Page generated in 0.0615 seconds