• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 30
  • 15
  • 6
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 76
  • 41
  • 31
  • 26
  • 22
  • 13
  • 11
  • 11
  • 9
  • 9
  • 8
  • 8
  • 8
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Otimiza??es da transmiss?o de imagens em redes de sensores visuais sem fio explorando a relev?ncia de monitoramento dos n?s fontes e codifica??o DWT

Costa, Daniel Gouveia 29 April 2013 (has links)
Made available in DSpace on 2014-12-17T14:55:11Z (GMT). No. of bitstreams: 1 DanielGC_TESE_Capa_pag90.pdf: 3923138 bytes, checksum: b23776867381c62bd332c913640275ac (MD5) Previous issue date: 2013-04-29 / The development of wireless sensor networks for control and monitoring functions has created a vibrant investigation scenario, covering since communication aspects to issues related with energy efficiency. When source sensors are endowed with cameras for visual monitoring, a new scope of challenges is raised, as transmission and monitoring requirements are considerably changed. Particularly, visual sensors collect data following a directional sensing model, altering the meaning of concepts as vicinity and redundancy but allowing the differentiation of source nodes by their sensing relevancies for the application. In such context, we propose the combined use of two differentiation strategies as a novel QoS parameter, exploring the sensing relevancies of source nodes and DWT image coding. This innovative approach supports a new scope of optimizations to improve the performance of visual sensor networks at the cost of a small reduction on the overall monitoring quality of the application. Besides definition of a new concept of relevance and the proposition of mechanisms to support its practical exploitation, we propose five different optimizations in the way images are transmitted in wireless visual sensor networks, aiming at energy saving, transmission with low delay and error recovery. Putting all these together, the proposed innovative differentiation strategies and the related optimizations open a relevant research trend, where the application monitoring requirements are used to guide a more efficient operation of sensor networks / O desenvolvimento de redes de sensores sem fio para fun??es de controle e monitoramento tem criado um pulsante cen?rio de investiga??o, abrangendo desde aspectos da comunica??o em rede at? quest?es como efici?ncia energ?tica. Quando sensores s?o equipados com c?meras para fun??es de monitoramento visual, um novo escopo de desafios ? lan?ado, uma vez que h? uma mudan?a significativa nos requisitos de monitoramento e transmiss?o. Em particular, sensores visuais coletam dados seguindo um modelo direcional de monitoramento, alterando conceitos j? estabelecidos de vizinhan?a e redund?ncia, por?m tornando poss?vel a diferencia??o de sensores pelas suas relev?ncias de monitoramento para a aplica??o. Nesse contexto, propomos que a relev?ncia de monitoramento dos sensores fontes sejam exploradas em conjunto com a codifica??o de imagens por transformada DWT, unindo assim dois diferentes escopos de relev?ncia para a cria??o de novos par?metros de QoS. Essa abordagem inovadora permite uma nova gama de otimiza??es da opera??o da rede, possibilitando aumento de desempenho com pequenas perdas na qualidade global de monitoramento. Al?m da defini??o de um novo conceito de relev?ncia e a proposi??o de mecanismos para suportar sua utiliza??o pr?tica, cinco diferentes otimiza??es da transmiss?o de imagens em redes de sensores visuais sem fio s?o propostas, visando economia de energia, transmiss?o com baixo atraso e recupera??o de erros. Em conjunto, as estrat?gias de diferencia??o e as otimiza??es relacionadas abrem uma importante vertente de pesquisa, onde os requisitos de monitoramento das aplica??es s?o utilizados para guiar uma opera??o mais eficiente da rede
62

Aplikace waveletové transformace v software Mathematica a Sage / Applications of wavelet transform in Mathematica and Sage

Novotný, Radek January 2013 (has links)
This thesis focuses on image processing using wavelet transform. The usage of wavelet transform is analysed especially for image compression and image noise reduction purposes. The analysis describes in detail aspects and application of the following wavelet transform methods: CWT, DWT, DTWT, 2D DWT. The thesis further explains the meaning of the mother wavelet and studies certain specific kinds of wavelets, kinds of thresholding and its purposes and also touches on the JPEG2000 standard. Mathematica and Sage software packages were used to design algorithms for image compression and image noise reduction, utilising relevant wavelet transform findings. The concluding part of the thesis compares the two software packages and results obtained using different algorithms.
63

Vorhersagbarkeit ökonomischer Zeitreihen auf verschiedenen zeitlichen Skalen

Mettke, Philipp 24 November 2015 (has links)
This thesis examines three decomposition techniques and their usability for economic and financial time series. The stock index DAX30 and the exchange rate from British pound to US dollar are used as representative economic time series. Additionally, autoregressive and conditional heteroscedastic simulations are analysed as benchmark processes to the real data. Discrete wavelet transform (DWT) uses wavelike functions to adapt the behaviour of time series on different time scales. The second method is the singular spectral analysis (SSA), which is applied to extract influential reconstructed modes. As a third algorithm, empirical mode decomposition (END) leads to intrinsic mode functions, who reflect the short and long term fluctuations of the time series. Some problems arise in the decomposition process, such as bleeding at the DWT method or mode mixing of multiple EMD mode functions. Conclusions to evaluate the predictability of the time series are drawn based on entropy - and recurrence - analysis. The cyclic behaviour of the decompositions is examined via the coefficient of variation, based on the instantaneous frequency. The results show rising predictability, especially on higher decomposition levels. The instantaneous frequency measure leads to low values for regular oscillatory cycles, irregular behaviour results in a high variation coefficient. The singular spectral analysis show frequency - stable cycles in the reconstructed modes, but represents the influences of the original time series worse than the other two methods, which show on the contrary very little frequency - stability in the extracted details.:1. Einleitung 2. Datengrundlage 2.1. Auswahl und Besonderheiten ökonomischer Zeitreihen 2.2. Simulationsstudie mittels AR-Prozessen 2.3. Simulationsstudie mittels GARCH-Prozessen 3. Zerlegung mittels modernen Techniken der Zeitreihenanalyse 3.1. Diskrete Wavelet Transformation 3.2. Singulärsystemanalyse 3.3. Empirische Modenzerlegung 4. Bewertung der Vorhersagbarkeit 4.1. Entropien als Maß der Kurzzeit-Vorhersagbarkeit 4.2. Rekurrenzanalyse 4.3. Frequenzstabilität der Zerlegung 5. Durchführung und Interpretation der Ergebnisse 5.1. Visuelle Interpretation der Zerlegungen 5.2. Beurteilung mittels Charakteristika 6. Fazit
64

Restauration d'images Satellitaires par des techniques de filtrage statistique non linéaire / Satellite image restoration by nonlinear statistical filtering techniques

Marhaba, Bassel 21 November 2018 (has links)
Le traitement des images satellitaires est considéré comme l'un des domaines les plus intéressants dans les domaines de traitement d'images numériques. Les images satellitaires peuvent être dégradées pour plusieurs raisons, notamment les mouvements des satellites, les conditions météorologiques, la dispersion et d'autres facteurs. Plusieurs méthodes d'amélioration et de restauration des images satellitaires ont été étudiées et développées dans la littérature. Les travaux présentés dans cette thèse se concentrent sur la restauration des images satellitaires par des techniques de filtrage statistique non linéaire. Dans un premier temps, nous avons proposé une nouvelle méthode pour restaurer les images satellitaires en combinant les techniques de restauration aveugle et non aveugle. La raison de cette combinaison est d'exploiter les avantages de chaque technique utilisée. Dans un deuxième temps, de nouveaux algorithmes statistiques de restauration d'images basés sur les filtres non linéaires et l'estimation non paramétrique de densité multivariée ont été proposés. L'estimation non paramétrique de la densité à postériori est utilisée dans l'étape de ré-échantillonnage du filtre Bayésien bootstrap pour résoudre le problème de la perte de diversité dans le système de particules. Enfin, nous avons introduit une nouvelle méthode de la combinaison hybride pour la restauration des images basée sur la transformée en ondelettes discrète (TOD) et les algorithmes proposés à l'étape deux, et nos avons prouvé que les performances de la méthode combinée sont meilleures que les performances de l'approche TOD pour la réduction du bruit dans les images satellitaires dégradées. / Satellite image processing is considered one of the more interesting areas in the fields of digital image processing. Satellite images are subject to be degraded due to several reasons, satellite movements, weather, scattering, and other factors. Several methods for satellite image enhancement and restoration have been studied and developed in the literature. The work presented in this thesis, is focused on satellite image restoration by nonlinear statistical filtering techniques. At the first step, we proposed a novel method to restore satellite images using a combination between blind and non-blind restoration techniques. The reason for this combination is to exploit the advantages of each technique used. In the second step, novel statistical image restoration algorithms based on nonlinear filters and the nonparametric multivariate density estimation have been proposed. The nonparametric multivariate density estimation of posterior density is used in the resampling step of the Bayesian bootstrap filter to resolve the problem of loss of diversity among the particles. Finally, we have introduced a new hybrid combination method for image restoration based on the discrete wavelet transform (DWT) and the proposed algorithms in step two, and, we have proved that the performance of the combined method is better than the performance of the DWT approach in the reduction of noise in degraded satellite images.
65

Odhad dechové frekvence z elektrokardiogramu a fotopletysmogramu / Breathing Rate Estimation from the Electrocardiogram and Photoplethysmogram

Janáková, Jaroslava January 2021 (has links)
The master thesis deals with the issue of gaining the respiratory rate from ECG and PPG signals, which are not only in clinical practice widely used measurable signals. The theoretical part of the work outlines the issue of obtaining a breath curve from these signals. The practical part of the work is focused on the implementation of five selected methods and their final evaluation and comparison.
66

Videokodek - komprese videosekvencí / Videocodec - Videosequence Compression

Bařina, David January 2009 (has links)
This thesis deals with modern methods of a lossy still image and video compression. Wavelet transformation and SPIHT algorithm also belong to these methods. In second half of this thesis, a videocodec is implemented based on acquired knowledge. This codec uses Daubechies wavelets to analyse an image. Afterwards there is a modified SPIHT algorithm applied on gained coefficients. A lot of effort was put in order to optimize this computation. It is possible to use the created codec in Video for Windows, DirectShow and FFmpeg multimedia frameworks. At the end of this thesis, commonly used codecs are compared with newly created one.
67

Processamento Inteligente de Sinais de Press?o e Temperatura Adquiridos Atrav?s de Sensores Permanentes em Po?os de Petr?leo

Pires, Paulo Roberto da Motta 06 February 2012 (has links)
Made available in DSpace on 2014-12-17T14:08:50Z (GMT). No. of bitstreams: 1 PauloRMP_capa_ate_pag32.pdf: 5057325 bytes, checksum: bf8da0b02ad06ee116c93344fb67e976 (MD5) Previous issue date: 2012-02-06 / Originally aimed at operational objectives, the continuous measurement of well bottomhole pressure and temperature, recorded by permanent downhole gauges (PDG), finds vast applicability in reservoir management. It contributes for the monitoring of well performance and makes it possible to estimate reservoir parameters on the long term. However, notwithstanding its unquestionable value, data from PDG is characterized by a large noise content. Moreover, the presence of outliers within valid signal measurements seems to be a major problem as well. In this work, the initial treatment of PDG signals is addressed, based on curve smoothing, self-organizing maps and the discrete wavelet transform. Additionally, a system based on the coupling of fuzzy clustering with feed-forward neural networks is proposed for transient detection. The obtained results were considered quite satisfactory for offshore wells and matched real requisites for utilization / Originalmente voltadas ao monitoramento da opera??o, as medi??es cont?nuas de press?o e temperatura no fundo de po?o, realizadas atrav?s de PDGs (do ingl?s, Permanent Downhole Gauges), encontram vasta aplicabilidade no gerenciamento de reservat?rios. Para tanto, permitem o monitoramento do desempenho de po?os e a estimativa de par?metros de reservat?rios no longo prazo. Contudo, a despeito de sua inquestion?vel utilidade, os dados adquiridos de PDG apresentam grande conte?do de ru?do. Outro aspecto igualmente desfavor?vel reside na ocorr?ncia de valores esp?rios (outliers) imersos entre as medidas registradas pelo PDG. O presente trabalho aborda o tratamento inicial de sinais de press?o e temperatura, mediante t?cnicas de suaviza??o, mapas auto-organiz?veis e transformada wavelet discreta. Ademais, prop?e-se um sistema de detec??o de transientes relevantes para an?lise no longo hist?rico de registros, baseado no acoplamento entre clusteriza??o fuzzy e redes neurais feed-forward. Os resultados alcan?ados mostraram-se de todo satisfat?rios para po?os marinhos, atendendo a requisitos reais de utiliza??o dos sinais registrados por PDGs
68

Predictability of Nonstationary Time Series using Wavelet and Empirical Mode Decomposition Based ARMA Models

Lanka, Karthikeyan January 2013 (has links) (PDF)
The idea of time series forecasting techniques is that the past has certain information about future. So, the question of how the information is encoded in the past can be interpreted and later used to extrapolate events of future constitute the crux of time series analysis and forecasting. Several methods such as qualitative techniques (e.g., Delphi method), causal techniques (e.g., least squares regression), quantitative techniques (e.g., smoothing method, time series models) have been developed in the past in which the concept lies in establishing a model either theoretically or mathematically from past observations and estimate future from it. Of all the models, time series methods such as autoregressive moving average (ARMA) process have gained popularity because of their simplicity in implementation and accuracy in obtaining forecasts. But, these models were formulated based on certain properties that a time series is assumed to possess. Classical decomposition techniques were developed to supplement the requirements of time series models. These methods try to define a time series in terms of simple patterns called trend, cyclical and seasonal patterns along with noise. So, the idea of decomposing a time series into component patterns, later modeling each component using forecasting processes and finally combining the component forecasts to obtain actual time series predictions yielded superior performance over standard forecasting techniques. All these methods involve basic principle of moving average computation. But, the developed classical decomposition methods are disadvantageous in terms of containing fixed number of components for any time series, data independent decompositions. During moving average computation, edges of time series might not get modeled properly which affects long range forecasting. So, these issues are to be addressed by more efficient and advanced decomposition techniques such as Wavelets and Empirical Mode Decomposition (EMD). Wavelets and EMD are some of the most innovative concepts considered in time series analysis and are focused on processing nonlinear and nonstationary time series. Hence, this research has been undertaken to ascertain the predictability of nonstationary time series using wavelet and Empirical Mode Decomposition (EMD) based ARMA models. The development of wavelets has been made based on concepts of Fourier analysis and Window Fourier Transform. In accordance with this, initially, the necessity of involving the advent of wavelets has been presented. This is followed by the discussion regarding the advantages that are provided by wavelets. Primarily, the wavelets were defined in the sense of continuous time series. Later, in order to match the real world requirements, wavelets analysis has been defined in discrete scenario which is called as Discrete Wavelet Transform (DWT). The current thesis utilized DWT for performing time series decomposition. The detailed discussion regarding the theory behind time series decomposition is presented in the thesis. This is followed by description regarding mathematical viewpoint of time series decomposition using DWT, which involves decomposition algorithm. EMD also comes under same class as wavelets in the consequence of time series decomposition. EMD is developed out of the fact that most of the time series in nature contain multiple frequencies leading to existence of different scales simultaneously. This method, when compared to standard Fourier analysis and wavelet algorithms, has greater scope of adaptation in processing various nonstationary time series. The method involves decomposing any complicated time series into a very small number of finite empirical modes (IMFs-Intrinsic Mode Functions), where each mode contains information of the original time series. The algorithm of time series decomposition using EMD is presented post conceptual elucidation in the current thesis. Later, the proposed time series forecasting algorithm that couples EMD and ARMA model is presented that even considers the number of time steps ahead of which forecasting needs to be performed. In order to test the methodologies of wavelet and EMD based algorithms for prediction of time series with non stationarity, series of streamflow data from USA and rainfall data from India are used in the study. Four non-stationary streamflow sites (USGS data resources) of monthly total volumes and two non-stationary gridded rainfall sites (IMD) of monthly total rainfall are considered for the study. The predictability by the proposed algorithm is checked in two scenarios, first being six months ahead forecast and the second being twelve months ahead forecast. Normalized Root Mean Square Error (NRMSE) and Nash Sutcliffe Efficiency Index (Ef) are considered to evaluate the performance of the proposed techniques. Based on the performance measures, the results indicate that wavelet based analyses generate good variations in the case of six months ahead forecast maintaining harmony with the observed values at most of the sites. Although the methods are observed to capture the minima of the time series effectively both in the case of six and twelve months ahead predictions, better forecasts are obtained with wavelet based method over EMD based method in the case of twelve months ahead predictions. It is therefore inferred that wavelet based method has better prediction capabilities over EMD based method despite some of the limitations of time series methods and the manner in which decomposition takes place. Finally, the study concludes that the wavelet based time series algorithm could be used to model events such as droughts with reasonable accuracy. Also, some modifications that could be made in the model have been suggested which can extend the scope of applicability to other areas in the field of hydrology.
69

On the design of fast and efficient wavelet image coders with reduced memory usage

Oliver Gil, José Salvador 06 May 2008 (has links)
Image compression is of great importance in multimedia systems and applications because it drastically reduces bandwidth requirements for transmission and memory requirements for storage. Although earlier standards for image compression were based on the Discrete Cosine Transform (DCT), a recently developed mathematical technique, called Discrete Wavelet Transform (DWT), has been found to be more efficient for image coding. Despite improvements in compression efficiency, wavelet image coders significantly increase memory usage and complexity when compared with DCT-based coders. A major reason for the high memory requirements is that the usual algorithm to compute the wavelet transform requires the entire image to be in memory. Although some proposals reduce the memory usage, they present problems that hinder their implementation. In addition, some wavelet image coders, like SPIHT (which has become a benchmark for wavelet coding), always need to hold the entire image in memory. Regarding the complexity of the coders, SPIHT can be considered quite complex because it performs bit-plane coding with multiple image scans. The wavelet-based JPEG 2000 standard is still more complex because it improves coding efficiency through time-consuming methods, such as an iterative optimization algorithm based on the Lagrange multiplier method, and high-order context modeling. In this thesis, we aim to reduce memory usage and complexity in wavelet-based image coding, while preserving compression efficiency. To this end, a run-length encoder and a tree-based wavelet encoder are proposed. In addition, a new algorithm to efficiently compute the wavelet transform is presented. This algorithm achieves low memory consumption using line-by-line processing, and it employs recursion to automatically place the order in which the wavelet transform is computed, solving some synchronization problems that have not been tackled by previous proposals. The proposed encode / Oliver Gil, JS. (2006). On the design of fast and efficient wavelet image coders with reduced memory usage [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/1826 / Palancia
70

Zamezení výpočetního přetížení počítačového systému v důsledku přerušení / Preventing Computer System from Computational Overload Due to Interrupts

Hajdík, Tomáš January 2019 (has links)
The master thesis deals with the techniques to prevent computer system from computational overloading due to excessive frequency of interruptions. The goal is to document the effect of interupts on a selected computing platform containing the ARM Cortex-M4 processor core. The work describes and implements possible software techniques that reduce the impact of consequences of overload due to excessive interruption frequency. At the same time the work verifies and compares the effectiveness of the particular implemented techniques by appropriate set of experiments.

Page generated in 0.0289 seconds