Spelling suggestions: "subject:"wavelets."" "subject:"waveletss.""
541 |
Contributions to Bayesian wavelet shrinkageRemenyi, Norbert 07 November 2012 (has links)
This thesis provides contributions to research in Bayesian modeling and shrinkage in the wavelet domain. Wavelets are a powerful tool to describe phenomena rapidly changing in time, and wavelet-based modeling has become a standard technique in many areas of statistics, and more broadly, in sciences and engineering. Bayesian modeling and estimation in the wavelet domain have found useful applications in nonparametric regression, image denoising, and many other areas. In this thesis, we build on the existing
techniques and propose new methods for applications in nonparametric regression, image denoising, and partially linear models.
The thesis consists of an overview chapter and four main topics. In Chapter 1, we provide an overview of recent developments and the current status of Bayesian wavelet shrinkage research. The chapter contains an extensive literature review consisting of almost 100 references. The main focus of the overview chapter is on nonparametric regression, where the observations come from an unknown function contaminated with Gaussian noise. We present many methods which employ model-based and adaptive shrinkage of the wavelet coefficients through Bayes rules. These includes new developments such as dependence models, complex wavelets, and Markov chain Monte Carlo (MCMC) strategies. Some applications of Bayesian wavelet shrinkage, such as curve classification, are discussed.
In Chapter 2, we propose the Gibbs Sampling Wavelet Smoother (GSWS), an adaptive wavelet denoising methodology. We use the traditional mixture prior on the wavelet coefficients, but also formulate a fully Bayesian hierarchical model in the wavelet domain accounting for the uncertainty of the prior parameters by placing hyperpriors on them. Since a closed-form solution to the Bayes estimator does not exist, the procedure is computational, in which the posterior mean is computed via MCMC simulations. We show how to efficiently develop a Gibbs sampling algorithm for the proposed model. The developed procedure is fully Bayesian, is adaptive to the underlying signal, and provides good denoising performance compared to state-of-the-art methods. Application of the method is illustrated on a real data set arising from the analysis of metabolic pathways, where an iterative shrinkage procedure is developed to preserve the mass balance of the metabolites in the system. We also show how the methodology can be extended to complex wavelet bases.
In Chapter 3, we propose a wavelet-based denoising methodology based on a Bayesian hierarchical model using a double Weibull prior. The interesting feature is that in contrast to the mixture priors traditionally used by some state-of-the-art methods, the wavelet coefficients are modeled by a single density. Two estimators are developed, one based on the posterior mean and the other based on the larger posterior mode; and we show how to calculate these estimators efficiently. The methodology provides good denoising performance, comparable even to state-of-the-art methods that use a mixture prior and an empirical Bayes setting of hyperparameters; this is demonstrated by simulations on standard test functions. An application to a real-word data set is also considered.
In Chapter 4, we propose a wavelet shrinkage method based on a neighborhood of wavelet coefficients, which includes two neighboring coefficients and a parental coefficient. The methodology is called Lambda-neighborhood wavelet
shrinkage, motivated by the shape of the considered neighborhood. We propose a Bayesian hierarchical model using a contaminated exponential prior on the total mean energy in the Lambda-neighborhood. The hyperparameters in the model are estimated by the empirical Bayes method, and the posterior mean, median, and Bayes factor are obtained and used in the estimation of the total mean energy. Shrinkage of the neighboring coefficients is based on the ratio of the estimated and observed energy. The proposed methodology is comparable and often superior to several established wavelet denoising methods that utilize neighboring information, which is demonstrated by extensive simulations. An application to a real-world data set from inductance plethysmography is considered, and an extension to image denoising is discussed.
In Chapter 5, we propose a wavelet-based methodology for estimation and variable selection in partially linear models. The inference is conducted in the wavelet domain, which provides a sparse and localized decomposition appropriate for nonparametric components with various degrees of smoothness. A hierarchical Bayes model is formulated on the parameters of this representation, where the estimation and variable selection is performed by a Gibbs sampling procedure. For both the parametric and nonparametric part of the model we are using point-mass-at-zero contamination priors with a double exponential spread distribution. In this sense we extend the model of Chapter 2 to partially linear models. Only a few papers in the area of partially linear wavelet models exist, and we show that the proposed methodology is often superior to the existing methods with respect to the task of estimating model parameters. Moreover, the method is able to perform Bayesian variable selection by a stochastic search for the parametric part of the model.
|
542 |
Compression of Medical Image Stacks using Wavelets and Zero-Tree Coding / Kompression av medicinska bildstackar med användning av wavelets och zerotree-kodningSjöblom, Erik January 2002 (has links)
The use of computers for handling image data in the healthcare is growing. The amount of data produced by modern image generating techniques, such as CT and MR, is vast. The amount of data might be a problem from a storage point of view or when the data is sent over a network. To overcome these problems data compression techniques adapted to these applications are needed. There are an abundant number of compression schemes for the compression of images, but few that are geared towards compression of image stacks. Though these 2D techniques can be applied to the image stacks they do not fully exploit the three-dimensional nature of the stacks. In this thesis ideas from the image compression area are extended to exploit the 3D nature of image stacks. Attention is directed to MR and CT stacks. The compression schemes in this thesis are based on the relatively new notion of wavelets, used in the JPEG2000 standard. The schemes also providea quality progressive transmission, which allows intermediate low-quality results to be constructed. Two different coders are presented in this thesis both developed for the compression of image stacks containing medical data. They are evaluated and compared and a top performer is chosen. Advantages and drawbacks of the schemes and future enhancements are also discussed.
|
543 |
Statistical methods for function estimation and classificationKim, Heeyoung 20 June 2011 (has links)
This thesis consists of three chapters. The first chapter focuses on adaptive smoothing splines for fitting functions with varying roughness. In the first part of the first chapter, we study an asymptotically optimal procedure to choose the value of a discretized version of the variable smoothing parameter in adaptive smoothing splines. With the choice given by the multivariate version of the generalized cross validation, the resulting adaptive smoothing spline estimator is shown to be consistent and asymptotically optimal under some general conditions. In the second part, we derive the asymptotically optimal local penalty function, which is subsequently used for the derivation of the locally optimal smoothing spline estimator. In the second chapter, we propose a Lipschitz regularity based statistical model, and apply it to coordinate measuring machine (CMM) data to estimate the form error of a manufactured product and to determine the optimal sampling positions of CMM measurements. Our proposed wavelet-based model takes advantage of the fact that the Lipschitz regularity holds for the CMM data. The third chapter focuses on the classification of functional data which are known to be well separable within a particular interval. We propose an interval based classifier. We first estimate a baseline of each class via convex optimization, and then identify an optimal interval that maximizes the difference among the baselines. Our interval based classifier is constructed based on the identified optimal interval. The derived classifier can be implemented via a low-order-of-complexity algorithm.
|
544 |
Detection of point sources in maps of the cosmic microwave background radiation by means of optimal filtersLópez-Caniego Alcarria, Marcos 21 December 2006 (has links)
Cuando observamos el cielo con los instrumentos más avanzados en la frecuencia de las microondas, los fotones que vemos fueron originados en regiones muy diferentes del universo. La mayoría proceden de nuestra propia galaxia (emisión difusa de sincrotrón, free-free y polvo), otros se originan en galaxias y cúmulos de galaxias muy lejanos, y, finalmente, una fracción tendrá su origen en la radiación del Big Bang, conocida como la Radiación del Fondo Cósmico de Microondas. Esta radiación está contaminada por las otras y su separación de ellas de una forma satisfactoria sigue siendo un problema abierto. En esta tesis exploraremos diferentes técnicas basadas en filtros lineales para detectar y separar uno de estos contaminantes, la contribución a la señal que se observa de radio galaxias y galaxias infrarrojas lejanas. Éstas son conocidas como fuentes puntuales porque aparecen como objetos puntuales no resueltos -- debido a las típicas resoluciones angulares de los instrumentos usados en los experimentos. En primer lugar, vamos a estudiar las propiedades de una familia de filtros adaptados para la detección de fuentes puntuales, donde modificamos la escala del filtro y usamos un test de Neyman-Pearson para definir la región de aceptación. En segundo lugar, consideramos un nuevo filtro (BSAF) que tiene dos parámetros que se pueden optimizar para maximizar el número de detecciones reales fijado un número de detecciones espurias. En tercer lugar, extendemos el BSAF al caso dos-dimensional y lo comparamos con la ondícula de sombrero mejicano (MHW). En cuarto lugar, estudiamos una nueva técnica de fusión de lineal y cuadrática de imágenes. En quinto lugar, comparamos tres filtros usando simulaciones realistas de Planck. En sexto lugar, para terminar, usamos el segundo miembro de la familia de ondículas de sombrero mejicano para hacer una detección no-ciega en las cinco frecuencias del satélite WMAP, analizando 2491 objetos observados a 5 GHz y produciendo un catálogo de 932 objetos observados a 3σ y 380 a 5σ. / When we observe the sky with the most advanced instruments operating at microwave frequencies, the photons that we see were originated in very different regions in the universe. Most of them come from our own Galaxy (diffuse synchrotron, free-free and dust emission), others come from very distant galaxies and clusters of galaxies, and, finally, a fraction will have their origin in the relic radiation of the Big Bang, known as the Cosmic Microwave Background Radiation. This radiation is contaminated by the other components and the separation of the different emissions in a satisfactory way is still an open problem. In this thesis we will explore different techniques based on linear filters to detect and separate one of the contaminants, the contribution to the observed signal of distant radio and infrared galaxies. These galaxies are known as extragalactic point sources because, for the angular resolution of the typical experiments in microwave frequencies, they appear as point-like unresolved objects. First, we will study the performance of a family of matched filters when detecting point sources, where we allow the scale of the filter to be modified and where we introduce a Neyman-Pearson test to define the region of acceptance. Second, we will consider a new one-dimensional linear filter, the Biparametric Scale-Adaptive Filter (BSAF), that has two parameters that can be optimised to maximise the number of real detections for a fixed number of spurious detections, used in conjuction with a Neyman-Pearson test. Third, we will extend the BSAF to two dimensions, and compare it with the Mexican Hat Wavelet. Fourth, we will explore a new technique that combines lineal and quadratic fusion of images with linear filters, testing it with realistic simulations of one of the Planck satellite channels. Fifth, we will study in detail the performance of three filters, the Matched filter and two members of the Mexican Hat wavelet in realistic simulations of the nine channels of Planck. To conclude, we will use the second member of the Mexican Hat Family to do a non-blind study in the five frequencies of WMAP of a complete sample of 2491 sources observed at 5 GHz, producing a catalogue of 932 objects detected above 3σ and a catalogue of 380 objects observed above 5σ.
|
545 |
Detecció, caracterització i estudi de les sibilàncies en senyals de so respiratori de pacients amb malalties pulmonarsHoms Corbera, Antoni 01 July 2005 (has links)
L'aportació més important d'aquesta tesi doctoral és la proposta de noves tècniques de processament del senyal per a l'anàlisi objectiu d'uns tipus de sons respiratoris, anomenats sibilàncies, com a eina complementària pel diagnòstic mèdic i pel seguiment clínic de malalties pulmonars.Per a detectar les sibilàncies, inicialment s'han proposat i validat dos algorismes basats en l'anàlisi de l'espectrograma del senyal: el Shabtai-Musih de Mitjana Local (SMML) i l'Algorisme de Detecció de Sibilàncies Adaptatiu Local (LAWDA). La validació, realitzada a partir de 60 senyals de so amb l'ajut d'un metge especialista, ha mostrat una sensibilitat en la detecció de sibilàncies del 90.3% pel SMML i del 93.9% pel LAWDA en el tram de flux de l'espiració forçada comprès entre 1.2 i 0.2 l/s. Pel tram de flux de 0.2 a 0 l/s els valors han estat del 64.5% i del 71.0% respectivament.Per a l'estudi s'han fet servir senyals de so adquirits durant la maniobra espiromètrica que s'utilitza en la pràctica clínica habitual per a l'avaluació de la funció pulmonar de pacients. La base de dades conté 37 pacients, 15 de sans que formen el grup de control, 16 amb asma bronquial no fumadors i 6 que pateixen malaltia pulmonar obstructiva crònica (COPD). Per a cada pacient s'han estudiat un mínim de 6 maniobres: 3 en estat basal i 3 passats 20 minuts després de l'aplicació de 1 mg de Terbutalina (broncodilatador).S'ha realitzat un estudi estadístic de paràmetres característics del tram d'interès (1.2 a 0.2 l/s): el nombre de sibilàncies detectades, la freqüència mitjana de la sibilància de potència pic (FMSPPM) i mitjana (FMSPMM) màximes, el promig de les freqüències mitjanes de les sibilàncies detectades (PFM), i els percentatges de la maniobra ocupats per monofonia, polifonia o sense sibilàncies. S'han obtingut diferències significatives entre els grups per la mitjana i la desviació típica intrapacient d'alguns dels paràmetres estudiats tot i que els resultats no sempre coincidien per als dos algorismes. Cal destacar l'observació de diferències molt significatives entre els pacients de control i els asmàtics (p<0.01) per la mitjana i la desviació típica intrapacient del nombre de sibilàncies detectades tant abans com després d'aplicar el broncodilatador, independentment de l'algorisme utilitzat per la detecció.Per a estudiar la dinàmica de canvi de les vies aèries dels grups de pacients, també s'ha realitzat un estudi estadístic de les diferències entre les mitjanes i les desviacions típiques intrapacient dels paràmetres calculats abans i després d'aplicar un fàrmac broncodilatador. La majoria de paràmetres no han presentat diferències de comportament entre estats significatives en un mateix grup de pacients. També s'han definit els paràmetres de canvi en signe i valor absolut, que es defineixen com la resta entre els valors anteriors i els posteriors a l'aplicació del fàrmac. De l'estudi estadístic d'aquests paràmetres es conclou que hi ha diferències entre estats per un mateix pacient i que aquestes són de vegades significatives per diferenciar grups, sobretot quan s'estudien en valor absolut. Un cop validada la utilitat de la tècnica s'ha desenvolupat l'Algorisme de Detecció de Sibilàncies Genèric (GWDA), basat en la transformada de Malvar i el pseudoespectre. Aquest serveix per a detectar i estudiar les sibilàncies en altres maniobres i menors nivells de flux respiratori, i facilita la integració en dispositius de temps real. En la validació del nou algorisme s'ha obtingut una sensibilitat superior als altres tant pel tram comprès entre 1.2 i 0.2 l/s del flux (96.7 %) com pel tram de 0.2 a 0 l/s (100 %). S'ha realitzat l'estudi de l'espiració forçada utilitzant el GWDA i s'han estudiat estadísticament els diferents paràmetres calculats per les sibilàncies detectades pels pacients dels diferents grups. També s'han obtingut diferències molt significatives en estudiar el nombre de sibilàncies detectades en asmàtics i controls (p<0.01).Els estudis estadístics s'han complementat en aquest cas amb un anàlisi discriminant per a avaluar la possibilitat de realitzar un sistema automàtic de classificació basat només en paràmetres provinents de les sibilàncies. L'anàlisi discriminant s'ha realitzat combinant fins a un màxim de 3 paràmetres que presentaven diferències significatives entre grups i ha mostrat taxes de classificació de fins al 91.9 % en la fase d'entrenament i del 77.8 % per la de test.Finalment, s'ha realitzat un estudi preliminar del nombre de sibilàncies detectades durant la maniobra de volum corrent per mitjà del GWDA. / La aportación más importante de esta tesis doctoral es la propuesta de nuevas técnicas de procesado de señal para el análisis objetivo de un tipo de sonidos respiratorios, las sibilancias, como herramienta complementaria para el diagnostico médico y para seguimiento clínico de enfermedades pulmonares.Para detectar sibilancias, inicialmente se han propuesto y validado dos algoritmos basados en el análisis del espectrograma señal: el Shabtai-Musih de Media Local (SMML) y el Algoritmo de Detección de Sibilancias Adaptativo Local (LAWDA). La validación, realizada a partir de 60 señales de sonido con la ayuda de un médico especialista, ha mostrado una sensibilidad en la detección de sibilancias del 90.3% para el SMML y del 93.9% para el LAWDA en el tramo de flujo de la espiración forzada comprendido entre 1.2 i 0.2 l/s. Para el tramo de flujo de 0.2 a 0 l/s los valores han sido del 64.5% i del 71.0% respectivamente.Para el estudio se han utilizado señales de sonido adquiridos durante la maniobra espirométrica que se utiliza en la práctica clínica habitual para evaluar la función pulmonar de pacientes. La base de datos contiene 37 pacientes, 15 de sanos que forman el grupo de control, 16 con asma bronquial no fumadores y 6 que sufren enfermedad pulmonar obstructiva crónica (COPD). Para cada paciente se han estudiado un mínimo de 6 maniobras: 3 en estado basal y 3 pasados 20 minutos de la aplicación de 1 mg de Terbutalina (broncodilatador).Se ha realizado un estudio estadístico de parámetros característicos del tramo de interés (1.2 a 0.2 l/s): el número de sibilancias detectadas, la frecuencia media de la sibilancia de potencia pico (FMSPPM) y media (FMSPMM) máximas, el promedio de las frecuencias medias de las sibilancias detectadas (PFM), y los porcentajes de la maniobra ocupados por monofonía, polifonía o sin sibilancias. Se han obtenido diferencias significativas entre grupos para la media y la desviación típica intrapaciente de algunos de los parámetros estudiados aunque los resultados no siempre coincidían para los dos algoritmos. Cabe destacar la observación de diferencias muy significativas entre los pacientes de control y los asmáticos (p<0.01) para la media y la desviación típica intrapaciente del numero de sibilancias detectadas tanto antes como después de aplicar el broncodilatador, independientemente del algoritmo utilizado para la detección.Para estudiar la dinámica de cambio de las vías aéreas de los grupos de pacientes, también se ha realizado un estudio estadístico de las diferencias entre las medias y las desviaciones típicas intrapaciente de los parámetros calculados antes y después de aplicar la Terbutalina. La mayoría de parámetros no han presentado diferencias de comportamiento entre estados significativos en un mismo grupo de pacientes. También se han definido los parámetros de cambio con signo y valor absoluto como la resta entre los valores anteriores y los posteriores a la aplicación del fármaco. Del estudio estadístico de estos parámetros se concluye que hay diferencias entre estados para un mismo paciente y que estas son a veces significativas para diferenciar grupos, sobretodo cuando se estudian en valor absoluto. Una vez validada la utilidad de la técnica se ha desarrollado el Algoritmo de Detección de Sibilancias Genérico (GWDA), basado en la transformada de Malvar y el pseudoespectro. Este sirve para detectar y estudiar las sibilancias en otras maniobras y menores niveles de flujo respiratorio, y facilita la integración en dispositivos de tiempo real. En la validación del nuevo algoritmo se ha obtenido una sensibilidad superior a los otros tanto para el tramo comprendido entre 1.2 i 0.2 l/s del flujo (96.7 %) como para el tramo de 0.2 a 0 l/s (100 %). Se ha realizado el estudio de la espiración forzada utilizando el GWDA y se han estudiado estadísticamente los parámetros calculados para las sibilancias detectadas en los pacientes de los diferentes grupos. También se han obtenido diferencias muy significativas al estudiar el número de sibilancias detectadas en asmáticos y controles (p<0.01).Los estudios estadísticos se han completado en este caso con un análisis discriminante para evaluar la posibilidad de realizar un sistema automático de clasificación basado solamente en parámetros provenientes de las sibilancias. El análisis discriminante se ha realizado combinando hasta un máximo de 3 parámetros que presentaban diferencias significativas entre grupos y ha mostrado índices de clasificación de hasta el 91.9 % en la fase de entrenamiento y del 77.8 % para la de test.Finalmente, se ha realizado un estudio preliminar del número de sibilancias detectadas durante la maniobra de volumen corriente por medio del GWDA. / The main contribution of this PhD Thesis is to propose new signal processing techniques to objectively analyze respiratory sounds, specifically wheezes, as a medical diagnostic complementary tool and to monitor the clinical evolution of pulmonary diseases.Initially, two algorithms based on the spectrogram analysis have been proposed and validated in order to detect wheezes: the Local Mean Shabtai-Musih (SMML) and the Local Adaptive Wheezes Detection Algorithm (LAWDA). Validation, done using 60 sound signals and with the assessment of an specialized medical doctor, shows a wheezes detection sensibility of 90.3% for the SMML and of 93.9% for the LAWDA when 1.2 to 0.2 l/s forced exhalation flow levels are evaluated. These values go down to 64.5% and 71.0% respectively when the 0.2 to 0 l/s segment is used instead. Sound signals used for this study were acquired from the patient during the spirometry maneuver that is used for the assessment of pulmonary function in regular clinical practice. Data base is composed by 37 individuals: 16 non-smoker bronchial asthma patients, 6 chronic obstructive pulmonary disease (COPD) affected individuals, and 15 control subjects. A minimum of 6 maneuvers have been studied for each patient: 3 in basal state and 3 after 20 minutes of applying a 1 mg dose of Terbutaline (bronchodilator). For the flow segment of interest (1.2 to 0.2 l/s), a statistical study of its characteristic parameters: number of detected wheezes, mean frequency of the wheeze with higher power peak (FMSPPM) or mean power (FMSPMM), the mean value of the detected wheezes mean frequencies (PFM), and the percentage of maneuver occupied by monophony, polyphony or without wheezes; has been performed. Significant differences between groups have been found for the intrapatient mean and standard deviation of some of the studied parameters but results were not always coincident for the two detection algorithms. A remarkable result has been encountered when studying the mean and standard deviation of the number of detected wheezes before and after applying a bronchodilator drug: very significant differences have been found (p<0.01) between asthma and control groups independently of the evaluated state or the algorithm used.In order to study the patient groups airways dynamics changes, a statistical study of the differences between means and standard deviations of the parameters calculated before and after the application of the bronchodilator has been also performed. Most of the parameters did not show significant behavior differences between states of the same patients group. Also, the change parameters with sign and in absolute value have been defined. These are the results obtained by subtracting the values obtained before and after applying the drug. The statistical study of these parameters concludes that patients show differences between states and that these are sometimes significant in order to separate groups, especially when studied in absolute value. Once validated the utility of the technique the General Wheezes Detection Algorithm (GWDA) has been developed, based on the Malvar transform and the pseudospectrum. This algorithm is useful to detect and study wheezes on alternative maneuvers and lower respiratory flow levels, and facilitates the integration on real time devices. After validating it a higher sensitivity than the other algorithms has been found for the 1.2 to 0.2 l/s flow segment (96.7%) and also for the 0.2 to 0 l/s range (100%). The GWDA has been used to perform the forced exhalation study and the parameters obtained from the wheezes detected in each of the different groups patients have been studied statistically. Very significant differences have been also found when studying the number of detected wheezes in asthma and control groups (p<0.01).For the GWDA, the statistical studies have been completed with a discriminant analysis to evaluate the possibility to develop an automatic classification system only based on the parameters calculated from the detected wheezes. The discriminant analysis has been done combining up to a maximum of 3 parameters that showed significant differences between groups and classification rates of 91.9% for the training and of 77.8% for the testing patients sets have been accomplished.Finally, a preliminary study of the number of wheezes detected during the sporadic inspiration and exhalation maneuver has been done using the GWDA.
|
546 |
Harmonic Wavelets Procedures and Wiener Path and Integral Methods for Response Determination and Reliability Assessment of Nonlinear Systems/StructuresJanuary 2011 (has links)
In this thesis a novel approximate/analytical approach based on the concepts of stochastic averaging and of statistical linearization is developed for the response determination of nonlinear/hysteretic multi-degree-of-freedom (MDOF) systems subject to evolutionary stochastic excitation. The significant advantage of the approach relates to the fact that it is readily applicable for excitations possessing even non-separable evolutionary power spectra (EPS) circumventing ad hoc pre-filtering and pre-processing excitation treatments associated with existing alternative schemes of linearization. Further, the approach can be used, in a rather straightforward manner, in conjunction with recently developed design spectrum based analyses for obtaining peak response estimates without resorting to numerical integration of the nonlinear equations of motion. Furthermore, a novel approximate/analytical Wiener path integral based solution (PIS) is developed and a numerical PIS approach is extended to determine the response and first-passage probability density functions (PDFs) of nonlinear/hysteretic systems subject to evolutionary stochastic excitation. Applications include the versatile Preisach hysteretic model, recently applied in modeling systems equipped with smart material (shape memory alloys) devices used for seismic hazard risk mitigation. The approach is also applied to determine the capsizing probability of a ship, whose rolling dynamics is captured by a softening Duffing oscillator. Finally, novel harmonic wavelets based joint time-frequency response analysis and identification approaches are developed capable of determining the time-varying frequency content of non-stationary complex stochastic phenomena encountered in engineering applications. Specifically, a harmonic wavelets based statistical linearization approach is developed to determine the EPS of the response of nonlinear/hysteretic systems subject to stochastic excitation. In a similar context, an identification approach for nonlinear time-variant systems based on the localization properties of the harmonic wavelet transform is also developed. It can be construed as a generalization of the well established reverse multiple-input/single-output (MISO) spectral identification approach to account for non-stationary inputs and time-varying system parameters. Several linear and nonlinear time-variant systems are used to demonstrate the reliability of the approach.
|
547 |
Self-Similarity of Images and Non-local Image ProcessingGlew, Devin January 2011 (has links)
This thesis has two related goals: the first involves the concept of self-similarity
of images. Image self-similarity is important because it forms the basis for many
imaging techniques such as non-local means denoising and fractal image coding.
Research so far has been focused largely on self-similarity in the pixel domain.
That is, examining how well different regions in an image mimic each other. Also,
most works so far concerning self-similarity have utilized only the mean squared
error (MSE).
In this thesis, self-similarity is examined in terms of the pixel and wavelet representations
of images. In each of these domains, two ways of measuring similarity
are considered: the MSE and a relatively new measurement of image fidelity called
the Structural Similarity (SSIM) Index. We show that the MSE and SSIM Index
give very different answers to the question of how self-similar images really are.
The second goal of this thesis involves non-local image processing. First, a
generalization of the well known non-local means denoising algorithm is proposed
and examined. The groundwork for this generalization is set by the aforementioned
results on image self-similarity with respect to the MSE. This new method is then
extended to the wavelet representation of images. Experimental results are given
to illustrate the applications of these new ideas.
|
548 |
Octave-band Directional DecompositionsHong, Paul S. 19 July 2005 (has links)
A new two-dimensional transform is derived and implemented that is able to discriminate with respect to angular and radial frequency. This octave-band directional filter bank (OBDFB) is maximally decimated, has a separable polyphase implmentation, provides perfect reconstruction, and can be implemented in a tree structure allowing for a somewhat arbitrary number of angular and radial divisions. This decomposition is based on the directional filter bank (DFB) and is compared to other transforms with similar properties. Additionally, the OBDFB is used in three applications. Texture segmentation results are provided with comparisons to both decimated and undecimated transforms. With hyperspectral data, the OBDFB is used to increase classification accuracy using texture augmentation and likelihood score combination. Finally, ultrasound despeckling is addressed with respect to real-time implementations, and subjective test results are presented. A non-uniform two-dimensional transform is also designed that is a modified version of the OBDFB. It is rationally sampled and maximally decimated, but it provides both angular and radial frequency passbands from the initial stage instead of making separate divisions like the OBDFB. It also does not create subband boundaries on the principal frequency axes and allows for further decomposition as well.
|
549 |
Modeling the effect of land cover land use change on estuarine environmental flowsSahoo, Debabrata 15 May 2009 (has links)
Environmental flows are important to maintain the ecological integrity of the estuary. In
a watershed, it is influenced by land use land cover (LULC) change, climate variability,
and water regulations. San Antonio, Texas, the 8th largest city in the US, is likely to
affect environmental flows to the San Antonio Bay/Guadalupe Estuary, due to rapid
urbanization.
Time series analysis was conducted at several stream gauging stations to assess trends in
hydrologic variables. A bootstrapping method was employed to estimate the critical
value for global significance. Results suggested a greater number of trends are observed
than are expected to occur by chance. Stream gauging stations present in lower half of
the watershed experienced increasing trend, whereas upper half experienced decreasing
trends. A similar spatial pattern was not observed for rainfall. Winter season observed
maximum number of trends. Wavelet analysis on hydrologic variables, suggested presence of multi-scale temporal
variability; dominant frequencies in 10 to 15 year scale was observed in some of the
hydrologic variables, with a decadal cycle. Dominant frequencies were also observed in
17 to 23 year scale with repeatability in 20 to 30 years. It is therefore important to
understand various ecological processes that are dominant in this scale and quantify
possible linkages among them.
Genetic algorithm (GA) was used for calibration of the Hydrologic Simulation Program
in FORTRAN (HSPF) model. Although, GA is computationally demanding, it is better
than manual calibration. Parameter values obtained for the calibrated model had physical
representation and were well within the ranges suggested in the literature.
Information from LANDSAT images for the years 1987, 1999, and 2003 were
introduced to HSPF to quantify the impact of LULC change on environmental flows.
Modeling studies indicated, with increase in impervious surface, peak flows increased
over the years. Wavelet analysis pointed, that urbanization also impacted storage.
Modeling studies quantified, on average about 50% of variability in freshwater inflows
could be attributed to variation in precipitation, and approximately 10% of variation in
freshwater inflows could be attributed to LULC change.
This study will help ecologist, engineers, scientist, and politicians in policy making
pertinent to water resources management.
|
550 |
Bayesian classification and survival analysis with curve predictorsWang, Xiaohui 15 May 2009 (has links)
We propose classification models for binary and multicategory data where the
predictor is a random function. The functional predictor could be irregularly and
sparsely sampled or characterized by high dimension and sharp localized changes. In
the former case, we employ Bayesian modeling utilizing flexible spline basis which is
widely used for functional regression. In the latter case, we use Bayesian modeling
with wavelet basis functions which have nice approximation properties over a large
class of functional spaces and can accommodate varieties of functional forms observed
in real life applications. We develop an unified hierarchical model which accommodates
both the adaptive spline or wavelet based function estimation model as well as
the logistic classification model. These two models are coupled together to borrow
strengths from each other in this unified hierarchical framework. The use of Gibbs
sampling with conjugate priors for posterior inference makes the method computationally
feasible. We compare the performance of the proposed models with the naive
models as well as existing alternatives by analyzing simulated as well as real data. We
also propose a Bayesian unified hierarchical model based on a proportional hazards model and generalized linear model for survival analysis with irregular longitudinal
covariates. This relatively simple joint model has two advantages. One is that using
spline basis simplifies the parameterizations while a flexible non-linear pattern of
the function is captured. The other is that joint modeling framework allows sharing
of the information between the regression of functional predictors and proportional
hazards modeling of survival data to improve the efficiency of estimation. The novel
method can be used not only for one functional predictor case, but also for multiple
functional predictors case. Our methods are applied to analyze real data sets and
compared with a parameterized regression method.
|
Page generated in 0.0582 seconds