• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 99
  • 12
  • 10
  • 10
  • 3
  • 1
  • Tagged with
  • 204
  • 204
  • 52
  • 47
  • 45
  • 44
  • 41
  • 38
  • 36
  • 36
  • 30
  • 29
  • 28
  • 22
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Efficient deterministic approximate Bayesian inference for Gaussian process models

Bui, Thang Duc January 2018 (has links)
Gaussian processes are powerful nonparametric distributions over continuous functions that have become a standard tool in modern probabilistic machine learning. However, the applicability of Gaussian processes in the large-data regime and in hierarchical probabilistic models is severely limited by analytic and computational intractabilities. It is, therefore, important to develop practical approximate inference and learning algorithms that can address these challenges. To this end, this dissertation provides a comprehensive and unifying perspective of pseudo-point based deterministic approximate Bayesian learning for a wide variety of Gaussian process models, which connects previously disparate literature, greatly extends them and allows new state-of-the-art approximations to emerge. We start by building a posterior approximation framework based on Power-Expectation Propagation for Gaussian process regression and classification. This framework relies on a structured approximate Gaussian process posterior based on a small number of pseudo-points, which is judiciously chosen to summarise the actual data and enable tractable and efficient inference and hyperparameter learning. Many existing sparse approximations are recovered as special cases of this framework, and can now be understood as performing approximate posterior inference using a common approximate posterior. Critically, extensive empirical evidence suggests that new approximation methods arisen from this unifying perspective outperform existing approaches in many real-world regression and classification tasks. We explore the extensions of this framework to Gaussian process state space models, Gaussian process latent variable models and deep Gaussian processes, which also unify many recently developed approximation schemes for these models. Several mean-field and structured approximate posterior families for the hidden variables in these models are studied. We also discuss several methods for approximate uncertainty propagation in recurrent and deep architectures based on Gaussian projection, linearisation, and simple Monte Carlo. The benefit of the unified inference and learning frameworks for these models are illustrated in a variety of real-world state-space modelling and regression tasks.
62

Optimal Latin Hypercube Designs for Computer Experiments Based on Multiple Objectives

Hou, Ruizhe 22 March 2018 (has links)
Latin hypercube designs (LHDs) have broad applications in constructing computer experiments and sampling for Monte-Carlo integration due to its nice property of having projections evenly distributed on the univariate distribution of each input variable. The LHDs have been combined with some commonly used computer experimental design criteria to achieve enhanced design performance. For example, the Maximin-LHDs were developed to improve its space-filling property in the full dimension of all input variables. The MaxPro-LHDs were proposed in recent years to obtain nicer projections in any subspace of input variables. This thesis integrates both space-filling and projection characteristics for LHDs and develops new algorithms for constructing optimal LHDs that achieve nice properties on both criteria based on using the Pareto front optimization approach. The new LHDs are evaluated through case studies and compared with traditional methods to demonstrate their improved performance.
63

Simulação sequencial na interpolação dos dados de entrada ou saída do modelo de lixiviação do software Araquá / Sequential simulation on input or output data interpolation of Araquá software leaching model

Moraes, Diego Augusto de Campos [UNESP] 12 November 2015 (has links) (PDF)
Made available in DSpace on 2016-02-05T18:29:09Z (GMT). No. of bitstreams: 0 Previous issue date: 2015-11-12. Added 1 bitstream(s) on 2016-02-05T18:33:17Z : No. of bitstreams: 1 000858605.pdf: 2289436 bytes, checksum: 3728a29094417cadf89fbeb1c1dc825b (MD5) / A interface entre simuladores do comportamento e destino ambiental de defensivos agrícolas e softwares de geoprocessamento, tem sido cada vez mais frequente em estudos de avaliação de risco ambiental. Destaca-se nesse contexto, o uso da geoestatística, a qual considera a correlação espacial e interpolação de um determinado fenômeno na natureza. No entanto, a aplicação do processo de interpolação geoestatística nos dados de entrada ou saída de um simulador pode fornecer resultados diferentes. Diante disso, a hipótese deste trabalho assenta-se na proposição de que o uso de técnicas de simulação sequencial na interpolação dos dados de entrada ou saída do modelo de lixiviação do software ARAquá produzirá um cenário mais crítico de contaminação de águas subterrâneas, quando comparado com a interpolação dos dados de saída deste mesmo modelo. Portanto, o objetivo foi implementar a metodologia de simulação sequencial como procedimento de interpolação dos dados de entrada e saída do modelo de lixiviação do software ARAquá, com a posterior comparação dos resultados. O estudo foi realizado para uma área de cana-de-açúcar com a aplicação simulada do herbicida Tebuthiuron, no município de São Manuel - SP. Foram consideradas duas abordagens: Calcular Antes - Interpolar Depois (CI) e Interpolar Antes - Calcular Depois (IC). Ambas consideraram a profundidade do lençol freático a 2 m e 1 m. Para a abordagem CI foram aplicados o software ARAquá, os variogramas univariados das concentrações estimadas e a Simulação Sequencial Gaussiana (SSG). Na abordagem IC foram aplicados o Modelo Linear de Corregionalização (MLC) dos parâmetros do solo, a co-Simulação Sequencial Gaussiana (co-SSG) e a aplicação do software ARAquá para obtenção das concentrações simuladas. Os resultados obtidos pelas abordagens mostraram que a abordagem IC obteve as maiores ... / The interface between simulators of pesticide environmental behavior and fate and geoprocessing softwares has been increasingly used in environmental risk assessment studies. In this context, the use of geostatistics, which considers the spatial correlation and interpolation of a given phenomenon in nature, has a great importance. However, application of geostatistical interpolation processes on the input or output simulator data can provide different results. Therefore, the hypothesis of this work relies on the proposition that using stochastic simulation techniques on ARAquá software input data interpolation will produce a more critical scenario of groundwater contamination, when compared with ARAquá software output data interpolation. Therefore, the aim of this work was to implement the stochastic simulation methodology as interpolation procedure for ARAquá software input and output data, with the subsequent comparison of results. The study was conducted for a sugarcane area with Tebuthiuron simulated application, in São Manuel-SP, Brazil. Two approaches were considered: Calculate Before - Interpolate After (CI) and Interpolate Before - Calculate After (IC). Both approaches considered the groundwater depth of 2 m and 1 m. For CI approach were applied ARAquá software, univariate variograms of estimated concentrations and Sequential Gaussian Simulation (SSG). In the IC approach were applied the Linear Model of Coregionalization (LMC) of soil parameters, the co-Sequential Gaussian Simulation (co-SSG) and the application of ARAquá software to obtain simulated concentrations. The results obtained by the approaches showed that the IC approach obtained the worst case scenario for Tebuthiuron simulated concentrations in groundwater, and acute risk to aquatic plants when considering 1 m groundwater depth. Through LMC analysis it was possible to identify that field capacity water content, organic ...
64

Extraction de l'ECG du foetus et de ses caractéristiques grâce à la multi-modalité / Extraction of fetal ECG and its characteristics using multi-modality

Noorzadeh, Saman 02 November 2015 (has links)
La surveillance de la santé foetale permet aux cliniciens d’évaluer le bien-être du foetus,de faire une détection précoce des anomalies cardiaques foetales et de fournir les traitementsappropriés. Les développements technologies actuels visent à permettre la mesurede l’électrocardiogramme (ECG) foetal de façon non-invasive afin d’extraire non seulementle rythme cardiaque mais également la forme d’onde du signal. Cet objectif est rendudifficile par le faible rapport signal sur bruit des signaux mesurés sur l’abdomen maternel.Cette mesure est donc toujours un challenge auquel se confrontent beaucoup d’études quiproposent des solutions de traitement de signal basées sur la seule modalité ECG.Le but de cette thèse est d’utiliser la modélisation des processus Gaussiens pour améliorerl’extraction des signaux cardiaques foetaux, dans une base multi-modale. L’ECG est utiliséconjointement avec le signal Phonocardiogramme (PCG) qui peut apporter une informationcomplémentaire à l’ECG. Une méthode générale pour la modélisation des signauxquasi-périodiques est présentée avec l’application au débruitage de l’ECG et à l’extractionde l’ECG du foetus. Différents aspects de la multi-modalité (synchronisation, · · · ) proposéesont étudiées afin de détecter avec plus de robustesse les battements cardiaques foetaux.La méthode considère l’application sur les signaux ECG et PCG à travers deux aspects:l’aspect du traitement du signal et l’expérimental. La modélisation des processus Gaussien,avec le signal PCG pris comme la référence, est utilisée pour extraire des modèles flexibleset des estimations non linéaires de l’information. La méthode cherche également à faciliterla mise en oeuvre pratique en utilisant un codage 1-bit des signaux de référence.Le modèle proposé est validé sur des signaux synthétiques et également sur des donnéespréliminaires réelles qui ont été enregistrées afin d’amorcer la constitution d’une base dedonnées multi-modale synchronisée. Les premiers résultats montrent que la méthode permettraà terme aux cliniciens d’étudier les battements cardiaques ainsi que la morphologiede l’ECG. Ce dernier aspect était jusqu’à présent limité à l’analyse d’enregistrements ECGinvasifs prélevés pendant l’accouchement par le biais d’électrodes posées sur le scalp dufoetus. / Fetal health must be carefully monitored during pregnancy to detect early fetal cardiac diseases, and provide appropriate treatment. Technological development allows a monitoring during pregnancy using the non-invasive fetal electrocardiogram (ECG). Noninvasive fetal ECG is a method not only to detect fetal heart rate, but also to analyze the morphology of fetal ECG, which is now limited to analysis of the invasive ECG during delivery. However, the noninvasive fetal ECG recorded from the mother's abdomen is contaminated with several noise sources among which the maternal ECG is the most prominent.In the present study, the problem of noninvasive fetal ECG extraction is tackled using multi-modality. Beside ECG signal, this approach benefits from the Phonocardiogram (PCG) signal as another signal modality, which can provide complementary information about the fetal ECG.A general method for quasi-periodic signal analysis and modeling is first described and its application to ECG denoising and fetal ECG extraction is explained. Considering the difficulties caused by the synchronization of the two modalities, the event detection in the quasi-periodic signals is also studied which can be specified to the detection of the R-peaks in the ECG signal.The method considers both clinical and signal processing aspects of the application on ECG and PCG signals. These signals are introduced and their characteristics are explained. Then, using PCG signal as the reference, the Gaussian process modeling is employed to provide the possibility of flexible models as nonlinear estimations. The method also tries to facilitate the practical implementation of the device by using the less possible number of channels and also by using only 1-bit reference signal.The method is tested on synthetic data and also on real data that is recorded to provide a synchronous multi-modal data set.Since a standard agreement for the acquisition of these modalities is not yet taken into much consideration, the factors which influence the signals in recording procedure are introduced and their difficulties and effects are investigated.The results show that the multi-modal approach is efficient in the detection of R-peaks and so in the extraction of fetal heart rate, and it also provides the results about the morphology of fetal ECG.
65

Simulação sequencial na interpolação dos dados de entrada ou saída do modelo de lixiviação do software Araquá /

Moraes, Diego Augusto de Campos, 1985. January 2015 (has links)
Orientador: Célia Regina Lopes Zimback / Coorientador: Claudio Aparecido Spadotto / Coorientador: Annamaria Castrignanò / Banca: Luis Gustavo Frediani Lessa / Banca: Paulo Milton Barbosa Landim / Banca: Alessandra Fagioli da Silva / Banca: Anderson Antonio da Conceição Sartori / Resumo: A interface entre simuladores do comportamento e destino ambiental de defensivos agrícolas e softwares de geoprocessamento, tem sido cada vez mais frequente em estudos de avaliação de risco ambiental. Destaca-se nesse contexto, o uso da geoestatística, a qual considera a correlação espacial e interpolação de um determinado fenômeno na natureza. No entanto, a aplicação do processo de interpolação geoestatística nos dados de entrada ou saída de um simulador pode fornecer resultados diferentes. Diante disso, a hipótese deste trabalho assenta-se na proposição de que o uso de técnicas de simulação sequencial na interpolação dos dados de entrada ou saída do modelo de lixiviação do software ARAquá produzirá um cenário mais crítico de contaminação de águas subterrâneas, quando comparado com a interpolação dos dados de saída deste mesmo modelo. Portanto, o objetivo foi implementar a metodologia de simulação sequencial como procedimento de interpolação dos dados de entrada e saída do modelo de lixiviação do software ARAquá, com a posterior comparação dos resultados. O estudo foi realizado para uma área de cana-de-açúcar com a aplicação simulada do herbicida Tebuthiuron, no município de São Manuel - SP. Foram consideradas duas abordagens: Calcular Antes - Interpolar Depois (CI) e Interpolar Antes - Calcular Depois (IC). Ambas consideraram a profundidade do lençol freático a 2 m e 1 m. Para a abordagem CI foram aplicados o software ARAquá, os variogramas univariados das concentrações estimadas e a Simulação Sequencial Gaussiana (SSG). Na abordagem IC foram aplicados o Modelo Linear de Corregionalização (MLC) dos parâmetros do solo, a co-Simulação Sequencial Gaussiana (co-SSG) e a aplicação do software ARAquá para obtenção das concentrações simuladas. Os resultados obtidos pelas abordagens mostraram que a abordagem IC obteve as maiores ... / Abstract:The interface between simulators of pesticide environmental behavior and fate and geoprocessing softwares has been increasingly used in environmental risk assessment studies. In this context, the use of geostatistics, which considers the spatial correlation and interpolation of a given phenomenon in nature, has a great importance. However, application of geostatistical interpolation processes on the input or output simulator data can provide different results. Therefore, the hypothesis of this work relies on the proposition that using stochastic simulation techniques on ARAquá software input data interpolation will produce a more critical scenario of groundwater contamination, when compared with ARAquá software output data interpolation. Therefore, the aim of this work was to implement the stochastic simulation methodology as interpolation procedure for ARAquá software input and output data, with the subsequent comparison of results. The study was conducted for a sugarcane area with Tebuthiuron simulated application, in São Manuel-SP, Brazil. Two approaches were considered: Calculate Before - Interpolate After (CI) and Interpolate Before - Calculate After (IC). Both approaches considered the groundwater depth of 2 m and 1 m. For CI approach were applied ARAquá software, univariate variograms of estimated concentrations and Sequential Gaussian Simulation (SSG). In the IC approach were applied the Linear Model of Coregionalization (LMC) of soil parameters, the co-Sequential Gaussian Simulation (co-SSG) and the application of ARAquá software to obtain simulated concentrations. The results obtained by the approaches showed that the IC approach obtained the worst case scenario for Tebuthiuron simulated concentrations in groundwater, and acute risk to aquatic plants when considering 1 m groundwater depth. Through LMC analysis it was possible to identify that field capacity water content, organic ... / Doutor
66

Analytic Long Term Forecasting with Periodic Gaussian Processes / Analytic Long Term Forecasting with Periodic Gaussian Processes

Ghassemi, Nooshin Haji January 2014 (has links)
In many application domains such as weather forecasting, robotics and machine learning we need to model, predict and analyze the evolution of periodic systems. For instance, time series applications that follow periodic patterns appear in climatology where the CO2 emissions and temperature changes follow periodic or quasi-periodic patterns. Another example can be in robotics where the joint angle of a rotating robotic arm follows a periodic pattern. It is often very important to make long term prediction of the evolution of such systems. For modeling and prediction purposes, Gaussian processes are powerful methods, which can be adjusted based on the properties of the problem at hand. Gaussian processes belong to the class of probabilistic kernel methods, where the kernels encode the characteristics of the problems into the models. In case of the systems with periodic evolution, taking the periodicity into account can simplifies the problem considerably. The Gaussian process models can account for the periodicity by using a periodic kernel. Long term predictions need to deal with uncertain points, which can be expressed by a distribution rather than a deterministic point. Unlike the deterministic points, prediction at uncertain points is analytically intractable for the Gaussian processes. However, there are approximation methods that allow for dealing with uncertainty in an analytic closed form, such as moment matching. However, only some particular kernels allow for analytic moment matching. The standard periodic kernel does not allow for analytic moment matching when performing long term predictions. This work presents an analytic approximation method for long term forecasting in periodic systems. We present a different parametrization of the standard periodic kernel, which allows us to approximate moment matching in an analytic closed form. We evaluate our approximate method on different periodic systems. The results indicate that the proposed method is valuable for the long term forecasting of periodic processes.
67

Discriminative pose estimation using mixtures of Gaussian processes

Fergie, Martin Paul January 2013 (has links)
This thesis proposes novel algorithms for using Gaussian processes for Discriminative pose estimation. We overcome the traditional limitations of Gaussian processes, their cubic training complexity and their uni-modal predictive distribution by assembling them in a mixture of experts formulation. Our First contribution shows that by creating a large number of Fixed size Gaussian process experts, we can build a model that is able to scale to large data sets and accurately learn the multi-modal and non- linear mapping between image features and the subject’s pose. We demonstrate that this model gives state of the art performance compared to other discriminative pose estimation techniques.We then extend the model to automatically learn the size and location of each expert. Gaussian processes are able to accurately model non-linear functional regression problems where the output is given as a function of the input. However, when an individual Gaussian process is trained on data which contains multi-modalities, or varying levels of ambiguity, the Gaussian process is unable to accurately model the data. We propose a novel algorithm for learning the size and location of each expert in our mixture of Gaussian processes model to ensure that the training data of each expert matches the assumptions of a Gaussian process. We show that this model is able to out perform our previous mixture of Gaussian processes model.Our final contribution is a dynamics framework for inferring a smooth sequence of pose estimates from a sequence of independent predictive distributions. Discriminative pose estimation infers the pose of each frame independently, leading to jittery tracking results. Our novel algorithm uses a model of human dynamics to infer a smooth path through a sequence of Gaussian mixture models as given by our mixture of Gaussian processes model. We show that our algorithm is able to smooth and correct some mis- takes made by the appearance model alone, and outperform a baseline linear dynamical system.
68

Bayesian-Entropy Method for Probabilistic Diagnostics and Prognostics of Engineering Systems

January 2020 (has links)
abstract: Information exists in various forms and a better utilization of the available information can benefit the system awareness and response predictions. The focus of this dissertation is on the fusion of different types of information using Bayesian-Entropy method. The Maximum Entropy method in information theory introduces a unique way of handling information in the form of constraints. The Bayesian-Entropy (BE) principle is proposed to integrate the Bayes’ theorem and Maximum Entropy method to encode extra information. The posterior distribution in Bayesian-Entropy method has a Bayesian part to handle point observation data, and an Entropy part that encodes constraints, such as statistical moment information, range information and general function between variables. The proposed method is then extended to its network format as Bayesian Entropy Network (BEN), which serves as a generalized information fusion tool for diagnostics, prognostics, and surrogate modeling. The proposed BEN is demonstrated and validated with extensive engineering applications. The BEN method is first demonstrated for diagnostics of gas pipelines and metal/composite plates for damage diagnostics. Both empirical knowledge and physics model are integrated with direct observations to improve the accuracy for diagnostics and to reduce the training samples. Next, the BEN is demonstrated in prognostics and safety assessment in air traffic management system. Various information types, such as human concepts, variable correlation functions, physical constraints, and tendency data, are fused in BEN to enhance the safety assessment and risk prediction in the National Airspace System (NAS). Following this, the BE principle is applied in surrogate modeling. Multiple algorithms are proposed based on different type of information encoding, such as Bayesian-Entropy Linear Regression (BELR), Bayesian-Entropy Semiparametric Gaussian Process (BESGP), and Bayesian-Entropy Gaussian Process (BEGP) are demonstrated with numerical toy problems and practical engineering analysis. The results show that the major benefits are the superior prediction/extrapolation performance and significant reduction of training samples by using additional physics/knowledge as constraints. The proposed BEN offers a systematic and rigorous way to incorporate various information sources. Several major conclusions are drawn based on the proposed study. / Dissertation/Thesis / Doctoral Dissertation Mechanical Engineering 2020
69

Spatial Regression and Gaussian Process BART

January 2020 (has links)
abstract: Spatial regression is one of the central topics in spatial statistics. Based on the goals, interpretation or prediction, spatial regression models can be classified into two categories, linear mixed regression models and nonlinear regression models. This dissertation explored these models and their real world applications. New methods and models were proposed to overcome the challenges in practice. There are three major parts in the dissertation. In the first part, nonlinear regression models were embedded into a multistage workflow to predict the spatial abundance of reef fish species in the Gulf of Mexico. There were two challenges, zero-inflated data and out of sample prediction. The methods and models in the workflow could effectively handle the zero-inflated sampling data without strong assumptions. Three strategies were proposed to solve the out of sample prediction problem. The results and discussions showed that the nonlinear prediction had the advantages of high accuracy, low bias and well-performed in multi-resolution. In the second part, a two-stage spatial regression model was proposed for analyzing soil carbon stock (SOC) data. In the first stage, there was a spatial linear mixed model that captured the linear and stationary effects. In the second stage, a generalized additive model was used to explain the nonlinear and nonstationary effects. The results illustrated that the two-stage model had good interpretability in understanding the effect of covariates, meanwhile, it kept high prediction accuracy which is competitive to the popular machine learning models, like, random forest, xgboost and support vector machine. A new nonlinear regression model, Gaussian process BART (Bayesian additive regression tree), was proposed in the third part. Combining advantages in both BART and Gaussian process, the model could capture the nonlinear effects of both observed and latent covariates. To develop the model, first, the traditional BART was generalized to accommodate correlated errors. Then, the failure of likelihood based Markov chain Monte Carlo (MCMC) in parameter estimating was discussed. Based on the idea of analysis of variation, back comparing and tuning range, were proposed to tackle this failure. Finally, effectiveness of the new model was examined by experiments on both simulation and real data. / Dissertation/Thesis / Doctoral Dissertation Statistics 2020
70

Spatial Function Estimation with Uncertain Sensor Locations / Spatial Function Estimation with Uncertain Sensor Locations

Ptáček, Martin January 2021 (has links)
Tato práce se zabývá úlohou odhadování prostorové funkce z hlediska regrese pomocí Gaussovských procesů (GPR) za současné nejistoty tréninkových pozic (pozic senzorů). Nejdříve je zde popsána teorie v pozadí GPR metody pracující se známými tréninkovými pozicemi. Tato teorie je poté aplikována při odvození výrazů prediktivní distribuce GPR v testovací pozici při uvážení nejistoty tréninkových pozic. Kvůli absenci analytického řešení těchto výrazů byly výrazy aproximovány pomocí metody Monte Carlo. U odvozené metody bylo demonstrováno zlepšení kvality odhadu prostorové funkce oproti standardnímu použití GPR metody a také oproti zjednodušenému řešení uvedenému v literatuře. Dále se práce zabývá možností použití metody GPR s nejistými tréninkovými pozicemi v~kombinaci s výrazy s dostupným analytickým řešením. Ukazuje se, že k dosažení těchto výrazů je třeba zavést značné předpoklady, což má od počátku za následek nepřesnost prediktivní distribuce. Také se ukazuje, že výsledná metoda používá standardní výrazy GPR v~kombinaci s upravenou kovarianční funkcí. Simulace dokazují, že tato metoda produkuje velmi podobné odhady jako základní GPR metoda uvažující známé tréninkové pozice. Na druhou stranu prediktivní variance (nejistota odhadu) je u této metody zvýšena, což je žádaný efekt uvážení nejistoty tréninkových pozic.

Page generated in 0.1687 seconds