• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 86
  • 57
  • 13
  • 13
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 216
  • 216
  • 75
  • 39
  • 36
  • 35
  • 35
  • 26
  • 25
  • 19
  • 18
  • 18
  • 18
  • 18
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

[en] EVALUATING THE IMPACT OF THE INFLATION FACTORS GENERATION FOR THE ENSEMBLE SMOOTHER WITH MULTIPLE DATA ASSIMILATION / [pt] INVESTIGANDO O IMPACTO DA GERAÇÃO DOS FATORES DE INFLAÇÃO PARA O ENSEMBLE SMOOTHER COM MÚLTIPLA ASSIMILAÇÃO DE DADOS

THIAGO DE MENEZES DUARTE E SILVA 09 September 2021 (has links)
[pt] O ensemble smoother with multiple data assimilation (ES-MDA) se tornou um poderoso estimador de parâmetros. A principal ideia do ES-MDA é assimilar os mesmos dados com a matriz de covariância dos erros dos dados inflada. Na implementação original do ES-MDA, os fatores de inflação e o número de assimilações são escolhidos a priori. O único requisito é que a soma dos inversos de tais fatores seja igual a um. Naturalmente, escolhendo-os iguais ao número de assimilações cumpre este requerimento. Contudo, estudos recentes mostraram uma relação entre a equação de atualização do ES-MDA com a solução para o problema inverso regularizado. Consequentemente, tais elementos agem como os parâmetros de regularização em cada assimilação. Assim, estudos propuseram técnicas para gerar tais fatores baseadas no princípio da discrepância. Embora estes estudos tenham propostos técnicas, um procedimento ótimo para gerar os fatores de inflação continua um problema em aberto. Mais ainda, tais estudos divergem em qual método de regularização é sufiente para produzir os melhores resultados para o ES-MDA. Portanto, nesta tese é abordado o problema de gerar os fatores de inflação para o ESMDA e suas influências na performance do método. Apresentamos uma análise numérica do impacto de tais fatores nos parâmetros principais do ES-MDA: o tamanho do conjunto, o número de assimilações e o vetor de atualização dos parâmetros. Com a conclusão desta análise, nós propomos uma nova técnica para gerar os fatores de inflação para o ES-MDA baseada em um método de regularização para algorítmos do tipo Levenberg-Marquardt. Investigando os resultados de um problema de inundação de um reservatório 2D, o novo método obtém melhor estimativa tanto para os parâmetros do modelo tanto quanto para os dados observados. / [en] The ensemble smoother with multiple data assimilation (ES-MDA) gained much attention as a powerful parameter estimation method. The main idea of the ES-MDA is to assimilate the same data multiple times with an inflated data error covariance matrix. In the original ES-MDA implementation, these inflation factors, such as the number of assimilations, are selected a priori. The only requirement is that the sum of the inflation factors inverses must be equal to one. Therefore, selecting them equal to the number of assimilations is a straightforward choice. Nevertheless, recent studies have shown a relationship between the ES-MDA update equation and the solution to a regularized inverse problem. Hence, the inflation factors play the role of the regularization parameter at each ES-MDA assimilation step. As a result, they have also suggested new procedures to generate these elements based on the discrepancy principle. Although several studies proposed efficient techniques to generate the ES-MDA inflation factors, an optimal procedure to generate them remains an open problem. Moreover, the studies diverge on which regularization scheme is sufficient to provide the best ES-MDA outcomes. Therefore, in this work, we address the problem of generating the ES-MDA inflation factors and their influence on the method s performance. We present a numerical analysis of the influence of such factors on the main parameters of the ES-MDA, such as the ensemble size, the number of assimilations, and the ES-MDA vector of model parameters update. With the conclusions presented in the aforementioned analysis, we propose a new procedure to generate ES-MDA inflation factors based on a regularizing scheme for Levenberg-Marquardt algorithms. It is shown through a synthetic two-dimensional waterflooding problem that the new method achieves better model parameters and data match compared to the other ES-MDA implementations available in the literature.
192

Exploiting Deep Learning and Traffic Models for Freeway Traffic Estimation

Genser, Alexander, Makridis, Michail A., Kouvelas, Anastasios 23 June 2023 (has links)
Emerging sensors and intelligent traffic technologies provide extensive data sets in a traffic network. However, realizing the full potential of such data sets for a unique representation of real-world states is challenging due to data accuracy, noise, and temporal-spatial resolution. Data assimilation is a known group of methodological approaches that exploit physics-informed traffic models and data observations to perform short-term predictions of the traffic state in freeway environments. At the same time, neural networks capture high non-linearities, similar to those presented in traffic networks. Despite numerous works applying different variants of Kalman filters, the possibility of traffic state estimation with deep-learning-based methodologies is only partially explored in the literature. We present a deep-learning modeling approach to perform traffic state estimation on large freeway networks. The proposed framework is trained on local observations from static and moving sensors and identifies differences between well-trusted data and model outputs. The detected patterns are then used throughout the network, even where there are no available observations to estimate fundamental traffic quantities. The preliminary results of the work highlight the potential of deep learning for traffic state estimation.
193

State (hydrodynamics) Identification In The Lower St. Johns River Using The Ensemble Kalman Filter

Tamura, Hitoshi 01 January 2012 (has links)
This thesis presents a method, Ensemble Kalman Filter (EnKF), applied to a highresolution, shallow water equations model (DG ADCIRC-2DDI) of the Lower St. Johns River with observation data at four gauging stations. EnKF, a sequential data assimilation method for non-linear problems, is developed for tidal flow simulation for estimation of state variables, i.e., water levels and depth-integrated currents for overland unstructured finite element meshes. The shallow water equations model is combined with observation data, which provides the basis of the EnKF applications. In this thesis, EnKF is incorporated into DG ADCIRC-2DDI code to estimate the state variables. Upon its development, DG ADCIRC-2DDI with EnKF is first validated by implementing to a low-resolution, shallow water equations model of a quarter annular harbor with synthetic observation data at six gauging stations. Second, DG ADCIRC-2DDI with EnKF is implemented to a high-resolution, shallow water equations model of the Lower St. Johns River with real observation data at four gauging stations. Third, four different experiments are performed by applying DG ADCIRC-2DDI with EnKF to the Lower St. Johns River.
194

[en] DIMENSIONLESS ENSEMBLE SMOOTHER WITH MULTIPLE DATA ASSIMILATION APPLIED ON AN INVERSE PROBLEM OF A MULTILAYER RESERVOIR WITH A DAMAGED ZONE / [pt] ENSEMBLE SMOOTER ADIMENSIONAL COM MÚLTIPLA ASSIMILAÇÃO APLICADO A UM PROBLEMA INVERSO DE RESERVATÓRIO MULTICAMADAS COM ZONA DE SKIN

ADAILTON JOSE DO NASCIMENTO SOUSA 05 December 2022 (has links)
[pt] O ES-MDA tem sido usado amplamente no que diz respeito a problemas inversos de reservatórios de petróleo, usando a estatística bayesiana como cerne. Propriedades importantes como a permeabilidade, raio da zona de skin e permeabilidade da zona de skin, são estimadas a partir de dados de histórico de reservatório usando esse método baseado em conjuntos. Nessa tese, a pressão medida no poço durante um teste de injetividade foi calculada usando uma abordagem analítica de um reservatório multicamadas, com zona de skin, usando a Transformada de Laplace. O algoritmo de Stehfest foi usado para inverter os dados para o campo real. Além disso, ao usarmos essa abordagem, conseguimos obter facilmente a vazão em cada camada como um novo dado a ser considerado no ES-MDA, enriquecendo a estimativa dos dados desejados. Por usarmos a vazão e a pressão como dados de entrada no ES-MDA, é de suma importância que a diferença de ordens de grandezas não influencie em nossas estimativas e por isso optou-se por usar o ES-MDA na forma adimensional. Visando uma maior precisão de nossas estimativas, usou-se um algoritmo de otimização dos fatores de inflação do ES-MDA. / [en] The ES-MDA has been extensively used concerning inverse problems of oil reservoirs, using Bayesian statistics as the core. Important properties such as permeability, skin zone radius, and skin zone permeability are estimated from historical reservoir data using this set-based method. In this thesis, the pressure measured at the well during an injectivity test was calculated using an analytical approach of a multilayer reservoir, with skin zone, using the Laplace Transform. Stehfest s algorithm was used to invert the data to the real field. Furthermore, using this approach, we were able to easily obtain the flow rate in each layer as new data to be considered in the ES-MDA, enriching the estimation of the targeted data. As we use flow rate and pressure as input data in the ES-MDA, it is important to assure that the difference in orders of magnitude does not influence our estimates. For this reason, we chose to use the ES-MDA in the dimensionless form. Aiming at a greater precision of our estimates, we used an algorithm to optimize the ES-MDA inflation factors.
195

Evaluation of the potential to estimate river discharge using measurements from the upcoming SWOT mission

Yoon, Yeosang 19 December 2013 (has links)
No description available.
196

Probabilistic and Statistical Learning Models for Error Modeling and Uncertainty Quantification

Zavar Moosavi, Azam Sadat 13 March 2018 (has links)
Simulations and modeling of large-scale systems are vital to understanding real world phenomena. However, even advanced numerical models can only approximate the true physics. The discrepancy between model results and nature can be attributed to different sources of uncertainty including the parameters of the model, input data, or some missing physics that is not included in the model due to a lack of knowledge or high computational costs. Uncertainty reduction approaches seek to improve the model accuracy by decreasing the overall uncertainties in models. Aiming to contribute to this area, this study explores uncertainty quantification and reduction approaches for complex physical problems. This study proposes several novel probabilistic and statistical approaches for identifying the sources of uncertainty, modeling the errors, and reducing uncertainty to improve the model predictions for large-scale simulations. We explore different computational models. The first class of models studied herein are inherently stochastic, and numerical approximations suffer from stability and accuracy issues. The second class of models are partial differential equations, which capture the laws of mathematical physics; however, they only approximate a more complex reality, and have uncertainties due to missing dynamics which is not captured by the models. The third class are low-fidelity models, which are fast approximations of very expensive high-fidelity models. The reduced-order models have uncertainty due to loss of information in the dimension reduction process. We also consider uncertainty analysis in the data assimilation framework, specifically for ensemble based methods where the effect of sampling errors is alleviated by localization. Finally, we study the uncertainty in numerical weather prediction models coming from approximate descriptions of physical processes. / Ph. D. / Computational models are used to understand the behavior of the natural phenomenon. Models are used to approximate the evolution of the true phenomenon or reality in time. We obtain more accurate forecast for the future by combining the model approximation together with the observation from reality. Weather forecast models, oceanography, geoscience, etc. are some examples of the forecasting models. However, models can only approximate the true reality to some extent and model approximation of reality is not perfect due to several sources of error or uncertainty. The noise in measurements or in observations from nature, the uncertainty in some model components, some missing components in models, the interaction between different components of the model, all cause model forecast to be different from reality. The aim of this study is to explore the techniques and approaches of modeling the error and uncertainty of computational models, provide solution and remedies to reduce the error of model forecast and ultimately improve the model forecast. Taking the discrepancy or error between model forecast and reality in time and mining that error provide valuable information about the origin of uncertainty in models as well as the hidden dynamics that is not considered in the model. Statistical and machine learning based solutions are proposed in this study to identify the source of uncertainty, capturing the uncertainty and using that information to reduce the error and enhancing the model forecast. We studied the error modeling, error or uncertainty quantification and reduction techniques in several frameworks from chemical models to weather forecast models. In each of the models, we tried to provide proper solution to detect the origin of uncertainty, model the error and reduce the uncertainty to improve the model forecast.
197

Estimation de la vitesse des courants marins à partir de séquences d'images satellitaires / Oceanic currents estimation from satellite image sequences

Beyou, Sébastien 12 July 2013 (has links)
Cette thèse étudie des méthodes d'assimilation de données par filtrage particulaire à l'estimation d'écoulements fluides observés au travers de séquences d'images. Nous nous appuyons sur un filtre particulaire spécifique dont la distribution de proposition est donnée par un filtre de Kalman d'ensemble, nommé filtre de Kalman d'ensemble pondéré. Deux variations à celui-ci sont introduites et étudiées. La première consiste à utiliser un bruit dynamique (permettant de modéliser l'incertitude du modèle et de séparer les particules entre elles) dont la forme spatiale suit une loi de puissance, cohérente avec la théorie phénoménologique de la turbulence. La deuxième variation repose sur un schéma d'assimilation multi-échelles introduisant un mécanisme de raffinements successifs à partir d'observations à des échelles de plus en plus petites. Ces deux méthodes ont été testées sur des séquences synthétiques et expérimentales d'écoulements 2D incompressibles. Ces résultats montrent un gain important sur l'erreur quadratique moyenne. Elles ont ensuite été testées sur des séquences d'images satellite réelles. Sur les images réelles, une bonne cohérence temporelle est observée, ainsi qu'un bon suivi des structures de vortex. L'assimilation multi-échelles montre un gain visible sur le nombre d'échelles reconstruites. Quelques variations additionnelles sont aussi présentées et testées afin de s'affranchir de problèmes importants rencontrés dans un contexte satellitaire réel. Il s'agit notamment de la prise en compte de données manquantes sur les images de température de surface de l'océan. En dernier lieu, une expérience d'un filtre de Kalman d'ensemble pondéré avec un modèle océanique complet est présentée pour une assimilation de champs de courants de surface en mer d'Iroise, à l'embouchure de la Manche. Quelques autres pistes d'amélioration sont également esquissées et testées. / This thesis studies fluid flows estimation with particle filtering-based assimilation methods imaged using digital cameras. We rely on a specific particle filter, of which the proposal distribution is given by an Ensemble Kalman Filter, namely the Weighted Ensemble Kalman Filter. Two variations of this method are introduced and tested. The first consists in using a dynamical noise (which modelizes the model uncertainty and separates the particles from each others); its spatial form obeys to a power law stemming from the phenomenological theory of the turbulence. The second variation relies on a multiscale assimilation scheme introduicing successive refinements from observations at smaller and smaller scales. These two methods are tested on synthetic and experimental sequences of 2D incompressible flows. Results show an important gain on the Root Mean Square Error. They are then tested on real satellite images. A good temporal coherence and a good tracking of vortex structures are observed on the real images. The multiscale assimilation shows a visible gain on the number of reconstructed scales. Some additional variations are also presented and tested in order to take into account important problems in a real satellite context. The main contribution is the management of missing data areas in the Sea Surface Temperature sequence. Lastly an experiment involving a Weighted Ensemble Kalman Filter with a complete oceanic model is presented for a surface currents fields assimilation in Iroise Sea near the English Channel mouth. Some other improvements are also drawn and tested.
198

Désagrégation spatiale de températures Météosat par une méthode d'assimilation de données (lisseur particulaire) dans un modèle de surface continentale / Spatial downscaling of Meteosat temperatures based on a data assimilation approach (Particle Smoother) to constrain a land surface model

Mechri, Rihab 04 December 2014 (has links)
La température des surfaces continentales (LST) est une variable météorologiquetrès importante car elle permet l’accès aux bilans d’énergie et d’eau ducontinuum Biosphère-Atmosphère. Sa haute variabilité spatio-temporelle nécessite desmesures à haute résolution spatiale (HRS) et temporelle (HRT) pour suivre au mieuxles états hydriques du sol et des végétations.La télédétection infrarouge thermique (IRT) permet d’estimer la LST à différentesrésolutions spatio-temporelles. Toutefois, les mesures les plus fréquentes sont souventà basse résolution spatiale (BRS). Il faut donc développer des méthodes pour estimerla LST à HRS à partir des mesures IRT à BRS/HRT. Cette solution est connue sous lenom de désagrégation et fait l’objet de cette thèse.Ainsi, une nouvelle approche de désagrégation basée sur l’assimilation de données(AD) est proposée. Il s’agit de contraindre la dynamique des LSTs HRS/HRT simuléespar un modèle en minimisant l’écart entre les LST agrégées et les données IRT àBRS/HRT, sous l’hypothèse d’homogénéité de la LST par type d’occupation des sols àl’échelle du pixel BRS. La méthode d’AD choisie est un lisseur particulaire qui a étéimplémenté dans le modèle de surface SETHYS (Suivi de l’Etat Hydrique du Sol).L’approche a été évaluée dans une première étape sur des données synthétiques etvalidée ensuite sur des données réelles de télédétection sur une petite région au Sud-Est de la France. Des séries de températures Météosat à 5 km de résolution spatialeont été désagrégées à 90m et validées sur une journée à l’aide de données ASTER.Les résultats encourageants nous ont conduit à élargir la région d’étude et la périoded’assimilation à sept mois. La désagrégation des produits Météosat a été validée quantitativementà 1km à l’aide de données MODIS et qualitativement à 30m à l’aide dedonnées Landsat7. Les résultats montrent de bonnes performances avec des erreursinférieures à 2.5K sur les températures désagrégées à 1km. / Land surface temperature (LST) is one of the most important meteorologicalvariables giving access to water and energy budgets governing the Biosphere-Atmosphere continuum. To better monitor vegetation and energy states, we need hightemporal and spatial resolution measures of LST because its high variability in spaceand time.Despite the growing availability of Thermal Infra-Red (TIR) remote sensing LSTproducts, at different spatial and temporal resolutions, both high spatial resolution(HSR) and high temporal resolution (HTR) TIR data is still not possible because ofsatellite resolutions trade-off : the most frequent LST products being low spatial resolution(LSR) ones.It is therefore necessary to develop methods to estimate HSR/HTR LST from availableTIR LSR/HTR ones. This solution is known as "downscaling" and the presentthesis proposes a new approach for downscaling LST based on Data Assimilation (DA)methods. The basic idea is to constrain HSR/HTR LST dynamics, simulated by a dynamicalmodel, through the minimization of their respective aggregated LSTs discrepancytoward LSR observations, assuming that LST is homogeneous at the land cover typescale inside the LSR pixel.Our method uses a particle smoother DA method implemented in a land surfacemodel : SETHYS model (Suivie de l’Etat Hydrique de Sol). The proposed approach hasbeen firstly evaluated in a synthetic framework then validated using actual TIR LSTover a small area in South-East of France. Meteosat LST time series were downscaledfrom 5km to 90m and validated with ASTER HSR LST over one day. The encouragingresults conducted us to expand the study area and consider a larger assimilation periodof seven months. The downscaled Meteosat LSTs were quantitatively validated at1km of spatial resolution (SR) with MODIS data and qualitatively at 30m of SR withLandsat7 data. The results demonstrated good performances with downscaling errorsless than 2.5K at MODIS scale (1km of SR).
199

Use of social media data in flood monitoring / Uso de dados das mídias sociais no monitoramento de enchentes

Restrepo Estrada, Camilo Ernesto 05 November 2018 (has links)
Floods are one of the most devastating types of worldwide disasters in terms of human, economic, and social losses. If authoritative data is scarce, or unavailable for some periods, other sources of information are required to improve streamflow estimation and early flood warnings. Georeferenced social media messages are increasingly being regarded as an alternative source of information for coping with flood risks. However, existing studies have mostly concentrated on the links between geo-social media activity and flooded areas. This thesis aims to show a novel methodology that shows a way to close the research gap regarding the use of social networks as a proxy for precipitation-runoff and flood forecast estimates. To address this, it is proposed to use a transformation function that creates a proxy variable for rainfall by analysing messages from geo-social media and precipitation measurements from authoritative sources, which are then incorporated into a hydrological model for the flow estimation. Then the proxy and authoritative rainfall data are merged to be used in a data assimilation scheme using the Ensemble Kalman Filter (EnKF). It is found that the combined use of authoritative rainfall values with the social media proxy variable as input to the Probability Distributed Model (PDM), improves flow simulations for flood monitoring. In addition, it is found that when these models are made under a scheme of fusion-assimilation of data, the results improve even more, becoming a tool that can help in the monitoring of \"ungauged\" or \"poorly gauged\" catchments. The main contribution of this thesis is the creation of a completely original source of rain monitoring, which had not been explored in the literature in a quantitative way. It also shows how the joint use of this source and data assimilation methodologies aid to detect flood events. / As inundações são um dos tipos mais devastadores de desastres em todo o mundo em termos de perdas humanas, econômicas e sociais. Se os dados oficiais forem escassos ou indisponíveis por alguns períodos, outras fontes de informação são necessárias para melhorar a estimativa de vazões e antecipar avisos de inundação. Esta tese tem como objetivo mostrar uma metodologia que mostra uma maneira de fechar a lacuna de pesquisa em relação ao uso de redes sociais como uma proxy para as estimativas de precipitação e escoamento. Para resolver isso, propõe-se usar uma função de transformação que cria uma variável proxy para a precipitação, analisando mensagens de medições geo-sociais e precipitação de fontes oficiais, que são incorporadas em um modelo hidrológico para a estimativa de fluxo. Em seguida, os dados de proxy e precipitação oficial são fusionados para serem usados em um esquema de assimilação de dados usando o Ensemble Kalman Filter (EnKF). Descobriu-se que o uso combinado de valores oficiais de precipitação com a variável proxy das mídias sociais como entrada para o modelo distribuído de probabilidade (Probability Distributed Model - PDM) melhora as simulações de fluxo para o monitoramento de inundações. A principal contribuição desta tese é a criação de uma fonte completamente original de monitoramento de chuva, que não havia sido explorada na literatura de forma quantitativa.
200

Use of social media data in flood monitoring / Uso de dados das mídias sociais no monitoramento de enchentes

Camilo Ernesto Restrepo Estrada 05 November 2018 (has links)
Floods are one of the most devastating types of worldwide disasters in terms of human, economic, and social losses. If authoritative data is scarce, or unavailable for some periods, other sources of information are required to improve streamflow estimation and early flood warnings. Georeferenced social media messages are increasingly being regarded as an alternative source of information for coping with flood risks. However, existing studies have mostly concentrated on the links between geo-social media activity and flooded areas. This thesis aims to show a novel methodology that shows a way to close the research gap regarding the use of social networks as a proxy for precipitation-runoff and flood forecast estimates. To address this, it is proposed to use a transformation function that creates a proxy variable for rainfall by analysing messages from geo-social media and precipitation measurements from authoritative sources, which are then incorporated into a hydrological model for the flow estimation. Then the proxy and authoritative rainfall data are merged to be used in a data assimilation scheme using the Ensemble Kalman Filter (EnKF). It is found that the combined use of authoritative rainfall values with the social media proxy variable as input to the Probability Distributed Model (PDM), improves flow simulations for flood monitoring. In addition, it is found that when these models are made under a scheme of fusion-assimilation of data, the results improve even more, becoming a tool that can help in the monitoring of \"ungauged\" or \"poorly gauged\" catchments. The main contribution of this thesis is the creation of a completely original source of rain monitoring, which had not been explored in the literature in a quantitative way. It also shows how the joint use of this source and data assimilation methodologies aid to detect flood events. / As inundações são um dos tipos mais devastadores de desastres em todo o mundo em termos de perdas humanas, econômicas e sociais. Se os dados oficiais forem escassos ou indisponíveis por alguns períodos, outras fontes de informação são necessárias para melhorar a estimativa de vazões e antecipar avisos de inundação. Esta tese tem como objetivo mostrar uma metodologia que mostra uma maneira de fechar a lacuna de pesquisa em relação ao uso de redes sociais como uma proxy para as estimativas de precipitação e escoamento. Para resolver isso, propõe-se usar uma função de transformação que cria uma variável proxy para a precipitação, analisando mensagens de medições geo-sociais e precipitação de fontes oficiais, que são incorporadas em um modelo hidrológico para a estimativa de fluxo. Em seguida, os dados de proxy e precipitação oficial são fusionados para serem usados em um esquema de assimilação de dados usando o Ensemble Kalman Filter (EnKF). Descobriu-se que o uso combinado de valores oficiais de precipitação com a variável proxy das mídias sociais como entrada para o modelo distribuído de probabilidade (Probability Distributed Model - PDM) melhora as simulações de fluxo para o monitoramento de inundações. A principal contribuição desta tese é a criação de uma fonte completamente original de monitoramento de chuva, que não havia sido explorada na literatura de forma quantitativa.

Page generated in 0.1213 seconds