• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 86
  • 57
  • 13
  • 13
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 216
  • 216
  • 75
  • 39
  • 36
  • 35
  • 35
  • 26
  • 25
  • 19
  • 18
  • 18
  • 18
  • 18
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Streamflow and Soil Moisture Assimilation in the SWAT model Using the Extended Kalman Filter

Sun, Leqiang January 2016 (has links)
Numerical models often fail to accurately simulate and forecast a hydrological state in operation due to its inherent uncertainties. Data Assimilation (DA) is a promising technology that uses real-time observations to modify a model's parameters and internal variables to make it more representative of the actual state of the system it describes. In this thesis, hydrological DA is first reviewed from the perspective of its objective, scope, applications and the challenges it faces. Special attention is then given to nonlinear Kalman filters such as the Extended Kalman Filter (EKF). Based on a review of the existing studies, it is found that the potential of EKF has not been fully exploited. The Soil and Water Assessment Tool (SWAT) is a semi-distributed rainfall-runoff model that is widely used in agricultural water management and flood forecasting. However, studies of hydrological DA that are based on distributed models are relatively rare because hydrological DA is still in its infancy, with many issues to be resolved, and linear statistical models and lumped rainfall-runoff models are often used for the sake of simplicity. This study aims to fill this gap by assimilating streamflow and surface soil moisture observations into the SWAT model to improve its state simulation and forecasting capability. Unless specifically defined, all ‘forecasts’ in Italic font are based on the assumption of a perfect knowledge of the meteorological forecast. EKF is chosen as the DA method for its solid theoretical basis and parsimonious implementation procedures. Given the large number of parameters and storage variables in SWAT, only the watershed scale variables are included in the state vector, and the Hydrological Response Unit (HRU) scale variables are updated with the a posteriori/a priori ratio of their watershed scale counterparts. The Jacobian matrix is calculated numerically by perturbing the state variables. Two case studies are carried out with real observation data in order to verify the effectiveness of EKF assimilation. The upstream section of the Senegal River (above Bakel station) in western Africa is chosen for the streamflow assimilation, and the USDA ARS Little Washita experimental watershed is chosen to examine surface soil moisture assimilation. In the case of streamflow assimilation, a spinoff study is conducted to compare EKF state-parameter assimilation with a linear autoregressive (AR) output assimilation to improve SWAT’s flood forecasting capability. The influence of precipitation forecast uncertainty on the effectiveness of EKF assimilation is discussed in the context of surface soil moisture assimilation. In streamflow assimilation, EKF was found to be effective mostly in the wet season due to the weak connection between runoff, soil moisture and the curve number (CN2) in dry seasons. Both soil moisture and CN2 were significantly updated in the wet season despite having opposite update patterns. The flood forecast is moderately improved for up to seven days, especially in the flood period by applying the EKF subsequent open loop (EKFsOL) scheme. The forecast is further improved with a newly designed quasi-error update scheme. Comparison between EKF and AR output assimilation in flood forecasting reveals that while both methods can improve forecast accuracy, their performance is influenced by the hydrological regime of the particular year. EKF outperformed the AR model in dry years, while AR outperformed the EKF in wet years. Compared to AR, EKF is more robust and less sensitive to the length of the forecast lead time. A combined EKF-AR method provides satisfying results in both dry and wet years. The assimilation of surface soil moisture is proved effective in improving the full profile soil moisture and streamflow estimate. The setting of state and observation vector has a great impact on the assimilation results. The state vector with streamflow and all-layer soil moisture outperforms other, more complicated state vectors, including those augmented with intermediate variables and model parameters. The joint assimilation of surface soil moisture and streamflow observation provides a much better estimate of soil moisture compared to assimilating the streamflow only. The updated SWAT model is sufficiently robust to issue improved forecasts of soil moisture and streamflow after the assimilation is ‘unplugged’. The error quantification is found to be critical to the performance of EKF assimilation. Nevertheless, the application of an adaptive EKF shows no advantages over using the trial and error method in determining time-invariant model errors. The robustness of EKF assimilation is further verified by explicitly perturbing the precipitation ‘forecast’ in the EKF subsequent forecasts. The open loop model without previous EKF update is more vulnerable to erroneous precipitation estimates. Compared to streamflow forecasting, soil moisture forecasting is found to be more resilient to erroneous precipitation input.
82

Analyse de séries temporelles d’images à moyenne résolution spatiale : reconstruction de profils de LAI, démélangeage : application pour le suivi de la végétation sur des images MODIS / Time series analysis of medium spatial resolution sensing images : LAI recinstruction, unmixing : application to vegetation monitoring on MODIS data

Gong, Xing 30 January 2015 (has links)
Cette thèse s’intéresse à l’analyse de séries temporelles d’images satellites à moyenne résolution spatiale. L’intérêt principal de telles données est leur haute répétitivité qui autorise des analyses de l’usage des sols. Cependant, deux problèmes principaux subsistent avec de telles données. En premier lieu, en raison de la couverture nuageuse, des mauvaises conditions d’acquisition, ..., ces données sont souvent très bruitées. Deuxièmement, les pixels associés à la moyenne résolution spatiale sont souvent “mixtes” dans la mesure où leur réponse spectrale est une combinaison de la réponse de plusieurs éléments “purs”. Ces deux problèmes sont abordés dans cette thèse. Premièrement, nous proposons une technique d’assimilation de données capable de recouvrer des séries temporelles cohérentes de LAI (Leaf Area Index) à partir de séquences d’images MODIS bruitées. Pour cela, le modèle de croissance de plantes GreenLab estutilisé. En second lieu, nous proposons une technique originale de démélangeage, qui s’appuie notamment sur des noyaux “élastiques” capables de gérer les spécificités des séries temporelles (séries de taille différentes, décalées dans le temps, ...)Les résultats expérimentaux, sur des données synthétiques et réelles, montrent de bonnes performances des méthodologies proposées. / This PhD dissertation is concerned with time series analysis for medium spatial resolution (MSR) remote sensing images. The main advantage of MSR data is their high temporal rate which allows to monitor land use. However, two main problems arise with such data. First, because of cloud coverage and bad acquisition conditions, the resulting time series are often corrupted and not directly exploitable. Secondly, pixels in medium spatial resolution images are often “mixed” in the sense that the spectral response is a combination of the response of “pure” elements.These two problems are addressed in this PhD. First, we propose a data assimilation technique able to recover consistent time series of Leaf Area Index from corrupted MODIS sequences. To this end, a plant growth model, namely GreenLab, is used as a dynamical constraint. Second, we propose a new and efficient unmixing technique for time series. It is in particular based on the use of “elastic” kernels able to properly compare time series shifted in time or of various lengths.Experimental results are shown both on synthetic and real data and demonstrate the efficiency of the proposed methodologies.
83

Stochastic longshore current dynamics

Restrepo, Juan M., Venkataramani, Shankar 12 1900 (has links)
We develop a stochastic parametrization, based on a 'simple' deterministic model for the dynamics of steady longshore currents, that produces ensembles that are statistically consistent with field observations of these currents. Unlike deterministic models, stochastic parameterization incorporates randomness and hence can only match the observations in a statistical sense. Unlike statistical emulators, in which the model is tuned to the statistical structure of the observation, stochastic parametrization are not directly tuned to match the statistics of the observations. Rather, stochastic parameterization combines deterministic, i.e physics based models with stochastic models for the "missing physics" to create hybrid models, that are stochastic, but yet can be used for making predictions, especially in the context of data assimilation. We introduce a novel measure of the utility of stochastic models of complex processes, that we call consistency of sensitivity. A model with poor consistency of sensitivity requires a great deal of tuning of parameters and has a very narrow range of realistic parameters leading to outcomes consistent with a reasonable spectrum of physical outcomes. We apply this metric to our stochastic parametrization and show that, the loss of certainty inherent in model due to its stochastic nature is offset by the model's resulting consistency of sensitivity. In particular, the stochastic model still retains the forward sensitivity of the deterministic model and hence respects important structural/physical constraints, yet has a broader range of parameters capable of producing outcomes consistent with the field data used in evaluating the model. This leads to an expanded range of model applicability. We show, in the context of data assimilation, the stochastic parametrization of longshore currents achieves good results in capturing the statistics of observation that were not used in tuning the model.
84

Parameterschätzung und Modellevaluation für komplexe Systeme

Schumann-Bischoff, Jan 06 April 2016 (has links)
No description available.
85

Analyse asymptotique en électrophysiologie cardiaque : applications à la modélisation et à l'assimilation de données / Asymptotic analysis in cardiac electrophysiology : applications in modeling and in data assimilation

Collin, Annabelle 06 October 2014 (has links)
Cette thèse est dédiée au développement d'outils mathématiques innovants améliorant la modélisation en électrophysiologie cardiaque.Une présentation du modèle bidomaine - un système réaction-diffusion - à domaine fixé est proposée en s'appuyant sur la littérature et une justification mathématique du processus d'homogénéisation (convergence «2-scale») est donnée. Enfin, une étude de l'impact des déformations mécaniques dans les lois de conservation avec la théorie des mélanges est faite.Comme les techniques d'imagerie ne fournissent globalement que des surfaces pour les oreillettes cardiaques dont l'épaisseur est très faible, une réduction dimensionnelle du modèle bidomaine dans une couche mince à une formulation posée sur la surface associée est étudiée. À l'aide de techniques développées pour les modèles de coques, une analyse asymptotique des termes de diffusion est faite sous des hypothèses de gradient d'anisotropie fort à travers l'épaisseur. Puis, une modélisation couplée du cœur - asymptotique pour les oreillettes et volumique pour les ventricules - permet la simulation d'électrocardiogramme complet. De plus, les méthodes asymptotiques sont utilisées pour obtenir des résultats de convergence forte pour les modèles de coque-3D.Enfin, afin de «personnaliser» les modèles, une méthode d'estimation est proposée. Les données médicales intégrées dans notre modèle - au moyen d'un filtre d'état de type Luenberger spécialement conçu - sont les cartes d'activation électrique. Ces problématiques apparaissent dans d'autres domaines où les modèles (réaction-diffusion) et les données (position du front) sont similaires, comme la propagation de feux ou la croissance tumorale. / This thesis aims at developing innovative mathematical tools to improve cardiac electrophysiological modeling. A detailed presentation of the bidomain model - a system of reaction-diffusion equations - with a fixed domain is given based on the literature and we mathematically justify the homogenization process using the 2-scale convergence. Then, a study of the impact of the mechanical deformations in the conservation laws is performed using the mixture theory.As the atria walls are very thin and generally appear as thick surfaces in medical imaging, a dimensional reduction of the bidomain model in a thin domain to a surface-based formulation is studied. The challenge is crucial in terms of computational efficiency. Following similar strategies used in shell mechanical modeling, an asymptotic analysis of the diffusion terms is done with assumptions of strong anisotropy through the thickness, as in the atria. Simulations in 2D and 3D illustrate these results. Then, a complete modeling of the heart - with the asymptotic model for the atria and the volume model for the ventricles - allow the simulation of full electrocardiogram cycles. Furthermore, the asymptotic methods are used to obtain strong convergence results for the 3D-shell models.Finally, a specific data assimilation method is proposed in order to «personalize» the electrophysiological models. The medical data assimilated in the model - using a Luenberger-like state filter specially designed - are the maps of electrical activation. The proposed methods can be used in other application fields where models (reaction-diffusion) and data (front position) are very similar, as for fire propagation or tumor growth.
86

Multi-agent Traffic Simulation using Characteristic Behavior Model / 個別性のある行動モデルを用いたマルチエージェント交通シミュレーション

Kingetsu, Hiroaki 23 March 2021 (has links)
京都大学 / 新制・課程博士 / 博士(情報学) / 甲第23320号 / 情博第756号 / 新制||情||129(附属図書館) / 京都大学大学院情報学研究科社会情報学専攻 / (主査)教授 吉川 正俊, 教授 伊藤 孝行, 教授 畑山 満則 / 学位規則第4条第1項該当 / Doctor of Informatics / Kyoto University / DFAM
87

History Matching of 4D Seismic Data Attributes using the Ensemble Kalman Filter

Ravanelli, Fabio M. 05 1900 (has links)
One of the most challenging tasks in the oil industry is the production of reliable reservoir forecast models. Because of different sources of uncertainties the numerical models employed are often only crude approximations of the reality. This problem is tackled by the conditioning of the model with production data through data assimilation. This process is known in the oil industry as history matching. Several recent advances are being used to improve history matching reliability, notably the use of time-lapse seismic data and automated history matching software tools. One of the most promising data assimilation techniques employed in the oil industry is the ensemble Kalman filter (EnKF) because its ability to deal with highly non-linear models, low computational cost and easy computational implementation when compared with other methods. A synthetic reservoir model was used in a history matching study designed to predict the peak production allowing decision makers to properly plan field development actions. If only production data is assimilated, a total of 12 years of historical data is required to properly characterize the production uncertainty and consequently the correct moment to take actions and decommission the field. However if time-lapse seismic data is available this conclusion can be reached 4 years in advance due to the additional fluid displacement information obtained with the seismic data. Production data provides geographically sparse data in contrast with seismic data which are sparse in time. Several types of seismic attributes were tested in this study. Poisson’s ratio proved to be the most sensitive attribute to fluid displacement. In practical applications, however the use of this attribute is usually avoided due to poor quality of the data. Seismic impedance tends to be more reliable. Finally, a new conceptual idea was proposed to obtain time-lapse information for a history matching study. The use of crosswell time-lapse seismic tomography to map velocities in the interwell region was demonstrated as a potential tool to ensure survey reproducibility and low acquisition cost when compared with full scale surface surveys. This approach relies on the higher velocity sensitivity to fluid displacement at higher frequencies. The velocity effects were modeled using the Biot velocity model. This method provided promising results leading to similar RRMS error reductions when compared with conventional history matched surface seismic data.
88

Diagnostika kovariancí chyb předběžného pole ve spojeném systému globální a regionální asimilace dat / Diagnostics of background error covariances in a connected global and regional data assimilation system

Bučánek, Antonín January 2018 (has links)
The thesis deals with the preparation of initial conditions for nume- rical weather prediction in high resolution limited area models. It focuses on the problem of preserving the large-scale part of the global driving model analysis, which can not be determined in sufficient quality in limited-area models. For this purpose, the so-called BlendVar scheme is used. The scheme consists of the appli- cation of the Digital Filter (DF) Blending method, which assures the transmission of a large-scale part of the analysis of the driving model to the limited area model, and of the three-dimensional variational method (3D-Var) at high resolution. The thesis focuses on the appropriate background error specification, which is one of the key components of 3D-Var. Different approaches to modeling of background errors are examined, including the possibility of taking into account the flow- dependent character of background errors. Approaches are also evaluated from the point of view of practical implementation. Study of evolution of background errors during DF Blending and BlendVar assimilation cycles leads to a new pro- posal for the preparation of a background error covariance matrix suitable for the BlendVar assimilation scheme. The use of the new background error covariance matrix gives the required property...
89

Assimilation of GNSS-R Delay-Doppler Maps into Weather Models

Feixiong Huang (9354989) 15 December 2020 (has links)
<div>Global Navigation Satellite System Reflectometry (GNSS-R) is a remote sensing technique that uses reflected satellite navigation signals from the Earth surface in a bistatic radar configuration. GNSS-R observations have been collected using receivers on stationary, airborne and spaceborne platforms. The delay-Doppler map (DDM) is the fundamental GNSS-R measurement from which ocean surface wind speed can be retrieved. GNSS-R observations can be assimilated into numerical weather prediction models to improve weather analyses and forecasts. The direct assimilation of DDM observations shows potential superiority over the assimilation of wind retrievals.</div><div><br></div><div>This dissertation demonstrates the direct assimilation of GNSS-R DDMs using a two-dimensional variational analysis method (VAM). First, the observation forward model and its Jacobian are developed. Then, the observation's bias correction, quality control, and error characterization are presented. The DDM assimilation was applied to a global and a regional case. </div><div><br></div><div>In the global case, DDM observations from the NASA Cyclone Global Navigation Satellite System (CYGNSS) mission are assimilated into global ocean surface wind analyses using the European Centre for Medium-Range Weather Forecasts (ECMWF) 10-meter winds as the background. The wind analyses are improved as a result of the DDM assimilation. VAM can also be used to derive a new type of wind vector observation from DDMs (VAM-DDM).</div><div><br></div><div>In the regional case, an observing system experiment (OSE) is used to quantify the impact of VAM-DDM wind vectors from CYGNSS on hurricane forecasts, in the case of Hurricane Michael (2018). It is found that the assimilation of VAM-DDM wind vectors at the early stage of the hurricane improves the forecasted track and intensity.</div><div><br></div><div>The research of this dissertation implies potential benefits of DDM assimilation for future research and operational applications.</div>
90

Verification of simulated DSDs and sensitivity to CCN concentration in EnKF analysis and ensemble forecasts of the 30 April 2017 tornadic QLCS during VORTEX-SE

Connor Paul Belak (10285328) 16 March 2021 (has links)
<p>Storms in the SE-US often evolve in different environments than those in the central Plains. Many poorly understood aspects of these differing environments may impact the tornadic potential of SE-US storms. Among these differences are potential variations in the CCN concentration owing to differences in land cover, combustion, industrial and urban activity, and proximity to maritime environments. The relative influence of warm and cold rain processes is sensitive to CCN concentration, with higher CCN concentrations producing smaller cloud droplets and more efficient cold rain processes. Cold rain processes result in DSDs with relatively larger drops from melting ice compared to warm rain processes. Differences in DSDs impact cold pool and downdraft size and strength, that influence tornado potential. This study investigates the impact of CCN concentration on DSDs in the SE-US by comparing DSDs from ARPS-EnKF model analyses and forecasts to observed DSDs from portable disdrometer-equipped probes collected by a collaboration between Purdue University, the University of Oklahoma (OU), the National Severe Storms Laboratory (NSSL), and the University of Massachusetts in a tornadic QLCS on 30 April 2017 during VORTEX-SE.</p><p>The ARPS-EnKF configuration, which consists of 40 ensemble members, is used with the NSSL triple-moment microphysics scheme. Surface and radar observations are both assimilated. Data assimilation experiments with CCN concentrations ranging from 100 cm<sup>-3</sup> (maritime) to 2,000 cm<sup>-3</sup> (continental) are conducted to characterize the variability of DSDs and the model output DSDs are verified against the disdrometer observations. The sensitivity of the DSD variability to CCN concentrations is evaluated. Results indicate continental CCN concentrations (close to CCN 1,000 cm<sup>3</sup>) produce DSDs that align closest to the observed DSDs. Other thermodynamic variables also accord better to observations in intermediate CCN concentration environments.</p>

Page generated in 0.1064 seconds