• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 84
  • 57
  • 13
  • 12
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 212
  • 212
  • 75
  • 38
  • 35
  • 34
  • 34
  • 24
  • 24
  • 18
  • 18
  • 18
  • 17
  • 17
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Reconstructing Biological Systems Incorporating Multi-Source Biological Data via Data Assimilation Techniques / データ同化手法を用いた多種生体内データの統合による生体内システム再構築の研究

Hasegawa, Takanori 23 January 2015 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(情報学) / 甲第18699号 / 情博第549号 / 新制||情||97(附属図書館) / 31632 / 京都大学大学院情報学研究科知能情報学専攻 / (主査)教授 阿久津 達也, 教授 鹿島 久嗣, 教授 石井 信 / 学位規則第4条第1項該当 / Doctor of Informatics / Kyoto University / DFAM
92

REMOTE SENSING DATA ASSIMILATION IN WATER QUALITY NUMERICAL MODELS FOR SIMULATION OF WATER COLUMN TEMPERATURE

Xie, Shuangshuang 16 March 2012 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Numerical models are important tools for simulating processes within complex natural systems, such as hydrodynamics and water quality processes within a water body. From decision makers’ perspectives, such models also serve as useful tools for predicting the impacts of water quality problems or develop early warning systems. However, accuracy of a numerical model developed for a specific site is dependent on multiple model parameters and variables whose values are attained via calibration processes and/or expert knowledge. Real time variations in the actual aquatic system at a site necessitate continuous monitoring of the system so that model parameters and variables are regularly updated to reflect accurate conditions. Multiple sources of observations can help adjust the model better by providing benefits of individual monitoring technology within the model updating process. For example, remote sensing data provide a spatially dense dataset of model variables at the surface of a water body, while in-situ monitoring technologies can provide data at multiple depths and at more frequent time intervals than remote sensing technologies. This research aims to present an overview of an integrated modeling and data assimilation framework that combines three-dimensional numerical model with multiple sources of observations to simulate water column temperature in a eutrophic reservoir in central Indiana. A variational data assimilation approach is investigated for incorporating spatially continuous remote sensing observations and spatially discrete in-situ observations to change initial conditions of the numerical model. This research addresses the challenge of improving the model performance by combining water temperature from multi-spectral remote sensing analysis and in-situ measurements. Results of the approach on a eutrophic reservoir in Central Indiana show that with four images of multi-spectral remote sensing data assimilated, the model results oscillate more from the in-situ measurements during the data assimilation period. For validation, the data assimilation has negative impacts on the root mean square error. According to quantitative analysis, more significant water temperature stratification leads to larger deviations. Sampling depth differences for remote sensing technology, in-situ measurements and model output are considered as possible error source.
93

Data Assimilation and Parameter Recovery for Rayleigh-Bénard Convection

Murri, Jacob William 03 August 2022 (has links)
Many problems in applied mathematics involve simulating the evolution of a system using differential equations with known initial conditions. But what if one records observations and seeks to determine the causal factors which produced them? This is known as an inverse problem. Some prominent inverse problems include data assimilation and parameter recovery, which use partial observations of a system of evolutionary, dissipative partial differential equations to estimate the state of the system and relevant physical parameters (respectively). Recently a set of procedures called nudging algorithms have shown promise in performing simultaneous data assimilation and parameter recovery for the Lorentz equations and the Kuramoto-Sivashinsky equation. This work applies these algorithms and extensions of them to the case of Rayleigh-B\'enard convection, one of the most ubiquitous and commonly-studied examples of turbulent flow. The performance of various parameter update formulas is analyzed through direct numerical simulation. Under appropriate conditions and given the correct parameter update formulas, convergence is also established, and in one case, an analytical proof is obtained.
94

Data Assimilation for Systems with Multiple Timescales

Vicente Ihanus, Dan January 2023 (has links)
This text provides an overview of problems in the field of data assimilation. We explore the possibility of recreating unknown data by continuously inserting known data into certain dynamical systems, under certain regularity assumptions. Additionally, we discuss an alternative statistical approach to data assimilation and investigate the utilization of the Ensemble Kalman Filter for assimilating data into dynamical models. A key challenge in numerical weather prediction is incorporating convective precipitation into an idealized setting for numerical computations. To answer this question we examine the modified rotating shallow water equations, a nonlinear coupled system of partial differential equations and further assess if this primitive model accurately mimics phenomena observed in operational numerical weather prediction models. Numerical experiments conducted using a Deterministic Ensemble Kalman Filter algorithm support its applicability for convective-scale data assimilation. Furthermore, we analyze the frequency spectrum of numerical forecasts using the Wavelet transform. Our frequency analysis suggests that, under certain experimental settings, there are similarities in the initialization of operational models, which can aid in understanding the problem of intialization of numerical weather prediction models.
95

Ensemble Kalman Filtering (EnKF) with One-Step-Ahead Smoothing: Application to Challenging Ocean Data Assimilation Problems

Raboudi, Naila Mohammed Fathi 20 September 2022 (has links)
Predicting and characterizing the state of the ocean is needed for various scientific, industrial, social, management, and recreational activities. Despite the tremendous progress in ocean modeling and simulation capabilities, the ocean models still suffer from different sources of uncertainties. To obtain accurate ocean state predictions, data assimilation (DA) is widely used to constrain the ocean model outputs with available observations. Ensemble Kalman filtering (EnKF) is a sequential DA approach that represents the distribution of the system state through an ensemble of ocean state samples. Different factors may limit the performance of an EnKF in realistic ocean applications, particularly the use of small ensembles and poorly known model error statistics, and also to a lesser extent the strongly nonlinear variations and abrupt regime changes, and unsatisfied underlying assumptions such as the commonly used white observation noise assumption. The objective of this PhD thesis is to develop, implement and test efficient ensemble filtering schemes to enhance the performances of EnKFs in such challenging settings. We resort to the one-step-ahead (OSA) smoothing formulation of the Bayesian filtering problem to introduce EnKFs involving a new update step with future observations (smoothing) between two successive analyses, thereby conditioning the ensemble sampling with more information. We show that this approach enhances the EnKFs performances by providing improved ensemble background statistics, and showcase its performance with realistic ocean DA and forecasting applications, namely a storm surge EnKF forecasting system and the Red Sea ensemble DA and forecasting system. We then derive new EnKF-based schemes accounting for time-correlated observation errors for efficient DA into the class of large dimensional DA problems where observation errors statistics are correlated in time, and further propose a new approach for online estimation of the parameters of the observation error time-correlations model concurrently with the state. We also exploit the OSA-smoothing formulation to propose a new joint EnKF with OSA-smoothing which mitigates for the reported inconsistencies in the joint EnKF update for efficient DA into one-way-coupled systems.
96

A Data Assimilation Scheme for the One-dimensional Shallow Water Equations

Khan, Ramsha January 2017 (has links)
For accurate prediction of tsunami wave propagation, information on the system of PDEs modelling its evolution and full initial and/or boundary data is required. However the latter is not generally fully available, and so the primary objective becomes to find an optimal estimate of these conditions, using available information. Data Assimilation is a methodology used to optimally integrate observed measurements into a mathematical model, to generate a better estimate of some control parameter, such as the initial condition of the wave, or the sea floor bathymetry. In this study, we considered the shallow water equations in both linear and non-linear form as an approximation for ocean wave propagation, and derived a data assimilation scheme based on the calculus of variations, the purpose of which is to optimise some distorted form of the initial condition to give a prediction closer to the exact initial data. We considered two possible forms of distortion, by adding noise to our initial wave, and by rescaling the wave amplitude. Multiple cases were analysed, with observations measured at different points in our spatial domain, as well as variations in the number of observation points. We found that the error between measurements and observation data was sufficiently minimised across all cases. A relationship was found between the number of measurement points and the error, dependent on the choice of where measurements were taken. In the linear case, since the wave form simply translates a fixed form, multiple measurement points did not necessarily provide more information. In the nonlinear case, because the waveform changes shape as it translates, adding more measurement points provides more information about the dynamics and the wave shape. This is reflected in the fact that in the nonlinear case adding more points gave a bigger decrease in error, and much closer convergence of the optimised guess for our initial condition to the exact initial wave profile. / Thesis / Master of Science (MSc) / In ocean wave modelling, information on the system dynamics and full initial and/or boundary data is required. When the latter is not fully available the primary objective is to find an optimal estimate of these conditions, using available information. Data Assimilation is a methodology used to optimally integrate observed measurements into a mathematical model, to generate a better estimate of some control parameter, such as the initial condition of the wave, or the sea floor bathymetry. In this study, we considered the shallow water equations in both linear and non-linear form as an approximation for ocean wave propagation, and derived a data assimilation scheme to optimise some distorted form of the initial condition to generate predictions converging to the exact initial data. The error between measurements and observation data was sufficiently minimised across all cases. A relationship was found between the number of measurement points and the error, dependent on the choice of where measurements were taken.
97

Development of Data Assimilation System for Toroidal Plasmas / トロイダルプラズマに対するデータ同化システムの開発

Morishita, Yuya 23 March 2023 (has links)
京都大学 / 新制・課程博士 / 博士(工学) / 甲第24613号 / 工博第5119号 / 新制||工||1979(附属図書館) / 京都大学大学院工学研究科原子核工学専攻 / (主査)教授 村上 定義, 教授 横峯 健彦, 教授 宮寺 隆之 / 学位規則第4条第1項該当 / Doctor of Philosophy (Engineering) / Kyoto University / DFAM
98

Estimating snow water resources from space: a passive microwave remote sensing data assimilation study in the Sierra Nevada, USA

Li, Dongyue 15 December 2016 (has links)
No description available.
99

Numerical Simulation of A Prognostic Meteorological Model Using Four-Dimensional Observational Data Assimilation in Ohio

Lin, Peng January 2007 (has links)
No description available.
100

Uncertainty Quantification and Uncertainty Reduction Techniques for Large-scale Simulations

Cheng, Haiyan 03 August 2009 (has links)
Modeling and simulations of large-scale systems are used extensively to not only better understand a natural phenomenon, but also to predict future events. Accurate model results are critical for design optimization and policy making. They can be used effectively to reduce the impact of a natural disaster or even prevent it from happening. In reality, model predictions are often affected by uncertainties in input data and model parameters, and by incomplete knowledge of the underlying physics. A deterministic simulation assumes one set of input conditions, and generates one result without considering uncertainties. It is of great interest to include uncertainty information in the simulation. By ``Uncertainty Quantification,'' we denote the ensemble of techniques used to model probabilistically the uncertainty in model inputs, to propagate it through the system, and to represent the resulting uncertainty in the model result. This added information provides a confidence level about the model forecast. For example, in environmental modeling, the model forecast, together with the quantified uncertainty information, can assist the policy makers in interpreting the simulation results and in making decisions accordingly. Another important goal in modeling and simulation is to improve the model accuracy and to increase the model prediction power. By merging real observation data into the dynamic system through the data assimilation (DA) technique, the overall uncertainty in the model is reduced. With the expansion of human knowledge and the development of modeling tools, simulation size and complexity are growing rapidly. This poses great challenges to uncertainty analysis techniques. Many conventional uncertainty quantification algorithms, such as the straightforward Monte Carlo method, become impractical for large-scale simulations. New algorithms need to be developed in order to quantify and reduce uncertainties in large-scale simulations. This research explores novel uncertainty quantification and reduction techniques that are suitable for large-scale simulations. In the uncertainty quantification part, the non-sampling polynomial chaos (PC) method is investigated. An efficient implementation is proposed to reduce the high computational cost for the linear algebra involved in the PC Galerkin approach applied to stiff systems. A collocation least-squares method is proposed to compute the PC coefficients more efficiently. A novel uncertainty apportionment strategy is proposed to attribute the uncertainty in model results to different uncertainty sources. The apportionment results provide guidance for uncertainty reduction efforts. The uncertainty quantification and source apportionment techniques are implemented in the 3-D Sulfur Transport Eulerian Model (STEM-III) predicting pollute concentrations in the northeast region of the United States. Numerical results confirm the efficacy of the proposed techniques for large-scale systems and the potential impact for environmental protection policy making. ``Uncertainty Reduction'' describes the range of systematic techniques used to fuse information from multiple sources in order to increase the confidence one has in model results. Two DA techniques are widely used in current practice: the ensemble Kalman filter (EnKF) and the four-dimensional variational (4D-Var) approach. Each method has its advantages and disadvantages. By exploring the error reduction directions generated in the 4D-Var optimization process, we propose a hybrid approach to construct the error covariance matrix and to improve the static background error covariance matrix used in current 4D-Var practice. The updated covariance matrix between assimilation windows effectively reduces the root mean square error (RMSE) in the solution. The success of the hybrid covariance updates motivates the hybridization of EnKF and 4D-Var to further reduce uncertainties in the simulation results. Numerical tests show that the hybrid method improves the model accuracy and increases the model prediction quality. / Ph. D.

Page generated in 0.0915 seconds