Intelligent Planning and Assimilation of AUV-Obtained Measurements Within a ROMS-Based Ocean Modeling SystemDavini, Benjamin J 01 December 2010 (has links) (PDF)
Efforts to learn more about the oceans that surround us have increased dramatically as the technological ability to do so grows. Autonomous Underwater Vehicles (AUVs) are one such technological advance. They allow for rapid deployment and can gather data quickly in places and ways that traditional measurement systems (bouys, profilers, etc.) cannot. A ROMS-based data assimilation method was developed that intelligently plans for and integrates AUV measurements with the goal of minimizing model standard deviation. An algorithm developed for this system is first described that optimizes paths for AUVs that seeks to improve the model by gathering data in high-interest locations. This algorithm and its effect on the ocean model are tested by comparing the results of missions made with the algorithm and missions created by hand. The results of the experiments demonstrate that the system is successful in improving the ROMS ocean model. Also shown are results comparing optimized missions and unoptimized missions.
Improving numerical simulation methods for the assessment of wind source availability and related power production for wind farms over complex terrainIve, Federica 26 July 2022 (has links)
One of the Sustainable Development Goals set in 2015 by the United Nations aims to ensure access to affordable, reliable, sustainable, and modern energy for all, increasing the global share of renewable energy to 32-35% by 2030. Moving towards this goal, the University of Trento funded the interdepartmental strategic project ERiCSol (Energie Rinnovabili e Combustibili Solari), in order to promote the research on renewable energy storage and solar fuels. The research activity presented in this thesis lies in the framework of this project, focusing on the development of new advanced simulation approaches to improve the estimation of the wind resource availability and the related power production for Italian wind farms in complex terrain. The wind farms, operated by the company AGSM S.p.A., are located in two different geographical contexts: Rivoli Veronese and Affi are at the inlet of the Adige Valley, while Casoni di Romagna and Carpinaccio Firenzuola, are on the crest of the Apennines close to the borders between the provinces of Bologna e Firenze. The analysis of data from year-long field measurements highlighted the different peculiarities of these areas. The wind farms at the mouth of the Adige Valley are influenced by a daily periodic thermally-driven circulation, characterised by a nocturnal intense down-valley wind alternating with a diurnal weaker up-valley wind, while the Apennines wind farms are primarily affected by synoptic-scale winds. Simulations, with the mesoscale Weather Research and Forecasting (WRF) model, are performed and compared with field measurements in both cases, to highlight strengths and weaknesses. The results show that the model is able to capture with good accuracy wind speed and direction in the Apennines wind farms, while larger errors arise for Rivoli Veronese and Affi wind farms, where the intensity of the nocturnal down-valley wind is generally underestimated. Considering the former case, modelled and observed yearly wind speed density distributions are compared, in order to evaluate the impact of model errors in the estimation of the wind resource at these sites. Since reliable simulations of the wind resource are also essential to ensure the security in power transmission and to prevent penalties to energy operators, an analysis of the power production is also performed, to evaluate how errors in the estimate of the resource translate into errors in the estimate of the production. Considering the wind farms at the mouth of the Adige Valley, the research work mainly focuses on the evaluation of the impact of data assimilation by means of observational nudging on model results, in order to optimize the setup for operational forecasts. Different configurations are tested and compared, varying the temporal window for the assimilation of local data.
Aneesh, C S
Today, ocean modeling is fast developing as a versatile tool for the study of earth’s climate, local marine ecosystems and coastal engineering applications. Though the field of ocean modeling began in the early 1950s along with the development of climate models and primitive computers, even today, the state-of-the-art ocean models have their own limitations. Many issues still remain such as the uncertainity in the parameterisation of essential processes that occur on spatial and temporal scales smaller than that can be resolved in model calculations, atmospheric forcing of the ocean and the boundary and initial conditions. The advent of data assimilation into ocean modeling has heralded a new era in the ﬁeld of ocean modeling and oceanic sciences. “Data assimilation” is a methodology in which observations are used to improve the forecasting skill of operational meteorological models. The study in the present thesis mainly focuses on obtaining a four dimensional realization (the spatial description coupled with the time evolution) of the oceanic ﬂow that is simultaneously consistent with the observational evidence and with the dynamical equations of motion and to provide initial conditions for predictions of oceanic circulation and tracer distribution. A good implementation of data assimilation can be achieved with the availability of large number of good quality observations of the oceanic fields as both synoptic and in-situ data. With the technology in satellite oceanography and insitu measurements advancing by leaps over the past two decades, good synoptic and insitu observations of oceanic ﬁelds have been achieved. The current and expected explosion in remotely sensed and insitu measured oceanographic data is ushering a new age of ocean modeling and data assimilation. The thesis presents results of analysis of the impact of data assimilation in an ocean general circulation model of the North Indian Ocean. In this thesis we have studied the impact of assimilation of temperature and salinity profiles from Argo ﬂoats and Sea Surface height anomalies from satellite altimeters in a Sigma-coordinate Indian Ocean model. An ocean data assimilation system based on the Regional Ocean Modeling System (ROMS) for the Indian Ocean is used. This model is implemented, validated and applied in a climatological simulation experiment to study the circulation in the Indian Ocean. The validated model is then used for the implementation of the data assimilation system for the Indian Ocean region. This dissertation presents the qualitative and quantitative comparisons of the model simulations with and without subsurface temperature and salinity profiles and sea surface height anamoly data assimilation for the Indian Ocean region. This is the ﬁrst ever reported data assimilation studies of the Argo subsurface temperature and salinity profile data with ROMS in the Indian Ocean region.
Nino Ruiz, Elias David
11 January 2016
Ensemble-based methods have gained widespread popularity in the field of data assimilation. An ensemble of model realizations encapsulates information about the error correlations driven by the physics and the dynamics of the numerical model. This information can be used to obtain improved estimates of the state of non-linear dynamical systems such as the atmosphere and/or the ocean. This work develops efficient ensemble-based methods for data assimilation. A major bottleneck in ensemble Kalman filter (EnKF) implementations is the solution of a linear system at each analysis step. To alleviate it an EnKF implementation based on an iterative Sherman Morrison formula is proposed. The rank deficiency of the ensemble covariance matrix is exploited in order to efficiently compute the analysis increments during the assimilation process. The computational effort of the proposed method is comparable to those of the best EnKF implementations found in the current literature. The stability analysis of the new algorithm is theoretically proven based on the positiveness of the data error covariance matrix. In order to improve the background error covariance matrices in ensemble-based data assimilation we explore the use of shrinkage covariance matrix estimators from ensembles. The resulting filter has attractive features in terms of both memory usage and computational complexity. Numerical results show that it performs better that traditional EnKF formulations. In geophysical applications the correlations between errors corresponding to distant model components decreases rapidly with the distance. We propose a new and efficient implementation of the EnKF based on a modified Cholesky decomposition for inverse covariance matrix estimation. This approach exploits the conditional independence of background errors between distant model components with regard to a predefined radius of influence. Consequently, sparse estimators of the inverse background error covariance matrix can be obtained. This implies huge memory savings during the assimilation process under realistic weather forecast scenarios. Rigorous error bounds for the resulting estimator in the context of data assimilation are theoretically proved. The conclusion is that the resulting estimator converges to the true inverse background error covariance matrix when the ensemble size is of the order of the logarithm of the number of model components. We explore high-performance implementations of the proposed EnKF algorithms. When the observational operator can be locally approximated for different regions of the domain, efficient parallel implementations of the EnKF formulations presented in this dissertation can be obtained. The parallel computation of the analysis increments is performed making use of domain decomposition. Local analysis increments are computed on (possibly) different processors. Once all local analysis increments have been computed they are mapped back onto the global domain to recover the global analysis. Tests performed with an atmospheric general circulation model at a T-63 resolution, and varying the number of processors from 96 to 2,048, reveal that the assimilation time can be decreased multiple fold for all the proposed EnKF formulations.Ensemble-based methods can be used to reformulate strong constraint four dimensional variational data assimilation such as to avoid the construction of adjoint models, which can be complicated for operational models. We propose a trust region approach based on ensembles in which the analysis increments are computed onto the space of an ensemble of snapshots. The quality of the resulting increments in the ensemble space is compared against the gains in the full space. Decisions on whether accept or reject solutions rely on trust region updating formulas. Results based on a atmospheric general circulation model with a T-42 resolution reveal that this methodology can improve the analysis accuracy. / Ph. D.
Goncalves de Goncalves, Luis Gustavo
Land surface processes play an important role when modeling weather and climate, and understanding and representing such processes in South America is a particular challenge because of the large variations in regional climate and surface features such as vegetation and soil. Numerical models have been used to explore the climate and weather of continental South America, but without appropriate initiation of land surface conditions model simulations can rapidly diverge from reality. This initiation problem is exacerbated by the fact that conventional surface observations over South America are scarce and biased towards the urban centers and coastal areas. This dissertation explores issues related to the apt representation of land surface processes and their impacts in numerical simulations with a regional atmospheric model (specifically the Eta model) over South America. The impacts of vegetation heterogeneity in regional weather forecast were first investigated. A South American Land Data Assimilation System (SALDAS) was then created analogous to that currently used in North America to estimate soil moisture fields for initializing regional atmospheric models. The land surface model (LSM) used in this SALDAS is the Simplified Simple Biosphere (SSiB). Precipitation fields are critical when calculating soil moisture and, because conventional surface observations are scarce in South America, some of the most important remote sensed precipitation products were evaluated as potential precipitation forcing for the SALDAS. Spin up states for SSiB where then compared with climatological estimates of land surface fields and significant differences found. Finally, an assessment was made of the value of SALDAS-derived soil moisture fields on Eta model forecasts. The primary result was that model performance is enhanced over the entire continent in up to 72h forecasts using SALDAS surface fields
Simulating the carbon cycling of croplands : model development, diagnosis, and regional application through data assimilationSus, Oliver January 2012 (has links)
In the year 2000, croplands covered about 12% of the Earth’s ice-free land surface. Through cropland management, humankind momentarily appropriates about 25% of terrestrial ecosystem productivity. Not only are croplands a key element of human food supply, but also bear potential in increased carbon (C) uptake when best-practice land management approaches are adopted. A detailed assessment of the impact of land use on terrestrial ecosystems can be achieved by modelling, but the simulation of crop C cycling itself is a relatively new discipline. Observational data on crop net ecosystem exchange (NEE) are available only recently, and constitute an important tool for model development, diagnosis, and validation. Before crop functional types (CFT) had been introduced, however, large-scale biogeochemical models (BGCM) lacked crop-specific patterns of phenology, C allocation, and land management. As a consequence, the influence of cropland C cycling on biosphere-atmosphere C exchange seasonality and magnitude is currently poorly known. To date, no regional assessment of crop C cycling and yield formation exists that specifically accounts for spatially and temporally varying patterns of sowing dates within models. In this thesis, I present such an assessment for the first time. In the first step (chapter 2), I built a crop C mass balance model (SPAc) that models crop development and C allocation as a response to ambient meteorological conditions. I compared model outputs against C flux and stock observations of six different sites in Europe, and found a high degree of agreement between simulated and measured fluxes (R2 = 0.83). However, the model tended to overestimate leaf area index (LAI), and underestimate final yield. In a model comparison study (chapter 3), I found in cooperation with further researchers that SPAc best reproduces observed fluxes of C and water (owed to the model’s high temporal and process resolution), but is deficient due to a lack in simulating full crop rotations. I then conducted a detailed diagnosis of SPAc through the assimilation of C fluxes and biometry with the Ensemble Kalman Filter (EnKF, chapter 4), and identified potential model weaknesses in C allocation fractions and plant hydraulics. Further, an overestimation of plant respiration and seasonal leaf thickness variability were evident. Temporal parameter variability as a response to C flux data assimilation (DA) is indicative of ecosystem processes that are resolved in NEE data but are not captured by a model’s structure. Through DA, I gained important insights into model shortcomings in a quantitative way, and highlighted further needs for model improvement and future field studies. Finally, I developed a framework allowing for spatio-temporally resolved simulation of cropland C fluxes under observational constraints on land management and canopy greenness (chapter 5). MODIS (Moderate Resolution Imaging Spectroradiometer) data were assimilated both variationally (for sowing date estimation) and sequentially (for improved model state estimation, using the EnKF) into SPAc. In doing so, I was able to accurately quantify the multiannual (2000-2006) regional C flux and biometry seasonality of maize-soybean crop rotations surrounding the Bondville Ameriflux eddy covariance (EC) site, averaged over 104 pixel locations within the wider area. Results show that MODIS-derived sowing dates and the assimilation of LAI data allow for highly accurate simulations of growing season C cycling at locations for which groundtruth sowing dates are not available. Through quantification of the spatial variability in biometry, NEE, and net biome productivity (NBP), I found that regional patterns of land management are important drivers of agricultural C cycling and major sources of uncertainty if not appropriately accounted for. Observing C cycling at one single field with its individual sowing pattern is not sufficient to constrain large-scale agroecosystem behaviour. Here, I developed a framework that enables modellers to accurately simulate current (i.e. last 10 years) C cycling of major agricultural regions and their contribution to atmospheric CO2 variability. Follow-up studies can provide crucial insights into testing and validating large-scale applications of biogeochemical models.
15 December 2016
As sensor data becomes more and more available, there is an increasing interest in assimilating real time sensor data into spatial temporal simulations to achieve more accurate simulation or prediction results. Particle Filters (PFs), also known as Sequential Monte Carlo methods, hold great promise in this area as they use Bayesian inference and stochastic sampling techniques to recursively estimate the states of dynamic systems from some given observations. However, PFs face major challenges to work effectively for complex spatial temporal simulations due to the high dimensional state space of the simulation models, which typically cover large areas and have a large number of spatially dependent state variables. As the state space dimension increases, the number of particles must increase exponentially in order to converge to the true system state. The purpose of this dissertation work is to develop localized particle filtering to support PFs-based data assimilation for large-scale spatial temporal simulations. We develop a spatially dependent particle-filtering framework that breaks the system state and observation data into sub-regions and then carries out localized particle filtering based on these spatial regions. The developed framework exploits the spatial locality property of system state and observation data, and employs the divide-and-conquer principle to reduce state dimension and data complexity. Within this framework, we propose a two-level automated spatial partitioning method to provide optimized and balanced spatial partitions with less boundary sensors. We also consider different types of data to effectively support data assimilation for spatial temporal simulations. These data include both hard data, which are measurements from physical devices, and soft data, which are information from messages, reports, and social network. The developed framework and methods are applied to large-scale wildfire spread simulations and achieved improved results. Furthermore, we compare the proposed framework to existing particle filtering based data assimilation frameworks and evaluate the performance for each of them.
Leveraging the information content of process-based models using Differential Evolution and the Extended Kalman FilterHoward, Lucas 01 January 2016 (has links)
Process-based models are used in a diverse array of fields, including environmental engineering to provide supporting information to engineers, policymakers and stakeholdes. Recent advances in remote sensing and data storage technology have provided opportunities for improving the application of process-based models and visualizing data, but also present new challenges. The availability of larger quantities of data may allow models to be constructed and calibrated in a more thorough and precise manner, but depending on the type and volume of data, it is not always clear how to incorporate the information content of these data into a coherent modeling framework. In this context, using process-based models in new ways to provide decision support or to produce more complete and flexible predictive tools is a key task in the modern data-rich engineering world. In standard usage, models can be used for simulating specific scenarios; they can also be used as part of an automated design optimization algorithm to provide decision support or in a data-assimilation framework to incorporate the information content of ongoing measurements. In that vein, this thesis presents and demonstrates extensions and refinements to leverage the best of what process-based models offer using Differential Evolution (DE) the Extended Kalman Filter (EKF). Coupling multi-objective optimization to a process-based model may provide valuable information provided an objective function is constructed appropriately to reflect the multi-objective problem and constraints. That, in turn, requires weighting two or more competing objectives in the early stages of an analysis. The methodology proposed here relaxes that requirement by framing the model optimization as a sensitivity analysis. For demonstration, this is implemented using a surface water model (HEC-RAS) and the impact of floodplain access up and downstream of a fixed bridge on bridge scour is analyzed. DE, an evoutionary global optimization algorithm, is wrapped around a calibrated HEC-RAS model. Multiple objective functions, representing different relative weighting of two objectives, are used; the resulting rank-orders of river reach locations by floodplain access sensitivity are consistent across these multiple functions. To extend the applicability of data assimilation methods, this thesis proposes relaxing the requirement that the model be calibrated (provided the parameters are still within physically defensible ranges) before performing assimilation. The model is then dynamically calibrated to new state estimates, which depend on the behavior of the model. Feasibility is demonstrated using the EKF and a synthetic dataset of pendulum motion. The dynamic calibration method reduces the variance of prediction errors compared to measurement errors using an initially uncalibrated model and produces estimates of calibration parameters that converge to the true values. The potential application of the dynamic calibration method to river sediment transport modeling is proposed in detail, including a method for automated calibration using sediment grain size distribution as a calibration parameter.
Programové prostředí pro asimilační metody v radiační ochraně / Software environment for data assimilation in radiation protectionMajer, Peter January 2015 (has links)
In this work we apply data assimilation onto meteorological model WRF for local domain. We use bayesian statistics, namely Sequential Monte Carlo method combined with particle filtering. Only surface wind data are considered. An application written in Python programming language is also part of this work. This application forms interface with WRF, performs data assimilation and provides set of charts as output of data assimilation. In case of stable wind conditions, wind predictions of assimilated WRF are significantly closer to measured data than predictions of non-assimilated WRF. In this kind of conditions, this assimilated model can be used for more accurate short-term local weather predictions. Powered by TCPDF (www.tcpdf.org)
Statistical Analysis and Bayesian Methods for Fatigue Life Prediction and Inverse Problems in Linear Time Dependent PDEs with UncertaintiesSawlan, Zaid A 10 November 2018 (has links)
This work employs statistical and Bayesian techniques to analyze mathematical forward models with several sources of uncertainty. The forward models usually arise from phenomenological and physical phenomena and are expressed through regression-based models or partial differential equations (PDEs) associated with uncertain parameters and input data. One of the critical challenges in real-world applications is to quantify uncertainties of the unknown parameters using observations. To this purpose, methods based on the likelihood function, and Bayesian techniques constitute the two main statistical inferential approaches considered here. Two problems are studied in this thesis. The first problem is the prediction of fatigue life of metallic specimens. The second part is related to inverse problems in linear PDEs. Both problems require the inference of unknown parameters given certain measurements. We first estimate the parameters by means of the maximum likelihood approach. Next, we seek a more comprehensive Bayesian inference using analytical asymptotic approximations or computational techniques. In the fatigue life prediction, there are several plausible probabilistic stress-lifetime (S-N) models. These models are calibrated given uniaxial fatigue experiments. To generate accurate fatigue life predictions, competing S-N models are ranked according to several classical information-based measures. A different set of predictive information criteria is then used to compare the candidate Bayesian models. Moreover, we propose a spatial stochastic model to generalize S-N models to fatigue crack initiation in general geometries. The model is based on a spatial Poisson process with an intensity function that combines the S-N curves with an averaged effective stress that is computed from the solution of the linear elasticity equations.
Page generated in 0.1539 seconds