• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 85
  • 57
  • 13
  • 13
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 215
  • 215
  • 75
  • 39
  • 36
  • 35
  • 35
  • 25
  • 24
  • 19
  • 18
  • 18
  • 18
  • 18
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Computational Tools for Chemical Data Assimilation with CMAQ

Gou, Tianyi 15 February 2010 (has links)
The Community Multiscale Air Quality (CMAQ) system is the Environmental Protection Agency's main modeling tool for atmospheric pollution studies. CMAQ-ADJ, the adjoint model of CMAQ, offers new analysis capabilities such as receptor-oriented sensitivity analysis and chemical data assimilation. This thesis presents the construction, validation, and properties of new adjoint modules in CMAQ, and illustrates their use in sensitivity analyses and data assimilation experiments. The new module of discrete adjoint of advection is implemented with the aid of automatic differentiation tool (TAMC) and is fully validated by comparing the adjoint sensitivities with finite difference values. In addition, adjoint sensitivity with respect to boundary conditions and boundary condition scaling factors are developed and validated in CMAQ. To investigate numerically the impact of the continuous and discrete advection adjoints on data assimilation, various four dimensional variational (4D-Var) data assimilation experiments are carried out with the 1D advection PDE, and with CMAQ advection using synthetic and real observation data. The results show that optimization procedure gives better estimates of the reference initial condition and converges faster when using gradients computed by the continuous adjoint approach. This counter-intuitive result is explained using the nonlinearity properties of the piecewise parabolic method (the numerical discretization of advection in CMAQ). Data assimilation experiments are carried out using real observation data. The simulation domain encompasses Texas and the simulation period is August 30 to September 1, 2006. Data assimilation is used to improve both initial and boundary conditions. These experiments further validate the tools developed in this thesis. / Master of Science
42

Improving numerical simulation methods for the assessment of wind source availability and related power production for wind farms over complex terrain

Ive, Federica 26 July 2022 (has links)
One of the Sustainable Development Goals set in 2015 by the United Nations aims to ensure access to affordable, reliable, sustainable, and modern energy for all, increasing the global share of renewable energy to 32-35% by 2030. Moving towards this goal, the University of Trento funded the interdepartmental strategic project ERiCSol (Energie Rinnovabili e Combustibili Solari), in order to promote the research on renewable energy storage and solar fuels. The research activity presented in this thesis lies in the framework of this project, focusing on the development of new advanced simulation approaches to improve the estimation of the wind resource availability and the related power production for Italian wind farms in complex terrain. The wind farms, operated by the company AGSM S.p.A., are located in two different geographical contexts: Rivoli Veronese and Affi are at the inlet of the Adige Valley, while Casoni di Romagna and Carpinaccio Firenzuola, are on the crest of the Apennines close to the borders between the provinces of Bologna e Firenze. The analysis of data from year-long field measurements highlighted the different peculiarities of these areas. The wind farms at the mouth of the Adige Valley are influenced by a daily periodic thermally-driven circulation, characterised by a nocturnal intense down-valley wind alternating with a diurnal weaker up-valley wind, while the Apennines wind farms are primarily affected by synoptic-scale winds. Simulations, with the mesoscale Weather Research and Forecasting (WRF) model, are performed and compared with field measurements in both cases, to highlight strengths and weaknesses. The results show that the model is able to capture with good accuracy wind speed and direction in the Apennines wind farms, while larger errors arise for Rivoli Veronese and Affi wind farms, where the intensity of the nocturnal down-valley wind is generally underestimated. Considering the former case, modelled and observed yearly wind speed density distributions are compared, in order to evaluate the impact of model errors in the estimation of the wind resource at these sites. Since reliable simulations of the wind resource are also essential to ensure the security in power transmission and to prevent penalties to energy operators, an analysis of the power production is also performed, to evaluate how errors in the estimate of the resource translate into errors in the estimate of the production. Considering the wind farms at the mouth of the Adige Valley, the research work mainly focuses on the evaluation of the impact of data assimilation by means of observational nudging on model results, in order to optimize the setup for operational forecasts. Different configurations are tested and compared, varying the temporal window for the assimilation of local data.
43

A Sensitivity Equation Framework for Parameter Estimation in Dynamical Systems

Newey, Joshua 14 August 2024 (has links) (PDF)
We present a new framework for understanding parameter estimation in dynamical systems. The approach is developed within the modeling approach of continuous data assimilation. We outline the basic assumptions that lead to our derivation. Under these assumptions we show that the parameter estimation turns into a finite dimensional nonlinear optimization problem. We show that our derivation reproduces and extends the algorithm originally developed in [9]. We then implement these methods in three example systems: the Lorenz '63 model, the two-layer Lorenz '96 model, and the Kuramoto Sivashinsky equation. So as to remain sufficiently general, our derivations are largely formal; we leave a more rigorous justification for future work.
44

Iterative near-term forecasting of the terrestrial carbon cycle at Harvard Forest

Helgeson, Alexis Rose 25 September 2024 (has links)
Through a combination of fossil fuel emissions, land use change, and other anthropogenic activities, mankind has dramatically altered global biogeochemical cycles, leading to an unprecedented era of rapid environmental change. To anticipate how the carbon and water cycles will change in the future, and inform decisions about how to adapt and mitigate these changes, we need a better understanding of the inherent predictability of these cycles. To begin to address this challenge I designed, implemented, and analyzed a 35-day iterative forecasting workflow using Harvard Forest as an initial testbed. A key aim of this forecast is to understand the predictability of leaf area index (LAI), net ecosystem exchange (NEE), and latent heat flux (LE), which I assess in terms of how forecast uncertainty changes as a function of forecast lead time, and how the predictability of LAI, NEE and LE is impacted by the assimilation of MODIS LAI observations. I used four metrics of uncertainty (root mean square error, bias, continuous ranked probability score, and mean absolute error) to evaluate the forecast performance. Uncertainty in LAI, LE, and NEE was not positively correlated with forecast lead time. The inclusion of MODIS LAI observations improved predictability of NEE and LE, but had the greatest impact on LAI (~50% uncertainty reduction). Carbon stores (LAI as a proxy for leaf carbon) were more predictable than terrestrial fluxes (NEE, LE).
45

Data Assimilation Experiments Using An Indian Ocean General Circulation Model

Aneesh, C S 08 1900 (has links)
Today, ocean modeling is fast developing as a versatile tool for the study of earth’s climate, local marine ecosystems and coastal engineering applications. Though the field of ocean modeling began in the early 1950s along with the development of climate models and primitive computers, even today, the state-of-the-art ocean models have their own limitations. Many issues still remain such as the uncertainity in the parameterisation of essential processes that occur on spatial and temporal scales smaller than that can be resolved in model calculations, atmospheric forcing of the ocean and the boundary and initial conditions. The advent of data assimilation into ocean modeling has heralded a new era in the field of ocean modeling and oceanic sciences. “Data assimilation” is a methodology in which observations are used to improve the forecasting skill of operational meteorological models. The study in the present thesis mainly focuses on obtaining a four dimensional realization (the spatial description coupled with the time evolution) of the oceanic flow that is simultaneously consistent with the observational evidence and with the dynamical equations of motion and to provide initial conditions for predictions of oceanic circulation and tracer distribution. A good implementation of data assimilation can be achieved with the availability of large number of good quality observations of the oceanic fields as both synoptic and in-situ data. With the technology in satellite oceanography and insitu measurements advancing by leaps over the past two decades, good synoptic and insitu observations of oceanic fields have been achieved. The current and expected explosion in remotely sensed and insitu measured oceanographic data is ushering a new age of ocean modeling and data assimilation. The thesis presents results of analysis of the impact of data assimilation in an ocean general circulation model of the North Indian Ocean. In this thesis we have studied the impact of assimilation of temperature and salinity profiles from Argo floats and Sea Surface height anomalies from satellite altimeters in a Sigma-coordinate Indian Ocean model. An ocean data assimilation system based on the Regional Ocean Modeling System (ROMS) for the Indian Ocean is used. This model is implemented, validated and applied in a climatological simulation experiment to study the circulation in the Indian Ocean. The validated model is then used for the implementation of the data assimilation system for the Indian Ocean region. This dissertation presents the qualitative and quantitative comparisons of the model simulations with and without subsurface temperature and salinity profiles and sea surface height anamoly data assimilation for the Indian Ocean region. This is the first ever reported data assimilation studies of the Argo subsurface temperature and salinity profile data with ROMS in the Indian Ocean region.
46

Efficient formulation and implementation of ensemble based methods in data assimilation

Nino Ruiz, Elias David 11 January 2016 (has links)
Ensemble-based methods have gained widespread popularity in the field of data assimilation. An ensemble of model realizations encapsulates information about the error correlations driven by the physics and the dynamics of the numerical model. This information can be used to obtain improved estimates of the state of non-linear dynamical systems such as the atmosphere and/or the ocean. This work develops efficient ensemble-based methods for data assimilation. A major bottleneck in ensemble Kalman filter (EnKF) implementations is the solution of a linear system at each analysis step. To alleviate it an EnKF implementation based on an iterative Sherman Morrison formula is proposed. The rank deficiency of the ensemble covariance matrix is exploited in order to efficiently compute the analysis increments during the assimilation process. The computational effort of the proposed method is comparable to those of the best EnKF implementations found in the current literature. The stability analysis of the new algorithm is theoretically proven based on the positiveness of the data error covariance matrix. In order to improve the background error covariance matrices in ensemble-based data assimilation we explore the use of shrinkage covariance matrix estimators from ensembles. The resulting filter has attractive features in terms of both memory usage and computational complexity. Numerical results show that it performs better that traditional EnKF formulations. In geophysical applications the correlations between errors corresponding to distant model components decreases rapidly with the distance. We propose a new and efficient implementation of the EnKF based on a modified Cholesky decomposition for inverse covariance matrix estimation. This approach exploits the conditional independence of background errors between distant model components with regard to a predefined radius of influence. Consequently, sparse estimators of the inverse background error covariance matrix can be obtained. This implies huge memory savings during the assimilation process under realistic weather forecast scenarios. Rigorous error bounds for the resulting estimator in the context of data assimilation are theoretically proved. The conclusion is that the resulting estimator converges to the true inverse background error covariance matrix when the ensemble size is of the order of the logarithm of the number of model components. We explore high-performance implementations of the proposed EnKF algorithms. When the observational operator can be locally approximated for different regions of the domain, efficient parallel implementations of the EnKF formulations presented in this dissertation can be obtained. The parallel computation of the analysis increments is performed making use of domain decomposition. Local analysis increments are computed on (possibly) different processors. Once all local analysis increments have been computed they are mapped back onto the global domain to recover the global analysis. Tests performed with an atmospheric general circulation model at a T-63 resolution, and varying the number of processors from 96 to 2,048, reveal that the assimilation time can be decreased multiple fold for all the proposed EnKF formulations.Ensemble-based methods can be used to reformulate strong constraint four dimensional variational data assimilation such as to avoid the construction of adjoint models, which can be complicated for operational models. We propose a trust region approach based on ensembles in which the analysis increments are computed onto the space of an ensemble of snapshots. The quality of the resulting increments in the ensemble space is compared against the gains in the full space. Decisions on whether accept or reject solutions rely on trust region updating formulas. Results based on a atmospheric general circulation model with a T-42 resolution reveal that this methodology can improve the analysis accuracy. / Ph. D.
47

LAND SURFACE-ATMOSPHERE INTERACTIONS IN REGIONAL MODELING OVER SOUTH AMERICA

Goncalves de Goncalves, Luis Gustavo January 2005 (has links)
Land surface processes play an important role when modeling weather and climate, and understanding and representing such processes in South America is a particular challenge because of the large variations in regional climate and surface features such as vegetation and soil. Numerical models have been used to explore the climate and weather of continental South America, but without appropriate initiation of land surface conditions model simulations can rapidly diverge from reality. This initiation problem is exacerbated by the fact that conventional surface observations over South America are scarce and biased towards the urban centers and coastal areas. This dissertation explores issues related to the apt representation of land surface processes and their impacts in numerical simulations with a regional atmospheric model (specifically the Eta model) over South America. The impacts of vegetation heterogeneity in regional weather forecast were first investigated. A South American Land Data Assimilation System (SALDAS) was then created analogous to that currently used in North America to estimate soil moisture fields for initializing regional atmospheric models. The land surface model (LSM) used in this SALDAS is the Simplified Simple Biosphere (SSiB). Precipitation fields are critical when calculating soil moisture and, because conventional surface observations are scarce in South America, some of the most important remote sensed precipitation products were evaluated as potential precipitation forcing for the SALDAS. Spin up states for SSiB where then compared with climatological estimates of land surface fields and significant differences found. Finally, an assessment was made of the value of SALDAS-derived soil moisture fields on Eta model forecasts. The primary result was that model performance is enhanced over the entire continent in up to 72h forecasts using SALDAS surface fields
48

Simulating the carbon cycling of croplands : model development, diagnosis, and regional application through data assimilation

Sus, Oliver January 2012 (has links)
In the year 2000, croplands covered about 12% of the Earth’s ice-free land surface. Through cropland management, humankind momentarily appropriates about 25% of terrestrial ecosystem productivity. Not only are croplands a key element of human food supply, but also bear potential in increased carbon (C) uptake when best-practice land management approaches are adopted. A detailed assessment of the impact of land use on terrestrial ecosystems can be achieved by modelling, but the simulation of crop C cycling itself is a relatively new discipline. Observational data on crop net ecosystem exchange (NEE) are available only recently, and constitute an important tool for model development, diagnosis, and validation. Before crop functional types (CFT) had been introduced, however, large-scale biogeochemical models (BGCM) lacked crop-specific patterns of phenology, C allocation, and land management. As a consequence, the influence of cropland C cycling on biosphere-atmosphere C exchange seasonality and magnitude is currently poorly known. To date, no regional assessment of crop C cycling and yield formation exists that specifically accounts for spatially and temporally varying patterns of sowing dates within models. In this thesis, I present such an assessment for the first time. In the first step (chapter 2), I built a crop C mass balance model (SPAc) that models crop development and C allocation as a response to ambient meteorological conditions. I compared model outputs against C flux and stock observations of six different sites in Europe, and found a high degree of agreement between simulated and measured fluxes (R2 = 0.83). However, the model tended to overestimate leaf area index (LAI), and underestimate final yield. In a model comparison study (chapter 3), I found in cooperation with further researchers that SPAc best reproduces observed fluxes of C and water (owed to the model’s high temporal and process resolution), but is deficient due to a lack in simulating full crop rotations. I then conducted a detailed diagnosis of SPAc through the assimilation of C fluxes and biometry with the Ensemble Kalman Filter (EnKF, chapter 4), and identified potential model weaknesses in C allocation fractions and plant hydraulics. Further, an overestimation of plant respiration and seasonal leaf thickness variability were evident. Temporal parameter variability as a response to C flux data assimilation (DA) is indicative of ecosystem processes that are resolved in NEE data but are not captured by a model’s structure. Through DA, I gained important insights into model shortcomings in a quantitative way, and highlighted further needs for model improvement and future field studies. Finally, I developed a framework allowing for spatio-temporally resolved simulation of cropland C fluxes under observational constraints on land management and canopy greenness (chapter 5). MODIS (Moderate Resolution Imaging Spectroradiometer) data were assimilated both variationally (for sowing date estimation) and sequentially (for improved model state estimation, using the EnKF) into SPAc. In doing so, I was able to accurately quantify the multiannual (2000-2006) regional C flux and biometry seasonality of maize-soybean crop rotations surrounding the Bondville Ameriflux eddy covariance (EC) site, averaged over 104 pixel locations within the wider area. Results show that MODIS-derived sowing dates and the assimilation of LAI data allow for highly accurate simulations of growing season C cycling at locations for which groundtruth sowing dates are not available. Through quantification of the spatial variability in biometry, NEE, and net biome productivity (NBP), I found that regional patterns of land management are important drivers of agricultural C cycling and major sources of uncertainty if not appropriately accounted for. Observing C cycling at one single field with its individual sowing pattern is not sufficient to constrain large-scale agroecosystem behaviour. Here, I developed a framework that enables modellers to accurately simulate current (i.e. last 10 years) C cycling of major agricultural regions and their contribution to atmospheric CO2 variability. Follow-up studies can provide crucial insights into testing and validating large-scale applications of biogeochemical models.
49

Data Assimilation for Spatial Temporal Simulations Using Localized Particle Filtering

Long, Yuan 15 December 2016 (has links)
As sensor data becomes more and more available, there is an increasing interest in assimilating real time sensor data into spatial temporal simulations to achieve more accurate simulation or prediction results. Particle Filters (PFs), also known as Sequential Monte Carlo methods, hold great promise in this area as they use Bayesian inference and stochastic sampling techniques to recursively estimate the states of dynamic systems from some given observations. However, PFs face major challenges to work effectively for complex spatial temporal simulations due to the high dimensional state space of the simulation models, which typically cover large areas and have a large number of spatially dependent state variables. As the state space dimension increases, the number of particles must increase exponentially in order to converge to the true system state. The purpose of this dissertation work is to develop localized particle filtering to support PFs-based data assimilation for large-scale spatial temporal simulations. We develop a spatially dependent particle-filtering framework that breaks the system state and observation data into sub-regions and then carries out localized particle filtering based on these spatial regions. The developed framework exploits the spatial locality property of system state and observation data, and employs the divide-and-conquer principle to reduce state dimension and data complexity. Within this framework, we propose a two-level automated spatial partitioning method to provide optimized and balanced spatial partitions with less boundary sensors. We also consider different types of data to effectively support data assimilation for spatial temporal simulations. These data include both hard data, which are measurements from physical devices, and soft data, which are information from messages, reports, and social network. The developed framework and methods are applied to large-scale wildfire spread simulations and achieved improved results. Furthermore, we compare the proposed framework to existing particle filtering based data assimilation frameworks and evaluate the performance for each of them.
50

Leveraging the information content of process-based models using Differential Evolution and the Extended Kalman Filter

Howard, Lucas 01 January 2016 (has links)
Process-based models are used in a diverse array of fields, including environmental engineering to provide supporting information to engineers, policymakers and stakeholdes. Recent advances in remote sensing and data storage technology have provided opportunities for improving the application of process-based models and visualizing data, but also present new challenges. The availability of larger quantities of data may allow models to be constructed and calibrated in a more thorough and precise manner, but depending on the type and volume of data, it is not always clear how to incorporate the information content of these data into a coherent modeling framework. In this context, using process-based models in new ways to provide decision support or to produce more complete and flexible predictive tools is a key task in the modern data-rich engineering world. In standard usage, models can be used for simulating specific scenarios; they can also be used as part of an automated design optimization algorithm to provide decision support or in a data-assimilation framework to incorporate the information content of ongoing measurements. In that vein, this thesis presents and demonstrates extensions and refinements to leverage the best of what process-based models offer using Differential Evolution (DE) the Extended Kalman Filter (EKF). Coupling multi-objective optimization to a process-based model may provide valuable information provided an objective function is constructed appropriately to reflect the multi-objective problem and constraints. That, in turn, requires weighting two or more competing objectives in the early stages of an analysis. The methodology proposed here relaxes that requirement by framing the model optimization as a sensitivity analysis. For demonstration, this is implemented using a surface water model (HEC-RAS) and the impact of floodplain access up and downstream of a fixed bridge on bridge scour is analyzed. DE, an evoutionary global optimization algorithm, is wrapped around a calibrated HEC-RAS model. Multiple objective functions, representing different relative weighting of two objectives, are used; the resulting rank-orders of river reach locations by floodplain access sensitivity are consistent across these multiple functions. To extend the applicability of data assimilation methods, this thesis proposes relaxing the requirement that the model be calibrated (provided the parameters are still within physically defensible ranges) before performing assimilation. The model is then dynamically calibrated to new state estimates, which depend on the behavior of the model. Feasibility is demonstrated using the EKF and a synthetic dataset of pendulum motion. The dynamic calibration method reduces the variance of prediction errors compared to measurement errors using an initially uncalibrated model and produces estimates of calibration parameters that converge to the true values. The potential application of the dynamic calibration method to river sediment transport modeling is proposed in detail, including a method for automated calibration using sediment grain size distribution as a calibration parameter.

Page generated in 0.0872 seconds