• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 84
  • 57
  • 13
  • 12
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 212
  • 212
  • 75
  • 38
  • 35
  • 34
  • 34
  • 24
  • 24
  • 18
  • 18
  • 18
  • 17
  • 17
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Advancing Streamflow Forecasts Through the Application of a Physically Based Energy Balance Snowmelt Model With Data Assimilation and Cyberinfrastructure Resources

Gichamo, Tseganeh Zekiewos 01 May 2019 (has links)
The Colorado Basin River Forecast Center (CBRFC) provides forecasts of streamflow for purposes such as flood warning and water supply. Much of the water in these basins comes from spring snowmelt, and the forecasters at CBRFC currently employ a suite of models that include a temperature-index snowmelt model. While the temperature-index snowmelt model works well for weather and land cover conditions that do not deviate from those historically observed, the changing climate and alterations in land use necessitate the use of models that do not depend on calibrations based on past data. This dissertation reports work done to overcome these limitations through using a snowmelt model based on physically invariant principles that depends less on calibration and can directly accommodate weather and land use changes. The first part of the work developed an ability to update the conditions represented in the model based on observations, a process referred to as data assimilation, and evaluated resulting improvements to the snowmelt driven streamflow forecasts. The second part of the research was the development of web services that enable automated and efficient access to and processing of input data to the hydrological models as well as parallel processing methods that speed up model executions. These tasks enable the more detailed models and data assimilation methods to be more efficiently used for streamflow forecasts.
62

Modèles numériques personnalisés de la fibrillation auriculaire / Numerical patient-specific model of atrial-fibrillation

Gerard, Antoine 10 July 2019 (has links)
Les arythmies auriculaires constituent une pathologie majeure en cardiologie, et leur étude constitue un vaste sujet de recherche. Pour les étudier, de nombreux modèles mathématiques de la propagation du potentiel d'action dans les oreillettes ont été développés. La plupart de ces modèles génériques permettent de reproduire des séquences d'activations typiques des oreillettes. De tels modèles peuvent avoir un intérêt expérimental, voir clinique, par exemple dans l'aide à la localisation des foyers arythmiques ou encore dans l'analyse des échecs du traitement de ces arythmies. Néanmoins, pour atteindre ce but, il faut être capable de recaler au mieux le modèle, dans ses dimensions géométriques ou fonctionnelles, sur des données individuelles. L'assimilation de données, discipline mathématique dans laquelle nous cherchons à combiner de manière optimale théorie et observations, est alors un bon candidat à la personnalisation des modèles de la propagation du potentiel d'action. Dans cette thèse, nous proposons d'étudier différentes méthodes d'assimilation de données -- séquentielles et variationnelles -- dans le but de combiner les modèles de propagation avec des données électroanatomiques. Plus précisément, nous nous intéressons à deux applications possible de l'assimilation de données que sont l'estimation d'état et l'estimation de paramètres. Dans un premier temps, nous étudions un observateur d'état permettant de corriger la position du front de propagation simulé en se basant sur la position du front observé. Cet observateur est alors utilisé afin de compléter une carte d'activation obtenue lors d'une procédure clinique. Ensuite, ce même observateur est combiné à un filtre de Kalman d'ordre réduit afin d'estimer les paramètres de conductivités du modèle mathématique de propagation du potentiel d'action. Une étude de la stratégie d'estimation liée état-paramètre est ensuite réalisée pour voir comment la méthode se comporte face aux erreurs de modélisation. La méthode est ensuite testée sur un jeu de données acquis cliniquement. Puis, nous regardons du côté des méthodes d'assimilation de données variationnelles qui permettent l'estimation de paramètres spatialement distribués. Plusieurs problèmes de minimisation, permettant d'estimer un paramètre de conductivité distribué dans l'espace, sont alors introduits et analysés. Nous montrons alors que la discrétisation de ces problèmes de minimisation, dans le but d'obtenir des méthodes numériques de résolution, peut s'avérer complexe. Une méthode numérique est ensuite mise en place pour un des problèmes de minimisation étudié, et trois cas tests unidimensionnels sont analysés.Enfin, nous démontrons l'existence d'un minimum pour une des fonctions objectif étudiées en nous basant sur des résultats d'analyse fonctionnelle de la littérature. / Atrial arrhythmias are a major pathology in cardiology, and their study is alarge research topic. To study them, many mathematical models of the actionpotential propagation in atria have been developed. Most of those generic models can be used to reproduce typical activation sequences of the atria. Such models may have an experimental or even clinical interest, for example in helping the location of arrhythmic foci or in the analysis of treatment failures for these arrhythmias. Nevertheless, to achieve this goal, it isnecessary to be able to adjust the model at best, based on experimental orclinical data. Data assimilation, a mathematical discipline in which we seek to optimally combine theory and observations, is then a good candidate for the customization of action potential propagation models.In this thesis, we propose to study different data assimilation methods-- sequential and variational -- in order to adjust action potential propagation model on electroanatomical data. More precisely, we are interested in two possible applications of data assimilation: state estimation and parameter estimation.First, we study a state observer which is able to correct the simulatedpropagation front localization based on the observed front localization. Thisobserver is then used to complete an activation map obtained during a clinical procedure.Then, this observer is combined with a reduced order Kalman filterin order to estimate the conductivity parameters of the action potentialpropagation model. A study of the joint state-parameter estimationstrategy is then realized to see how the method behaves faced with modelingerrors. The method is then tested on a clinically acquired dataset.Then, we look at variational data assimilation methods that allow the estimation of spatially distributed parameters. Several minimization problems, allowing to estimate a conductivity parameter distributed in space, are then introduced and analyzed. We then show that the discretization of these minimization problems, in order to obtain numerical methods of resolution, can be complex. A numerical method is then implemented for one of the studied minimization problems, and three 1D test cases are analyzed.Finally, we demonstrate the existence of a minimum for one of the studiedobjective function based on functional analysis results from theliterature.
63

Ensemble Statistics and Error Covariance of a Rapidly Intensifying Hurricane

Rigney, Matthew C. 16 January 2010 (has links)
This thesis presents an investigation of ensemble Gaussianity, the effect of non- Gaussianity on covariance structures, storm-centered data assimilation techniques, and the relationship between commonly used data assimilation variables and the underlying dynamics for the case of Hurricane Humberto. Using an Ensemble Kalman Filter (EnKF), a comparison of data assimilation results in Storm-centered and Eulerian coordinate systems is made. In addition, the extent of the non-Gaussianity of the model ensemble is investigated and quantified. The effect of this non-Gaussianity on covariance structures, which play an integral role in the EnKF data assimilation scheme, is then explored. Finally, the correlation structures calculated from a Weather Research Forecast (WRF) ensemble forecast of several state variables are investigated in order to better understand the dynamics of this rapidly intensifying cyclone. Hurricane Humberto rapidly intensified in the northwestern Gulf of Mexico from a tropical disturbance to a strong category one hurricane with 90 mph winds in 24 hours. Numerical models did not capture the intensification of Humberto well. This could be due in large part to initial condition error, which can be addressed by data assimilation schemes. Because the EnKF scheme is a linear theory developed on the assumption of the normality of the ensemble distribution, non-Gaussianity in the ensemble distribution used could affect the EnKF update. It is shown that multiple state variables do indeed show significant non-Gaussianity through an inspection of statistical moments. In addition, storm-centered data assimilation schemes present an alternative to traditional Eulerian schemes by emphasizing the centrality of the cyclone to the assimilation window. This allows for an update that is most effective in the vicinity of the storm center, which is of most concern in mesoscale events such as Humberto. Finally, the effect of non-Gaussian distributions on covariance structures is examined through data transformations of normal distributions. Various standard transformations of two Gaussian distributions are made. Skewness, kurtosis, and correlation between the two distributions are taken before and after the transformations. It can be seen that there is a relationship between a change in skewness and kurtosis and the correlation between the distributions. These effects are then taken into consideration as the dynamics contributing to the rapid intensification of Humberto are explored through correlation structures.
64

Data assimilation for parameter estimation in coastal ocean hydrodynamics modeling

Mayo, Talea Lashea 25 February 2014 (has links)
Coastal ocean models are used for a vast array of applications. These applications include modeling tidal and coastal flows, waves, and extreme events, such as tsunamis and hurricane storm surges. Tidal and coastal flows are the primary application of this work as they play a critical role in many practical research areas such as contaminant transport, navigation through intracoastal waterways, development of coastal structures (e.g. bridges, docks, and breakwaters), commercial fishing, and planning and execution of military operations in marine environments, in addition to recreational aquatic activities. Coastal ocean models are used to determine tidal amplitudes, time intervals between low and high tide, and the extent of the ebb and flow of tidal waters, often at specific locations of interest. However, modeling tidal flows can be quite complex, as factors such as the configuration of the coastline, water depth, ocean floor topography, and hydrographic and meteorological impacts can have significant effects and must all be considered. Water levels and currents in the coastal ocean can be modeled by solv- ing the shallow water equations. The shallow water equations contain many parameters, and the accurate estimation of both tides and storm surge is dependent on the accuracy of their specification. Of particular importance are the parameters used to define the bottom stress in the domain of interest [50]. These parameters are often heterogeneous across the seabed of the domain. Their values cannot be measured directly and relevant data can be expensive and difficult to obtain. The parameter values must often be inferred and the estimates are often inaccurate, or contain a high degree of uncertainty [28]. In addition, as is the case with many numerical models, coastal ocean models have various other sources of uncertainty, including the approximate physics, numerical discretization, and uncertain boundary and initial conditions. Quantifying and reducing these uncertainties is critical to providing more reliable and robust storm surge predictions. It is also important to reduce the resulting error in the forecast of the model state as much as possible. The accuracy of coastal ocean models can be improved using data assimilation methods. In general, statistical data assimilation methods are used to estimate the state of a model given both the original model output and observed data. A major advantage of statistical data assimilation methods is that they can often be implemented non-intrusively, making them relatively straightforward to implement. They also provide estimates of the uncertainty in the predicted model state. Unfortunately, with the exception of the estimation of initial conditions, they do not contribute to the information contained in the model. The model error that results from uncertain parameters is reduced, but information about the parameters in particular remains unknown. Thus, the other commonly used approach to reducing model error is parameter estimation. Historically, model parameters such as the bottom stress terms have been estimated using variational methods. Variational methods formulate a cost functional that penalizes the difference between the modeled and observed state, and then minimize this functional over the unknown parameters. Though variational methods are an effective approach to solving inverse problems, they can be computationally intensive and difficult to code as they generally require the development of an adjoint model. They also are not formulated to estimate parameters in real time, e.g. as a hurricane approaches landfall. The goal of this research is to estimate parameters defining the bottom stress terms using statistical data assimilation methods. In this work, we use a novel approach to estimate the bottom stress terms in the shallow water equations, which we solve numerically using the Advanced Circulation (ADCIRC) model. In this model, a modified form of the 2-D shallow water equations is discretized in space by a continuous Galerkin finite element method, and in time by finite differencing. We use the Manning’s n formulation to represent the bottom stress terms in the model, and estimate various fields of Manning’s n coefficients by assimilating synthetic water elevation data using a square root Kalman filter. We estimate three types of fields defined on both an idealized inlet and a more realistic spatial domain. For the first field, a Manning’s n coefficient is given a constant value over the entire domain. For the second, we let the Manning’s n coefficient take two distinct values, letting one define the bottom stress in the deeper water of the domain and the other define the bottom stress in the shallower region. And finally, because bottom stress terms are generally spatially varying parameters, we consider the third field as a realization of a stochastic process. We represent a realization of the process using a Karhunen-Lo`ve expansion, and then seek to estimate the coefficients of the expansion. We perform several observation system simulation experiments, and find that we are able to accurately estimate the bottom stress terms in most of our test cases. Additionally, we are able to improve forecasts of the model state in every instance. The results of this study show that statistical data assimilation is a promising approach to parameter estimation. / text
65

Understanding the Coupled Surface-Groundwater System from Event to Decadal Scale using an Un-calibrated Hydrologic Model and Data Assimilation

Tao, Jing January 2015 (has links)
<p>In this dissertation, a Hydrologic Data Assimilation System (HDAS) relying on the Duke Coupled surface-groundwater Hydrology Model (DCHM) and various data assimilation techniques including EnKF (Ensemble Kalman Filter), the fixed-lag EnKS (Ensemble Kalman Smoother) and the Asynchronous EnKF (AEnKF) was developed to 1) investigate the hydrological predictability of precipitation-induced natural hazards (i.e. floods and landslides) in the Southern Appalachians in North Carolina, USA, and 2) to characterize the seasonal (wet/dry) and inter-annual variability of surface-groundwater interactions with implications for water resource management in the Upper Zambezi River Basin (UZRB) in southern Africa. The overarching research objective is to improve hydrologic predictability of precipitation-induced natural hazards and water resources in regions of complex terrain. The underlying research hypothesis is that hydrologic response in mountainous regions is governed by surface-subsurface interaction mechanisms, specifically interflow in soil-mantled slopes, surface-groundwater interactions in recharge areas, and wetland dynamics in alluvial floodplains at low elevations. The research approach is to investigate the modes of uncertainty propagation from atmospheric forcing and hydrologic states on processes at multiple scales using a parsimonious uncalibrated hydrologic model (i.e. the DCHM), and Monte Carlo and Data-Assimilation methods. In order to investigate the coupled surface-groundwater system and assess the predictability of precipitation-induced natural hazards (i.e. floods and landslides) in headwater basins, including the propagation of uncertainty in QPE/QPF (Quantitative Precipitation Estimates/Forecasts) to QFE/QFF (Quantitative Flood Estimates/Forecasts), the DCHM model was implemented first at high spatial resolution (250m) in the Southern Appalachian Mountains (SAM) in North Carolina, USA. The DCHM modeling system was implemented subsequently at coarse resolution (5 km) in the Upper Zambezi River Basin (UZRB) in southern Africa for decadal-scale simulations (i.e. water years from 2002 to 2012). </p><p>The research in the SAM showed that joint QPE-QFF distributions for flood response at the headwater catchment scale are highly non-linear with respect to the space-time structure of rainfall, exhibiting strong dependence on basin physiography, initial soil moisture conditions (transient basin storage capacity), the space-time organization of runoff generation and conveyance mechanisms, and in particular interflow dynamics. The errors associated with QPEs and QPFs were characterized using rainfall observations from a dense raingauge network in the Pigeon River Basin, resulting in a simple linear regression model for adjusting/improving QPEs. Deterministic QFEs simulated by the DCHM agree well with observations, with Nash–Sutcliffe (NS) coefficients of 0.8~0.9. Limitations with state-of-the-science operational QPF and the impact of even limited improvements in rainfall forcing was demonstrated through an experiment consisting of nudging satellite-like observations (i.e. Adjusted QPEs) into operational QPE/QPF that showed significant improvement in QFF performance, especially when the timing of satellite overpass is such that it captures transient episodes of heavy rainfall during the event. The research further showed that the dynamics of subsurface hydrologic processes play an important role as a trigger mechanism of shallow landslides through soil moisture redistribution by interflow. Specifically, transient mass fluxes associated with the temporal-spatial dynamics of interflow govern the timing of shallow landslide initiation, and subsequent debris flow mobilization, independently of storm characteristics such as precipitation intensity and duration. Interflow response was shown to be dominant at high elevations in the presence of deep soils as well as in basins with large alluvial fans or unconsolidated debris flow deposits. In recharge areas and where subsurface flow is an important contribution to streamflow, subsurface-groundwater interactions determine initial hydrologic conditions (e.g. soil moisture states and water table position), which in turn govern the timing and magnitude of flood response at the event scale. More generally, surface-groundwater interactions are essential to capture low flows in the summer season, and generally during persistent dry weather and drought conditions. Future advances in QFF and landslide monitoring remain principally constrained by progress in QPE and QPF at the spatial resolution necessary to resolve rainfall-interflow dynamics in mountainous regions.</p><p>The predictability of QFE/QFF was further scrutinized in a complete operational environment during the Intense Observing Period (IOP) of the Integrated Precipitation and Hydrology Experiment (IPHEx-IOP), in order to investigate the predictability of floods (and flashfloods) in headwater catchments in the Southern Appalachians with various drainage sizes. With the DCHM, a variety of operational QPEs were used to produce hydrological hindcasts for the previous day, from which the final states were used as initial conditions in the hydrological forecast for the current day. Although the IPHEx operational testbed results were promising in terms of not having missed any of the flash flood events during the IOP with large lead times of up to 6 hours, significant errors of overprediction or underprediction were identified that could be traced back to the QPFs and subgrid-scale variability of radar QPEs. Furthermore, the added value of improving QFE/QFF through assimilating discharge observations into the DCHM was investigated for advancing flood forecasting skills in the operational mode. Both the flood hindcast/forecast results were significantly improved by assimilating the discharge observations into the DCHM using the EnKF (Ensemble Kalman Filter), the fixed-lag EnKS (Ensemble Kalman Smoother) and Asynchronous EnKF (AEnKF). The results not only demonstrate the utility of discharge assimilation in operational forecasts, but also reveal the importance of initial water storage in the basin for issuing flood forecasts. Specifically, hindcast NSEs as high as 0.98, 0.71 and 0.99 at 15-min time-scales were attained for three headwater catchments in the inner mountain region, demonstrating that assimilation of discharge observations at the basin’s outlet can reduce the errors and uncertainties in soil moisture. Success in operational flood forecasting at lead times of 6, 9, 12 and 15hrs was also achieved through discharge assimilation, with NSEs of 0.87, 0.78, 0.72 and 0.51, respectively. The discharge assimilation experiments indicate that the optimal assimilating time window not only depends on basin properties but also on the storm-specific space-time-structure of rainfall within the basin, and therefore adaptive, context-aware configurations of the data assimilation system should prove useful to address the challenges of flood prediction in headwater basins.</p><p>A physical parameterization of wetland hydrology was incorporated in the DCHM for water resource assessment studies in the UZRB. The spatial distribution of wetlands was introduced in the model using probability occurrence maps generated by logistic regression models using MODIS reflectance-based indices as predictor variables. Continuous model simulations for the 2002-2012 period show that the DCHM with wetland parameterization was able to reproduce wetland hydrology processes adequately, including surface-groundwater interactions. The modelled regional terrestrial water storage anomaly (TWSA) captured very well the inter- and intra-annual variability of the system water storage changes in good agreement with the NASA’s GRACE (Gravity Recovery and Climate Experiment) TWSA observations. Specifically, the positive trend of TWSA documented by GRACE was simulated independently by the DCHM. Furthermore, it was determined that the TSWA positive trend results from cumulative water storage in the sandy soils of the Cuando-Luana sub-basin when shifts in storm tracks move rainfall to the western sector of the Angolan High Plateau. </p><p>Overall, the dissertation study demonstrates the capability of the DCHM in predicting specific characteristics of hydrological response to extreme events and also the inter- and intra-annual variability of surface-groundwater interactions at a decadal scale. The DCHM, coupled with slope stability module and wetland module featuring surface-groundwater interaction mechanism, not only is of great potential in the context of developing a regional warning system for natural hazards (i.e. flashfloods and landslides), but also is promising in investigating regional water budgets at decadal scale. In addition, the DCHM-HDAS demonstrated the ability to reduce forecasting uncertainty and errors associated with forcing data and the model proper, thus significantly improving the predictability of natural hazards. The HDAS could also be used to investigate the regional water resource assessment especially in poorly-gauged regions (e.g. southern Africa), taking advantage of satellite observations.</p> / Dissertation
66

Diagnostics and Generalizations for Parametric State Estimation

Nearing, Grey Stephen January 2013 (has links)
This dissertation is comprised of a collection of five distinct research projects which apply, evaluate and extend common methods for land surface data assimilation. The introduction of novel diagnostics and extensions of existing algorithms is motivated by an example, related to estimating agricultural productivity, of failed application of current methods. We subsequently develop methods, based on Shannon's theory of communication, to quantify the contributions from all possible factors to the residual uncertainty in state estimates after data assimilation, and to measure the amount of information contained in observations which is lost due to erroneous assumptions in the assimilation algorithm. Additionally, we discuss an appropriate interpretation of Shannon information which allows us to measure the amount of information contained in a model, and use this interpretation to measure the amount of information introduced during data assimilation-based system identification. Finally, we propose a generalization of the ensemble Kalman filter designed to alleviate one of the primary assumptions - that the observation function is linear.
67

Data Assimilation In Systems With Strong Signal Features

Rosenthal, William Steven January 2014 (has links)
Filtering problems in high dimensional geophysical applications often require spatially continuous models to interpolate spatially and temporally sparse data. Many applications in numerical weather and ocean state prediction are concerned with tracking and assessing the uncertainty in the position of large scale vorticity features, such as storm fronts, jets streams, and hurricanes. Quantifying the amplitude variance in these features is complicated by the fact that both height and lateral perturbations in the feature geometry are represented in the same covariance estimate. However, when there are sufficient observations to detect feature information like spatial gradients, the positions of these features can be used to further constrain the filter, as long as the statistical model (cost function) has provisions for both height perturbations and lateral displacements. Several authors since the 1990s have proposed various formalisms for the simultaneous modeling of position and amplitude errors, and the typical approaches to computing the generalized solutions in these applications are variational or direct optimization. The ensemble Kalman filter is often employed in large scale nonlinear filtering problems, but its predication on Gaussian statistics causes its estimators suffer from analysis deflation or collapse, as well as the usual curse of dimensionality in high dimensional Monte Carlo simulations. Moreover, there is no theoretical guarantee of the performance of the ensemble Kalman filter with nonlinear models. Particle filters which employ importance sampling to focus attention on the important regions of the likelihood have shown promise in recent studies on the control of particle size. Consider an ensemble forecast of a system with prominent feature information. The correction of displacements in these features, by pushing them into better agreement with observations, is an application of importance sampling, and Monte Carlo methods, including particle filters, and possibly the ensemble Kalman filter as well, are well suited to applications of feature displacement correction. In the present work, we show that the ensemble Kalman filter performs well in problems where large features are displaced both in amplitude and position, as long as it is used on a statistical model which includes both function height and local position displacement in the model state. In a toy model, we characterize the performance-degrading effect that untracked displacements have on filters when large features are present. We then employ tools from classical physics and fluid dynamics to statistically model displacements by area-preserving coordinate transformations. These maps preserve the area of contours in the displaced function, and using strain measures from continuum mechanics, we regularize the statistics on these maps to ensure they model smooth, feature-preserving displacements. The position correction techniques are incorporated into the statistical model, and this modified ensemble Kalman filter is tested on a system of vortices driven by a stochastically forced barotropic vorticity equation. We find that when the position correction term is included in the statistical model, the modified filter provides estimates which exhibit substantial reduction in analysis error variance, using a much smaller ensemble than what is required when the position correction term is removed from the model.
68

Application of frequency-dependent nudging in biogeochemical modeling and assessment of marine animal tag data for ocean observations

Lagman, Karl Bryan 28 June 2013 (has links)
Numerical models are powerful and widely used tools for environmental prediction; however, any model prediction contains errors due to imperfect model parameterizations, insufficient model resolution, numerical errors, imperfect initial and boundary conditions etc. A variety of approaches is applied to quantify, correct and minimize these errors including skill assessments, bias correction and formal data assimilation. All of these require observations and benefit from comprehensive data sets. In this thesis, two aspects related to the quantification and correction of errors in biological ocean models are addressed: (i) A new bias correction method for a biological ocean model is evaluated, and (ii) a novel approach for expanding the set of typically available phytoplankton observations is assessed. The bias correction method, referred to as frequency-dependent nudging, was proposed by Thompson et al. (Ocean Modelling, 2006, 13:109-125) and is used to nudge a model only in prescribed frequencies. A desirable feature of this method is that it can preserve high frequency variability that would be dampened with conventional nudging. The method is first applied to an idealized signal consisting of a seasonal cycle and high frequency variability. In this example, frequency-dependent nudging corrected for the imposed seasonal bias without affecting the high-frequency variability. The method is then applied to a non-linear, 1 dimensional (1D) biogeochemical ocean model. Results showed that application of frequency-dependent nudging leads to better biogeochemical estimates than conventional nudging. In order to expand the set of available phytoplankton observations, light measurements from sensors attached on grey seals where assessed to determine if they provide a useful proxy of phytoplankton biomass. A controlled experiment at Bedford Basin showed that attenuation coefficient estimates from light attenuation measurements from seal tags were found to correlate significantly with chlorophyll. On the Scotian Shelf, results of the assessment indicate that seal tags can uncover spatio-temporal patterns related to phytoplankton biomass; however, more research is needed to derive absolute biomass estimates in the region.
69

Data Assimilation for Agent-Based Simulation of Smart Environment

Wang, Minghao 18 December 2014 (has links)
Agent-based simulation of smart environment finds its application in studying people’s movement to help the design of a variety of applications such as energy utilization, HAVC control and egress strategy in emergency situation. Traditionally, agent-based simulation is not dynamic data driven, they run offline and do not assimilate real sensor data about the environment. As more and more buildings are equipped with various sensors, it is possible to utilize real time sensor data to inform the simulation. To incorporate the real sensor data into the simulation, we introduce the method of data assimilation. The goal of data assimilation is to provide inference about system state based on the incomplete, ambiguous and uncertain sensor data using a computer model. A typical data assimilation framework consists of a computer model, a series of sensors and a melding scheme. The purpose of this dissertation is to develop a data assimilation framework for agent-based simulation of smart environment. With the developed data assimilation framework, we demonstrate an application of building occupancy estimation which focuses on position estimation using the framework. We build an agent based model to simulate the occupants’ movement s in the building and use this model in the data assimilation framework. The melding scheme we use to incorporate sensor data into the built model is particle filter algorithm. It is a set of statistical method aiming at compute the posterior distribution of the underlying system using a set of samples. It has the benefit that it does not have any assumption about the target distribution and does not require the target system to be written in analytic form .To overcome the high dimensional state space problem as the number of agents increases, we develop a new resampling method named as the component set resampling and evaluate its effectiveness in data assimilation. We also developed a graph-based model for simulating building occupancy. The developed model will be used for carrying out building occupancy estimation with extremely large number of agents in the future.
70

Ondes hydro-magnétiques dans un modèle Quasi-géostrophique du noyau terrestre / Hydromagnetic waves in a Quasi-geostrophic model of Earth's core

Labbé, François 28 September 2015 (has links)
Les variations du champ magnétique terrestre sont documentées par les observatoires au sol et les satellites en orbite basse, pour des échelles de temps de l'année au siècle.Sur ces périodes, la dynamique du noyau externe -- là où est principalement généré le champ magnétique -- est fortement influencée par la rotation terrestre, qui tend à imposer une invariance dans la direction parallèle à l'axe de rotation.Dans cette thèse, j'étudie un modèle s'appuyant sur cette hypothèse de bidimensionnalité du champ de vitesse, le modèle quasi-géostrophique.Je présente une nouvelle dérivation de ce modèle par une approche variationnelle, plus adaptée aux fortes pentes des frontières du domaine sphérique.Je présente une étude modale des ondes hydro-magnétiques, qui pour la première fois prend en compte l'impact d'un champ magnétique imposé non-zonal.Deux groupes d'ondes magnéto-hydrodynamiques apparaissent alors : les ondes magnéto-Coriolis (centennales) et les ondes d'Alfvéen de torsion (interannuelle).Je décris l'évolution des ondes à mesure que l'on intensifie l'effet de la rotation, jusqu'à atteindre des paramètres géophysiques.Je discute également dans quel mesure une version du modèle quasi-géostrophique où la force de Lorentz est représentée par des produits quadratiques du champ magnétique est adapté pour l'interprétation de calculs numériques tridimensionnels de la dynamo. J'observe que pour les paramètres aujourd'hui accessibles à ces calculs, les forces magnétiques sont faibles. À long terme, nous espérons utiliser le modèle quasi-géostrophique dans le contexte de l'assimilation de données satellitaires. / Variations of the Earth's magnetic field are documented by ground observatories and low-orbiting satellites, for time scales from year to century.On such periods, dynamics of the outer core -- where the creation of the magnetic field takes place -- is strongly influenced by the Earth rotation, which tends to impose invariance of the flow in the direction parallel to the rotation axis.In this thesis report, I study a model based on this bi-dimensional velocity field hypothesis, the quasi-geostrophic model.I present a new mathematical formulation of this model through a variational approach, better suited to steep slopes on the boundaries of the spherical domain.I present a modal study of hydromagnetic waves, taking into account for the first time the impact of a non-zonal imposed magnetic field.Two groups of hydromagnetic waves are present : centennial magneto-Coriolis waves and interannual torsional Alfvén waves.I describe evolution of those waves as the effect of rotation is intensified until Earth-like parameters are reached.I also discuss in what measure an other version of the quasi-geostrophic model, where Lorentz force is represented by quadratic products of the magnetic field, can be adapted to understand tridimensional dynamo numerical simulations.I observe that for parameters available today, magnetic forces are weak.In the future, we hope to use the Quasi-geostrophic model in the context of satellite data assimilation.

Page generated in 0.2421 seconds