• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 29
  • 6
  • 2
  • 1
  • Tagged with
  • 43
  • 43
  • 39
  • 22
  • 21
  • 12
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • 7
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Non-Adjoint Surfactant Flood Optimization of Net Present Value and Incorporation of Optimal Solution Under Geological and Economic Uncertainty

Odi, Uchenna O. 2009 December 1900 (has links)
The advent of smart well technology, which is the use of down hole sensors to adjust well controls (i.e. injection rate, bottomhole pressure, etc.), has allowed the possibility to control a field in all stages of the production. This possibility holds great promise in better managing enhanced oil recovery (EOR) processes, especially in terms of applying optimization techniques. However, some procedures for optimizing EOR processes are not based on the physics of the process, which may lead to erroneous results. In addition, optimization of EOR processes can be difficult, and limited, if there is no access to the simulator code for computation of the adjoints used for optimization. This research describes the development of a general procedure for designing an initial starting point for a surfactant flood optimization. The method does not rely on a simulator's adjoint computation or on external computing of adjoints for optimization. The reservoir simulator used for this research was Schlumberger's Eclipse 100, and optimization was accomplished through use of a program written in Matlab. Utility of the approach is demonstrated by using it to optimize the process net present value (NPV) of a 5-spot surfactant flood (320-acres) and incorporating the optimization solution into a probabilistic geological and economic setting. This thesis includes a general procedure for optimizing a surfactant flood and provides groundwork for optimizing other EOR techniques. This research is useful because it takes the optimal solution and calculates a probability of success for possible NPVs. This is very important when accessing risk in a business scenario, because projects that have unknown probability of success are most likely to be abandoned as uneconomic. This thesis also illustrates possible NPVs if the optimal solution was used.
2

Testing a Coupled Global-limited-area Data Assimilation System Using Observations from the 2004 Pacific Typhoon Season

Holt, Christina 2011 August 1900 (has links)
Tropical cyclone (TC) track and intensity forecasts have improved in recent years due to increased model resolution, improved data assimilation, and the rapid increase in the number of routinely assimilated observations over oceans. The data assimilation approach that has received the most attention in recent years is Ensemble Kalman Filtering (EnKF). The most attractive feature of the EnKF is that it uses a fully flow-dependent estimate of the error statistics, which can have important benefits for the analysis of rapidly developing TCs. We implement the Local Ensemble Transform Kalman Filter algorithm, a variation of the EnKF, on a reduced-resolution version of the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) model and the NCEP Regional Spectral Model (RSM) to build a coupled global-limited area analysis/forecast system. This is the first time, to our knowledge, that such a system is used for the analysis and forecast of tropical cyclones. We use data from summer 2004 to study eight tropical cyclones in the Northwest Pacific. The benchmark data sets that we use to assess the performance of our system are the NCEP Reanalysis and the NCEP Operational GFS analyses from 2004. These benchmark analyses were both obtained by the Statistical Spectral Interpolation, which was the operational data assimilation system of NCEP in 2004. The GFS Operational analysis assimilated a large number of satellite radiance observations in addition to the observations assimilated in our system. All analyses are verified against the Joint Typhoon Warning Center Best Track data set. The errors are calculated for the position and intensity of the TCs. The global component of the ensemble-based system shows improvement in position analysis over the NCEP Reanalysis, but shows no significant difference from the NCEP operational analysis for most of the storm tracks. The regional component of our system improves position analysis over all the global analyses. The intensity analyses, measured by the minimum sea level pressure, are of similar quality in all of the analyses. Regional deterministic forecasts started from our analyses are generally not significantly different from those started from the GFS operational analysis. On average, the regional experiments performed better for longer than 48 h sea level pressure forecasts, while the global forecast performed better in predicting the position for longer than 48 h.
3

ADVANCING SEQUENTIAL DATA ASSIMILATION METHODS FOR ENHANCED HYDROLOGIC FORECASTING IN SEMI-URBAN WATERSHEDS

Leach, James January 2019 (has links)
Accurate hydrologic forecasting is vital for proper water resource management. Practices that are impacted by these forecasts include power generation, reservoir management, agricultural water use, and flood early warning systems. Despite these needs, the models largely used are simplifications of the real world and are therefore imperfect. The forecasters face other challenges in addition to the model uncertainty, which includes imperfect observations used for model calibration and validation, imperfect meteorological forecasts, and the ability to effectively communicate forecast results to decision-makers. Bayesian methods are commonly used to address some of these issues, and this thesis will be focused on improving methods related to recursive Bayesian estimation, more commonly known as data assimilation. Data assimilation is a means to optimally account for the uncertainties in observations, models, and forcing data. In the literature, data assimilation for urban hydrologic and flood forecasting is rare; therefore the main areas of study in this thesis are urban and semi-urban watersheds. By providing improvements to data assimilation methods, both hydrologic and flood forecasting can be enhanced in these areas. This work explored the use of alternative data products as a type of observation that can be assimilated to improve hydrologic forecasting in an urban watershed. The impact of impervious surfaces in urban and semi-urban watersheds was also evaluated in regards to its impact on remotely sensed soil moisture assimilation. Lack of observations is another issue when it comes to data assimilation, particularly in semi- or fully-distributed models; because of this, an improved method for updating locations which do not have observations was developed which utilizes information theory’s mutual information. Finally, we explored extending data assimilation into the short-term forecast by using prior knowledge of how a model will respond to forecasted forcing data. Results from this work found that using alternative data products such as those from the Snow Data Assimilation System or the Soil Moisture and Ocean Salinity mission, can be effective at improving hydrologic forecasting in urban watersheds. They also were effective at identifying a limiting imperviousness threshold for soil moisture assimilation into urban and semi-urban watersheds. Additionally, the inclusion of mutual information between gauged and ungauged locations in a semi-distributed hydrologic model was able to provide better state updates in models. Finally, by extending data assimilation into the short-term forecast, the reliability of the forecasts could be improved substantially. / Dissertation / Doctor of Philosophy (PhD) / The ability to accurately model hydrological systems is essential, as that allows for better planning and decision making in water resources management. The better we can forecast the hydrologic response to rain and snowmelt events, the better we can plan and manage our water resources. This includes better planning and usage of water for agricultural purposes, better planning and management of reservoirs for power generation, and better preparing for flood events. Unfortunately, hydrologic models primarily used are simplifications of the real world and are therefore imperfect. Additionally, our measurements of the physical system responses to atmospheric forcing can be prone to both systematic and random errors that need to be accounted for. To address these limitations, data assimilation can be used to improve hydrologic forecasts by optimally accounting for both model and observation uncertainties. The work in this thesis helps to further advance and improve data assimilation, with a focus on enhancing hydrologic forecasting in urban and semi-urban watersheds. The research presented herein can be used to provide better forecasts, which allow for better planning and decision making.
4

Vers une assimilation des données de déformation en volcanologie / Towards assimilation of deformation measurements in volcanology

Bato, Mary Grace 02 July 2018 (has links)
Le suivi de la mise en place du magma à faible profondeur et de sa migration vers la surface est crucial pour prévoir les éruptions volcaniques.Avec les progrès récents de l'imagerie SAR et le nombre croissant de réseaux GNSS continus sur les volcans, il est maintenant possible de fournir une évolution continue et spatialement étendue des déplacements de surface pendant les périodes inter-éruptives. Pour les volcans basaltiques, ces mesures combinées à des modèles dynamiques simples peuvent être exploitées pour caractériser et contraindre la mise en pression d'un ou de plusieurs réservoirs magmatiques, ce qui fournit une meilleure information prédictive sur l'emplacement du magma à faible profondeur. L'assimilation de données—un processus séquentiel qui combine au mieux les modèles et les observations, en utilisant parfois une information a priori basée sur les statistiques des erreurs, pour prédire l'état d'un système dynamique—a récemment gagné en popularité dans divers domaines des géosciences. Dans cette thèse, je présente la toute première application de l'assimilation de données en volcanologie en allant des tests synthétiques à l’utilisation de données géodésiques réelles.La première partie de ce travail se concentre sur le développement de stratégies afin d'évaluer le potentiel de l’assimilation de données. En particulier, le Filtre de Kalman d'Ensemble a été utilisé avec un modèle dynamique simple à deux chambres et de données géodésiques synthétiques pour aborder les points suivants : 1) suivi de l'évolution de la pression magmatique en profondeur et des déplacements de surface et estimation des paramètres statiques incertains du modèle, 2) assimilation des données GNSS et InSAR, 3) mise en évidence des avantages ou des inconvénients de l'EnKF par rapport à une technique d'inversion bayésienne. Les résultats montrent que l’EnKF fonctionne de manière satisfaisante et que l'assimilation de données semble prometteuse pour la surveillance en temps réel des volcans.La deuxième partie de la thèse est dédiée à l'application de la stratégie mise au point précédemment à l’exploitation des données GNSS inter-éruptives enregistrées de 2004 à 2011 au volcan Grímsvötn en Islande, afin de tester notre capacité à prédire la rupture d'une chambre magmatique en temps réel. Nous avons introduit ici le concept de ``niveau critique'' basé sur l’estimation de la probabilité d'une éruption à chaque pas de temps. Cette probabilité est définie à partir de la proportion d'ensembles de modèles qui dépassent un seuil critique, initialement assigné selon une distribution donnée. Nos résultats montrent que lorsque 25 +/- 1 % des ensembles du modèle ont dépassé la surpression critique une éruption est imminente. De plus, dans ce chapitre, nous élargissons également les tests synthétiques précédents en améliorant la stratégie EnKF d'assimilation des données géodésiques pour l'adapter à l’utilisation de données réelles en nombre limité. Les outils de diagnostiques couramment utilisés en assimilation de données sont mis en oeuvre et présentés.Enfin, je démontre qu'en plus de son intérêt pour prédire les éruptions volcaniques, l'assimilation séquentielle de données géodésiques basée sur l'utilisation de l'EnKF présente un potentiel unique pour apporter une information sur l'alimentation profonde du système volcanique. En utilisant le modèle dynamique à deux réservoirs pour le système de plomberie de Grímsvötn et en supposant une géométrie fixe et des propriétés magmatiques invariantes, nous mettons en évidence que l'apport basal en magma sous Grímsvötn diminue de 85 % au cours des 10 mois précédant le début de l'événement de rifting de Bárdarbunga. La perte d'au moins 0.016 km3 dans l'approvisionnement en magma de Grímsvötn est interprétée comme une conséquence de l'accumulation de magma sous Bárdarbunga et de l'alimentation consécutive de l'éruption Holuhraun à 41 km de distance. / Tracking magma emplacement at shallow depth as well as its migration towards the Earth's surface is crucial to forecast volcanic eruptions.With the recent advances in Interferometric Synthetic Aperture Radar (InSAR) imaging and the increasing number of continuous Global Navigation Satellite System (GNSS) networks recorded on volcanoes, it is now possible to provide continuous and spatially extensive evolution of surface displacements during inter-eruptive periods. For basaltic volcanoes, these measurements combined with simple dynamical models can be exploited to characterise and to constrain magma pressure building within one or several magma reservoirs, allowing better predictive information on the emplacement of magma at shallow depths. Data assimilation—a sequential time-forward process that best combines models and observations, sometimes a priori information based on error statistics, to predict the state of a dynamical system—has recently gained popularity in various fields of geoscience (e.g. ocean-weather forecasting, geomagnetism and natural resources exploration). In this dissertation, I present the very first application of data assimilation in volcanology from synthetic tests to analyzing real geodetic data.The first part of this work focuses on the development of strategies in order to test the applicability and to assess the potential of data assimilation, in particular, the Ensemble Kalman Filter (EnKF) using a simple two-chamber dynamical model (Reverso2014) and artificial geodetic data. Synthetic tests are performed in order to address the following: 1) track the magma pressure evolution at depth and reconstruct the synthetic ground surface displacements as well as estimate non-evolving uncertain model parameters, 2) properly assimilate GNSS and InSAR data, 3) highlight the strengths and weaknesses of EnKF in comparison with a Bayesian-based inversion technique (e.g. Markov Chain Monte Carlo). Results show that EnKF works well with the synthetic cases and there is a great potential in utilising data assimilation for real-time monitoring of volcanic unrest.The second part is focused on applying the strategy that we developed through synthetic tests in order to forecast the rupture of a magma chamber in real time. We basically explored the 2004-2011 inter-eruptive dataset at Grímsvötn volcano in Iceland. Here, we introduced the concept of “eruption zones” based on the evaluation of the probability of eruption at each time step estimated as the percentage of model ensembles that exceeded their failure overpressure values initially assigned following a given distribution. Our results show that when 25 +/- 1% of the model ensembles exceeded the failure overpressure, an actual eruption is imminent. Furthermore, in this chapter, we also extend the previous synthetic tests by further enhancing the EnKF strategy of assimilating geodetic data in order to adapt to real world problems such as, the limited amount of geodetic data available to monitor ice-covered active volcanoes. Common diagnostic tools in data assimilation are presented.Finally, I demonstrate that in addition to the interest of predicting volcanic eruptions, sequential assimilation of geodetic data on the basis of EnKF shows a unique potential to give insights into volcanic system roots. Using the two-reservoir dynamical model for Grímsvötn 's plumbing system and assuming a fixed geometry and constant magma properties, we retrieve the temporal evolution of the basal magma inflow beneath Grímsvötn that drops up to 85% during the 10 months preceding the initiation of the Bárdarbunga rifting event. The loss of at least 0.016 km3 in the magma supply of Grímsvötn is interpreted as a consequence of magma accumulation beneath Bárdarbunga and subsequent feeding of the Holuhraun eruption 41 km away.
5

Bayesian methods for inverse problems

Lian, Duan January 2013 (has links)
This thesis describes two novel Bayesian methods: the Iterative Ensemble Square Filter (IEnSRF) and the Warp Ensemble Square Root Filter (WEnSRF) for solving the barcode detection problem, the deconvolution problem in well testing and the history matching problem of facies patterns. For the barcode detection problem, at the expanse of overestimating the posterior uncertainty, the IEnSRF efficiently achieves successful detections with very challenging real barcode images which the other considered methods and commercial software fail to detect. It also performs reliable detection on low-resolution images under poor ambient light conditions. For the deconvolution problem in well testing, the IEnSRF is capable of quantifying estimation uncertainty, incorporating the cumulative production data and estimating the initial pressure, which were thought to be unachievable in the existing well testing literature. The estimation results for the considered real benchmark data using the IEnSRF significantly outperform the existing methods in the commercial software. The WEnSRF is utilised for solving the history matching problem of facies patterns. Through the warping transformation, the WEnSRF performs adjustment on the reservoir features directly and is thus superior in estimating the large-scale complicated facies patterns. It is able to provide accurate estimates of the reservoir properties robustly and efficiently with reasonably reliable prior reservoir structural information.
6

[en] HYBRID METHOD BASED INTO KALMAN FILTER AND DEEP GENERATIVE MODEL TO HISTORY MATCHING AND UNCERTAINTY QUANTIFICATION OF FACIES GEOLOGICAL MODELS / [pt] MÉTODO HÍBRIDO BASEADO EM FILTRO DE KALMAN E MODELOS GENERATIVOS DE APRENDIZAGEM PROFUNDA NO AJUSTE DE HISTÓRICO SOB INCERTEZAS PARA MODELOS DE FÁCIES GEOLÓGICAS

SMITH WASHINGTON ARAUCO CANCHUMUNI 25 March 2019 (has links)
[pt] Os métodos baseados no filtro de Kalman têm tido sucesso notável na indústria do petróleo nos últimos anos, especialmente, para resolver problemas reais de ajuste de histórico. No entanto, como a formulação desses métodos é baseada em hipóteses de gaussianidade e linearidade, seu desempenho é severamente degradado quando a geologia a priori é descrita em termos de distribuições complexas (e.g. modelos de fácies). A tendência atual em soluções para o problema de ajuste de histórico é levar em consideração modelos de reservatórios mais realistas com geologia complexa. Assim, a modelagem de fácies geológicas desempenha um papel importante na caracterização de reservatórios, como forma de reproduzir padrões importantes de heterogeneidade e facilitar a modelagem das propriedades petrofísicas das rochas do reservatório. Esta tese introduz uma nova metodologia para realizar o ajuste de histórico de modelos geológicos complexos. A metodologia consiste na integração de métodos baseados no filtro de Kalman em particular o método conhecido na literatura como Ensemble Smoother with Multiple Data Assimilation (ES-MDA), com uma parametrização das fácies geológicas por meio de técnicas baseadas em aprendizado profundo (Deep Learning) em arquiteturas do tipo autoencoder. Um autoencoder sempre consiste em duas partes, o codificador (modelo de reconhecimento) e o decodificador (modelo gerador). O procedimento começa com o treinamento de um conjunto de realizações de fácies por meio de algoritmos de aprendizado profundo, através do qual são identificadas as principais características das imagens de fácies geológicas, permitindo criar novas realizações com as mesmas características da base de treinamento com uma reduzida parametrização dos modelos de fácies na saída do codificador. Essa parametrização é regularizada no codificador para fornecer uma distribuição gaussiana na saída, a qual é utilizada para atualizar os modelos de fácies de acordo com os dados observados do reservatório, através do método ES-MDA. Ao final, os modelos atualizados são reconstruídos através do aprendizado profundo (decodificador), com o objetivo de obter modelos finais que apresentem características similares às da base de treinamento. Os resultados, em três casos de estudo com 2 e 3 fácies, mostram que a parametrização de modelos de fácies baseada no aprendizado profundo consegue reconstruir os modelos de fácies com um erro inferior a 0,3 por cento. A metodologia proposta gera modelos geológicos ajustados que conservam a descrição geológica a priori do reservatório (fácies com canais curvilíneos), além de ser consistente com o ajuste dos dados observados do reservatório. / [en] Kalman filter-based methods have had remarkable success in the oil industry in recent years, especially to solve several real-life history matching problems. However, as the formulation of these methods is based on the assumptions of gaussianity and linearity, their performance is severely degraded when a priori geology is described in terms of complex distributions (e.g., facies models). The current trend in solutions for the history matching problem is to take into account more realistic reservoir models, with complex geology. Thus the geological facies modeling plays an important role in the characterization of reservoirs as a way of reproducing important patterns of heterogeneity and to facilitate the modeling of the reservoir rocks petrophysical properties. This thesis introduces a new methodology to perform the history matching of complex geological models. This methodology consists of the integration of Kalman filter-based methods, particularly the method known in the literature as Ensemble Smoother with Multiple Data Assimilation (ES-MDA), with a parameterization of the geological facies through techniques based on deep learning in autoencoder type architectures. An autoencoder always consists of two parts, the encoder (recognition model) and the decoder (generator model). The procedure begins with the training of a set of facies realizations via deep generative models, through which the main characteristics of geological facies images are identified, allowing for the creation of new realizations with the same characteristics of the training base, with a low dimention parametrization of the facies models at the output of the encoder. This parameterization is regularized at the encoder to provide Gaussian distribution models in the output, which is then used to update the models according to the observed data of the reservoir through the ES-MDA method. In the end, the updated models are reconstructed through deep learning (decoder), with the objective of obtaining final models that present characteristics similar to those of the training base. The results, in three case studies with 2 and 3 facies, show that the parameterization of facies models based on deep learning can reconstruct facies models with an error lower than 0.3 percent. The proposed methodology generates final geological models that preserve the a priori geological description of the reservoir (facies with curvilinear channels), besides being consistent with the adjustment of the observed data of the reservoir.
7

Assimilation of snow covered area into a hydrologic model

Hreinsson, Einar Örn January 2008 (has links)
Accurate knowledge of water content in seasonal snow can be helpful for water resource management. In this study, a distributed temperature index snow model based on temperature and precipitation as forcing data, is used to estimate snow storage in the Jollie catchment approximately 20km east of the main divide of the central Southern Alps, New Zealand. The main objective is to apply a frequently used assimilation method, the ensemble Kalman square root filter, to assimilate remotely sensed snow covered area into the model and evaluate the impacts of this approach on simulations of snow water equivalent. A 250m resolution remotely sensed data from Moderate Resolution Imaging Spectroradiometer (MODIS), specifically tuned to the study location was used. Temperature and precipitation were given on a 0.055 latitude/longitude grid. Precipitation was perturbed as input into the model, generating 100 ensemble members, which represented model error. Only observations of snow covered area that had less that 25% cloud cover classification were used in the assimilation precess. The error in the snow covered area observations was assumed to be 0.1 and grow linearly with cloud cover fraction up to 1 for a totally cloud covered pixel. As the model was not calibrated, two withholding experiments were conducted, in which observations withheld from the assimilation process were compared to the results. Two model states were updated in the assimilation, the total snow accumulation state variable and the total snow melt state variable. The results of this study indicate that the model underestimates snow storage at the end of winter and/or does not detect snow fall events during the ablation period. The assimilation method only affected simulated snow covered area and snow storage during the ablation period. That corresponded to higher correlation between modelled snow cover area and the updated state variables. Withholding experiments show good agreement between observations and simulated snow covered area. This study successfully applied the ensemble Kalman square root filter and showed its applicability for New Zealand conditions.
8

Structural and shape reconstruction using inverse problems and machine learning techniques with application to hydrocarbon reservoirs

Etienam, Clement January 2019 (has links)
This thesis introduces novel ideas in subsurface reservoir model calibration known as History Matching in the reservoir engineering community. The target of history matching is to mimic historical pressure and production data from the producing wells with the output from the reservoir simulator for the sole purpose of reducing uncertainty from such models and improving confidence in production forecast. Ensemble based methods such as the Ensemble Kalman Filter (EnKF) and Ensemble Smoother with Multiple Data Assimilation (ES-MDA) as been proposed for history matching in literature. EnKF/ES-MDA is a Monte Carlo ensemble nature filter where the representation of the covariance is located at the mean of the ensemble of the distribution instead of the uncertain true model. In EnKF/ES-MDA calculation of the gradients is not required, and the mean of the ensemble of the realisations provides the best estimates with the ensemble on its own estimating the probability density. However, because of the inherent assumptions of linearity and Gaussianity of petrophysical properties distribution, EnKF/ES-MDA does not provide an acceptable history-match and characterisation of uncertainty when tasked with calibrating reservoir models with channel like structures. One of the novel methods introduced in this thesis combines a successive parameter and shape reconstruction using level set functions (EnKF/ES-MDA-level set) where the spatial permeability fields' indicator functions are transformed into signed distances. These signed distances functions (better suited to the Gaussian requirement of EnKF/ES-MDA) are then updated during the EnKF/ES-MDA inversion. The method outperforms standard EnKF/ES-MDA in retaining geological realism of channels during and after history matching and also yielded lower Root-Mean-Square function (RMS) as compared to the standard EnKF/ES-MDA. To improve on the petrophysical reconstruction attained with the EnKF/ES-MDA-level set technique, a novel parametrisation incorporating an unsupervised machine learning method for the recovery of the permeability and porosity field is developed. The permeability and porosity fields are posed as a sparse field recovery problem and a novel SELE (Sparsity-Ensemble optimization-Level-set Ensemble optimisation) approach is proposed for the history matching. In SELE some realisations are learned using the K-means clustering Singular Value Decomposition (K-SVD) to generate an overcomplete codebook or dictionary. This dictionary is combined with Orthogonal Matching Pursuit (OMP) to ease the ill-posed nature of the production data inversion, converting our permeability/porosity field into a sparse domain. SELE enforces prior structural information on the model during the history matching and reduces the computational complexity of the Kalman gain matrix, leading to faster attainment of the minimum of the cost function value. From the results shown in the thesis; SELE outperforms conventional EnKF/ES-MDA in matching the historical production data, evident in the lower RMS value and a high geological realism/similarity to the true reservoir model.
9

Framework for Calibration of a Traffic State Space Model

Sandin, Mats, Fransson, Magnus January 2012 (has links)
To evaluate the traffic state over time and space, several models can be used. A typical model for estimating the state of the traffic for a stretch of road or a road network is the cell transmission model, which is a form of state space model. This kind of model typically needs to be calibrated since the different roads have different properties. This thesis will present a calibration framework for the velocity based cell transmission model, the CTM-v. The cell transmission model for velocity is a discrete time dynamical system that can model the evolution of the velocity field on highways. Such a model can be fused with an ensemble Kalman filter update algorithm for the purpose of velocity data assimilation. Indeed, enabling velocity data assimilation was the purpose for ever developing the model in the first place and it is an essential part of the Mobile Millennium research project. Therefore a systematic methodology for calibrating the cell transmission is needed. This thesis presents a framework for calibration of the velocity based cell transmission model that is combined with the ensemble Kalman filter. The framework consists of two separate methods, one is a statistical approach to calibration of the fundamental diagram. The other is a black box optimization method, a simplification of the complex method that can solve inequality constrained optimization problems with non-differentiable objective functions. Both of these methods are integrated with the existing system, yielding a calibration framework, in particular highways were stationary detectors are part of the infrastructure. The output produced by the above mentioned system is highly dependent on the values of its characterising parameters. Such parameters need to be calibrated so as to make the model a valid representation of reality. Model calibration and validation is a process of its own, most often tailored for the researchers models and purposes. The combination of the two methods are tested in a suit of experiments for two separate highway models of Interstates 880 and 15, CA which are evaluated against travel time and space mean speed estimates given by Bluetooth detectors with an error between 7.4 and 13.4 % for the validation time periods depending on the parameter set and model.
10

Ensemble Statistics and Error Covariance of a Rapidly Intensifying Hurricane

Rigney, Matthew C. 16 January 2010 (has links)
This thesis presents an investigation of ensemble Gaussianity, the effect of non- Gaussianity on covariance structures, storm-centered data assimilation techniques, and the relationship between commonly used data assimilation variables and the underlying dynamics for the case of Hurricane Humberto. Using an Ensemble Kalman Filter (EnKF), a comparison of data assimilation results in Storm-centered and Eulerian coordinate systems is made. In addition, the extent of the non-Gaussianity of the model ensemble is investigated and quantified. The effect of this non-Gaussianity on covariance structures, which play an integral role in the EnKF data assimilation scheme, is then explored. Finally, the correlation structures calculated from a Weather Research Forecast (WRF) ensemble forecast of several state variables are investigated in order to better understand the dynamics of this rapidly intensifying cyclone. Hurricane Humberto rapidly intensified in the northwestern Gulf of Mexico from a tropical disturbance to a strong category one hurricane with 90 mph winds in 24 hours. Numerical models did not capture the intensification of Humberto well. This could be due in large part to initial condition error, which can be addressed by data assimilation schemes. Because the EnKF scheme is a linear theory developed on the assumption of the normality of the ensemble distribution, non-Gaussianity in the ensemble distribution used could affect the EnKF update. It is shown that multiple state variables do indeed show significant non-Gaussianity through an inspection of statistical moments. In addition, storm-centered data assimilation schemes present an alternative to traditional Eulerian schemes by emphasizing the centrality of the cyclone to the assimilation window. This allows for an update that is most effective in the vicinity of the storm center, which is of most concern in mesoscale events such as Humberto. Finally, the effect of non-Gaussian distributions on covariance structures is examined through data transformations of normal distributions. Various standard transformations of two Gaussian distributions are made. Skewness, kurtosis, and correlation between the two distributions are taken before and after the transformations. It can be seen that there is a relationship between a change in skewness and kurtosis and the correlation between the distributions. These effects are then taken into consideration as the dynamics contributing to the rapid intensification of Humberto are explored through correlation structures.

Page generated in 0.0299 seconds