• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 86
  • 57
  • 13
  • 13
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 216
  • 216
  • 75
  • 39
  • 36
  • 35
  • 35
  • 26
  • 25
  • 19
  • 18
  • 18
  • 18
  • 18
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Um esquema de assimilação de dados oceanográficos para o modelo oceânico HYCOM ao largo da costa sudeste brasileira / A data assimilation scheme using the ocean model HYCOM for southeastern brazilian bight

Oliveira, Jean Felix de 22 December 2009 (has links)
Made available in DSpace on 2015-03-04T18:51:15Z (GMT). No. of bitstreams: 1 thesis.pdf: 19996358 bytes, checksum: b3a11077536e0bcd42efeade2b4a5bed (MD5) Previous issue date: 2009-12-22 / The present work presents a data assimilation scheme customized to work with the Hybrid Coordinate Ocean Model (HYCOM) for the Southeastern Brazilian Bights. HYCOM uses hybrid vertical coordinates, i.e., it uses z coordinates in the mixed layer, isopycnal coordinates in the deep ocean and sigma-z coordinates in the continental shelf. However, since vertical profiles of the main ocean variables, like temperature, density and salinity, are observed in z -coordinates, the assimilation of these data into HYCOM is not trivial. For this reason, a technique to transform vertical profiles from isopycnal coordinates to z -coordinates is here proposed as an alternative to realize data assimilation in HYCOM. This technique uses Lagrangian multipliers with a optmization process that guarantees the conservation of the barotropic mass ux. The technique of transformation is applied with the data assimilation method proposed by Ezer & Mellor (1997). The method uses statistical interpolation and correlations, a priori calculated with the model´s output, between the sea surface data - temperature (SST) and/or height (SSH) - and subsurface potential temperature and density structures. Numerical experiments showed that the data assimilation scheme is able to reproduce eficiently the local ocean circulation. The best performance scheme included the correlation with both SST and SSH. / Neste trabalho é apresentado um esquema de assimilação de dados a ser realizado com o Modelo Oceânico de Coordenadas Híbridas HYCOM ao largo da costa sudeste brasileira. O HYCOM utiliza 3 diferentes coordenadas verticais, a saber: coordenada-z na camada de mistura, coordenada isopicnal no oceano profundo estratificado e coordenada sigma-z nas regiões mais rasas e costeiras. Entretanto, como os perfis verticais das principais variáveis oceânicas, como temperatura, salinidade e densidade, são observados e disponibilizados em coordenadas-z, a assimilação desses dados não é tão trivial. Por esse motivo, uma técnica de transformação de coordenadas verticais de isopicnal para z é aqui proposta como uma alternativa para a realização da assimilação de dados no HYCOM. Essa técnica utiliza multiplicadores de Lagrange juntamente com um processo de otimização que garante a conservação do fluxo de massa barotrópico. A técnica de transformação é aplicada juntamente com o método de assimilação de dados proposto por Ezer & Mellor (1997). Esse método utiliza interpolação estatística e correlações, calculadas a priori com resultados do modelo, entre dados de superfície - temperatura (TSM) e /ou altura (ASM) - e a estrutura de subsuperfície de temperatura e densidade potenciais. Com base nos experimentos numéricos realizados, pode-se verificar que o esquema de assimilação de dados foi capaz de reproduzir eficientemente a circulação oceânica do domínio proposto e com os melhores resultados quando utilizando conjuntamente ASM e TSM nas correlações.
112

Etude et développement d'algorithmes d'assimilation de données variationnelle adaptés aux modèles couplés océan-atmosphère / Study and development of some variational data assimilation methods suitable for ocean-atmophere coupled models

Pellerej, Rémi 26 March 2018 (has links)
La qualité des prévisions météorologiques repose principalement sur la qualité du modèle utilisé et de son état initial. Cet état initial est reconstitué en combinant les informations provenant du modèle et des observations disponibles en utilisant des techniques d'assimilation de données. Historiquement, les prévisions et l'assimilation sont réalisées dans l'atmosphère et l'océan de manière découplée. Cependant, les centres opérationnels développent et utilisent de plus en plus des modèles couplés océan-atmosphère. Or, assimiler des données de manière découplée n'est pas satisfaisant pour des systèmes couplés. En effet, l'état initial ainsi obtenu présente des inconsistances de flux à l'interface entre les milieux, engendrant des erreurs de prévision. Il y a donc besoin d'adapter les méthodes d'assimilation aux systèmes couplés. Ces travaux de thèse s'inscrivent dans ce contexte et ont été effectués dans le cadre du projet FP7 ERA-Clim2, visant à produire une réanalyse globale du système terrestre.Dans une première partie, nous introduisons les notions d'assimilation de données, de couplage et les différentes méthodologies existantes appliquées au problème de l'assimilation couplée. Ces méthodologies n’étant pas satisfaisantes en terme de qualité de couplage ou de coût de calcul, nous proposons, dans une seconde partie, des méthodes alternatives. Nous faisons le choix de méthodes d'assimilation basées sur la théorie du contrôle optimal. Ces alternatives se distinguent alors par le choix de la fonction coût à minimiser, des variables contrôlées et de l’algorithme de couplage utilisé. Une étude théorique de ces algorithmes a permis de déterminer un critère nécessaire et suffisant de convergence dans un cadre linéaire. Pour conclure cette seconde partie, les performances des différentes méthodes introduites sont évaluées en terme de qualité de l’analyse produite et de coût de calcul à l’aide d’un modèle couplé linéaire 1D. Dans une troisième et dernière partie, un modèle couplé non-linéaire 1D incluant des paramétrisations physique a été développé et implémenté dans OOPS (textit{Object-Oriented Prediction System}) qui est une surcouche logicielle permettant la mise en œuvre d’un ensemble d’algorithmes d’assimilation de données. Nous avons alors pu évaluer la robustesse de nos algorithmes dans un cadre plus réaliste, et conclure sur leurs performances vis à vis de méthodes existantes. Le fait d’avoir développé nos méthodes dans le cadre de OOPS devrait permettre à l’avenir de les appliquer aisément à des modèles réalistes de prévision. Nous exposons enfin quelques perspectives d'amélioration de ces algorithmes. / In the context of operational meteorology and oceanography, forecast skills heavily rely on the model used and its initial state. This initial state is produced by a proper combination of model dynamics and available observations via data assimilation techniques. Historically, numerical weather prediction is made separately for the ocean and the atmosphere in an uncoupled way. However, in recent years, fully coupled ocean-atmosphere models are increasingly used in operational centres. Yet the use of separated data assimilation schemes in each medium is not satisfactory for coupled problems. Indeed, the result of such assimilation process is generally inconsistent across the interface, thus leading to unacceptable artefacts. Hence, there is a strong need for adapting existing data assimilation techniques to the coupled framework. This PhD thesis is related to this context and is part of the FP7 ERA-Clim2 project, which aim to produce an earth system global reanalysis.We first introduce data assimilation and model coupling concepts, followed by some existing algorithms of coupled data assimilation. Since these methods are not satisfactory in terms of coupling strengh or numerical cost, we suggest, in a second part, some alternatives. These are based on optimal control theory and differ by the choice of the cost function to minimize, controled variable and coupling algorithm used. A theoretical study of these algorithms exhibits a necessary and sufficient convergence criterion in a linear case. To conclude about this second part, the different methods are compared in terms of analysis quality and numerical cost using a 1D linear model. In a third part, a 1D non-linear model with subgrid parametrizations was developed and implemented in OOPS (Object-Oriented Prediction System), a software overlay allowing the implementation of a set of data assimilation algorithms. We then assess the robustness of the different algorithms in a more realistic case, and concluded about their performances against existing methods. By implementing our methods in OOPS, we hope it should be easier to use them with operational forecast models. Finally, we expose some propects for improving these algorithms.
113

Assimilation variationnelle de données altimétriques dans le modèle océanique NEMO : exploration de l'effet des non-linéarités dans une configuration simplifiée à haute résolution / Variational altimetric data assimilation in the oceanographic numerical model NEMO : investigation of the impact of nonlinearities in an academic configuration at high resolution

Bouttier, Pierre-Antoine 04 February 2014 (has links)
Un enjeu majeur des modèles océaniques est de représenter fidèlement les circulations méso- et subméso-échelles afin de simuler leur importante contribution dans la circulation générale et dans le budget énergétique de l'océan. La poursuite de cet objectif se traduit par une augmentation de la résolution spatiale et temporelle à la fois des modèles et des réseaux d'observation de l'océan. Cependant, à ces petites échelles, la dynamique de l'écoulement revêt un caractère fortement turbulent ou non-linéaire. Dans ce contexte, les méthodes actuelles d'assimilation de données (AD), variationnelles en particulier, sont généralement moins performantes que dans un contexte (quasi-) linéaire.L'objectif de cette thèse est d'explorer sous divers aspects le comportement des méthodes variationnelles d'AD dans un modèle d'océan non-linéaire. Pour ce faire, nous avons réalisé une série d'expériences dites "jumelles" en assimilant des données altimétriques simulées suivant les caractéristiques des satellites altimétriques Jason-1 et SARAL/AltiKA . À l'aide de ces expériences, nous analysons sous différents angles les problématiques posées par les non-linéarités à l'AD. Enfin, nous ouvrons plusieurs pistes d'amélioration de l'efficacité du système d'AD dans ce contexte.Ce travail est basé sur le logiciel de modélisation océanique NEMO, incluant la configuration de bassin océanique turbulent idéalisé SEABASS, à différentes résolutions spatiales. Dans la continuité de la plateforme de recherche en AD avec NEMO, NEMO-ASSIM, nous avons utilisé et contribué au développement de cet ensemble d'outil, comprenant, entre autre, opérateur d'observation, modèles linéaire tangent et adjoint de NEMO, permettant de mener à bien notre étude. Le système d'AD variationnelle utilisé est le logiciel NEMOVAR.Les résultats présentés tentent de lier les échelles caractéristiques des structures d'erreurs d'analyse et l'activité aux petites échelles. Pour ce faire, nous avons utilisé une large gamme de diagnostics, e.g. erreur quadratique moyenne spatiale et temporelle, caractéristiques des fonctions coûts, caractérisation de l'hypothèse linéaire tangente, PSD des champs d'erreurs d'analyse.Nos expériences montrent que le 4DVAR incrémental contrôle efficacement la trajectoire analysée au 1/4° pour de longues fenêtres d'AD (2 mois). Lorsque la résolution augmente, la convergence de l'algorithme apparaît plus lente voire inexistante sous certaines conditions. Cependant, l'algorithme permet encore de réduire convenablement l'erreur d'analyse. Enfin, l'algorithme 3DFGAT se révèle beaucoup moins performant, quelle que soit la résolution.De plus, nous montrons également l'importance de l'adéquation entre la circulation simulée et l'échantillonnage altimétrique, en terme d'échelles spatiales représentées, pour obtenir de meilleures performances. Enfin, nous avons exploré la stratégie de minimisation dite "progressive", permettant d'accélérer la convergence du 4DVAR à haute résolution. / A current stake for numerical ocean models is to adequately represent meso- and small-scale activity, in order to simulate its crucial role in the general ocean circulation and energy budget. It is therefore also a challenge for data assimilation (DA) methods to control these scales. However this small-scale activity is strongly linked to the nonlinear or turbulent character of the flow, whereas DA methods are generally much less efficient in such contexts than in (almost) linear ones. For variational DA methods as incremental 4DVAR, non-linearities imply convergence difficulty, the cost functions to be minimised presenting multiple local minima.The purpose of this thesis is to address this problem specifically, by exploring the behaviour of variational DA methods in a non-linear ocean model. To achieve this objective, a series of "twin" experiments assimilating simulated altimeter data, following the characteristics of altimetric satellite Jason-1 and SARAL/AltiKA, are analyzed. We also find different ways to improve efficiency of variational algorithms applied to turbulent circulations.This work is based on oceanic modelisation software called NEMO, including a idealized turbulent oceanic basin configuration, SEABASS, and DA components (e.g. Observation operator, Linear Tangent and Adjoint Models). Thanks to NEMO-ASSIM research platform, we have used and developed this set of tools. The used variational DA system itself is NEMOVAR.We present results characterizing scales and structures of the analysis error along the assimilation process, as well as tentative links with small scale activity. To study both the algorithm convergence and the analysis and forecast errors in a qualitative and quantitative way, a large spectrum of systematic diagnostics has been employed, e.g. spatial and temporal RMSE, cost function characteristics, projection of error fields on EOFs, validity of the tangent linear hypothesis, PSD of error fields.In our experiments, it appears that the incremental 4DVAR algorithm proved to be quite robust for long DA windows at eddy-permitting resolution.When the model horizontal resolution increases, the convergence of the minimisation algorithm is poorer but the 4DVAR method still controls efficiently analysis error.It has also been shown that the 4DVAR algorithm is clearly more performant than 3DFGAT for both considered resolutions.Moreover we investigate some strategies for DA in such nonlinear contexts, with the aim of reducing the analysis error. We performed so-called progressive incremental 4DVAR to improve the algorithm convergence for longer assimilation windows. Finally, we show that the adequation in represented flow scales between the model and the altimetric sampling is crucial to obtain the best error analysis reduction.
114

Calage en ligne d'un modèle dynamique de trafic routier pour l'estimation en temps réel des conditions de circulation / Online calibration of a dynamic traffic model for real time estimation of traffic states

Clairais, Aurélien 12 April 2019 (has links)
Les modèles de trafic ont une importance capitale pour la compréhension et la prévision des phénomènes liés aux conditions de circulation. Ils représentent une aide précieuse à tous les niveaux de gestion du trafic. Cette thèse s'attache aux problématiques liées à la gestion du trafic au quotidien. Pour les gestionnaires de réseaux, quatre enjeux sont traités. L'enjeu de rapidité renvoie au choix de l'échelle de représentation et la formulation du modèle d'écoulement. Le modèle retenu est le modèle LWR lagrangien-spatial. La fiabilité est un enjeu relatif à la prise en compte des erreurs de modèles dans les estimations des conditions de circulation. La réactivité est décrite comme la capacité de la méthode à prendre en compte en temps réel les états de trafic captés. Enfin, l'adaptabilité renvoie à la capacité des paramètres de la méthode à évoluer en tenant compte des situations de trafic observées. Les verrous scientifiques que les travaux présentés cherchent à lever s'articulent autour des quatre enjeux décrits précédemment. L'intégration de la propagation des incertitudes directement dans le modèle d'écoulement représente un premier verrou. Ensuite, la production d'indicateurs opérationnels rendant compte de la fiabilité des résultats. Concernant l'enjeu de réactivité, les verrous scientifiques traités sont la mise en place d'un schéma d'assimilation de données séquentiel et le calage des conditions internes du modèle d'écoulement intégrant les erreurs de modèle et d'observation. Enfin, concernant l'enjeu de réactivité, le verrou scientifique associé est le calage en ligne des paramètres du modèle d'écoulement. Un modèle de suivi d'erreur où les variables du modèle d'écoulement sont distribuées selon des mélanges de gaussienne est développé. Le suivi des erreurs dans le modèle est réalisé grâce à une méthode de perturbation adaptée à la formulation multi-composantes des mélanges de gaussiennes. Une analyse de sensibilité est menée afin d'établir le lien entre la robustesse de la méthode proposée et la discrétisation du réseau, le nombre de composantes dans le mélange de gaussiennes et les erreurs sur les paramètres du modèle d'écoulement. Ce modèle permet la production d'indicateurs opérationnels et leurs erreurs associées rendant compte de la fiabilité des conditions de circulation ainsi estimées. Le processus d'assimilation séquentielle permet d'estimer et de prévoir les conditions de trafic en accord avec les observations en cas de demande et d'offre non calées. L'état a posteriori est calculé à l'aide d'une formulation bayésienne connaissant les états a priori et les observations. Deux méthodes de mise à jour du modèle ont été testées. Devant les incohérences du modèle, introduites par la méthode de substitution des états a priori par les états a posteriori, la mise à jour agit aussi sur les véhicules via l'ajout, la suppression, l'avancement ou le retardement de leurs temps de passage. La validation des concepts étudiés est réalisée sur un réseau composé d'un simple lien homogène sans discontinuité. Lorsque les paramètres de l'écoulement du trafic ne sont pas calés, l'assimilation de données seule ne permet pas de propager correctement les états de trafic en accord avec la situation observée. Le calage des paramètres d'écoulement est traité dans un chapitre d'ouverture dans lequel des pistes de recherche sont suggérées afin de proposer des solutions à ce dernier verrou scientifique. Les travaux de cette thèse ouvrent la voie à des perspectives de recherche et opérationnelles. En effet, il est intéressant de quantifier le renforcement apporté par les méthodes modèle-centrées aux méthodes données-centrées usuelles dans l'estimation en temps réel et la prévision à court-terme des conditions de circulation. De plus, les méthodes développées, associées aux pistes de recherche évoquées, pourraient représenter un apport considérable aux outils d'aide à la gestion du trafic au quotidien. / Traffic models are of paramount importance for understanding and forecasting traffic dynamics. They represent a significant support for all the stages of traffic management. This thesis focuses on issues related to daily traffic management. For road network managers, four challenges are addressed. The speed refers to the choice of the scale of representation and formulation of the flow model. The selected model is the Lagrangian-Space LWR model. The reliability is associated to the integration of the model errors in the traffic conditions estimation process. The reactivity is described as the capacity of the method to take into account the prevailling traffic states in real time. Finally, the versatility refers to the capacity of the method parameters to evolve considering the observed traffic situations.The scientific challenges that the presented works aim are based on the four issues. The integration of the uncertainties into the flow model is a first challenge. Then, the production of operational indicators that account for the reliability of the results is discussed. Concerning the reactivity, the addressed scientific challenges are the establishment of a vehicle indexes based sequential data assimilation process and the calibration of the model's internal conditions. Finally, concerning the versatility, the associated scientific question is the online calibration of the parameters of the traffic flow model. A model for tracking the errors,assumed to be distributed following Gaussian mixtures, is developped. The error tracking is achieved thanks to an original perturbation method designed for multi-modal Gaussian mixtures. A sensitivity analysis is performed in order to establish a link between the designed method's robustness and the discretization of the network, the number of modes in the Gaussian mixture and the errors on the flow model's parameters. The data assimilation process enables to propagate traffic conditions in accordance with the observed situation in case of non-calibrated demand and supply. The posterior state is calculated by means of a Bayesian inference formulation knowing the prior and observed states. Two methods for model update have been tested. Facing model inconsistencies introduced by the method of substituting \textit{prior} states by \textit{posterior} states, the update acts also on the vehicles by means of addition, deletion, advancing and delaying of the passing times. The validation of the proposed solutions is achieved on a network composed of a simple homogeneous link without discontinuity. When the parameters of the traffic flow models are not calibrated, the data assimilation alone is not able to propagate the traffic states in accordance with the observed situation. The calibration of the parameters is addressed in an opening chapter in which several research avenues are proposed to resolve this last scientific question. The works in this thesis pave the way to perspectives in both research and operational domains. Indeed, it is interesting to quantify the reinforcement brought by model centered methods to usual data centered methods for the real time estimation and the short term forecasting of traffic conditions. Furthermore, the developed methods, associated to the cited research avenues, may represent a significant intake in the daily traffic management tools.
115

Inverse Modelling of Trace Gas Exchange at Canopy and Regional Scales

Styles, Julie Maree, julie.styles@oregonstate.edu January 2003 (has links)
This thesis deals with the estimation of plant-atmosphere trace gas exchange and isotopic discrimination from atmospheric concentration measurements. Two space scales were investigated: canopy and regional. The canopy-scale study combined a Lagrangian model of turbulent dispersal with ecophysiological principles to infer vertical profiles of fluxes of CO2, H2O and heat as well as carbon and oxygen isotope discrimination during CO2 assimilation, from concentration measurements within a forest. The regional-scale model used a convective boundary layer budget approach to infer average regional isotopic discrimination and fluxes of CO2 and sensible and latent heat from the evolution during the day of boundary layer height and mean concentrations of CO2 and H2O, temperature and carbon and oxygen isotope composition of CO2. For the canopy study, concentrations of five scalar quantities, CO2, 13CO2, C18O16O, H2O and temperature, were measured at up to nine heights within and above a mixed fir and spruce forest in central Siberia over several days just after snow melt in May 2000. Eddy covariance measurements of CO2, H2O and heat fluxes were made above the canopy over the same period, providing independent verification of the model flux estimates. Photosynthesis, transpiration, heat exchange and isotope discrimination during CO2 assimilation were modelled for sun and shade leaves throughout the canopy through a combination of inversion of the concentration data and principles of biochemistry, plant physiology and energy balance. In contrast to the more usual inverse modelling concept where fluxes are inferred directly from concentrations, in this study the inversion was used to predict unknown parameters within a process-based model of leaf gas and energy exchange. Parameters relating to photosynthetic capacity, stomatal conductance, radiation penetration and turbulence structure were optimised by the inversion to provide the best fit of modelled to measured concentration profiles of the five scalars. Model results showed that carbon isotope discrimination, stomatal conductance and intercellular CO2 concentration were depressed due to the low temperatures experienced during snow melt, oxygen isotope discrimination was positive and consistent with other estimates, radiation penetrated further than simple theoretical predictions because of leaf clumping and penumbra, the turbulence coherence was lower than expected and stability effects were important in the morning and evening. For the regional study, five flights were undertaken over two days in and above the convective boundary layer above a heterogeneous pine forest and bog region in central Siberia. Vertical profiles of CO2 and H2O concentrations, temperature and pressure were obtained during each flight. Air flask samples were taken at various heights for carbon and oxygen isotopic analysis of CO2. Two budget methods were used to estimate regional surface fluxes of CO2 and plant isotopic discrimination against 13CO2 and C18O16O, with the first method also used to infer regional sensible and latent heat fluxes. Flux estimates were compared to ground-based eddy covariance measurements. Model results showed that afternoon estimates for carbon and oxygen isotope discrimination were close to those expected from source water isotopic measurements and theory of isotope discrimination. Estimates for oxygen isotope discrimination for the morning period were considerably different and could be explained by contrasting influences of the two different ecosystem types and non-steady state evaporative enrichment of leaf water.
116

Model error space and data assimilation in the Mediterranean Sea and nested grids / Espace d'erreur et assimilation de données dans un modèle de la Mer Mediterranée et des grilles gigognes.

Vandenbulcke, Luc 11 June 2007 (has links)
In this work, we implemented the GHER hydrodynamic model in the Gulf of Lions (resolution 1/100°). This model is nested interactively in another model covering the North-Western basin of the Mediterranean Sea (resolution 1/20°), itself nested in a model covering the whole basin (1/4°). A data assimilation filter, called the SEEK filter, is used to test in which of those grids observations taken in the Gulf of Lions are best assimilated. Therefore, twin experiments are used: a reference run is considered as the truth, and another run, starting from different initial conditions, assimilates pseudo-observations coming from the reference run. It appeared that, in order to best constrain the coastal model, available data should be assimilated in that model. The most efficient setup, however, is to group all the state vectors from the 3 grids into a single vector, and hence coherently modify the 3 domains at once during assimilation cycles. Operational forecasting with nested models often only uses so-called passive nesting: no data feedback happens from the regional models to the global model. We propose a new idea: to use data assimilation as a substitute for the feedback. Using again twin experiments, we show that when assimilating outputs from the regional model in the global model, this has benecial impacts for the subsequent forecasts in the regional model. The data assimilation method used in those experiments corrects errors in the models using only some privileged directions in the state space. Furthermore, these directions are selected from a previous model run. This is a weakness of the method when real observations are available. We tried to build new directions of the state space using an ensemble run, this time covering only the Mediterranean basin (without grid nesting). This led to a quantitative characterization of the forecast errors we might expect when various parameters and external forcings are affected by uncertainties. Finally, using these new directions, we tried to build a statistical model supposed to simulate the hydrodynamical model using only a fraction of the computer resources needed by the latter. To achieve this goal, we tried out artifficial neural networks, nearest-neighbor and regression trees. This study constitutes only the first step toward an innovative statistical model, as in its present form, only a few degrees of freedom are considered and the primitive equation model is still required to build the AL method. We tried forecasting at 2 different time horizons: one day and one week.
117

Assimilation of trace gas retrievals obtained from satellite (SCIAMACHY), aircraft and ground observations into a regional scale air quality model (CMAQ-DDM/3D)

Kaynak, Burcak 15 September 2009 (has links)
A major opportunity for using satellite observations of tropospheric chemical concentrations is to improve our scientific understanding of atmospheric processes by integrated analysis of satellite, aircraft, and ground-based observations with global and regional scale models. One endpoint of such efforts is to reduce modeling biases and uncertainties. The idea of coupling these observations with a regional scale air quality model was the starting point of this research. The overall objective of this research was to improve the NOₓ emission inventories by integrating observations from different platforms and regional air quality modeling. Specific objectives were: 1) Comparison of satellite NO₂ retrievals with simulated NO₂ by the regional air quality model. Comparison of simulated tropospheric gas concentrations simulated by the regional air quality model, with aircraft and ground-based observations; 3) Assessment of the uncertainties in comparing satellite NO₂ retrievals with NOₓ emissions estimates and model simulations; 4) Identification of biases in emission inventories by data assimilation of satellite NO₂ retrievals, and ground-based NO, NO₂ and O₃ observations with an iterative inverse method using the regional air quality model coupled with sensitivity calculations; 5) Improvement of our understanding of NOₓ emissions, and the interaction between regional and global air pollution by an integrated analysis of satellite NO₂ retrievals with the regional air quality model. Along with these objectives, a lightning NOₓ emission inventory was prepared for two months of summer 2004 to account for a significant upper level NOₓ source. Spatially-resolved weekly NO₂ variations from satellite retrievals were compared with estimated NOₓ emissions for different region types. Data assimilation of satellite NO₂ retrievals, and ground-based NO, NO₂ and O₃ observations were performed to evaluate the NOₓ emission inventory. This research contributes to a better understanding of the use of satellite NO₂ retrievals in air quality modeling, and improvements in the NOₓ emission inventories by correcting some of the inconsistencies that were found in the inventories. Therefore, it may provide groups that develop emissions estimates guidance on areas for improvement. In addition, this research indicates the weaknesses and the strengths of the satellite NO₂ retrievals and offers suggestions to improve the quality of the retrievals for further use in the tropospheric air pollution research.
118

Tsunami Prediction and Earthquake Parameters Estimation in the Red Sea

Sawlan, Zaid A 12 1900 (has links)
Tsunami concerns have increased in the world after the 2004 Indian Ocean tsunami and the 2011 Tohoku tsunami. Consequently, tsunami models have been developed rapidly in the last few years. One of the advanced tsunami models is the GeoClaw tsunami model introduced by LeVeque (2011). This model is adaptive and consistent. Because of different sources of uncertainties in the model, observations are needed to improve model prediction through a data assimilation framework. Model inputs are earthquake parameters and topography. This thesis introduces a real-time tsunami forecasting method that combines tsunami model with observations using a hybrid ensemble Kalman filter and ensemble Kalman smoother. The filter is used for state prediction while the smoother operates smoothing to estimate the earthquake parameters. This method reduces the error produced by uncertain inputs. In addition, state-parameter EnKF is implemented to estimate earthquake parameters. Although number of observations is small, estimated parameters generates a better tsunami prediction than the model. Methods and results of prediction experiments in the Red Sea are presented and the prospect of developing an operational tsunami prediction system in the Red Sea is discussed.
119

Constraining 3D Petroleum Reservoir Models to Petrophysical Data, Local Temperature Observations, and Gridded Seismic Attributes with the Ensemble Kalman Filter (EnKF)

Zagayevskiy, Yevgeniy Unknown Date
No description available.
120

Use of Temperature data for assisted history matching and characterization of SAGD heterogeneous reservoirs within EnKF framework

Panwar, Amit Unknown Date
No description available.

Page generated in 0.1303 seconds