• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 86
  • 57
  • 13
  • 13
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 216
  • 216
  • 75
  • 39
  • 36
  • 35
  • 35
  • 26
  • 25
  • 19
  • 18
  • 18
  • 18
  • 18
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Application of frequency-dependent nudging in biogeochemical modeling and assessment of marine animal tag data for ocean observations

Lagman, Karl Bryan 28 June 2013 (has links)
Numerical models are powerful and widely used tools for environmental prediction; however, any model prediction contains errors due to imperfect model parameterizations, insufficient model resolution, numerical errors, imperfect initial and boundary conditions etc. A variety of approaches is applied to quantify, correct and minimize these errors including skill assessments, bias correction and formal data assimilation. All of these require observations and benefit from comprehensive data sets. In this thesis, two aspects related to the quantification and correction of errors in biological ocean models are addressed: (i) A new bias correction method for a biological ocean model is evaluated, and (ii) a novel approach for expanding the set of typically available phytoplankton observations is assessed. The bias correction method, referred to as frequency-dependent nudging, was proposed by Thompson et al. (Ocean Modelling, 2006, 13:109-125) and is used to nudge a model only in prescribed frequencies. A desirable feature of this method is that it can preserve high frequency variability that would be dampened with conventional nudging. The method is first applied to an idealized signal consisting of a seasonal cycle and high frequency variability. In this example, frequency-dependent nudging corrected for the imposed seasonal bias without affecting the high-frequency variability. The method is then applied to a non-linear, 1 dimensional (1D) biogeochemical ocean model. Results showed that application of frequency-dependent nudging leads to better biogeochemical estimates than conventional nudging. In order to expand the set of available phytoplankton observations, light measurements from sensors attached on grey seals where assessed to determine if they provide a useful proxy of phytoplankton biomass. A controlled experiment at Bedford Basin showed that attenuation coefficient estimates from light attenuation measurements from seal tags were found to correlate significantly with chlorophyll. On the Scotian Shelf, results of the assessment indicate that seal tags can uncover spatio-temporal patterns related to phytoplankton biomass; however, more research is needed to derive absolute biomass estimates in the region.
72

Data Assimilation for Agent-Based Simulation of Smart Environment

Wang, Minghao 18 December 2014 (has links)
Agent-based simulation of smart environment finds its application in studying people’s movement to help the design of a variety of applications such as energy utilization, HAVC control and egress strategy in emergency situation. Traditionally, agent-based simulation is not dynamic data driven, they run offline and do not assimilate real sensor data about the environment. As more and more buildings are equipped with various sensors, it is possible to utilize real time sensor data to inform the simulation. To incorporate the real sensor data into the simulation, we introduce the method of data assimilation. The goal of data assimilation is to provide inference about system state based on the incomplete, ambiguous and uncertain sensor data using a computer model. A typical data assimilation framework consists of a computer model, a series of sensors and a melding scheme. The purpose of this dissertation is to develop a data assimilation framework for agent-based simulation of smart environment. With the developed data assimilation framework, we demonstrate an application of building occupancy estimation which focuses on position estimation using the framework. We build an agent based model to simulate the occupants’ movement s in the building and use this model in the data assimilation framework. The melding scheme we use to incorporate sensor data into the built model is particle filter algorithm. It is a set of statistical method aiming at compute the posterior distribution of the underlying system using a set of samples. It has the benefit that it does not have any assumption about the target distribution and does not require the target system to be written in analytic form .To overcome the high dimensional state space problem as the number of agents increases, we develop a new resampling method named as the component set resampling and evaluate its effectiveness in data assimilation. We also developed a graph-based model for simulating building occupancy. The developed model will be used for carrying out building occupancy estimation with extremely large number of agents in the future.
73

Ondes hydro-magnétiques dans un modèle Quasi-géostrophique du noyau terrestre / Hydromagnetic waves in a Quasi-geostrophic model of Earth's core

Labbé, François 28 September 2015 (has links)
Les variations du champ magnétique terrestre sont documentées par les observatoires au sol et les satellites en orbite basse, pour des échelles de temps de l'année au siècle.Sur ces périodes, la dynamique du noyau externe -- là où est principalement généré le champ magnétique -- est fortement influencée par la rotation terrestre, qui tend à imposer une invariance dans la direction parallèle à l'axe de rotation.Dans cette thèse, j'étudie un modèle s'appuyant sur cette hypothèse de bidimensionnalité du champ de vitesse, le modèle quasi-géostrophique.Je présente une nouvelle dérivation de ce modèle par une approche variationnelle, plus adaptée aux fortes pentes des frontières du domaine sphérique.Je présente une étude modale des ondes hydro-magnétiques, qui pour la première fois prend en compte l'impact d'un champ magnétique imposé non-zonal.Deux groupes d'ondes magnéto-hydrodynamiques apparaissent alors : les ondes magnéto-Coriolis (centennales) et les ondes d'Alfvéen de torsion (interannuelle).Je décris l'évolution des ondes à mesure que l'on intensifie l'effet de la rotation, jusqu'à atteindre des paramètres géophysiques.Je discute également dans quel mesure une version du modèle quasi-géostrophique où la force de Lorentz est représentée par des produits quadratiques du champ magnétique est adapté pour l'interprétation de calculs numériques tridimensionnels de la dynamo. J'observe que pour les paramètres aujourd'hui accessibles à ces calculs, les forces magnétiques sont faibles. À long terme, nous espérons utiliser le modèle quasi-géostrophique dans le contexte de l'assimilation de données satellitaires. / Variations of the Earth's magnetic field are documented by ground observatories and low-orbiting satellites, for time scales from year to century.On such periods, dynamics of the outer core -- where the creation of the magnetic field takes place -- is strongly influenced by the Earth rotation, which tends to impose invariance of the flow in the direction parallel to the rotation axis.In this thesis report, I study a model based on this bi-dimensional velocity field hypothesis, the quasi-geostrophic model.I present a new mathematical formulation of this model through a variational approach, better suited to steep slopes on the boundaries of the spherical domain.I present a modal study of hydromagnetic waves, taking into account for the first time the impact of a non-zonal imposed magnetic field.Two groups of hydromagnetic waves are present : centennial magneto-Coriolis waves and interannual torsional Alfvén waves.I describe evolution of those waves as the effect of rotation is intensified until Earth-like parameters are reached.I also discuss in what measure an other version of the quasi-geostrophic model, where Lorentz force is represented by quadratic products of the magnetic field, can be adapted to understand tridimensional dynamo numerical simulations.I observe that for parameters available today, magnetic forces are weak.In the future, we hope to use the Quasi-geostrophic model in the context of satellite data assimilation.
74

The climate of Mars from assimilations of spacecraft data

Ruan, Tao January 2015 (has links)
The Mars climate has been explored using two reanalysis datasets based on combining spacecraft observations of temperature and dust with the UK version of the LMD Mars GCM. The semiannual oscillation (SAO) of zonal-mean zonal wind was studied using the existing Mars Analysis Correction Data Assimilation reanalysis during Mars Years (MYs) 24-27. The SAO of zonal-mean zonal wind was shown to exist and extend over a wide range of latitudes. The dynamical driving processes of the SAO in the tropics were investigated, and the forcing due to meridional advection appeared to be the main contributor to the SAO. The study also highlighted some phenomena associated with perturbations of the global circulation during the MY 25 global dust storm (GDS). The meridional advection term was shown to be weaker in the first half of GDS year MY 25 than in the following year, but the forcing due to meridional advection and westward thermal tides both appeared to intensify during the MY 25 GDS. The capabilities of the Mars data assimilation system were also extended in this thesis, 1) to represent dynamic dust lifting and dust transport during the assimilation and 2) to assimilate measurements of the dust vertical distribution. The updated reanalysis was then used to study several major dust events during MY28-29. It proved able to reproduce a southward-moving regional dust storm without the overwhelming assistance of the assimilation. Dust devil lifting was found to at least partly provide the initial pattern of dust of this moving dust storm. The cold anomaly of the cooling zone beneath this dust storm could be as large as ∼ 2 K similar to the magnitude of what was found during the MY 25 GDS. Using the reanalysis, the life cycle of the planet-encircling global dust storm in MY28 was also studied. The Noachis dust storm that occurred just before the MY 28 GDS was found to be the joint result of a travelling Chryse storm, enhanced by dust lifting along its path and local dust lifting in Noachis itself. The adiabatic heating associated with the north polar warming that occurred during MY 28 GDS was up to ∼ 3 times as large as that found during the non-GDS year MY 29. The wind stress dust lifting was shown to in strong correlation with the global average dust loadings, and significantly decreased when the GDS decayed.
75

Utilisation de données cliniques pour la construction de modèles en oncologie / Clinical data used to build models in oncology

Kritter, Thibaut 01 October 2018 (has links)
Cette thèse présente des travaux en lien avec l’utilisation de données cliniques dans la construction de modèles appliqués à l’oncologie. Les modèles actuels visant à intégrer plusieurs mécanismes biologiques liés à la croissance tumorale comportent trop de paramètres et ne sont pas calibrables sur des cas cliniques. A l’inverse, les modèles plus simples ne parviennent pas à prédire précisément l’évolution tumorale pour chaque patient. La multitude et la variété des données acquises par les médecins sont de nouvelles sources d’information qui peuvent permettre de rendre les estimations des modèles plus précises. A travers deux projets différents, nous avons intégré des données dans le processus de modélisation afin d’en tirer le maximum d’information. Dans la première partie, des données d’imagerie et de génétique de patients atteints de gliomes sont combinées à l’aide de méthodes d’apprentissage automatique. L’objectif est de différencier les patients qui rechutent rapidement au traitement de ceux qui ont une rechute plus lente. Les résultats montrent que la stratification obtenue est plus efficace que celles utilisées actuellement par les cliniciens. Cela permettrait donc d’adapter le traitement de manière plus spécifique pour chaque patient. Dans la seconde partie, l’utilisation des données est cette fois destinée à corriger un modèle simple de croissance tumorale. Même si ce modèle est efficace pour prédire le volume d’une tumeur, sa simplicité ne permet pas de rendre compte de l’évolution de forme. Or pouvoir anticiper la future forme d’une tumeur peut permettre au clinicien de mieux planifier une éventuelle chirurgie. Les techniques d’assimilation de données permettent d’adapter le modèle et de reconstruire l’environnement de la tumeur qui engendre ces changements de forme. La prédiction sur des cas de métastases cérébrales est alors plus précise. / This thesis deals with the use of clinical data in the construction of models applied to oncology. Existing models which take into account many biological mechanisms of tumor growth have too many parameters and cannot be calibrated on clinical cases. On the contrary, too simple models are not able to precisely predict tumor evolution for each patient. The diversity of data acquired by clinicians is a source of information that can make model estimations more precise. Through two different projets, we integrated data in the modeling process in order to extract more information from it. In the first part, clinical imaging and biopsy data are combined with machine learning methods. Our aim is to distinguish fast recurrent patients from slow ones. Results show that the obtained stratification is more efficient than the stratification used by cliniciens. It could help physicians to adapt treatment in a patient-specific way. In the second part, data is used to correct a simple tumor growth model. Even though this model is efficient to predict the volume of a tumor, its simplicity prevents it from accounting for shape evolution. Yet, an estimation of the tumor shape enables clinician to better plan surgery. Data assimilation methods aim at adapting the model and rebuilding the tumor environment which is responsible for these shape changes. The prediction of the growth of brain metastases is then more accurate.
76

Um esquema de Assimilação de dados Oceanográficos para o Modelo Oceânico HYCOM ao largo da Costa Sudeste Brasileira / A Data Assimilation Scheme Using The Ocean Model HYCOM For Southeastern Brazilian Bight

Jean Felix de Oliveira 00 December 2009 (has links)
Neste trabalho é apresentado um esquema de assimilação de dados a ser realizado com o Modelo Oceânico de Coordenadas Híbridas HYCOM ao largo da costa sudeste brasileira. O HYCOM utiliza 3 diferentes coordenadas verticais, a saber: coordenada-z na camada de mistura, coordenada isopicnal no oceano profundo estratificado e coordenada sigma-z nas regiões mais rasas e costeiras. Entretanto, como os perfis verticais das principais variáveis oceânicas, como temperatura, salinidade e densidade, são observados e disponibilizados em coordenadas-z, a assimilação desses dados não é tão trivial. Por esse motivo, uma técnica de transformação de coordenadas verticais de isopicnal para z é aqui proposta como uma alternativa para a realização da assimilação de dados no HYCOM. Essa técnica utiliza multiplicadores de Lagrange juntamente com um processo de otimização que garante a conservação do fluxo de massa barotrópico. A técnica de transformação é aplicada juntamente com o método de assimilação de dados proposto por Ezer & Mellor (1997). Esse método utiliza interpolação estatística e correlações, calculadas a priori com resultados do modelo, entre dados de superfície - temperatura (TSM) e /ou altura (ASM) - e a estrutura de subsuperfície de temperatura e densidade potenciais. Com base nos experimentos numéricos realizados, pode-se verificar que o esquema de assimilação de dados foi capaz de reproduzir eficientemente a circulação oceânica do domínio proposto e com os melhores resultados quando utilizando conjuntamente ASM e TSM nas correlações. / The present work presents a data assimilation scheme customized to work with the Hybrid Coordinate Ocean Model (HYCOM) for the Southeastern Brazilian Bights. HYCOM uses hybrid vertical coordinates, i.e., it uses z coordinates in the mixed layer, isopycnal coordinates in the deep ocean and sigma-z coordinates in the continental shelf. However, since vertical profiles of the main ocean variables, like temperature, density and salinity, are observed in z -coordinates, the assimilation of these data into HYCOM is not trivial. For this reason, a technique to transform vertical profiles from isopycnal coordinates to z -coordinates is here proposed as an alternative to realize data assimilation in HYCOM. This technique uses Lagrangian multipliers with a optmization process that guarantees the conservation of the barotropic mass ux. The technique of transformation is applied with the data assimilation method proposed by Ezer & Mellor (1997). The method uses statistical interpolation and correlations, a priori calculated with the models output, between the sea surface data - temperature (SST) and/or height (SSH) - and subsurface potential temperature and density structures. Numerical experiments showed that the data assimilation scheme is able to reproduce eficiently the local ocean circulation. The best performance scheme included the correlation with both SST and SSH.
77

Coherent Doppler Lidar for Boundary Layer Studies and Wind Energy

January 2013 (has links)
abstract: This thesis outlines the development of a vector retrieval technique, based on data assimilation, for a coherent Doppler LIDAR (Light Detection and Ranging). A detailed analysis of the Optimal Interpolation (OI) technique for vector retrieval is presented. Through several modifications to the OI technique, it is shown that the modified technique results in significant improvement in velocity retrieval accuracy. These modifications include changes to innovation covariance portioning, covariance binning, and analysis increment calculation. It is observed that the modified technique is able to make retrievals with better accuracy, preserves local information better, and compares well with tower measurements. In order to study the error of representativeness and vector retrieval error, a lidar simulator was constructed. Using the lidar simulator a thorough sensitivity analysis of the lidar measurement process and vector retrieval is carried out. The error of representativeness as a function of scales of motion and sensitivity of vector retrieval to look angle is quantified. Using the modified OI technique, study of nocturnal flow in Owens' Valley, CA was carried out to identify and understand uncharacteristic events on the night of March 27th 2006. Observations from 1030 UTC to 1230 UTC (0230 hr local time to 0430 hr local time) on March 27 2006 are presented. Lidar observations show complex and uncharacteristic flows such as sudden bursts of westerly cross-valley wind mixing with the dominant up-valley wind. Model results from Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS®) and other in-situ instrumentations are used to corroborate and complement these observations. The modified OI technique is used to identify uncharacteristic and extreme flow events at a wind development site. Estimates of turbulence and shear from this technique are compared to tower measurements. A formulation for equivalent wind speed in the presence of variations in wind speed and direction, combined with shear is developed and used to determine wind energy content in presence of turbulence. / Dissertation/Thesis / Ph.D. Mechanical Engineering 2013
78

Fusing tree-ring and forest inventory data to infer influences on tree growth

Evans, Margaret E. K., Falk, Donald A., Arizpe, Alexis, Swetnam, Tyson L., Babst, Flurin, Holsinger, Kent E. 07 1900 (has links)
Better understanding and prediction of tree growth is important because of the many ecosystem services provided by forests and the uncertainty surrounding how forests will respond to anthropogenic climate change. With the ultimate goal of improving models of forest dynamics, here we construct a statistical model that combines complementary data sources, tree-ring and forest inventory data. A Bayesian hierarchical model was used to gain inference on the effects of many factors on tree growth-individual tree size, climate, biophysical conditions, stand-level competitive environment, tree-level canopy status, and forest management treatments-using both diameter at breast height (dbh) and tree-ring data. The model consists of two multiple regression models, one each for the two data sources, linked via a constant of proportionality between coefficients that are found in parallel in the two regressions. This model was applied to a data set of similar to 130 increment cores and similar to 500 repeat measurements of dbh at a single site in the Jemez Mountains of north-central New Mexico, USA. The tree-ring data serve as the only source of information on how annual growth responds to climate variation, whereas both data types inform non-climatic effects on growth. Inferences from the model included positive effects on growth of seasonal precipitation, wetness index, and height ratio, and negative effects of dbh, seasonal temperature, southerly aspect and radiation, and plot basal area. Climatic effects inferred by the model were confirmed by a den-droclimatic analysis. Combining the two data sources substantially reduced uncertainty about non-climate fixed effects on radial increments. This demonstrates that forest inventory data measured on many trees, combined with tree-ring data developed for a small number of trees, can be used to quantify and parse multiple influences on absolute tree growth. We highlight the kinds of research questions that can be addressed by combining the high-resolution information on climate effects contained in tree rings with the rich tree-and stand-level information found in forest inventories, including projection of tree growth under future climate scenarios, carbon accounting, and investigation of management actions aimed at increasing forest resilience.
79

Prévisions des crues en temps réel sur le bassin de la Marne : assimilation in situ pour la correction du modèle hydraulique mono-dimensionnel Mascaret / Operational flood forecasting on the Marne catchment : data assimilation for hydraulic model Mascaret correction

Habert, Johan 06 January 2016 (has links)
La prévision des crues et des inondations reste aujourd’hui un défi pour anticiper et assurer la sécurité des biens et des personnes. En France, le SCHAPI, qui dépend du MEDDE, assure ce rôle. Les niveaux et les débits d’un cours d’eau dépendent étroitement des interactions à différentes échelles entre les précipitations, les caractéristiques géométriques du cours d’eau et les propriétés topographiques, géologiques et pédologiques du bassin versant. Les modèles hydrauliques, utilisés dans le cadre de la prévision des crues, sont entachés d’incertitudes qu’il est nécessaire de quantifier et de corriger afin de mieux anticiper l’évolution hydrodynamique du cours d’eau en temps réel. L’objectif de ces travaux de thèse est d’améliorer les prévisions de hauteurs d’eau et de débits, sur le bassin de la Marne, issues des modèles hydrauliques utilisés dans le cadre opérationnel de la prévision des crues à partir de méthodes d’assimilation de données. Ces prévisions reposent sur une modélisation mono-dimensionnelle (1D) de l’hydrodynamique du cours d’eau à partir du code hydraulique 1D Mascaret basé sur la résolution des équations de Saint-Venant, enrichie par une méthode d’assimilation de données in situ utilisant un Filtre de Kalman Étendu (EKF). Ce mémoire de thèse s’articule en cinq chapitres, trois dédiés à la recherche et les deux derniers à l’application opérationnelle. Le chapitre 1 présente les données et les outils utilisés pour caractériser le risque inondation dans le cadre de la prévision des crues, ainsi que les modèles hydrauliques Marne Amont Global (MAG) et Marne Moyenne (MM), sujets d’application des méthodes d’assimilation de données développées dans cette étude. Le chapitre 2 est dédié à la méthodologie : il traite des différentes sources d’incertitudes liées à la modélisation hydraulique et présente les approches d’assimilation de données de type EKF appliquées dans cette étude à travers la maquette DAMP pour les réduire. Dans le chapitre 3, cette approche est appliquée aux modèles MAG et MM en mode réanalyse pour un ensemble de crues ayant touché le bassin de la Marne par le passé. Deux publications ont été insérées dans ce chapitre "étude". Dans le chapitre 4, les corrections appliquées dans le chapitre 3, sont validées à partir du rejeu de la crue de 1983 en condition opérationnelle avec le modèle MM. La quantification des incertitudes de prévision et la réalisation de cartes de zones inondées potentielles y sont aussi abordées. L’application de ces méthodes d’assimilation de données pour les modèles MAG et MM en opérationnel au SCHAPI au niveau national et au SPC SAMA au niveau local est présentée dans le chapitre 5. Cette thèse s’inscrit dans un contexte collaboratif où chacun apporte son expertise : la modélisation hydraulique pour le LNHE, les méthodes numériques pour le CERFACS et la prévision opérationnelle pour le SCHAPI. L’ensemble de ces travaux de thèse a permis de démontrer les bénéfices et la complémentarité de l’estimation des paramètres et de l’état hydraulique par assimilation de données sur les hauteurs d’eau et les débits prévus par un modèle hydraulique 1D, ce qui constitue un enjeu d’importance pour l’anticipation du risque hydrologique. Ces méthodes ont été intégrées dans la chaîne opérationnelle de prévision du SCHAPI et du SPC SAMA. / Flood forecasting remains a challenge to anticipate and insure security of people. In France, the SCHAPI, wich depends on the MEDDE, takes this function. Water levels and discharges are highly dependent on interactions at different scales between rainfall, geometric characteristics of rivers and topographic, geological and soil properties of the watershed. Hydraulic models, used in the context of flood forecasting, are tainted by uncertainties which necessist to be quantified and corrected in order to better anticipate flow evolution in real time. The work carried out for this PhD thesis aims to improve water level and discharge forecasts on the Marne watershed, from hydraulic models used in the operational framework of flood forecasting using data assimilation methods. These forecasts come from a mono-dimensional (1D) hydraulic model Mascaret based on the resolution of Saint-Venant equations, improved by data assimilation methods using an Extended Kalman Filter (EKF). This thesis consists of five chapters, three dedicated to research and the two last to the operational application. The first presents data, tools and methods used to characterize the flood risk in the context of flood forecasting, as well as the Marne Amont Global (MAG) and Marne Moyenne (MM) models, subjects of application of data assimilation methods developed in this study. The second chapter covers hydraulic model uncertainties and data assimilation methodology (Kalman filter) applied in this thesis through DAMP in order to reduce them. In the third chapter, this approach is applied to the MAG and MM models for different flood events. In the fourth chapter, the April 1983 flood event allows to validate the corrections applied in the previous chapter for the MM model in an operational context. The uncertainties evaluations and the mapping of potential flooded zones are also reported. The real-time application of these data assimilation methods for MAG and MM models by SCHAPI and SPC SAMA is presented in the fifth chapter. This thesis takes place in a collaborative work where each member brings his own expertise : the hydraulic modeling for LNHE, the numeric methods for the CERFACS and operational forecasting for the SCHAPI. This thesis shows the benefits and complementarity of the evaluation of parameters and hydraulic state using data assimilation on water levels and discharges forcasted by a 1D hydraulic model, which is an important issue for the anticipation of hydrologic risk. These methods have already been integrated to the operational chain of flood forecasting of the SCHAPI and the SPC SAMA.
80

Chemical Feedback From Decreasing Carbon Monoxide Emissions

Gaubert, B., Worden, H. M., Arellano, A. F. J., Emmons, L. K., Tilmes, S., Barré, J., Martinez Alonso, S., Vitt, F., Anderson, J. L., Alkemade, F., Houweling, S., Edwards, D. P. 16 October 2017 (has links)
Understanding changes in the burden and growth rate of atmospheric methane (CH4) has been the focus of several recent studies but still lacks scientific consensus. Here we investigate the role of decreasing anthropogenic carbon monoxide (CO) emissions since 2002 on hydroxyl radical (OH) sinks and tropospheric CH4 loss. We quantify this impact by contrasting two model simulations for 2002-2013: (1) a Measurement of the Pollution in the Troposphere (MOPITT) CO reanalysis and (2) a Control-Run without CO assimilation. These simulations are performed with the Community Atmosphere Model with Chemistry of the Community Earth System Model fully coupled chemistry climate model with prescribed CH4 surface concentrations. The assimilation of MOPITT observations constrains the global CO burden, which significantly decreased over this period by similar to 20%. We find that this decrease results to (a) increase in CO chemical production, (b) higher CH4 oxidation by OH, and (c) similar to 8% shorter CH4 lifetime. We elucidate this coupling by a surrogate mechanism for CO-OH-CH4 that is quantified from the full chemistry simulations.

Page generated in 0.1392 seconds