11 |
ESA ExoMars Rover PanCam System Geometric Modeling and EvaluationLi, Ding 14 May 2015 (has links)
No description available.
|
12 |
Error Estimations in the Design of a Terrain Measurement SystemRainey, Cameron Scott 22 March 2013 (has links)
Terrain surface measurement is an important tool in vehicle design work as well as pavement classification and health monitoring. �Non-deformable terrains are the primary excitation to vehicles traveling over it, and therefore it is important to be able to quantify these terrain surfaces. Knowledge of the terrain can be used in combination with vehicle models in order to predict force loads the vehicles would experience while driving over the terrain surface. �This is useful in vehicle design, as it can speed the design process through the use of simulation as opposed to prototype construction and durability testing. �Additionally, accurate terrain maps can be used by highway engineers and maintenance personnel to identify deterioration in road surface conditions for immediate correction. �Repeated measurements of terrain surfaces over an extended length of time can also allow for long term pavement health monitoring.
Many systems have been designed to measure terrain surfaces, most of them historically single line profiles, with more modern equipment capable of capturing three dimensional measurements of the terrain surface. �These more modern systems are often constructed using a combination of various sensors which allow the system to measure the relative height of the terrain with respect to the terrain measurement system. �Additionally, these terrain measurement systems are also equipped with sensors which allow the system to be located in some global coordinate space, as well as the angular attitude of that system to be estimated. �Since all sensors return estimated values, with some uncertainty, the combination of a group of sensors serves to also combine their uncertainties, resulting in a system which is less precise than any of its individual components. �In order to predict the precision of the system, the individual probability densities of the components must be quantified, in some cases transformed, and finally combined in order to predict the system precision. �This thesis provides a proof-of-concept as to how such an evaluation of final precision can be performed. / Master of Science
|
13 |
Metodologia para quantificação e acompanhamento de indicadores-chave de desempenho operacionalGiaquinto, Cláudia Daniela Melo January 2017 (has links)
Indicadores-chave de desempenho (KPIs) exercem um papel de extrema importância na indústria de processos, auxiliando na tomada de decisão. No entanto, para serem representativos precisam ser calculados de forma confiável. O presente trabalho propôs uma metodologia para o cálculo destes KPIs com base em técnicas de detecção do estado estacionário, remoção de ruído, propagação de erros e análise de sensibilidade. Estes KPIs foram apresentados, de acordo com o que consta na literatura, em uma nova ferramenta gráfica de acompanhamento proposta pelos autores, denominada StatSSCandlePlot. O StatSSCandlePlot apresenta os KPIs no padrão candlestick, que é bastante utilizado no mercado de ações, incluindo informações adicionais. O grande diferencial do StatSSCandlePlot é que os indicadores e suas respectivas propriedades exibidas são calculadas a partir de técnicas que englobam o tratamento de dados e análises estatísticas. A metodologia proposta foi aplicada em um estudo de caso de um chuveiro contendo dois princípios de aquecimento, gás e energia elétrica. Para este estudo, foi criado o Índice de Qualidade do Banho (IQB), que é um indicador dependente da temperatura e da vazão de saída, cujos dados foram avaliados em três cenários distintos, o primeiro quando o sistema é submetido a distúrbios na vazão, no segundo ocorre uma queda na temperatura da água fria e no último, o IQB foi avaliado quando o sistema foi submetido a distúrbios na vazão sob uma nova estratégia de controle da planta. A partir do StatSSCandlePlot, foi possível identificar as tendências do indicador nos diferentes cenários, a parcela de cada janela no estado estacionário, os valores a serem considerados do indicador e, de forma complementar, identificar a variável que mais influenciou na variação do indicador, através da análise de sensibilidade. / Key performance indicators (KPIs) play an extremely important role in the process industry, aiding in decision-making. However, to be representative they need to be calculated reliably. The present work proposed a methodology for the calculation of these KPIs based on steady state detection, noise removal, error propagation and sensitivity analysis techniques. These KPIs were presented, as far as it is known, in a new graphical KPIs monitoring tool proposed by the authors, called StatSSCandlePlot. StatSSCandlePlot introduces KPIs in the candlestick standard, which is widely used in the stock market, including additional information. The major difference of StatSSCandlePlot is that the indicators and their respective displayed properties are calculated from techniques that encompass data processing and statistical analysis. The proposed methodology was applied in a case study of a shower containing two principles of heating, gas and electric energy. For this study the Bath Quality Index (BQI) was created, which is a temperature and output flow dependent indicator, whose data were evaluated in three different scenarios, the first one when the system was submitted to flow disturbances, in the second one, a decrease in the temperature of the cold water and in the last one, the IQB was evaluated when the system was submitted to disturbances in the flow under a new strategy of control of the plant. From the StatSSCandlePlot, it was possible to identify the trends of the indicator in the different scenarios, the portion of each window in the steady state, the values to be considered in the indicator and, in a complementary way, to identify the variable that most influenced the variation of the indicator, through the sensitivity analysis.
|
14 |
Quantitative performance evaluation of autonomous visual navigationTian, Jingduo January 2017 (has links)
Autonomous visual navigation algorithms for ground mobile robotic systems working in unstructured environments have been extensively studied for decades. Among these work, algorithm performance evaluations between different design configurations mainly involve the use of benchmark datasets with a limited number of real-world trails. Such evaluations, however, have difficulties to provide sufficient statistical power for performance quantification. In addition, they are unable to independently assess the algorithm robustness to individual realistic uncertainty sources, including the environment variations and processing errors. This research presents a quantitative approach to performance and robustness evaluation and optimisation of autonomous visual navigation algorithms, using large scale Monte-Carlo analyses. The Monte-Carlo analyses are supported by a simulation environment designed to represent a real-world level of visual information, using the perturbations from realistic visual uncertainties and processing errors. With the proposed evaluation method, a stereo vision based autonomous visual navigation algorithm is designed and iteratively optimised. This algorithm encodes edge-based 3D patterns into a topological map, and use them for the subsequent global localisation and navigation. An evaluation on the performance perturbations from individual uncertainty sources indicates that the stereo match error produces significant limitation for the current system design. Therefore, an optimisation approach is proposed to mitigate such an error. This maximises the Fisher information available in stereo image pairs by manipulating the stereo geometry. Moreover, the simulation environment is further updated in association with the algorithm design, which include the quantitative modelling and simulation of localisation error to the subsequent navigation behaviour. During a long-term Monte-Carlo evaluation and optimisation, the algorithm performance has been significantly improved. Simulation experiments demonstrate that the navigation of a 3-DoF robotic system is achieved in an unstructured environment, while possessing sufficient robustness to realistic visual uncertainty sources and systematic processing errors.
|
15 |
Calage en ligne d'un modèle dynamique de trafic routier pour l'estimation en temps réel des conditions de circulation / Online calibration of a dynamic traffic model for real time estimation of traffic statesClairais, Aurélien 12 April 2019 (has links)
Les modèles de trafic ont une importance capitale pour la compréhension et la prévision des phénomènes liés aux conditions de circulation. Ils représentent une aide précieuse à tous les niveaux de gestion du trafic. Cette thèse s'attache aux problématiques liées à la gestion du trafic au quotidien. Pour les gestionnaires de réseaux, quatre enjeux sont traités. L'enjeu de rapidité renvoie au choix de l'échelle de représentation et la formulation du modèle d'écoulement. Le modèle retenu est le modèle LWR lagrangien-spatial. La fiabilité est un enjeu relatif à la prise en compte des erreurs de modèles dans les estimations des conditions de circulation. La réactivité est décrite comme la capacité de la méthode à prendre en compte en temps réel les états de trafic captés. Enfin, l'adaptabilité renvoie à la capacité des paramètres de la méthode à évoluer en tenant compte des situations de trafic observées. Les verrous scientifiques que les travaux présentés cherchent à lever s'articulent autour des quatre enjeux décrits précédemment. L'intégration de la propagation des incertitudes directement dans le modèle d'écoulement représente un premier verrou. Ensuite, la production d'indicateurs opérationnels rendant compte de la fiabilité des résultats. Concernant l'enjeu de réactivité, les verrous scientifiques traités sont la mise en place d'un schéma d'assimilation de données séquentiel et le calage des conditions internes du modèle d'écoulement intégrant les erreurs de modèle et d'observation. Enfin, concernant l'enjeu de réactivité, le verrou scientifique associé est le calage en ligne des paramètres du modèle d'écoulement. Un modèle de suivi d'erreur où les variables du modèle d'écoulement sont distribuées selon des mélanges de gaussienne est développé. Le suivi des erreurs dans le modèle est réalisé grâce à une méthode de perturbation adaptée à la formulation multi-composantes des mélanges de gaussiennes. Une analyse de sensibilité est menée afin d'établir le lien entre la robustesse de la méthode proposée et la discrétisation du réseau, le nombre de composantes dans le mélange de gaussiennes et les erreurs sur les paramètres du modèle d'écoulement. Ce modèle permet la production d'indicateurs opérationnels et leurs erreurs associées rendant compte de la fiabilité des conditions de circulation ainsi estimées. Le processus d'assimilation séquentielle permet d'estimer et de prévoir les conditions de trafic en accord avec les observations en cas de demande et d'offre non calées. L'état a posteriori est calculé à l'aide d'une formulation bayésienne connaissant les états a priori et les observations. Deux méthodes de mise à jour du modèle ont été testées. Devant les incohérences du modèle, introduites par la méthode de substitution des états a priori par les états a posteriori, la mise à jour agit aussi sur les véhicules via l'ajout, la suppression, l'avancement ou le retardement de leurs temps de passage. La validation des concepts étudiés est réalisée sur un réseau composé d'un simple lien homogène sans discontinuité. Lorsque les paramètres de l'écoulement du trafic ne sont pas calés, l'assimilation de données seule ne permet pas de propager correctement les états de trafic en accord avec la situation observée. Le calage des paramètres d'écoulement est traité dans un chapitre d'ouverture dans lequel des pistes de recherche sont suggérées afin de proposer des solutions à ce dernier verrou scientifique. Les travaux de cette thèse ouvrent la voie à des perspectives de recherche et opérationnelles. En effet, il est intéressant de quantifier le renforcement apporté par les méthodes modèle-centrées aux méthodes données-centrées usuelles dans l'estimation en temps réel et la prévision à court-terme des conditions de circulation. De plus, les méthodes développées, associées aux pistes de recherche évoquées, pourraient représenter un apport considérable aux outils d'aide à la gestion du trafic au quotidien. / Traffic models are of paramount importance for understanding and forecasting traffic dynamics. They represent a significant support for all the stages of traffic management. This thesis focuses on issues related to daily traffic management. For road network managers, four challenges are addressed. The speed refers to the choice of the scale of representation and formulation of the flow model. The selected model is the Lagrangian-Space LWR model. The reliability is associated to the integration of the model errors in the traffic conditions estimation process. The reactivity is described as the capacity of the method to take into account the prevailling traffic states in real time. Finally, the versatility refers to the capacity of the method parameters to evolve considering the observed traffic situations.The scientific challenges that the presented works aim are based on the four issues. The integration of the uncertainties into the flow model is a first challenge. Then, the production of operational indicators that account for the reliability of the results is discussed. Concerning the reactivity, the addressed scientific challenges are the establishment of a vehicle indexes based sequential data assimilation process and the calibration of the model's internal conditions. Finally, concerning the versatility, the associated scientific question is the online calibration of the parameters of the traffic flow model. A model for tracking the errors,assumed to be distributed following Gaussian mixtures, is developped. The error tracking is achieved thanks to an original perturbation method designed for multi-modal Gaussian mixtures. A sensitivity analysis is performed in order to establish a link between the designed method's robustness and the discretization of the network, the number of modes in the Gaussian mixture and the errors on the flow model's parameters. The data assimilation process enables to propagate traffic conditions in accordance with the observed situation in case of non-calibrated demand and supply. The posterior state is calculated by means of a Bayesian inference formulation knowing the prior and observed states. Two methods for model update have been tested. Facing model inconsistencies introduced by the method of substituting \textit{prior} states by \textit{posterior} states, the update acts also on the vehicles by means of addition, deletion, advancing and delaying of the passing times. The validation of the proposed solutions is achieved on a network composed of a simple homogeneous link without discontinuity. When the parameters of the traffic flow models are not calibrated, the data assimilation alone is not able to propagate the traffic states in accordance with the observed situation. The calibration of the parameters is addressed in an opening chapter in which several research avenues are proposed to resolve this last scientific question. The works in this thesis pave the way to perspectives in both research and operational domains. Indeed, it is interesting to quantify the reinforcement brought by model centered methods to usual data centered methods for the real time estimation and the short term forecasting of traffic conditions. Furthermore, the developed methods, associated to the cited research avenues, may represent a significant intake in the daily traffic management tools.
|
16 |
Error Propagation and Metamodeling for a Fidelity Tradeoff Capability in Complex Systems DesignMcDonald, Robert Alan 07 July 2006 (has links)
Complex man-made systems are ubiquitous in modern technological society. The national air transportation infrastructure and the aircraft that operate within it, the highways stretching coast-to-coast and the vehicles that travel on them, and global communications networks and the computers that make them possible are all complex systems.
It is impossible to fully validate a systems analysis or a design process. Systems are too large, complex, and expensive to build test and validation articles. Furthermore, the operating conditions throughout the life cycle of a system are impossible to predict and control for a validation experiment.
Error is introduced at every point in a complex systems design process. Every error source propagates through the complex system in the same way information propagates, feedforward, feedback, and coupling are all present with error.
As with error propagation through a single analysis, error sources grow and decay when propagated through a complex system. These behaviors are made more complex by the complex interactions of a complete system. This complication and the loss of intuition that accompanies it make proper error propagation calculations even more important to aid the decision maker.
Error allocation and fidelity trade decisions answer questions like: Is the fidelity of a complex systems analysis adequate, or is an improvement needed, and how is that improvement best achieved? Where should limited resources be invested for the improvement of fidelity? How does knowledge of the imperfection of a model impact design decisions based on the model and the certainty of the performance of a particular design?
In this research, a fidelity trade environment was conceived, formulated, developed, and demonstrated. This development relied on the advancement of enabling techniques including error propagation, metamodeling, and information management. A notional transport aircraft is modeled in the fidelity trade environment. Using the environment, the designer is able to make design decisions while considering error and he is able to make decisions regarding required tool fidelity as the design problem continues. These decisions could not be made in a quantitative manner before the fidelity trade environment was developed.
|
17 |
Numerical study of error propagation in Monte Carlo depletion simulationsWyant, Timothy Joseph 26 June 2012 (has links)
Improving computer technology and the desire to more accurately model the heterogeneity of the nuclear reactor environment have made the use
of Monte Carlo depletion codes more attractive in recent years, and feasible (if not practical) even for 3-D depletion simulation. However, in this case statistical uncertainty is combined with error propagating through the calculation from previous steps. In an effort to understand this error propagation, four test problems were developed to test error propagation in
the fuel assembly and core domains. Three test cases modeled and tracked individual fuel pins in four 17x17 PWR fuel assemblies. A fourth problem
modeled a well-characterized 330MWe nuclear reactor core. By changing the code's initial random number seed, the data produced by a series of 19 replica runs of each test case was used to investigate the true and apparent variance in k-eff, pin powers, and number densities of several isotopes. While this study does not intend to develop a predictive model for error
propagation, it is hoped that its results can help to identify some common regularities in the behavior of uncertainty in several key parameters.
|
18 |
Metodologia para quantificação e acompanhamento de indicadores-chave de desempenho operacionalGiaquinto, Cláudia Daniela Melo January 2017 (has links)
Indicadores-chave de desempenho (KPIs) exercem um papel de extrema importância na indústria de processos, auxiliando na tomada de decisão. No entanto, para serem representativos precisam ser calculados de forma confiável. O presente trabalho propôs uma metodologia para o cálculo destes KPIs com base em técnicas de detecção do estado estacionário, remoção de ruído, propagação de erros e análise de sensibilidade. Estes KPIs foram apresentados, de acordo com o que consta na literatura, em uma nova ferramenta gráfica de acompanhamento proposta pelos autores, denominada StatSSCandlePlot. O StatSSCandlePlot apresenta os KPIs no padrão candlestick, que é bastante utilizado no mercado de ações, incluindo informações adicionais. O grande diferencial do StatSSCandlePlot é que os indicadores e suas respectivas propriedades exibidas são calculadas a partir de técnicas que englobam o tratamento de dados e análises estatísticas. A metodologia proposta foi aplicada em um estudo de caso de um chuveiro contendo dois princípios de aquecimento, gás e energia elétrica. Para este estudo, foi criado o Índice de Qualidade do Banho (IQB), que é um indicador dependente da temperatura e da vazão de saída, cujos dados foram avaliados em três cenários distintos, o primeiro quando o sistema é submetido a distúrbios na vazão, no segundo ocorre uma queda na temperatura da água fria e no último, o IQB foi avaliado quando o sistema foi submetido a distúrbios na vazão sob uma nova estratégia de controle da planta. A partir do StatSSCandlePlot, foi possível identificar as tendências do indicador nos diferentes cenários, a parcela de cada janela no estado estacionário, os valores a serem considerados do indicador e, de forma complementar, identificar a variável que mais influenciou na variação do indicador, através da análise de sensibilidade. / Key performance indicators (KPIs) play an extremely important role in the process industry, aiding in decision-making. However, to be representative they need to be calculated reliably. The present work proposed a methodology for the calculation of these KPIs based on steady state detection, noise removal, error propagation and sensitivity analysis techniques. These KPIs were presented, as far as it is known, in a new graphical KPIs monitoring tool proposed by the authors, called StatSSCandlePlot. StatSSCandlePlot introduces KPIs in the candlestick standard, which is widely used in the stock market, including additional information. The major difference of StatSSCandlePlot is that the indicators and their respective displayed properties are calculated from techniques that encompass data processing and statistical analysis. The proposed methodology was applied in a case study of a shower containing two principles of heating, gas and electric energy. For this study the Bath Quality Index (BQI) was created, which is a temperature and output flow dependent indicator, whose data were evaluated in three different scenarios, the first one when the system was submitted to flow disturbances, in the second one, a decrease in the temperature of the cold water and in the last one, the IQB was evaluated when the system was submitted to disturbances in the flow under a new strategy of control of the plant. From the StatSSCandlePlot, it was possible to identify the trends of the indicator in the different scenarios, the portion of each window in the steady state, the values to be considered in the indicator and, in a complementary way, to identify the variable that most influenced the variation of the indicator, through the sensitivity analysis.
|
19 |
Metodologia para quantificação e acompanhamento de indicadores-chave de desempenho operacionalGiaquinto, Cláudia Daniela Melo January 2017 (has links)
Indicadores-chave de desempenho (KPIs) exercem um papel de extrema importância na indústria de processos, auxiliando na tomada de decisão. No entanto, para serem representativos precisam ser calculados de forma confiável. O presente trabalho propôs uma metodologia para o cálculo destes KPIs com base em técnicas de detecção do estado estacionário, remoção de ruído, propagação de erros e análise de sensibilidade. Estes KPIs foram apresentados, de acordo com o que consta na literatura, em uma nova ferramenta gráfica de acompanhamento proposta pelos autores, denominada StatSSCandlePlot. O StatSSCandlePlot apresenta os KPIs no padrão candlestick, que é bastante utilizado no mercado de ações, incluindo informações adicionais. O grande diferencial do StatSSCandlePlot é que os indicadores e suas respectivas propriedades exibidas são calculadas a partir de técnicas que englobam o tratamento de dados e análises estatísticas. A metodologia proposta foi aplicada em um estudo de caso de um chuveiro contendo dois princípios de aquecimento, gás e energia elétrica. Para este estudo, foi criado o Índice de Qualidade do Banho (IQB), que é um indicador dependente da temperatura e da vazão de saída, cujos dados foram avaliados em três cenários distintos, o primeiro quando o sistema é submetido a distúrbios na vazão, no segundo ocorre uma queda na temperatura da água fria e no último, o IQB foi avaliado quando o sistema foi submetido a distúrbios na vazão sob uma nova estratégia de controle da planta. A partir do StatSSCandlePlot, foi possível identificar as tendências do indicador nos diferentes cenários, a parcela de cada janela no estado estacionário, os valores a serem considerados do indicador e, de forma complementar, identificar a variável que mais influenciou na variação do indicador, através da análise de sensibilidade. / Key performance indicators (KPIs) play an extremely important role in the process industry, aiding in decision-making. However, to be representative they need to be calculated reliably. The present work proposed a methodology for the calculation of these KPIs based on steady state detection, noise removal, error propagation and sensitivity analysis techniques. These KPIs were presented, as far as it is known, in a new graphical KPIs monitoring tool proposed by the authors, called StatSSCandlePlot. StatSSCandlePlot introduces KPIs in the candlestick standard, which is widely used in the stock market, including additional information. The major difference of StatSSCandlePlot is that the indicators and their respective displayed properties are calculated from techniques that encompass data processing and statistical analysis. The proposed methodology was applied in a case study of a shower containing two principles of heating, gas and electric energy. For this study the Bath Quality Index (BQI) was created, which is a temperature and output flow dependent indicator, whose data were evaluated in three different scenarios, the first one when the system was submitted to flow disturbances, in the second one, a decrease in the temperature of the cold water and in the last one, the IQB was evaluated when the system was submitted to disturbances in the flow under a new strategy of control of the plant. From the StatSSCandlePlot, it was possible to identify the trends of the indicator in the different scenarios, the portion of each window in the steady state, the values to be considered in the indicator and, in a complementary way, to identify the variable that most influenced the variation of the indicator, through the sensitivity analysis.
|
20 |
Estimation de la biomasse en forêt tropicale humide : propagation des incertitudes dans la modélisation de la distribution spatiale de la biomasse en Guyane Française / Biomass estimation in neotropical forests : uncertainty propagation and spatial modelling with applications in French GuianaMolto, Quentin 13 December 2012 (has links)
Les forêts tropicales contiennent dans leur biomasse aérienne un stock de carbone important à l’échelle de la planète. Mesurer ce stock et comprendre son fonctionnement permet de mieux saisir les enjeux liés à sa protection, sa destruction ou sa modification, ainsi que son rôle dans le cycle du carbone et les mécanismes climatiques globaux. Dans le cadre des problématiques économiques et écologiques liés au climat, les forêts sont progressivement prises en compte dans les programmes internationaux de mesures visant à réduire l’impact anthropique sur le climat (Protocole de Kyoto, 1998 ; Accord de Copenhague, 2009). L’intégration des forêts dans les mécanismes financiers liés au marché du carbone est clairement envisagée dans les années à venir (marché de carbone « d’origine forestière »).L’estimation de la biomasse aérienne d’une parcelle de forêt inventoriée nécessite l’utilisation successive de plusieurs modèles. Dans ces parcelles, le diamètre et l’espèce de chaque arbre sont renseignés. Ensuite, ces parcelles sont utilisées comme références pour l’extrapolation spatiale des valeurs de biomasse. Cette extrapolation repose souvent sur le traitement de signaux aériens (avions, satellites). Il est aussi possible d’utiliser covariables environnementales (climat, géologie, …).L’incertitude associée à l’estimation de la biomasse d’une région est le résultat de toutes les incertitudes associées à ces différents processus. Les connaissances actuelles ne nous permettent pas de quantifier cette incertitude. L’objectif de la thèse est de développer une méthodologie d’estimation de la biomasse qui permette la propagation des incertitudes ainsi que l’identification des sources de cette incertitude.Modèle de hauteur : Nous proposons un nouveau modèle de hauteur pour les forêts tropicales. Les paramètres de ce modèle ont un sens écologique et peuvent être prédits par des variables décrivant le peuplement sur lequel il est appliqué. Nous avons utilisé les données des projets Amalin et Bridge et sélectionné 42 parcelles réparties en Guyane dans lesquelles le diamètre et la hauteur de chaque arbre ont été mesurés.Modèle de densité de bois : Un modèle à été conçu pour associer à chaque espèce une distribution de densité de bois et non une valeur fixe. Nous avons utilisé les données du projet Bridge dans lequel la densité de bois a été mesurée sur 2504 arbres représentant 466 espèces.Modèle de biomasse: Nous avons montré par une analyse de sensibilité que les incertitudes des prédictions de hauteur et de densité de bois n’avaient qu’une part négligeable dans l’incertitude des prédictions de biomasse. L’amélioration de la précision de ces modèles n’est donc pas une priorité. En revanche, le modèle de hauteur peut être une source de biais.Modélisation spatiale : Les modèles développés ont été appliqués à différents réseaux de parcelles couvrant bien la zone littorale guyanaise : inventaires papetiers de l’ONF, réseau Guyafor, projet Amalin… Les prédictions de biomasse de ces parcelles sont utilisées dans un modèle spatiale. Nous produisons ainsi une carte de biomasse guyanaise associée à une carte de l’incertitude de cette estimation. / Tropical forest yield and important part of the aerial vegetation carbon stock on earth. Measuring and understanding this stock distribution is crucial for the management of the tropical forests facing the actual environmental challenges (REDD+, carbon market).Aerial Biomass estimation of forest census plots requires few models depending on the precision of the inventory: diameter models, height models, wood density models. Spatial extrapolation between census plots relies on aerial data (satellite measurements, images) or ground-based data (geology, altitude).The uncertainty of the estimation of a region’s biomass is the result of the uncertainty brought by all these models. The aim of the thesis is to develop models and methods to estimate the biomass of a region while propagating the uncertainties. This is applied to the neo-tropical forest of French Guiana (South America, Guianas Plateau).
|
Page generated in 0.1817 seconds