Spelling suggestions: "subject:"[een] SENSITIVITY ANALYSIS"" "subject:"[enn] SENSITIVITY ANALYSIS""
41 |
Etude de la complémentarité et de la fusion des images qui seront fournies par les futurs capteurs satellitaires OLCI/Sentinel 3 et FCI/Meteosat Troisième Génération / Study of the complementarity and the fusion of the images that will be provided by the future satellite sensors OLCI/Sentinel-3 and FCI/Meteosat Third GenerationPeschoud, Cécile 17 October 2016 (has links)
L’objectif de cette thèse était de proposer, valider et comparer des méthodes de fusion d’images provenant d’un capteur héliosynchrone multispectral et d’un capteur géostationnaire multispectral, pour produire des cartes de composition de l’eau détaillées spatialement et les mieux rafraîchies possibles. Notre méthodologie a été appliquée au capteur héliosynchrone OLCI sur Sentinel-3 et au capteur géostationnaire FCI sur Météosat Troisième Génération. Dans un premier temps, la sensibilité des deux capteurs à la couleur de l’eau a été analysée. Les images des capteurs OLCI et FCI n’étant pas encore disponibles, ont donc été simulées sur le Golfe du Lion, grâce à des cartes d’hydrosols (chlorophylle, matières en suspension et matières organiques dissoutes) et à des modèles de transfert radiatifs (Hydrolight et Modtran). Deux méthodes de fusion ont ensuite été adaptées puis testées à partir des images simulées : la méthode SSTF (Spatial, Spectral, Temporal Fusion) inspirée de la fusion de (Vanhellemont et al., 2014) et la méthode STARFM (Spatial Temporal Adaptative Reflectance Fusion Model) de (Gao et al., 2006). Les résultats de fusion ont alors été validés avec des images de référence simulées et les cartes d’hydrosols estimées à partir de ces images ont été comparées aux cartes utilisées en entrée des simulations. Pour améliorer le SNR des images FCI, un filtrage temporel a été proposé. Enfin, comme le but est d’obtenir des indicateurs de qualité de l’eau, nous avons testé les méthodes de fusion sur les cartes d’hydrosols estimées à partir des images FCI et OLCI simulées. / The objective of this thesis was to propose, validate and compare fusion methods of images provided by a Low Earth Orbit multispectral sensor and a geostationary multispectral sensor in order to obtain water composition maps with spatial details and high temporal resolution. Our methodology was applied to OLCI Low Earth Orbit sensor on Sentinel-3 and FCI Geostationary Earth Orbit (GEO) sensor on Meteosat Third Generation. Firstly, the sensor sensivity, regarding the water color, was analyzed. As the images from both sensors were not available, they were simulated on the Golf of Lion, thanks to hydrosol maps (chl, SPM and CDOM) and radiative transfer models (Hydrolight and Modtran). Two fusion methods were then adapted and tested with the simulated images: the SSTF (Spatial, Spectral, Temporal Fusion) method inspired from the method developed by (Vanhellemont et al., 2014)) and the STARFM (Spatial Temporal Adaptative Reflectance Fusion Model) method from (Gao et al., 2006)). The fusion results were then validated with the simulated reference images and by estimating the hydrosol maps from the fusion images and comparing them with the input maps of the simulation process. To improve FCI SNR, a temporal filtering was proposed. Finally, as the aim is to obtain a water quality indicator, the fusion methods were adapted and tested on the hydrosol maps estimated with the FCI and OLCI simulated images.
|
42 |
Modelling of Atmospheric Mercury Emission, Transport, Transformation and Deposition in North AmericaWen, Deyong January 2006 (has links)
<html> <head> <meta http-equiv="Content-Type" content="text/html;charset=iso-8859-1"> </head>
<body> A modelling study was conducted to explore the emission, transport, transformation and deposition behaviour of atmospheric Hg. A detailed natural Hg emission model was developed to estimate the natural Hg emissions from soil, water and vegetation. U. S. Environmental Protection Agency (EPA) Community Multiscale Air Quality (CMAQ) Hg model system was improved by incorporating the detailed treatment of natural Hg emissions, adopting boundary conditions from a global Hg model (Seigneur et al. , 2004) and including the calculation of the dry deposition velocity of elemental Hg. The natural Hg emission model and the improved CMAQ-Hg model were validated with some measurements and then applied to North America for a whole year (2002).
A detailed natural Hg emission model was developed in this study. This model made use of detailed soil Hg concentration measurements, meteorological data and soil conditions to estimate Hg emissions from soil, water and vegetation. The influence of snow cover and low temperature was also considered in the model. This model was then applied to simulate one-year natural Hg emissions in North America in 2002. The modelled results, compared to some reported natural Hg emission measurements, demonstrated a strong simulation ability. The spatial and temporal variations of emission fluxes were examined through numerical simulations. A pronounced diurnal cycle and a seasonal cycle were found in the emissions from most land uses. Compared with summer, natural Hg emission was significantly limited in winter. Simulation results showed that about 229 metric tons of total natural Hg emission, 1. 8 times anthropogenic Hg emission, was emitted from the simulation domain in 2002.
U. S. EPA CMAQ Hg model system was improved and then applied to simulate the emission, transport, transformation and deposition of atmospheric Hg in North America for the year 2002. The simulated results were compared with measured hourly Total Gaseous Hg (TGM) for 3 sites. The good agreement between them demonstrated the good performance of this improved model in modelling the behaviour of emission, transport, transformation and deposition of atmospheric Hg. Hg budget and net evasion of Hg in North America were also investigated.
A sensitivity analysis was conducted to assess the effects of emissions, including Hg and non-Hg emissions, on the air concentration and deposition of atmospheric Hg. The results indicated that ambient concentration of TGM was much more sensitive to Hg emissions than non-Hg emissions. Natural Hg emission was more significant than anthropogenic emission to affect ambient concentration of TGM, illustrating natural Hg emission is a key factor influencing TGM ambient concentration. Unlike TGM concentration, Hg dry deposition was not only sensitive to Hg emissions but also to non-Hg emissions such as VOCs and NO<sub>x</sub>. Anthropogenic Hg emission, natural Hg emission and NO<sub>x</sub> emission had almost the same effect on total dry deposition of Hg. The results also illustrated that Hg wet deposition was only sensitive to non-Hg emissions such as NO<sub>x</sub> and VOCs, especially of VOCs emission. Because of the inverse effect of VOCs on Hg wet deposition, reducing NO<sub>x</sub> emission should be an ideal solution to mitigate Hg wet deposition. A possible pathway through which atmospheric Hg was greatly affected by emissions changes was identified: emissions of pollutants, especially VOCs and NO<sub>x</sub>, greatly affect the level of OH in the atmosphere; OH influences the concentration and deposition of Hg by significantly affecting the gas phase reaction between Hg(0) and OH. </body> </html>
|
43 |
Modelling a Manufacturing line : Analysis and Decision support based on Discrete Event SimulationIbrahim, Fady January 2011 (has links)
The increasing competition between the companies forces them to develop the production in a continuous manner in order to maintain the competitiveness in the global market, and became as efficient and effective as possible. This master thesis is conducted at Getrag All Wheel Drive Company which is one of the largest suppliers for transmissions and powertrain systems .This Company has worked actively for long time to improve the production flow at the manufacturing departments by using usual methods. Because of the high complex and intersected flow that the company has, the management intended to adopt another approach that takes dynamic information into consideration, therefore, building a simulation model is the solution, where according to Banks et al (2001) the simulation is a duplication of a real-world process or system and its behaviour as it progress during the time, which is a very useful method to evaluate complex systems, instead of using usual mathematical means used tools. The simulation model created by using Banks et al (2001) simulation methodology, and ExtendSim software help. The resulted model used as a tool that provides great assistance to the decision makers, in order to develop the Conwip system which applied in the manufacturing line under study, and to investigate “What if” scenarios. The result of this study obtained after performing two experiments, where the first experiment gives recommendation regarding the optimal upper bound of the total amount of work that can be used in Conwip system, with the use of sensitivity analysis, and the second experiment analyse the overall all effect on the system after separating the paths of high and low volume products. This project proves the powerful side of using the simulation in situations where it’s too hard or even impossible to improve the performance of a manufacturing line i.e. when large number of variables involved and affecting the system.
|
44 |
Modelling of Atmospheric Mercury Emission, Transport, Transformation and Deposition in North AmericaWen, Deyong January 2006 (has links)
<html> <head> <meta http-equiv="Content-Type" content="text/html;charset=iso-8859-1"> </head>
<body> A modelling study was conducted to explore the emission, transport, transformation and deposition behaviour of atmospheric Hg. A detailed natural Hg emission model was developed to estimate the natural Hg emissions from soil, water and vegetation. U. S. Environmental Protection Agency (EPA) Community Multiscale Air Quality (CMAQ) Hg model system was improved by incorporating the detailed treatment of natural Hg emissions, adopting boundary conditions from a global Hg model (Seigneur et al. , 2004) and including the calculation of the dry deposition velocity of elemental Hg. The natural Hg emission model and the improved CMAQ-Hg model were validated with some measurements and then applied to North America for a whole year (2002).
A detailed natural Hg emission model was developed in this study. This model made use of detailed soil Hg concentration measurements, meteorological data and soil conditions to estimate Hg emissions from soil, water and vegetation. The influence of snow cover and low temperature was also considered in the model. This model was then applied to simulate one-year natural Hg emissions in North America in 2002. The modelled results, compared to some reported natural Hg emission measurements, demonstrated a strong simulation ability. The spatial and temporal variations of emission fluxes were examined through numerical simulations. A pronounced diurnal cycle and a seasonal cycle were found in the emissions from most land uses. Compared with summer, natural Hg emission was significantly limited in winter. Simulation results showed that about 229 metric tons of total natural Hg emission, 1. 8 times anthropogenic Hg emission, was emitted from the simulation domain in 2002.
U. S. EPA CMAQ Hg model system was improved and then applied to simulate the emission, transport, transformation and deposition of atmospheric Hg in North America for the year 2002. The simulated results were compared with measured hourly Total Gaseous Hg (TGM) for 3 sites. The good agreement between them demonstrated the good performance of this improved model in modelling the behaviour of emission, transport, transformation and deposition of atmospheric Hg. Hg budget and net evasion of Hg in North America were also investigated.
A sensitivity analysis was conducted to assess the effects of emissions, including Hg and non-Hg emissions, on the air concentration and deposition of atmospheric Hg. The results indicated that ambient concentration of TGM was much more sensitive to Hg emissions than non-Hg emissions. Natural Hg emission was more significant than anthropogenic emission to affect ambient concentration of TGM, illustrating natural Hg emission is a key factor influencing TGM ambient concentration. Unlike TGM concentration, Hg dry deposition was not only sensitive to Hg emissions but also to non-Hg emissions such as VOCs and NO<sub>x</sub>. Anthropogenic Hg emission, natural Hg emission and NO<sub>x</sub> emission had almost the same effect on total dry deposition of Hg. The results also illustrated that Hg wet deposition was only sensitive to non-Hg emissions such as NO<sub>x</sub> and VOCs, especially of VOCs emission. Because of the inverse effect of VOCs on Hg wet deposition, reducing NO<sub>x</sub> emission should be an ideal solution to mitigate Hg wet deposition. A possible pathway through which atmospheric Hg was greatly affected by emissions changes was identified: emissions of pollutants, especially VOCs and NO<sub>x</sub>, greatly affect the level of OH in the atmosphere; OH influences the concentration and deposition of Hg by significantly affecting the gas phase reaction between Hg(0) and OH. </body> </html>
|
45 |
The Validity Problem of Reverse Engineering Dynamic SystemsChen, Jian-xun 15 August 2006 (has links)
The high-throughput measurement devices for DNA, RNA, and proteins produce large amount of information-rich data from biological dynamic systems. It is a need to reverse engineering these data to reveal parameters/structure and behavior relationships implicit in the data. Ultimately, complex interactions between its components that make up a system can be better understood.
However, issues of reverse engineering in bioinformatics like algorithms use, the number of temporal sample, continuous or discrete type of input data, etc. are discussed but merely in the validity problem. We argue that, since the data available in reality are not so perfect, the result of reverse engineering is impacted by the un-perfect data. If this is true, to know how this impacts the results of the reverse engineering and to what extent is an important issue. We choose the parameter estimation as our task of reverse engineering and develop a novel method to investigate this validity problem. The data we used has a minor deviation from real data in each data point and then we compare the results of reverse engineering with its target parameters. It can be realized that the more error in data will introduce more serious validity problem in reverse engineering.
Three artificial systems are used as test bed to demonstrate our approach. The results of the experiments show, a minor deviation in data may introduce large parameter deviation in the parameter solutions. We conclude that we should not ignore the data error in reverse engineering. To have more knowledge of this phenomenon, we further develop an analytical procedure to analyze the dynamic of the systems to see which characteristic will contribute to this impact. The sensitivity test, propagation analysis and impact factor analysis are applied to the systems. Some qualitative rules that describe the relationship between the results of reverse engineering and the dynamics of the system are summarized.
All the finding of this exploration research needs more study to confirm its results. Along this line of research, the biological meaning and the possible relationship between robustness and the variation in parameters in reverse engineering is worth to study in the future. The better reverse algorithm to avoid this validity problem is another topic for future work.
|
46 |
Computer-Aided Analysis and Design of the Steering System of Front Wheels for VehicleLi, Wei-Chun 04 July 2007 (has links)
Four wheeler is one type of the transportation vehicle which be widely used. Good or bad performance of the vehicle steering system has the decisive influence regarding security of the driver operation and the vehicle travel. Therefore, the purpose of this article is to establish the kinematic analysis theories of the vehicle steering system. And an improvement design theory of size synthesis are proposed to develop a set of user-friendly computer aided analysis and design program.
At First, this article collects the related literatures and papers to understand the research technique and the design direction of vehicle steering mechanism. Second, establish the location analysis theory, the rotation torque analysis theory, the mechanical advantage theory and other performance¡¦s calculation formula with regard to ¡§Rocker Shaft Type Connecting Rod Vehicle Steering¡¨ and ¡§Rack and Pinion Type Connecting Rod Vehicle Steering¡¨ which are the most widely used. Third, establish a set of user-friendly computer aided kinematic analysis program by using the VB.NET computer programming language. And then propose a systematized movement design procedure, and establish the optimization size synthesis design theory by using the gene calculation method. Afterwards, carry on the sensitivity analysis in view of the each design variable of the steering mechanism with the orthogonal array table, and establish the kinematic improvement design program. Finally, separately aim at the two Vehicle Steering mechanisms, design a new group mechanism¡¦s size, whose steering performance is better than the original mechanism¡¦s size.
|
47 |
Modeling the Sensitivity of a Seasonalized Semi-arid Aquifer to the Quantity of Recharge and EvapotranspirationNeff, Kirstin Lynn January 2013 (has links)
The Upper San Pedro River aquifer in Southern Arizona has been modeled using MODFLOW several times. The current model improves upon previous models by switching stream packages, adding a third season to represent the summer monsoon, and thereby creating a seasonalized steady-state oscillatory model. Recharge was seasonalized using a method to develop seasonal recharge estimates using ratios of seasonal precipitation to seasonal actual or potential evapotranspiration (ET). Maximum ET was seasonalized according to estimates of riparian groundwater consumption by vegetation in the study area. The model was run with inputs of 80%, 100% and 120% of base values for recharge and maximum ET rates to assess the sensitivity of the groundwater system and river to the seasonal timing and quantity of recharge and ET. The greatest amount of baseflow, 47%, occurred during the wet winter season, 35% occurred during the dry summer, and 18% during the wet summer (monsoon) season.
|
48 |
Land Surface Processes In Natural and Artificial Tropical EcosystemsRosolem, Rafael January 2010 (has links)
Land Surface Parameterization (LSP) schemes have evolved from simple tipping-bucket models to fully interactive models, including parameterizations which account for exchanges of momentum, energy, mass, and biogeochemistry. As the demand for greater realism has increased, so has the complexity of LSPs which now includes some parameters that may not be universally relevant to all regions of the globe. The performance of LSP schemes depends on the magnitude of structural, data-related (input and output), and parameter uncertainties in the model. Parameter estimation uncertainty can be reduced by calibrating LSPs against measurements available at field sites. Given the multiple outputs of the models, multi-objective optimization approaches are performed. Some of the parameter values used in LSPs have originally obtained from laboratory studies which analyzed plant behavior under a range of conditions in enclosed chambers. The research described in this dissertation takes advantage of currently available data from several eddy covariance flux towers located mainly in the Brazilian Amazon basin to estimate parameter values of a widely-used LSP scheme, version 3 of the Simple Biosphere model (SiB3). Background climatological data was used to assess the representativeness of the data collection period that might have affected model calibration. Variance-based sensitivity analysis was then used to investigate potential structural deficiencies in SiB3 and to reduce the dimensionality of the subsequent optimization by identifying those model parameters that merit calibration. Finally, some structural and conceptual aspects of SiB3 were tested inside Biosphere 2 Tropical Rain Forest biome (B2-TRF) under meteorological conditions that resemble those predicted in future climate scenarios for the Amazon basin.
|
49 |
Sensitivity analysis of surface wind field reconstructions in tropical cyclonesMadison, Emily Victoria 27 August 2014 (has links)
Accurate forecasts of tropical cyclone surface wind fields are essential for decisions involving evacuation preparation and damage potential. Towards addressing these actions, a comparison of the CFAN tropical cyclone surface wind field model with the H*Wind wind field reanalyzes is done to assess the accuracy of the CFAN algorithm and to determine potential limitations of its use. 16 tropical cyclones were assessed through correlation coefficient, mean bias, and root mean square error. The resolution of initial conditions to be ingested into the model was also analyzed, along with storm type and whether or not wind shear was a limiting factor. Results suggest that the CFAN wind model accurately predicts the H*Wind analyses in most regions of the TC. The center of circulation has the highest error due to the CFAN wind model treating the center of circulation as a point rather than having finite lateral extent. Results from the sensitivity analysis based on input resolution show that the minimum input resolution for the CFAN wind model to produce fine spatial resolutions with high fidelity is 0.25°. It is shown that the reproductions of weaker tropical cyclones have lower accuracy due to wind field asymmetries within these systems, while stronger TCs are better reproduced, as these systems are usually better organized. Finally, through the wind shear analysis, it is shown that the accuracy of reconstruction is not dependent on the magnitude of vertical wind shear.
|
50 |
A study on the heat transfer and energy performance implications of cool roofsZhang, Tianyao 12 January 2015 (has links)
In this study, we examined the effect of cool roofs on commercial and residential buildings in each climate zone, by looking at monitored case studies and DOE-2 simulations from various sources of literature; and using an online tool - the Cool Roof Calculator and a simple COP ratio model to validate the results of the case studies. It was found that the Cool Roof Calculator does not take building form into account, hence a sensitivity analysis was first conducted to rank the importance of various building parameters against one another. The analysis was conducted on the EPC normative building energy model. Results indicated that roof absorptance coefficient, aspect ratio and number of floors were the three parameters that either ranked highest or were important parameters, and were chosen for further parametric analysis to evaluate the impact of these building parameters on total building loads.
A simple COP ratio model was also developed to validate the results from the literature review and Cool Roof Calculator, and it was found that in terms of cost, for a prototype medium-sized commercial building, it is always beneficial to use a white roof, but cities in northern climates may have little advantage, and insulation may be a better
choice.
|
Page generated in 0.0448 seconds