81 |
Developing A Computer Program For Evaluating Uncertainty Of Some Typical Dimensional Measuring And Gauging DevicesCelebioglu, Emrah Hasan 01 April 2005 (has links) (PDF)
In dimensional measurements, it is required to specify uncertainty in the
measurement as the range of possible deviation for the measurement result. In this
thesis, a computer program is developed for evaluating uncertainty in
measurement of commonly used dimensional measuring devices like vernier
callipers, micrometers, comparators, and gauge blocks.
In evaluation of the uncertainty in measurement, some uncertainty sources
like temperature difference between the measured part and the instrument,
uncertainty in reference gauge block&rsquo / s dimension, mechanical effects, etc. are
considered. The program developed, employs the EAL, NIST and GUM
uncertainty evaluation equations as standard equations. However, the program can
also be used for other measuring instruments and the users can define their own
uncertainty equation. In the evaluations, for the standard uncertainty of the
variables considered, symmetric distributions are used.
The program gives the uncertainty budget and to compare the contribution
of each variable on the overall uncertainty of the measurement, the uncertainty
effect ratio is also given. In this thesis the evaluation process for uncertainty in measurement, the difference between the measurement error and uncertainty in
measurement and the structure of the program are discussed. Also, a set of
experiments has been made to illustrate the application of the program for
evaluating the measurement uncertainty of vernier callipers with 1/50 and 1/20
resolutions, digital vernier calliper and 25 mm micrometer.
|
82 |
A study of uncertainty aspects in venture appraisalJohar, Khalid Lutfi, Civil & Environmental Engineering, Faculty of Engineering, UNSW January 2009 (has links)
The appraisal or the feasibility of an engineering venture or an investment relies on the estimation of the analysis parameters, which usually occur in the future. All such estimates have an element of uncertainty which needs to be acknowledged. Traditional methods of engineering economic or discounted cash flow analysis, for example, net present value, benefit/cost ratio, internal rate of return and payback period, do not take into account the uncertainty associated with the analysis parameters. To this end, the present study proposes a number of evaluation methodologies in order to deal with the inherent uncertainty. The present study uses second-order moment thinking to determine the expected value and the variance of feasibility measures, net present value, benefit/cost ratio, internal rate of return and payback period. A venture???s feasibility is defined in this study as the probability of the total benefit exceeding the total cost, the probability of the internal rate of return being greater than a specified interest rate, or the probability of the payback period being less than a specified time period. However, the determination of the variance of these measures requires the estimation of the correlation coefficients between the benefits and costs. The task of estimating correlation coefficients is difficult without making certain assumptions. An examination of the degree of correlation is presented which can be used for guidance in feasibility studies. The present study also gives a theoretical formulation for feasibility for single and multiple ventures and supports this with representative results based on case studies. Such a formulation resolves which combination of ventures is best from a viewpoint of feasibility. Additionally, venture appraisal is modelled as a system with Markov properties. When analysis parameters such as the interest rate, benefits and costs are defined as states, with the associated transition probabilities from one period of time to another, Markov chains can be used to estimate a venture???s feasibility. This provides further insight into the influence of variability in the analysis parameters, and provides the solution to the problem of the determination of the optimal policy, which maximises the expected net present value or the venture's feasibility over its life span. Markov chains provide further insights into the effect of the inter-temporal correlation coefficients on the variance of the net present value. When each state is taken to represent a different value of inter-temporal correlation coefficient, and consequently a different variance, it is possible to evaluate the venture's expected variance and the variance of the variance of the net present value, according to the transition probabilities associated with each state.
|
83 |
QUALITATIVE AND QUANTITATIVE PROCEDURE FOR UNCERTAINTY ANALYSIS IN LIFE CYCLE ASSESSMENT OF WASTEWATER SOLIDS TREATMENT PROCESSESAlyaseri, Isam 01 May 2014 (has links)
In order to perform the environmental analysis and find the best management in the wastewater treatment processes using life cycle assessment (LCA) method, uncertainty in LCA has to be evaluated. A qualitative and quantitative procedure was constructed to deal with uncertainty for the wastewater treatment LCA studies during the inventory and analysis stages. The qualitative steps in the procedure include setting rules for the inclusion of inputs and outputs in the life cycle inventory (LCI), setting rules for the proper collection of data, identifying and conducting data collection analysis for the significant contributors in the model, evaluating data quality indicators, selecting the proper life cycle impact assessment (LCIA) method, evaluating the uncertainty in the model through different cultural perspectives, and comparing with other LCIA methods. The quantitative steps in the procedure include assigning the best guess value and the proper distribution for each input or output in the model, calculating the uncertainty for those inputs or outputs based on data characteristics and the data quality indicators, and finally using probabilistic analysis (Monte Carlo simulation) to estimate uncertainty in the outcomes. Environmental burdens from the solids handling unit at Bissell Point Wastewater Treatment Plant (BPWWTP) in Saint Louis, Missouri was analyzed. Plant specific data plus literature data were used to build an input-output model. Environmental performance of an existing treatment scenario (dewatering-multiple hearth incineration-ash to landfill) was analyzed. To improve the environmental performance, two alternative scenarios (fluid bed incineration and anaerobic digestion) were proposed, constructed, and evaluated. System boundaries were set to include the construction, operation and dismantling phases. The impact assessment method chosen was Eco-indicator 99 and the impact categories were: carcinogenicity, respiratory organics and inorganics, climate change, radiation, ozone depletion, ecotoxicity, acidification-eutrophication, and minerals and fossil fuels depletion. Analysis of the existing scenario shows that most of the impacts came from the operation phase on the categories related to fossil fuels depletion, respiratory inorganics, and carcinogens due to energy consumed and emissions from incineration. The proposed alternatives showed better performance than the existing treatment. Fluid bed incineration had better performance than anaerobic digestion. Uncertainty analysis showed there is 57.6% possibility to have less impact on the environment when using fluid bed incineration than the anaerobic digestion. Based on single scores ranking in the Eco-indicator 99 method, the environmental impact order is: multiple hearth incineration > anaerobic digestion > fluid bed incineration. This order was the same for the three model perspectives in the Eco-indicator 99 method and when using other LCIA methods (Eco-point 97 and CML 2000). The study showed that the incorporation of qualitative/quantitative uncertainty analysis into LCA gave more information than the deterministic LCA and can strengthen the LCA study. The procedure tested in this study showed that Monte Carlo simulation can be used in quantifying uncertainty in the wastewater treatment studies. The procedure can be used to analyze the performance of other treatment options. Although the analysis in different perspectives and different LCIA methods did not impact the order of the scenarios, it showed a possibility of variation in the final outcomes of some categories. The study showed the importance of providing decision makers with the best and worst possible outcomes in any LCA study and informing them about the perspectives and assumptions used in the assessment. Monte Carlo simulation is able to perform uncertainty analysis in the comparative LCA only between two products or scenarios based on the (A-B) approach due to the overlapping between the probability distributions of the outcomes. It is recommended to modify it to include more than two scenarios.
|
84 |
Avaliação do gerenciamento de incertezas em projetos de softwareSOUZA, José Alfredo Santos de 31 August 2015 (has links)
Submitted by Fabio Sobreira Campos da Costa (fabio.sobreira@ufpe.br) on 2016-04-05T14:25:09Z
No. of bitstreams: 2
license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5)
DissertacaoAlfredo_rev_pos_defesa_2015-11-03 vDigital (1).pdf: 2027109 bytes, checksum: 01759177045949760cd95203c6ed1de9 (MD5) / Made available in DSpace on 2016-04-05T14:25:09Z (GMT). No. of bitstreams: 2
license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5)
DissertacaoAlfredo_rev_pos_defesa_2015-11-03 vDigital (1).pdf: 2027109 bytes, checksum: 01759177045949760cd95203c6ed1de9 (MD5)
Previous issue date: 2015-08-31 / Projetos de desenvolvimento de software têm se tornado cada vez mais complexos,
motivados, principalmente, pelo alto grau de inovação e tecnologia empregada.
Associado a esses elementos, a incerteza, que é caracterizada pela deficiência de
informações relacionadas a um evento, sua compreensão, seu conhecimento, sua
consequência, ou sua probabilidade, quase sempre presente em projetos de
desenvolvimento de software colabora para os altos indicadores de insucesso em
tais projetos, pois as abordagens tradicionais em gerenciamento de projetos não
consideram um ambiente instável e sujeito a diversas fontes de incerteza. Esse
trabalho tem por objetivo a elaboração de uma proposta, voltada para organizações
de desenvolvimento de software, avaliar sua competência em gerir a incerteza. A
proposta de avaliação definida foi fundamentada através de um trabalho teórico em
que é sugerido um processo para gerir a incerteza em projetos de software. A
construção da proposta de avaliação utilizou a abordagem GQM
(Goal/Question/Metric), a partir da qual foram apresentadas métricas para auxiliar
organizações a avaliarem suas práticas em gerir a incerteza no seu processo de
desenvolvimento. Através de um estudo de caso desenvolvido, a proposta de
avaliação foi demonstrada e aplicada. / Software development projects have become increasingly complex, motivated mainly
by the high degree of innovation and technology used. Associated with these
elements, the uncertainty, which is characterized by deficiency of information
associated to an event, their understanding, their knowledge, their consequence or
likelihood, often present in software development projects contributes to the high
failure indicators in such projects because traditional approaches in project
management do not consider an environment that is unstable and exposed to several
sources of uncertainty. This work aims to elaborate a proposal geared to software
development organizations to evaluate their competence in managing uncertainty.
The evaluation proposal as defined was based on a theoretical study in which a
process is advanced to manage the uncertainty in software projects. The
construction of the evaluation proposal used the approach GQM (Goal/Question/
Metric), from which were presented metrics to help organizations assess their
practices in managing uncertainty in the development process. Through a case study
that was developed, the evaluation proposal was demonstrated and applied.
|
85 |
Architecture des réseaux de distribution en présence de production décentralisée. Planification sous incertitudes et modes d'exploitation décentralisés / multi objective distributed generation planning in a flexible environmentSoroudi, Alireza 04 October 2011 (has links)
La libéralisation du marché de l'électricité a introduit plusieurs nouveaux sujets de recherche intéressants dans la zone du système électrique. Cette thèse aborde l'un des problèmes fascinants parmi eux: l'étude de la génération distribuée à la fois renouvelable et classique d'intégration dans les réseaux de distribution. De Gestionnaires de Réseau de Distribution (GRD) point de vue, il est intéressant de développer une méthodologie globale qui considère les différentes technologies de production décentralisée (GD) comme une option pour la fourniture à la demande. Dans cette thèse, le problème de planification a été modélisé avec la méthodologie de multi-objectif. Cela aidera le planificateur de la prise de décision tout en sachant les arbitrages entre les fonctions objectives. Afin de trouver le front de Pareto optimale du problème, un hybride génétique-immunes algorithme est proposé. La méthode floue satisfaisant est utilisé pour trouver la solution finale. Divers objectifs comme le coût, les pertes actifs, d'émissions et de la satisfaction de contraintes techniques ont été prises en compte. Les variables de décision sont les stratégies de renforcement des réseaux de distribution et aussi les décisions d'investissement concernant les modules GD, dans le cas où GRD peut investir dans des modules de DG aussi. Un autre aspect qui rend les modèles proposés plus flexible, est compte tenu des incertitudes sur les paramètres d'entrée. Les incertitudes des données d'entrée ont été traitées de trois manières différentes à savoir : probabiliste, possibiliste et finalement mélangés possibiliste-probabilistes. Dans cette thèse, deux types de modèles ont été développés: centralisé et dégroupé modèle de planification GD. Dans les deux modèles, le GRD est responsable de fournir un réseau fiable et performant pour ses clients sur son territoire. Dans le contexte de planification centralisée, le GRD est autorisé à faire des investissements dans les modules de la GD. Dans ce modèle, la taille optimale, nombre d'unités de la GD, l'emplacement, la technologie et de la GD, calendrier des investissements dans les modules de GD à la fois et les composants du réseau sont déterminés. Le modèle développé ne sera pas seulement utile dans le contexte de la planification centralisée, mais est également applicable aux marchés de l'énergie d'autres qui ont besoin pour évaluer, surveiller et guider les décisions des développeurs GD. Dans le modèle de planification de la GD dégroupé, le GRD n'est pas autorisé à prendre des décisions d'investissement dans les options de la GD. Les variables de décision du GRD sont limités à renfort de réseau, le placement de condensateurs, la reconfiguration du réseau et des technologies de réseau intelligent. / The process of deregulation that has involved electricity markets has introduced several new interesting research topics in power system area. This thesis addresses one of the fascinating issues among them: the study of distributed generation both renewable and conventional integration in distribution networks. From the distribution network operator (DNO)'s point of view, it is interesting to develop a comprehensive methodology which considers various distributed generation technologies as an option for supplying the demand. In this thesis, the planning problem has been multi-objectively modeled. This will help the planner in decision making while knowing the trade-offs between the objective functions. for finding the Pareto optimal front of the problem a hybrid Genetic-Immune algorithm is proposed. The fuzzy satisfying method is used to find the final solution. Various objectives like cost, active losses, emission and the technical constraint satisfaction have been taken into account. The decision variables are the distribution network reinforcement strategies and also the investment decisions regarding DG units, in case where DNO can invest in DG units too. Another aspect which makes the proposed models more flexible, is considering the uncertainties of the input parameters. The uncertainties of input data have been treated in three different ways namely, probabilistic, possibilistic and finally mixed possibilistic-probabilistic methods. In this thesis, two types of models have been developed: centralized and unbundled DG planning model. In both models, the DNO is responsible to provide a reliable and efficient network for his costumers in its territory. In centrally controlled planning context, the DNO is authorized to make investment in DG units. In this model, the optimal size, number of DG units, location, DG technology and timing of investment in both DG units and network components are determined. The developed model will not only be useful in the centrally controlled planning context but also is applicable to other power markets that need to assess, monitor and guide the decisions of DG developers. In unbundled DG planning model, the DNO is not authorized to make investment decisions in DG options. The decision variables of DNO are limited to feeder/substation expansion/reinforcement, capacitor placement, network reconfiguration and smart grid technologies.
|
86 |
Uncertainty in process innovations : A case study on the adaption of search engine optimizationSamuelsson, Jonathan, Skoglund, Lovisa January 2020 (has links)
Process innovation is an important topic in business research. It enables competitive advantages for companies if managed properly. It is previously acknowledged that uncertainty in process innovation is common and previously research show that it has a negative impact on process innovation projects, as it can cause a waste of resources for the company. For SME’s, where resources are limited, it is imperative that uncertainty do not affect process innovation projects negatively. Previous scholars do not identify sources of process innovation uncertainty in SME’s or how it can be managed, thus leave a gap in theory that is important to fill. The purpose of the study was to investigate how uncertainty in process innovation arises in an SME and how it can be reduced by an investigation on how SEO, as an instance of process innovation, was perceived before and after an implementation process and if a change in perception was related to uncertainty. A single case study with qualitative interviews, combined with an implementation process of SEO, was used to investigate the topic and generate in-depth knowledge. Our findings identify sources of process innovation uncertainty in SME’s, arising from either a resource perspective or from an organizational perspective. Furthermore, we suggest how to manage the identified sources of uncertainty through either information, communication or results. Organizations can use these findings to manage process innovation uncertainty before it arises, thus achieve successful process innovation.
|
87 |
Method for the Interpretation of RMR Variability Using Gaussian Simulation to Reduce the Uncertainty in Estimations of Geomechanical Models of Underground MinesRodriguez-Vilca, Juliet, Paucar-Vilcañaupa, Jose, Pehovaz-Alvarez, Humberto, Raymundo, Carlos, Mamani-Macedo, Nestor, Moguerza, Javier M. 01 January 2020 (has links)
El texto completo de este trabajo no está disponible en el Repositorio Académico UPC por restricciones de la casa editorial donde ha sido publicado. / The application of conventional techniques, such as kriging, to model rock mass is limited because rock mass spatial variability and heterogeneity are not considered in such techniques. In this context, as an alternative solution, the application of the Gaussian simulation technique to simulate rock mass spatial heterogeneity based on the rock mass rating (RMR) classification is proposed. This research proposes a methodology that includes a variographic analysis of the RMR in different directions to determine its anisotropic behavior. In the case study of an underground deposit in Peru, the geomechanical record data compiled in the field were used. A total of 10 simulations were conducted, with approximately 6 million values for each simulation. These were calculated, verified, and an absolute mean error of only 3.82% was estimated. It is acceptable when compared with the value of 22.15% obtained with kriging.
|
88 |
STRUCTURAL UNCERTAINTY IN HYDROLOGICAL MODELSAbhinav Gupta (11185086) 28 July 2021 (has links)
All hydrological models incur various uncertainties that can be broadly classified into three categories: measurement, structural, and parametric uncertainties. Measurement uncertainty exists due to error in measurements of properties and variables (e.g. streamflows that are typically an output and rainfall that serves as an input to hydrological models). Structural uncertainty exists due errors in mathematical representation of real-world hydrological processes. Parametric uncertainty exists due to structural and measurement uncertainty and limited amount of data availability for calibration. <br>Several studies have addressed the problem of measurement and parametric uncertainties but studies on structural uncertainty are lacking. Specifically, there does not exist any model that can be used to quantify structural uncertainties at an ungauged location. This was the first objective of the study: to develop a model of structural uncertainty that can be used to quantify total uncertainty (including structural uncertainty) in streamflow estimates at ungauged locations in a watershed. The proposed model is based on the idea that since the effect of structural uncertainty is to introduce a bias into the parameter estimation, one way to accommodate structural uncertainty is to compensate for this bias. The developed model was applied to two watersheds: Upper Wabash Busseron Watershed (UWBW) and Lower Des Plaines Watershed (LDPW). For UWBW, mean daily streamflow data were used while for LDPW mean hourly streamflow data were used. The proposed model worked well for mean daily data but failed to capture the total uncertainties for hourly data likely due to higher measurement uncertainties in hourly streamflow data than what was assumed in the study.<br>Once a hydrological and error model is specified, the next step is to estimate model- and error- parameters. Parameter estimation in hydrological modeling may be carried out using either formal Bayesian methodology or informal Bayesian methodology. In formal Bayesian methodology, a likelihood function, motivated from probability theory, is specified over a space of models (or residuals), and a prior probability distribution is assigned over the space of models. There has been significant debate on whether the likelihood functions used in Bayesian theory are justified in hydrological modeling. However, relatively little attention has been given to justification of prior probabilities. In most hydrological modeling studies, a uniform prior over hydrological model parameters is used to reflect a complete lack of knowledge of a modeler about model parameters before calibration. Such a prior is also known as a non-informative prior. The second objective of this study was to scrutinize the assumption of uniform prior as non-informative using the principle of maximum information gain. This principle was used to derive non-informative priors for several hydrological models, and it was found that the obtained prior was significantly different from a uniform prior. Further, the posterior distributions obtained by using this prior were significantly different from those obtained by using uniform priors.<br>The information about uncertainty in a modeling exercise is typically obtained from residual time series (the difference between observed and simulated streamflows) which is an aggregate of structural and measurement uncertainties for a fixed model parameter set. Using this residual time series, an estimate of total uncertainty may be obtained but it is impossible to separate structural and measurement uncertainties. The separation of these two uncertainties is, however, required to facilitate the rejection of deficient model structures, and to identify whether the model structure or the measurements need to be improved to reduce the total uncertainty. The only way to achieve this goal is to obtain an estimate of measurement uncertainty before model calibration. An estimate of measurement uncertainties in streamflow can be obtained by using rating-curve analysis but it is difficult to obtain an estimate of measurement uncertainty in rainfall. In this study, the classic idea of repeated sampling is used to get an estimate of measurement uncertainty in rainfall and streamflows. In the repeated sampling scheme, an experiment is performed several times under identical conditions to get an estimate of measurement uncertainty. This kind of repeated sampling, however, is not strictly possible for environmental observations, therefore, repeated sampling was used in an approximate manner using a machine learning algorithm called random forest (RF). The main idea is to identify rainfall-runoff events across several different watersheds which are similar to each other such that they can be thought of as different realizations of the same experiment performed under identical conditions. The uncertainty bounds obtained by RF were compared against the uncertainty band obtained by rating-curve analysis and runoff-coefficient method. Overall, the results of this study are encouraging in using RF as a pseudo repeated sampler. <br>In the fourth objective, importance of uncertainty in estimated streamflows at ungauged locations and uncertainty in measured streamflows at gauged locations is illustrated in water quality modeling. The results of this study showed that it is not enough to obtain an uncertainty bound that envelops the true streamflows, but that the individual realizations obtained by the model of uncertainty should be able to emulate the shape of the true streamflow time series for water quality modeling.
|
89 |
Nonlinear Uncertainty Quantification, Sensitivity Analysis, and Uncertainty Propagation of a Dynamic Electrical CircuitDoty, Austin January 2012 (has links)
No description available.
|
90 |
Propagation of Unit Location Uncertainty in Dense Storage EnvironmentsReilly, Patrick 01 January 2015 (has links)
Effective space utilization is an important consideration in logistics systems and is especially important in dense storage environments. Dense storage systems provide high-space utilization; however, because not all items are immediately accessible, storage and retrieval operations often require shifting of other stored items in order to access the desired item, which results in item location uncertainty when asset tracking is insufficient. Given an initial certainty in item location, we use Markovian principles to quantify the growth of uncertainty as a function of retrieval requests and discover that the steady state probability distribution for any communicating class of storage locations approaches uniform. Using this result, an expected search time model is developed and applied to the systems analyzed. We also develop metrics that quantify and characterize uncertainty in item location to aid in understanding the nature of that uncertainty. By incorporating uncertainty into our logistics model and conducting numerical experiments, we gain valuable insights into the uncertainty problem such as the benefit of multiple item copies in reducing expected search time and the varied response to different retrieval policies in otherwise identical systems.
|
Page generated in 0.0384 seconds