451 |
A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitationAkram, Muhammad Farooq 28 March 2012 (has links)
The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to-be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system, make it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent.
It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions.
A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.
|
452 |
Preserving Texture Boundaries for SAR Sea Ice SegmentationJobanputra, Rishi January 2004 (has links)
Texture analysis has been used extensively in the computer-assisted interpretation of SAR sea ice imagery. Provision of maps which distinguish relevant ice types is significant for monitoring global warming and ship navigation. Due to the abundance of SAR imagery available, there exists a need to develop an automated approach for SAR sea ice interpretation. Grey level co-occurrence probability (<i>GLCP</i>) texture features are very popular for SAR sea ice classification. Although these features are used extensively in the literature, they have a tendency to erode and misclassify texture boundaries. Proposed is an advancement to the <i>GLCP</i> method which will preserve texture boundaries during image segmentation. This method exploits the relationship a pixel has with its closest neighbors and weights the texture measurement accordingly. These texture features are referred to as <i>WGLCP</i> (weighted <i>GLCP</i>) texture features. In this research, the <i>WGLCP</i> and <i>GLCP</i> feature sets are compared in terms of boundary preservation, unsupervised segmentation ability, robustness to increasing boundary density and computation time. The <i>WGLCP</i> method outperforms the <i>GLCP</i> method in all aspects except for computation time, where it suffers. From the comparative analysis, an inconsistency with the <i>GLCP</i> correlation statistic was observed, which motivated an investigative study into using this statistic for image segmentation. As the overall goal of the thesis is to improve SAR sea ice segmentation accuracy, the concepts developed from the study are applied to the image segmentation problem. The results indicate that for images with high contrast boundaries, the <i>GLCP</i> correlation statistical feature decreases segmentation accuracy. When comparing <i>WGLCP</i> and <i>GLCP</i> features for segmentation, the <i>WGLCP</i> features provide higher segmentation accuracy.
|
453 |
The Impact of Performance Ratings on Federal Personnel DecisionsOh, Seong Soo 08 January 2010 (has links)
Can pay-for-performance increase the motivation of public employees? By providing a basis for personnel decisions, particularly linking rewards to performance, performance appraisals aim to increase employees' work motivation and ultimately to improve their work performance and organizational productivity. With the emphasis on results-oriented management, performance appraisals have become a key managerial tool in the public sector. Critics charge, however, that pay-for-performance is ineffective in the public sector, largely because the link between performance and rewards is weak. However, no one has empirically measured the strength of the linkage. If performance ratings do have an impact on career success in the federal service, they might contribute to race and gender inequality. Although many studies have examined factors affecting gender and racial differences in career success, studies that try to connect gender and racial inequalities to managerial tools are scarce. Using a one percent sample of federal personnel records, the first essay examines the impact of performance ratings on salary increases and promotion probabilities, and the second essay explores whether women and minorities receive lower ratings than comparable white males, and women and minorities receive lower returns on the same level of performance ratings than comparable white males. The first essay finds that performance ratings have only limited impact on salary increases, but that they significantly affect promotion probability. Thus, the argument that performance-rewards link is weak could be partially correct, if it considers only pay-performance relationships. The second essay finds that women receive equal or higher performance ratings than comparable white men, but some minority male groups, particularly black men, tend to receive lower ratings than comparable white men. On the other hand, the returns on outstanding ratings do not differ between women and minority male groups and white men, though women groups seem to have disadvantages in promotion with the same higher ratings as comparable men in highly male-dominant occupations.
|
454 |
Preserving Texture Boundaries for SAR Sea Ice SegmentationJobanputra, Rishi January 2004 (has links)
Texture analysis has been used extensively in the computer-assisted interpretation of SAR sea ice imagery. Provision of maps which distinguish relevant ice types is significant for monitoring global warming and ship navigation. Due to the abundance of SAR imagery available, there exists a need to develop an automated approach for SAR sea ice interpretation. Grey level co-occurrence probability (<i>GLCP</i>) texture features are very popular for SAR sea ice classification. Although these features are used extensively in the literature, they have a tendency to erode and misclassify texture boundaries. Proposed is an advancement to the <i>GLCP</i> method which will preserve texture boundaries during image segmentation. This method exploits the relationship a pixel has with its closest neighbors and weights the texture measurement accordingly. These texture features are referred to as <i>WGLCP</i> (weighted <i>GLCP</i>) texture features. In this research, the <i>WGLCP</i> and <i>GLCP</i> feature sets are compared in terms of boundary preservation, unsupervised segmentation ability, robustness to increasing boundary density and computation time. The <i>WGLCP</i> method outperforms the <i>GLCP</i> method in all aspects except for computation time, where it suffers. From the comparative analysis, an inconsistency with the <i>GLCP</i> correlation statistic was observed, which motivated an investigative study into using this statistic for image segmentation. As the overall goal of the thesis is to improve SAR sea ice segmentation accuracy, the concepts developed from the study are applied to the image segmentation problem. The results indicate that for images with high contrast boundaries, the <i>GLCP</i> correlation statistical feature decreases segmentation accuracy. When comparing <i>WGLCP</i> and <i>GLCP</i> features for segmentation, the <i>WGLCP</i> features provide higher segmentation accuracy.
|
455 |
Knotting statistics after a local strand passage in unknotted self-avoiding polygons in Z<sup>3</sup>Szafron, Michael Lorne 15 April 2009 (has links)
We study here a model for a strand passage in a ring polymer about a randomly chosen location at which two strands of the polymer have been brought gcloseh together. The model is based on ¦-SAPs, which are unknotted self-avoiding polygons in Z^3 that contain a fixed structure ¦ that forces two segments of the polygon to be close together. To study this model, the Composite Markov Chain Monte Carlo (CMCMC) algorithm, referred to as the CMC ¦-BFACF algorithm, that I developed and proved to be ergodic for unknotted ¦-SAPs in my M. Sc. Thesis, is used. Ten simulations (each consisting of 9.6~10^10 time steps) of the CMC ¦-BFACF algorithm are performed and the results from a statistical analysis of the simulated data are presented. To this end, a new maximum likelihood method, based on previous work of Berretti and Sokal, is developed for obtaining maximum likelihood estimates of the growth constants and critical exponents associated respectively with the numbers of unknotted (2n)-edge ¦-SAPs, unknotted (2n)-edge successful-strand-passage ¦-SAPs, unknotted (2n)-edge failed-strand-passage ¦-SAPs, and (2n)-edge after-strand-passage-knot-type-K unknotted successful-strand-passage ¦-SAPs. The maximum likelihood estimates are consistent with the result (proved here) that the growth constants are all equal, and provide evidence that the associated critical exponents are all equal.<p>
We then investigate the question gGiven that a successful local strand passage occurs at a random location in a (2n)-edge knot-type K ¦-SAP, with what probability will the ¦-SAP have knot-type Kf after the strand passage?h. To this end, the CMCMC data is used to obtain estimates for the probability of knotting given a (2n)-edge successful-strand-passage ¦-SAP and the probability of an after-strand-passage polygon having knot-type K given a (2n)-edge successful-strand-passage ¦-SAP. The computed estimates numerically support the unproven conjecture that these probabilities, in the n¨ limit, go to a value lying strictly between 0 and 1. We further prove here that the rate of approach to each of these limits (should the limits exist) is less than exponential.<p>
We conclude with a study of whether or not there is a difference in the gsizeh of an unknotted successful-strand-passage ¦-SAP whose after-strand-passage knot-type is K when compared to the gsizeh of a ¦-SAP whose knot-type does not change after strand passage. The two measures of gsizeh used are the expected lengths of, and the expected mean-square radius of gyration of, subsets of ¦-SAPs. How these two measures of gsizeh behave as a function of a polygonfs length and its after-strand-passage knot-type is investigated.
|
456 |
Managing Uncertainty in Engineering Design Using Imprecise Probabilities and Principles of Information EconomicsAughenbaugh, Jason Matthew 22 June 2006 (has links)
The engineering design community recognizes that an essential part of the design process is decision making. Because decisions are generally made under uncertainty, engineers need appropriate methods for modeling and managing uncertainty. Two important characteristics of uncertainty in the context of engineering design are imprecision and irreducible uncertainty. In order to model both of these characteristics, it is valuable to use probabilities that are most generally imprecise and subjective. These imprecise probabilities generalize traditional, precise probabilities; when the available information is extensive, imprecise probabilities reduce to precise probabilities. An approach for comparing the practical value of different uncertainty models is developed. The approach examines the value of a model using the principles of information economics: value equals benefits minus costs. The benefits of a model are measured in terms of the quality of the product that results from the design process. Costs are measured not only in terms of direct design costs, but also the costs of creating and using the model. Using this approach, the practical value of using an uncertainty model that explicitly recognizes both imprecision and irreducible uncertainty is demonstrated in the context of a high-risk engineering design example in which the decision-maker has few statistical samples to support the decision. It is also shown that a particular imprecise probability model called probability bounds analysis generalizes sensitivity analysis, a process of identifying whether a particular decision is robust given the decision makers lack of complete information. An approach for bounding the value of future statistical data samples while collecting information to support design decisions is developed, and specific policies for making decisions in the presence of imprecise information are examined in the context of engineering.
|
457 |
Derivative pricing based on time series models of default probabilitiesChang, Kai-hsiang 02 August 2006 (has links)
In recent years, people pay much attention to
derivative pricing subject to credit risk. In this paper, we proposed an autoregressive time series model of log odds ratios to price derivatives. Examples of the proposed model are given via the structural and reduced form approaches. Pricing formulae of the proposed time series models are derived for bonds and options. Furthermore, simulation studies are performed to confirm the accuracy of derived formulae.
|
458 |
Bayesian Inference In Anova ModelsOzbozkurt, Pelin 01 January 2010 (has links) (PDF)
Estimation of location and scale parameters from a random sample of size n is of paramount importance in Statistics. An estimator is called fully efficient if it attains the Cramer-Rao minimum variance bound besides being unbiased. The method that yields such estimators, at any rate for large n, is the method of modified maximum likelihood estimation. Apparently, such estimators cannot be made more efficient by using sample based classical methods. That makes room for Bayesian method of estimation which engages prior distributions and likelihood functions. A formal combination of the prior knowledge and the sample information is called posterior distribution. The posterior distribution is maximized with respect to the unknown parameter(s). That gives HPD (highest probability density) estimator(s). Locating the maximum of the posterior distribution is, however, enormously difficult (computationally and analytically) in most situations. To alleviate these difficulties, we use modified likelihood function in the posterior distribution instead of the likelihood function. We derived the HPD estimators of location and scale parameters of distributions in the family of Generalized Logistic. We have extended the work to experimental design, one way ANOVA. We have obtained the HPD estimators of the block effects and the scale parameter (in the distribution of errors) / they have beautiful algebraic forms. We have shown that they are highly efficient. We have given real life examples to illustrate the usefulness of our results. Thus, the enormous computational and analytical difficulties with the traditional Bayesian method of estimation are circumvented at any rate in the context of experimental design.
|
459 |
Probabilistic Latent Semantic Analysis Based Framework For Hybrid Social Recommender SystemsEryol, Erkin 01 June 2010 (has links) (PDF)
Today, there are user annotated internet sites, user interaction logs, online user communities which are valuable sources of information concerning the personalized recommendation problem. In the literature, hybrid social recommender systems have been proposed to reduce the sparsity of the usage data by integrating the user related information sources together. In this thesis, a method based on probabilistic latent semantic analysis is used as a framework for a hybrid social recommendation system. Different data hybridization approaches on probabilistic latent semantic analysis are experimented. Based on this flexible probabilistic model, network regularization and model blending approaches are applied on probabilistic latent semantic analysis model as a solution for social trust network usage throughout the collaborative filtering
process. The proposed model has outperformed the baseline methods in our experiments. As a result of the research, it is shown that the proposed methods successfully model the rating and social trust data together in a theoretically principled
way.
|
460 |
Development Of A Gis Software For Evaluating Network Relibility Of Lifelines Under Seismic HazardOduncucuoglu, Lutfi 01 December 2010 (has links) (PDF)
Lifelines are vital networks and it is important that those networks are still be functional after major natural disasters such as earthquakes. The goal of this study is to develop a GIS software for evaluating network reliability of lifelines under seismic hazard. In this study, GIS, statistics and facility management is used together and a GIS software module, which constructs GIS based reliability maps of lifeline networks, is developed by using geoTools. Developed GIS module imports seismic hazard and lifeline network layers in GIS formats using geoTools libraries and after creating a gridded network structure it uses a network reliability algorithm, initially developed by Yoo and Deo (1988), to calculate the upper and lower bounds of lifeline network reliability under seismic hazard. Also it can show the results in graphical form and save as shape file format. In order to validate the developed application, results are compared with a former case study of Selcuk (2000) and the results are satisfactorily close to previous study. As a result of this study, an easy to use, GIS based software module that creates GIS based reliability map of lifelines under seismic hazard was developed.
|
Page generated in 0.0911 seconds