1 |
The mathematics of hedgingChen, Yi-Jen Elaine 24 August 2010 (has links)
Possessing the knowledge to hedge energy price risks properly is essential and crucial for running a long-term business. In the past, many hedging instruments have been invented and widely used. By using these derivatives, decision makers reduce the price risk to a certain degree.
To apply these hedging instruments to the perfect hedging strategies correctly, it is necessary to be familiar with these tools in the first place. This work introduces the financial tools widely applied in hedging, including forward contracts, futures, swaps and options. It also introduces the hedging strategies used on energy hedging. Since individuals are creating strategies according to their unique risk appetite and collected information, this work presents three risk appetites and a method of distinguishing valuable information.
With the contribution of this thesis, future works can be done in the field that connect the information valuation and energy hedging by changing the behavior in each risk appetites’ hedging ratio. / text
|
2 |
Optimal Sensor Placement for Infrastructure System Monitoring using Probabilistic Graphical Models and Value of InformationMalings, Carl Albert 01 May 2017 (has links)
Civil infrastructure systems form the backbone of modern civilization, providing the basic services that allow society to function. Effective management of these systems requires decision-making about the allocation of limited resources to maintain and repair infrastructure components and to replace failed or obsolete components. Making informed decisions requires an understanding of the state of the system; such an understanding can be achieved through a computational or conceptual system model combined with information gathered on the system via inspections or sensors. Gathering of this information, referred to generally as sensing, should be optimized to best support the decision-making and system management processes, in order to reduce long-term operational costs and improve infrastructure performance. In this work, an approach to optimal sensing in infrastructure systems is developed by combining probabilistic graphical models of infrastructure system behavior with the value of information (VoI) metric, which quantifies the utility of information gathering efforts (referred to generally as sensor placements) in supporting decision-making in uncertain systems. Computational methods are presented for the efficient evaluation and optimization of the VoI metric based on the probabilistic model structure. Various case studies on the application of this approach to managing infrastructure systems are presented, illustrating the flexibility of the basic method as well as various special cases for its practical implementation. Three main contributions are presented in this work. First, while the computational complexity of the VoI metric generally grows exponentially with the number of components, growth can be greatly reduced in systems with certain topologies (designated as cumulative topologies). Following from this, an efficient approach to VoI computation based on a cumulative topology and Gaussian random field model is developed and presented. Second, in systems with non-cumulative topologies, approximate techniques may be used to evaluate the VoI metric. This work presents extensive investigations of such systems and draws some general conclusions about the behavior of this metric. Third, this work presents several complete application cases for probabilistic modeling techniques and the VoI metric in supporting infrastructure system management. Case studies are presented in structural health monitoring, seismic risk mitigation, and extreme temperature response in urban areas. Other minor contributions included in this work are theoretical and empirical comparisons of the VoI with other sensor placement metrics and an extension of the developed sensor placement method to systems that evolve in time. Overall, this work illustrates how probabilistic graphical models and the VoI metric can allow for efficient sensor placement optimization to support infrastructure system management. Areas of future work to expand on the results presented here include the development of approximate, heuristic methods to support efficient sensor placement in non-cumulative system topologies, as well as further validation of the efficient sensing optimization approaches used in this work.
|
3 |
Assessing Parameter Importance in Decision Models. Application to Health Economic EvaluationsMilev, Sandra 25 February 2013 (has links)
Background: Uncertainty in parameters is present in many risk assessment and decision making problems and leads to uncertainty in model predictions. Therefore an analysis of the degree of uncertainty around the model inputs is often needed. Importance analysis involves use of quantitative methods aiming at identifying the contribution of uncertain input model parameters to output uncertainty. Expected value of partial perfect information (EVPPI) measure is a current gold- standard technique for measuring parameters importance in health economics models. The current standard approach of estimating EVPPI through performing double Monte Carlo simulation (MCS) can be associated with a long run time. Objective: To investigate different importance analysis techniques with an aim to find alternative technique with shorter run time that will identify parameters with greatest contribution to uncertainty in model output. Methods: A health economics model was updated and served as a tool to implement various importance analysis techniques. Twelve alternative techniques were applied: rank correlation analysis, contribution to variance analysis, mutual information analysis, dominance analysis, regression analysis, analysis of elasticity, ANCOVA, maximum separation distances analysis, sequential bifurcation, double MCS EVPPI,EVPPI-quadrature and EVPPI- single method. Results: Among all these techniques, the dominance measure resulted with the closest correlated calibrated scores when compared with EVPPI calibrated scores. Performing a dominance analysis as a screening method to identify subgroup of parameters as candidates for being most important parameters and subsequently only performing EVPPI analysis on the selected parameters will reduce the overall run time.
|
4 |
Evaluation of information bundles in engineering decisionsBakir, Niyazi Onur 15 November 2004 (has links)
This dissertation addresses the question of choosing the best information alternative in engineering decisions. The decision maker maximizes his expected utility under uncertainty where both the action he takes and the state of the environment determines the payoff earned. The decision maker has an opportunity to gather information about the decision environment a priori at a certain cost. There might be different information alternatives, and the decision maker has to determine which alternative offers "better" prospects for improving the decision.
Any decision environment that is characterized by a finite number of outcomes and a discrete probability distribution over the set of outcomes is a lottery. We analyze the value of information on a single outcome and determine the attributes in each piece of information that maximizes its value. Information is valuable when the decision is changed after gathering information. We show that if the number of optimal actions taken under different outcomes scenarios is finite, the decision maker does not require the perfect information. Further, we analyze the relation between the value of information and its determinants, and show a monotonic relation exists for a restricted class of information bundles and utility functions. We use different approaches to evaluate information and analyze the cases where preference reversals occur between different approaches. We observe that a priori pricing of information does not necessarily induce the same ranking with the expected utility approach, however both approaches agree on whether a given piece of information is valuable or not.
The second part of this dissertation evaluates information in both static and dynamic coinsurance problems. In static insurance decisions, we analyze the case where the decision maker gathers information about the severity of the risk events and perform ranking of information bundles in a specific class. In dynamic insurance problems, we make a case study to analyze different physical risks that the production facilities are exposed to. The information in dynamic insurance problems involves more detail with regard to the timing of the multiple risk events. We observe that information on events that pose relatively good scenarios for the decision maker have value, however, their value may diminish as their probability of occurance decreases. The decision maker purchases more information as the profitability of the product increases and less information as the initial wealth increases. Furthermore, the decrease cost of insurance does not necessarily make information more valuable as the value is directly related to the change in the decisions rather than the cost of taking a specific action.
|
5 |
Value of information and portfolio decision analysisZan, Kun 25 September 2013 (has links)
Value of information (VOI) is the amount a decision maker is willing to pay for information to better understand the uncertainty surrounding a decision, prior to making the decision. VOI is a key part of decision analysis (DA). Especially in this age of information explosion, evaluating information value is critical. VOI research tries to derive generic conclusions regarding VOI properties. However, in most cases, VOI properties rely on the specific decision context, which means that VOI properties may not be generalizable. Thus, instead, VOI properties have been derived for typical or representative decisions. In addition, VOI analysis as a method of DA has been successfully applied to practical decision problems in a variety of industries. This approach has also been adopted as the basis of a heuristic algorithm in the latest research in simulation and optimization. Portfolio Decision Analysis (PDA), rooted in DA, is a body of theories, methods, and practices that seek to help decision makers with limited budget select a subset of candidate items through mathematical modeling that accounts for relevant constraints, preferences, and uncertainties. As one of the main tools for resource allocation problems, its successful implementation, especially in capital-intensive industries such as pharmaceuticals and oil & gas, has been documented (Salo, Keisler and Morton 2011). Although VOI and PDA have been extensively researched separately, their combination has received attention only recently. Resource allocation problems are ubiquitous. Although significant attention has been directed at it, less energy has been focused on understanding the VOI within this setting, and the role of VOI analysis to solve resource allocation problems. This belief motivates the present work. We investigate VOI properties in portfolio contexts that can be modeled as a knapsack problem. By further looking at the properties, we illustrate how VOI analysis can derive portfolio management insights to facilitate PDA process. We also develop a method to evaluate the VOI of information portfolios and how the VOI will be affected by the correlations between information sources. Last, we investigate the performance of a widely implemented portfolio selection approach, the benefit-cost ratio (BCR) approach, in PDA practice. / text
|
6 |
Three Perspectives on the Worth of Hydrologic DataKikuchi, Colin P. January 2015 (has links)
Data collection is an integral part of hydrologic investigations; yet, hydrologic data collection is costly, particularly in subsurface environments. Consequently, it is critical to target data collection efforts toward prospective data sets that will best address the questions at hand, in the context of the study. Experimental and monitoring network designs that have been carefully planned with a specific objective in mind are likely to yield information-rich data that can address critical questions of concern. Conversely, data collection undertaken without careful planning may yield datasets that contain little information relevant to the questions of concern. This dissertation research develops and presents approaches that can be used to support careful planning of hydrologic experiments and monitoring networks. Specifically, three general types of problems are considered. Under the first problem type, the objective of the hydrologic investigation is to discriminate among rival conceptual models, or among rival predictive groupings. A Bayesian methodology is presented that can be used to rank prospective datasets during the planning phases of a hydrologic investigation. Under the second problem type, the objective is to quantify the impact of existing data on reductions in parameter uncertainty. An inverse modeling approach is presented to quantify the impact of existing data on parameter uncertainty when the hydrogeologic conceptual model is uncertain. The third and final problem type focuses on data collection in a water resource management context, with the specific goal to maximize profits without imposing adverse environmental impacts. A risk-based decision support framework is developed using detailed hydrologic simulation to evaluate probabilistic constraints. This enables direct calculation of the profit gains associated with prospective reductions in system parameter uncertainty, and the possible environmental impacts of unknown bias in the system parameters.
|
7 |
Decadal Climate Variability: Economic Implications in Agriculture and Water in the Missouri River BasinFernandez Cadena, Mario 16 December 2013 (has links)
Economic research on climate and productivity effects of ocean phenomena has mostly focused on interannual cases such as the El Niño Southern Oscillation. Here Decadal climate variability (DCV) refers to ocean related climate influences of duration from seven to twenty years. The specific phenomena analyzed here are the Pacific Decadal Oscillation, the Tropical Atlantic Gradient and the West Pacific Warm Pool. Their positive and negative phases, occurring individually or in combination, are associated with variations in crop and water yields.
This dissertation examines the value of DCV information to agriculture and water users in the Missouri river basin using a price endogenous agricultural and non-agricultural model that depicts cropping and water use. The model is used to evaluate the welfare gains and adaptations given various levels of DCV information.
The analysis shows the value (for a 10-year average) for a perfect forecast is about 5.2 billion dollars, though 86% of this value, 4.55 billion dollars, can be obtained by a less perfect forecast based on already available data in the form of the prediction of DCV phase under transition probabilities. The results indicate that forecasting any DCV state is important because of differential responses in the acreage of major crops plus water use adjustments by residential, agricultural and industrial users.
|
8 |
Assessing Parameter Importance in Decision Models. Application to Health Economic EvaluationsMilev, Sandra 25 February 2013 (has links)
Background: Uncertainty in parameters is present in many risk assessment and decision making problems and leads to uncertainty in model predictions. Therefore an analysis of the degree of uncertainty around the model inputs is often needed. Importance analysis involves use of quantitative methods aiming at identifying the contribution of uncertain input model parameters to output uncertainty. Expected value of partial perfect information (EVPPI) measure is a current gold- standard technique for measuring parameters importance in health economics models. The current standard approach of estimating EVPPI through performing double Monte Carlo simulation (MCS) can be associated with a long run time. Objective: To investigate different importance analysis techniques with an aim to find alternative technique with shorter run time that will identify parameters with greatest contribution to uncertainty in model output. Methods: A health economics model was updated and served as a tool to implement various importance analysis techniques. Twelve alternative techniques were applied: rank correlation analysis, contribution to variance analysis, mutual information analysis, dominance analysis, regression analysis, analysis of elasticity, ANCOVA, maximum separation distances analysis, sequential bifurcation, double MCS EVPPI,EVPPI-quadrature and EVPPI- single method. Results: Among all these techniques, the dominance measure resulted with the closest correlated calibrated scores when compared with EVPPI calibrated scores. Performing a dominance analysis as a screening method to identify subgroup of parameters as candidates for being most important parameters and subsequently only performing EVPPI analysis on the selected parameters will reduce the overall run time.
|
9 |
Assessing Parameter Importance in Decision Models. Application to Health Economic EvaluationsMilev, Sandra January 2013 (has links)
Background: Uncertainty in parameters is present in many risk assessment and decision making problems and leads to uncertainty in model predictions. Therefore an analysis of the degree of uncertainty around the model inputs is often needed. Importance analysis involves use of quantitative methods aiming at identifying the contribution of uncertain input model parameters to output uncertainty. Expected value of partial perfect information (EVPPI) measure is a current gold- standard technique for measuring parameters importance in health economics models. The current standard approach of estimating EVPPI through performing double Monte Carlo simulation (MCS) can be associated with a long run time. Objective: To investigate different importance analysis techniques with an aim to find alternative technique with shorter run time that will identify parameters with greatest contribution to uncertainty in model output. Methods: A health economics model was updated and served as a tool to implement various importance analysis techniques. Twelve alternative techniques were applied: rank correlation analysis, contribution to variance analysis, mutual information analysis, dominance analysis, regression analysis, analysis of elasticity, ANCOVA, maximum separation distances analysis, sequential bifurcation, double MCS EVPPI,EVPPI-quadrature and EVPPI- single method. Results: Among all these techniques, the dominance measure resulted with the closest correlated calibrated scores when compared with EVPPI calibrated scores. Performing a dominance analysis as a screening method to identify subgroup of parameters as candidates for being most important parameters and subsequently only performing EVPPI analysis on the selected parameters will reduce the overall run time.
|
10 |
Centralization And Advance Quality Information In RemanufacturingUnal, Muruvvet 01 September 2009 (has links) (PDF)
In this study, value of quality information and the eects of centralization are investigated
for a reverse supply chain consisting of a remanufacturer and a collector. Used products are
collected and inspected to classify them into quality groups, then they are remanufactured to
meet the demand of remanufactured products. The supply of collected products and demand
of remanufactured products are both price-sensitive. The uncertain quality of the collected
products is revealed by an inspection process. Two quality classes are considered, and the cost
of remanufacturing depends on the quality class. The main decisions are on acquisition fee for
the returns, the selling price for remanufactured products, and the transfer prices of inspected
products between the collector and the remanufacturer. For this environment, centralized
and decentralized settings are considered and dierent models that dier in availability of
quality information when the pricing decisions are made are built. We explore the value of
advance quality information and eects of centralization on the optimal prices and profits via
a computational study.
|
Page generated in 0.0391 seconds