331 |
An Application of Statistical Decision Theory to Farm Management in Sevier County, UtahLakawathana, Suwaphot 01 May 1970 (has links)
The major purpose of this study is to present selected empirical results of a study employing decision-making theory as a framework for considering decision making under risk. The part icular problem involves choices between alternative crop rotations for Sevier County farmers. The study demonstrates the usefulness of the Bayesian theory that gives more than a point estimation.
A multiple regression mod e l using two linear terms was employed to determine the influence of s now pack and reservoir storage on water availability for irrigation purposes during July, August , and September.
The Bayesian approach was employed. The optima l action or decision was first determined where only the knowledge of the~ priori probabiities of the states of nature was available. Optimal strategies were then determined where run-off observation was available and the~ poster iori probabilities of the states of nature were determined.
Study results indicate that the expected va lue of the additional information is substantial and come out very close to the expected value of a perfect predictor and higher than the expected value of t he "no data" problems . It means that the Bayesian approach gives more than a point estimation a nd is us eful for farm management decision making under risk.
|
332 |
Confirmation Bias and Related ErrorsBorthwick, Geoffrey Ludlow 01 January 2010 (has links)
This study attempted to replicate and extend the study of Doherty, Mynatt, Tweney, and Schiavo (1979), which introduced what is here called the Bayesian conditionals selection paradigm. The present study used this paradigm (and a script similar to that used by Doherty et al.) to explore confirmation bias and related errors that can appear in both search and integration in probability revision. Despite selection differences and weak manipulations, this study provided information relevant to four important questions. First, by asking participants to estimate the values of the conditional probabilities they did not learn, this study was able to examine the use of "intuitive conditionals". This study found evidence that participants used intuitive conditionals and that their intuitive conditionals were affected by the size of the actual conditionals. Second, by examining both phases in the same study, this study became the first to look for inter-phase interactions. A strong correlation was found between the use of focal search strategies and focal integration strategies (r=.81, p
|
333 |
Bayesian inquiry: an approach to the use of expertsYee, King G. 01 January 1976 (has links)
Subjective information is a valuable resource; however, decisionmakers often ignore it because of difficulties in eliciting it from assessors. This thesis is on Bayesian inquiry and it presents an approach to eliciting subjective information from assessors. Based on the concepts of cascaded inference and Bayesian statistics, the approach is designed to reveal to the decision-maker the way in which the assessor considers his options and the reasons he has for selecting particular alternatives. Unlike previous works on cascaded inferences, the approach here focuses on incoherency. Specifically, it employs the use of additional information to revise and check the estimates. The reassessment may be done directly or indirectly. The indirect procedure uses a second order probability or type II distribution. An algorithm utilizing this approach is also presented. The methodology is applicable to any number of assessors. Procedures for aggregating and deriving surrogate distributions are also proposed.
|
334 |
Essays on commitment and flexibilityPayro Chew, Fernando 30 October 2021 (has links)
This dissertation consists of two essays studying economic agents who choose menus, or opportunity sets from which they will make a choice at a later date. The common theme is that the utility of a menu can be affected by inclusion of alternatives that are not subsequently chosen. This effect can create either a preference for commitment or a preference for flexibility.
The first essay models an agent who experiences temptation when choosing from a menu of lotteries, and who is self-aware and anticipates her future behavior when choosing between menus. Her desire to eliminate tempting alternatives from a menu creates a preference for commitment. When studying menus of lotteries, the literature has typically assumed that preferences satisfy the Independence axiom. Independence requires that the ranking of two menus is not affected if each is mixed (probabilistically) with a common third menu. In particular, the preference for commitment is invariant under Independence. This essay argues that intuitive behavior may require that the preference for commitment be affected by such mixing, and hence be mixture-dependent. To capture such behavior, a generalization of the temptation and self-control model of Gul and Pesendorfer (2001) is provided. The model generalizes Gul and Pesendorfer (2001) by replacing their Independence axiom with a suitably adapted version of the Mixture-Betweenness axiom of Chew (1989) and Dekel (1986). The result is an implicit utility model in which utility is defined as the unique solution of an equation that depends on the agent's commitment and temptation rankings, neither of which needs to satisfy the Independence axiom. Axiomatic characterization of the model exploits a novel extension of the Mixture Space Theorem to preferences that satisfy Mixture-Betweenness. Since the Mixture Space Theorem is central to decision theory, this extension is potentially useful for addressing issues in economics other than temptation and self-control.
The second essay explores the testable implications of the linear representations considered in Dekel et al. (2001). Dekel et al. (2001) extends the seminal model of preference for flexibility due to Kreps (1979) by considering menus of lotteries rather than deterministic alternatives. They show that a simple set of axioms characterizes a representation that can be interpreted as if the agent is uncertain about her future tastes. This taste uncertainty is summarized by the “subjective state space”, consisting of the set of possible future preferences over lotteries. Their approach is axiomatic, thus testability requires that the entire preference order be observable. This essay provides a corresponding revealed preference analysis and assumes that only finitely many choices are observed. It is shown that for a particular class of data sets, the characterizing conditions can be reformulated as nonlinear systems of inequalities for which the existence of solutions can be verified using numerical methods. Hence, for this type of data, the analysis provides a test for the subjective state space hypothesis that is, in principle, implementable. In addition, the analysis covers the case where available data involves only menus of alternatives (and not lotteries). Hence, it also provides revealed preference characterizations for Kreps (1979).
|
335 |
Essays on Statistical Decision Theory and EconometricsDe Albuquerque Furtado, Bruno January 2023 (has links)
This dissertation studies statistical decision making in various guises. I start by providing a general decision theoretic model of statistical behavior, and then analyze two particular instances which fit in that framework.
Chapter 1 studies statistical decision theory (SDT), a class of models pioneered by Abraham Wald to analyze how agents use data when making decisions under uncertainty. Despite its prominence in information economics and econometrics, SDT has not been given formal choice-theoretic or behavioral foundations. This chapter axiomatizes preferences over decision rules and experiments for a broad class of SDT models. The axioms show how certain seemingly-natural decision rules are incompatible with this broad class of SDT models. Using those representation result, I then develop a methodology to translate axioms from classical decision-theory, a la Anscombe and Aumann (1963), to the SDT framework. The usefulness of this toolkit is then illustrated by translating various classical axioms, which serve to refine my baseline framework into more specific statistical decision theoretic models, some of which are novel to SDT. I also discuss foundations for SDT under other kinds of choice data.
Chapter 2 studies statistical identifiability of finite mixture models. If a model is not identifiable, multiple combinations of its parameters can lead to the same observed distribution of the data, which greatly complicates, if not invalidates, causal inference based on the model. High-dimensional latent parameter models, which include finite mixtures, are widely used in economics, but are only guaranteed to be identifiable under specific conditions. Since these conditions are usually stated in terms of the hidden parameters of the model, they are seldom testable using noisy data. This chapter provides a condition which, when imposed on the directly observable mixture distribution, guarantees that a finite mixture model is non-parametrically identifiable. Since the condition relates to an observable quantity, it can be used to devise a statistical test of identification for the model. Thus I propose a Bayesian test of whether the model is close to being identified, which the econometrician may apply before estimating the parameters of the model. I also show that, when the model is identifiable, approximate non-negative matrix factorization provides a consistent, likelihood-free estimator of mixture weights.
Chapter 3 studies the robustness of pricing strategies when a firm is uncertain about the distribution of consumers' willingness-to-pay. When the firm has access to data to estimate this distribution, a simple strategy is to implement the mechanism that is optimal for the estimated distribution. We find that such an empirically optimal mechanism boasts strong profit and regret guarantees. Moreover, we provide a toolkit to evaluate the robustness properties of different mechanisms, showing how to consistently estimate and conduct valid inference on the profit generated by any one mechanism, which enables one to evaluate and compare their probabilistic revenue guarantees.
|
336 |
The effect of alternate information structures on probability revisions /Dickhaut, John Wilson January 1970 (has links)
No description available.
|
337 |
Bayesian analysis of particle tracking data using hierarchical models for characterization and designDhatt-Gauthier, Kiran January 2022 (has links)
This dissertation explores the intersection between the fields of colloid science and statistical inference where the stochastic trajectories of colloidal particles are captured by video microscopy, reconstructed using particle tracking algorithms, and analyzed using physics-based models and probabilistic programming techniques. Although these two fields may initially seem disparate, the dynamics of micro- and nano-sized particles dispersed in liquids at room temperature are inherently stochastic due to Brownian motion.
Further, both the particles under observation and their environment are heterogeneous, leading to variability between particles as well. We use Bayesian data analysis to infer the uncertain parameters of physics-based models that describe the observed trajectories, explicitly modeling the hierarchical structure of the noise under a set of varying experimental conditions.
We set the stage in Chapter 1 by introducing Robert Brown's curious observation of incessantly diffusing pollen grains and Albert Einstein's statistical physics model that describes their motion. We analyze Jean Baptiste Perrin's data from Les Atomes using a probabilistic model to infer the uncertain diffusivities of the colloids. We show how the Bayesian paradigm allows us to assign and update our credences, before and after observing this data and quantify the information gained by the observation.
In Chapter 2, we build on these concepts to provide insight on the phenomenon of enhanced enzyme diffusion, whereby enzymes are purported to diffuse faster in the presence of their substrate. We develop a hierarchical model of enzyme diffusion that describes the stochastic dynamics of individual enzymes drawn from a dispersed population. Using this model, we analyze single molecule imaging data of urease enzymes to infer their uncertain diffusivities for different substrate concentrations. Our analysis emphasizes the important role of model criticism for establishing self-consistency between experimental observations and model predictions; moreover, we caution against drawing strong conclusions when such consistency cannot be established.
In Chapter 3, we automate, and optimize the data acquisition process, tuning a resonant acoustic cell using minimal experimental resources. By iterating a cycle of observation, inference, and design, we select the frequency the applied signal and the framerate of the data acquisition, garnering the same amount of information as a grid search approach with a fraction of the data.
Finally, in Chapter 4, we discuss the role of Bayesian inference and design to optimize functional goals and discuss selected examples on where black-box techniques may prove useful. We review the current state of the art for magnetically actuated colloids and pose the search for autonomous magnetic behaviors as a design problem, offering insight as we seek to augment and accelerate the capabilities of micron scale magnetically actuated colloids using modern computational techniques.
|
338 |
Bayesian collocation tempering and generalized profiling for estimation of parameters from differential equation modelsCampbell, David Alexander. January 2007 (has links)
No description available.
|
339 |
Comparison of two drugs by multiple stage sampling using Bayesian decision theorySmith, Armand V. 02 February 2010 (has links)
The general problem considered in this thesis is to determine an optimum strategy for deciding how to allocate the observations in each stage of a multi-stage experimental procedure between two binomial populations (e.g., the numbers of successes for two drugs) on the basis of the results of previous stages. After all of the stages of the experiment have been performed, one must make the terminal decision of which of the two populations has the higher probability of success. The optimum strategy is to be optimum relative to a given loss function; and a prior distribution, or weighting function, for the probabilities of success for the two populations is assumed. Two general classes of loss functions are considered, and it is assumed that the total number of observations in each stage is fixed prior to the experiment.
In order to find the optimum strategy a method of analysis called extensive-form analysis is used. This is essentially a method for enumerating all the possible outcomes and corresponding strategies and choosing the optimum strategy for a given outcome. However, it is found that this method of analysis is much too long for all but small examples even when a digital computer is used.
Because of this difficulty two alternative procedures, which are approximations to extensive-form analysis, are proposed.
In the stage-by-stage procedure one assumes that at each stage he is at the last stage of his multi-stage procedure and allocates his observations to each of the two populations accordingly. It is shown that this is equivalent to assuming at each stage one has a one stage procedure.
In the approximate procedure one (approximately) minimizes the posterior variance of the difference of the probabilities of success for the two populations at each stage. The computations for this procedure are quite simple to perform.
The stage-by-stage procedure for the case that the two populations are normal with known variance rather than binomial is considered. It is then shown that the approximate procedure can be derived as an approximation to the stage-by- stage procedure when normal approximations to binomial distributions are used.
The three procedures are compared with each other and with equal division of the observations in several examples by the computation of the probability of making the correct terminal decision for various values of the population parameters (the probabilities of success}. It is assumed in these computations that the prior distributions of the population parameters are rectangular distributions and that the loss functions are symmetric} i.e., the losses are as great for one wrong terminal decision as they are for the other. These computations show that, for the examples studied, there is relatively little loss in using the stage-by-stage procedure rather than extensive-form analysis and relatively little gain in using the approximate procedure instead of equal division of the observations. However, there is a relatively large loss in using the approximate procedure rather than the stage-by-stage procedure when the population parameters are close to 0 or 1.
At first it is assumed there are a fixed number of stages in the experiment, but later in the thesis this restriction is weakened to the restriction that only the maximum number of stages possible in the experiment is fixed and the experiment can be stopped at any stage before the last possible stage is reached. Stopping rules for the stage-by- stage and the approximate procedures are then derived. / Ph. D.
|
340 |
Investigation of the rate of convergence in the two sample nonparametric empirical Bayes approach to an estimation problemWang, Alan Then-Kang January 1965 (has links)
In this thesis we consider the following. We choose the random variable θ, which has some fixed but unknown distribution with a finite second moment. We observe the value x, of a preliminary random variable X, which has an unknown distribution which is conditional on θ. Using x and our past experience we are asked to estimate the value of θ with a squared error loss function. After we have made our decision we are given the value, y, of a detailed random variable Y, which has an unknown distribution conditional on θ. The random variable X and Y are assumed independent given a particular θ. Our past experience is made up of the values of preliminary and detailed random variables from previous decision problems which are independent of but similar to the present one.
With the risk defined in the usual way the Bayes decision function is the expected value of θ given that X = x. Since the distributions are unknown, the use of the two sample nonparametric empirical Bayes decision function is proposed. With the regret defined in the usual way it can be shown that the two sample nonparametric empirical Bayes decision function is asymptotically optimal, i.e. for a large number of past decision problems, the regret in using the two nonparametric empirical Bayes decision function tends to zero, and it is the main purpose of this thesis to verify this property by using a hypothetical numerical example. / Master of Science
|
Page generated in 0.0943 seconds