• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 29
  • 29
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

THINKING POKER THROUGH GAME THEORY

Palafox, Damian 01 June 2016 (has links)
Poker is a complex game to analyze. In this project we will use the mathematics of game theory to solve some simplified variations of the game. Probability is the building block behind game theory. We must understand a few concepts from probability such as distributions, expected value, variance, and enumeration methods to aid us in studying game theory. We will solve and analyze games through game theory by using different decision methods, decision trees, and the process of domination and simplification. Poker models, with and without cards, will be provided to illustrate optimal strategies. Extensions to those models will be presented, and we will show that optimal strategies still exist. Finally, we will close this paper with an original work to an extension that can be used as a medium to creating more extensions and, or, different games to explore.
2

The Utilization of Expert Advice: Effects of Cost and Accuracy

Sutherland, Steven C. 01 January 2009 (has links)
Effects of cost and accuracy on the decision to request and to utilize expert advice were investigated in 2 experiments using a choice task. Experiments 1 and 2 found that experienced accuracy significantly predicted requesting expert advice. Participants in Experiment 2 used very inaccurate experts to rule out the expert's option. Cost affected requesting advice in Experiment 1 only when cost was able to exceed the amount that could be gained for a correct choice. Experiment 2 found a significant interaction between cost and experienced accuracy. Both experiments found requesting advice was the only significant predictor for changing answers. The results did not support an adherence to sunk costs in the decision to change answers.
3

The Influence of Relative Subjective Value on Preparatory Activity in the Superior Colliculus as Indexed by Saccadic Reaction Times

Milstein, DAVID 26 June 2013 (has links)
Deal or no deal? Hold ‘em or fold ‘em? Buy, hold or sell? When faced with uncertainty, a wise decision-maker evaluates each option and chooses the one they deem most valuable. Scientists studying decision making processes have spent much theoretical and experimental effort formalizing a framework that captures how decision makers can maximize the amount of subjective value they accrue from such decisions. This thesis tested two hypotheses. The first was that subjective value guides our simplest and most common of motor actions similar to how it guides more deliberative economic decisions. The second was that subjective value is allocated across pre-motor regions of the brain to make our actions more efficient. To accomplish these goals, I adapted a paradigm used by behavioural economists for use in neurophysiological experiments in non-human primates. In our task, monkeys repeatedly make quick, orienting eye movements, known as saccades, to targets, which they learned through experience, had different values. In support of the hypothesis that subjective value influences simple motor actions, the speed with which monkeys responded, known as saccadic reaction time (SRT), and their saccadic choices to valued targets were highly correlated and therefore both acted as a behavioural measures of subjective value. Two complimentary results support the hypothesis that subjective value influences activity in the intermediate layers of the superior colliculus (SCi) – a well-studied brain region important to the planning and execution of saccades - to produce efficient actions. First, when saccades were elicited with microstimulation, we found that the timing and spatial allocation of pre-saccadic activity in the SC was shaped by subjective value. Second, the baseline preparatory activity and transient visual activity of SCi neurons prior to saccade generation was also influenced by subjective value. Our results can be incorporated into existing models of SC functioning that use dynamic neural field theory. I suggest that saccades of higher subjective value will result in higher activation of their associated neural field such that they will be more likely and more quickly selected. In summary, this thesis demonstrates that subjective value influences neural mechanisms, not only for deliberative decision making, but also for the efficient selection of simple motor actions. / Thesis (Ph.D, Neuroscience Studies) -- Queen's University, 2013-06-25 17:18:25.393
4

Evaluation of the Swedish Trade Council’s Business Opportunity Projects

Allerup, Jonas January 2010 (has links)
The purpose of this paper is to investigate the effects of the Business Opportunity Projects (BOPs) that the Swedish Trade Council uses when promoting export for small enterprises. The Business Opportunity Projects have the same type of setup for all offices where the Swedish Trade Council is established and are subsidized by 60 percent from the government. A dataset on firms’ financial state on a ten year basis is used and survey interviews conducted in 2005/06 and 2007/08. From this data three types of methods are used; a calculations on expected values of return; a panel data model and a probit model.The results show that the expected return of one project is around 250 000 SEK and if the project is successful the average return is around 1 000 000 SEK. The governmental return is around 22 times the invested money. The probability of creating business volume directly or indirectly is around 45 percent. It is also shown that the projects have an impact on the export turnover of the participating firms. The effect comes after two years and it increases until four years after the BOP. The interpretation of the exact effect should be made with caution due to estimation issues. The result also indicates that the BOP generates around 1.5 employees on averages.The results show that the participating firms do not have advantage being larger, or being from the middle region of Sweden nor in a specific branch in order to have a successful project. Firms from north part of Sweden that have a slightly smaller chance of having a successful project, if the project is made in Western European offices, the firms have a higher probability to succeed compared to other offices.
5

Sequential Auction Design and Participant Behavior

Taylor, Kendra C. 20 July 2005 (has links)
This thesis studies the impact of sequential auction design on participant behavior from both a theoretical and an empirical viewpoint. In the first of the two analyses, three sequential auction designs are characterized and compared based on expected profitability to the participants. The optimal bid strategy is derived as well. One of the designs, the alternating design, is a new auction design and is a blend of the other two. It assumes that the ability to bid in or initiate an auction is given to each side of the market in an alternating fashion to simulate seasonal markets. The conditions for an equilibrium auction design are derived and characteristics of the equilibrium are outlined. The primary result is that the alternating auction is a viable compromise auction design when buyers and suppliers disagree on whether to hold a sequence of forward or reverse auctions. We also found the value of information on future private value for a strategic supplier in a two-period case of the alternating and reverse auction designs. The empirical work studies the cause of low aggregation of timber supply in reverse auctions of an online timber exchange. Unlike previous research results regarding timber auctions, which focus on offline public auctions held by the U.S. Forest Service, we study online private auctions between logging companies and mills. A limited survey of the online auction data revealed that the auctions were successful less than 50% of the time. Regression analysis is used to determine which internal and external factors to the auction affect the aggregation of timber in an effort to determine the reason that so few auctions succeeded. The analysis revealed that the number of bidders, the description of the good and the volume demanded had a significant influence on the amount of timber supplied through the online auction exchange. A plausible explanation for the low aggregation is that the exchange was better suited to check the availability for custom cuts of timber and to transact standard timber.
6

Statistical Idealities and Expected Realities in the Wavelet Techniques Used for Denoising

DeNooyer, Eric-Jan D. 01 January 2010 (has links)
In the field of signal processing, one of the underlying enemies in obtaining a good quality signal is noise. The most common examples of signals that can be corrupted by noise are images and audio signals. Since the early 1980's, a time when wavelet transformations became a modernly defined tool, statistical techniques have been incorporated into processes that use wavelets with the goal of maximizing signal-to-noise ratios. We provide a brief history of wavelet theory, going back to Alfréd Haar's 1909 dissertation on orthogonal functions, as well as its important relationship to the earlier work of Joseph Fourier (circa 1801), which brought about that famous mathematical transformation, the Fourier series. We demonstrate how wavelet theory can be used to reconstruct an analyzed function, ergo, that it can be used to analyze and reconstruct images and audio signals as well. Then, in order to ground the understanding of the application of wavelets to the science of denoising, we discuss some important concepts from statistics. From all of these, we introduce the subject of wavelet shrinkage, a technique that combines wavelets and statistics into a "thresholding" scheme that effectively reduces noise without doing too much damage to the desired signal. Subsequently, we discuss how the effectiveness of these techniques are measured, both in the ideal sense and in the expected sense. We then look at an illustrative example in the application of one technique. Finally, we analyze this example more generally, in accordance with the underlying theory, and make some conclusions as to when wavelets are an effective technique in increasing a signal-to-noise ratio.
7

The Practicality of Statistics: Why Money as Expected Value Does Not Make Statistics Practical

Reimer, Sean 01 January 2015 (has links)
This thesis covers the uncertainty of empirical prediction. As opposed to objectivity, I will discuss the practicality of statistics. Practicality defined as "useful" in an unbiased sense, in relation to something in the external world that we care about. We want our model of prediction to give us unbiased inference whilst also being able to speak about something we care about. For the reasons explained, the inherent uncertainty of statistics undermines the unbiased inference for many methods. Bayesian Statistics, by valuing hypotheses is more plausible but ultimately cannot arrive at an unbiased inference. I posit the value theory of money as a concept that might be able to allow us to derive unbiased inferences from while still being something we care about. However, money is of instrumental value, ultimately being worth less than an object of “transcendental value.” Which I define as something that is worth more than money since money’s purpose is to help us achieve “transcendental value” under the value theory. Ultimately, as long as an individual has faith in a given hypothesis it will be worth more than any hypothesis valued with money. From there we undermine statistic’s practicality as it seems as though without the concept of money we have no manner of valuing hypotheses unbiasedly, and uncertainty undermines the “objective” inferences we might have been able to make.
8

Zadávání veřejných zakázek z pohledu zadavatele / Public Procurement from the Perspective of the Contracting Authority

Židková, Michaela January 2016 (has links)
The aim of this master thesis is primarily to develop a methodological framework for a contracting authority receiving an abnormally low bid. The theoretical part of the thesis outlines the basic terms and definitions, analyzes a public tender from the perspective of a contracting authority and defines an extremely low bid price. The practical part of the thesis applies the methodological framework for dealing with an extremely low bid price on the case study, where on the set of selected areas of mechanical items indicates possible views above the limit costs of individual items costing unit prices of construction work.
9

The Relationship of Expected Value-based Risky Decision Making Tasks to Attitudes Toward Various Kinds of Risks

Brown, Andrew B. 04 August 2011 (has links)
No description available.
10

Methodologische Aspekte biomechanischer Messungen unter Laborbedingungen

Oriwol, Doris 30 March 2012 (has links) (PDF)
„Nun sag, wie hast du’s mit der Messung im Labor?“ So oder ähnlich lautet die sich anhand dieser Arbeit ergebende Gretchenfrage bezüglich biomechanischer Auswertungen und Studien des Laufsports, welche unter Laborbedingungen durchgeführt werden. Hierbei wird angenommen, dass eine Messung im Labor eine valide experimentelle Operationalisierung des Ausdauerlaufens darstellt. Aufgrund der räumlichen Begrenzung kann lediglich eine vergleichsweise geringe Anzahl an einzelnen Versuchen aufgezeichnet werden. Für die statistische Auswertung werden dann zumeist einzelne Parameter der Zeitreihen berechnet, welche wiederum zusammengefasst durch Mittelwerte den Probanden repräsentieren müssen. Bei der Verwendung von diskreten Parametern reduziert sich die aufgenommene Information der Zeitreihe dabei erheblich. Damit einhergehend muss die Frage geklärt werden, ob die Variabilität eines Probanden anhand diskreter Werte oder anhand der gesamten Kurve Beachtung finden muss. Des Weiteren stellt sich die Frage inwieweit das arithmetische Mittel über eine große Anzahl an Versuchen als die den Probanden repräsentierende Kennzahl verwendet und dessen Variabilität mittels einer endlichen Anzahl an Wiederholungen charakterisiert werden kann. Für die Untersuchungen wurden zunächst zwei Studien durchgeführt, wobei die Aufzeichnung von Bodenreaktionskräften und der Winkelgeschwindigkeit bei 100 Läufen an je zwei Messtagen im Labor erfolgte. Die statistischen Auswertungen umfassen sowohl die Betrachtung der Konvergenz von Folgen kumulierter Mittelwerte, Standardabweichungen und Root Mean Square Errors für diskrete Parameter und die gesamten aufgezeichneten Signale der Bodenreaktionskräfte und Winkelgeschwindigkeit als auch die Untersuchung von Prädiktionsbändern. Zudem wurden unterschiedliche Algorithmen zur Bestimmung der minimalen Anzahl an aufzuzeichnenden Versuchen entwickelt. Diese beinhalten nichtlineare Regressionsmodelle für die Anpassung der kumulierten Fläche der Prädiktionsbänder gesamter Kurven und die Analyse der Differenzen aufeinanderfolgender Standardabweichungskurven. Zusammenfassend geht aus dieser Arbeit hervor, dass die postulierte ausreichende und stabile Charakterisierung eines Probanden anhand des arithmetischen Mittels sowie der vollständigen und soliden Beschreibung der Variabilität für diskrete Parameter nicht nachgewiesen werden konnte. Für gesamte Kurven ergab sich ein anderes Bild. Die Probanden konnten anhand der mittleren vertikalen Bodenreaktionskräfte sowie der Bodenreaktionskräfte in anterior-posterior Richtung stabil und ausreichend charakterisiert werden. Für die Bodenreaktionskräfte in mediolateraler Richtung und die Kurve der Winkelgeschwindigkeit wurde dies nicht bestätigt. Die Möglichkeit der Charakterisierung der Variabilität eines Probanden konnte zudem verifiziert werden. Durch Beibehaltung der ursprünglichen Messprozedur ist die Wahrscheinlichkeit sehr hoch, dass der begangene Fehler den Ausgang der statistischen Auswertung beeinflusst und damit Eigenschaften der vorliegenden Grundgesamtheit unter Umständen falsch widerspiegelt. Von einer Verwendung des Mittelwertes diskreter Parameter sollte daher abgesehen werden. Der Fehler sowie dessen unbekanntes Ausmaß sind zum Teil unkontrollierbar und dessen Auswirkungen auf weitere biomechanische Kenngrößen nicht überprüfbar. Die Annahme, dass eine Labormessung als valide experimentelle Operationalisierung des Ausdauerlaufens angesehen werden kann, ist damit hinfällig. Es ist zukünftig notwendig, die Erforschung neuer Aufnahme- und Auswerteprozeduren, die alternative Verwendung gesamter Kurven und die Entwicklung neuer Testverfahren zu forcieren.

Page generated in 0.0352 seconds