• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 164
  • 41
  • 23
  • 21
  • 19
  • 16
  • 12
  • 11
  • 9
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 373
  • 114
  • 104
  • 69
  • 68
  • 68
  • 56
  • 47
  • 44
  • 41
  • 32
  • 31
  • 27
  • 27
  • 26
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Accurate Approximation Series for Optimal Targeting Regions in a Neural Growth Model with a Low –branching Probability

Nieto, Bernardo 16 December 2015 (has links)
Understanding the complex growth process of dendritic arbors is essential for the medical field and disciplines like Biology and Neurosciences. The establishment of the dendritic patterns has received increasing attention from experimental researchers that seek to determine the cellular mechanisms that play a role in the growth of neural trees. Our goal in this thesis was to prove the recurrence formula for the probability distribution of all possible neural trees, as well as the formulas of the expected number of active branches and their variances. We also derived formulas for the spatial locations of the optimal targeting region for a tree with branching probability. These formulas were necessary for the simplified stochastic computational model that Osan et al have developed in order to examine how changes in branching probability influence the success of targeting neurons located at different distances away from a starting point.
62

Robust coalition formation in a dynamic, contractless environment

Jones, Christopher Lyman 21 June 2010 (has links)
This dissertation focuses on robust coalition formation between selfish agents in a dynamic environment where contracts are unenforceable. Previous research on this topic has covered each different aspect of this problem, but no research successfully addresses these factors in combination. Therefore, a novel approach is required. This dissertation accordingly has three major goals: to develop a theoretical framework that describes how selfish agents should select jobs and partners in a dynamic, contractless environment, to test a strategy based on that framework against existing heuristics in a simulated environment, and to create a learning agent capable of optimally adjusting its coalition formation strategy based on the level of dynamic change found in its environment. Experimental results demonstrate that the Expected Utility (EU) strategy based on the developed theoretical framework performs better than strategies using heuristics to select jobs and partners, and strategies which simulate a centralized “manager”. Future work in this area includes altering the EU strategy from an anytime strategy to a hill-climbing one, as well as further game theoretic explorations of the interactions between different strategies. / text
63

The Influence of Relative Subjective Value on Preparatory Activity in the Superior Colliculus as Indexed by Saccadic Reaction Times

Milstein, DAVID 26 June 2013 (has links)
Deal or no deal? Hold ‘em or fold ‘em? Buy, hold or sell? When faced with uncertainty, a wise decision-maker evaluates each option and chooses the one they deem most valuable. Scientists studying decision making processes have spent much theoretical and experimental effort formalizing a framework that captures how decision makers can maximize the amount of subjective value they accrue from such decisions. This thesis tested two hypotheses. The first was that subjective value guides our simplest and most common of motor actions similar to how it guides more deliberative economic decisions. The second was that subjective value is allocated across pre-motor regions of the brain to make our actions more efficient. To accomplish these goals, I adapted a paradigm used by behavioural economists for use in neurophysiological experiments in non-human primates. In our task, monkeys repeatedly make quick, orienting eye movements, known as saccades, to targets, which they learned through experience, had different values. In support of the hypothesis that subjective value influences simple motor actions, the speed with which monkeys responded, known as saccadic reaction time (SRT), and their saccadic choices to valued targets were highly correlated and therefore both acted as a behavioural measures of subjective value. Two complimentary results support the hypothesis that subjective value influences activity in the intermediate layers of the superior colliculus (SCi) – a well-studied brain region important to the planning and execution of saccades - to produce efficient actions. First, when saccades were elicited with microstimulation, we found that the timing and spatial allocation of pre-saccadic activity in the SC was shaped by subjective value. Second, the baseline preparatory activity and transient visual activity of SCi neurons prior to saccade generation was also influenced by subjective value. Our results can be incorporated into existing models of SC functioning that use dynamic neural field theory. I suggest that saccades of higher subjective value will result in higher activation of their associated neural field such that they will be more likely and more quickly selected. In summary, this thesis demonstrates that subjective value influences neural mechanisms, not only for deliberative decision making, but also for the efficient selection of simple motor actions. / Thesis (Ph.D, Neuroscience Studies) -- Queen's University, 2013-06-25 17:18:25.393
64

Conditional betas, higher comoments and the cross-section of expected stock returns

Xu, Lei January 2010 (has links)
This thesis examines the performance of different models of conditional betas and higher comoments in the context of the cross-section of expected stock returns, both in-sample and out-of-sample. I first examine the performance of different conditional market beta models by using monthly returns of the Fama-French 25 portfolios formed by the quintiles of size and book-to-market ratio in Chapter 3. This is a cross-sectional test of the conditional CAPM. The models examined include simple OLS regressions, the macroeconomic variables model, the state-space model, the multivariate GARCH model and the realized beta model. The results show that the state-space model performs best in-sample with significant betas and insignificant intercepts. For the out-of-sample performance, however, none of the models examined can explain returns of the 25 portfolios. Next, I examine the recently proposed realized beta model, which is based on the realized volatility literature, by using individual stocks listed in the US market in Chapter 4. I extend the realized market beta model to betas of multi-factor asset pricing models. Models tested are the CAPM, the Fama-French three-factor model and a four-factor model including the three Fama-French factors and a momentum factor. Realized betas of different models are used in the cross-section regressions along with firm-level variables such as size, book-to-market ratio and past returns. The in-sample results show that market beta is significant and additional betas of multi-factor models can reduce although not eliminate the effects of firm-level variables. The out-of-sample results show that no betas are significant. The results are robust across different markets such as NYSE, AMEX and NASDAQ. In Chapter 5, I test if realized coskewness and cokurtosis can help explain the cross-section of stock returns. I add coskewness and cokurtosis to the factor pricing models tested in Chapter 4. The results show that the coefficients of coskewness and cokurtosis have the correct sign as predicted by the higher-moment CAPM theory but only cokurtosis is significant. Cokurtosis is significant not only in-sample but also out-of-sample, suggesting cokurtosis is an important risk. However, the effects of firm-level variables remain significant after higher moments are included, indicating a rejection of higher-moment asset pricing models. The results are also robust across different markets such as NYSE, AMEX and NASDAQ. The overall results of this thesis indicate a rejection of the conditional asset pricing models. Models of systematic risks, i.e. betas and higher comoments, cannot explain the cross-section of expected stock returns.
65

How Low Can You Go? : Quantitative Risk Measures in Commodity Markets

Forsgren, Johan January 2016 (has links)
The volatility model approach to forecasting Value at Risk is complemented with modelling of Expected Shortfalls using an extreme value approach. Using three models from the GARCH family (GARCH, EGARCH and GJR-GARCH) and assuming two conditional distributions, normal Gaussian and Student t’s distribution, to make predictions of VaR, the forecasts are used as a threshold for assigning losses to the distribution tail. The Expected Shortfalls are estimated assuming that the violations of VaR follow the Generalized Pareto distribution, and the estimates are evaluated. The results indicate that the most efficient model for making predictions of VaR is the asymmetric GJR-GARCH, and that assuming the t distribution generates conservative forecasts. In conclusion there is evidence that the commodities are characterized by asymmetry and conditional normality. Since no comparison is made, the EVT approach can not be deemed to be either superior or inferior to standard approaches to Expected Shortfall modeling, although the data intensity of the method suggest that a standard approach may be preferable.
66

The expected signature of a stochastic process

Ni, Hao January 2012 (has links)
The signature of the path provides a top down description of a path in terms of its eects as a control. It is a group-like element in the tensor algebra and is an essential object in rough path theory. When the path is random, the linear independence of the signatures of different paths leads one to expect, and it has been proved in simple cases, that the expected signature would capture the complete law of this random variable. It becomes of great interest to be able to compute examples of expected signatures. In this thesis, we aim to compute the expected signature of various stochastic process solved by a PDE approach. We consider the case for an Ito diffusion process up to a fixed time, and the case for the Brownian motion up to the first exit time from a domain. We manage to derive the PDE of the expected signature for both cases, and find that this PDE system could be solved recursively. Some specific examples are included herein as well, e.g. Ornstein-Uhlenbeck (OU) processes, Brownian motion and Levy area coupled with Brownian motion.
67

Quantile-based methods for prediction, risk measurement and inference

Ally, Abdallah K. January 2010 (has links)
The focus of this thesis is on the employment of theoretical and practical quantile methods in addressing prediction, risk measurement and inference problems. From a prediction perspective, a problem of creating model-free prediction intervals for a future unobserved value of a random variable drawn from a sample distribution is considered. With the objective of reducing prediction coverage error, two common distribution transformation methods based on the normal and exponential distributions are presented and they are theoretically demonstrated to attain exact and error-free prediction intervals respectively. The second problem studied is that of estimation of expected shortfall via kernel smoothing. The goal here is to introduce methods that will reduce the estimation bias of expected shortfall. To this end, several one-step bias correction expected shortfall estimators are presented and investigated via simulation studies and compared with one-step estimators. The third problem is that of constructing simultaneous confidence bands for quantile regression functions when the predictor variables are constrained within a region is considered. In this context, a method is introduced that makes use of the asymmetric Laplace errors in conjunction with a simulation based algorithm to create confidence bands for quantile and interquantile regression functions. Furthermore, the simulation approach is extended to an ordinary least square framework to build simultaneous bands for quantiles functions of the classical regression model when the model errors are normally distributed and when this assumption is not fulfilled. Finally, attention is directed towards the construction of prediction intervals for realised volatility exploiting an alternative volatility estimator based on the difference of two extreme quantiles. The proposed approach makes use of AR-GARCH procedure in order to model time series of intraday quantiles and forecast intraday returns predictive distribution. Moreover, two simple adaptations of an existing model are also presented.
68

Measuring Extremes: Empirical Application on European Markets

Öztürk, Durmuş January 2015 (has links)
This study employs Extreme Value Theory and several univariate methods to compare their Value-at-Risk and Expected Shortfall predictive performance. We conduct several out-of-sample backtesting procedures, such as uncondi- tional coverage, independence and conditional coverage tests. The dataset in- cludes five different stock markets, PX50 (Prague, Czech Republic), BIST100 (Istanbul, Turkey), ATHEX (Athens, Greece), PSI20 (Lisbon, Portugal) and IBEX35 (Madrid, Spain). These markets have different financial histories and data span over twenty years. We analyze the global financial crisis period sep- arately to inspect the performance of these methods during the high volatility period. Our results support the most common findings that Extreme Value Theory is one of the most appropriate risk measurement tools. In addition, we find that GARCH family of methods, after accounting for asymmetry and fat tail phenomena, can be equally useful and sometimes even better than Extreme Value Theory based method in terms of risk estimation. Keywords Extreme Value Theory, Value-at-Risk, Expected Shortfall, Out-of-Sample Backtesting Author's e-mail ozturkdurmus@windowslive.com Supervisor's e-mail ies.avdulaj@gmail.com
69

Essais sur la fraude à l'impôt sur le revenu / Essais on income tax evasion

Trotin, Gwenola 26 June 2012 (has links)
L'objectif central de cette thèse est d'étudier le comportement de fraude fiscale des contribuables quand ils ne déclarent qu'une partie de leur revenu. Le premier chapitre complète la littérature existante en étudiant le niveau de déclaration du revenu et les effets de changements des taux de taxe, de pénalité et de probabilité de contrôle, en considérant des fonctions d'imposition et de pénalité non linéaire, dans le cadre de la théorie de l'espérance de l'utilité.Le cadre fourni par la théorie des perspectives cumulatives est ensuite utilisé dans le second chapitre. L'accent est mis sur la dépendance des décisions du contribuable vis-à-vis du revenu de référence introduit par cette théorie. Le troisième chapitre caractérise le barème optimal d'imposition du revenu et la stratégie de contrôle et de pénalité que doit mettre en place l'État quand le comportement de fraude des contribuables vérifie les propriétés de la théorie des perspectives. / This dissertation analyzes the tax evasion behavior of taxpayers when they do not declare their entire income. The first chapter studies the declaration of the taxpayer and the effects of changes in the tax rate, the penalty rate and the probability of audit. The tax and the penalty functions are assumed to be non linear. The setting is provided by expected utility theory. The setting provided by cumulative prospect theory is used in the second chapter. Reference dependence, which is a central point in this theory, is particularly studied. The third chapter characterizes the optimal income tax and audit schemes under taxe evasion behavior, when of tax payers behave as predicted by prospect theory.
70

Risk Management Project

Yan, Lu 02 May 2012 (has links)
In order to evaluate and manage portfolio risk, we separated this project into three sections. In the first section we constructed a portfolio with 15 different stocks and six options with different strategies. The portfolio was implemented in Interactive Brokers and rebalanced weekly through five holding periods. In the second section we modeled the loss distribution of the whole portfolio with normal and student-t distributions, we computed the Value-at-Risk and expected shortfall in detail for the portfolio loss in each holding week, and then we evaluated differences between the normal and student-t distributions. In the third section we applied the ARMA(1,1)-GARCH(1,1) model to simulate our assets and compared the polynomial tails with Gaussian and t-distribution innovations.

Page generated in 0.0345 seconds