• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7501
  • 2116
  • 1527
  • 586
  • 533
  • 378
  • 378
  • 378
  • 378
  • 378
  • 367
  • 320
  • 219
  • 174
  • 66
  • Tagged with
  • 15860
  • 7517
  • 2041
  • 1909
  • 1528
  • 1333
  • 1288
  • 1258
  • 1186
  • 1084
  • 1067
  • 1039
  • 1001
  • 930
  • 821
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

The Impact of the Carry Trade on Global Currency Markets

Smit, Steven 28 January 2020 (has links)
This work analyses the effect of the carry trade factor, statistically derived from a comprehensive basket of currencies, on currencies in various heuristically defined global risk appetite regimes. Findings of a heightened (lessened) impact of this factor for Emerging/Commodity (Developed/European) currencies in the presence of high risk are presented. The risk appetite process is additionally analysed by modelling it as a Markov-switching model, providing evidence of three inherent regimes, with properties roughly consistent with findings in the literature.
202

Stock price fragility in an emerging market

Nairac, Jean-Michel January 2013 (has links)
Includes bibliographical references. / This research project examines stock price fragility, a measure developed by Greenwood and Thesmar (2011), which serves as a proxy for non-fundamental risk i.e. it aims to isolate the drivers of stock price volatility beyond traditional fundamental drivers, in particular examining the impact of concentrated stock ownership and correlated liquidity shocks on price volatility. Here, the measure is applied to the South African financial market. Subject to data complications, it is nevertheless shown that stock price fragility is a significant predictor of total return volatility owing to the ownership structure of South African funds, even when controlling for endogeneity, autocorrelation and heteroskedasticity in the model.
203

Analysis of clustered competing risks with application to a multicentre clinical trial

Familusi, Mary Ajibola January 2016 (has links)
The usefulness of time-to-event (survival) analysis has made it gain a wide applicability in statistically modelling research. The methodological developments of time-to-event analysis that have been widely adopted are: (i) The Kaplan-Meier method, for estimating the survival function; (ii) The log-rank test, for comparing the equality of two or more survival distributions; (m) The Cox proportional hazards model, for examining the covariate effects on the hazard function; and (iv) The accelerated failure time model, for examining the covariate effects on the survival function. Nonetheless, in time-to-event endpoints assessment, if subjects can fail from multiple mutually-exclusive causes, data are said to have competing risks. For competing risks data, the Fine and Gray proportional hazards model for sub-distributions has gained popularity due to its convenience in directly assessing the effect of covariates on the cumulative incidence function. Furthermore, sometimes competing risks data cannot be considered as independent because of a clustered design; for instance, in registry cohorts or multi-centre clinical trials. The Fine and Gray model has been extended to the analysis of clustered time-to-event data, by including random-centre effects or frailties in the sub-distribution hazard. This research focuses on the analysis of clustered competing risks with an application to the investigation of the management of pericarditis clinical trial (IMPI) dataset. IMPI is a multi- centre clinical trial that was carried out from 19 centres in 8 African countries with the principal objective of assessing the effectiveness and safety of adjunctive prednisolone and Mycobacterium indicus pranii immunotherapy, in reducing the composite outcome of death, constriction or cardiac tamponade, requiring pericardial drainage in patients with probable or definite tuberculous pericarditis. The clinical objective in this thesis is therefore to analyse time to these outcomes. In addition, the risk factors associated with these outcomes were determined, and the effect of the prednisolone and M. indcus pranii was examined, while adjusting for these risk factors and considering centres as a random effect. Using Cox proportional hazards model, it was found that age, weight, New York Heart Association (NYHA) class, hypotension, creatinine, and peripheral oedema show a statistically significant association with the composite outcome. Furthermore, weight, NYHA class, hypotension, creatinine and peripherial oedema show a statistically significant association with death. In addition, NYHA class and hypotension show a statistically significant association with cardiac tamponade. Lastly, prednisolone, gender, NYHA class, tachycardia, haemoglobin level, peripheral oedema, pulmonary infiltrate and HIV status show a statistically significant association with constriction. A value of 0.1 significance level was used to identify variables as significant in the univariate model using forward stepwise regression method. The random effect was found to be significant in the incidence of composite outcomes of death, cardiac tamponade and constriction, and in the individual outcome of constriction, but this only slightly changed the estimated effect of the covariates as compared to when the random effect was not considered. Accounting for death as a competing event to the outcomes of cardiac tamponade or constriction, does not affect the effect of the covariates on these outcomes. In addition, in the multivariate models that adjust for other risk factors, there was no significant difference in the primary outcome between patients who received prednisolone, and those who received placebo, or between those who received M. indicus pranii immunotherapy, and those who received placebo.
204

Functional quantization-based stratified sampling

Platts, Alexander January 2017 (has links)
Functional quantization-based stratified sampling is a method for variance reduction proposed by Corlay and Pagès (2015). This method requires the ability to both create functional quantizers and to sample Brownian paths from the strata defined by the quantizers. We show that product quantizers are a suitable approximation of an optimal quantizer for the formation of functional quantizers. The notion of functional stratification is then extended to options written on multiple stocks and American options priced using the Longstaff-Schwartz method. To illustrate the gains in performance we focus on geometric brownian motion (GBM), constant elasticity of variance (CEV) and constant elasticity of variance with stochastic volatility (CEV-SV) models. The pricing algorithm is used to price knock-in, knockout, autocall, call on the max and path dependent call on the max options.
205

Efficient Monte Carlo simulations of pricing captions using Libor market models

Mkhwanazi, MA (Mpendulo Armstrong) January 2013 (has links)
Includes bibliographical references. / The cap option (caption) is one of common European exotic options discussed in literature. This (interest rates) exotic option has no closed form solution and its accurate pricing and hedging in a volatile market is a challenge for traders. The reason for this is that, comparatively, the behaviour on an individual interest rate is more complex than that of a stock price. To price any interest rate product, it is essential to develop an interest rates model describing the behaviour of the entire zero coupon yield curve. The equity and yield curve, respectively, relate to the difference in the dynamics of a scalar variable and vector variable. Moreover, captions are second order with respect to the discount bonds in that they are options on caps (which are also options on bonds). These reasons make it of particular interest to study efficient numerical solutions to price captions. Monte Carlo simulation provides a simple method for pricing this option, and a suitable interest rate model to use is the Libor market model. The approach of describing the behaviour of the entire zero coupon yield curve, in the era post the 2007 credit crunch crisis, is what is called a standard single-curve market practice, and Part l of this work is based on it. . After introducing the framework for option pricing in the interest rate market, the theory and implementation procedure for Monte Carlo simulation using Libor market models is described. A detailed analysis of the results is presented together with a sensitivity analysis, and finally suggestions for efficient pricing of captions are given. In Part II we review the recent financial market evolution, triggered by the credit crunch crisis towards double-curve approach. Unfortunately, such a methodology is not easy to build. In practice an empirical approach to price and hedge interest rate derivatives has prevailed in the market. Future cash flows are generated through multiple forwarding yield curves associated to the underlying rate tenors, and their net present value is calculated through discount factors front a single discounting yield curve.
206

Empirical statistical modelling for crop yields predictions: bayesian and uncertainty approaches

Adeyemi, Rasheed Alani January 2015 (has links)
Includes bibliographical references / This thesis explores uncertainty statistics to model agricultural crop yields, in a situation where there are neither sampling observations nor historical record. The Bayesian approach to a linear regression model is useful for predict ion of crop yield when there are quantity data issue s and the model structure uncertainty and the regression model involves a large number of explanatory variables. Data quantity issues might occur when a farmer is cultivating a new crop variety, moving to a new farming location or when introducing a new farming technology, where the situation may warrant a change in the current farming practice. The first part of this thesis involved the collection of data from experts' domain and the elicitation of the probability distributions. Uncertainty statistics, the foundation of uncertainty theory and the data gathering procedures were discussed in detail. We proposed an estimation procedure for the estimation of uncertainty distributions. The procedure was then implemented on agricultural data to fit some uncertainty distributions to five cereal crop yields. A Delphi method was introduced and used to fit uncertainty distributions for multiple experts' data of sesame seed yield. The thesis defined an uncertainty distance and derived a distance for a difference between two uncertainty distributions. We lastly estimated the distance between a hypothesized distribution and an uncertainty normal distribution. Although, the applicability of uncertainty statistics is limited to one sample model, the approach provides a fast approach to establish a standard for process parameters. Where no sampling observation exists or it is very expensive to acquire, the approach provides an opportunity to engage experts and come up with a model for guiding decision making. In the second part, we fitted a full dataset obtained from an agricultural survey of small-scale farmers to a linear regression model using direct Markov Chain Monte Carlo (MCMC), Bayesian estimation (with uniform prior) and maximum likelihood estimation (MLE) method. The results obtained from the three procedures yielded similar mean estimates, but the credible intervals were found to be narrower in Bayesian estimates than confidence intervals in MLE method. The predictive outcome of the estimated model was then assessed using simulated data for a set of covariates. Furthermore, the dataset was then randomly split into two data sets. The informative prior was later estimated from one-half called the "old data" using Ordinary Least Squares (OLS) method. Three models were then fitted onto the second half called the "new data": General Linear Model (GLM) (M1), Bayesian model with a non-informative prior (M2) and Bayesian model with informative prior (M3). A leave-one-outcross validation (LOOCV) method was used to compare the predictive performance of these models. It was found that the Bayesian models showed better predictive performance than M1. M3 (with a prior) had moderate average Cross Validation (CV) error and Cross Validation (CV) standard error. GLM performed worst with least average CV error and highest (CV) standard error among the models. In Model M3 (expert prior), the predictor variables were found to be significant at 95% credible intervals. In contrast, most variables were not significant under models M1 and M2. Also, The model with informative prior had narrower credible intervals compared to the non-information prior and GLM model. The results indicated that variability and uncertainty in the data was reasonably reduced due to the incorporation of expert prior / information prior. We lastly investigated the residual plots of these models to assess their prediction performance. Bayesian Model Average (BMA) was later introduced to address the issue of model structure uncertainty of a single model. BMA allows the computation of weighted average over possible model combinations of predictors. An approximate AIC weight was then proposed for model selection instead of frequentist alternative hypothesis testing (or models comparison in a set of competing candidate models). The method is flexible and easy to interpret instead of raw AIC or Bayesian information criterion (BIC), which approximates the Bayes factor. Zellner's g-prior was considered appropriate as it has widely been used in linear models. It preserves the correlation structure among predictors in its prior covariance. The method also yields closed-form marginal likelihoods which lead to huge computational savings by avoiding sampling in the parameter space as in BMA. We lastly determined a single optimal model from all possible combination of models and also computed the log-likelihood of each model.
207

Bias-Free Joint Simulation of Multi-Factor Short Rate Models and Discount Factor

Lopes, Marcio Ferrao 06 February 2019 (has links)
This dissertation explores the use of single- and multi-factor Gaussian short rate models for the valuation of interest rate sensitive European options. Specifically, the focus is on deriving the joint distribution of the short rate and the discount factor, so that an exact and unbiased simulation scheme can be derived for risk-neutral valuation. We see that the derivation of the joint distribution remains tractable when working with the class of Gaussian short rate models. The dissertation compares three joint and exact simulation schemes for the short rate and the discount factor in the single-factor case; and two schemes in the multifactor case. We price European floor options and European swaptions using a twofactor Gaussian short rate model and explore the use of variance reduction techniques. We compare the exact and unbiased schemes to other solutions available in the literature: simulating the short rate under the forward measure and approximating the discount factor using quadrature.
208

Approximating the Heston-Hull-White Model

Patel, Riaz 04 February 2020 (has links)
The hybrid Heston-Hull-White (HHW) model combines the Heston (1993) stochastic volatility and Hull and White (1990) short rate models. Compared to stochastic volatility models, hybrid models improve upon the pricing and hedging of longdated options and equity-interest rate hybrid claims. When the Heston and HullWhite components are uncorrelated, an exact characteristic function for the HHW model can be derived. In contrast, when the components are correlated, the more useful case for the pricing of hybrid claims, an exact characteristic function cannot be obtained. Grzelak and Oosterlee (2011) developed two approximations for this correlated case, such that the characteristics functions are available. Within this dissertation, the approximations, referred to as the determinist and stochastic approximations, were implemented to price vanilla options. This involved extending the Carr and Madan (1999) method to a stochastic interest rate setting. The approximations were then assessed for accuracy and efficiency. In determining an appropriate benchmark for assessing the accuracy of the approximations, the full truncation Milstein and Quadratic Exponential (QE) schemes, which are popular Monte Carlo discretisation schemes for the Heston model, were extended to the HHW model. These schemes were then compared against the characteristic function for the uncorrelated case, and the QE scheme was found to be more accurate than the Milstein-based scheme. With the differences in performance becoming increasingly noticeable when the Feller (1951) condition was not satisfied and the maturity and volatility of the Hull-White model (⌘) was large. In assessing the accuracy of the approximations against the QE scheme, both approximations were similarly accurate when ⌘ was small. In contrast, when ⌘ was large, the stochastic approximation was more accurate than the deterministic approximation. However, the deterministic approximation was significantly faster than the stochastic approximation and the stochastic approximation displayed signs of potential instability. When ⌘ is small, the deterministic approximation is therefore recommended for use in applications such as calibration. With its shortcomings, the stochastic approximation could not be recommended. However, it did show promising signs of accuracy that warrants further investigation into its efficiency and stability.
209

Level Dependence in Volatility in Linear-Rational Term Structure Models

Ramnarayan, Kalind 14 February 2020 (has links)
The degree of level dependence in interest rate volatility is analysed in the linearrational term structure model. The linear-rational square-root (LRSQ) model, where level dependence is set a priori, is compared to a specification where the factor process follows CEV-type dynamics which allows a more flexible degree of level dependence. Parameters are estimated using an unscented Kalman filter in conjunction with quasi-maximum likelihood. An extended specification for the state price density process is required to ensure reliable parameter estimates. The empirical analysis indicates that the LRSQ model generally overestimates level dependence. Although the CEV specification captures the degree of level dependence in volatility more accurately, it has a trade-off with analytical tractability. The optimal specification, therefore, depends on the type of model implementation and general economic conditions.
210

Characteristic function pricing with the Heston-LIBOR hybrid model

Sterley, Christopher 24 February 2020 (has links)
We derive an approximate characteristic function for a simplified version of the Heston-LIBOR model, which assumes a constant instantaneous volatility structure in the underlying LIBOR market model. We also implement measures to improve the numerical stability of the characteristic function derived in this dissertation as well as the one derived by Grzelak and Oosterlee. The ultimate aim of the dissertation is to prevent these characteristic functions from exploding for given parameter values.

Page generated in 0.075 seconds