• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 8
  • 8
  • 8
  • 8
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Four essays in quantitative analysis : artificial intelligence and statistical inference

Hassanniakalager, Arman January 2018 (has links)
This thesis consists of four essays exploring quantitative methods for investment analysis. Chapter 1 is an introduction to the topic where the backgrounds, motivations and contributions of the thesis are discussed. This Chapter proposes an expert system paradigm which accommodates the methodology for all four empirical studies presented in Chapters 2 to 5. In Chapter 2 the profitability of technical analysis and Bayesian Statistics in trading the EUR/USD, GBP/USD, and USD/JPY exchange rates are examined. For this purpose, seven thousand eight hundred forty-six technical rules are generated, and their profitability is assessed through a novel data snooping procedure. Then, the most promising rules are combined with a Naïve Bayes (NB), a Relevance Vector Machine (RVM), a Dynamic Model Averaging (DMA), a Dynamic Model Selection (DMS) and a Bayesian regularised Neural Network (BNN) model. The findings show that technical analysis has value in Foreign eXchange (FX) trading, but the profit margins are small. On the other hand, Bayesian Statistics seems to increase the profitability of technical rules up to four times. Chapter 3 introduces the concept of Conditional Fuzzy (CF) inference. The proposed approach is able to deduct Fuzzy Rules (FRs) conditional to a set of restrictions. This conditional rule selection discards weak rules and the generated forecasts are based only on the most powerful ones. In order to achieve this, an RVM is used to extract the most relevant subset of predictors as the CF inputs. Through this process, it is capable of achieving higher forecasting performance and improving the interpretability of the underlying system. The CF concept is applied in a betting application on football games of three main European championships. CF’s performance in terms of accuracy and profitability over the In-Sample (IS) and Out-Of-Sample (OOS) are benchmarked against the single RVM and an Adaptive Neuro-Fuzzy Inference System (ANFIS) fed with the same CF inputs and an Ordered Probit (OP) fed with the full set of predictors. The results demonstrate that the CF is providing higher statistical accuracy than its benchmarks, while it is offering substantial profits in the designed betting simulation. Chapter 4 proposes the Discrete False Discovery Rate (DFDR+/-) as an approach to compare a large number of hypotheses at the same time. The presented method limits the probability of having lucky findings and accounts for the dependence between candidate models. The performance of this approach is assessed by backtesting the predictive power of technical analysis in stock markets. A pool of twenty-one thousand technical rules is tested for a positive Sharpe ratio. The surviving technical rules are used to construct dynamic portfolios. Twelve categorical and country-specific Morgan Stanley Capital International (MSCI) indexes are examined over ten years (2006-2015). There are three main findings. First, the proposed method has high power in detecting the profitable trading strategies and the time-related anomalies across the chosen financial markets. Second, the emerging and frontier markets are more profitable than the developed markets despite having higher transaction costs. Finally, for a successful portfolio management, it is vital to rebalance the portfolios on a monthly basis or more frequently. Chapter 5 undertakes an extensive investigation of volatility models for six securities in FX, stock index and commodity markets, using daily one-step-ahead forecasts over five years. A discrete false discovery controlling procedure is employed to study one thousand five hundred and twelve volatility models from twenty classes of Generalized AutoRegressive Conditional Heteroskedasticity (GARCH), Exponential Weighted Moving Average (EWMA), Stochastic Volatility (SV), and Heterogeneous AutoRegressive (HAR) families. The results indicate significant differences in forecasting conditional variance. The most accurate models vary across the three market categories and depend on the study period and measurement scale. Time-varying means, Integrated GARCH (IGARCH) and SV, as well as fat-tailed innovation distributions are the dominant specifications for the outperforming models compared to three benchmarks of ARCH (1), GARCH (1,1), and the volatility pool’s 90th percentile. Finally, Chapter 6 puts together the main findings from the four essays and presents the concluding marks.
2

Validating and extending the two-moment capital asset pricing model for financial time series

Neslihanoglu, Serdar January 2014 (has links)
This thesis contributes to the ongoing discussion about the financial and statistical modelling of returns on financial stock markets. It develops the asset pricing model concept which has received continuous attention for almost 50 years in the area of finance, as a method by which to identify the stochastic behaviour of financial data when making investment decisions, such as portfolio choices, and determining market risk. The best known and most widely used asset pricing model detailed in the finance literature is the Two-Moment Capital Asset Pricing Model (CAPM) (consistent with the Linear Market Model), which was developed by Sharpe-Lintner- Mossin in the 1960s to explore systematic risk in a mean-variance framework and is the benchmark model for this thesis. However, this model has now been criticised as misleading and insufficient as a tool for characterising returns in financial stock markets. This is partly a consequence of the presence of non-normally distributed returns and non-linear relationships between asset and market returns. The inadequacies of the Two-Moment CAPM are qualified in this thesis, and the extensions are proposed that improve on both model fit and forecasting abilities. To validate and extend the benchmark Linear Market Model, the empirical work presented in this thesis centres around three related extensions. The first extension compares the Linear Market Model’s modelling and forecasting abilities with those of the time-varying Linear Market Model (consistent with the conditional Two-Moment CAPM) for 19 Turkish industry sector portfolios. Two statistical modelling techniques are compared: a class of GARCH-type models, which allow for non-constant variance in stock market returns, and state space models, which allow for the systematic covariance risk to change linearly over time in the time-varying Linear Market Model. The state space modelling is shown to outperform the GARCH-type modelling. The second extension concentrates on comparing the performance of the Linear Market Model, with models for higher order moments, including polynomial extensions and a Generalised Additive Model (GAM). In addition, time-varying versions of the Linear Market Model and polynomial extensions, in the form of state space models, are considered. All these models are applied to 18 global markets during three different time periods: the entire period from July 2002 to July 2012, from July 2002 to just before the October 2008 financial crisis, and from after the October 2008 financial crisis to July 2012. Although the more complex unconditional models are shown to improve slightly on the Linear Market Model, the state space models again improve substantially on all the unconditional models. The final extension focuses on comparing the performance of four possible multivariate state space forms of the time-varying Linear Market Models, using data on the same 18 global markets, utilising correlations between markets. This approach is shown to improve further on the performance of the univariate state space models. The thesis concludes by drawing together three related themes: the inappropriateness of the Linear Market Model, the extent to which multivariate modelling improves the univariate market model and the state of the world’s stock markets.
3

The design of dynamic and nonlinear models in cash flow prediction

Pang, Yang January 2015 (has links)
This thesis is concerned with designing a novel model for cash flow prediction. Cash flow and earnings are both important measures of a firm’s profit. The extant literature has discussed different models that have been applied to cash flow prediction. However, previous studies have not made attempts to address the dynamics in the cash flow model parameters, which are potentially nonlinear processes. This thesis proposes a grey-box model to capture the nonlinearity and dynamics of the cash flow model parameters. The parameters are modelled as a black box, which adopts a Padé approximant as the functional form and two exogenous variables as input variables that are considered to have explanatory power for the parameter process. Besides, this thesis also employs a Bayesian forecasting model in an attempt to capture the parameter dynamics of the cash flow modelling process. The Bayesian model has the advantage of applicability in the case of a limited number of observations. Compared with the grey-box model, the Bayesian model places linear restriction on the parameter dynamics. The prior is required for the implementation of the Bayesian model and this thesis uses the results of a random parameter model as the prior. In addition, panel data estimation methods are also applied to see whether they could outperform the pooled regression that is widely applied in the extant literature. There are four datasets employed in this thesis for the examination of various models’ performance in predicting cash flow. All datasets are in panel form. This work studies the pattern of net operating cash flow (or cash flow to asset ratio) along with time for different datasets. Out-of-sample comparison is conducted among the applied models and two measures of performance are selected to compare the practical predictive power of the models. The designed grey-box model has promising and encouraging performance in all the datasets, especially for U.S. listed firms. However, the Bayesian model does not appear to be superior compared to the simple benchmark models in making practical prediction. Similarly, the panel data models also cannot beat pooled regression. In this thesis, the traditional discounted cash flow model for equity valuation is employed to take account of the cash flow prediction models that have been developed to obtain the theoretical value of equities based on the cash flows predicted by the various models developed in this thesis. The reported results show that simpler models such as the random walk model is closer to market expectation of future cash flows because it leads to a better fitness for the market share prices using the new discounting model. The results reported in this thesis show that the new valuation models developed in this thesis could have investment value. This thesis has made contributions in both theoretical and practical aspects. Through the derivation of various models, it is found that there exists potential nonlinearity and dynamic feature in cash flow prediction models. Therefore, it is crucial to capture the nonlinearity using particular tools. In addition, this thesis builds up a framework, which can be used to analyse problems of similar kinds, such as panel data prediction. The models are derived from theoretical level and then applied to analyse empirical data. The promising results suggest that in practice, the models developed in this work could provide useful guidance for people who make decisions.
4

Robust asset allocation under model ambiguity

Tobelem-Foldvari, Sandrine January 2010 (has links)
A decision maker, when facing a decision problem, often considers several models to represent the outcomes of the decision variable considered. More often than not, the decision maker does not trust fully any of those models and hence displays ambiguity or model uncertainty aversion. In this PhD thesis, focus is given to the specific case of asset allocation problem under ambiguity faced by financial investors. The aim is not to find an optimal solution for the investor, but rather come up with a general methodology that can be applied in particular to the asset allocation problem and allows the investor to find a tractable, easy to compute solution for this problem, taking into account ambiguity. This PhD thesis is structured as follows: First, some classical and widely used models to represent asset returns are presented. It is shown that the performance of the asset portfolios built using those single models is very volatile. No model performs better than the others consistently over the period considered, which gives empirical evidence that: no model can be fully trusted over the long run and that several models are needed to achieve the best asset allocation possible. Therefore, the classical portfolio theory must be adapted to take into account ambiguity or model uncertainty. Many authors have in an early stage attempted to include ambiguity aversion in the asset allocation problem. A review of the literature is studied to outline the main models proposed. However, those models often lack flexibility and tractability. The search for an optimal solution to the asset allocation problem when considering ambiguity aversion is often difficult to apply in practice on large dimension problems, as the ones faced by modern financial investors. This constitutes the motivation to put forward a novel methodology easily applicable, robust, flexible and tractable. The Ambiguity Robust Adjustment (ARA) methodology is theoretically presented and then tested on a large empirical data set. Several forms of the ARA are considered and tested. Empirical evidence demonstrates that the ARA methodology improves portfolio performances greatly. Through the specific illustration of the asset allocation problem in finance, this PhD thesis proposes a new general methodology that will hopefully help decision makers to solve numerous different problems under ambiguity.
5

A Bayesian approach to modelling mortality, with applications to insurance

Cairns, George Lindsay January 2013 (has links)
The purpose of this research was to use Bayesian statistics to develop flexible mortality models that could be used to forecast human mortality rates. Several models were developed as extensions to existing mortality models, in particular the Lee-Carter mortality model and the age-period-cohort model, by including some of the following features: age-period and age-cohort interactions, random effects on mortality, measurement errors in population count and smoothing of the mortality rate surface. One expects mortality rates to change in a relatively smooth manner between neighbouring ages or between neighbouring years or neighbouring cohorts. The inclusion of random effects in some of the models captures additional fluctuations in these effects. This smoothing is incorporated in the models by ensuring that the age, period and cohort parameters of the models have a relatively smooth sequence which is achieved through the choice of the prior distribution of the parameters. Three different smoothing priors were employed: a random walk, a random walk on first differences of the parameters and an autoregressive model of order one on the first differences of the parameters. In any model only one form of smoothing was used. The choice of smoothing prior not only imposes different patterns of smoothing on the parameters but is seen to be very influential when making mortality forecasts. The mortality models were fitted, using Bayesian methods, to population data for males and females from England and Wales. The fits of the models were analysed and compared using analysis of residuals, posterior predictive intervals for both in-sample and out-of-sample data and the Deviance Information Criterion. The models fitted the data better than did both the Lee-Carter model and the age-period-cohort model. From the analysis undertaken, for any given age and calendar year, the preferred model based on the Deviance Information Criterion score, for male and female death counts was a Poisson model with the mean parameter equal to the number of lives exposed to risk of dying for that age in that calendar year multiplied by a mortality parameter. The logit of this mortality parameter was a function of the age, year (period) and cohort with additional interactions between the age and period parameters and between the age and cohort parameters. The form of parameter smoothing that suited the males was an autoregressive model of order one on the first differences of the parameters and that for the females was a random walk. Moreover, it was found useful to add Gaussian random effects to account for overdispersion caused by unobserved heterogeneity in the population mortality. The research concluded by the application of a selection of these models to the provision of forecasts of period and cohort life expectancies as well as the numbers of centenarians for males and females in England and Wales. In addition, the thesis illustrated how Bayesian mortality models could be used to consider the impact of the new European Union solvency regulations for insurers (Solvency II) for longevity risk. This research underlined the important role that Bayesian stochastic mortality models can play in considering longevity risk.
6

Essays in international finance

Huang, Huichou January 2015 (has links)
This Ph.D. thesis contains 3 essays in international finance with a focus on foreign exchange market from the perspectives of empirical asset pricing (Chapter 2 and Chapter 3), forecasting and market microstructure (Chapter 4). In Chapter 2, I derive the position-unwinding likelihood indicator for currency carry trade portfolios in the option pricing model, and show that it represents the systematic crash risk associated with global liquidity imbalances and also is able to price the cross-section of global currency, sovereign bond, and equity portfolios; I also explore the currency option-implied sovereign default risk in Merton’s framework, and link the sovereign CDS-implied credit risk premia to currency excess returns that it prices the cross section of currency carry, momentum, and volatility risk premium portfolios. In Chapter 3, I investigate the factor structure in currency market and identify three important properties of global currencies – overvalued (undervalued) currencies with respect to equilibrium exchange rates tend to be crash sensitive (insensitive) measured by copula lower tail dependence, relatively cheap (expensive) to hedge in terms of volatility risk premium, and exposed to high (low) speculative propensity gauged by skew risk premium. I further reveal that these three characteristics have rich asset pricing and asset allocation implications, e.g. striking crash-neutral and diversification benefits for portfolio optimization and risk management purposes. In Chapter 4, I examine the term structure of exchange rate predictability by return decomposition, incorporate common latent factors across a range of investment horizons into the exchange rate dynamics with a broad set of predictors, and handle both parameter uncertainty and model uncertainty. I demonstrate the time-varying term-structural effect and model disagreement effect of exchange rate determinants and the projections of predictive information over the term structure, and utilize the time-variation in the probability weighting from dynamic model averaging to identify the scapegoat drivers of customer order flows. I further comprehensively evaluate both statistical and economic significance of the model allowing for a full spectrum of currency investment management, and find that the model generates substantial performance fees.
7

Angel diversity : studying the decision making criteria

Botelho, Tiago dos Santos January 2017 (has links)
Business angels are widely acknowledged as being a key source of risk finance for growth-oriented enterprises. Their importance has become even more significant since the onset of the financial crisis. Research on business angels goes back some 30 years, focusing primarily on two themes: (i) their characteristics and (ii) the investment process. It has become clear that business angels are not a homogeneous population. Various studies have sought to develop typologies of business angels based on their personal characteristics, competence, motivations, investment approach and types of investment made. However, this stream of research remains limited and has not progressed beyond establishing typologies. Moreover, the possibility that typologies are dynamic, with angels shifting between categories over time remains largely unexplored. Neither has it been considered how different types of business angels approach the process of making investment decisions or managing the post-investment relationship. The aim of this research is to further develop this line of research on angel typologies to explore differences between types of angel investors in terms of their approach to investment, looking in particular, at their decision-making criteria. This dissertation starts by questioning the methodologies used in research on business angel decision making. In particular, how comparable are results that arise from different methodologies. Using a sample of 51 business angels (21 gatekeepers and 30 individual investors), the findings indicate that the results are methodologically dependent. The next stage used data collected through an online survey with 472 investment decisions made by 238 angel investors. These data were used in the subsequent analysis. Firstly, a two-step cluster analysis procedure was conducted to cluster the investment decisions by the criteria weights. Three clusters were identified. The investment experience and the level of influence of others are both helpful in explaining the differences across groups. Secondly, the cluster membership was used to evaluate if angel investors change their investment criteria. A logistic model was developed. The results indicate that the likelihood of a business angel’s change the investment criteria depend on three key areas: investment specific area (ISA), angel specific area (ASA) and group specific area (GSA).
8

Forecasting exchange rates in the presence of instabilities

Ribeiro, Pinho J. January 2016 (has links)
Many exchange rate papers articulate the view that instabilities constitute a major impediment to exchange rate predictability. In this thesis we implement Bayesian and other techniques to account for such instabilities, and examine some of the main obstacles to exchange rate models' predictive ability. We first consider in Chapter 2 a time-varying parameter model in which fluctuations in exchange rates are related to short-term nominal interest rates ensuing from monetary policy rules, such as Taylor rules. Unlike the existing exchange rate studies, the parameters of our Taylor rules are allowed to change over time, in light of the widespread evidence of shifts in fundamentals - for example in the aftermath of the Global Financial Crisis. Focusing on quarterly data frequency from the crisis, we detect forecast improvements upon a random walk (RW) benchmark for at least half, and for as many as seven out of 10, of the currencies considered. Results are stronger when we allow the time-varying parameters of the Taylor rules to differ between countries. In Chapter 3 we look closely at the role of time-variation in parameters and other sources of uncertainty in hindering exchange rate models' predictive power. We apply a Bayesian setup that incorporates the notion that the relevant set of exchange rate determinants and their corresponding coefficients, change over time. Using statistical and economic measures of performance, we first find that predictive models which allow for sudden, rather than smooth, changes in the coefficients yield significant forecast improvements and economic gains at horizons beyond 1-month. At shorter horizons, however, our methods fail to forecast better than the RW. And we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients variability to incorporate in the models, as the main factors obstructing predictive ability. Chapter 4 focus on the problem of the time-varying predictive ability of economic fundamentals for exchange rates. It uses bootstrap-based methods to uncover the time-specific conditioning information for predicting fluctuations in exchange rates. Employing several metrics for statistical and economic evaluation of forecasting performance, we find that our approach based on pre-selecting and validating fundamentals across bootstrap replications generates more accurate forecasts than the RW. The approach, known as bumping, robustly reveals parsimonious models with out-of-sample predictive power at 1-month horizon; and outperforms alternative methods, including Bayesian, bagging, and standard forecast combinations. Chapter 5 exploits the predictive content of daily commodity prices for monthly commodity-currency exchange rates. It builds on the idea that the effect of daily commodity price fluctuations on commodity currencies is short-lived, and therefore harder to pin down at low frequencies. Using MIxed DAta Sampling (MIDAS) models, and Bayesian estimation methods to account for time-variation in predictive ability, the chapter demonstrates the usefulness of suitably exploiting such short-lived effects in improving exchange rate forecasts. It further shows that the usual low-frequency predictors, such as money supplies and interest rates differentials, typically receive little support from the data at monthly frequency, whereas MIDAS models featuring daily commodity prices are highly likely. The chapter also introduces the random walk Metropolis-Hastings technique as a new tool to estimate MIDAS regressions.

Page generated in 0.129 seconds