• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 119
  • 13
  • 5
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 146
  • 146
  • 146
  • 62
  • 36
  • 32
  • 32
  • 23
  • 21
  • 18
  • 17
  • 16
  • 15
  • 15
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Volatility estimates of ARCH models.

January 2001 (has links)
Chung Kwong-leung. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2001. / Includes bibliographical references (leaves 80-84). / Abstracts in English and Chinese. / ACKNOWOLEDGMENTS --- p.iii / LIST OF TABLES --- p.iv / LIST OF ILLUSTRATIONS --- p.vi / CHAPTER / Chapter ONE --- INTORDUCTION --- p.1 / Chapter TWO --- LITERATURE REVIEW --- p.5 / Volatility / ARCH Models / The Accuracy of ARCH Volatility Estimates / Chapter THREE --- METHODOLOGY --- p.11 / Testing and Estimation / Simulation / Chapter FOUR --- DATA DESCRIPTION AND EMPIRICAL RESULTS --- p.29 / Data Description / Testing and Estimation Results / Simulation Results / Chapter FIVE --- CONCLUSION --- p.45 / TABLES --- p.49 / ILLUSTRATIONS --- p.58 / APPENDICES --- p.77 / BIBOGRAPHY --- p.80
132

A comparison of the Philips price earnings multiple model and the actual future price earnings multiple of selected companies listed on the Johannesburg stock exchange

Coetzee, G. J 12 1900 (has links)
Thesis (MBA)--Stellenbosch University, 2000. / ENGLISH ABSTRACT: The price earnings multiple is a ratio of valuation and is published widely in the media as a comparative instrument of investment decisions. It is used to compare company valuation levels and their future growth/franchise opportunities. There have been numerous research studies done on the price earnings multiple, but no study has been able to design or derive a model to successfully predict the future price earnings multiple where the current stock price and following year-end earnings per share is used. The most widely accepted method of share valuation is to discount the future cash flows by an appropriate discount rate. Popular and widely used stock valuation models are the Dividend Discount Model and the Gordon Model. Both these models assume that future dividends are cash flows to the shareholder. Thomas K. Philips, the chief investment officer at Paradigm Asset Management in New York, constructed a valuation model at the end of 1999, which he published in The Journal of Portfolio Management. The model (Philips price earnings multiple model) was derived from the Dividend Discount Model and calculates an implied future price earnings multiple. The Philips price earnings multiple model includes the following independent variables: the cost of equity, the return on equity and the dividend payout ratio. Each variable in the Philips price earnings multiple model is a calculated present year-end point value, which was used to calculate the implied future price earnings multiple (present year stock price divided by following year-end earnings per share). This study used a historical five year (1995-2000) year-end data to calculate the implied and actual future price earnings multiple. Out of 225, Johannesburg Stock Exchange listed companies studied, only 36 were able to meet the criteria of the Philips price earnings multiple model. Correlation and population mean tests were conducted on the implied and constructed data sets. It proved that the Philips price earnings multiple model was unsuccesful in predicting the future price earnings multiple, at a statistical 0,20 level of significance. The Philips price earnings multiple model is substantially more complex than the Discount Dividend Model and includes greater restrictions and more assumptions. The Philips price earnings multiple model is a theoretical instrument which can be used to analyse hypothetical (with all model assumptions and restrictions having been met) companies. The Philips price earnings multiple model thus has little to no applicability in the practical valuation of stock price on Johannesburg Stock Exchange listed companies. / AFRIKAANSE OPSOMMING: Die prysverdienste verhouding is 'n waarde bepalingsverhouding en word geredelik gepubliseer in die media. Hierdie verhouding is 'n maatstaf om maatskappye se waarde vlakke te vergelyk en om toekomstige groei geleenthede te evalueer. Daar was al verskeie navorsingstudies gewy aan die prysverdiensteverhouding, maar nog geen model is ontwikkel wat die toekomstige prysverdiensteverhouding (die teenswoordige aandeelprys en toekomstige jaareind verdienste per aandeel) suksesvol kon modelleer nie. Die mees aanvaarbare metode vir waardebepaling van aandele is om toekomstige kontantvloeie te verdiskonteer teen 'n toepaslike verdiskonteringskoers. Van die vernaamste en mees gebruikte waardeberamings modelle is die Dividend Groei Model en die Gordon Model. Beide modelle gebruik die toekomstige dividendstroom as die toekomstige kontantvloeie wat uitbetaal word aan die aandeelhouers. Thomas K. Philips, die hoof beleggingsbeampte by Paradigm Asset Management in New York, het 'n waardeberamingsmodel ontwerp in 1999. Die model (Philips prysverdienste verhoudingsmodei) was afgelei vanaf die Dividend Groei Model en word gebruik om 'n geïmpliseerde toekomstige prysverdiensteverhouding te bereken. Die Philips prysverdienste verhoudingsmodel sluit die volgende onafhanklike veranderlikes in: die koste van kapitaal, die opbrengs op aandeelhouding en die uitbetalingsverhouding. Elke veranderlike in hierdie model is 'n berekende teenswoordige jaareinde puntwaarde, wat gebruik was om die toekomstige geïmpliseerde prysverdiensteverhouding (teenswoordige jaar aandeelprys gedeel deur die toekomstige verdienste per aandeel) te bereken. In hierdie studie word vyf jaar historiese jaareind besonderhede gebruik om die geïmpliseerde en werklike toekomstige prysverdiensteverhouding te bereken. Van die 225 Johannesburg Effektebeurs genoteerde maatskappye, is slegs 36 gebruik wat aan die vereistes voldoen om die Philips prysverdienste verhoudingsmodel te toets. Korrelasie en populasie gemiddelde statistiese toetse is op die berekende en geïmpliseerde data stelle uitgevoer en gevind dat die Philips prysverdienste verhoudingsmodel, teen 'n statistiese 0,20 vlak van beduidenheid, onsuksesvol was om die toekomstige prysverdiensteverhouding vooruit te skat. Die Philips prysverdienste verhoudingsmodel is meer kompleks as die Dividend Groei Model met meer aannames en beperkings. Die Philips prysverdienste verhoudingsmodel is 'n teoretiese instrument wat gebruik kan word om hipotetiese (alle model aannames en voorwaardes is nagekom) maatskappye te ontleed. Dus het die Philips prysverdienste verhoudingsmodel min tot geen praktiese toepassingsvermoë in die werkilke waardasie van aandele nie.
133

Non-parametric volatility measurements and volatility forecasting models

Du Toit, Cornel 03 1900 (has links)
Assignment (MComm)--Stellenbosch University, 2005. / ENGLISH ABSTRACT: Volatilty was originally seen to be constant and deterministic, but it was later realised that return series are non-stationary. Owing to this non-stationarity nature of returns, there were no reliable ex-post volatility measurements. Subsequently, researchers focussed on ex-ante volatility models. It was only then realised that before good volatility models can be created, reliable ex-post volatility measuremetns need to be defined. In this study we examine non-parametric ex-post volatility measurements in order to obtain approximations of the variances of non-stationary return series. A detailed mathematical derivation and discussion of the already developed volatility measurements, in particular the realised volatility- and DST measurements, are given In theory, the higher the sample frequency of returns is, the more accurate the measurements are. These volatility measurements referred to above, however, all have short-comings in that the realised volatility fails if the sample frequency becomes to high owing to microstructure effects. On the other hand, the DST measurement cannot handle changing instantaneous volatility. In this study we introduce a new volatility measurement, termed microstructure realised volatility, that overcomes these shortcomings. This measurement, as with realised volatility, is based on quadratic variation theory, but the underlying return model is more realistic. / AFRIKAANSE OPSOMMING: Volatiliteit is oorspronklik as konstant en deterministies beskou, dit was eers later dat besef is dat opbrengste nie-stasionêr is. Betroubare volatiliteits metings was nie beskikbaar nie weens die nie-stasionêre aard van opbrengste. Daarom het navorsers gefokus op vooruitskattingvolatiliteits modelle. Dit was eers op hierdie stadium dat navorsers besef het dat die definieering van betroubare volatiliteit metings 'n voorvereiste is vir die skepping van goeie vooruitskattings modelle. Nie-parametriese volatiliteit metings word in hierdie studie ondersoek om sodoende benaderings van die variansies van die nie-stasionêre opbrengste reeks te beraam. 'n Gedetaileerde wiskundige afleiding en bespreking van bestaande volatiliteits metings, spesifiek gerealiseerde volatiliteit en DST- metings, word gegee. In teorie salopbrengste wat meer dikwels waargeneem word tot beter akkuraatheid lei. Bogenoemde volatilitieits metings het egter tekortkominge aangesien gerealiseerde volatiliteit faal wanneer dit te hoog raak, weens mikrostruktuur effekte. Aan die ander kant kan die DST meting nie veranderlike oombliklike volatilitiet hanteer nie. Ons stel in hierdie studie 'n nuwe volatilitieits meting bekend, naamlik mikro-struktuur gerealiseerde volatiliteit, wat nie hierdie tekortkominge het nie. Net soos met gerealiseerde volatiliteit sal hierdie meting gebaseer wees op kwadratiese variasie teorie, maar die onderliggende opbrengste model is meer realisties.
134

Improving the accuracy of prediction using singular spectrum analysis by incorporating internet activity

Badenhorst, Dirk Jakobus Pretorius 03 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2013. / ENGLISH ABSTRACT: Researchers and investors have been attempting to predict stock market activity for years. The possible financial gain that accurate predictions would offer lit a flame of greed and drive that would inspire all kinds of researchers. However, after many of these researchers have failed, they started to hypothesize that a goal such as this is not only improbable, but impossible. Previous predictions were based on historical data of the stock market activity itself and would often incorporate different types of auxiliary data. This auxiliary data ranged as far as imagination allowed in an attempt to find some correlation and some insight into the future, that could in turn lead to the figurative pot of gold. More often than not, the auxiliary data would not prove helpful. However, with the birth of the internet, endless amounts of new sources of auxiliary data presented itself. In this thesis I propose that the near in finite amount of data available on the internet could provide us with information that would improve stock market predictions. With this goal in mind, the different sources of information available on the internet are considered. Previous studies on similar topics presented possible ways in which we can measure internet activity, which might relate to stock market activity. These studies also gave some insights on the advantages and disadvantages of using some of these sources. These considerations are investigated in this thesis. Since a lot of this work is therefore based on the prediction of a time series, it was necessary to choose a prediction algorithm. Previously used linear methods seemed too simple for prediction of stock market activity and a new non-linear method, called Singular Spectrum Analysis, is therefore considered. A detailed study of this algorithm is done to ensure that it is an appropriate prediction methodology to use. Furthermore, since we will be including auxiliary information, multivariate extensions of this algorithm are considered as well. Some of the inaccuracies and inadequacies of these current multivariate extensions are studied and an alternative multivariate technique is proposed and tested. This alternative approach addresses the inadequacies of existing methods. With the appropriate methodology chosen and the appropriate sources of auxiliary information chosen, a concluding chapter is done on whether predictions that includes auxiliary information (obtained from the internet) improve on baseline predictions that are simply based on historical stock market data. / AFRIKAANSE OPSOMMING: Navorsers en beleggers is vir jare al opsoek na maniere om aandeelpryse meer akkuraat te voorspel. Die moontlike finansiële implikasies wat akkurate vooruitskattings kan inhou het 'n vlam van geldgierigheid en dryf wakker gemaak binne navorsers regoor die wêreld. Nadat baie van hierdie navorsers onsuksesvol was, het hulle begin vermoed dat so 'n doel nie net onwaarskynlik is nie, maar onmoontlik. Vorige vooruitskattings was bloot gebaseer op historiese aandeelprys data en sou soms verskillende tipes bykomende data inkorporeer. Die tipes data wat gebruik was het gestrek so ver soos wat die verbeelding toegelaat het, in 'n poging om korrelasie en inligting oor die toekoms te kry wat na die guurlike pot goud sou lei. Navorsers het gereeld gevind dat hierdie verskillende tipes bykomende inligting nie van veel hulp was nie, maar met die geboorte van die internet het 'n oneindige hoeveelheid nuwe bronne van bykomende inligting bekombaar geraak. In hierdie tesis stel ek dus voor dat die data beskikbaar op die internet dalk vir ons kan inligting gee wat verwant is aan toekomstige aandeelpryse. Met hierdie doel in die oog, is die verskillende bronne van inligting op die internet gebestudeer. Vorige studies op verwante werk het sekere spesifieke maniere voorgestel waarop ons internet aktiwiteit kan meet. Hierdie studies het ook insig gegee oor die voordele en die nadele wat sommige bronne inhou. Hierdie oorwegings word ook in hierdie tesis bespreek. Aangesien 'n groot gedeelte van hierdie tesis dus gebasseer word op die vooruitskatting van 'n tydreeks, is dit nodig om 'n toepaslike vooruitskattings algoritme te kies. Baie navorsers het verkies om eenvoudige lineêre metodes te gebruik. Hierdie metodes het egter te eenvoudig voorgekom en 'n relatiewe nuwe nie-lineêre metode (met die naam "Singular Spectrum Analysis") is oorweeg. 'n Deeglike studie van hierdie algoritme is gedoen om te verseker dat die metode van toepassing is op aandeelprys data. Verder, aangesien ons gebruik wou maak van bykomende inligting, is daar ook 'n studie gedoen op huidige multivariaat uitbreidings van hierdie algoritme en die probleme wat dit inhou. 'n Alternatiewe multivariaat metode is toe voorgestel en getoets wat hierdie probleme aanspreek. Met 'n gekose vooruitskattingsmetode en gekose bronne van bykomende data is 'n gevolgtrekkende hoofstuk geskryf oor of vooruitskattings, wat die bykomende internet data inkorporeer, werklik in staat is om te verbeter op die eenvoudige vooruitskattings, wat slegs gebaseer is op die historiese aandeelprys data.
135

Evidence of volatility clustering on the FTSE/JSE top 40 index

Louw, Jan Paul 12 1900 (has links)
Thesis (MBA (Business Management))--Stellenbosch University, 2008. / ENGLISH ABSTRACT: This research report investigated whether evidence of volatility clustering exists on the FTSE/JSE Top 40 Index. The presence of volatility clustering has practical implications relating to market decisions as well as the accurate measurement and reliable forecasting of volatility. This research report was conducted as an in-depth analysis of volatility, measured over five different return interval sizes covering the sample in non-overlapping periods. Each of the return interval sizes' volatility were analysed to reveal the distributional characteristics and if it violated the normality assumption. The volatility was also analysed to identify in which way, if any, subsequent periods are correlated. For each of the interval sizes one-step-ahead volatility forecasting was conducted using Linear Regression, Exponential Smoothing, GARCH(1,1) and EGARCH(1,1) models. The results were analysed using appropriate criteria to determine which of the forecasting models were more powerful. The forecasting models range from very simple to very complex, the rationale for this was to determine if more complex models outperform simpler models. The analysis showed that there was sufficient evidence to conclude that there was volatility clustering on the FTSE/JSE Top 40 Index. It further showed that more complex models such as the GARCH(1,1) and EGARCH(1,1) only marginally outperformed less complex models, and does not offer any real benefit over simpler models such as Linear Regression. This can be ascribed to the mean reversion effect of volatility and gives further insight into the volatility structure over the sample period. / AFRIKAANSE OPSOMMING: Die navorsingsverslag ondersoek die FTSE/JSE Top 40 Indeks om te bepaal of daar genoegsame bewyse is dat volatiliteitsbondeling teenwoordig is. Die teenwoordigheid van volatiliteitsbondeling het praktiese implikasies vir besluite in finansiele markte en akkurate en betroubare volatiliteitsvooruitskattings. Die verslag doen 'n diepgaande ontleding van volatiliteit, gemeet oor vyf verskillende opbrengs interval groottes wat die die steekproef dek in nie-oorvleuelende periodes. Elk van die opbrengs interval groottes se volatiliteitsverdelings word ontleed om te bepaal of dit verskil van die normaalverdeling. Die volatiliteit van die intervalle word ook ondersoek om te bepaal tot watter mate, indien enige, opeenvolgende waarnemings gekorreleer is. Vir elk van die interval groottes word 'n een-stap-vooruit vooruitskatting gedoen van volatiliteit. Dit word gedoen deur middel van Lineêre Regressie, Eksponensiële Gladstryking, GARCH(1,1) en die EGARCH(1,1) modelle. Die resultate word ontleed deur middel van erkende kriteria om te bepaal watter model die beste vooruitskattings lewer. Die modelle strek van baie eenvoudig tot baie kompleks, die rasionaal is om te bepaal of meer komplekse modelle beter resultate lewer as eenvoudiger modelle. Die ontleding toon dat daar genoegsame bewyse is om tot die gevolgtrekking te kom dat daar volatiliteitsbondeling is op die FTSE/JSE Top 40 Indeks. Dit toon verder dat meer komplekse vooruitskattingsmodelle soos die GARCH(1,1) en die EGARCH(1,1) slegs marginaal beter presteer het as die eenvoudiger vooruitskattingsmodelle en nie enige werklike voordeel soos Lineêre Regressie bied nie. Dit kan toegeskryf word aan die neiging van volatiliteit am terug te keer tot die gemiddelde, wat verdere insig lewer oor volatiliteit gedurende die steekproef.
136

An econophysical investigation : using the Boltzmann distribution to determine market temperature as applied to the JSE all share index

Brand, Rene 03 1900 (has links)
Thesis (MBA (Business Management))--University of Stellenbosch, 2009. / ENGLISH ABSTRACT: Econophysics is a relatively new branch of physics. It entails the use of models in physics applied to economics. The distributions of financial time series are the aspect most intensely studied by physicists. This study is based on a study by Kleinert and Chen who applied the Boltzmann distribution to stock exchange data to define a market temperature that may be used by investors to indicate an impending stock market crash. Most econophysicists’ analysed the tail regions of the distributions as the tails represent risk in financial data. This study’s focus of analysis, on the other hand is the characterisation of the central portion of the probability distribution. The Boltzmann distribution, a cornerstone in statistical physics, yields an exponential distribution. The objective of this study is to investigate the suitability of using a market volatility forecasting method from econophysics, namely the Boltzmann/market temperature method. As econometric benchmark the ARCH/GARCH method is used. Stock market indices are known to be non-normally (non-Gaussian) distributed. The distribution pattern of a stock market index of reasonable high sampling frequency (typically interday or intraday) is leptokurtic with heavy tails. Mesoscopic (interday) distributions of financial time series have been found to be exponential distributions. If the empirical exponential distribution is therefore interpreted as a Boltzmann distribution, then a market temperature can be calculated from the exponential distribution. Empirical data for this study is in the form of daily closing values of the Johannesburg Stock Exchange (JSE) All Share Index (ALSI) and the Standard & Poor 500 (S & P 500) index for the period 1995 through to 2008. The Kleinert and Chen study made use of intraday data obtained from established markets. This study differs from the Kleinert and Chen study in that interday data obtained from an emerging market, namely the South African stock market is used. Neither of the aforementioned two differences had a significant influence on the results of this study. The JSE ALSI log-return data displays non-Gaussian properties and the Laplace (double exponential) distribution fit the data well. A plot of the market temperature provided a clear indication of when stock market crashes occurred. Results of the econophysical (Boltzmann/market temperature) method compared well to results of the econometric (ARCH/GARCH) method and subject to certain improvements can be utilised successfully. A leptokurtic, non-Gaussian nature was established for daily log-returns of the JSE ALSI and the S & P 500 index. The Laplace (double exponential) distribution fit the annual logreturns of the JSE ALSI and S & P 500 index well. As a result of the good Laplace fit, annual market temperatures could be calculated for the JSE ALSI and the S & P 500 index. The market temperature method was effective in identifying market crashes for both indices, but a limitation of the method is that only annual market temperatures can be determined. The availability of intraday stock index data should improve the interval for which market temperature can be determined. / AFRIKAANSE OPSOMMING: Ekonofisika is ‘n relatiewe nuwe studieveld. Dit behels die toepassing van fisiese modelle op finansiële data. Die waarskynlikheidsversdelings van finansiële tydreekse is die aspek wat meeste deur fisisie bestudeer word. Hierdie studie is gebaseer op ‘n studie deur Kleinert en Chen. Hulle het die Boltzmann-verspreiding op ‘n aandele-indeks toegepas en ‘n mark-temperatuur bepaal. Hierdie mark-temperatuur kan deur ontleders gebruik word as waarskuwingsmeganisme teen moontlike aandelebeurs ineenstortings. Die meeste fisisie het die uiterste areas van die verspreidingskurwes geanaliseer omdat hierdie uiterste area risiko in finansiële data verteenwoordig. Die analitiese fokus van hierdie studie, aan die ander kant, is die karakterisering van die die sentrale areas van die waarskeinlikheidsverdeling. Die Boltzmann verspreiding, die hoeksteen van Statistiese Fisika lewer ‘n eksponensiële waarskynlikheidsverdeling. Die doel van hierdie studie is om ‘n ondersoek te doen na die geskiktheid van die gebruik van ‘n ekonofisiese, vooruitskattingsmetode, naamlik die Boltzmann/mark-temperatuur model. As ekonometriese verwysing is die “ARCH/GARCH” metode toegepas. Aandelemark indekse is bekend vir die nie-Gaussiese verspreiding daarvan. Die verspreidingspatroon van ‘n aandelemark indeks met‘n redelike hoë steekproef frekwensie (in die orde van ‘n dag of minder) is leptokurties met breë stert-dele. Mesoskopiese (interdag) verspreidings van finansiële tydreekse is getipeer as eksponensieël. Indien die empiriese eksponensiële-verspreiding as ‘n Boltzmann-verspreiding geinterpreteer word, kan ‘n mark-temperatuur daarvoor bereken word. Empiriese data vir die gebruik in hierdie studie is in die vorm van daaglikse sluitingswaardes van die Johannesburgse Effektebeurs (JSE) se Alle Aandele Indeks (ALSI) en die Standard en Poor 500 (S & P 500) indeks vir die periode 1995 tot en met 2008. Die Kleinert en Chen studie het van intradag data vanuit ‘n ontwikkelde mark gebruik gemaak. Hierdie studie verskil egter van die Kleinert en Chen studie deurdat van interdag data vanuit ‘n opkomende mark, naamlik die Suid-Afrikaanse aandelemark, gebruik is. Nie een van die twee voorafgaande verskille het ‘n beduidende invloed op die resultate van hierdie studie gehad nie. Die JSE ALSI se logaritmiese opbrengs data vertoon nie-Gaussiese eienskappe en die Laplace (dubbeleksponensiële) verspreiding beskryf die data goed. ‘n Grafiek van die mark-temperatuur vertoon duidelik wanneer aandelemarkineenstortings plaasgevind het. Resultate van die ekonofisiese (Boltzmann/mark-temperatuur) metode vergelyk goed met resultate van die ekonometriese (“ARCH/GARCH”) metode en onderhewig aan sekere verbeteringe kan dit met sukses toegepas word. ‘n Leptokurtiese, nie-Gaussiese aard is vir daaglike opbrengswaardes vir die JSE ALSI en die S & P 500 indeks vasgestel. ‘n Laplace (dubbel-eksponensiële) verspreiding kan goed op die jaarlikse logaritmiese opbrengste van die JSE ALSI en die S & P 500 indeks toegepas word. As gevolg van die goeie aanwending van die Laplace-verspreiding kan ‘n jaarlikse mark-temperatuur vir die JSE ALSI en die S & P 500 indeks bereken word. Die mark-temperatuur metode is effektief in die identifisering van aandelemarkineenstorings vir beide indekse, hoewel daar ‘n beperking is op die aantal mark-temperature wat bereken kan word. Die beskikbaarheid van intradag aandele indekswaardes behoort die interval waarvoor mark-temperature bereken kan word te verbeter.
137

Derivation of Probability Density Functions for the Relative Differences in the Standard and Poor's 100 Stock Index Over Various Intervals of Time

Bunger, R. C. (Robert Charles) 08 1900 (has links)
In this study a two-part mixed probability density function was derived which described the relative changes in the Standard and Poor's 100 Stock Index over various intervals of time. The density function is a mixture of two different halves of normal distributions. Optimal values for the standard deviations for the two halves and the mean are given. Also, a general form of the function is given which uses linear regression models to estimate the standard deviations and the means. The density functions allow stock market participants trading index options and futures contracts on the S & P 100 Stock Index to determine probabilities of success or failure of trades involving price movements of certain magnitudes in given lengths of time.
138

Comprehensibility, Overfitting and Co-Evolution in Genetic Programming for Technical Trading Rules

Seshadri, Mukund 30 April 2003 (has links)
This thesis presents Genetic Programming methodologies to find successful and understandable technical trading rules for financial markets. The methods when applied to the S&P500 consistently beat the buy-and-hold strategy over a 12-year period, even when considering transaction costs. Some of the methods described discover rules that beat the S&P500 with 99% significance. The work describes the use of a complexity-penalizing factor to avoid overfitting and improve comprehensibility of the rules produced by GPs. The effect of this factor on the returns for this domain area is studied and the results indicated that it increased the predictive ability of the rules. A restricted set of operators and domain knowledge were used to improve comprehensibility. In particular, arithmetic operators were eliminated and a number of technical indicators in addition to the widely used moving averages, such as trend lines and local maxima and minima were added. A new evaluation function that tests for consistency of returns in addition to total returns is introduced. Different cooperative coevolutionary genetic programming strategies for improving returns are studied and the results analyzed. We find that paired collaborator coevolution has the best results.
139

A study of genetic fuzzy trading modeling, intraday prediction and modeling. / CUHK electronic theses & dissertations collection

January 2010 (has links)
This thesis consists of three parts: a genetic fuzzy trading model for stock trading, incremental intraday information for financial time series forecasting, and intraday effects in conditional variance estimation. Part A investigates a genetic fuzzy trading model for stock trading. This part contributes to use a fuzzy trading model to eliminate undesirable discontinuities, incorporate vague trading rules into the trading model and use genetic algorithm to select an optimal trading ruleset. Technical indicators are used to monitor the stock price movement and assist practitioners to set up trading rules to make buy-sell decision. Although some trading rules have a clear buy-sell signal, the signals are always detected with 'hard' logical. These trigger the undesirable discontinuities due to the jumps of the Boolean variables that may occur for small changes of the technical indicator. Some trading rules are vague and conflicting. They are difficult to incorporate into the trading system while they possess significant market information. Various performance comparisons such as total return, maximum drawdown and profit-loss ratios among different trading strategies were examined. Genetic fuzzy trading model always gave moderate performance. Part B studies and contributes to the literature that focuses on the forecasting of daily financial time series using intraday information. Conventional daily forecast always focuses on the use of lagged daily information up to the last market close while neglecting intraday information from the last market close to current time. Such intraday information are referred to incremental intraday information. They can improve prediction accuracy not only at a particular instant but also with the intraday time when an appropriate predictor is derived from such information. These are demonstrated in two forecasting examples, predictions of daily high and range-based volatility, using linear regression and Neural Network forecasters. Neural Network forecaster possesses a stronger causal effect of incremental intraday information on the predictand. Predictability can be estimated by a correlation without conducting any forecast. Part C explores intraday effects in conditional variance estimation. This contributes to the literature that focuses on conditional variance estimation with the intraday effects. Conventional GARCH volatility is formulated with an additive-error mean equation for daily return and an autoregressive moving-average specification for its conditional variance. However, the intra-daily information doesn't include in the conditional variance while it should has implication on the daily variance. Using Engle's multiplicative-error model formulation, range-based volatility is proposed as an intraday proxy for several GARCH frameworks. The impact of significant changes in intraday data is reflected in the MEM-GARCH variance. For some frameworks, it is possible to use lagged values of range-based volatility to delay the intraday effects in the conditional variance equation. / Ng, Hoi Shing Raymond. / Adviser: Kai-Pui Lam. / Source: Dissertation Abstracts International, Volume: 72-01, Section: B, page: . / Thesis (Ph.D.)--Chinese University of Hong Kong, 2010. / Includes bibliographical references (leaves 107-114). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. Ann Arbor, MI : ProQuest Information and Learning Company, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Abstract also in Chinese.
140

Three new perspectives for testing stock market efficiency

Chandrashekar, Satyajit 29 August 2008 (has links)
Not available

Page generated in 0.1264 seconds