• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 6
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 16
  • 16
  • 16
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Bootstrap methods and parameter estimation in time series threshold modelling

Mekaiel, Mohammed M. January 1995 (has links)
The aim of this thesis is to investigate of bootstrap methods (Efron, 1979), in the the performance estimation of parameter estimates in non-linear time series models, in particular SETAR models (Tong, 1993). First and higher order SETAR models in known and unknown thresholds cases are considered. To assess the performance of bootstrap methods, we first give an extensive simulation study (by using simulated normal errors), in chapters 3 and 4, to investigate large and small sample behaviours of the true sampling distributions of parameter estimates of SETAR models and how they are affected by sample size. First and higher order SETAR models in the known and unknown threshold cases are considered. An introduction to the bootstrap methods (Efron, 1979 ) is given in chapter 5. The effect of sample size on the bootstrap distributions of parameter estimates of first and higher order SETAR models in the known and unknown threshold cases ( for given order, delay and number of thresholds ) are also investigated in this chapter, via simulation and by using the same models used in the simulated normal errors 'true distribution' case ( chapters 3 & 4). The results are compared with simulated normal case in order to assess the bootstrap results. Tong and Lim (1980) method is used for fitting SETAR models to bootstrap samples, which is also used in the initial fit. Moreover, applications of bootstrap to celebrated data sets, namely, the logarithmically transformed lynx data covering the period (182-1934); and the sunspot numbers covering the period (1700- 1920), are attempted. The cyclical behaviours of bootstrap models are also examined. Finally, in chapter 5, an attempt is also made to study the problem of non-linear properties of the skeleton of a non-linear autoregressive process (Jones, 1976) via simulation and we study in particular a limit cycle behaviour.
2

Continuous time threshold autoregressive model

Yeung, Miu Han Iris January 1989 (has links)
No description available.
3

Genetic detection with application of time series analysis

呂素慧 Unknown Date (has links)
This article investigates the detection and identification problems for changing of regimes about non-linear time series process. We apply the concept of genetic algorithm and AIC criterion to test the changing of regimes. This way is different from traditional detection methods According to our statistical decision procedure, the mean of moving average and the genetic detection for the underlying time series will be considered to decide change points. Finally, an empirical application about the detection and identification of change points for the Taiwan Business Cycle is illustrated.
4

Genetic detection with application of time series analysis

呂素慧 Unknown Date (has links)
This article investigates the detection and identification problems for changing of regimes about non-linear time series process. We apply the concept of genetic algorithm and AIC criterion to test the changing of regimes. This way is different from traditional detection methods. According to our statistical decision procedure, the mean of moving average and the genetic detection for the underlying time series will be considered to decide change points. Finally, an empirical application about the detection and identification of change points for the Taiwan Business Cycle is illustrated.
5

Suboptimal LULU-estimators in measurements containing outliers

Astl, Stefan Ludwig 12 1900 (has links)
Thesis (MSc)--Stellenbosch University, 2013. / ENGLISH ABSTRACT: Techniques for estimating a signal in the presence of noise which contains outliers are currently not well developed. In this thesis, we consider a constant signal superimposed by a family of noise distributions structured as a tunable mixture f(x) = α g(x) + (1 − α) h(x) between finitesupport components of “well-behaved” noise with small variance g(x) and of “impulsive” noise h(x) with a large amplitude and strongly asymmetric character. When α ≈ 1, h(x) can for example model a cosmic ray striking an experimental detector. In the first part of our work, a method for obtaining the expected values of the positive and negative pulses in the first resolution level of a LULU Discrete Pulse Transform (DPT) is established. Subsequent analysis of sequences smoothed by the operators L1U1 or U1L1 of LULU-theory shows that a robust estimator for the location parameter for g is achieved in the sense that the contribution by h to the expected average of the smoothed sequences is suppressed to order (1 − α)2 or higher. In cases where the specific shape of h can be difficult to guess due to the assumed lack of data, it is thus also shown to be of lesser importance. Furthermore, upon smoothing a sequence with L1U1 or U1L1, estimators for the scale parameters of the model distribution become easily available. In the second part of our work, the same problem and data is approached from a Bayesian inference perspective. The Bayesian estimators are found to be optimal in the sense that they make full use of available information in the data. Heuristic comparison shows, however, that Bayes estimators do not always outperform the LULU estimators. Although the Bayesian perspective provides much insight into the logical connections inherent in the problem, its estimators can be difficult to obtain in analytic form and are slow to compute numerically. Suboptimal LULU-estimators are shown to be reasonable practical compromises in practical problems. / AFRIKAANSE OPSOMMING: Tegnieke om ’n sein af te skat in die teenwoordigheid van geraas wat uitskieters bevat is tans nie goed ontwikkel nie. In hierdie tesis aanskou ons ’n konstante sein gesuperponeer met ’n familie van geraasverdelings wat as verstelbare mengsel f(x) = α g(x) + (1 − α) h(x) tussen eindige-uitkomsruimte geraaskomponente g(x) wat “goeie gedrag” en klein variansie toon, plus “impulsiewe” geraas h(x) met groot amplitude en sterk asimmetriese karakter. Wanneer α ≈ 1 kan h(x) byvoorbeeld ’n kosmiese straal wat ’n eksperimentele apparaat tref modelleer. In die eerste gedeelte van ons werk word ’n metode om die verwagtingswaardes van die positiewe en negatiewe pulse in die eerste resolusievlak van ’n LULU Diskrete Pulse Transform (DPT) vasgestel. Die analise van rye verkry deur die inwerking van die gladstrykers L1U1 en U1L1 van die LULU-teorie toon dat hul verwagte gemiddelde waardes as afskatters van die liggingsparameter van g kan dien wat robuus is in die sin dat die bydrae van h tot die gemiddeld van orde grootte (1 − α)2 of hoër is. Die spesifieke vorm van h word dan ook onbelangrik. Daar word verder gewys dat afskatters vir die relevante skaalparameters van die model maklik verkry kan word na gladstryking met die operatore L1U1 of U1L1. In die tweede gedeelte van ons werk word dieselfde probleem en data vanuit ’n Bayesiese inferensie perspektief benader. Die Bayesiese afskatters word as optimaal bevind in die sin dat hulle vol gebruikmaak van die beskikbare inligting in die data. Heuristiese vergelyking wys egter dat Bayesiese afskatters nie altyd beter vaar as die LULU afskatters nie. Alhoewel die Bayesiese sienswyse baie insig in die logiese verbindings van die probleem gee, kan die afskatters moeilik wees om analities af te lei en stadig om numeries te bereken. Suboptimale LULU-beramers word voorgestel as redelike praktiese kompromieë in praktiese probleme.
6

On the estimation of time series regression coefficients with long range dependence

Chiou, Hai-Tang 28 June 2011 (has links)
In this paper, we study the parameter estimation of the multiple linear time series regression model with long memory stochastic regressors and innovations. Robinson and Hidalgo (1997) and Hidalgo and Robinson (2002) proposed a class of frequency-domain weighted least squares estimates. Their estimates are shown to achieve the Gauss-Markov bound with standard convergence rate. In this study, we proposed a time-domain generalized LSE approach, in which the inverse autocovariance matrix of the innovations is estimated via autoregressive coefficients. Simulation studies are performed to compare the proposed estimates with Robinson and Hidalgo (1997) and Hidalgo and Robinson (2002). The results show the time-domain generalized LSE is comparable to Robinson and Hidalgo (1997) and Hidalgo and Robinson (2002) and attains higher efficiencies when the autoregressive or moving average coefficients of the FARIMA models have larger values. A variance reduction estimator, called TF estimator, based on linear combination of the proposed estimator and Hidalgo and Robinson (2002)'s estimator is further proposed to improve the efficiency. Bootstrap method is applied to estimate the weights of the linear combination. Simulation results show the TF estimator outperforms the frequency-domain as well as the time-domain approaches.
7

The effectiveness of central bank interventions in the foreign exchange market

Seerattan, Dave Arnold January 2012 (has links)
The global foreign exchange market is the largest financial market with turnover in this market often outstripping the GDP of countries in which they are located. The dynamics in the foreign exchange market, especially price dynamics, have huge implications for financial asset values, financial returns and volatility in the international financial system. It is therefore an important area of study. Exchange rates have often departed significantly from the level implied by fundamentals and exhibit excessive volatility. This reality creates a role for central bank intervention in this market to keep the rate in line with economic fundamentals and the overall policy mix, to stabilize market expectations and to calm disorderly markets. Studies that attempt to measure the effectiveness of intervention in the foreign exchange market in terms of exchange rate trends and volatility have had mixed results. This, in many cases, reflects the unavailability of data and the weaknesses in the empirical frameworks used to measure effectiveness. This thesis utilises the most recent data available and some of the latest methodological advances to measure the effectiveness of central bank intervention in the foreign exchange markets of a variety of countries. It therefore makes a contribution in the area of applied empirical methodologies for the measurement of the dynamics of intervention in the foreign exchange market. It demonstrates that by using high frequency data and more robust and appropriate empirical methodologies central bank intervention in the foreign exchange market can be effective. Moreover, a framework that takes account of the interactions between different central bank policy instruments and price dynamics, the reaction function of the central bank, different states of the market, liquidity in the market and the profitability of the central bank can improve the effectiveness of measuring the impact of central bank policy in the foreign exchange market and provide useful information to policy makers.
8

Testy linearity v časových řadách / Tests for time series linearity

Melicherčík, Martin January 2013 (has links)
Title: Testing for linearity in time series Author: Martin Melicherčík Department: Department of Probability and Mathematical Statistics Supervisor: doc. RNDr. Zuzana Prášková, CSc., Department of Probability and Mathematical Statistics Abstract: In the first part of the thesis, a necessary theoretical base from time series analysis is explained, which is consequently used to formulate several tests for linearity. According to variety of approaches the theory includes wide range of knowledge from correlation and spectral analysis and introduces some basic nonlinear models. In the second part, linearity tests are described, classified and compared both theoretically and practically on simulated data from several linear and nonlinear models. At the end, some scripts and hints in R language are introduced that could be used when applying tests to real data. Keywords: linear time series, bispectrum, testing for linearity, nonlinear models
9

Aplicação de teoria de sistema dinâmicos para inferência de causalidade entre séries temporais sintéticas e biológicas. / Applications of dynamical systems theory to the inference of causality between synthetic and biological time series.

Silva, Rafael Lopes Paixão da 03 April 2018 (has links)
Submitted by RAFAEL LOPES PAIXÃO DA SILVA (lopes1313@gmail.com) on 2018-05-02T19:05:38Z No. of bitstreams: 1 0-Thesis.pdf: 4208376 bytes, checksum: e3d171683a2ab6be4462439595e613c3 (MD5) / Approved for entry into archive by Hellen Sayuri Sato null (hellen@ift.unesp.br) on 2018-05-03T18:00:03Z (GMT) No. of bitstreams: 1 silva_rlp_me_ift.pdf: 4208376 bytes, checksum: e3d171683a2ab6be4462439595e613c3 (MD5) / Made available in DSpace on 2018-05-03T18:00:03Z (GMT). No. of bitstreams: 1 silva_rlp_me_ift.pdf: 4208376 bytes, checksum: e3d171683a2ab6be4462439595e613c3 (MD5) Previous issue date: 2018-04-03 / Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) / A modelagem matemática é uma ferramenta presente nos campos da ecologia teórica e da biologia ma- temática. Porém tais modelos que tentam reproduzir parte da dinâmica natural são limitados, o que rapidamente esgota as possibilidades de investigações e exploração dos dados. Visando contornar isso partimos para o contexto da reconstrução de espaços-de-fase, pois queremos obter outras informações sobre aquilo que temos em mãos, a observação da natureza, o dado. De posse dessa nova aplicação da teoria de sistemas dinâmicos, é nos possibilitado uma nova inferência sobre o fenômeno observado, bem como suas causas que, através do modelo estavam ocultas. A técnica do mapeamento cruzado convergente, entre atratores gerados pela reconstrução de espaços-de-fase, através da representação do espaço-de-fase original num espaço euclidiano formado pela série temporal original e seus atrasos, pos- sibilita uma inferência de causalidade mais pragmática e mais efetiva para sistemas que obedeçam uma dinâmica não-linear, o caso para as muitas séries ecológicas e biológicas de interesse. / Mathematical modeling is an almost omnipresent tool in the fields of theoretical ecology and mathe- matical biology. However, such models that try to partially reproduce the natural dynamics are limited, which quickly runs out possibilities for data-driven investigation and exploration. Aiming to circumvent this, we set out to the context of phase-space reconstruction, since we want to obtain other information on what is in hands, an observation of nature, the data. In possession of the new application of the theory of dynamical systems, are enabled to us a new type of inference on the observed phenomenon, and its causes, until now hidden by the models. The technique of convergent-cross mapping, among attractors generated by phase-space reconstruction through the representation of the original phase-space in a Euclidean space formed by the original time series and its delays, enables us a more pragmatic inference of causality and more effective for systems that obey a nonlinear dynamics, the case for many ecological and biological series of interest. / 131659/2016-2.
10

A comparative evaluation of non-linear time series analysis and singular spectrum analysis for the modelling of air pollution

Diab, Anthony Francis 12 1900 (has links)
Thesis (MScEng)--University of Stellenbosch, 2000. / ENGLISH ABSTRACT: Air pollution is a major concern III the Cape Metropole. A major contributor to the air pollution problem is road transport. For this reason, a national vehicle emissions study is in progress with the aim of developing a national policy regarding motor vehicle emissions and control. Such a policy could bring about vehicle emission control and regulatory measures, which may have far-reaching social and economic effects. Air pollution models are important tools 10 predicting the effectiveness and the possible secondary effects of such policies. It is therefore essential that these models are fundamentally sound to maintain a high level of prediction accuracy. Complex air pollution models are available, but they require spatial, time-resolved information of emission sources and a vast amount of processing power. It is unlikely that South African cities will have the necessary spatial, time-resolved emission information in the near future. An alternative air pollution model is one that is based on the Gaussian Plume Model. This model, however, relies on gross simplifying assumptions that affect model accuracy. It is proposed that statistical and mathematical analysis techniques will be the most viable approach to modelling air pollution in the Cape Metropole. These techniques make it possible to establish statistical relationships between pollutant emissions, meteorological conditions and pollutant concentrations without gross simplifying assumptions or excessive information requirements. This study investigates two analysis techniques that fall into the aforementioned category, namely, Non-linear Time Series Analysis (specifically, the method of delay co-ordinates) and Singular Spectrum Analysis (SSA). During the past two decades, important progress has been made in the field of Non-linear Time Series Analysis. An entire "toolbox" of methods is available to assist in identifying non-linear determinism and to enable the construction of predictive models. It is argued that the dynamics that govern a pollution system are inherently non-linear due to the strong correlation with weather patterns and the complexity of the chemical reactions and physical transport of the pollutants. In addition to this, a statistical technique (the method of surrogate data) showed that a pollution data set, the oxides of Nitrogen (NOx), displayed a degree of non-linearity, albeit that there was a high degree of noise contamination. This suggested that a pollution data set will be amenable to non-linear analysis and, hence, Non-linear Time Series Analysis was applied to the data set. SSA, on the other hand, is a linear data analysis technique that decomposes the time series into statistically independent components. The basis functions, in terms of which the data is decomposed, are data-adaptive which makes it well suited to the analysis of non-linear systems exhibiting anharmonic oscillations. The statistically independent components, into which the data has been decomposed, have limited harmonic content. Consequently, these components are more amenable to prediction than the time series itself. The fact that SSA's ability has been proven in the analysis of short, noisy non-linear signals prompted the use of this technique. The aim of the study was to establish which of these two techniques is best suited to the modelling of air pollution data. To this end, a univariate model to predict NOx concentrations was constructed using each of the techniques. The prediction ability of the respective model was assumed indicative of the accuracy of the model. It was therefore used as the basis against which the two techniques were evaluated. The procedure used to construct the model and to quantify the model accuracy, for both the Non-linear Time Series Analysis model and the SSA model, was consistent so as to allow for unbiased comparison. In both cases, no noise reduction schemes were applied to the data prior to the construction of the model. The accuracy of a 48-hour step-ahead prediction scheme and a lOO-hour step-ahead prediction scheme was used to compare the two techniques. The accuracy of the SSA model was markedly superior to the Non-linear Time Series model. The paramount reason for the superior accuracy of the SSA model is its adept ability to analyse and cope with noisy data sets such as the NOx data set. This observation provides evidence to suggest that Singular Spectrum Analysis is better suited to the modelling of air pollution data. It should therefore be the analysis technique of choice when more advanced, multivariate modelling of air pollution data is carried out. It is recommended that noise reduction schemes, which decontaminate the data without destroying important higher order dynamics, should be researched. The application of an effective noise reduction scheme could lead to an improvement in model accuracy. In addition to this, the univariate SSA model should be extended to a more complex multivariate model that explicitly encompasses variables such as traffic flow and weather patterns. This will explicitly expose the inter-relationships between the variables and will enable sensitivity studies and the evaluation of a multitude of scenarios. / AFRIKAANSE OPSOMMING: Die hoë vlak van lugbesoedeling in die Kaapse Metropool is kommerwekkend. Voertuie is een van die hoofoorsake, en as gevolg hiervan word 'n landswye ondersoek na voertuigemissie tans onderneem sodat 'n nasionale beleid opgestel kan word ten opsigte van voertuigemissie beheer. Beheermaatreëls van so 'n aard kan verreikende sosiale en ekonomiese uitwerkings tot gevolg hê. Lugbesoedelingsmodelle is van uiterste belang in die voorspelling van die effektiwiteit van moontlike wetgewing. Daarom is dit noodsaaklik dat hierdie modelle akkuraat is om 'n hoë vlak van voorspellingsakkuraatheid te handhaaf. Komplekse modelle is beskikbaar, maar hulle verg tyd-ruimtelike opgeloste inligting van emmissiebronne en baie berekeningsvermoë. Dit is onwaarskynlik dat Suid-Afrika in die nabye toekoms hierdie tydruimtelike inligting van emissiebronne gaan hê. 'n Alternatiewe lugbesoedelingsmodel is dié wat gebaseer is op die "Guassian Plume". Hierdie model berus egter op oorvereenvoudigde veronderstellings wat die akkuraatheid van die model beïnvloed. Daar word voorgestel dat statistiese en wiskundige analises die mees lewensvatbare benadering tot die modellering van lugbesoedeling in die Kaapse Metropool sal wees. Hierdie tegnieke maak dit moontlik om 'n statistiese verwantskap tussen besoedelingsbronne, meteorologiese toestande en besoedeling konsentrasies te bepaal sonder oorvereenvoudigde veronderstellings of oormatige informasie vereistes. Hierdie studie ondersoek twee analise tegnieke wat in die bogenoemde kategorie val, naamlik, Nie-lineêre Tydreeks Analise en Enkelvoudige Spektrale Analise (ESA). Daar is in die afgelope twee dekades belangrike vooruitgang gemaak in die studieveld van Nie-lineêre Tydreeks Analise. 'n Volledige stel metodes is beskikbaar om nie-lineêriteit te identifiseer en voorspellingsmodelle op te stel. Dit word geredeneer dat die dinamika wat 'n besoedelingsisteem beheer nie-lineêr is as gevolg van die sterk verwantskap wat dit toon met weerpatrone asook die kompleksiteit van die chemiese reaksies en die fisiese verplasing van die besoedelingstowwe. Bykomend verskaf 'n statistiese tegniek (die metode van surrogaatdata) bewyse dat 'n lugbesoedelingsdatastel, die okside van Stikstof (NOx), melineêre gedrag toon, alhoewel daar 'n hoë geraasvlak is. Om hierdie rede is die besluit geneem om Nie-lineêre Tydreeks Analise aan te wend tot die datastel. ESA daarenteen, is 'n lineêre data analise tegniek. Dit vereenvoudig die tydreeks tot statistiese onafhanklike komponente. Die basisfunksies, in terme waarvan die data vereenvoudig is, is data-aanpasbaar en dit maak hierdie tegniek gepas vir die analise van nielineêre sisteme. Die statisties onafhanklike komponente het beperkte harmoniese inhoud, met die gevolg dat die komponente aansienlik makliker is om te voorspel as die tydreeks self. ESA se effektiwitiet is ook al bewys in die analise van kort, hoë-graas nie-lineêre seine. Om hierdie redes, is ESA toegepas op die lugbesoedelings data. Die doel van die ondersoek was om vas te stel watter een van die twee tegnieke meer gepas is om lugbesoedelings data te analiseer. Met hierdie doelwit in sig, is 'n enkelvariaat model opgestel om NOx konsentrasies te voorspel met die gebruik van elk van die tegnieke. Die voorspellingsvermoë van die betreklike model is veronderstelom as 'n maatstaf van die model se akkuraatheid te kan dien en dus is dit gebruik om die twee modelle te vergelyk. 'n Konsekwente prosedure is gevolg om beide die modelle te skep om sodoende invloedlose vergelyking te verseker. In albei gevalle was daar geen geraasverminderings-tegnieke toegepas op die data nie. Die akuraatheid van 'n 48-uur voorspellingsmodel en 'n 100-uur voorspellingsmodel was gebruik vir die vergelyking van die twee tegnieke. Daar is bepaal dat die akkuraatheid van die ESA model veel beter as die Nie-lineêre Tydsreeks Analise is. Die hoofrede vir die ESA se hoër akkuraatheid is die model se vermoë om data met hoë geraasvlakke te analiseer. Hierdie ondersoek verskaf oortuigende bewyse dat Enkelvoudige Spektrale Analiese beter gepas is om lugbesoedelingsdata te analiseer en gevolglik moet hierdie tegniek gebruik word as meer gevorderde, multivariaat analises uitgevoer word. Daar word aanbeveel dat geraasverminderings-tegnieke, wat die data kan suiwer sonder om belangrike hoë-orde dinamika uit te wis, ondersoek moet word. Hierdie toepassing van effektiewe geraasverminderings-tegniek sal tot 'n verbetering in model-akkuraatheid lei. Aanvullend hiertoe, moet die enkele ESA model uitgebrei word tot 'n meer komplekse multivariaat model wat veranderlikes soos verkeersvloei en weerpatrone insluit. Dit sal die verhoudings tussen veranderlikes ten toon stel en sal sensitiwiteit-analises en die evaluering van menigte scenarios moontlik maak.

Page generated in 0.0605 seconds