• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 7
  • 7
  • 7
  • 7
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Exploring functional asymptotic confidence intervals for a population mean

Tuzov, Ekaterina 10 April 2014 (has links)
We take a Student process that is based on independent copies of a random variable X and has trajectories in the function space D[0,1]. As a consequence of a functional central limit theorem for this process, with X in the domain of attraction of the normal law, we consider convergence in distribution of several functionals of this process and derive respective asymptotic confidence intervals for the mean of X. We explore the expected lengths and finite-sample coverage probabilities of these confidence intervals and the one obtained from the asymptotic normality of the Student t-statistic, thus concluding some alternatives to the latter confidence interval that are shorter and/or have at least as high coverage probabilities.
2

A unified approach to structural change tests based on F statistics, OLS residuals, and ML scores

Zeileis, Achim January 2005 (has links) (PDF)
Three classes of structural change tests (or tests for parameter instability) which have been receiving much attention in both the statistics and econometrics communities but have been developed in rather loosely connected lines of research are unified by embedding them into the framework of generalized M-fluctuation tests (Zeileis and Hornik, 2003). These classes are tests based on F statistics (supF, aveF, expF tests), on OLS residuals (OLS-based CUSUM and MOSUM tests) and on maximum likelihood scores (including the Nyblom-Hansen test). We show that (represantives from) these classes are special cases of the generalized M-fluctuation tests, based on the same functional central limit theorem, but employing different functionals for capturing excessive fluctuations. After embedding these tests into the same framework and thus understanding the relationship between these procedures for testing in historical samples, it is shown how the tests can also be extended to a monitoring situation. This is achieved by establishing a general M-fluctuation monitoring procedure and then applying the different functionals corresponding to monitoring with F statistics, OLS residuals and ML scores. In particular, an extension of the supF test to a monitoring scenario is suggested and illustrated on a real-world data set. / Series: Research Report Series / Department of Statistics and Mathematics
3

Understanding the Functional Central Limit Theorems with Some Applications to Unit Root Testing with Structural Change / El Teorema del Límite Central Funcional con algunas aplicaciones a raíces unitarias con cambios estructurales

Aquino, Juan Carlos, Rodríguez, Gabriel 10 April 2018 (has links)
The application of different unit root statistics is by now a standard practice in empirical work. Even when it is a practical issue, these statistics have complex nonstandard distributions depending on functionals of certain stochastic processes, and their derivations represent a barrier even for many theoretical econometricians. These derivations are based on rigorous and fundamental statistical tools which are not (very) well known by standard econometricians. This paper aims to fill this gap by explaining in a simple way one of these fundamental tools: namely, the Functional Central Limit Theorem. To this end, this paper analyzes the foundations and applicability of two versions of the Functional Central Limit Theorem within the framework of a unit root with a structural break. Initial attention is focused on the probabilistic structure of the time series to be considered. Thereafter, attention is focused on the asymptotic theory for nonstationary time series proposed by Phillips (1987a), which is applied by Perron (1989) to study the effects of an (assumed) exogenous structural break on the power of the augmented Dickey-Fuller test and by Zivot and Andrews (1992) to criticize the exogeneity assumption and propose a method for estimating an endogenous breakpoint. A systematic method for dealing with efficiency issues is introduced by Perron and Rodriguez (2003), which extends the Generalized Least Squares detrending approach due to Elliot et al. (1996). An empirical application is provided. / Hoy en día es una práctica estándar de trabajo empírico la aplicación de diferentes estadísticos de contraste de raíz unitaria. A pesar de ser un aspecto práctico, estos estadísticos poseen distribuciones complejas y no estándar que dependen de funcionales de ciertos procesos estocásticos y sus derivaciones representan una barrera incluso para varios econometristas teóricos. Estas derivaciones están basadas en herramientas estadísticas fundamentales y rigurosas que no son (muy) bien conocidas por econometristas estándar. El presente artículo completa esta brecha al explicar en una forma simple una de estas herramientas fundamentales la cual es el Teorema del Límite Central Funcional. Por lo tanto, este documento analiza los fundamentos y la aplicabilidad de dos versiones del Teorema del Límite Central Funcional dentro del marco de una raíz unitaria con un quiebre estructural. La atención inicial se centra en la estructura probabilística de las series de tiempo propuesta por Phillips (1987a), la cual es aplicada por Perron (1989) para estudiar los efectos de un quiebre estructural (asumido) exógeno sobre la potencia de las pruebas Dickey-Fuller aumentadas y por Zivot y Andrews (1992) para criticar el supuesto de exogeneidad y proponer un método para estimar un punto de quiebre endógeno. Un método sistemático para tratar con aspectos de eficiencia es introducido por Perron y Rodríguez (2003), el cual extiende el enfoque de Mínimos Cuadrados Generalizados para eliminar los componentes determinísticos de Elliot et al. (1996). Se presenta además una aplicación empírica.
4

Speciální problémy nestacionarity ve finančních časových řadách / Special problems of non-stationarity in financial time series

Radič, Pavol January 2015 (has links)
The aim of this thesis is a detailed analysis of selected approaches of unit root testing. First chapter deals with the basic knowledge of the theory of stochastic processes. Further, we describe Dickey-Fuller tests, t-tests and likelihood ratio tests for the presence of a unit root and derive their asymptotic properties. Numerical studies include comparison of accuracy of the parameter estimates, estimating quantiles of the presented distributions, their graphical presentation and determination of power of our tests. The acquired theoretical knowledge is applied on real data which were analyzed using software Mathematica and R. Powered by TCPDF (www.tcpdf.org)
5

Stochastic Process Limits for Topological Functionals of Geometric Complexes

Andrew M Thomas (11009496) 23 July 2021 (has links)
<p>This dissertation establishes limit theory for topological functionals of geometric complexes from a stochastic process viewpoint. Standard filtrations of geometric complexes, such as the Čech and Vietoris-Rips complexes, have a natural parameter <i>r </i>which governs the formation of simplices: this is the basis for persistent homology. However, the parameter <i>r</i> may also be considered the time parameter of an appropriate stochastic process which summarizes the evolution of the filtration.</p><p>Here we examine the stochastic behavior of two of the foremost classes of topological functionals of such filtrations: the Betti numbers and the Euler characteristic. There are also two distinct setups in which the points underlying the complexes are generated, where the points are distributed randomly in <i>R<sup>d</sup></i> according to a general density (the traditional setup) and where the points lie in the tail of a heavy-tailed or exponentially-decaying “noise” distribution (the extreme-value theory (EVT) setup).<br></p><p>These results constitute some of the first results combining topological data analysis (TDA) and stochastic process theory. The first collection of results establishes stochastic process limits for Betti numbers of Čech complexes of Poisson and binomial point processes for two specific regimes in the traditional setup: the sparse regime—when the parameter <i>r </i>governing the formation of simplices causes the Betti numbers to concentrate on components of the lowest order; and the critical regime—when the parameter <i>r</i> is of the order <i>n<sup>-1/d</sup></i> and the geometric complex becomes highly connected with topological holes of every dimension. The second collection of results establishes a functional strong law of large numbers and a functional central limit theorem for the Euler characteristic of a random geometric complex for the critical regime in the traditional setup. The final collection of results establishes functional strong laws of large numbers for geometric complexes in the EVT setup for the two classes of “noise” densities mentioned above.<br></p>
6

Limit theorems for limit order books

Paulsen, Michael Christoph 21 August 2014 (has links)
Im ersten Teil der Dissertation wird ein diskretes stochastisches zustandsabhängiges Modell eines zweiseitigen Limit Orderbuchs als bestehend aus den Zustandsgrößen bester Bidpreis (Geldkurs), bester Askpreis (Briefkurs) und vorhandener Kauf- bzw. Verkaufsdichte definiert. Für eine einfache Skalierung mit zwei Zeitskalen wird ein Grenzwertsatz bewiesen. Die Veränderungen der besten Bid- und Askpreise werden im Sinne des Gesetzes der großen Zahlen skaliert und dies entspricht der langsameren Zeitskala. Das Platzieren bzw. Stornieren der Limitorder findet auf der schnelleren Zeitskala statt. Der Grenzwertsatz besagt, dass die fundamentalen Zustandsgrößen, gegeben Regularitätsbedingungen der einkommenden Order, fast sicher zu einem stetigen Limesmodell konvergieren. Im Limesmodell sind der beste Bidpreis und der beste Askpreis die eindeutigen Lösungen von zwei gekoppelten gewöhnlichen DGLen. Die Kauf- und Verkaufsdichten sind jeweils als eindeutige Lösungen von linearen hyperbolischen PDGLen, die anhand der Erwartungswerte der einkommenden Orderparameter festgelegt sind, gegeben. Die Lösungen sind in geschlossener Form erhältlich. Im zweiten Teil wird ein funktionaler zentraler Grenzwertsatz d.h. ein Invarianzprinzip für ein vereinfachtes Modell eines Limitorderbuches bewiesen. Unter einer natürlichen Skalierung konvergiert der zweidimensionale Preisprozess (Bid- und Askpreis) in Verteilung zu einer Semimartingal reflektierten Brownschen Bewegung in der zugelassenen Preismenge. Gleichzeitig konvergieren die Kauf- und Verkaufsdichten im schwachen Sinn zum Betrag einer zweiparametrischen Brownschen Bewegung. Es wird weiterhin anhand eines Beispiels gezeigt, wie man für das Modell im ersten Teil eine stochastiche PDGL, unter einer starken Stationaritätsannahme für die Orderplatzierungen und -stornierungen, herleiten kann. Im dritten Teil wird ein Mittelungs- bzw. ein Invarianzprinzip für diskrete Banach- bzw. Hilbertraumwertige stochastische Prozesse bewiesen. / In the first part of the thesis, we define a random state-dependent discrete model of a two-sided limit order book in terms of its key quantities best bid [ask] price and the standing buy [sell] volume density. For a simple scaling that introduces a slow time scaling, that is equivalent to the classical law of large numbers, for the bid/ask prices and a faster time scale for the limit volume placements/cancelations, that keeps the expected volume rate over the considered price interval invariant, we prove a limit theorem. The limit theorem states that, given regularity conditions on the random order flow, the key quantities converge in the sense of a strong law of large numbers to a tractable continuous limiting model. The limiting model is such that the best bid and ask price dynamics can be described in terms of two coupled ODE:s, while the dynamics of the relative buy and sell volume density functions are given as the unique solutions of two linear first-order hyperbolic PDE:s with variable coefficients, specified by the expectation of the order flow parameters. In the second part, we prove a functional central limit theorem i.e. an invariance principle for an order book model with block shaped volume densities close to the spread. The weak limit of the two-dimensional price process (best bid and ask price) is given by a semi-martingale reflecting Brownian motion in the set of admissible prices. Simultaneously, the relative buy and sell volume densities close to the spread converge weakly to the modulus of a two-parameter Brownian motion. We also demonstrate an example how to easily derive an SPDE for the relative volume densities in a simple case, when a strong stationarity assumption is made on the limit order placements and cancelations for the model suggested in the first part. In the third and final part of the thesis, we prove an averaging and an invariance principle for discrete processes taking values in Banach and Hilbert spaces, respectively.
7

Estimation de synchrones de consommation électrique par sondage et prise en compte d'information auxiliaire / Estimate the mean electricity consumption curve by survey and take auxiliary information into account

Lardin, Pauline 26 November 2012 (has links)
Dans cette thèse, nous nous intéressons à l'estimation de la synchrone de consommation électrique (courbe moyenne). Etant donné que les variables étudiées sont fonctionnelles et que les capacités de stockage sont limitées et les coûts de transmission élevés, nous nous sommes intéressés à des méthodes d'estimation par sondage, alternatives intéressantes aux techniques de compression du signal. Nous étendons au cadre fonctionnel des méthodes d'estimation qui prennent en compte l'information auxiliaire disponible afin d'améliorer la précision de l'estimateur de Horvitz-Thompson de la courbe moyenne de consommation électrique. La première méthode fait intervenir l'information auxiliaire au niveau de l'estimation, la courbe moyenne est estimée à l'aide d'un estimateur basé sur un modèle de régression fonctionnelle. La deuxième l'utilise au niveau du plan de sondage, nous utilisons un plan à probabilités inégales à forte entropie puis l'estimateur de Horvitz-Thompson fonctionnel. Une estimation de la fonction de covariance est donnée par l'extension au cadre fonctionnel de l'approximation de la covariance donnée par Hájek. Nous justifions de manière rigoureuse leur utilisation par une étude asymptotique. Pour chacune de ces méthodes, nous donnons, sous de faibles hypothèses sur les probabilités d'inclusion et sur la régularité des trajectoires, les propriétés de convergence de l'estimateur de la courbe moyenne ainsi que de sa fonction de covariance. Nous établissons également un théorème central limite fonctionnel. Afin de contrôler la qualité de nos estimateurs, nous comparons deux méthodes de construction de bande de confiance sur un jeu de données de courbes de charge réelles. La première repose sur la simulation de processus gaussiens. Une justification asymptotique de cette méthode sera donnée pour chacun des estimateurs proposés. La deuxième utilise des techniques de bootstrap qui ont été adaptées afin de tenir compte du caractère fonctionnel des données / In this thesis, we are interested in estimating the mean electricity consumption curve. Since the study variable is functional and storage capacities are limited or transmission cost are high survey sampling techniques are interesting alternatives to signal compression techniques. We extend, in this functional framework, estimation methods that take into account available auxiliary information and that can improve the accuracy of the Horvitz-Thompson estimator of the mean trajectory. The first approach uses the auxiliary information at the estimation stage, the mean curve is estimated using model-assisted estimators with functional linear regression models. The second method involves the auxiliary information at the sampling stage, considering πps (unequal probability) sampling designs and the functional Horvitz-Thompson estimator. Under conditions on the entropy of the sampling design the covariance function of the Horvitz-Thompson estimator can be estimated with the Hájek approximation extended to the functional framework. For each method, we show, under weak hypotheses on the sampling design and the regularity of the trajectories, some asymptotic properties of the estimator of the mean curve and of its covariance function. We also establish a functional central limit theorem.Next, we compare two methods that can be used to build confidence bands. The first one is based on simulations of Gaussian processes and is assessed rigorously. The second one uses bootstrap techniques in a finite population framework which have been adapted to take into account the functional nature of the data

Page generated in 0.1434 seconds