• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 56
  • 14
  • 10
  • 4
  • 1
  • 1
  • Tagged with
  • 101
  • 21
  • 16
  • 15
  • 14
  • 13
  • 11
  • 10
  • 10
  • 8
  • 8
  • 8
  • 8
  • 8
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Choix optimal du paramètre de lissage dans l'estimation non paramétrique de la fonction de densité pour des processus stationnaires à temps continu / Optimal choice of smoothing parameter in non parametric density estimation for continuous time stationary processes

El Heda, Khadijetou 25 October 2018 (has links)
Les travaux de cette thèse portent sur le choix du paramètre de lissage dans le problème de l'estimation non paramétrique de la fonction de densité associée à des processus stationnaires ergodiques à temps continus. La précision de cette estimation dépend du choix de ce paramètre. La motivation essentielle est de construire une procédure de sélection automatique de la fenêtre et d'établir des propriétés asymptotiques de cette dernière en considérant un cadre de dépendance des données assez général qui puisse être facilement utilisé en pratique. Cette contribution se compose de trois parties. La première partie est consacrée à l'état de l'art relatif à la problématique qui situe bien notre contribution dans la littérature. Dans la deuxième partie, nous construisons une méthode de sélection automatique du paramètre de lissage liée à l'estimation de la densité par la méthode du noyau. Ce choix issu de la méthode de la validation croisée est asymptotiquement optimal. Dans la troisième partie, nous établissons des propriétés asymptotiques, de la fenêtre issue de la méthode de la validation croisée, données par des résultats de convergence presque sûre. / The work this thesis focuses on the choice of the smoothing parameter in the context of non-parametric estimation of the density function for stationary ergodic continuous time processes. The accuracy of the estimation depends greatly on the choice of this parameter. The main goal of this work is to build an automatic window selection procedure and establish asymptotic properties while considering a general dependency framework that can be easily used in practice. The manuscript is divided into three parts. The first part reviews the literature on the subject, set the state of the art and discusses our contribution in within. In the second part, we design an automatical method for selecting the smoothing parameter when the density is estimated by the Kernel method. This choice stemming from the cross-validation method is asymptotically optimal. In the third part, we establish an asymptotic properties pertaining to consistency with rate for the resulting estimate of the window-width.
82

Financial time series analysis with competitive neural networks

Roussakov, Maxime 08 1900 (has links)
No description available.
83

Empirické ověření nové Keynesiánské Philipsovy křivky v ČR / Empirical Testing of the New Keynesian Phillips Curve in the Czech Republic

Plašil, Miroslav January 2003 (has links)
New keynesian Phillips curve (NKPC) has become a central model to study the relation between inflation and real economic activity, notably in the framework of optimal monetary policy design. However, some recent evidence suggests that empirical data are usually at odds with the underlying theory. The model due to its inherent structure represents a statistical challenge in its own right. Since Galí and Gertler (1999) published their seminal paper introducing estimation via GMM techniques, they have triggered a heated debate on its empirical relevance. Their approach has been heavily criticised by later authors, mainly on the grounds of questionable behaviour of GMM estimator in the NKPC context and/or its small sample properties. The common criticism includes sensitivity to the choice of instrument set, weak identification and small sample bias. In this thesis I propose a new estimation strategy that provides a remedy to above mentioned shortcomings and allows to obtain reliable estimates. The procedure exploits recent advances in GMM theory as well as in other fields of statistics, in particular in the area of time series factor analysis and bootstrap. The proposed estimation strategy consists of several consecutive steps: first, to reduce a small sample bias resulting from excessive use of instruments I summarize all available information by employing factor analysis and include estimated factors into information set. In the second step I use statistical information criteria to select optimal instruments and eventually I obtain confidence intervals on parameters using bootstrap method. In NKPC context all these methods were used for the first time and can also be used independently. Their combination however provides synergistic effect that helps to improve the properties of estimates and to check the efficiency of given steps. Obtained results suggest that NKPC model can explain Czech inflation dynamics fairly well and provide some support for underlying theory. Among other things the results imply that the policy of disinflation may not be as costly with respect to a loss in aggregate product as earlier versions of Phillips curve would indicate. However, finding a good proxy for real economic activity has proved to be a difficult task. In particular we demonstrated that results are conditional on how the measure is calculated, some measures even showed countercyclical behaviour. This issue -- in the thesis discussed only in passing -- is a subject of future research. In addition to the proposed strategy and provided parameter estimates the thesis brings some partial simulation-based findings. Simulations elaborate on earlier literature on naive bootstrap in GMM context and study performance of bootstrap modifications of unit root and KPSS test.
84

Relations entre diversité des habitats forestiers et communautés de chiroptères à différentes échelles spatiales en Europe : implications pour leur conservation et le maintien de leur fonction de prédation / Relationships between forest habitat diversity and bat communities at different spatial scales in Europe

Charbonnier, Yohan 02 December 2014 (has links)
Les chiroptères sont reconnus comme de potentiels régulateurs des populations d’insectes. Ce sont aussi les mammifères européens pour lesquels les enjeux de conservation sont les plus importants. Ils trouvent dans les forêts des habitats favorables qui sont cependant menacés par les changements climatiques et la fragmentation. Il convient donc de mieux comprendre lesrelations entre les communautés de chiroptères, leurs habitats et leurs proies en forêt. L'objectif de cette thèse est de quantifier les effets, à différentes échelles spatiales, desprincipales composantes de l’habitat forestier sur l’activité, la richesse spécifique, la diversité fonctionnelle et la composition des communautés de chiroptères européens. Les résultats reposent sur des données collectées grâce à des protocoles expérimentaux en Aquitaine et dans les six pays du réseau de placettes forestières organisé par le projet FunDivEurope. De la parcelle au continent, l'accroissement de la diversité des essences forestières, de la proportion de feuillus et du bois mort, en augmentant les ressources en proies et en gîtes, ont des effets positifs sur les communautés de chiroptères. Ces effets, non stationnaires, se renforcent vers le nord avec la rigueur du climat. Nous confirmons également que les chiroptères forestiers, par leur réponse numérique et fonctionnelle aux densités de proie, peuvent limiter la démographie d’un insecte défoliateur. Des mesures de gestion, visant le renforcement des structures-clés des habitats forestiers, sont proposées pour favoriser la conservation des communautés de chiroptères et leur capacité de régulation des insectes ravageurs. / Insectivorous bats are increasingly recognized as potential regulators of pest insect populations.They also represent the group of European mammals with the most unfavorable conservation status. Forests are key habitats for many bat species but are currently under threat from climate change and fragmentation. It is therefore urgent to better understand the relationships between the bats, their prey and their habitats in forests. Our main objective was to quantify the effects, at multiple spatial scales, of the main attributes of forest habitats on the activity, species richness, functional diversity and composition of European bat communities. They were studied using manipulative experiments in Aquitaine plantation forests and automatic recordings in the network of exploratory plots set up in six European countries by the FunDivEurope project. From the plot to the continent scale, increasing tree diversity, amount of broad leaved trees and dead wood, had positive effects on bat communities through an increase in prey and roost resources. However these effects were not stationary, being stronger at higher latitudes, probably due to lower habitat carrying capacity in relation to harsher climatic conditions. In addition we experimentally demonstrated that the numerical and functional responses of bats to prey density could result in effective regulation of pine processionary moth populations. Forest management strategies aim at enhancing key habitat structures, are eventually proposed in order to improve the conservation of bats and to increase the service of pest regulation they can provide.
85

Numerical analysis for random processes and fields and related design problems

Abramowicz, Konrad January 2011 (has links)
In this thesis, we study numerical analysis for random processes and fields. We investigate the behavior of the approximation accuracy for specific linear methods based on a finite number of observations. Furthermore, we propose techniques for optimizing performance of the methods for particular classes of random functions. The thesis consists of an introductory survey of the subject and related theory and four papers (A-D). In paper A, we study a Hermite spline approximation of quadratic mean continuous and differentiable random processes with an isolated point singularity. We consider a piecewise polynomial approximation combining two different Hermite interpolation splines for the interval adjacent to the singularity point and for the remaining part. For locally stationary random processes, sequences of sampling designs eliminating asymptotically the effect of the singularity are constructed. In Paper B, we focus on approximation of quadratic mean continuous real-valued random fields by a multivariate piecewise linear interpolator based on a finite number of observations placed on a hyperrectangular grid. We extend the concept of local stationarity to random fields and for the fields from this class, we provide an exact asymptotics for the approximation accuracy. Some asymptotic optimization results are also provided. In Paper C, we investigate numerical approximation of integrals (quadrature) of random functions over the unit hypercube. We study the asymptotics of a stratified Monte Carlo quadrature based on a finite number of randomly chosen observations in strata generated by a hyperrectangular grid. For the locally stationary random fields (introduced in Paper B), we derive exact asymptotic results together with some optimization methods. Moreover, for a certain class of random functions with an isolated singularity, we construct a sequence of designs eliminating the effect of the singularity. In Paper D, we consider a Monte Carlo pricing method for arithmetic Asian options. An estimator is constructed using a piecewise constant approximation of an underlying asset price process. For a wide class of Lévy market models, we provide upper bounds for the discretization error and the variance of the estimator. We construct an algorithm for accurate simulations with controlled discretization and Monte Carlo errors, andobtain the estimates of the option price with a predetermined accuracy at a given confidence level. Additionally, for the Black-Scholes model, we optimize the performance of the estimator by using a suitable variance reduction technique.
86

Risk Homeostasis Reconsidered - The Limits of Traffic Safety Regulation

Kalus, Falk 01 October 2001 (has links) (PDF)
Die Risikohomeostasistheorie (RHT) ist ein formales Konzept zur Erklärung menschlichen Verhaltens im Straßenverkehr bei verändertem Unfallrisiko. Vor dem Hintergrund des gegenwärtigen Standes der Ökonometrie weisen die Untersuchungen zur RHT mittels langer Zeitreihen einige Schwächen auf. Im folgenden wird versucht, diese Schwächen einerseits mit dem Stationaritätskonzept der Ökonometrie und andererseits mit einer auf Dummyvariablen basierenden Methode zu beheben. Gleichzeitig wird die Theorie einem neuerlichen Test auf ihre Gültigkeit hinsichtlich der Unfallsituation im Straßenverkehr in Deutschland unterzogen. Die Arbeit nimmt Bezug erstens auf die Wirksamkeit von Regulierungsmaßnahmen (hier: Verschärfung der Gurtanlegepflicht) und zweitens auf die Wirkungen der deutschen Wiedervereinigung. Beiden Ereignissen wird nach der RHT keine Wirkung zugesprochen. Die Ergebnisse der Analysen unterstützen die Thesen der RHT nur schwach. Sie belegen, daß konsequente und mit Strafandrohung belegte Regulierungsmaßnahmen entgegen dem Postulat der Risikohomeostasisthese eine stark positive Wirkung auf die Unfallsituation besitzen. Außerdem werden die komplexen Entscheidungsprozesse von Verkehrsteilnehmern im Kontext mehrerer theoretischer Konzepte untersucht. Es zeigt sich, das Theorien zur Beschreibung individuellen Verhaltens unter Unsicherheit sehr gut geeignet sind, tatsächliches Verhalten von Verkehrsteilnehmern zu erklären. / Risk homeostasis theory (RHT) is a behavioural theory of risk taking in road traffic. So far, most of the published papers concerning RHT and long time series are based on econometric methods which are not very well suited for this purpose. We propose here to address the issue using instead the econometric concept of stationarity and a concept based on dummy variables. We then test the RHT with German traffic accident data and specifically analyze compulsory traffic safety measures (the penalty for not using seat belts) as well as the effects of German reunification. Both are ineffective according to RHT. Our results, found by using several risk measures, show only weak evidence for RHT. Contrary to RHT, we can show that compulsory safety measures combined with penalties had a strict positive effect on the road traffic accident risk. We also develop a solution which focuses on the complex decision-making process of an individual in road traffic. This is done within the context of several theories explaining individuals decision-making under uncertainty. There we can show that these theoretical concepts are very well suited to explain actual behavior of road users.
87

Location-based estimation of the autoregressive coefficient in ARX(1) models.

Kamanu, Timothy Kevin Kuria January 2006 (has links)
<p>In recent years, two estimators have been proposed to correct the bias exhibited by the leastsquares (LS) estimator of the lagged dependent variable (LDV) coefficient in dynamic regression models when the sample is finite. They have been termed as &lsquo / mean-unbiased&rsquo / and &lsquo / medianunbiased&rsquo / estimators. Relative to other similar procedures in the literature, the two locationbased estimators have the advantage that they offer an exact and uniform methodology for LS estimation of the LDV coefficient in a first order autoregressive model with or without exogenous regressors i.e. ARX(1).</p> <p><br /> However, no attempt has been made to accurately establish and/or compare the statistical properties among these estimators, or relative to those of the LS estimator when the LDV coefficient is restricted to realistic values. Neither has there been an attempt to&nbsp / compare their performance in terms of their mean squared error (MSE) when various forms of the exogenous regressors are considered. Furthermore, only implicit confidence intervals have been given for the &lsquo / medianunbiased&rsquo / estimator. Explicit confidence bounds that are directly usable for inference are not available for either estimator. In this study a new estimator of the LDV coefficient is proposed / the &lsquo / most-probably-unbiased&rsquo / estimator. Its performance properties vis-a-vis the existing estimators are determined and compared when the parameter space of the LDV coefficient is restricted. In addition, the following new results are established: (1) an explicit computable form for the density of the LS estimator is derived for the first time and an efficient method for its numerical evaluation is proposed / (2) the exact bias, mean, median and mode of the distribution of the LS estimator are determined in three specifications of the ARX(1) model / (3) the exact variance and MSE of LS estimator is determined / (4) the standard error associated with the determination of same quantities when simulation rather than numerical integration method is used are established and the methods are compared in terms of computational time and effort / (5) an exact method of evaluating the density of the three estimators is described / (6) their exact bias, mean, variance and MSE are determined and analysed / and finally, (7) a method of obtaining the explicit exact confidence intervals from the distribution functions of the estimators is proposed.</p> <p><br /> The discussion and results show that the estimators are still biased in the usual sense: &lsquo / in expectation&rsquo / . However the bias is substantially reduced compared to that of the LS estimator. The findings are important in the specification of time-series regression models, point and interval estimation, decision theory, and simulation.</p>
88

Modeling non-stationary resting-state dynamics in large-scale brain models

Hansen, Enrique carlos 27 February 2015 (has links)
La complexité de la connaissance humaine est révèlée dans l'organisation spatiale et temporelle de la dynamique du cerveau. Nous pouvons connaître cette organisation grâce à l'analyse des signaux dépendant du niveau d'oxygène sanguin (BOLD), lesquels sont obtenus par l'imagerie par résonance magnétique fonctionnelle (IRMf). Nous observons des dépendances statistiques entre les régions du cerveau dans les données BOLD. Ce phénomène s' appelle connectivité fonctionnelle (CF). Des modèles computationnels sont développés pour reproduire la connectivité fonctionnelle (CF). Comme les études expérimentales précédantes, ces modèles assument que la CF est stationnaire, c'est-à-dire la moyenne et la covariance des séries temporelles BOLD utilisées par la CF sont constantes au fil du temps. Cependant, des nouvelles études expérimentales concernées par la dynamique de la CF à différentes échelles montrent que la CF change dans le temps. Cette caractéristique n'a pas été reproduite dans ces modèles computationnels précédants. Ici on a augmenté la non-linéarité de la dynamique locale dans un modèle computationnel à grande échelle. Ce modèle peut reproduire la grande variabilité de la CF observée dans les études expérimentales. / The complexity of human cognition is revealed in the spatio-temporal organization of brain dynamics. We can gain insight into this organization through the analysis of blood oxygenation-level dependent (BOLD) signals, which are obtained from functional magnetic resonance imaging (fMRI). In BOLD data we can observe statistical dependencies between brain regions. This phenomenon is known as functional connectivity (FC). Computational models are being developed to reproduce the FC of the brain. As in previous empirical studies, these models assume that FC is stationary, i.e. the mean and the covariance of the BOLD time series used for the FC are constant over time. Nevertheless, recent empirical studies focusing on the dynamics of FC at different time scales show that FC is variable in time. This feature is not reproduced in the simulated data generated by some previous computational models. Here we have enhanced the non-linearity of local dynamics in a large-scale computational model. By enhancing this non-linearity, our model is able to reproduce the variability of the FC found in empirical data.
89

Development of a precipitation index-based conceptual model to overcome sparse data barriers in runoff prediction in cold climate

Akanegbu, J. O. (Justice Orazulukwe) 07 December 2018 (has links)
Abstract This thesis describes the development of a new precipitation index-based conceptual water balance model with parameters easily regionalized through the functional relationship with catchment and climate attributes. It also presents a simple method for improving model dynamics for streamflow simulations in a non-stationary climate. The model was developed for streamflow modelling and prediction in high-latitude catchments, where model parameter regionalization is difficult due to limited availability of hydrological data for the region. The model couples a snow accumulation and melt formulation with a current precipitation index (CPI) formulation to simulate daily precipitation in runoff hydrograph pattern from catchments with seasonal snow cover. Using new runoff conversion factors CT and Lf, and a threshold flow factor ThQ, the simulated CPI hydrograph is converted into daily runoff and routed using the transformation function Maxbas. The model was developed in Microsoft Excel workbook and tested in 32 catchments in Finland, a region with considerable seasonal snow cover. The results showed that the model can adequately simulate and reproduce the dynamics of daily runoff from catchments where the underlying physical conditions are not known. In addition, incorporating temperature conditions influencing inter-annual variability in streamflow into the model structure improved its structural dynamics, thereby improving its performance in a non-stationary climate. Most model parameters showed strong relationships with observable catchment characteristics, climate characteristics, or both. The parameter functional relationships derived from the model parameter-catchment relationships produced equally good model results when applied to independent test catchments used as mock-ungauged catchments. Inclusion of snow-water equivalent records and use of multiple objective functions for snow-water equivalent and runoff simulations during model optimization helped reduce the effect of parameter equifinality, making it easier to determine optimal parameter values. The current precipitation index (CPIsnow) model is a parsimonious tool for predicting streamflow in data-limited high-latitude regions. / Tiivistelmä Tämä väitöskirja käsittelee yksinkertaisen sadantaan perustuvan konseptuaalisen vesitasemallin kehitystä ja soveltamista boreaalisille valuma-alueille sekä malliin liittyvää alueellista parametrisointia valuma-alueominaisuuksien ja ilmastoaineiston perusteella. Hydrologinen malli on luotu laskemaan ja ennustamaan valuntaa pohjoisille valuma-alueille, joilta on vähän hydrologista tietoa. Malli yhdistää lumen kertymisen ja sulannan tunnettuun sadantaindeksiin perustuvaan malliin (CPI) ja edelleen simuloi päivittäisen hydrografin valuma-alueille, joilla on selkeä lumipeitteinen ajanjakso. Malli laskee MaxBas funktion avulla CPI:llä muodostetun hydrografin päivittäiseksi valunnaksi valuntaan liittyvien malliparametrien CT ja Lf sekä virtaaman kynnysarvon ThQ avulla. Malli kehitettiin Excel-ympäristössä ja sitä testattiin 32 valuma-alueella Suomessa. Valuma-alueet edustivat maantieteellisesti kattavasti alueita, joilla esiintyy tyypillisesti kausittainen lumipeite. Saadut tulokset osoittivat, että kehitetty malli simuloi ja tuottaa päivittäisen valunnan riittävällä tarkkuudella valuma-alueille, vaikka hydrologista ja fysikaalista tietoa alueilta olisi niukasti. Useimmat malliparametrit olivat vahvasti riippuvaisia joko valuma-alue ominaisuuksista tai ilmastollisista parametreista tai molemmista. Parametrien funktionaalinen yhteys muodostettiin valuma-alueiden ominaisuuksien perusteella ja testattiin riippumattomalla valuma-aluejoukolla hyvin tuloksin. Malliparametrien samatavoitteellisuutta eli ekvifinaliteettiä voitiin vähentää huomioimalla mallissa lumen vesiarvomittaukset sekä hyödyntämällä useita parametrisia funktioita. Tällöin myös optimaalisten parametrien löytyminen nopeutui ja helpottui. Tämän väitöstyön pohjalta syntynyt uusi sadannan indeksiin pohjautuva laskentamalli (CPIsnow) mahdollistaa valunnan arvioinnin pieniltä valuma-alueilta, joilta on niukasti aineistoa saatavilla ja joissa lumen sulanta ja kertyminen ovat keskeisiä hydrologisia prosesseja.
90

Couverture du risque de volatilité et de corrélation dans un portefeuille / Hedging volatility and correlation risk in a portfolio

Malongo Elouaï, Hassan 11 February 2014 (has links)
Ce travail est centré sur la modélisation des dynamiques de volatilités et de corrélations entre rendements d'actifs financiers. Après une présentation de la littérature relative aux modèles Garch univariés et multivariés, l'auteur établit des résultats d'existence et d'unicité pour les solutions stationnaires des modèles de corrélations dynamiques de type DCC (Engle, 2002). Il étend ensuite cette classe de modèles en incluant les volatilités instantanées et des probabilités de changement de régime dans la dynamique des corrélations. Les nouveaux modèles sont évalués empiriquement sur un portefeuille d'indices MSCI. Des tests formels montrent que certaines de ces nouvelles spécifications améliorent le pouvoir prédictif de la matrice de covariance des rendements et s'avèreraient utiles en gestion de portefeuille. Enfin, se focalisant désormais sur le risque de volatilité, l'auteur montre que des stratégies de couvertures des principaux indices actions Européen à partir d'indices de volatilité implicite (VIX, VSTOXX) sont pertinentes et permettent à la fois de couvrir et réduire le risque action d'un portefeuille. / This work focuses on modeling the dynamics of volatilities and correlations between financial assets returns. After a literature review of univariate and multivariate GARCH-type models, the author establishes results for the existence and uniqueness of stationary solutions of dynamic correlations models (DCC model, Engle 2002). He then extends this class of models including instantaneous volatility and probability of regime changes in the dynamics of correlations. The new models are empirically evaluated on a MSCI portfolio. Formal tests have shown that some of these new specifications improve predictive power of the returns covariance matrix that would be useful in portfolio management. Finally, focusing now on the volatility risk, the author shows that hedging strategies of main European equity indices based on implied volatility indices (VIX, VSTOXX) are relevant and allow to both hedge and reduce the equity risk of a portfolio.

Page generated in 0.0951 seconds