1 |
Estimation de modèles autorégressifs vectoriels à noyaux à valeur opérateur : Application à l'inférence de réseaux / Estimation of operator-valued kernel-based vector autoregressive models : Application to network inferenceLim, Néhémy 02 April 2015 (has links)
Dans l’analyse des séries temporelles multivariées, la plupart des modèles existants sont utilisés à des fins de prévision, c’est-à-dire pour estimer les valeurs futures du système étudié à partir d’un historique de valeurs observées dans le passé. Une autre tâche consiste à extraire des causalités entre les variables d’un système dynamique. C’est pour ce dernier problème à visée explicative que nous développons une série d’outils. À cette fin, nous définissons dans cette thèse une nouvelle famille de modèles autorégressifs vectoriels non paramétriques construits à partir de noyaux à valeur opérateur. En faisant l’hypothèse d’une structure sous-jacente creuse, la parcimonie du modèle est contrôlée en imposant dans la fonction de coût des contraintes de parcimonie aux paramètres du modèle (qui sont en l’occurrence des vecteurs qui pondèrent une combinaison linéaire de noyaux). Les noyaux étudiés possèdent parfois des hyperparamètres qui doivent être appris selon la nature du problème considéré. Lorsque des hypothèses de travail ou des connaissances expertes permettent de fixer les paramètres du noyau, le problème d’apprentissage se réduit à la seule estimation des paramètres du modèle. Pour optimiser la fonction de coût correspondante, nous développons un algorithme proximal. A contrario, lorsqu’aucune hypothèse relative aux variables n’est disponible, les paramètres de certains noyaux ne peuvent être fixés a priori. Il est alors nécessaire d’apprendre conjointement les paramètres du modèle et ceux du noyau. Pour cela, nous faisons appel à un schéma d’optimisation alterné qui met en jeu des méthodes proximales. Nous proposons ensuite d’extraire un estimateur de la matrice d’adjacence encodant le réseau causal sous-jacent en calculant une statistique des matrices jacobiennes instantanées. Dans le cas de la grande dimension, c’est-à-dire un nombre insuffisant de données par rapport au nombre de variables, nous mettons en oeuvre une approche d’ensemble qui partage des caractéristiques du boosting et des forêts aléatoires. Afin de démontrer l’efficacité de nos modèles, nous les appliquons à deux jeux de données : des données simulées à partir de réseaux de régulation génique et des données réelles sur le climat. / In multivariate time series analysis, existing models are often used for forecasting, i.e. estimating future values of the observed system based on previously observed values. Another purpose is to find causal relationships among a set of state variables within a dynamical system. We focus on the latter and develop tools in order to address this problem. In this thesis, we define a new family of nonparametric vector autoregressive models based on operator-valued kernels. Assuming a sparse underlying structure, we control the model’s sparsity by defining a loss function that includes sparsity-inducing penalties on the model parameters (which are basis vectors within a linear combination of kernels). The selected kernels sometimes involve hyperparameters that may need to be learned depending on the nature of the problem. On the one hand, when expert knowledge or working assumptions allow presetting the parameters of the kernel, the learning problem boils down to estimating only the model parameters. To optimize the corresponding loss function, we develop a proximal algorithm. On the other hand, when no prior knowledge is available, some other kernels may exhibit unknown parameters. Consequently, this leads to the joint learning of the kernel parameters in addition to the model parameters. We thus resort to an alternate optimization scheme which involves proximal methods. Subsequently, we propose to build an estimate of the adjacency matrix coding for the underlying causal network by computing a function of the instantaneous Jacobian matrices. In a high-dimensional setting, i.e. insufficient amount of data compared to the number of variables, we design an ensemble methodology that shares features of boosting and random forests. In order to emphasize the performance of the developed models, we apply them on two tracks : simulated data from gene regulatory networks and real climate data.
|
2 |
NilLiu, Tse-Tseng 27 July 2000 (has links)
Nil
|
3 |
Inflation Targeting in Developing Countries and Its Applicability to the Turkish EconomyTutar, Eser 01 August 2002 (has links)
Inflation targeting is a monetary policy regime, characterized by public announcement of official target ranges or quantitative targets for price level increases and by explicit acknowledgement that low inflation is the most crucial long-run objective of the monetary authorities. There are three prerequisites for inflation targeting: 1)central bank independence,2)having a sole target,3)existence of stable and predictable relationship between monetary policy instruments and inflation.In many developing countries, the use of seigniorage revenues as an important source of financing public debts, the lack of commitment to low inflation as a primary goal by monetary authorities, considerable exchange rate flexibility, lack of substantial operational independence of the central bank or of powerful models to make domestic inflation forecasts hinder the satisfaction of these requirements. This study investigates the applicability of inflation targeting to the Turkish economy. Central bank independence in Turkey has been mainly hindered by "fiscal dominance" through monetization of high budget deficits. In addition, although serious steps have been taken recently under a new law to have an independent central bank, such as formal commitment to the achievement of price stability as the primary objective and the prohibition of credit extension to the government, the central bank does not satisfy independence criteria due to the problems associated with the appointment of the government and the share of the Treasury within the bank. Having a sole inflation target was hindered by the existence of fixed exchange rate system throughout the years. However, in February 2001, Turkey switched to a floating exchange rate regime, which is important for a successful inflation-targeting regime. Having a sole target within the system has also been supported by the new central bank law, which gives priority to price stability and supports any other objective as long as it is consistent with price stability. In this thesis, an empirical investigation has been made in order to assess the statistical readiness of Turkey to satisfy the requirements of inflation-targeting by making use of vector autoregressive (VAR) models. The results suggest that inflation is an inertial phenomenon in Turkey and money, interest rates and nominal exchange rates innovations are not economically and statistically important determinants of prices. Most of the variances in prices are explained by prices themselves. According to the VAR evidence, the direct linkages between monetary policy instruments and inflation do not seem to be strong, stable, and predictable. As a result, while the second requirement of the inflation-targeting regime seems to have been satisfied, there are still problems associated with the central bank independence and the existence of stable and predictable relationship between monetary policy instruments and inflation in Turkey. / Master of Arts
|
4 |
Essays on oil price shocks and financial marketsWang, Jiayue January 2012 (has links)
This thesis is composed of three chapters, which can be read independently. The first chapter investigates how oil price volatility affects the investment decisions for a panel of Japanese firms. The model is estimated using a system generalized method of moments technique for panel data. The results are presented to show that there is a U-shaped relationship between oil price volatility and Japanese firm investment. The results from subsamples of these data indicate that this U-shaped relationship is more significant for oil-intensive firms and small firms. The second chapter aims to examine the underlying causes of changes in real oil price and their transmission mechanisms in the Japanese stock market. I decompose real oil price changes into three components; namely, oil supply shock, aggregate demand shock and oil-specific demand shock, and then estimate the dynamic effects of each component on stock returns using a structural vector autoregressive (SVAR) model. I find that the responses of aggregate Japanese real stock returns differ substantially with different underlying causes of oil price changes. In the long run, oil shocks account for 43% of the variation in the Japanese real stock returns. The response of Japanese real stock returns to oil price shocks can be attributed in its entirety to the cash flow variations. The third chapter tests the robustness of SVAR and investigates the impact of oil price shocks on the different U.S. stock indices. I find that the responses of real stock returns of alternate stock indices differ substantially depending on the underlying causes of the oil price increase. However, the magnitude and length of the effect depends on the firm size. The response of U.S. stock returns to oil price shocks can be attributed to the variations of expected discount rates and expected cash flows.
|
5 |
Weekly Two-Stage Robust Generation Scheduling for Hydrothermal Power SystemsDashti, Hossein, Conejo, Antonio J., Jiang, Ruiwei, Wang, Jianhui 11 1900 (has links)
As compared to short-term forecasting (e.g., 1 day), it is often challenging to accurately forecast the volume of precipitation in a medium-term horizon (e.g., 1 week). As a result, fluctuations in water inflow can trigger generation shortage and electricity price spikes in a power system with major or predominant hydro resources. In this paper, we study a two-stage robust scheduling approach for a hydrothermal power system. We consider water inflow uncertainty and employ a vector autoregressive (VAR) model to represent its seasonality and accordingly construct an uncertainty set in the robust optimization approach. We design a Benders' decomposition algorithm to solve this problem. Results are presented for the proposed approach on a real-world case study.
|
6 |
Macroeconomic Forecasting: Statistically Adequate, Temporal Principal ComponentsDorazio, Brian Arthur 05 June 2023 (has links)
The main goal of this dissertation is to expand upon the use of Principal Component Analysis (PCA) in macroeconomic forecasting, particularly in cases where traditional principal components fail to account for all of the systematic information making up common macroeconomic and financial indicators. At the outset, PCA is viewed as a statistical model derived from the reparameterization of the Multivariate Normal model in Spanos (1986). To motivate a PCA forecasting framework prioritizing sound model assumptions, it is demonstrated, through simulation experiments, that model mis-specification erodes reliability of inferences. The Vector Autoregressive (VAR) model at the center of these simulations allows for the Markov (temporal) dependence inherent in macroeconomic data and serves as the basis for extending conventional PCA. Stemming from the relationship between PCA and the VAR model, an operational out-of-sample forecasting methodology is prescribed incorporating statistically adequate, temporal principal components, i.e. principal components which capture not only Markov dependence, but all of the other, relevant information in the original series. The macroeconomic forecasts produced from applying this framework to several, common macroeconomic indicators are shown to outperform standard benchmarks in terms of predictive accuracy over longer forecasting horizons. / Doctor of Philosophy / The landscape of macroeconomic forecasting and nowcasting has shifted drastically in the advent of big data. Armed with significant growth in computational power and data collection resources, economists have augmented their arsenal of statistical tools to include those which can produce reliable results in big data environments. At the forefront of such tools is Principal Component Analysis (PCA), a method which reduces the number of predictors into a few factors containing the majority of the variation making up the original data series. This dissertation expands upon the use of PCA in the forecasting of key, macroeconomic indicators, particularly in instances where traditional principal components fail to account for all of the systematic information comprising the data. Ultimately, a forecasting methodology which incorporates temporal principal components, ones capable of capturing both time dependence as well as the other, relevant information in the original series, is established. In the final analysis, the methodology is applied to several, common macroeconomic and financial indicators. The forecasts produced using this framework are shown to outperform standard benchmarks in terms of predictive accuracy over longer forecasting horizons.
|
7 |
Time series and spatial analysis of crop yieldAssefa, Yared January 1900 (has links)
Master of Science / Department of Statistics / Juan Du / Space and time are often vital components of research data sets. Accounting for and utilizing the space and time information in statistical models become beneficial when the response variable in question is proved to have a space and time dependence. This work focuses on the modeling and analysis of crop yield over space and time. Specifically, two different yield data sets were used. The first yield and environmental data set was collected across selected counties in Kansas from yield performance tests conducted for multiple years. The second yield data set was a survey data set collected by USDA across the US from 1900-2009. The objectives of our study were to investigate crop yield trends in space and time, quantify the variability in yield explained by genetics and space-time (environment) factors, and study how spatio-temporal information could be incorporated and also utilized in modeling and forecasting yield. Based on the format of these data sets, trend of irrigated and dryland crops was analyzed by employing time series statistical techniques. Some traditional linear regressions and smoothing techniques are first used to obtain the yield function. These models were then improved by incorporating time and space information either as explanatory variables or as auto- or cross- correlations adjusted in the residual covariance structures. In addition, a multivariate time series modeling approach was conducted to demonstrate how the space and time correlation information can be utilized to model and forecast yield and related variables. The conclusion from this research clearly emphasizes the importance of space and time components of data sets in research analysis. That is partly because they can often adjust (make up) for those underlying variables and factor effects that are not measured or not well understood.
|
8 |
The macroeconomic effects of international uncertainty shocksCrespo Cuaresma, Jesus, Huber, Florian, Onorante, Luca 03 1900 (has links) (PDF)
We propose a large-scale Bayesian VAR model with factor stochastic volatility to investigate the macroeconomic consequences of international uncertainty shocks on the G7 countries. The factor structure enables us to identify an international uncertainty shock by assuming that it is the factor most correlated with forecast errors related to equity markets and permits fast sampling of the model. Our findings suggest that the estimated uncertainty factor is strongly related to global equity price volatility, closely tracking other prominent measures commonly adopted to assess global uncertainty. The dynamic responses of a set of macroeconomic and financial variables show that an international uncertainty shock exerts a powerful effect on all economies and variables under consideration. / Series: Department of Economics Working Paper Series
|
9 |
The regional transmission of uncertainty shocks on income inequality in the United StatesFischer, Manfred M., Huber, Florian, Pfarrhofer, Michael January 2019 (has links) (PDF)
This paper explores the relationship between household income inequality and macroeconomic
uncertainty in the United States. Using a novel large-scale macroeconometric
model, we shed light on regional disparities of inequality responses to a national uncertainty
shock. The results suggest that income inequality decreases in most states, with a
pronounced degree of heterogeneity in terms of the dynamic responses. By contrast,
some few states, mostly located in the Midwest, display increasing levels of income
inequality over time. Forecast error variance and historical decompositions highlight
the importance of uncertainty shocks in explaining income inequality in most regions
considered. Finally, we explain differences in the responses of income inequality by means
of a simple regression analysis. These regressions reveal that the income composition as
well as labor market fundamentals determine the directional pattern of the dynamic responses. / Series: Working Papers in Regional Science
|
10 |
Implications of Macroeconomic Volatility in the Euro AreaHauzenberger, Niko, Böck, Maximilian, Pfarrhofer, Michael, Stelzer, Anna, Zens, Gregor 04 1900 (has links) (PDF)
In this paper, we estimate a Bayesian vector autoregressive (VAR) model with factor stochastic volatility in the error term to assess the effects of an uncertainty shock in the Euro area (EA). This allows us to incorporate uncertainty directly into the econometric framework and treat it as a latent quantity. Only a limited number of papers estimates impacts of uncertainty and macroeconomic consequences jointly, and most literature in this sphere is based on single countries. We analyze the special case of a shock restricted to the Euro area, whose countries are highly related by definition. Among other variables, we find significant results of a decrease in real activity measured by GDP in most Euro area countries over a period of roughly a year following an uncertainty shock. / Series: Department of Economics Working Paper Series
|
Page generated in 0.1444 seconds