• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 263
  • 150
  • 61
  • 58
  • 26
  • 23
  • 20
  • 15
  • 10
  • 6
  • 5
  • 5
  • 3
  • 3
  • 3
  • Tagged with
  • 659
  • 226
  • 127
  • 124
  • 112
  • 79
  • 78
  • 78
  • 68
  • 63
  • 60
  • 57
  • 55
  • 55
  • 54
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
431

Variabilité du taux de change, flux commerciaux et croissance économique : le cas de Madagascar / Exchange rate Variability, trade flows, economic growth : the case of Madagascar

Razafindramanana, Olivasoa Miaranirainy 30 November 2015 (has links)
De change, les flux commerciaux ou commerces et la croissance économique de Madagascar. En d’autres termes nous avons étudié les effets de la volatilité et le mésalignement du taux de change sur les exportations, les importations, et la croissance économique. Pour pouvoir réaliser cette étude, nous avons utilisé des données annuelles entre la période 1971-2012 pour les exportations et importations globales, et la période 1990-2011 pour les exportations et importations par secteur. Nous avons mesuré la volatilité à l’aide de deux méthodes, et nous avons obtenu la volatilité par l’écart-type mobile et la volatilité calculée par le GARCH. La méthode de cointégration a été utilisée pour l’étude des variables. Avec le modèle NATREX, le mésalignement a été calculé comme la différence du TCER à l’instant t et TCER d’équilibre. Sur la dernière partie du travail et afin de répondre à notre problématique, nous faisons appel à la méthode SUR (Seemingly Unrelated Regression). Cette méthode nous a permis d’estimer notre modèle à deux equations pour les exportations en volume et les importations en volume. En bref, pour le cas Madagascar, d’une part en considérant l’exportation, le mésalignenemt a un impact positif significatif sur l’exportation globale quelle que soit la définition de la volatilité, en effet la sur-évaluation de l’Ariary augmente l’exportation. Par ailleurs, la volatilité a un impact positif significatif sur l’exportation globale uniquement avec la prise en compte du VOLGARCHTCEN. D’autre part en considérant l’importation, le mésalignenemt a un impact positif significatif sur l’importation globale avec la prise en compte du VOLMASDTCER, et du VOLMASDTCEN, la sur-évaluation de l’Ariary augmente l’importation. La volatilité a un impact positif significatif sur l’importation pour les trois cas suivants : VOLMASDTCEN, VOLGARCHTCER, VOLGARCHTCEN. Avec l’exportation globale ou l’importation globale, le mésalignement n’a pas d’impact significatif sur le taux de croissance, par contre la volatilité a un impact négatif significatif sur le taux de croissance en considérant le VOLMASDTCER, et le VOLMASDTCEN. / In this thesis, we tried to know the relationship between the variability of exchange rates, trade flows and economic growth in Madagascar. In other words, we have studied the effects of volatility and misalignment of the exchange rate on exports, imports, and economic growth. To conduct this study, we used annual data from the 1971-2012 period for global exports and imports, and the 1990-2011 period for exports and imports by sector. We measured the volatility using two methods, and we got the volatility by moving standard deviation and volatility calculated by the GARCH. The method of cointegration was used to study the variables. With NATREX model, the misalignment was calculated as the difference between REER at time t and REER equilibrium. On the last part of this work and to resolve our problem, we use the method SUR (Seemingly Unrelated Regression). This method allowed us to estimate our model with two equations for export volumes and import volumes.Finally, the results show that for the case of Madagascar, considering exports, misalignment has a significant positive impact on overall export whatever the definition of volatility, indeed over-evaluation increases export. Then, volatility has a significant positive impact on overall export only with the inclusion of VOLGARCHTCEN. Moreover considering imports, misalignment has a significant positive impact on the overall import with the inclusion of VOLMASDTCER, and VOLMASDTCEN, over-evaluation increases import. The volatility has a significant positive impact on the import in the case of : VOLMASDTCEN, VOLGARCHTCER, VOLGARCHTCEN. With the global export or import, misalignment has no significant impact on the growth rate, however volatility has a significant negative impact on growth rates considering VOLMASDTCER, and VOLMASDTCEN.
432

Estimation and misspecification Risks in VaR estimation / Estimation and misspecification risks in VaR evaluation

Telmoudi, Fedya 19 December 2014 (has links)
Dans cette thèse, nous étudions l'estimation de la valeur à risque conditionnelle (VaR) en tenant compte du risque d'estimation et du risque de modèle. Tout d'abord, nous considérons une méthode en deux étapes pour estimer la VaR. La première étape évalue le paramètre de volatilité en utilisant un estimateur quasi maximum de vraisemblance généralisé (gQMLE) fondé sur une densité instrumentale h. La seconde étape estime un quantile des innovations à partir du quantile empirique des résidus obtenus dans la première étape. Nous donnons des conditions sous lesquelles l'estimateur en deux étapes de la VaR est convergent et asymptotiquement normal. Nous comparons également les efficacités des estimateurs obtenus pour divers choix de la densité instrumentale h. Lorsque l'innovation n'est pas de densité h, la première étape donne généralement un estimateur biaisé de paramètre de volatilité et la seconde étape donne aussi un estimateur biaisé du quantile des innovations. Cependant, nous montrons que les deux erreurs se contrebalancent pour donner une estimation consistante de la VaR. Nous nous concentrons ensuite sur l'estimation de la VaR dans le cadre de modèles GARCH en utilisant le gQMLE fondé sur la classe des densités instrumentales double gamma généralisées qui contient la distribution gaussienne. Notre objectif est de comparer la performance du QMLE gaussien par rapport à celle du gQMLE. Le choix de l'estimateur optimal dépend essentiellement du paramètre d qui minimise la variance asymptotique. Nous testons si le paramètre d qui minimise la variance asymptotique est égal à 2. Lorsque le test est appliqué sur des séries réelles de rendements financiers, l'hypothèse stipulant l'optimalité du QMLE gaussien est généralement rejetée. Finalement, nous considérons les méthodes non-paramétriques d'apprentissage automatique pour estimer la VaR. Ces méthodes visent à s'affranchir du risque de modèle car elles ne reposent pas sur une forme spécifique de la volatilité. Nous utilisons la technique des machines à vecteurs de support pour la régression (SVR) basée sur la fonction de perte moindres carrés (en anglais LS). Pour améliorer la solution du modèle LS-SVR nous utilisons les modèles LS-SVR pondérés et LS-SVR de taille fixe. Des illustrations numériques mettent en évidence l'apport des modèles proposés pour estimer la VaR en tenant compte des risques de spécification et d'estimation. / In this thesis, we study the problem of conditional Value at Risk (VaR) estimation taking into account estimation risk and model risk. First, we considered a two-step method for VaR estimation. The first step estimates the volatility parameter using a generalized quasi maximum likelihood estimator (gQMLE) based on an instrumental density h. The second step estimates a quantile of innovations from the empirical quantile of residuals obtained in the first step. We give conditions under which the two-step estimator of the VaR is consistent and asymptotically normal. We also compare the efficiencies of the estimators for various instrumental densities h. When the distribution of is not the density h the first step usually gives a biased estimator of the volatility parameter and the second step gives a biased estimator of the quantile of the innovations. However, we show that both errors counterbalance each other to give a consistent estimate of the VaR. We then focus on the VaR estimation within the framework of GARCH models using the gQMLE based on a class of instrumental densities called double generalized gamma which contains the Gaussian distribution. Our goal is to compare the performance of the Gaussian QMLE against the gQMLE. The choice of the optimal estimator depends on the value of d that minimizes the asymptotic variance. We test if this parameter is equal 2. When the test is applied to real series of financial returns, the hypothesis stating the optimality of Gaussian QMLE is generally rejected. Finally, we consider non-parametric machine learning models for VaR estimation. These methods are designed to eliminate model risk because they are not based on a specific form of volatility. We use the support vector machine model for regression (SVR) based on the least square loss function (LS). In order to improve the solution of LS-SVR model, we used the weighted LS-SVR and the fixed size LS-SVR models. Numerical illustrations highlight the contribution of the proposed models for VaR estimation taking into account the risk of specification and estimation.
433

Modelos black-litterman e GARCH ortogonal para uma carteira de títulos do tesouro nacional / Black-Litterman and ortogonal GARCH models for a portfolio of bonds issued by the National Treasury

Roberto Beier Lobarinhas 02 March 2012 (has links)
Uma grande dificuldade da gestão financeira é conseguir associar métodos quantitativos às formas tradicionais de gestão, em um único arranjo. O estilo tradicional de gestão tende a não crer, na devida medida, que métodos quantitativos sejam capazes de captar toda sua visão e experiência, ao passo que analistas quantitativos tendem a subestimar a importância do enfoque tradicional, gerando flagrante desarmonia e ineficiência na análise de risco. Um modelo que se propõe a diminuir a distância entre essas visões é o modelo Black-Litterman. Mais especificamente, propõe-se a diminuir os problemas enfrentados na aplicação da teoria moderna de carteiras e, em particular, os decorrentes da aplicação do modelo de Markowitz. O modelo de Markowitz constitui a base da teoria de carteiras há mais de meio século, desde a publicação do artigo Portfolio Selection [Mar52], entretanto, apesar do papel de destaque da abordagem média-variância para o meio acadêmico, várias dificuldades aparecem quando se tenta utilizá-lo na prática, e talvez, por esta razão, seu impacto no mundo dos investimentos tem sido bastante limitado. Apesar das desvantagens na utilização do modelo de média-variância de Markowitz, a idéia de maximizar o retorno, para um dado nível de risco é tão atraente para investidores, que a busca por modelos com melhor comportamento continuou e é neste contexto que o modelo Black-Litterman surgiu. Em 1992, Fischer Black e Robert Litterman publicam o artigo Portfolio Optimization [Bla92], fazendo considerações sobre o papel de pouco destaque da alocação quantitativa de ativos, e lançam o modelo conhecido por Black-Litterman. Uma grande diferença entre o modelo Black-Litterman e um modelo média-variância tradicional é que, enquanto o segundo gera pesos em uma carteira a partir de um processo de otimização, o modelo Black-Litterman parte de uma carteira de mercado em equilíbrio de longo prazo (CAPM). Outro ponto de destaque do modelo é ser capaz de fornecer uma maneira clara para que investidores possam expressar suas visões de curto prazo e, mais importante, fornece uma estrutura para combinar de forma consistente a informação do equilíbrio de longo prazo (priori) com a visão do investidor (curto prazo), gerando um conjunto de retornos esperados, a partir do qual os pesos em cada ativo são fornecidos. Para a escolha do método de estimação dos parâmetros, levou-se em consideração o fato de que matrizes de grande dimensão têm um papel importante na avaliação de investimentos, uma vez que o risco de uma carteira é fundamentalmente determinado pela matriz de covariância de seus ativos. Levou-se também em consideração que seria desejável utilizar um modelo flexível ao aumento do número de ativos. Um modelo capaz de cumprir este papel é o GARCH ortogonal, pois este pode gerar matrizes de covariâncias do modelo original a partir de algumas poucas volatilidades univariadas, sendo, portanto, um método computacionalmente bastante simples. De fato, as variâncias e correlações são transformações de duas ou três variâncias de fatores ortogonais obtidas pela estimação GARCH. Os fatores ortogonais são obtidos por componentes principais. A decomposição da variância do sistema em fatores de risco permite quantificar a variabilidade que cada fator de risco traz, o que é de grande relevância, pois o gestor de risco poderá direcionar mais facilmente sua atenção para os fatores mais relevantes. Ressalta-se também que a ideia central da ortogonalização é utilizar um espaço reduzido de componentes. Neste modelo de dimensão reduzida, suficientes fatores de risco serão considerados, assim, os demais movimentos, ou seja, aqueles não capturados por estes fatores, serão considerados ruídos insignificantes para este sistema. Não obstante, a precisão, ao desconsiderar algumas componentes, irá depender de o número de componentes principais ser suficiente para explicar grande parte da variação do sistema. Logo, o método funcionará melhor quando a análise de componentes principais funcionar melhor, ou seja, em estruturas a termo e outros sistemas altamente correlacionados. Cabe mencionar que o GARCH ortogonal continua igualmente útil e viável quando pretende-se gerar matriz de covariâncias de fatores de risco distintos, isto é, tanto dos altamente correlacionados, quanto daqueles pouco correlacionados. Neste caso, basta realizar a análise de componentes principais em grupos correlacionados. Feito isto, obtêm-se as matrizes de covariâncias utilizando a estimação GARCH. Em seguida faz-se a combinação de todas as matrizes de covariâncias, gerando a matriz de covariâncias do sistema original. A estimação GARCH foi escolhida pois esta é capaz de captar os principais fatos estilizados que caracterizam séries temporais financeiras. Entende-se por fatos estilizados padrões estatísticos observados empiricamente, que, acredita-se serem comuns a um grande número de séries temporais. Séries financeiras com suficiente alta frequência (observações intraday e diárias) costumam apresentar tais características. Este modelo foi utilizado para a estimação dos retornos e, com isso, obtivemos todas as estimativas para que, com o modelo B-L, pudéssemos gerar uma carteira ótima em um instante de tempo inicial. Em seguida, faremos previsões, obtendo carteiras para as semanas seguintes. Por fim, mostraremos que a associação do modelo B-L e da estimação GARCH ortogonal pode gerar resultados bastante satisfatórios e, ao mesmo tempo, manter o modelo simples e gerar resultados coerentes com a intuição. Este estudo se dará sobre retornos de títulos de renda fixa, mais especificamente, títulos emitidos pelo Tesouro Nacional no mercado brasileiro. Tanto a escolha do modelo B-L, quanto a escolha por utilizar uma carteira de títulos emitidos pelo Tesouro Nacional tiveram como motivação o objetivo de aproximar ferramentas estatísticas de aplicações em finanças, em particular, títulos públicos federais emitidos em mercado, que têm se tornado cada vez mais familiares aos investidores pessoas físicas, sobretudo através do programa Tesouro Direto. Ao fazê-lo, espera-se que este estudo traga informações úteis tanto para investidores, quanto para gestores de dívida, uma vez que o modelo média-variância presta-se tanto àqueles que adquirem títulos, buscando, portanto, maximizar retorno para um dado nível de risco, quanto para aqueles que emitem títulos, e que, portanto, buscam reduzir seus custos de emissão a níveis prudenciais de risco. / One major challenge to financial management resides in associating traditional management with quantitative methods. Traditional managers tend to be skeptical about the quantitative methods contributions, whereas quantitative analysts tend to disregard the importance of the traditional view, creating clear disharmony and inefficiency in the risk management process. A model that seeks to diminish the distance between these two views is the Black-Litterman model (BLM). More specifically, it comes as a solution to difficulties faced when using modern portfolio in practice, particularly those derived from the usage of the Markowitz model. Although the Markowitz model has constituted the basis of portfolio theory for over half century, since the publication of the article Portfolio Selection [Mar52], its impact on the investment world has been quite limited. The Markowitz model addresses the most central objectives of an investment: maximizing the expected return, for a given level of risk. Even though it has had a standout role in the mean-average approach to academics, several difficulties arise when one attempts to make use of it in practice. Despite the disadvantages of its practical usage, the idea of maximizing the return for a given level of risk is so appealing to investors, that the search for models with better behavior continued, and is in this context that the Black-Litterman model came out. In 1992, Fischer Black and Robert Litterman wrote an article on the Black-Litterman model. One intrinsic difference between the BLM and a traditional mean-average one is that, while the second provides the weights of the assets in a portfolio out of a optimization routine, the BLM has its starting point at the long-run equilibrium market portfolio(CAPM). Another highlighting point of the BLM is the ability to provide one clear structucture that is able to combine the long term equilibrium information with the investors views, providing a set of expected returns, which, together, will be the input to generate the weights on the assets. As far as the estimation process is concerned, and for the purpose of choosing the most appropriate model, it was taken into consideration the fact that the risk of a portfolio is determined by the covariation matrix of its assets and, being so, matrices with large dimensions play an important role in the analysis of investments. Whereas, provided the application under study, it is desirable to have a model that is able to carry out the analysis for a considerable number of assets. For these reasons, the Orthogonal GARCH was selected, once it can generate the matrix of covariation of the original system from just a few univariate volatilities, and for this reason, it is a computationally simple method. The orthogonal factors are obtained with principal components analysis. Decomposing the variance of the system into risk factors is highly important, once it allows the risk manager to focus separately on each relevant source of risk. The main idea behind the orthogonalization consists in working with a reduced dimension of components. In this kind of model, sufficient risk factors are considered, thus, the variability not perceived by the model will be considered insigficant noise to the system. Nevertheless, the precision, when not using all the components, will depend on the number of components be sufficient to explain the major part of the variability. Moreover, the model will provide reasonable results depending on principal component analysis performing properly as well, what will be more likely to happen, in highly correlated systems. It is worthy of note that the Orthogonal GARCH is equally useful and feasible when one intends to analyse a portfolio consisting of assets across various types of risk, it means, a system which is not highly correlated. It is common to have such a portfolio, with, for instance, currency rates, stocks, fixed income and commodities. In order to make it to perform properly, it is necessary to separate groups with the same kind of risk and then carry out the principal component analysis by group and then merge the covariance matrices, producing the covariance matrix of the original system. To work together with the orthogonalization method, the GARCH model was chosen because it is able to draw the main stylized facts which characterize financial time series. Stylized facts are statistical patterns empirically observed, which are believed to be present in a number of time series. Financial time series which sufficient high frequency (intraday, daily and even weekly) usually present such behavior. For estimating returns purposes, it was used a ARMA model, and together with the covariance matrix estimation, we have all the parameters needed to perform the BLM study, coming out, in the end, with the optimal portfolio in a given initial time. In addition, we will make forecasts with the GARCH model, obtaining optimal portfolio for the following weeks. We will show that the association of the BLM with the Orthogonal GARCH model can generate satisfactory and coherent with intuition results and, at the same time, keeping the model simple. Our application is on fixed income returns, more specifically, returns of bonds issued in the domestic market by the Brazilian National Treasury. The motivation of this work was to put together statistical tolls and finance uses and applications, more specifically those related to the bonds issued by the National Treasuy, which have become more and more popular due to the \"Tesouro Direto\" program. In conclusion, this work aims to bring useful information either for investors or to debt managers, once the mean-variance model can be useful for those who want to maximize return at a given level or risk as for those who issue bonds, and, thus, seek to reduce their issuance costs at prudential levels of risk.
434

IG-GARJI模型下之住宅抵押貸款保險評價 / Valuation of Mortgage Insurance Contracts in IG-GARJI model

林思岑, Lin, Szu Tsen Unknown Date (has links)
住宅抵押貸款保險(Mortgage Insurance)為管理違約風險的重要工具,在2008年次級房貸風暴後更加受到金融機構的關注。為了能更準確且更有效率的預測房價及合理評價住宅抵押貸款保險,本文延續Christoffersen, Heston and Jacobs (2006)對股票報酬率的研究,提出新的GARCH模型,利用Inverse Gaussian分配取代常態分配來捕捉房價序列中存在的自我相關以及典型現象(stylized facts),並且同時考慮房價市場中所隱含的價格跳躍現象。本文將新模型命名為IG-GARJI模型,以便和傳統GARCH模型作區分。由於傳統的GARCH模型在計算保險價格時,通常不存在封閉解,必須藉由模擬的方法來計算價格,會增加預測的誤差,本文提供IG-GARJI模型半封閉解以增進預測效率與準確度,並利用Bühlmann et al. (1996)提出的Esscher transform方法找出其風險中立機率測度,而後運用Heston and Nandi (2000)提出之遞迴方法,找出適合的住宅抵押貸款保險評價模型。實證結果顯示,在新建房屋市場中,使用Inverse Gaussian分配會比常態分配的表現要好;對於非新建房屋,不同模型間沒有顯著的差異。另外,本文亦引用Bardhan, Karapandža, and Urošević (2006)的觀點,利用不同評價模型來比較若房屋所有權無法及時轉換時,對住宅抵押貸款保險價格帶來的影響,為住宅抵押貸款保險提供更準確的評價方法。 / Mortgage insurance products represent an attractive alternative for managing default risk. After the subprime crisis in 2008, more and more financial institutions have paid highly attention on the credit risk and default risk in mortgage market. For the purpose of giving a more accurate and more efficient model in forecasting the house price and evaluate mortgage insurance contracts properly, we follow Christoffersen, Heston and Jacobs (2006) approach to propose a new GARCH model with Inverse Gaussian innovation instead of normal distribution which is capable of capturing the auto-correlated characteristic as well as the stylized facts revealed in house price series. In addition, we consider the jump risk within the model, which is widely discussed in the house market. In order to separate our new model from traditional GARCH model, we named our model IG-GARJI model. Generally, traditional GARCH model do not exist an analytical solution, it may increase the prediction error with respect to the simulation procedure for evaluating mortgage insurance. We propose a semi-analytical solution of our model to enhance the efficiency and accuracy. Furthermore, our approach is implemented the Esscher transform introduced by Bühlmann et al. (1996) to identify a martingale measure. Then use the recursive procedure proposed by Heston and Nandi (2000) to evaluate the mortgage insurance contract. The empirical results indicate that the model with Inverse Gaussian distribution gives better performance than the model with normal distribution in newly-built house market and we could not find any significant difference between each model in previously occupied house market. Moreover, we follow Bardhan, Karapandža, and Urošević (2006) approach to investigate the impact on the mortgage insurance premium due to the legal efficiency. Our model gives another alternative to value the mortgage contracts.
435

Analyzing value at risk and expected shortfall methods: the use of parametric, non-parametric, and semi-parametric models

Huang, Xinxin 25 August 2014 (has links)
Value at Risk (VaR) and Expected Shortfall (ES) are methods often used to measure market risk. Inaccurate and unreliable Value at Risk and Expected Shortfall models can lead to underestimation of the market risk that a firm or financial institution is exposed to, and therefore may jeopardize the well-being or survival of the firm or financial institution during adverse markets. The objective of this study is therefore to examine various Value at Risk and Expected Shortfall models, including fatter tail models, in order to analyze the accuracy and reliability of these models. Thirteen VaR and ES models under three main approaches (Parametric, Non-Parametric and Semi-Parametric) are examined in this study. The results of this study show that the proposed model (ARMA(1,1)-GJR-GARCH(1,1)-SGED) gives the most balanced Value at Risk results. The semi-parametric model (Extreme Value Theory, EVT) is the most accurate Value at Risk model in this study for S&P 500. / October 2014
436

Kenngrößen für die Abhängigkeitsstruktur in Extremwertzeitreihen / Characteristics for Dependence in Time Series of Extreme Values

Ehlert, Andree 31 August 2010 (has links)
No description available.
437

運用Elman類神經網路與時間序列模型預測LME銅價之研究 / A study on applying Elman neural networks and time series model to predict the price of LME copper

黃鴻仁, Huang, Hung Jen Unknown Date (has links)
銅價在近年來不斷的創下歷史新高,由於台灣蓬勃的電子、半導體、工具機產業皆需要銅,因此銅進口量位居全球第五(ICSG,2009),使得台灣企業的生產成本受國際銅價的波動影響甚鉅,全球有70%的銅價是按照英國倫敦金屬交易所(London Metal Exchange, LME)的牌價進行貿易,因此本研究欲建置預測模式以預測銅價未來趨勢。   本研究之資料來源為2003年1月2日至2011年7月14日的LME三月期銅價,並依文獻探討選取LME的銅庫存、三月期鋁價、三月期鉛價、三月期鎳價、三月期鋅價、三月期錫價,以及金價、銀價、石油價格、美國生產者物價指數、美國消費者物價指數、聯邦資金利率作為影響因素的分析資料。時間序列分析、類神經網路已被廣泛的用於預測股市及期貨,本研究先藉由向量自我迴歸模型篩選出有影響力的變數,同時建置GARCH時間序列預測模型與具有遞迴的Elman類神經網路預測模型,再整合兩者建置GARCH-Elman類神經網路預測模型。 本研究之向量自我迴歸模型顯示銅價與金、鋁、銅庫存前第1期;自身前第2期;鎳、錫前第3期;鋅前第4期的變動有負向的影響;受到石油前第2期的變動有正向的影響,這其中以銅的自我解釋變異最高,銅庫存最低,推測其影響已有效率地反映到銅價上。也驗證預測模型必須考量總體經濟變數,且變數先經向量自我迴歸模型的篩選能因減少雜訊而提升類神經網路的預測能力。依此建置的GARCH模型有33.81%的累積報酬率、Elman類神經網路38.11%、整合兩者的GARCH-Elman類神經網路56.46%,皆優於實際銅價指數的累積報酬率。對銅有需求的企業者,能更為準確的預測漲跌趨勢,依此判斷如何跟原物料供應商簽訂合約的價格與期間,使其免於價格趨勢的誤判而提高生產成本,並提出五點建議供未來研究者參考。 / The recent copper price in London Metal Exchange (LME) has breaking the historical high. Taiwan’s booming electronics, semiconductor and machine tool industry causing copper import volume ranked fifth in the world (ICSG, 2009). Because of 70% of copper worldwide trade in accordance with the price of the London Metal Exchange, this study using time series and neural networks to build the LME copper price forecast model.   This study considering copper, copper stocks, aluminum, lead, nickel, zinc, tin, gold, silver, oil ,federal funds rate, CPI and PPI during the period of 2003/1/2 to 2011/7/14. Time series model and neural networks have been widely used for forecasting the stock market and futures. In this study, using Vector Autoregressive (VAR) model screened influential variables, building GARCH model and Elman neural network to forecast the LME copper price; and further, integrating this two models to build GARCH-Elman neural network prediction model.   This study’s VAR models show that the copper has negative effect with gold, aluminum, copper stocks, nickel, tin, zinc and itself. And has positive impact with oil prices. The highest of explained variance is copper. Copper stocks are lowest, speculating that its impact has been efficiently reflecting on the price of copper. Verifying the prediction model must consider the macroeconomics variables. Using VAR model screened influential variables can reduce noise to enhance the predictive ability of the neural network. This study’s GARCH model has 33.81% of the cumulative rate of return, Elman neural network has 38.11% and the GARCH-Elman neural network has 56.46%. All of them are better than the actual price of copper.
438

MODELOS DE SÉRIES TEMPORAIS APLICADOS A DADOS DE UMIDADE RELATIVA DO AR / MODELS OF TEMPORAL SERIES APPLIED TO AIR RELATIVE HUMIDITY DATA

Tibulo, Cleiton 11 December 2014 (has links)
Time series model have been used in many areas of knowledge and have become a current necessity for companies to survive in a globalized and competitive market, as well as climatic factors that have always been a concern because of the different ways they interfere in human life. In this context, this work aims to present a comparison among the performances by the following models of time series: ARIMA, ARMAX and Exponential Smoothing, adjusted to air relative humidity (UR) and also to verify the volatility present in the series through non-linear models ARCH/GARCH, adjusted to residues of the ARIMA and ARMAX models. The data were collected from INMET from October, 1st to January, 22nd, 2014. In the comparison of the results and the selection of the best model, the criteria MAPE, EQM, MAD and SSE were used. The results showed that the model ARMAX(3,0), with the inclusion of exogenous variables produced better forecast results, compared to the other models SARMA(3,0)(1,1)12 and the Holt-Winters multiplicative. In the volatility study of the series via non-linear ARCH(1), adjusted to the quadrants of SARMA(3,0)(1,1)12 and ARMAX(3,0) residues, it was observed that the volatility does not tend to influence the future long-term observations. It was then concluded that the classes of models used and compared in this study, for data of a climatologic variable, showed a good performance and adjustment. We highlight the broad usage possibility in the techniques of temporal series when it is necessary to make forecasts and also to describe a temporal process, being able to be used as an efficient support tool in decision making. / Modelos de séries temporais vêm sendo empregados em diversas áreas do conhecimento e têm surgido como necessidade atual para empresas sobreviverem em um mercado globalizado e competitivo, bem como fatores climáticos sempre foram motivo de preocupação pelas diferentes formas que interferem na vida humana. Nesse contexto, o presente trabalho tem por objetivo apresentar uma comparação do desempenho das classes de modelos de séries temporais ARIMA, ARMAX e Alisamento Exponencial, ajustados a dados de umidade relativa do ar (UR) e verificar a volatilidade presente na série por meio de modelos não-lineares ARCH/GARCH ajustados aos resíduos dos modelos ARIMA e ARMAX. Os dados foram coletados junto ao INMET no período de 01 de outubro de 2001 a 22 de janeiro de 2014. Na comparação dos resultados e na seleção do melhor modelo foram utilizados os critérios MAPE, EQM, MAD e SSE. Os resultados mostraram que o modelo ARMAX(3,0) com a inclusão de variáveis exógenas produziu melhores resultados de previsão em relação aos seus concorrentes SARMA(3,0)(1,1)12 e o Holt-Winters multiplicativo. No estudo da volatilidade da série via modelo não-linear ARCH(1), ajustado aos quadrados dos resíduos dos modelos SARMA(3,0)(1,1)12 e ARMAX(3,0), observou-se que a volatilidade não tende a influenciar as observações futuras em longo prazo. Conclui-se que as classes de modelos utilizadas e comparadas neste estudo, para dados de uma variável climatológica, demonstraram bom desempenho e ajuste. Destaca-se a ampla possibilidade de utilização das técnicas de séries temporais quando se deseja fazer previsões e descrever um processo temporal, podendo ser utilizadas como ferramenta eficiente de apoio nas tomadas de decisão.
439

[en] A COMPARATIVE STUDY OF THE FORECAST CAPABILITY OF VOLATILITY MODELS / [pt] ESTUDO COMPARATIVO DA CAPACIDADE PREDITIVA DE MODELOS DE ESTIMAÇÃO DE VOLATILIDADE

LUIS ANTONIO GUIMARAES BENEGAS 15 January 2002 (has links)
[pt] O conceito de risco é definido como a distribuição de resultados inesperados devido a alterações nos valores das variáveis que descrevem o mercado. Entretanto, o risco não é uma variável observável e sua quantificação depende do modelo empregado para avaliá-lo. Portanto, o uso de diferentes modelos pode levar a previsões de risco significativamente diferentes.O objetivo principal desta dissertação é realizar um estudo comparativo dos modelos mais amplamente utilizados (medição de variância amostral nos últimos k períodos, modelos de amortecimento exponencial e o GARCH(1,1) de Bollerslev) quanto à capacidade preditiva da volatilidade.Esta dissertação compara os modelos de estimação de volatilidade citados acima quanto à sua capacidade preditiva para carteiras compostas por um conjunto de ações negociadas no mercado brasileiro. As previsões de volatilidade desses modelos serão comparadas com a volatilidade real fora da amostra. Como a volatilidade real não é uma variável observável, usou-se o mesmo procedimento adotado pelo RiskMetrics para o cálculo do fator de decaimento ótimo: assumiu-se a premissa que o retorno médio de cada uma das carteiras de ações estudadas é igual a zero e,como conseqüência disso, a previsão um passo à frente da variância do retorno realizada na data t é igual ao valor esperado do quadrado do retorno na data t.O objetivo final é concluir, por meio de técnicas de backtesting, qual dos modelos de previsão de volatilidade apresentou melhor performance quanto aos critérios de comparação vis-à-vis ao esforço computacional necessário. Dessa forma, pretende-se avaliar qual desses modelos oferece a melhor relação custo-benefício para o mercado acionário brasileiro. / [en] The risk concept is defined as the distribution of the unexpected results from variations in the values of the variables that describe the market. However, the variable risk is not observable and its measurement depends on which model is used in its evaluation. Thus, the application of different models could result in significant different risk forecasts.The goal of this study is to carry out a comparison within the largest used models (sample variance in the last k observations, exponentially smoothing models and the Bollerslev s model GARCH(1,1)). The study compares the models mentioned above regarding its forecast capability of the volatility for portfolios of selected brazilian stocks. The volatility forecasts will be compared to the actual out of sample volatility. As long as the actual volatility is not an observable variable, the same procedure adopted by RiskMetrics in the calculation of the optimum decay factor will be used: it assumes the premise that the average return of which one of the stock portfolios is equal zero and, as the consequence of this fact, the one step variance forecast of the portfolio return carried out on date t is equal to expected value of the squared return of date t.The final objective is to conclude, using backtesting techniques, which of the forecasting volatility models show the best performance regarding the comparison criterions vis-a-vis the demanding computer efforts. By this way, it was aimed to evaluate which of them offer the best cost-benefit relation for the brazilian equity market.
440

[en] A STUDY ON THE BEHAVIOR OF THE PRICES OF SOYBEAN IN BRAZIL: AN APPROACH TO THE METHOD OF MEAN REVERSION WITH JUMPS / [pt] UM ESTUDO SOBRE O COMPORTAMENTO DOS PREÇOS DA SOJA NO MERCADO BRASILEIRO: UMA ABORDAGEM PELO MÉTODO DE REVERSÃO À MÉDIA COM SALTOS

CRISTIANE BATISTA RODRIGUES 01 November 2017 (has links)
[pt] O Brasil tem mostrado bons resultados em suas atividades agropecuárias que vem sendo justificado por suas condições naturais propícias e ao advento da tecnologia agrícola. O agronegócio no Brasil já representa, aproximadamente, 33 por cento do Produto Interno Bruto do país (MAPA, 2007) e em se tratando da soja, o Brasil é o segundo maior produtor de soja do mundo, com 6,77 por cento das exportações totais do país (CONAB, 2007). Dentro do contexto de agronegócios, a atividade produtora da soja está sujeita a diversos riscos e incertezas como: condições climáticas, ciclo produtivo, produto altamente perecível e pragas, além das condições econômicas de mercado que influenciam diretamente no preço dessa commodity. Na tentativa de minimizar as incertezas e os riscos inerentes dessa atividade produtora é comum encontramos operações de hedge, como os contratos futuros ou a termo, associado às atividades de agronegócio. A lógica desses mecanismos de hedge consiste na proteção contra as possíveis variações no preço dos ativos até uma data definida. O bom funcionamento dessas operações depende de um aparato jurídico e metodológico confiável. Um aparato jurídico que possa garantir a liquidez da mercadoria dentro de padrões previamente definidos em contratos, e uma metodologia adequada que conduza, principalmente, a um preço confiável da commodity no futuro. Deste modo, o objetivo principal deste trabalho é analisar, a partir de uma série histórica, o comportamento dos preços da soja no mercado brasileiro e testar sua aderência ao processo de reversão à média com saltos, bem como testar os efeitos ARCH e GARCH na volatilidade deste processo. / [en] Brazil has shown good results in their agricultural activities, which is justified by its favorable natural conditions and the advent of agricultural technology. The agribusiness in Brazil already represents approximately 33 percent of Gross Domestic Product of the country (MAPA, 2007) and in the case of soybeans, Brazil is the second largest soybean producer in the world, with 6.77 percent of total exports of country (CONAB, 2007). Within the context of agribusiness, the activity of soybean production is subject to various risks and uncertainties such as climatic conditions, production cycle, product highly perishable and pests, and economic conditions of the market, which directly influence the price of that commodity. In an attempt to minimize the uncertainties and risks inherent in producing such activity is common to find hedging transactions such as futures contracts or term associated with the activities of agribusiness. The logic of these mechanisms is the hedge of protection against possible changes in the price of assets by a date set. The functioning of these operations depends on a reliable legal and methodological apparatus. A legal apparatus that can ensure the liquidity of the goods within predefined standards in contracts, and an appropriate methodology that will lead, especially at a price reliable commodity in the future. The purpose of this study is to analyze, from a historical series, the behavior of the prices of soybean in the Brazilian market, and test their adherence to the process of reversion to the mean with jumps as well as test the ARCH and GARCH effects in volatility of this process.

Page generated in 0.0845 seconds