• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 60
  • 26
  • 9
  • 8
  • 6
  • 5
  • 4
  • 4
  • 2
  • 1
  • 1
  • Tagged with
  • 127
  • 75
  • 33
  • 29
  • 28
  • 26
  • 21
  • 17
  • 17
  • 15
  • 15
  • 15
  • 13
  • 12
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

縮小股價升降單位對實現波動率之影響 / Tick Size Reduction and Realized Volatility on the Taiwan Stock Exchange

張皓雯, Chang, Hao Wen Unknown Date (has links)
本文以日內資料研究台灣證券交易所於2005年3月1日實施股價升降單位新制後,市場交易因子與股價報酬波動率的變化;延伸討論市場參與者對新訊息之反應,進而評估實施股價升降單位新制之成效。本文首先比較四種常用來衡量報酬波動率的方法,並從中挑選出最穩健的測度方式;接著藉此分析股價日報酬波動率與市場交易因子之間的關係;最後,由於日內股價報酬波動的軌跡呈現U型曲線,為突顯波動較劇烈之時段股價報酬波動率是否亦隨股價升降單位縮小而趨緩,故著眼交易日開盤後一小時及收盤前一小時,再次檢驗上述關係。實證結果支持股價升降單位縮小使實現波動率大幅降低且交易筆數密切影響股價報酬波動率,且不論在日資料與日內資料都呈現相似結論;並發現愈接近開、收盤的時間點,股價報酬波動率降低比例亦愈大,顯示升降單位新制達成政策目的。 / In this study, we address the impact of the tick size reduction on the Taiwan Stock Exchange on March 1, 2005. We propose to investigate the variations of trading activities and return volatility, discuss investors' behaviors to the new information and evaluate the tick size reduction by analyzing intraday data. First, we select the most robust volatility measure for our study from four commonly used ones. Second, we examine the relationship between daily return volatility and trading activities. Eventually, due to the commonly observed U-shaped pattern of intraday return volatility, we re-examine the intraday relation between return volatility and trading activities. Our empirical results based on the robust realized volatility confirm that both daily and intraday return volatility decline significantly after the tick size reduction, and number of trades is a prominent trading factor in explaining realized volatility. More interestingly, we observe that the percentage decrease in realized volatility is most pronounced for trading sessions near the beginning or the ending of each trading day. Overall, our empirical findings support the arguments for tick size reduction intended by policymakers.
62

Mesure et Prévision de la Volatilité pour les Actifs Liquides

Chaker, Selma 04 1900 (has links)
Le prix efficient est latent, il est contaminé par les frictions microstructurelles ou bruit. On explore la mesure et la prévision de la volatilité fondamentale en utilisant les données à haute fréquence. Dans le premier papier, en maintenant le cadre standard du modèle additif du bruit et le prix efficient, on montre qu’en utilisant le volume de transaction, les volumes d’achat et de vente, l’indicateur de la direction de transaction et la différence entre prix d’achat et prix de vente pour absorber le bruit, on améliore la précision des estimateurs de volatilité. Si le bruit n’est que partiellement absorbé, le bruit résiduel est plus proche d’un bruit blanc que le bruit original, ce qui diminue la misspécification des caractéristiques du bruit. Dans le deuxième papier, on part d’un fait empirique qu’on modélise par une forme linéaire de la variance du bruit microstructure en la volatilité fondamentale. Grâce à la représentation de la classe générale des modèles de volatilité stochastique, on explore la performance de prévision de différentes mesures de volatilité sous les hypothèses de notre modèle. Dans le troisième papier, on dérive de nouvelles mesures réalizées en utilisant les prix et les volumes d’achat et de vente. Comme alternative au modèle additif standard pour les prix contaminés avec le bruit microstructure, on fait des hypothèses sur la distribution du prix sans frictions qui est supposé borné par les prix de vente et d’achat. / The high frequency observed price series is contaminated with market microstructure frictions or noise. We explore the measurement and forecasting of the fundamental volatility through novel approaches to the frictions’ problem. In the first paper, while maintaining the standard framework of a noise-frictionless price additive model, we use the trading volume, quoted depths, trade direction indicator and bid-ask spread to get rid of the noise. The econometric model is a price impact linear regression. We show that incorporating the cited liquidity costs variables delivers more precise volatility estimators. If the noise is only partially absorbed, the remaining noise is closer to a white noise than the original one, which lessens misspecification of the noise characteristics. Our approach is also robust to a specific form of endogeneity under which the common robust to noise measures are inconsistent. In the second paper, we model the variance of the market microstructure noise that contaminates the frictionless price as an affine function of the fundamental volatility. Under our model, the noise is time-varying intradaily. Using the eigenfunction representation of the general stochastic volatility class of models, we quantify the forecasting performance of several volatility measures under our model assumptions. In the third paper, instead of assuming the standard additive model for the observed price series, we specify the conditional distribution of the frictionless price given the available information which includes quotes and volumes. We come up with new volatility measures by characterizing the conditional mean of the integrated variance.
63

Předpovídání Realizované Volatility Pomocí Neuronových Sítí / Forecasting Realized Volatility Using Neural Networks

Jurkovič, Jindřich January 2013 (has links)
In this work, neural networks are used to forecast daily Realized Volatility of the EUR/USD, GBP/USD and USD/CHF currency pairs time series. Their performan-ce is benchmarked against nowadays popular Hetero-genous Autoregressive model of Realized Volatility (HAR) and traditional ARIMA models. As a by-product of our research, we introduce a simple yet effective enhancement to HAR model, naming the new model HARD extension. Forecasting performance tests of HARD model are conducted as well, promoting it to become a reference benchmark for neural networks and ARIMA.
64

Modely neuronových sítí pro podmíněné kvantily finančních výnosů a volatility / Neural network models for conditional quantiles of financial returns and volatility

Hauzr, Marek January 2016 (has links)
This thesis investigates forecasting performance of Quantile Regression Neural Networks in forecasting multiperiod quantiles of realized volatility and quantiles of returns. It relies on model-free measures of realized variance and its components (realized variance, median realized variance, integrated variance, jump variation and positive and negative semivariances). The data used are S&P 500 futures and WTI Crude Oil futures contracts. Resulting models of returns and volatility have good absolute performance and relative performance in comparison to the linear quantile regression models. In the case of in- sample the models estimated by Quantile Regression Neural Networks provide better estimates than linear quantile regression models and in the case of out-of-sample they are equally good.
65

Atividade prescrita e Atividade realizada: reflexões críticas de uma professora de Inglês

Charlariello, Luciane Nigro 19 December 2005 (has links)
Made available in DSpace on 2016-04-28T18:22:57Z (GMT). No. of bitstreams: 1 Dissertacao LUCIANE NIGRO CHARLARIELLO.pdf: 911814 bytes, checksum: d1d067e4d280f494280cc764260aff8e (MD5) Previous issue date: 2005-12-19 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / ABSTRACT This work belongs to the research area Language and Education. It aims to investigate the didatic unit: No violence: It s just a game which was developed by a group of teachers, including the author. These teachers work in Public Schools of São Paulo. This unit was created during the last module of the course Reflection on Action . I set out to examine, explain theoretically and criticize this unit (prescribed activity) and its application (realized activity) and I intend to make suggestions to reconstruct it. In other words, the development of this work will be basically a process of Critic Reflection according to Smith (description, informing, confrontation and reconstruction). The theoretical framework is based on Activity Theory (Vygotsky, 1934/2002, 1924/1984; Leontiev, 1977, 1978, 1998; Engeström, 1987, 1999, 1999a).It is also based on PCNs (1996, 1998) which were discussed with regard to teaching foreign language, sociointeracionism, critical view of teaching-learning and a dialogical perspective of language. This research will be considered a critical one with a collaborative view, because it envolves the combined complex of investigation of action that aims the understanding, analysis and critic of action context aiming to the transformation of this context. The researcher is one of the participants who studies her action context, interferes and transforms her context. The methodology will be a documental analisis (the didatic unit), involving recording classes and transcription (the application of the unit).The data analysis and interpretation suggest that the didatic unit and the classes praise the development of discursive base to discursive engadgement into English language partially, and also the development of critical conscience, and it was based on transversal theme. / O trabalho aqui analisado está inserido na linha de pesquisa Linguagem e Educação e visa investigar a unidade didática: No-violence: It s just a game , desenvolvida por um grupo de professores da rede pública do estado de São Paulo, durante o último módulo do curso Reflexão sobre a Ação. Proponho-me a examinar essa unidade didática (atividade prescrita), explicá-la teoricamente e criticá-la com vistas, também, à sua aplicação (atividade realizada) e pretendo fazer propostas de mudanças, ou seja, o desenvolvimento da dissertação será basicamente um processo de reflexão crítica de Smith (1992): descrever, informar, confrontar e reconstruir. O presente estudo está embasado na Teoria da Atividade Sócio-Histórica-Cultural conforme discutido por Vygotsky (1924/1984, 1934/2002), Leontiev (1977,1978,1998) e Engeström (1987,1999,1999a). Também se pauta nos Parâmetros Curriculares Nacionais (1996,1998), os quais foram discutidos em relação ao ensino de língua estrangeira, ao sociointeracionismo, à visão crítica de ensino-aprendizagem e à perspectiva dialógica de linguagem. Esta pesquisa foi considerada como crítica com cunho colaborativo, pois envolve o processo conjunto de investigação da ação que visa a apreensão, análise e crítica de contextos de ação com foco na sua transformação, tendo a pesquisadora como uma das participantes focais, que estuda seu contexto de ação, interfere e modifica seu contexto. Para isso, foi utilizada a análise documental (a unidade didática) e foram gravadas e transcritas as nove aulas (aplicação da unidade). Os resultados mostraram que a unidade didática e suas aulas favorecem parcialmente o desenvolvimento de base discursiva para engajamento discursivo na língua inglesa, assim como o desenvolvimento de consciência crítica, e foi totalmente pautada em tema transversal.
66

Combinação de projeções de volatilidade baseadas em medidas de risco para dados em alta frequência / Volatility forecast combination using risk measures based on high frequency data

Araújo, Alcides Carlos de 29 April 2016 (has links)
Operações em alta frequência demonstraram crescimento nos últimos anos; em decorrência disso, surgiu a necessidade de estudar o mercado de ações brasileiro no contexto dos dados em alta frequência. Os estimadores da volatilidade dos preços de ações utilizando dados de negociações em alta frequência são os principais objetos de estudo. Conforme Aldridge (2010) e Vuorenmaa (2013), o HFT foi definido como a rápida realocação de capital feita de modo que as transações possam ocorrer em milésimos de segundos por uso de algoritmos complexos que gerenciam envio de ordens, análise dos dados obtidos e tomada das melhores decisões de compra e venda. A principal fonte de informações para análise do HFT são os dados tick by tick, conhecidos como dados em alta frequência. Uma métrica oriunda da análise de dados em alta frequência e utilizada para gestão de riscos é a Volatilidade Percebida. Conforme Andersen et al. (2003), Pong et al. (2004), Koopman et al. (2005) e Corsi (2009) há um consenso na área de finanças de que as projeções da volatilidade utilizando essa métrica de risco são mais eficientes de que a estimativa da volatilidade por meio de modelos GARCH. Na gestão financeira, a projeção da volatilidade é uma ferramenta fundamental para provisionar reservas para possíveis perdas;, devido à existência de vários métodos de projeção da volatilidade e em decorrência desta necessidade torna-se necessário selecionar um modelo ou combinar diversas projeções. O principal desafio para combinar projeções é a escolha dos pesos: as diversas pesquisas da área têm foco no desenvolvimento de métodos para escolhê-los visando minimizar os erros de previsão. A literatura existente carece, no entanto, de uma proposição de método que considere o problema de eventual projeção de volatilidade abaixo do esperado. Buscando preencher essa lacuna, o objetivo principal desta tese é propor uma combinação dos estimadores da volatilidade dos preços de ações utilizando dados de negociações em alta frequência para o mercado brasileiro. Como principal ponto de inovação, propõe-se aqui de forma inédita a utilização da função baseada no Lower Partial Moment (LPM) para estimativa dos pesos para combinação das projeções. Ainda que a métrica LPM seja bastante conhecida na literatura, sua utilização para combinação de projeções ainda não foi analisada. Este trabalho apresenta contribuições ao estudo de combinações de projeções realizadas pelos modelos HAR, MIDAS, ARFIMA e Nearest Neighbor, além de propor dois novos métodos de combinação -- estes denominados por LPMFE (Lower Partial Moment Forecast Error) e DLPMFE (Discounted LPMFE). Os métodos demonstraram resultados promissores pretendem casos cuja pretensão seja evitar perdas acima do esperado e evitar provisionamento excessivo do ponto de vista orçamentário. / The High Frequency Trading (HFT) has grown significantly in the last years, in this way, this raises the need for research of the high frequency data on the Brazilian stock market.The volatility estimators of the asset prices using high frequency data are the main objects of study. According to Aldridge (2010) and Vuorenmaa (2013), the HFT was defined as the fast reallocation of trading capital that the negotiations may occur on milliseconds by complex algorithms scheduled for optimize the process of sending orders, data analysis and to make the best decisions of buy or sell. The principal information source for HFT analysis is the tick by tick data, called as high frequency data. The Realized Volatility is a risk measure from the high frequency data analysis, this metric is used for risk management.According to Andersen et al. (2003), Pong et al. (2004), Koopman et al.(2005) and Corsi (2009) there is a consensus in the finance field that the volatility forecast using this risk measure produce better results than estimating the volatility by GARCH models. The volatility forecasting is a key issue in the financial management to provision capital resources to possible losses. However, because there are several volatility forecast methods, this problem raises the need to choice a specific model or combines the projections. The main challenge to combine forecasts is the choice of the weights, with the aim of minimizingthe forecast errors, several research in the field have been focusing on development of methods to choice the weights.Nevertheless, it is missing in the literature the proposition of amethod which consider the minimization of the risk of an inefficient forecast for the losses protection. Aiming to fill the gap, the main goal of the thesis is to propose a combination of the asset prices volatility forecasts using high frequency data for Brazilian stock market. As the main focus of innovation, the thesis proposes, in an unprecedented way, the use of the function based on the Lower Partial Moment (LPM) to estimate the weights for the combination of volatility forecasts. Although the LPM measure is well known in the literature, the use of this metric for forecast combination has not been yet studied.The thesis contributes to the literature when studying the forecasts combination made by the models HAR, MIDAS, ARFIMA and Nearest Neighbor. The thesis also contributes when proposing two new methods of combinations, these methodologies are referred to as LPMFE (Lower Partial Moment Forecast Error) and DLPMFE (Discounted LPMFE). The methods have shown promising results when it is intended to avoid losses above the expected it is not intended to cause provisioning excess in the budget.
67

Ensaios em cópulas e finanças empíricas

Silva, Fernando Augusto Boeira Sabino da January 2017 (has links)
Nesta tese discutimos abordagens que utilizam cópulas para descrever dependências entre instrumentos nanceiros e avaliamos a performance destes métodos. Muitas crises nanceiras aconteceram desde o nal da década de 90, incluindo a crise asiática (1997), a crise da dívida da Rússia (1998), a crise da bolha da internet (2000), as crises após o 9/11 (2001) e a guerra do Iraque (2003), a crise do subprime or crise nanceira global (2007-08), e a crise da dívida soberana europeia (2009). Todas estas crises levaram a uma perda maciça de riqueza nanceira e a um aumento da volatilidade observada, e enfatizaram a importância de uma política macroprudencial mais robusta. Em outras palavras, perturbações nanceiras tornam os processos econômicos altamente não-lineares, levando os principais bancos centrais a tomarem medidas contrárias para conter a angústia - nanceira. Devido aos complexos padrões de dependência dos mercados nanceiros, uma abordagem multivariada em grandes dimensões para a análise da dependência caudal é seguramente mais perspicaz do que assumir retornos com distribuição normal multivariada. Dada a sua exibilidade, as cópulas são capazes de modelar melhor as regularidades empiricamente veri cadas que são normalmente atribuídas a retornos nanceiros multivariados: (1) volatilidade condicional assimétrica com maior volatilidade para grandes retornos negativos e menor volatilidade para retornos positivos (HAFNER, 1998); (2) assimetria condicional (AIT-SAHALIA; BRANDT, 2001; CHEN; HONG; STEIN, 2001; PATTON, 2001); (3) excesso de curtose (TAUCHEN, 2001; ANDREOU; PITTIS; SPANOS, 2001); e (4) dependência temporal não linear (CONT, 2001; CAMPBELL; LO; MACKINLAY, 1997). A principal contribuição dos ensaios é avaliar se abordagens mais so sticadas do que o método da distância e o tradicional modelo de Markowitz podem tirar proveito de quaisquer anomalias/fricções de mercado. Os ensaios são uma tentativa de fornecer uma análise adequada destas questões usando conjuntos de dados abrangentes e de longo prazo. Empiricamente, demonstramos que as abordagens baseadas em cópulas são úteis em todos os ensaios, mostrando-se bené cas para modelar dependências em diferentes cenários, avaliando as medidas de risco caudais mais adequadamente e gerando rentabilidade superior a dos benchmarks utilizados. / In this thesis we discuss copula-based approaches to describe statistical dependencies within nancial instruments and evaluate its performance. Many nancial crises have occurred since the late 1990s, including the Asian crisis (1997), the Russian national debt crisis (1998), the dot-com bubble crisis (2000), the crises after 9-11 (2001) and Iraq war (2003), the subprime mortgage crisis or global nancial crisis (2007-08), and the European sovereign debt crisis (2009). All of these crises lead to a massive loss of nancial wealth and an upward in observed volatility and have emphasized the importance of a more robust macro-prudential policy. In other words, nancial disruptions make the economic processes highly nonlinear making the major central banks to take counter-measures in order to contain nancial distress. The methods for modeling uncertainty and evaluating the market risk on nancial markets are now under more scrutiny after the global nancial crisis. Due to the complex dependence patterns of nancial markets, a high-dimensional multivariate approach to tail dependence analysis is surely more insightful than assuming multivariate normal returns. Given its exibility, copulas are able to model better the empirically veri ed regularities normally attributed to multivariate nancial returns: (1) asymmetric conditional volatility with higher volatility for large negative returns and smaller volatility for positive returns (HAFNER, 1998); (2) conditional skewness (AITSAHALIA; BRANDT, 2001; CHEN; HONG; STEIN, 2001; PATTON, 2001); (3) excess kurtosis (TAUCHEN, 2001; ANDREOU; PITTIS; SPANOS, 2001); and (4) nonlinear temporal dependence (CONT, 2001; CAMPBELL; LO; MACKINLAY, 1997). The principal contribution of the essays is to assess if more sophisticated approaches than the distance method and plain Markowitz model can take advantage of any market anomalies/ fricctions. The essays are one attempt to provide a proper analysis in these issues using a long-term and comprehensive datasets. We empirically show that copula-based approaches are useful in all essays, proving bene cial to model dependencies in di erent scenarios, assessing the downside risk measures more adequately and yielding higher profitability than the benchmarks.
68

On the Normal Inverse Gaussian Distribution in Modeling Volatility in the Financial Markets

Forsberg, Lars January 2002 (has links)
<p>We discuss the Normal inverse Gaussian (NIG) distribution in modeling volatility in the financial markets. Refining the work of Barndorff-Nielsen (1997) and Andersson (2001), we introduce a new parameterization of the NIG distribution to build the GARCH(p,q)-NIG model. This new parameterization allows the model to be a strong GARCH in the sense of Drost and Nijman (1993). It also allows us to standardized the observed returns to be i.i.d., so that we can use standard inference methods when we evaluate the fit of the model.</p><p>We use the realized volatility (RV), calculated from intraday data, to standardize the returns of the ECU/USD foreign exchange rate. We show that normality cannot be rejected for the RV-standardized returns, i.e., the Mixture-of-Distributions Hypothesis (MDH) of Clark (1973) holds. {We build a link between the conditional RV and the conditional variance. This link allows us to use the conditional RV as a proxy for the conditional variance. We give an empirical justification of the GARCH-NIG model using this approximation.</p><p>In addition, we introduce a new General GARCH(p,q)-NIG model. This model has as special cases the Threshold-GARCH(p,q)-NIG model to model the leverage effect, the Absolute Value GARCH(p,q)-NIG model, to model conditional standard deviation, and the Threshold Absolute Value GARCH(p,q)-NIG model to model asymmetry in the conditional standard deviation. The properties of the maximum likelihood estimates of the parameters of the models are investigated in a simulation study.</p>
69

On the Normal Inverse Gaussian Distribution in Modeling Volatility in the Financial Markets

Forsberg, Lars January 2002 (has links)
We discuss the Normal inverse Gaussian (NIG) distribution in modeling volatility in the financial markets. Refining the work of Barndorff-Nielsen (1997) and Andersson (2001), we introduce a new parameterization of the NIG distribution to build the GARCH(p,q)-NIG model. This new parameterization allows the model to be a strong GARCH in the sense of Drost and Nijman (1993). It also allows us to standardized the observed returns to be i.i.d., so that we can use standard inference methods when we evaluate the fit of the model. We use the realized volatility (RV), calculated from intraday data, to standardize the returns of the ECU/USD foreign exchange rate. We show that normality cannot be rejected for the RV-standardized returns, i.e., the Mixture-of-Distributions Hypothesis (MDH) of Clark (1973) holds. {We build a link between the conditional RV and the conditional variance. This link allows us to use the conditional RV as a proxy for the conditional variance. We give an empirical justification of the GARCH-NIG model using this approximation. In addition, we introduce a new General GARCH(p,q)-NIG model. This model has as special cases the Threshold-GARCH(p,q)-NIG model to model the leverage effect, the Absolute Value GARCH(p,q)-NIG model, to model conditional standard deviation, and the Threshold Absolute Value GARCH(p,q)-NIG model to model asymmetry in the conditional standard deviation. The properties of the maximum likelihood estimates of the parameters of the models are investigated in a simulation study.
70

Optimizable Multiresolution Quadratic Variation Filter For High-frequency Financial Data

Sen, Aykut 01 February 2009 (has links) (PDF)
As the tick-by-tick data of financial transactions become easier to reach, processing that much of information in an efficient and correct way to estimate the integrated volatility gains importance. However, empirical findings show that, this much of data may become unusable due to microstructure effects. Most common way to get over this problem is to sample the data in equidistant intervals of calendar, tick or business time scales. The comparative researches on that subject generally assert that, the most successful sampling scheme is a calendar time sampling which samples the data every 5 to 20 minutes. But this generally means throwing out more than 99 percent of the data. So it is obvious that a more efficient sampling method is needed. Although there are some researches on using alternative techniques, none of them is proven to be the best. Our study is concerned with a sampling scheme that uses the information in different scales of frequency and is less prone to microstructure effects. We introduce a new concept of business intensity, the sampler of which is named Optimizable Multiresolution Quadratic Variation Filter. Our filter uses multiresolution analysis techniques to decompose the data into different scales and quadratic variation to build up the new business time scale. Our empirical findings show that our filter is clearly less prone to microstructure effects than any other common sampling method. We use the classified tick-by-tick data for Turkish Interbank FX market. The market is closed for nearly 14 hours of the day, so big jumps occur between closing and opening prices. We also propose a new smoothing algorithm to reduce the effects of those jumps.

Page generated in 0.0597 seconds