• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 43
  • 14
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 71
  • 71
  • 71
  • 25
  • 20
  • 15
  • 14
  • 13
  • 12
  • 11
  • 11
  • 10
  • 10
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Five contributions to econometric theory and the econometrics of ultra-high-frequency data

Meitz, Mika January 2006 (has links)
No description available.
42

Improvement And Development Of High-frequency Wireless Token-ring Protocol

Kurtulus, Taner 01 January 2011 (has links) (PDF)
STANAG 5066 Edition 2 is a node-to-node protocol developed by NATO in order to communicate via HF media. IP integration is made to be able to spread the use of STANAG 5066 protocol. However, this integration made the communication much slower which is already slow. In order to get faster the speed and communicate within single-frequency multi-node network, HFTRP, which is a derivative of WTRP, is developed. This protocol is in two parts, first is a message design for management tokens exchanged by communicating nodes, and second is the algorithms used to create, maintain, and repair the ring of nodes in the network. Scope of this thesis is to find out a faster ring setup, growing procedure and to implement. Beside, finding optimum values of tuning parameters for HFTRP is also in the scope of this thesis.
43

Mesure et Prévision de la Volatilité pour les Actifs Liquides

Chaker, Selma 04 1900 (has links)
Le prix efficient est latent, il est contaminé par les frictions microstructurelles ou bruit. On explore la mesure et la prévision de la volatilité fondamentale en utilisant les données à haute fréquence. Dans le premier papier, en maintenant le cadre standard du modèle additif du bruit et le prix efficient, on montre qu’en utilisant le volume de transaction, les volumes d’achat et de vente, l’indicateur de la direction de transaction et la différence entre prix d’achat et prix de vente pour absorber le bruit, on améliore la précision des estimateurs de volatilité. Si le bruit n’est que partiellement absorbé, le bruit résiduel est plus proche d’un bruit blanc que le bruit original, ce qui diminue la misspécification des caractéristiques du bruit. Dans le deuxième papier, on part d’un fait empirique qu’on modélise par une forme linéaire de la variance du bruit microstructure en la volatilité fondamentale. Grâce à la représentation de la classe générale des modèles de volatilité stochastique, on explore la performance de prévision de différentes mesures de volatilité sous les hypothèses de notre modèle. Dans le troisième papier, on dérive de nouvelles mesures réalizées en utilisant les prix et les volumes d’achat et de vente. Comme alternative au modèle additif standard pour les prix contaminés avec le bruit microstructure, on fait des hypothèses sur la distribution du prix sans frictions qui est supposé borné par les prix de vente et d’achat. / The high frequency observed price series is contaminated with market microstructure frictions or noise. We explore the measurement and forecasting of the fundamental volatility through novel approaches to the frictions’ problem. In the first paper, while maintaining the standard framework of a noise-frictionless price additive model, we use the trading volume, quoted depths, trade direction indicator and bid-ask spread to get rid of the noise. The econometric model is a price impact linear regression. We show that incorporating the cited liquidity costs variables delivers more precise volatility estimators. If the noise is only partially absorbed, the remaining noise is closer to a white noise than the original one, which lessens misspecification of the noise characteristics. Our approach is also robust to a specific form of endogeneity under which the common robust to noise measures are inconsistent. In the second paper, we model the variance of the market microstructure noise that contaminates the frictionless price as an affine function of the fundamental volatility. Under our model, the noise is time-varying intradaily. Using the eigenfunction representation of the general stochastic volatility class of models, we quantify the forecasting performance of several volatility measures under our model assumptions. In the third paper, instead of assuming the standard additive model for the observed price series, we specify the conditional distribution of the frictionless price given the available information which includes quotes and volumes. We come up with new volatility measures by characterizing the conditional mean of the integrated variance.
44

Four essays on the econometric modelling of volatility and durations

Amado, Cristina January 2009 (has links)
The thesis "Four Essays on the Econometric Modelling of Volatility and Durations" consists of four research papers in the area of financial econometrics on topics of the modelling of financial market volatility and the econometrics of ultra-high-frequency data. The aim of the thesis is to develop new econometric methods for modelling and hypothesis testing in these areas. The second chapter introduces a new model, the time-varying GARCH (TV-GARCH) model, in which volatility has a smooth time-varying structure of either additive or multiplicative type. To characterize smooth changes in the (un)conditional variance we assume that the parameters vary smoothly over time according to the logistic transition function. A data-based modelling technique is used for specifying the parametric structure of the TV-GARCH models. This is done by testing a sequence of hypotheses by Lagrange multiplier tests presented in the chapter. Misspecification tests are also provided for evaluating the adequacy of the estimated model. The third chapter addresses the issue of modelling deterministic changes in the unconditional variance over a long return series. The modelling strategy is illustrated with an application to the daily returns of the Dow Jones Industrial Average (DJIA) index from 1920 until 2003. The empirical results sustain the hypothesis that the assumption of constancy of the unconditional variance is not adequate over long return series and indicate that deterministic changes in the unconditional variance may be associated with macroeconomic factors. In the fourth chapter we propose an extension of the univariate multiplicative TV-GARCH model to the multivariate Conditional Correlation GARCH (CC-GARCH) framework. The variance equations are parameterized such that they combine the long-run and the short-run dynamic behaviour of the volatilities. In this framework, the long-run behaviour is described by the individual unconditional variances, and it is allowed to vary smoothly over time according to the logistic transition function. The effects of modelling the nonstationary variance component are examined empirically in several CC-GARCH models using pairs of seven daily stock return series from the S&P 500 index. The results show that the magnitude of such effect varies across different stock series and depends on the structure of the conditional correlation matrix. An important feature of financial durations is the evidence of a strong diurnal variation over the trading day. In the fifth chapter we propose a new parameterization for describing the diurnal pattern of trading activity. The parametric structure of the diurnal component allows the duration process to change smoothly over the time-of-day according to the logistic transition function. The empirical results suggest that the diurnal variation may not always have the inverted U-shaped pattern for the trade durations as documented in earlier studies.
45

Stochastic Modelling of Random Variables with an Application in Financial Risk Management.

Moldovan, Max January 2003 (has links)
The problem of determining whether or not a theoretical model is an accurate representation of an empirically observed phenomenon is one of the most challenging in the empirical scientific investigation. The following study explores the problem of stochastic model validation. Special attention is devoted to the unusual two-peaked shape of the empirically observed distributions of the conditional on realised volatility financial returns. The application of statistical hypothesis testing and simulation techniques leads to the conclusion that the conditional on realised volatility returns are distributed with a specific previously undocumented distribution. The probability density that represents this distribution is derived, characterised and applied for validation of the financial model.
46

Análise das cotações e transações intradiárias da Petrobrás utilizando dados irregularmente espaçados

Silva, Marília Gabriela Elias da 27 August 2014 (has links)
Submitted by Marília Gabriela Elias da Silva (marilia.gabriela.es@gmail.com) on 2014-09-18T19:07:04Z No. of bitstreams: 1 Marilia_Gabriela_tese.pdf: 512980 bytes, checksum: 8ab7fc0b5b89fa1bd8f99a705ae51920 (MD5) / Approved for entry into archive by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br) on 2014-09-18T19:43:45Z (GMT) No. of bitstreams: 1 Marilia_Gabriela_tese.pdf: 512980 bytes, checksum: 8ab7fc0b5b89fa1bd8f99a705ae51920 (MD5) / Made available in DSpace on 2014-09-18T19:51:23Z (GMT). No. of bitstreams: 1 Marilia_Gabriela_tese.pdf: 512980 bytes, checksum: 8ab7fc0b5b89fa1bd8f99a705ae51920 (MD5) Previous issue date: 2014-08-27 / This study uses data provided by BM&FBovespa to analyze Petrobras' stock for the months between July and August 2010 and October 2008. First, we present a detailed discussion about handling data, we show the impossibility of using the mid-price quote due to the high number of buy / sell orders that present very high / low prices. We checked some of the empirical stylized facts pointed out by Cont (2001), among others enshrined in the microstructure literature. In general, the stylized facts were replicated by the data. We apply the filter, proposed by Brownlees and Gallo (2006), to Petrobras' stock and we analyze the sensitivity of the number of possible outliers found by the filter with respect to the filter's parameters variation. We propose using the Akaike criterion to sort and select conditional duration models whose samples have different length sizes. The selected models are not always those in which the data has been filtered. For the ACD (1,1) setting, when we consider only well-adjusted models, the Akaike criterion indicates as better model as one in which the data were not filtered. / O presente trabalho utiliza os dados disponibilizados pela BM&FBovespa para analisar as ações da Petrobrás para os meses compreendidos entre julho e agosto de 2010 e outubro de 2008. Primeiramente, apresentamos uma discussão detalhada sobre a manipulação desses dados, na qual explicitamos a impossibilidade de se usar o mid-price quote devido ao número elevado de ofertas de compra/venda com preços muito elevados/baixos. Verificamos alguns dos fatos estilizados empíricos apontados por Cont (2001), entre outros consagrados na literatura de microestrutura. Em geral, os dados replicaram os fatos estilizados. Aplicamos o filtro proposto por Brownlees e Gallo (2006) às ações da Petrobrás e analisamos a sensibilidade do número de possíveis outliers encontrados pelo filtro a variação dos parâmetros desse filtro. Propomos utilizar o critério de Akaike para ordenar e selecionar modelos de duração condicional cujas amostras de duração possuem tamanhos distintos. Os modelos selecionados, nem sempre são aqueles em que os dados foram filtrados. Para o ajuste ACD (1,1), quando considerados apenas os modelos bem ajustados (resíduos não autocorrelacionados), o critério de Akaike indica como melhor modelo aquele em que os dados não foram filtrados.
47

Descoberta de preço nas opções de Petrobrás

Suzuki, Yurie Yassunaga January 2015 (has links)
Submitted by Yurie Yassunaga Suzuki (yurieyassunaga@gmail.com) on 2015-09-03T02:42:33Z No. of bitstreams: 1 Dissertação - YYS.pdf: 638348 bytes, checksum: 7a31f5f0e578eed37d240d302f503e27 (MD5) / Approved for entry into archive by Renata de Souza Nascimento (renata.souza@fgv.br) on 2015-09-03T16:04:18Z (GMT) No. of bitstreams: 1 Dissertação - YYS.pdf: 638348 bytes, checksum: 7a31f5f0e578eed37d240d302f503e27 (MD5) / Made available in DSpace on 2015-09-03T16:44:56Z (GMT). No. of bitstreams: 1 Dissertação - YYS.pdf: 638348 bytes, checksum: 7a31f5f0e578eed37d240d302f503e27 (MD5) Previous issue date: 2015 / This work aims to study market behavior involving Petrobras’ stock and options markets applying price discovery methodology. Using high-frequency data, provided by BM&FBOVESPA, econometric models used in this methodology were estimated and measures of Information Share (IS) and Component Share (CS) were calculated. The results of the analyzes indicated dominance of the spot market in the process of price discovery, since, for this market, were observed values over 66% for IS and above 74% for CS. Graphical analysis of the impulse response function indicated that the spot market is more efficient than the option market. / Este trabalho tem como objetivo estudar o comportamento do mercado de ações e opções de Petrobrás utilizando a metodologia de price discovery (descoberta de preços). A partir de dados de alta frequência de ambos os mercados, fornecidos pela BM&FBOVESPA, os modelos econométricos utilizados nessa metodologia foram estimados e as medidas de Information Share (IS) e Component Share (CS) foram calculadas. Os resultados das análises indicaram dominância do mercado à vista no processo de descoberta de preços, dado que, para este mercado, foram observados valores acima de 66% para a medida IS e acima de 74% para a medida CS. Análises gráficas da função resposta ao impulso indicaram, também, que o mercado à vista é o mais eficiente.
48

Efektivnost trhu a automatické obchodní systémy / Market efficiency and automated trading

ZEMAN, Petr January 2013 (has links)
The dissertation thesis deals with the problem efficiency of the spot currency market. The main aim of this thesis is to verify the Efficient-market hypothesis on the majo foreign exchange pairs, and especially in the short term. The author focuses on the effective functioning of foreign exchange markets. The behaiour of the five main spot foreign exchange pairs - EUR/USD, GBP/USD, USD/CHF, USD/JPY and USD/CAD was analyzed in the thesis. Due to the increasing rise of intraday trades and growing popularity of margin accounts among retail investors, spot rates have been investigated primarily through a high-frequency data, that were collected for a period equal to or shorter than one day. The hypothesis of the effective exchange rate behaviour was verified by both using statistical methods, such as through automated trading systems, which were designed to assess the economic importance of the theory and to exclude or confirm the possibility of achieving above-average profits of retail investors on the foreign exchange markets.
49

Modelování durací mezi finančními transakcemi / Modeling of duration between financial transactions

Voráčková, Andrea January 2018 (has links)
❆❜str❛❝t ❚❤✐s ❞✐♣❧♦♠❛ t❤❡s✐s ❞❡❛❧s ✇✐t❤ ♣r♦♣❡rt✐❡s ♦❢ ❆❈❉ ♣r♦❝❡ss ❛♥❞ ♠❡t❤♦❞s ♦❢ ✐ts ❡st✐♠❛t✐♦♥✳ ❋✐rst✱ t❤❡ ❜❛s✐❝ ❞❡☞♥✐t✐♦♥s ❛♥❞ r❡❧❛t✐♦♥s ❜❡t✇❡❡♥ ❆❘▼❆ ❛♥❞ ●❆❘❈❍ ♣r♦❝❡ss❡s ❛r❡ st❛t❡❞✳ ■♥ t❤❡ s❡❝♦♥❞ ♣❛rt ♦❢ t❤❡ t❤❡s✐s✱ t❤❡ ❆❈❉ ♣r♦❝❡ss ✐s ❞❡☞♥❡❞ ❛♥❞ t❤❡ r❡❧❛t✐♦♥ ❜❡t✇❡❡♥ ❆❘▼❆ ❛♥❞ ❆❈❉ ✐s s❤♦✇♥✳ ❚❤❡♥ ✇❡ s❤♦✇ t❤❡ ♠❡t❤♦❞s ♦❢ ❞❛t❛ ❛❞❥✉st♠❡♥t✱ ❡st✐♠❛t✐♦♥✱ ♣r❡❞✐❝t✐♦♥ ❛♥❞ ✈❡r✐☞❝❛t✐♦♥ ♦❢ t❤❡ ❆❈❉ ♠♦❞❡❧✳ ❆❢t❡r t❤❛t✱ t❤❡ ♣❛rt✐❝✉❧❛r ❝❛s❡s ♦❢ ❆❈❉ ♣r♦❝❡ss✿ ❊❆❈❉✱ ❲❆❈❉✱ ●❆❈❉✱ ●❊❱❆❈❉ ✇✐t❤ ✐ts ♣r♦♣❡rt✐❡s ❛♥❞ t❤❡ ♠♦t✐✈❛t✐♦♥❛❧ ❡①❛♠♣❧❡s ❛r❡ ✐♥tr♦❞✉❝❡❞✳ ❚❤❡ ♥✉♠❡r✐❝❛❧ ♣❛rt ✐s ♣❡r❢♦r♠❡❞ ✐♥ ❘ s♦❢t✇❛r❡ ❛♥❞ ❝♦♥❝❡r♥s t❤❡ ♣r❡❝✐s✐♦♥ ♦❢ t❤❡ ❡st✐♠❛t❡s ❛♥❞ ♣r❡❞✐❝t✐♦♥s ♦❢ t❤❡ s♣❡❝✐❛❧ ❝❛s❡s ♦❢ ❆❈❉ ♠♦❞❡❧ ❞❡♣❡♥❞✐♥❣ ♦♥ t❤❡ ❧❡♥❣t❤ ♦❢ s❡r✐❡s ❛♥❞ ♥✉♠❜❡r ♦❢ s✐♠✉❧❛t✐♦♥s✳ ■♥ t❤❡ ❧❛st ♣❛rt✱ ✇❡ ❛♣♣❧② t❤❡ ♠❡t❤♦❞s st❛t❡❞ ✐♥ t❤❡♦r❡t✐❝❛❧ ♣❛rt ♦♥ r❡❛❧ ❞❛t❛✳ ❚❤❡ ❛❞❥✉st♠❡♥t ♦❢ t❤❡ ❞❛t❛ ❛♥❞ ❡st✐♠❛t✐♦♥ ♦❢ t❤❡ ♣❛r❛♠❡t❡rs ✐s ♣❡r❢♦r♠❡❞ ❛s ✇❡❧❧ ❛s t❤❡ ✈❡r✐☞❝❛t✐♦♥ ♦❢ t❤❡ ❆❈❉ ♠♦❞❡❧✳ ❆❢t❡r t❤❛t✱ ✇❡ ♣r❡❞✐❝t ❢❡✇ st❡♣s ❛♥❞ ❝♦♠♣❛r❡ t❤❡♠ ✇✐t❤ r❡❛❧ ❞✉r❛t✐♦♥s✳ ✶
50

Combinação de projeções de volatilidade baseadas em medidas de risco para dados em alta frequência / Volatility forecast combination using risk measures based on high frequency data

Alcides Carlos de Araújo 29 April 2016 (has links)
Operações em alta frequência demonstraram crescimento nos últimos anos; em decorrência disso, surgiu a necessidade de estudar o mercado de ações brasileiro no contexto dos dados em alta frequência. Os estimadores da volatilidade dos preços de ações utilizando dados de negociações em alta frequência são os principais objetos de estudo. Conforme Aldridge (2010) e Vuorenmaa (2013), o HFT foi definido como a rápida realocação de capital feita de modo que as transações possam ocorrer em milésimos de segundos por uso de algoritmos complexos que gerenciam envio de ordens, análise dos dados obtidos e tomada das melhores decisões de compra e venda. A principal fonte de informações para análise do HFT são os dados tick by tick, conhecidos como dados em alta frequência. Uma métrica oriunda da análise de dados em alta frequência e utilizada para gestão de riscos é a Volatilidade Percebida. Conforme Andersen et al. (2003), Pong et al. (2004), Koopman et al. (2005) e Corsi (2009) há um consenso na área de finanças de que as projeções da volatilidade utilizando essa métrica de risco são mais eficientes de que a estimativa da volatilidade por meio de modelos GARCH. Na gestão financeira, a projeção da volatilidade é uma ferramenta fundamental para provisionar reservas para possíveis perdas;, devido à existência de vários métodos de projeção da volatilidade e em decorrência desta necessidade torna-se necessário selecionar um modelo ou combinar diversas projeções. O principal desafio para combinar projeções é a escolha dos pesos: as diversas pesquisas da área têm foco no desenvolvimento de métodos para escolhê-los visando minimizar os erros de previsão. A literatura existente carece, no entanto, de uma proposição de método que considere o problema de eventual projeção de volatilidade abaixo do esperado. Buscando preencher essa lacuna, o objetivo principal desta tese é propor uma combinação dos estimadores da volatilidade dos preços de ações utilizando dados de negociações em alta frequência para o mercado brasileiro. Como principal ponto de inovação, propõe-se aqui de forma inédita a utilização da função baseada no Lower Partial Moment (LPM) para estimativa dos pesos para combinação das projeções. Ainda que a métrica LPM seja bastante conhecida na literatura, sua utilização para combinação de projeções ainda não foi analisada. Este trabalho apresenta contribuições ao estudo de combinações de projeções realizadas pelos modelos HAR, MIDAS, ARFIMA e Nearest Neighbor, além de propor dois novos métodos de combinação -- estes denominados por LPMFE (Lower Partial Moment Forecast Error) e DLPMFE (Discounted LPMFE). Os métodos demonstraram resultados promissores pretendem casos cuja pretensão seja evitar perdas acima do esperado e evitar provisionamento excessivo do ponto de vista orçamentário. / The High Frequency Trading (HFT) has grown significantly in the last years, in this way, this raises the need for research of the high frequency data on the Brazilian stock market.The volatility estimators of the asset prices using high frequency data are the main objects of study. According to Aldridge (2010) and Vuorenmaa (2013), the HFT was defined as the fast reallocation of trading capital that the negotiations may occur on milliseconds by complex algorithms scheduled for optimize the process of sending orders, data analysis and to make the best decisions of buy or sell. The principal information source for HFT analysis is the tick by tick data, called as high frequency data. The Realized Volatility is a risk measure from the high frequency data analysis, this metric is used for risk management.According to Andersen et al. (2003), Pong et al. (2004), Koopman et al.(2005) and Corsi (2009) there is a consensus in the finance field that the volatility forecast using this risk measure produce better results than estimating the volatility by GARCH models. The volatility forecasting is a key issue in the financial management to provision capital resources to possible losses. However, because there are several volatility forecast methods, this problem raises the need to choice a specific model or combines the projections. The main challenge to combine forecasts is the choice of the weights, with the aim of minimizingthe forecast errors, several research in the field have been focusing on development of methods to choice the weights.Nevertheless, it is missing in the literature the proposition of amethod which consider the minimization of the risk of an inefficient forecast for the losses protection. Aiming to fill the gap, the main goal of the thesis is to propose a combination of the asset prices volatility forecasts using high frequency data for Brazilian stock market. As the main focus of innovation, the thesis proposes, in an unprecedented way, the use of the function based on the Lower Partial Moment (LPM) to estimate the weights for the combination of volatility forecasts. Although the LPM measure is well known in the literature, the use of this metric for forecast combination has not been yet studied.The thesis contributes to the literature when studying the forecasts combination made by the models HAR, MIDAS, ARFIMA and Nearest Neighbor. The thesis also contributes when proposing two new methods of combinations, these methodologies are referred to as LPMFE (Lower Partial Moment Forecast Error) and DLPMFE (Discounted LPMFE). The methods have shown promising results when it is intended to avoid losses above the expected it is not intended to cause provisioning excess in the budget.

Page generated in 0.1095 seconds