• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 60
  • 26
  • 9
  • 8
  • 6
  • 5
  • 4
  • 4
  • 2
  • 1
  • 1
  • Tagged with
  • 127
  • 75
  • 33
  • 29
  • 28
  • 26
  • 21
  • 17
  • 17
  • 15
  • 15
  • 15
  • 13
  • 12
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Plánovaná versus skutečná plodnost - dotazníkové šetření realizované plodnosti na vzorku žen v ČR / Planned versus realized fertility - a questionnaire survey of realized fertility of a small sample of women in Czech republic

Klementová, Lenka January 2015 (has links)
The objective of the thesis is to compare planned and realized fertility of a small sample of Czech women with finished fertility. A questionnaire was used to determine both planned and actual number of children, their gender, birth order and spacing and the age of woman at her first birth and her marital status. The realization of these events was compared and the reasons for failure were identified. A sample of 47 respondents anonymously completed the two-page questionnaire comprising of four parts -- plan, reality, personal characteristics and additional questions. Additional questions were related to contraception, abortions, religion, etc. The survey showed average number of 2.34 child per woman, whereas planned average number was 2.28. Further, on average 53 % of women fulfilled their plans regarding number of children, gender and order, spacing, age at the first birth and age at the first marriage. Finally, the correlation between observed number of children and planned number of children was analysed based on gathered data. The resulting model showed that the planned number of children one corresponds with realized number of children 1.47.
102

Mémoire longue, volatilité et gestion de portefeuille / Long memory, volatility and portfolio management

Coulon, Jérôme 20 May 2009 (has links)
Cette thèse porte sur l’étude de la mémoire longue de la volatilité des rendements d’actions. Dans une première partie, nous apportons une interprétation de la mémoire longue en termes de comportement d’agents grâce à un modèle de volatilité à mémoire longue dont les paramètres sont reliés aux comportements hétérogènes des agents pouvant être rationnels ou à rationalité limitée. Nous déterminons de manière théorique les conditions nécessaires à l’obtention de mémoire longue. Puis nous calibrons notre modèle à partir des séries de volatilité réalisée journalière d’actions américaines de moyennes et grandes capitalisations et observons le changement de comportement des agents entre la période précédant l’éclatement de la bulle internet et celle qui la suit. La deuxième partie est consacrée à la prise en compte de la mémoire longue en gestion de portefeuille. Nous commençons par proposer un modèle de choix de portefeuille à volatilité stochastique dans lequel la dynamique de la log-volatilité est caractérisée par un processus d’Ornstein-Uhlenbeck. Nous montrons que l’augmentation du niveau d’incertitude sur la volatilité future induit une révision du plan de consommation et d’investissement. Puis dans un deuxième modèle, nous introduisons la mémoire longue grâce au mouvement brownien fractionnaire. Cela a pour conséquence de transposer le système économique d’un cadre markovien à un cadre non-markovien. Nous fournissons donc une nouvelle méthode de résolution fondée sur la technique de Monte Carlo. Puis, nous montrons toute l’importance de modéliser correctement la volatilité et mettons en garde le gérant de portefeuille contre les erreurs de spécification de modèle. / This PhD thesis is about the study of the long memory of the volatility of asset returns. In a first part, we bring an interpretation of long memory in terms of agents’ behavior through a long memory volatility model whose parameters are linked with the bounded rational agents’ heterogeneous behavior. We determine theoretically the necessary condition to get long memory. Then we calibrate our model from the daily realized volatility series of middle and large American capitalization stocks. Eventually, we observe the change in the agents’ behavior between the period before the internet bubble burst and the one after. The second part is devoted to the consideration of long memory in portfolio management. We start by suggesting a stochastic volatility portfolio model in which the dynamics of the log-volatility is characterized by an Ornstein-Uhlenbeck process. We show that when the uncertainty of the future volatility level increases, it induces the revision of the consumption and investment plan. Then in a second model, we introduce a long memory component by the use of a fractional Brownian motion. As a consequence, it transposes the economic system from a Markovian framework to a non-Markovian one. So we provide a new resolution method based on Monte Carlo technique. Then we show the high importance to well model the volatility and warn the portfolio manager against the misspecification errors of the model.
103

Entrevoir la construction de niche des Iroquoiens du Saint-Laurent dans les paysages de la vallée laurentienne aux XVIe et XVIIe siècles : quels enseignements pouvons-nous tirer des feux contrôlés dans la gestion des milieux naturels?

Fortin, Daniel 03 1900 (has links)
Ce mémoire a pour objectif général d’apporter une contribution à la compréhension de la nature et l’étendue des modifications du paysage par les Iroquoiens du Saint-Laurent dans la vallée laurentienne au XVIe siècle. Comme « organismes vivants », ce groupe culturel apparenté qui occupait un grand territoire entre l’embouchure des Grands Lacs et le golfe du Saint-Laurent constituant une vaste zone de captage de ressources, a créé des niches pour assurer sa reproduction. Nous nous intéressons particulièrement à l’empreinte que ces niches réalisées ont laissé sur leur environnement en prenant spécifiquement le paysage comme objet d’étude. Nous avons cherché à évaluer l’importance ou l’étendue de cette « transformation ». Puisqu’il est difficile de « voir » ces paysages du passé, nous avons tenté de les décrire en utilisant des descriptions des premiers explorateurs, missionnaires, aventuriers, administrateurs et colonisateurs européens, d’une part, et à des études paléoécologiques et archéologiques plus récentes, de l’autre. La démarche adoptée est celle de la multidisciplinarité. Les études protohistoriques sur le paysage étant relativement peu fréquentes en écologie végétale et dans les disciplines de l’aménagement, nous nous appuierons principalement sur une revue de littérature dans les domaines de l’ethnologie, de l’ethnohistoire, de l’histoire, de l’ethnologie comparative, de la géographie, de l’écologie, de l’agronomie, de la palynologie, de l’étude des charbons de bois fossiles, de l’archéologie et de l’archéologie du paysage, considérant certains paysages comme une niche réalisée. Au final, plus de 400 textes ont été consultés dont 160 ont été cités dans ce mémoire. Les illustrations de l’explorateur et du cartographe Samuel de Champlain, sur la côte est de la Nouvelle-Angleterre, et celles de la vallée du Saint-Laurent ont été étudiées pour déterminer à la fois leur valeur de « vérité » et leur valeur de « connaissances » et, ainsi nous permettre de mieux comprendre les paysages anthropiques de ces territoires. Un des outils les plus efficaces pour défricher de larges pans des forêts pour ainsi ouvrir le paysage et permettre, en autre, la mise en culture de la terre est l’utilisation du feu. Nous avons recensé un certain nombre d’observations dans ce sens dans le nord-est de l’Amérique, mais pas strictement dans la vallée du Saint-Laurent. La niche réalisée par les Iroquoiens du Saint-Laurent devait constituer un ou des paysages de type mosaïque. C’est-à-dire que de grandes étendues de forêts denses pouvaient alterner avec des forêts de type « parc ». Les sources historiques retenues dans notre mémoire tendent à confirmer ce type de paysages dans certaines parties de la vallée laurentienne et autour des Grands Lacs. / The purpose of this thesis is to contribute to the understanding of the nature and extent of landscape changes by the Iroquoians of the St. Lawrence in the Laurentian Valley in the sixteenth century. As a "living organism" (Homo sapiens), this related cultural group, which occupied a vast territory between the mouth of the Great Lakes and the Gulf of St. Lawrence, was a large resource catchment area and created niches to ensure its reproduction. We are particularly interested in the « footprint » that these niches made to leave on their environment by specifically taking the landscape as a scale of study. We sought to assess the importance or extent of this "transformation". Since it is impossible to "see" these landscapes of the past, we have tried to describe them using descriptions of early explorers, missionaries, adventurers, administrators and European colonizers, on the one hand, and more recent paleoecological and archaeological studies, on the other hand. The approach adopted is that of multidisciplinarity. Protohistorical studies on landscape are relatively infrequent in plant ecology and in the disciplines of planning, we will rely primarily on a literature review in the fields of ethnology, ethnohistory, history, comparative ethnology, geography, ecology, agronomy, palynology, the study of fossil charcoal, archeology and landscape archeology, considering certain landscapes as a realized niche construction. In the end, more of 400 texts were consulted, of which 160 were cited in this master thesis. Illustrations by Samuel de Champlain, explorer and cartographer, on the east coast of New England, and those of the St. Lawrence Valley were studied to determine both their value as "truth" and their value as "knowledge" and thus allow us to "see" the anthropic landscapes of these territories. One of the most effective tools for clearing large part of forests to open up the landscape and allow, among other things, the cultivation of the land is the use of fire. We have identified a number of observations along these lines in northeastern America, but not strictly in the St. Lawrence Valley. The niche created by the Iroquoians of the St. Lawrence was to constitute one or more mosaic-type landscapes. That is, large areas of dense forest could alternate with "park" type forests. The historical sources retained in our memory tend to confirm this type of landscape in certain parts of the Laurentian valley and around the Great Lakes. As part of this master thesis, we are looking to know if the use of fires controlled by Aboriginals for the creation of a niche or for the maintenance of the niche created could be used in the management of natural environments by taking for example the speckled alder (Alnus incana ssp. rugosa) overgrowth in marsh of Lake Saint-François National Wildlife Area.
104

Analýza a hodnocení rizik realizace úspory energie pomocí rekonstrukce školského zařízení / Analysis and risk assessment of implementation of energy saving via renovation of school facilities

Šafářová, Tereza January 2016 (has links)
The thesis is focused on analysis and risk assessment during the reconstruction of the school building. The work will be used appropriate methods to a specific reconstruction of school, especially checklist and risk catalog. They will be determined by the ris ks that caused the failure to comply with the schedule of works and thus limit the operation of school and the surrounding area. These risks will be set out measures that should determine how these scenarios could have been avoided.
105

[pt] ENSAIOS SOBRE A PRECIFICAÇÃO EMPÍRICA DE ATIVOS, POLÍTICA MONETÁRIA E SUAS INTER-RELAÇÕES / [en] ESSAYS ON EMPIRICAL ASSET PRICING, MONETARY POLICY AND THEIR INTER-RELATIONS

FLÁVIO DE FREITAS VAL 20 September 2016 (has links)
[pt] A presente tese trata da estimação do risco e da precificação de ativos financeiros, de medidas que buscam estimar como os agentes de mercado estão avaliando a política monetária, bem como da inter-relação entre o mercado acionário e a política monetária. Esta inter-relação é representada pela estimação da reação do mercado acionário às mudanças na política monetária. O primeiro trabalho implementa dois recentes modelos de estimação de volatilidade que utilizam dados de alta frequência. O modelo Auto-Regressivo Heterogêneo (HAR) e o modelo Componente (2-Comp) são estimados e os resultados são comparados com os encontrados pelas estimações que utilizam a família de modelos Auto-Regressivos com Heteroscedasticidade Generalizados (GARCH). Durante o período analisado, os modelos que usam dados intradiários obtiveram melhores previsões de retornos dos ativos avaliados, tanto dentro como fora da amostra, confirmando assim que esses modelos possuem informações importantes para uma série de agentes econômicos. No trabalho seguinte se estima a credibilidade da política monetária implementada pelo Banco Central do Brasil - BCB nos últimos dez anos. Esta credibilidade foi estimada por meio de implementação do filtro de Kalman em medidas derivadas de expectativas inflacionárias de pesquisa ao consumidor, da pesquisa Focus do BCB e de curvas de juros dos títulos governamentais. Os resultados fornecem evidências da existência de três movimentos da credibilidade inflacionária estimada pela medida implícita e pela Focus no período analisado: (i) cedeu fortemente em meados de 2008, durante o momento mais crítico da Crise Subprime; (ii) relativa estabilidade entre o início de 2009 e meados de 2010 (meados de 2013, pela medida Focus); (iii) uma tendência de queda a partir de então, quando houve uma taxa real de juros abaixo da mínima compatível com a meta de inflação. Já a credibilidade inflacionária estimada a partir de pesquisa ao consumidor apresentou um comportamento mais errático que as demais, apresentando uma tendência de queda mais intensa a partir do início de 2013 e permanecendo em patamares próximos a zero desde então. Ao mesmo tempo, os resultados indicam que alterações da inflação são importantes para a previsão da credibilidade estimada a partir de pesquisa ao consumidor, validando sua característica backward looking e de ser formada a partir de expectativa adaptativa dos consumidores. A metodologia adotada possibilita desenvolver estimativas em tempo real do grau desta credibilidade e retornar avaliação quantitativa sobre a consistência da política monetária em um ambiente de metas de inflação. Ele contribui para a literatura existente ao implementar o teste de credibilidade de Svensson (1993) e o estender dentro de um arcabouço econométrico de espaço de estado, permitindo a estimação probabilística do grau de credibilidade da política monetária implementada pela autoridade monetária brasileira no período analisado. Finalmente, o terceiro e último trabalho é um estudo empírico da relação entre a política monetária, implementada pelo BCB, e o mercado de ações brasileiro. Utilizando a metodologia de Estudo de Eventos, analisa-se o efeito dos componentes esperados e não esperados das decisões de política monetária nos retornos do Índice Bovespa e de trinta e cinco ações de diferentes empresas. Os resultados fornecem evidências de que a política monetária possui um efeito significativo no mercado acionário, sendo que o evento de reversão na direção da política monetária tende a potencializar a resposta deste mercado. A análise no nível setorial indica que o setor de consumo cíclico é o mais afetado por esta política, enquanto os setores de utilidade pública e de petróleo, gás e biocombustíveis não são afetados significativamente. Os ativos individuais respondem de forma bastante heterogênea à política monetária, porém, ao se utilizar os retornos anormais destes ativos, identificou-se uma forte redução na intensidade e no número de empresas impactadas pela política monetária. Além disso, a surpresa monetária é explicada por variações não esperadas da taxa de desemprego, do índice de produção industrial e do IPCA, sendo Granger causada por variações não esperadas do índice de produção industrial, indicando a importância desta variável para a previsão da política monetária. / [en] This present thesis discusses the estimation of risk and of financial assets pricing, the measures that seek to estimate how the market players are evaluating the monetary policy, as well as the inter-relationship between the stock market and monetary policy. This interrelation is represented by the estimation of the stock market s reaction to changes in monetary policy. The first essay implements the estimation of two recent volatility models using high-frequency data. Heterogeneous Autoregressive model (HAR) and the Component model (2-Comp) are estimated and the results are compared with those found by estimations using the Generalized Autoregressive Conditional Heteroskedasticity (GARCH) family models. During the analyzed period, the models using intraday data obtained better forecasts of asset returns valued both in-sample and out-of-sample, thus confirming that these models have important information for the economic agents. The next essay estimates the credibility of monetary policy implemented by the Central Bank of Brazil - BCB in the last ten years. This credibility was estimated by the use of Kalman filter on measures of inflation expectations derived from the consumer survey, the Focus survey from BCB and on the yield curves of government bonds. The results provide evidence of the existence of three changes on inflationary credibility in the analyzed period: (i) sharp downturn in mid-2008; (ii) relative stability between early 2009 and mid-2010 (mid-2013, for Focus measure); (iii) a downward trend since then, when there was a real interest rate below the minimum compatible with the inflation target. Besides, inflationary credibility estimated from the consumer pooling showed a more erratic behavior than the others, with a tendency to fall more intensely from the beginning of 2013 and remaining at levels close to zero since then. At the same time, the results indicate that inflation changes are important for the prediction of the credibility estimated from consumer pooling, validating its backwardated characteristic and its construction from adaptive consumer expectations. The adopted methodology enables to develop real-time estimates of BCB credibility and to return quantitative assessment of the consistency of monetary policy on an inflation target regime. This work adds to the existing literature in implementing Svensson credibility test (1993) and in extending it within an econometric framework of state space, allowing the probabilistic estimation of the degree of credibility of the monetary policy implemented by the Brazilian monetary authority during the analyzed period. Finally, the third and final essay is an empirical study of the relationship between monetary policy, implemented by the BCB, and the Brazilian stock market. Using the Event Study methodology, this essay analyzes the effect of expected and unexpected components of monetary policy decisions on the Bovespa index returns and on thirty-five different stock returns. The results provide evidence that monetary policy has a significant effect on the stock market returns, and the reversal event in the direction of monetary policy tends to enhance the response of the stock market. The analysis on a sectorial basis indicates that the cyclical consumer sector is the most affected by this policy, while the public utility and the oil, gas and biofuels sectors are not significantly affected. Individual assets respond in a very heterogeneous way to monetary policy. However, when using the abnormal returns, we identified a strong reduction in the intensity and in the number of companies affected by monetary policy. Furthermore, monetary surprise is explained by unexpected variations in the unemployment rate, in the industrial production index and in the CPI. Nonetheless, monetary surprise is Granger caused by unexpected variations in the industrial production index, indicating the importance of this variable for monetary policy forecasting.
106

Modeling the Relation Between Implied and Realized Volatility / Modellering av relationen mellan implicit och realiserad volatilitet

Brodd, Tobias January 2020 (has links)
Options are an important part in today's financial market. It's therefore of high importance to be able to understand when options are overvalued and undervalued to get a lead on the market. To determine this, the relation between the volatility of the underlying asset, called realized volatility, and the market's expected volatility, called implied volatility, can be analyzed. In this thesis five models were investigated for modeling the relation between implied and realized volatility. The five models consisted of one Ornstein–Uhlenbeck model, two autoregressive models and two artificial neural networks. To analyze the performance of the models, different accuracy measures were calculated for out-of-sample forecasts. Signals from the models were also calculated and used in a simulated options trading environment to get a better understanding of how well they perform in trading applications. The results suggest that artificial neural networks are able to model the relation more accurately compared to more traditional time series models. It was also shown that a trading strategy based on forecasting the relation was able to generate significant profits. Furthermore, it was shown that profits could be increased by combining a forecasting model with a signal classification model. / Optioner är en viktig del i dagens finansiella marknad. Det är därför viktigt att kunna förstå när optioner är över- och undervärderade för att vara i framkant av marknaden. För att bestämma detta kan relationen mellan den underliggande tillgångens volatilitet, kallad realiserad volatilitet, och marknadens förväntade volatilitet, kallad implicit volatilitet, analyseras. I den här avhandlingen undersöktes fem modeller för att modellera relationen mellan implicit och realiserad volatilitet. De fem modellerna var en Ornstein–Uhlenbeck modell, två autoregressiva modeller samt två artificiella neurala nätverk. För att analysera modellernas prestanda undersöktes olika nogrannhetsmått för prognoser från modellerna. Signaler från modellerna beräknades även och användes i en simulerad optionshandelsmiljö för att få en bättre förståelse för hur väl de presterar i en handelstillämpning. Resultaten tyder på att artificiella neurala nätverk kan modellera relationen bättre än mer traditionella tidsseriemodellerna. Det visades även att en handelsstrategi baserad på prognoser av relationen kunde generera en signifikant vinst. Det visades dessutom att vinster kunde ökas genom att kombinera en prognosmodell med en modell som klassificerar signaler.
107

Essays on Volatility Risk, Asset Returns and Consumption-Based Asset Pricing

Kim, Young Il 25 June 2008 (has links)
No description available.
108

Development of a Novel Social Media Sentiment Risk Model for Financial Assets / Utveckling av ett finansiellt riskmått med hänsyn till sentimentalitet från sociala medier

Rudert, Emelie January 2023 (has links)
This thesis aims to investigate the potential effects on Value at Risk (VaR) measurements when including social media sentiments from Reddit and Twitter. The investigated stock companies are Apple, Alphabet and Tesla. Furthermore, the VaR measurements will be computed through volatility forecasts and assumptions about the return distributions. The volatility will be forecasted by two different models and each model will both include and exclude social media sentiments, so there will be four different volatility forecasts for each stock. Moreover, the volatility models will be the Heterogeneous autoregression (HAR) model and the Heterogeneous autoregression Neural Network (HAR-NN) model. The assumptions of return distributions are a log-logistic distribution and a log-normal distribution. In addition to this, the VaR measurements are computed and evaluated through number of breaches for each of the volatility forecasts and for both assumptions of a return distribution. The result shows that there is an improvement in forecasting volatility for Apple and Alphabet, as well as fewer VaR breaches for both assumptions of log-return distributions. However, the results for Tesla showed that the volatility forecasts were better when excluding social media sentiment. A possible reason for this might be due to Twitter posts made by influential people, like Elon Musk that would have a larger effect on the volatility than the average sentiment score over that day. Another possible explanation to this might be due to multicollinearity. Overall, the results showed that the assumption of a log-logistic distribution was more suitable over a log- normal return distribution for all three stocks. / Den här studien undersöker de potentiella effekterna av att inkludera sentiment från Reddit och Twitter vid beräkning av det finansiella riskmåttet VaR. De undersökta aktierna är Apple, Alphabet och Tesla. VaR måtten beräknas genom att förutspå volatiliteten samt genom att göra antaganden om aktiernas avkast- ningsfördelning. Volatiliteten förutspås genom två olika modeller och bägge modeller kommer både att inkludera sentiment från sociala medier samt exkludera sentimenten. Därav kommer det totalt vara fyra olika volatilitets prognoser för vardera aktie. Volatilitetsmodellerna som används i denna studie är HAR modellen och HAR-NN modellen. De antaganden som görs om logartimen av avkastningsfördelningarna är att de följer en logistik fördelning samt en normalfördelning. Dessutom är VaR måtten beräknade och eval- uerade genom antalet gånger portföljen överskrider VaR måttet för varje volatilitetsprognos och för vardera antagande om avkastningsfördelning. Resultaten av denna studie visar att inkludering av sentiment från sociala medier förbättrar volatilitetsprognosen för Apple och Alphabet, samt att portföljen överskrider dessa VaR mått för båda fördelningsantaganden. Däremot, visar resultaten för Tesla att volatilitetsprog- nosen är sämre då sentiment från sociala medier inkluderas i modellerna. En möjlig anledning till detta skulle kunna vara på grund av inflyelserika personer, så som Elon Musk vars Twitter inlägg har större påverkan på aktievolatiliteten än medelsentimentet. En annan anledning till detta skulle kunna vara på grund av multikollinearitet, ifall sentimenten till Tesla är starkt korrelerade med volatiliteten. Samman- taget visade resultaten att antagandet av att logaritmen av avkastningarna följer en logistikt fördelning var mer passande än antagandet av en normalfördelning för alla tre aktier.
109

ESSAYS ON OPTION IMPLIED VOLATILITY RISK MEASURES FOR BANKS

ANSELMI, GIULIO 03 March 2016 (has links)
La tesi comprende tre saggi sul ruolo della volatilità implicita per le banche. La tesi è organizzata in tre capitoli. Capitolo I - studia il ruolo di skew e spread della volatilità implicita nel determinare i rendimenti delle azioni bancarie. Capitolo II - analizza gli effetti degli skew della volatilità implicita e della realized volatility sulla leva finanziaria delle banche. Capitolo III - si focalizza sul rapporto tra il coefficiente di liquidità delle banche e le misure per il rischio estratte dalla volatilità (skew, spread, realized volatility). / The thesis comprehends three essays on option implied volatility risk measures for banks. The thesis is organized in three chapters. Chapter I - studies the informational content for banks' stock returns in option's implied volatilities skews and spread. Chapter II - analyzes the effect of volatility risk measures (volatility skew and realized volatility) on banks' leverage. Chapter III - studies the relationship between banks' liquidity ratio and volatility risk measures.
110

Efficient estimation using the characteristic function : theory and applications with high frequency data

Kotchoni, Rachidi 05 1900 (has links)
The attached file is created with Scientific Workplace Latex / Nous abordons deux sujets distincts dans cette thèse: l'estimation de la volatilité des prix d'actifs financiers à partir des données à haute fréquence, et l'estimation des paramétres d'un processus aléatoire à partir de sa fonction caractéristique. Le chapitre 1 s'intéresse à l'estimation de la volatilité des prix d'actifs. Nous supposons que les données à haute fréquence disponibles sont entachées de bruit de microstructure. Les propriétés que l'on prête au bruit sont déterminantes dans le choix de l'estimateur de la volatilité. Dans ce chapitre, nous spécifions un nouveau modèle dynamique pour le bruit de microstructure qui intègre trois propriétés importantes: (i) le bruit peut être autocorrélé, (ii) le retard maximal au delà duquel l'autocorrélation est nulle peut être une fonction croissante de la fréquence journalière d'observations; (iii) le bruit peut avoir une composante correlée avec le rendement efficient. Cette dernière composante est alors dite endogène. Ce modèle se différencie de ceux existant en ceci qu'il implique que l'autocorrélation d'ordre 1 du bruit converge vers 1 lorsque la fréquence journalière d'observation tend vers l'infini. Nous utilisons le cadre semi-paramétrique ainsi défini pour dériver un nouvel estimateur de la volatilité intégrée baptisée "estimateur shrinkage". Cet estimateur se présente sous la forme d'une combinaison linéaire optimale de deux estimateurs aux propriétés différentes, l'optimalité étant défini en termes de minimisation de la variance. Les simulations indiquent que l'estimateur shrinkage a une variance plus petite que le meilleur des deux estimateurs initiaux. Des estimateurs sont également proposés pour les paramètres du modèle de microstructure. Nous clôturons ce chapitre par une application empirique basée sur des actifs du Dow Jones Industrials. Les résultats indiquent qu'il est pertinent de tenir compte de la dépendance temporelle du bruit de microstructure dans le processus d'estimation de la volatilité. Les chapitres 2, 3 et 4 s'inscrivent dans la littérature économétrique qui traite de la méthode des moments généralisés. En effet, on rencontre en finance des modèles dont la fonction de vraisemblance n'est pas connue. On peut citer en guise d'exemple la loi stable ainsi que les modèles de diffusion observés en temps discrets. Les méthodes d'inférence basées sur la fonction caractéristique peuvent être envisagées dans ces cas. Typiquement, on spécifie une condition de moment basée sur la différence entre la fonction caractéristique (conditionnelle) théorique et sa contrepartie empirique. Le défit ici est d'exploiter au mieux le continuum de conditions de moment ainsi spécifié pour atteindre la même efficacité que le maximum de vraisemblance dans les inférences. Ce défit a été relevé par Carrasco et Florens (2000) qui ont proposé la procédure CGMM (continuum GMM). La fonction objectif que ces auteurs proposent est une forme quadratique hilbertienne qui fait intervenir l'opérateur inverse de covariance associé au continuum de condition de moments. Cet opérateur inverse est régularisé à la Tikhonov pour en assurer l'existence globale et la continuité. Carrasco et Florens (2000) ont montré que l'estimateur obtenu en minimisant cette forme quadratique est asymptotiquement aussi efficace que l'estimateur du maximum de vraisemblance si le paramètre de régularisation (α) tend vers zéro lorsque la taille de l'échatillon tend vers l'infini. La nature de la fonction objectif du CGMM soulève deux questions importantes. La première est celle de la calibration de α en pratique, et la seconde est liée à la présence d'intégrales multiples dans l'expression de la fonction objectif. C'est à ces deux problématiques qu'essayent de répondent les trois derniers chapitres de la présente thèse. Dans le chapitre 2, nous proposons une méthode de calibration de α basée sur la minimisation de l'erreur quadratique moyenne (EQM) de l'estimateur. Nous suivons une approche similaire à celle de Newey et Smith (2004) pour calculer un développement d'ordre supérieur de l'EQM de l'estimateur CGMM de sorte à pouvoir examiner sa dépendance en α en échantillon fini. Nous proposons ensuite deux méthodes pour choisir α en pratique. La première se base sur le développement de l'EQM, et la seconde se base sur des simulations Monte Carlo. Nous montrons que la méthode Monte Carlo délivre un estimateur convergent de α optimal. Nos simulations confirment la pertinence de la calibration de α en pratique. Le chapitre 3 essaye de vulgariser la théorie du chapitre 2 pour les modèles univariés ou bivariés. Nous commençons par passer en revue les propriétés de convergence et de normalité asymptotique de l'estimateur CGMM. Nous proposons ensuite des recettes numériques pour l'implémentation. Enfin, nous conduisons des simulations Monte Carlo basée sur la loi stable. Ces simulations démontrent que le CGMM est une méthode fiable d'inférence. En guise d'application empirique, nous estimons par CGMM un modèle de variance autorégressif Gamma. Les résultats d'estimation confirment un résultat bien connu en finance: le rendement est positivement corrélé au risque espéré et négativement corrélé au choc sur la volatilité. Lorsqu'on implémente le CGMM, une difficulté majeure réside dans l'évaluation numérique itérative des intégrales multiples présentes dans la fonction objectif. Les méthodes de quadrature sont en principe parmi les plus précises que l'on puisse utiliser dans le présent contexte. Malheureusement, le nombre de points de quadrature augmente exponentiellement en fonction de la dimensionalité (d) des intégrales. L'utilisation du CGMM devient pratiquement impossible dans les modèles multivariés et non markoviens où d≥3. Dans le chapitre 4, nous proposons une procédure alternative baptisée "reéchantillonnage dans le domaine fréquentielle" qui consiste à fabriquer des échantillons univariés en prenant une combinaison linéaire des éléments du vecteur initial, les poids de la combinaison linéaire étant tirés aléatoirement dans un sous-espace normalisé de ℝ^{d}. Chaque échantillon ainsi généré est utilisé pour produire un estimateur du paramètre d'intérêt. L'estimateur final que nous proposons est une combinaison linéaire optimale de tous les estimateurs ainsi obtenus. Finalement, nous proposons une étude par simulation et une application empirique basées sur des modèles autorégressifs Gamma. Dans l'ensemble, nous faisons une utilisation intensive du bootstrap, une technique selon laquelle les propriétés statistiques d'une distribution inconnue peuvent être estimées à partir d'un estimé de cette distribution. Nos résultats empiriques peuvent donc en principe être améliorés en faisant appel aux connaissances les plus récentes dans le domaine du bootstrap. / In estimating the integrated volatility of financial assets using noisy high frequency data, the time series properties assumed for the microstructure noise determines the proper choice of the volatility estimator. In the first chapter of the current thesis, we propose a new model for the microstructure noise with three important features. First of all, our model assumes that the noise is L-dependent. Secondly, the memory lag L is allowed to increase with the sampling frequency. And thirdly, the noise may include an endogenous part, that is, a piece that is correlated with the latent returns. The main difference between this microstructure model and existing ones is that it implies a first order autocorrelation that converges to 1 as the sampling frequency goes to infinity. We use this semi-parametric model to derive a new shrinkage estimator for the integrated volatility. The proposed estimator makes an optimal signal-to-noise trade-off by combining a consistent estimators with an inconsistent one. Simulation results show that the shrinkage estimator behaves better than the best of the two combined ones. We also propose some estimators for the parameters of the noise model. An empirical study based on stocks listed in the Dow Jones Industrials shows the relevance of accounting for possible time dependence in the noise process. Chapters 2, 3 and 4 pertain to the generalized method of moments based on the characteristic function. In fact, the likelihood functions of many financial econometrics models are not known in close form. For example, this is the case for the stable distribution and a discretely observed continuous time model. In these cases, one may estimate the parameter of interest by specifying a moment condition based on the difference between the theoretical (conditional) characteristic function and its empirical counterpart. The challenge is then to exploit the whole continuum of moment conditions hence defined to achieve the maximum likelihood efficiency. This problem has been solved in Carrasco and Florens (2000) who propose the CGMM procedure. The objective function of the CGMM is a quadrqtic form on the Hilbert space defined by the moment function. That objective function depends on a Tikhonov-type regularized inverse of the covariance operator associated with the moment function. Carrasco and Florens (2000) have shown that the estimator obtained by minimizing the proposed objective function is asymptotically as efficient as the maximum likelihood estimator provided that the regularization parameter (α) converges to zero as the sample size goes to infinity. However, the nature of this objective function raises two important questions. First of all, how do we select α in practice? And secondly, how do we implement the CGMM when the multiplicity (d) of the integrals embedded in the objective-function d is large. These questions are tackled in the last three chapters of the thesis. In Chapter 2, we propose to choose α by minimizing the approximate mean square error (MSE) of the estimator. Following an approach similar to Newey and Smith (2004), we derive a higher-order expansion of the estimator from which we characterize the finite sample dependence of the MSE on α. We provide two data-driven methods for selecting the regularization parameter in practice. The first one relies on the higher-order expansion of the MSE whereas the second one uses only simulations. We show that our simulation technique delivers a consistent estimator of α. Our Monte Carlo simulations confirm the importance of the optimal selection of α. The goal of Chapter 3 is to illustrate how to efficiently implement the CGMM for d≤2. To start with, we review the consistency and asymptotic normality properties of the CGMM estimator. Next we suggest some numerical recipes for its implementation. Finally, we carry out a simulation study with the stable distribution that confirms the accuracy of the CGMM as an inference method. An empirical application based on the autoregressive variance Gamma model led to a well-known conclusion: investors require a positive premium for bearing the expected risk while a negative premium is attached to the unexpected risk. In implementing the characteristic function based CGMM, a major difficulty lies in the evaluation of the multiple integrals embedded in the objective function. Numerical quadratures are among the most accurate methods that can be used in the present context. Unfortunately, the number of quadrature points grows exponentially with d. When the data generating process is Markov or dependent, the accurate implementation of the CGMM becomes roughly unfeasible when d≥3. In Chapter 4, we propose a strategy that consists in creating univariate samples by taking a linear combination of the elements of the original vector process. The weights of the linear combinations are drawn from a normalized set of ℝ^{d}. Each univariate index generated in this way is called a frequency domain bootstrap sample that can be used to compute an estimator of the parameter of interest. Finally, all the possible estimators obtained in this fashion can be aggregated to obtain the final estimator. The optimal aggregation rule is discussed in the paper. The overall method is illustrated by a simulation study and an empirical application based on autoregressive Gamma models. This thesis makes an extensive use of the bootstrap, a technique according to which the statistical properties of an unknown distribution can be estimated from an estimate of that distribution. It is thus possible to improve our simulations and empirical results by using the state-of-the-art refinements of the bootstrap methodology.

Page generated in 0.0776 seconds