• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 41
  • 8
  • 8
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 77
  • 77
  • 27
  • 16
  • 15
  • 14
  • 14
  • 14
  • 13
  • 12
  • 12
  • 12
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Mesure et Prévision de la Volatilité pour les Actifs Liquides

Chaker, Selma 04 1900 (has links)
Le prix efficient est latent, il est contaminé par les frictions microstructurelles ou bruit. On explore la mesure et la prévision de la volatilité fondamentale en utilisant les données à haute fréquence. Dans le premier papier, en maintenant le cadre standard du modèle additif du bruit et le prix efficient, on montre qu’en utilisant le volume de transaction, les volumes d’achat et de vente, l’indicateur de la direction de transaction et la différence entre prix d’achat et prix de vente pour absorber le bruit, on améliore la précision des estimateurs de volatilité. Si le bruit n’est que partiellement absorbé, le bruit résiduel est plus proche d’un bruit blanc que le bruit original, ce qui diminue la misspécification des caractéristiques du bruit. Dans le deuxième papier, on part d’un fait empirique qu’on modélise par une forme linéaire de la variance du bruit microstructure en la volatilité fondamentale. Grâce à la représentation de la classe générale des modèles de volatilité stochastique, on explore la performance de prévision de différentes mesures de volatilité sous les hypothèses de notre modèle. Dans le troisième papier, on dérive de nouvelles mesures réalizées en utilisant les prix et les volumes d’achat et de vente. Comme alternative au modèle additif standard pour les prix contaminés avec le bruit microstructure, on fait des hypothèses sur la distribution du prix sans frictions qui est supposé borné par les prix de vente et d’achat. / The high frequency observed price series is contaminated with market microstructure frictions or noise. We explore the measurement and forecasting of the fundamental volatility through novel approaches to the frictions’ problem. In the first paper, while maintaining the standard framework of a noise-frictionless price additive model, we use the trading volume, quoted depths, trade direction indicator and bid-ask spread to get rid of the noise. The econometric model is a price impact linear regression. We show that incorporating the cited liquidity costs variables delivers more precise volatility estimators. If the noise is only partially absorbed, the remaining noise is closer to a white noise than the original one, which lessens misspecification of the noise characteristics. Our approach is also robust to a specific form of endogeneity under which the common robust to noise measures are inconsistent. In the second paper, we model the variance of the market microstructure noise that contaminates the frictionless price as an affine function of the fundamental volatility. Under our model, the noise is time-varying intradaily. Using the eigenfunction representation of the general stochastic volatility class of models, we quantify the forecasting performance of several volatility measures under our model assumptions. In the third paper, instead of assuming the standard additive model for the observed price series, we specify the conditional distribution of the frictionless price given the available information which includes quotes and volumes. We come up with new volatility measures by characterizing the conditional mean of the integrated variance.
52

Limits to the Efficiency of the Capital Market / Limits to the Efficiency of the Capital Market

Vyhlídka, Jan January 2009 (has links)
The aim of this study is to gather insights into market efficiency and mechanisms that work in the financial markets. It provides a framework with an emphasis on liquidity and the failure of arbitrage that deepens our understanding of various financial crises. Described mechanisms are particularly relevant for the last financial crises - including 2007-2009, LTCM, and dot-com bubble. In the first chapter the concept of efficient markets is introduced. In the second chapter it is challenged from the point of view of noise trader theory and limits of arbitrage. The third chapter deals with market microstructure and liquidity. Last chapter shows importance and adverse effects of externalities, particularly of those causing liquidity spirals.
53

Optimal liquidation in dark pools in discrete and continuous time

Kratz, Peter 30 August 2011 (has links)
Wir studieren optimale Handelsstrategien für einen risikoaversen Investor, der bis zu einem Zeitpunkt T ein Portfolio aufzulösen hat. Dieser kann auf einem traditionellen Markt (dem "Primärmarkt") handeln, wodurch er den Preis beeinflusst, und gleichzeitig Aufträge in einem Dark Pool erteilen. Dort ist die Liquidität nicht öffentlich bekannt, und es findet keine Preisfindung statt: Aufträge werden zum Preis des Primärmarkts abgewickelt. Deshalb haben sie keinen Preiseinfluss, die Ausführung ist aber unsicher; es muss zwischen den Preiseinflusskosten am Primärmarkt und den indirekten Kosten durch die Ausübungsunsicherheit im Dark Pool abgewogen werden. In einem zeitdiskreten Handelsmodell betrachten wir ein Kostenfunktional aus erwarteten Preiseinfluss- und Marktrisikokosten. Für linearen Preiseinfluss ist dieses linear-quadratisch und wir erhalten eine Rekursion für die optimale Handelsstrategie. Eine Position in einem einzelnen Wertpapier wird langsam am Primärmarkt abgebaut während der Rest im Dark Pool angeboten wird. Für eine Position in mehreren Wertpapieren ist dies wegen der Korrelation der Wertpapiere nicht optimal. Tritt im eindimensionalen Fall adverse Selektion auf, so wird die Attraktivität des Dark Pools verringert. In stetiger Zeit impliziert die Liquidationsbedingung eine Singularität der Wertfunktion am Endzeitpunkt T. Diese wird im linear-quadratischen Fall ohne adverse Selektion durch den Grenzwert einer Folge von Lösungen einer Matrix Differentialgleichung beschrieben. Mit Hilfe einer Matrixungleichung erhalten wir Schranken für diese Lösungen, die Existenz des Grenzwertes sowie ein Verifikationsargument mittels HJB Gleichung. Tritt adverse Selektion auf, ergeben umfangreiche heuristische Betrachtungen eine ungewöhnliche Struktur der Wertfunktion: Sie ist ein quadratisches "Quasi-Polynom", dessen Koeffizienten in nicht-trivialer Weise von der Position abhängen. Wir bestimmen dieses semi-explizit und führen ein Verifikationsargument durch. / We study optimal trading strategies of a risk-averse investor who has to liquidate a portfolio within a finite time horizon [0,T]. The investor has the option to trade at a traditional exchange (the "primary venue") which yields price impact and to place orders in a dark pool. The liquidity in dark pools is not openly displayed and dark pools do not contribute to the price formation process: orders are executed at the price of the primary venue. Hence, they have no price impact, but their execution is uncertain. The investor thus faces the trade-off between the price impact costs at the primary venue and the indirect costs resulting from the execution uncertainty in the dark pool. In a discrete-time market model we consider a cost functional which incorporates the expected price impact costs and market risk costs. For linear price impact, it is linear-quadratic and we obtain a recursion for the optimal trading strategy. For single asset liquidation, the investor trades out of her position at the primary venue, with the remainder being placed in the dark pool. For multi asset liquidation this is not optimal because of the correlation of the assets. In the presence of adverse selection in the one dimensional setting the dark pool is less attractive. In continuous time the liquidation constraint implies a singularity of the value function at the terminal time T. In the linear-quadratic case without adverse selection it is described by the limit of a sequence of solutions of a matrix differential equation. By means of a matrix inequality we obtain bounds of these solutions, the existence of the limit and a verification argument via HJB equation. In the presence of adverse selection the value function has an unusual structure, which we obtain via extensive heuristic considerations: it is a "quasi-polynomial" whose coefficients depend on the asset position in a non-trivial way. We characterize the value function semi-explicitly and carry out a verification argument.
54

Méthodes et modèles numériques appliqués aux risques du marché et à l’évaluation financière / Numerical methods and models in market risk and financial valuations area

Infante Acevedo, José Arturo 09 December 2013 (has links)
Ce travail de thèse aborde deux sujets : (i) L'utilisation d'une nouvelle méthode numérique pour l'évaluation des options sur un panier d'actifs, (ii) Le risque de liquidité, la modélisation du carnet d'ordres et la microstructure de marché. Premier thème : Un algorithme glouton et ses applications pour résoudre des équations aux dérivées partielles. L'exemple typique en finance est l'évaluation d'une option sur un panier d'actifs, laquelle peut être obtenue en résolvant l'EDP de Black-Scholes ayant comme dimension le nombre d'actifs considérés. Nous proposons d'étudier un algorithme qui a été proposé et étudié récemment dans [ACKM06, BLM09] pour résoudre des problèmes en grande dimension et essayer de contourner la malédiction de la dimension. L'idée est de représenter la solution comme une somme de produits tensoriels et de calculer itérativement les termes de cette somme en utilisant un algorithme glouton. La résolution des EDP en grande dimension est fortement liée à la représentation des fonctions en grande dimension. Dans le Chapitre 1, nous décrivons différentes approches pour représenter des fonctions en grande dimension et nous introduisons les problèmes en grande dimension en finance qui sont traités dans ce travail de thèse. La méthode sélectionnée dans ce manuscrit est une méthode d'approximation non-linéaire appelée Proper Generalized Decomposition (PGD). Le Chapitre 2 montre l'application de cette méthode pour l'approximation de la solution d'une EDP linéaire (le problème de Poisson) et pour l'approximation d'une fonction de carré intégrable par une somme des produits tensoriels. Un étude numérique de ce dernier problème est présenté dans le Chapitre 3. Le problème de Poisson et celui de l'approximation d'une fonction de carré intégrable serviront de base dans le Chapitre 4 pour résoudre l'équation de Black-Scholes en utilisant l'approche PGD. Dans des exemples numériques, nous avons obtenu des résultats jusqu'en dimension 10. Outre l'approximation de la solution de l'équation de Black-Scholes, nous proposons une méthode de réduction de variance des méthodes Monte Carlo classiques pour évaluer des options financières. Second thème : Risque de liquidité, modélisation du carnet d'ordres, microstructure de marché. Le risque de liquidité et la microstructure de marché sont devenus des sujets très importants dans les mathématiques financières. La dérégulation des marchés financiers et la compétition entre eux pour attirer plus d'investisseurs constituent une des raisons possibles. Dans ce travail, nous étudions comment utiliser cette information pour exécuter de façon optimale la vente ou l'achat des ordres. Les ordres peuvent seulement être placés dans une grille des prix. A chaque instant, le nombre d'ordres en attente d'achat (ou vente) pour chaque prix est enregistré. Dans [AFS10], Alfonsi, Fruth et Schied ont proposé un modèle simple du carnet d'ordres. Dans ce modèle, il est possible de trouver explicitement la stratégie optimale pour acheter (ou vendre) une quantité donnée d'actions avant une maturité. L'idée est de diviser l'ordre d'achat (ou de vente) dans d'autres ordres plus petits afin de trouver l'équilibre entre l'acquisition des nouveaux ordres et leur prix. Ce travail de thèse se concentre sur une extension du modèle du carnet d'ordres introduit par Alfonsi, Fruth et Schied. Ici, l'originalité est de permettre à la profondeur du carnet d'ordres de dépendre du temps, ce qui représente une nouvelle caractéristique du carnet d'ordres qui a été illustré par [JJ88, GM92, HH95, KW96]. Dans ce cadre, nous résolvons le problème de l'exécution optimale pour des stratégies discrètes et continues. Ceci nous donne, en particulier, des conditions suffisantes pour exclure les manipulations des prix au sens de Huberman et Stanzl [HS04] ou de Transaction-Triggered Price Manipulation (voir Alfonsi, Schied et Slynko) / This work is organized in two themes : (i) A novel numerical method to price options on manyassets, (ii) The liquidity risk, the limit order book modeling and the market microstructure.First theme : Greedy algorithms and applications for solving partial differential equations in high dimension Many problems of interest for various applications (material sciences, finance, etc) involve high-dimensional partial differential equations (PDEs). The typical example in finance is the pricing of a basket option, which can be obtained by solving the Black-Scholes PDE with dimension the number of underlying assets. We propose to investigate an algorithm which has been recently proposed and analyzed in [ACKM06, BLM09] to solve such problems and try to circumvent the curse of dimensionality. The idea is to represent the solution as a sum of tensor products and to compute iteratively the terms of this sum using a greedy algorithm. The resolution of high dimensional partial differential equations is highly related to the representation of high dimensional functions. In Chapter 1, we describe various linear approaches existing in literature to represent high dimensional functions and we introduce the high dimensional problems in finance that we will address in this work. The method studied in this manuscript is a non-linear approximation method called the Proper Generalized Decomposition. Chapter 2 shows the application of this method to approximate the so-lution of a linear PDE (the Poisson problem) and also to approximate a square integrable function by a sum of tensor products. A numerical study of this last problem is presented in Chapter 3. The Poisson problem and the approximation of a square integrable function will serve as basis in Chapter 4for solving the Black-Scholes equation using the PGD approach. In numerical experiments, we obtain results for up to 10 underlyings. Second theme : Liquidity risk, limit order book modeling and market microstructure. Liquidity risk and market microstructure have become in the past years an important topic in mathematical finance. One possible reason is the deregulation of markets and the competition between them to try to attract as many investors as possible. Thus, quotation rules are changing and, in general, more information is available. In particular, it is possible to know at each time the awaiting orders on some stocks and to have a record of all the past transactions. In this work we study how to use this information to optimally execute buy or sell orders, which is linked to the traders' behaviour that want to minimize their trading cost. In [AFS10], Alfonsi, Fruth and Schied have proposed a simple LOB model. In this model, it is possible to explicitly derive the optimal strategy for buying (or selling) a given amount of shares before a given deadline. Basically, one has to split the large buy (or sell) order into smaller ones in order to find the best trade-off between attracting new orders and the price of the orders. Here, we focus on an extension of the Limit Order Book (LOB) model with general shape introduced by Alfonsi, Fruth and Schied. The additional feature is a time-varying LOB depth that represents a new feature of the LOB highlighted in [JJ88, GM92, HH95, KW96]. We solve the optimal execution problem in this framework for both discrete and continuous time strategies. This gives in particular sufficient conditions to exclude Price Manipulations in the sense of Huberman and Stanzl [HS04] or Transaction-Triggered Price Manipulations (see Alfonsi, Schied and Slynko). The seconditions give interesting qualitative insights on how market makers may create price manipulations
55

Essays on market microstructure : empirical evidence from some Nordic exchanges

Niemeyer, Jonas January 1994 (has links)
This dissertation consists of five separate and self-contained essays. They have been written as distinct papers. Although there is a fair amount of overlap and cross-reference in analysis and discussion, the intention is that potential readers should be able to read them separately. Essay 1: An Empirical Analysis of the Trading Structure at the Stockholm Stock Exchange.This essay describes and analyzes the trading structure at the Stockholm Stock Exchange. In the empirical part, we report stylized facts based on intraday transaction and order book data, focusing on the intraday behavior of returns, trading activity, order placement and bid/ask spread, on the importance of the tick size and finally on some characteristics of the limit order book. Our main empirical conclusions are that a) the intraday U-shape in trading activity found in earlier U.S. studies on the whole also pertains to the Stockholm Stock Exchange, b) the limit order placement also follows an intraday U-shape, c) there is no distinct intraday pattern in returns, d) the volatility and bid/ask spread seems to be higher at the beginning of the trading day, e) the tick size is economically important, and f) the price impact of an order is a nonlinear function of its quantity, implying price inelastic demand and supply. Essay 2: An Empirical Analysis of the Trading Structure at the Stockholm Options and Forwards Exchange, OM.We first describe and analyze the trading structure at the Stockholm Options and Forward Exchange, OM Stockholm. It is characterized by some interesting market microstructure features, such as a high degree of transparency in a fully computerized trading system and a possibility to submit combination orders. We also present empirically results from tests on the intra- and interday trading volume of the OMX index derivatives, both in terms of number of contracts traded and in terms of number of transactions. There is evidence of a high degree of intraday variation in trading volume and some interday variation. The extension of trading hours of the underlying stocks, during the studied period should, according to modern trade concentration models, affect the distribution of trading across the day. Although no formal test of the models is possible with this data set, we are able to shed some supportive additional light on all of these models. Essay 3: Tick Size, Market Liquidity and Trading Volume: Evidence from the Stockholm Stock Exchange. (This essay was co-authored with Patrik Sandås.)The regulated tick size at a securities exchange puts a lower bound on the bid/ask spread. We use cross-sectional and cross-daily data from the Stockholm Stock Exchange to assess if this lower bound is economically important and if it has any direct effect on market depth and traded volume. We find a) strong support that the tick size is positively correlated to market depth and c) some support that it is negatively related to traded volume. We identify different groups of agents to whom a lower tick size would be beneficial and to whom it would be detrimental. Essay 4: An Analysis of the Lead-Lag Relationship between the OMX Index Forwards and the OMX Cash Index.This essay investigates the intraday lead-lag structure in returns between on the one hand the OMX cash index and on the other hand the OMX index forwards and the OMX synthetic index forwards in Sweden. The data set includes 22 months of data, from December 1991, to September 1993. It is divided into three sub-periods. The main conclusion is that there is a high degree of bidirectional interdependence, with both series Granger causing each other. Using a Sims-test, we find that the forwards as well as synthetic forwards lead the cash index with between fifteen and thirty minutes, while the cash index leads the forwards with about ten to fifteen minutes.. This implies a longer lead from the cash index to the forwards than in previous studies. The large interdependence could possibly be due to higher transaction costs, lower liquidity in the forward market and the specific trading environments used for Swedish securities. Essay 5: Order Flow Dynamics: Evidence from the Helsinki Stock Exchange. (This essay was co-authored with Kaj Hedvall.)This essay investigates the dynamics of the order flow in a limit order book. In contrast to previous studies, our data set from the Helsinki Stock Exchange encompasses the entire order book structure, including the dealer identities. This enables us to focus on the order behavior of individual dealers. We classify the events in the order book and study the structure of subsequent events using contingency tables. In specific, the structure of subsequent events initiated by the same dealer is compared to the overall event structure. We find that order splitting is more frequent than order imitation. Furthermore, if the spread increases as a result of a trade, other dealers quickly restore the spread, by submitting new limit orders. One conclusion is therefore that there exists a body of potential limit orders outside the formal limit order book and that there is a high degree of resiliency in our limit order book market. As a logical consequence, a large dealer strategically splits his order, in order for the market to supply additional liquidity. One interpretation of our results is that a limit order book market can accommodate larger orders than is first apparent by the outstanding limit orders. Another interpretation is that a limit order book structure gives room for informed traders to successively trade on their information. A third interpretation is that prices only slowly incorporate new information. / Diss. Stockholm : Handelshögskolan, 1994
56

Contrôle optimal dans des carnets d'ordres limites

Guilbaud, Fabien 01 February 2013 (has links) (PDF)
On propose un traitement quantitatif de différentes problématiques du trading haute fréquence. On s'intéresse à plusieurs aspects de cette pratique, allant de la minimisation des frais indirects de trading, jusqu'à la tenue de marché, et plus généralement des stratégies de maximisation du profit sur un horizon de temps fini. On établit un cadre de travail original qui permet de refléter les spécificités du trading haute fréquence, notamment la distinction entre le trading passif et le trading actif, à l'aide de méthodes de contrôle stochastique mixte. On porte un soin particulier à la modélisation des phénomènes de marché en haute fréquence, et on propose pour chacun des méthodes de calibration compatibles avec les contraintes pratiques du trading algorithmique.
57

Mesure et Prévision de la Volatilité pour les Actifs Liquides

Chaker, Selma 04 1900 (has links)
Le prix efficient est latent, il est contaminé par les frictions microstructurelles ou bruit. On explore la mesure et la prévision de la volatilité fondamentale en utilisant les données à haute fréquence. Dans le premier papier, en maintenant le cadre standard du modèle additif du bruit et le prix efficient, on montre qu’en utilisant le volume de transaction, les volumes d’achat et de vente, l’indicateur de la direction de transaction et la différence entre prix d’achat et prix de vente pour absorber le bruit, on améliore la précision des estimateurs de volatilité. Si le bruit n’est que partiellement absorbé, le bruit résiduel est plus proche d’un bruit blanc que le bruit original, ce qui diminue la misspécification des caractéristiques du bruit. Dans le deuxième papier, on part d’un fait empirique qu’on modélise par une forme linéaire de la variance du bruit microstructure en la volatilité fondamentale. Grâce à la représentation de la classe générale des modèles de volatilité stochastique, on explore la performance de prévision de différentes mesures de volatilité sous les hypothèses de notre modèle. Dans le troisième papier, on dérive de nouvelles mesures réalizées en utilisant les prix et les volumes d’achat et de vente. Comme alternative au modèle additif standard pour les prix contaminés avec le bruit microstructure, on fait des hypothèses sur la distribution du prix sans frictions qui est supposé borné par les prix de vente et d’achat. / The high frequency observed price series is contaminated with market microstructure frictions or noise. We explore the measurement and forecasting of the fundamental volatility through novel approaches to the frictions’ problem. In the first paper, while maintaining the standard framework of a noise-frictionless price additive model, we use the trading volume, quoted depths, trade direction indicator and bid-ask spread to get rid of the noise. The econometric model is a price impact linear regression. We show that incorporating the cited liquidity costs variables delivers more precise volatility estimators. If the noise is only partially absorbed, the remaining noise is closer to a white noise than the original one, which lessens misspecification of the noise characteristics. Our approach is also robust to a specific form of endogeneity under which the common robust to noise measures are inconsistent. In the second paper, we model the variance of the market microstructure noise that contaminates the frictionless price as an affine function of the fundamental volatility. Under our model, the noise is time-varying intradaily. Using the eigenfunction representation of the general stochastic volatility class of models, we quantify the forecasting performance of several volatility measures under our model assumptions. In the third paper, instead of assuming the standard additive model for the observed price series, we specify the conditional distribution of the frictionless price given the available information which includes quotes and volumes. We come up with new volatility measures by characterizing the conditional mean of the integrated variance.
58

Essays in Market Microstructure

Hoffmann, Peter 13 July 2011 (has links)
This thesis covers three topics in Market Microstructure. Chapter 1 demonstrates that market access frictions may play a significant role in the competition between trading platforms. Analyzing a recent dataset of the trading activity in French and German stocks, we provide evidence that the incumbent markets dominate because the sole market entrant exposes liquidity providers to an excessive adverse selection risk due to a lack of noise traders. Chapter 2 presents a theoretical model of price formation in a dynamic limit order market with slow human traders and fast algorithmic traders. We show that in most cases, algorithmic trading has a detrimental effect on human traders’ welfare. Finally, Chapter 3 empirically analyzes the role of pre-trade transparency in call auctions. Comparing the trading mechanisms in place on the French and German stock exchanges, we find that transparency is associated with higher trading volume, greater liquidity, and better price discovery. / Esta tesis estudia tres temas diferentes de la microestructura de los mercados financieros. El capítulo 1 demuestra que fricciones en el acceso al mercado pueden desempeñar un papel significativo en la competencia entre plataformas de negociación de activos. El análisis de un conjunto de datos recientes de la actividad en acciones francesas y alemanas demuestra que los mercados primarios dominan debido a que el único mercado satélite expone los proveedores de liquidez a un riesgo excesivo de selección adversa, causado por una falta de noise traders. El capítulo 2 presenta un modelo teórico de formación de precios en un mercado dinámico con limit order book poblado por agentes humanos lentos y agentes algorítmicos rápidos. Se demuestra que, en la mayoría de los casos, la negociación algorítmica tiene un efecto negativo sobre el bienestar de agentes humanos. Por último, el capítulo 3 analiza empíricamente el papel de la transparencia pre-negociación en las subastas de apertura y de cierre. Comparando los mecanismos en las bolsas francesas y alemanas, encontramos que la transparencia está asociada con un volumen mayor, una liquidez mayor y un mejor price discovery.
59

Is high-frequency trading a threat to financial stability?

Virgilio, Gianluca January 2017 (has links)
The purpose of this thesis is: (i) to produce an in-depth data analysis and computer-based simulations of the market environment to investigate whether financial stability is affected by the presence of High-Frequency investors; (ii) to verify how High-Frequency Trading and financial stability interact with each other under non-linear conditions; (iii) whether non-illicit behaviours can still lead to potentially destabilising effects; (iv) to provide quantitative support to the theses, either from the audit trail data or resulting from simulations. Simulations are provided to test whether High-Frequency Trading: (a) has an impact on market volatility, (b) leads to market splitting into two tiers; (c) takes the lion's share of arbitrage opportunities. Audit trail data is analysed to verify some hypotheses on the dynamics of the Flash Crash. The simulation on the impact of High-Frequency Trading on market volatility confirms that when markets are under stress, High-Frequency Trading may cause volatility to significantly increase. However, as the number of ultra-fast participants increases, this phenomenon tends to disappear and volatility realigns to its standard values. The market tiering simulation suggests that High-Frequency traders have some tendency to deal with each other, and that causes Low-Frequency traders also to deal with other slow traders, albeit at a lesser extent. This is also a kind of market instability. High-Frequency Trading potentially allows a few fast traders to grab all the arbitrage-led profits, so falsifying the Efficient Market Hypothesis. This phenomenon may disappear as more High-Frequency traders enter the competition, leading to declining profits. Yet, the whole matter seems a dispute for abnormal gains only between few sub-second traders. All simulations have been carefully designed to provide robust results: the behaviours simulated have been drawn from existing literature and the simplifying assumptions have been kept to a minimum. This maximises the reliability of the results and minimizes the potential of bias. Finally, from the data analysis, the impact of High-Frequency Trading on the Flash Crash seems significant; other sudden crashes occurred since, and more can be expected over the next future. Overall, it can be concluded that High-Frequency Trading shows some controversial aspects impacting on financial stability. The results are at a certain extent confirmed by the audit trail data analysis, although only indirectly, since the details allowing the match between High-Frequency traders and their behaviour are confidential and not publicly available Nevertheless, the findings about HFT-induced volatility, market segmentation and sub-optimal market efficiency, albeit not definitive, suggest that careful monitoring by regulators and policy-makers might be required.
60

Clustering in foreign exchange markets : price, trades and traders / Clustering sur les marchés FX : prix, trades et traders

Lallouache, Mehdi 10 July 2015 (has links)
En utilisant des données haute-fréquence inédites, cette thèse étudie trois types de regroupements (“clusters”) présents dans le marché des changes: la concentration d'ordres sur certains prix, la concentration des transactions dans le temps et l'existence de groupes d'investisseurs prenant les mêmes décisions. Nous commençons par étudier les propriétés statistiques du carnet d'ordres EBS pour les paires de devises EUR/USD et USD/JPY et l'impact d'une réduction de la taille du tick sur sa dynamique. Une grande part des ordres limites est encore placée sur les anciens prix autorisés, entraînant l'apparition de prix-barrières, où figurent les meilleures limites la plupart du temps. Cet effet de congestion se retrouve dans la forme moyenne du carnet où des pics sont présents aux distances entières. Nous montrons que cette concentration des prix est causée par les traders manuels qui se refusent d’utiliser la nouvelle résolution de prix. Les traders automatiques prennent facilement la priorité, en postant des ordres limites un tick devant les pics de volume.Nous soulevons ensuite la question de l'aptitude des processus de Hawkes à rendre compte de la dynamique du marché. Nous analysons la précision de tels processus à mesure que l'intervalle de calibration est augmenté. Différent noyaux construits à partir de sommes d'exponentielles sont systématiquement comparés. Le marché FX qui ne ferme jamais est particulièrement adapté pour notre but, car il permet d’éviter les complications dues à la fermeture nocturne des marchés actions. Nous trouvons que la modélisation est valide selon les trois tests statistiques, si un noyau à deux exponentielles est utilisé pour fitter une heure, et deux ou trois pour une journée complète. Sur de plus longues périodes la modélisation est systématiquement rejetée par les tests à cause de la non-stationnarité du processus endogène. Les échelles de temps d'auto-excitation estimées sont relativement courtes et le facteur d'endogénéité est élevé mais sous-critique autour de 0.8. La majorité des modèles à agents suppose implicitement que les agents interagissent à travers du prix des actifs et des volumes échangés. Certains utilisent explicitement un réseau d'interaction entre traders, sur lequel des rumeurs se propagent, d'autres, un réseau qui représente des groupes prenant des décisions communes. Contrairement à d'autres types de données, de tels réseaux, s'ils existent, sont nécessairement implicites, ce qui rend leur détection compliquée. Nous étudions les transactions des clients de deux fournisseur de liquidités sur plusieurs années. En supposant que les liens entre agents sont déterminés par la synchronisation de leur activité ou inactivité, nous montrons que des réseaux d'interactions existent. De plus, nous trouvons que l'activité de certains agents entraîne systématiquement l’activité d'autres agents, définissant ainsi des relations de type “lead-lag” entre les agents. Cela implique que le flux des clients est prévisible, ce que nous vérifions à l'aide d'une méthode sophistiquée d'apprentissage statistique. / The aim of this thesis is to study three types of clustering in foreign exchange markets, namely in price, trades arrivals and investors decisions. We investigate the statistical properties of the EBS order book for the EUR/USD and USD/JPY currency pairs and the impact of a ten-fold tick size reduction on its dynamics. A large fraction of limit orders are still placed right at or halfway between the old allowed prices. This generates price barriers where the best quotes lie for much of the time, which causes the emergence of distinct peaks in the average shape of the book at round distances. Furthermore, we argue that this clustering is mainly due to manual traders who remained set to the old price resolution. Automatic traders easily take price priority by submitting limit orders one tick ahead of clusters, as shown by the prominence of buy (sell) limit orders posted with rightmost digit one (nine).The clustering of trades arrivals is well-known in financial markets and Hawkes processes are particularly suited to describe this phenomenon. We raise the question of what part of market dynamics Hawkes processes are able to account for exactly. We document the accuracy of such processes as one varies the time interval of calibration and compare the performance of various types of kernels made up of sums of exponentials. Because of their around-the-clock opening times, FX markets are ideally suited to our aim as they allow us to avoid the complications of the long daily overnight closures of equity markets. One can achieve statistical significance according to three simultaneous tests provided that one uses kernels with two exponentials for fitting an hour at a time, and two or three exponentials for full days, while longer periods could not be fitted within statistical satisfaction because of the non-stationarity of the endogenous process. Fitted timescales are relatively short and endogeneity factor is high but sub-critical at about 0.8.Most agent-based models of financial markets implicitly assume that the agents interact through asset prices and exchanged volumes. Some of them add an explicit trader-trader interaction network on which rumors propagate or that encode groups that take common decisions. Contrarily to other types of data, such networks, if they exist, are necessarily implicit, which makes their determination a more challenging task. We analyze transaction data of all the clients of two liquidity providers, encompassing several years of trading. By assuming that the links between agents are determined by systematic simultaneous activity or inactivity, we show that interaction networks do exist. In addition, we find that the (in)activity of some agents systematically triggers the (in)activity of other traders, defining lead-lag relationships between the agents. This implies that the global investment flux is predictable, which we check by using sophisticated machine learning methods.

Page generated in 0.0713 seconds