Spelling suggestions: "subject:"hawkes canprocess"" "subject:"hawkes 3.3vprocess""
11 |
Community Hawkes Models for Continuous-time NetworksSoliman, Hadeel 15 September 2022 (has links)
No description available.
|
12 |
Quoting behaviour of a market-maker under different exchange fee structures / Quoting behaviour of a market-maker under different exchange fee structuresKiseľ, Rastislav January 2018 (has links)
During the last few years, market micro-structure research has been active in analysing the dependence of market efficiency on different market character istics. Make-take fees are one of those topics as they might modify the incen tives for participating agents, e.g. broker-dealers or market-makers. In this thesis, we propose a Hawkes process-based model that captures statistical differences arising from different fee regimes and we estimate the differences on limit order book data. We then use these estimates in an attempt to measure the execution quality from the perspective of a market-maker. We appropriate existing theoretical market frameworks, however, for the pur pose of hireling optimal market-making policies we apply a novel method of deep reinforcement learning. Our results suggest, firstly, that maker-taker exchanges provide better liquidity to the markets, and secondly, that deep reinforcement learning methods may be successfully applied to the domain of optimal market-making. JEL Classification Keywords Author's e-mail Supervisor's e-mail C32, C45, C61, C63 make-take fees, Hawkes process, limit order book, market-making, deep reinforcement learn ing kiselrastislavSgmail.com barunik@f sv.cuni.cz
|
13 |
Causalité des marchés financiers : asymétrie temporelle et réseaux multi-échelles de meneurs et suiveurs / Causality in financial markets : time reversal asymmetry and multi-scale lead-lag networksCordi, Marcus 07 March 2019 (has links)
Cette thèse a pour but d’explorer la structure de causalité qui sous-tend les marchés financiers. Elle se concentre sur l’inférence multi-échelle de réseaux de causalité entre investisseurs dans deux bases de données contenant les identifiants des investisseurs. La première partie de cette thèse est consacrée à l’étude de la causalité dans les processus de Hawkes. Ces derniers définissent la façon dont l’activité d’un investisseur (par exemple) dépend du passé; sa version multivariée inclut l’interaction entre séries temporelles, à toutes les échelles. Les résultats principaux de cette partie est que l’estimation avec le maximum de vraisemblance des paramètres du processus changent remarquablement peu lorsque la direction du temps est inversée, tant pour les processus univariés que pour les processus multivariés avec noyaux d’influence mutuelle symétriques, et que la causalité effective de ces processus dépend de leur endogénéité. Cela implique qu’on ne peut pas utiliser ce type de processus pour l’inférence de causalité sans précautions. L’utilisation de tests statistiques permet la différentiation des directions du temps pour des longues données synthétiques. Par contre, l’analyse de données empiriques est plus problématique: il est tout à fait possible de trouver des données financières pour lesquelles la vraisemblance des processus de Hawkes est plus grande si le temps s’écoule en sens inverse. Les processus de Hawkes multivariés avec noyaux d’influence asymétriques ne sont pas affectés par une faible causalité. Il est malheureusement difficile de les calibrer aux actions individuelles des investisseurs présents dans nos bases de données, pour deux raisons. Nous avons soigneusement vérifie que l’activité des investisseurs est hautement non-stationaire et qu’on ne peut pas supposer que leur activité est localement stationaire, faute de données en nombre suffisant, bien que nos bases de données contiennent chacune plus de 1 million de transactions. Ces problèmes sont renforcés par le fait que les noyaux dans les processus de Hawkes codent l’influence mutuelle des investisseurs pour toutes les échelles de temps simultanément. Afin de pallier ce problème, la deuxième partie de cette thèse se concentre sur la causalité entre des échelles de temps spécifiques. Un filtrage supplémentaire est obtenu en réduisant le nombre effectif d’investisseurs grâce aux Réseaux Statistiquement Validés. Ces derniers sont utilisés pour catégoriser les investisseurs, qui sont groupés selon leur degré de la synchronisation de leurs actions (achat, vente, neutre) dans des intervalles déterminés à une échelle temporelle donnée. Cette partie propose une méthode pour l’inférence de réseaux de meneurs et suiveurs déterminés à une échelle de temps donnée dans le passé et à une autre dans le futur. Trois variations de cette méthode sont étudiées. Cette méthode permet de caractériser la causalité d’une façon novatrice. Nous avons comparé l’asymétrie temporelle des actions des investisseurs et celle de la volatilité des prix, et conclure que la structure de causalité des investisseurs est considérablement plus complexe que celle de la volatilité. De façon attendue, les investisseurs institutionnels, dont l’impact sur l’évolution des prix est beaucoup plus grand que celui des clients privés, ont une structure causale proche de celle de la volatilité: en effet, la volatilité, étant une quantité macroscopique, est le résultat d’une aggrégation des comportements de tous les investisseurs, qui fait disparaître la structure causale des investisseurs privés. / This thesis aims to uncover the underlyingcausality structure of financial markets by focusing onthe inference of investor causal networks at multipletimescales in two trader-resolved datasets.The first part of this thesis is devoted to the causal strengthof Hawkes processes. These processes describe in a clearlycausal way how the activity rate of e.g. an investor dependson his past activity rate; its multivariate version alsomakes it possible to include the interactions between theagents, at all time scales. The main result of this part isthat the classical MLE estimation of the process parametersdoes not vary significantly if the arrow of time is reversedin the univariate and symmetric multivariate case.This means that blindly trusting univariate and symmetricmultivariate Hawkes processes to infer causality from datais problematic. In addition, we find a dependency betweenthe level of causality in the process and its endogeneity.For long time series of synthetic data, one can discriminatebetween the forward and backward arrows of time byperforming rigorous statistical tests on the processes, butfor empirical data the situation is much more ambiguous,as it is entirely possible to find a better Hawkes process fitwhen time runs backwards compared to forwards.Asymmetric Hawkes processes do not suffer from veryweak causality. Fitting them to the individual traders’ actionsfound in our datasets is unfortunately not very successfulfor two reasons. We carefully checked that tradersactions in both datasets are highly non-stationary, andthat local stationarity cannot be assumed to hold as thereis simply not enough data, even if each dataset containsabout one million trades. This is also compounded by thefact that Hawkes processes encode the pairwise influenceof traders for all timescales simultaneously.In order to alleviate this problem, the second part ofthis thesis focuses on causality between specific pairs oftimescales. Further filtering is achieved by reducing theeffective number of investors; Statistically Validated Networksare applied to cluster investors into groups basedon the statistically high synchronisation of their actions(buy, sell or neutral) in time intervals of a given timescale.This part then generalizes single-timescale lead-lag SVNsto lead-lag networks between two timescales and introducesthree slightly different methodsThese methods make it possible to characterize causalityin a novel way. We are able to compare the time reversalasymmetry of trader activity and that of price volatility,and conclude that the causal structure of trader activity isconsiderably more complex than that of the volatility for agiven category of traders. Expectedly, institutional traders,whose impact on prices is much larger than that of retailclients, have a causality structure that is closer to that ofvolatility. This is because volatility, being a macroscopicquantity, aggregates the behaviour of all types of traders,thereby hiding the causality structure of minor players.
|
14 |
New Spatio-temporal Hawkes Process Models For Social GoodWen-Hao Chiang (12476658) 28 April 2022 (has links)
<p>As more and more datasets with self-exciting properties become available, the demand for robust models that capture contagion across events is also getting stronger. Hawkes processes stand out given their ability to capture a wide range of contagion and self-excitation patterns, including the transmission of infectious disease, earthquake aftershock distributions, near-repeat crime patterns, and overdose clusters. The Hawkes process is flexible in modeling these various applications through parametric and non-parametric kernels that model event dependencies in space, time and on networks.</p>
<p>In this thesis, we develop new frameworks that integrate Hawkes Process models with multi-armed bandit algorithms, high dimensional marks, and high-dimensional auxiliary data to solve problems in search and rescue, forecasting infectious disease, and early detection of overdose spikes.</p>
<p>In Chapter 3, we develop a method applications to the crisis of increasing overdose mortality over the last decade. We first encode the molecular substructures found in a drug overdose toxicology report. We then cluster these overdose encodings into different overdose categories and model these categories with spatio-temporal multivariate Hawkes processes. Our results demonstrate that the proposed methodology can improve estimation of the magnitude of an overdose spike based on the substances found in an initial overdose. </p>
<p>In Chapter 4, we build a framework for multi-armed bandit problems arising in event detection where the underlying process is self-exciting. We derive the expected number of events for Hawkes processes given a parametric model for the intensity and then analyze the regret bound of a Hawkes process UCB-normal algorithm. By introducing the Hawkes Processes modeling into the upper confidence bound construction, our models can detect more events of interest under the multi-armed bandit problem setting. We apply the Hawkes bandit model to spatio-temporal data on crime events and earthquake aftershocks. We show that the model can quickly learn to detect hotspot regions, when events are unobserved, while striking a balance between exploitation and exploration. </p>
<p>In Chapter 5, we present a new spatio-temporal framework for integrating Hawkes processes with multi-armed bandit algorithms. Compared to the methods proposed in Chapter 4, the upper confidence bound is constructed through Bayesian estimation of a spatial Hawkes process to balance the trade-off between exploiting and exploring geographic regions. The model is validated through simulated datasets and real-world datasets such as flooding events and improvised explosive devices (IEDs) attack records. The experimental results show that our model outperforms baseline spatial MAB algorithms through rewards and ranking metrics.</p>
<p>In Chapter 6, we demonstrate that the Hawkes process is a powerful tool to model the infectious disease transmission. We develop models using Hawkes processes with spatial-temporal covariates to forecast COVID-19 transmission at the county level. In the proposed framework, we show how to estimate the dynamic reproduction number of the virus within an EM algorithm through a regression on Google mobility indices. We also include demographic covariates as spatial information to enhance the accuracy. Such an approach is tested on both short-term and long-term forecasting tasks. The results show that the Hawkes process outperforms several benchmark models published in a public forecast repository. The model also provides insights on important covariates and mobility that impact COVID-19 transmission in the U.S.</p>
<p>Finally, in chapter 7, we discuss implications of the research and future research directions.</p>
|
15 |
Portfolio optimization in presence of a self-exciting jump process: from theory to practiceVeronese, Andrea 27 April 2022 (has links)
We aim at generalizing the celebrated portfolio optimization problem "à la Merton", where the asset evolution is steered by a self-exciting jump-diffusion process. We first define the rigorous mathematical framework needed to introduce the stochastic optimal control problem we are interesting in. Then, we provide a proof for a specific version of the Dynamic Programming Principle (DPP) with respect to the
general class of self-exciting processes under study. After, we state the Hamilton-Jacobi-Bellman (HJB) equation, whose solution gives the value function for the corresponding optimal control problem.
The resulting HJB equation takes the form of a Partial-Integro Differential Equation (PIDE), for which we prove both existence and uniqueness for the solution in the viscosity sense. We further derive a suitable numerical scheme to solve the HJB equation corresponding to the portfolio optimizationproblem. To this end, we also provide a detailed study of solution dependence on the parameters of the problem. The analysis is performed by calibrating the model on ENI asset levels during the COVID-19 worldwide breakout. In particular, the calibration routine is based on a sophisticated Sequential Monte Carlo algorithm.
|
16 |
金融大數據之應用 : Hawkes相互激勵模型於跨市場跳躍傳染現象之實證分析 / Empirical Analysis on Financial Contagion using Hawkes Mutu-ally Exciting Model簡宇澤, Chien, Yu Tse Unknown Date (has links)
本研究使用美國、德國、英國股票指數期貨之日內交易資料,從報酬率中分離出連續波動度與跳躍項,再以MLE法估計Hawkes相互激勵過程之參數,衡量跨市場跳躍傳染現象。擴展文獻中僅兩市場的分析至三市場模型,更能從整體的角度解釋市場間的關係及跳躍傳染途徑。實證結果顯示,美國能直接影響其他市場,而其他市場反過來不易干涉美國,呈現非對稱影響效果。歐洲兩國能互相傳染,英國對德國的影響較大,也更有能力影響美國,稱英國為歐洲的影響輸出國,德國為歐洲的影響輸入國。
|
17 |
Quantitative Finance under rough volatility / Finance quantitative sous les modèles à volatilité rugueuseEl Euch, Omar 25 September 2018 (has links)
Cette thèse a pour objectif la compréhension de plusieurs aspects du caractère rugueux de la volatilité observé de manière universelle sur les actifs financiers. Ceci est fait en six étapes. Dans une première partie, on explique cette propriété à partir des comportements typiques des agents sur le marché. Plus précisément, on construit un modèle de prix microscopique basé sur les processus de Hawkes reproduisant les faits stylisés importants de la microstructure des marchés. En étudiant le comportement du prix à long terme, on montre l’émergence d’une version rugueuse du modèle de Heston (appelé modèle rough Heston) avec effet de levier. En utilisant ce lien original entre les processus de Hawkes et les modèles de Heston, on calcule dans la deuxième partie de cette thèse la fonction caractéristique du log-prix du modèle rough Heston. Cette fonction caractéristique est donnée en terme d’une solution d’une équation de Riccati dans le cas du modèle de Heston classique. On montre la validité d’une formule similaire dans le cas du modèle rough Heston, où l’équation de Riccati est remplacée par sa version fractionnaire. Cette formule nous permet de surmonter les difficultés techniques dues au caractère non markovien du modèle afin de valoriser des produits dérivés. Dans la troisième partie, on aborde la question de la gestion des risques des produits dérivés dans le modèle rough Heston. On présente des stratégies de couverture utilisant comme instruments l’actif sous-jacent et la courbe variance forward. Ceci est fait en spécifiant la structure markovienne infini-dimensionnelle du modèle. Étant capable de valoriser et couvrir les produits dérivés dans le modèle rough Heston, nous confrontons ce modèle à la réalité des marchés financiers dans la quatrième partie. Plus précisément, on montre qu’il reproduit le comportement de la volatilité implicite et historique. On montre également qu’il génère l’effet Zumbach qui est une asymétrie par inversion du temps observée empiriquement sur les données financières. On étudie dans la cinquième partie le comportement limite de la volatilité implicite à la monnaie à faible maturité dans le cadre d’un modèle à volatilité stochastique général (incluant le modèle rough Bergomi), en appliquant un développement de la densité du prix de l’actif. Alors que l’approximation basée sur les processus de Hawkes a permis de traiter plusieurs questions relatives au modèle rough Heston, nous examinons dans la sixième partie une approximation markovienne s’appliquant sur une classe plus générale de modèles à volatilité rugueuse. En utilisant cette approximation dans le cas particulier du modèle rough Heston, on obtient une méthode numérique pour résoudre les équations de Riccati fractionnaires. Enfin, nous terminons cette thèse en étudiant un problème non lié à la littérature sur la volatilité rugueuse. Nous considérons le cas d’une plateforme cherchant le meilleur système de make-take fees pour attirer de la liquidité. En utilisant le cadre principal-agent, on décrit le meilleur contrat à proposer au market maker ainsi que les cotations optimales affichées par ce dernier. Nous montrons également que cette politique conduit à une meilleure liquidité et à une baisse des coûts de transaction pour les investisseurs. / The aim of this thesis is to study various aspects of the rough behavior of the volatility observed universally on financial assets. This is done in six steps. In the first part, we investigate how rough volatility can naturally emerge from typical behav- iors of market participants. To do so, we build a microscopic price model based on Hawkes processes in which we encode the main features of the market microstructure. By studying the asymptotic behavior of the price on the long run, we obtain a rough version of the Heston model exhibiting rough volatility and leverage effect. Using this original link between Hawkes processes and the Heston framework, we compute in the second part of the thesis the characteristic function of the log-price in the rough Heston model. In the classical Heston model, the characteristic function is expressed in terms of a solution of a Riccati equation. We show that rough Heston models enjoy a similar formula, the Riccati equation being replaced by its fractional version. This formula enables us to overcome the non-Markovian nature of the model in order to deal with derivatives pricing. In the third part, we tackle the issue of managing derivatives risks under the rough Heston model. We establish explicit hedging strategies using as instruments the underlying asset and the forward variance curve. This is done by specifying the infinite-dimensional Markovian structure of the rough Heston model. Being able to price and hedge derivatives in the rough Heston model, we challenge the model to practice in the fourth part. More precisely, we show the excellent fit of the model to historical and implied volatilities. We also show that the model reproduces the Zumbach’s effect, that is a time reversal asymmetry which is observed empirically on financial data. While the Hawkes approximation enabled us to solve the pricing and hedging issues under the rough Heston model, this approach cannot be extended to an arbitrary rough volatility model. We study in the fifth part the behavior of the at-the-money implied volatility for small maturity under general stochastic volatility models. In the same spirit as the Hawkes approximation, we look in the sixth part of this thesis for a tractable Markovian approximation that holds for a general class of rough volatility models. By applying this approximation on the specific case of the rough Heston model, we derive a numerical scheme for solving fractional Riccati equations. Finally, we end this thesis by studying a problem unrelated to rough volatility. We consider an exchange looking for the best make-take fees system to attract liquidity in its platform. Using a principal-agent framework, we describe the best contract that the exchange should propose to the market maker and provide the optimal quotes displayed by the latter. We also argue that this policy leads to higher quality of liquidity and lower trading costs for investors.
|
18 |
Modélisation du carnet d’ordres, Applications Market Making / Limit order book modelling, Market Making ApplicationsLu, Xiaofei 04 October 2018 (has links)
Cette thèse aborde différents aspects de la modélisation de la microstructure du marché et des problèmes de Market Making, avec un accent particulier du point de vue du praticien. Le carnet d’ordres, au cœur du marché financier, est un système de files d’attente complexe à haute dimension. Nous souhaitons améliorer la connaissance du LOB pour la communauté de la recherche, proposer de nouvelles idées de modélisation et développer des applications pour les Market Makers. Nous remercions en particuler l’équipe Automated Market Making d’avoir fourni la base de données haute-fréquence de très bonne qualité et une grille de calculs puissante, sans laquelle ces recherches n’auraient pas été possible. Le Chapitre 1 présente la motivation de cette recherche et reprend les principaux résultats des différents travaux. Le Chapitre 2 se concentre entièrement sur le LOB et vise à proposer un nouveau modèle qui reproduit mieux certains faits stylisés. A travers cette recherche, non seulement nous confirmons l’influence des flux d’ordres historiques sur l’arrivée de nouveaux, mais un nouveau modèle est également fourni qui réplique beaucoup mieux la dynamique du LOB, notamment la volatilité réalisée en haute et basse fréquence. Dans le Chapitre 3, l’objectif est d’étudier les stratégies de Market Making dans un contexte plus réaliste. Cette recherche contribueà deux aspects : d’une part le nouveau modèle proposé est plus réaliste mais reste simple à appliquer pour la conception de stratégies, d’autre part la stratégie pratique de Market Making est beaucoup améliorée par rapport à une stratégie naive et est prometteuse pour l’application pratique. La prédiction à haute fréquence avec la méthode d’apprentissage profond est étudiée dans le Chapitre 4. De nombreux résultats de la prédiction en 1- étape et en plusieurs étapes ont retrouvé la non-linéarité, stationarité et universalité de la relation entre les indicateurs microstructure et le changement du prix, ainsi que la limitation de cette approche en pratique. / This thesis addresses different aspects around the market microstructure modelling and market making problems, with a special accent from the practitioner’s viewpoint. The limit order book (LOB), at the heart of financial market, is a complex continuous high-dimensional queueing system. We wish to improve the knowledge of LOB for the research community, propose new modelling ideas and develop concrete applications to the interest of Market Makers. We would like to specifically thank the Automated Market Making team for providing a large high frequency database of very high quality as well as a powerful computational grid, without whom these researches would not have been possible. The first chapter introduces the incentive of this research and resumes the main results of the different works. Chapter 2 fully focuses on the LOB and aims to propose a new model that better reproduces some stylized facts. Through this research, not only do we confirm the influence of historical order flows to the arrival of new ones, but a new model is also provided that captures much better the LOB dynamic, notably the realized volatility in high and low frequency. In chapter 3, the objective is to study Market Making strategies in a more realistic context. This research contributes in two aspects : from one hand the newly proposed model is more realistic but still simple enough to be applied for strategy design, on the other hand the practical Market Making strategy is of large improvement compared to the naive one and is promising for practical use. High-frequency prediction with deep learning method is studied in chapter 4. Many results of the 1-step and multi-step prediction have found the non-linearity, stationarity and universality of the relationship between microstructural indicators and price change, as well as the limitation of this approach in practice.
|
19 |
Generative Models of Link Formation and Community Detection in Continuous-Time Dynamic NetworksArastuie, Makan January 2020 (has links)
No description available.
|
20 |
Financial risk sources and optimal strategies in jump-diffusion frameworksPrezioso, Luca 25 March 2020 (has links)
An optimal dividend problem with investment opportunities, taking into consideration a source of strategic risk is being considered, as well as the effect of market frictions on the decision process of the financial entities.
It concerns the problem of determining an optimal control of the dividend under debt constraints and investment opportunities in an economy with business cycles. It is assumed that the company is to be allowed to accept or reject investment opportunities arriving at random times with random sizes, by changing its outstanding indebtedness, which would impact its capital structure and risk profile. This work mainly focuses on the strategic risk faced by the companies; and, in particular, it focuses on the manager's problem of setting appropriate priorities to deploy the limited resources available. This component is taken into account by introducing frictions in the capital structure modification process.
The problem is formulated as a bi-dimensional singular control problem under regime switching in presence of jumps. An explicit condition is obtained in order to ensure that the value function is finite. A viscosity solution approach is used to get qualitative descriptions of the solution.
Moreover, a lending scheme for a system of interconnected banks with probabilistic constraints of failure is being considered. The problem arises from the fact that financial institutions cannot possibly carry enough capital to withstand counterparty failures or systemic risk. In such situations, the central bank or the government becomes effectively the risk manager of last resort or, in extreme cases, the lender of last resort.
If, on the one hand, the health of the whole financial system depends on government intervention, on the other hand, guaranteeing a high probability of salvage may result in increasing the moral hazard of the banks in the financial network. A closed form solution for an optimal control problem related to interbank lending schemes has been derived, subject to terminal probability constraints on the failure of banks which are interconnected through a financial network. The derived solution applies to real bank networks by obtaining a general solution when the aforementioned probability constraints are assumed for all the banks. We also present a direct method to compute the systemic relevance parameter for each bank within the network.
Finally, a possible computation technique for the Default Risk Charge under to regulatory risk measurement processes is being considered. We focus on the Default Risk Charge measure as an effective alternative to the Incremental Risk Charge one, proposing its implementation by a quasi exhaustive-heuristic algorithm to determine the minimum capital requested to a bank facing the market risk associated to portfolios based on assets emitted by several financial agents.
While most of the banks use the Monte Carlo simulation approach and the empirical quantile to estimate this risk measure, we provide new computational approaches, exhaustive or heuristic, currently becoming feasible, because of both new regulation and the high speed - low cost technology available nowadays.
|
Page generated in 0.042 seconds