Spelling suggestions: "subject:"extremevalue"" "subject:"extreme·value""
131 |
Les approches extrêmes de la contagion sur les marchés financiers / Extreme approaches of contagion in financial marketsXu, Bei 16 November 2012 (has links)
La thèse est composée de trois parties. La première présente un certain nombre de mesures de dépendance extrême. Une application sur les actions et les obligations de 49 pays montre que la théorie des valeurs extrêmes multivariées conduit aux résultats différents de ceux issus du coefficient de corrélation, mais relativement proches de ceux obtenus du rho de Spearman conditionnel multivarié. Cette partie évalue aussi le risque de pertes importantes simultanées. La deuxième partie examine les déterminants des co-mouvements extrêmes entre 5 pays core et 49 pays non core. Les mécanismes de transmission des chocs varient de la période moins récente à la période récente, des pays développés aux pays émergents, des chocs normaux aux chocs extrêmes. La troisième partie étudie le rôle de valeur refuge de l’or sur la période 1986-2012. Les gains positifs extrêmes de l'or peuvent être liés aux pertes extrêmes du S&P. Cependant, ce lien n'est pas toujours valable, il évolue dans le temps et serait conditionné par d'autres facteurs. / The thesis consists of three parts. The first part introduces a number of measures of extreme dependency. An application on stock and bond markets of 49 countries shows the multivariate extreme value theory leads to results which are different from those from the correlation coefficient, but relatively close to those obtained from multivariate conditional Spearman's rho. This part also assesses the risk of simultaneous losses. The second part examines the determinants of extreme co-movements between 5 core countries and 49 non-core countries. Transmission mechanisms of shocks vary from less recent to recent period, from developed to emerging markets, from normal to extreme shocks. The third part examines the role of safe haven of gold over the period 1986-2012. Extreme positive gains of gold can be linked to extreme losses of S&P. However, this relationship is not always valid, it evolves over time and could be determined by other factors.
|
132 |
Measuring and managing operational risk in the insurance and banking sectors / Mesure et gestion du risque opérationnel en assurance et financeKaram, Elias 26 June 2014 (has links)
Notre intérêt dans cette thèse est de combiner les différentes techniques de mesure du risque opérationnel dans les secteurs financiers, et on s'intéresse plus particulièrement aux conséquences du risque d'estimation dans les modèles, qui est un risque opérationnel particulier. Nous allons présenter les concepts mathématiques et actuariels associés ainsi qu'une application numérique en ce qui concerne l'approche de mesure avancée comme Loss Distribution pour calculer l'exigence en capital. En plus, on se concentre sur le risque d'estimation illustré avec l'analyse des scénarios de l'opinion d'experts en conjonction avec des données de pertes internes pour évaluer notre exposition aux évènements de gravité. Nous concluons cette première partie en définissant une technique de mise l'échelle sur la base de (MCO) qui nous permet de normaliser nos données externes à une banque locale Libanaise.Dans la deuxième partie, on donne de l'importance sur la mesure de l'erreur induite sur le SCR par l'erreur d'estimation des paramètres, on propose une méthode alternative pour estimer une courbe de taux et on termine par attirer l'attention sur les réflexions autour des hypothèses de calcul et ce que l'on convient de qualifier d'hypothèse "cohérente avec les valeurs de marché" serait bien plus pertinente et efficace que la complexification du modèle, source d'instabilité supplémentaire, ainsi mettre en évidence le risque d'estimation qui est lié au risque opérationnel et doit être accordé beaucoup plus d'attention dans nos modèles de travail / Our interest in this thesis is first to combine the different measurement techniques for operational risk in financial companies, and we highlight more and more the consequences of estimation risk which is treated as a particular part of operational risk. In the first part, we will present a full overview of operational risk, from the regulatory laws and regulations to the associated mathematical and actuarial concepts as well as a numerical application regarding the Advanced Measurement Approach, like Loss Distribution to calculate the capital requirement, then applying the Extreme Value Theory. We conclude this first part by setting a scaling technique based on (OLS) enabling us to normalize our external data to a local Lebanese Bank. On the second part, we feature estimation risk by first measuring the error induced on the SCR by the estimation error of the parameters, to having an alternative yield curve estimation and finishing by calling attention to the reflections on assumptions of the calculation instead of focusing on the so called hypothesis "consistent with market values", would be more appropriate and effective than to complicate models and generate additional errors and instability. Chapters in this part illustrate the estimation risk in its different aspects which is a part of operational risk, highlighting as so the attention that should be given in treating our models
|
133 |
Contributions aux algorithmes stochastiques pour le Big Data et à la théorie des valeurs extrèmes multivariés. / Contributions to stochastic algorithm for Big Data and multivariate extreme value theory.Ho, Zhen Wai Olivier 04 October 2018 (has links)
La thèse comporte deux parties distinctes. La première partie concerne des modèles pour les extrêmes multivariés.On donne une construction de vecteurs aléatoires multivariés à variations régulières. La construction se base sur une extension multivariée d'un lemme de Breiman établissant la propriété de variation régulière d'un produit $RZ$ de variable aléatoire avec $R$ positive à variation régulière et $Z$ positive suffisamment intégrable. En prenant $mathbf{Z}$ multivarié et suffisamment intégrable, on montre que $Rmathbf{Z}$ est un vecteur aléatoire à variations régulières et on caractérise sa mesure limite. On montre ensuite que pour $mathbf{Z}$ de loi bien choisie, on retrouve des modèles stables classiques comme le modèle t-extremal, Hüsler-Reiss, etc. Puis, on étend notre construction pour considérer la notion de variation régulière multivariée non standard. On montre ensuite que le modèle de Pareto (qu'on appelle Hüsler-Reiss Pareto) associé au modèle max-stable Hüsler-Reiss forme une famille exponentielle complète. On donne quelques propriétés du modèle Hüsler-Reiss Pareto puis on propose un algorithme de simulation exacte. On étudie l'inférence par le maximum de vraisemblance. Finalement, on considère une extension du modèle Hüsler-Reiss Pareto utilisant la notion de variation régulière non standard. On étudie l'inférence par le maximum de vraisemblance du modèle généralisé et on propose une méthode d'estimation des paramètres. On donne une étude numérique sur l'estimateur du maximum de vraisemblance pour le modèle Hüsler-Reiss Pareto. Dans la second partie qui concerne l'apprentissage statistique, on commence par donner une borne sur la valeur singulière minimale d'une matrice perturbée par l'ajout d'une colonne. On propose alors un algorithme de sélection de colonne afin d'extraire les caractéristiques de la matrice. On illustre notre algorithme sur des données réelles de séries temporelles où chaque série est pris comme étant une colonne de la matrice. Deuxièmement, on montre que si une matrice $X$ à une propriété d'incohérence alors $X$ possède aussi une version affaiblie de la propriété NSP (null space property). Puis, on s'intéresse au problème de sélection de matrice incohérente. A partir d'une matrice $Xin mathbb{R}^{n imes p}$ et $mu>0$, on cherche la plus grande sous-matrice de $X$ avec une cohérence inférieure à $mu$. Ce problème est formulé comme un programme linéaire avec contrainte quadratique sur ${0,1}^p$. Comme ce problème est NP-dur, on considère une relaxation sur la sphère et on obtient une borne sur l'erreur lorsqu'on considère le problème relaxé. Enfin, on analyse l'algorithme de gradient stochastique projeté pour l'analyse en composante principale online. On montre qu'en espérance, l'algorithme converge vers un vecteur propre maximum et on propose un algorithme pour sélectionner le pas de l'algorithme. On illustre ensuite cet algorithme par une expérience de simulation. / This thesis in divided in two parts. The first part studies models for multivariate extremes. We give a method to construct multivariate regularly varying random vectors. The method is based on a multivariate extension of a Breiman Lemma that states that a product $RZ$ of a random non negative regularly varying variable $R$ and a non negative $Z$ sufficiently integrable is also regularly varying. Replacing $Z$ with a random vector $mathbf{Z}$, we show that the product $Rmathbf{Z}$ is regularly varying and we give a characterisation of its limit measure. Then, we show that taking specific distributions for $mathbf{Z}$, we obtain classical max-stable models. We extend our result to non-standard regular variations. Next, we show that the Pareto model associated with the Hüsler-Reiss max-stable model forms a full exponential family. We show some properties of this model and we give an algorithm for exact simulation. We study the properties of the maximum likelihood estimator. Then, we extend our model to non-standard regular variations. To finish the first part, we propose a numerical study of the Hüsler-Reiss Pareto model.In the second part, we start by giving a lower bound of the smallest singular value of a matrix perturbed by appending a column. Then, we give a greedy algorithm for feature selection and we illustrate this algorithm on a time series dataset. Secondly, we show that an incoherent matrix satisfies a weakened version of the NSP property. Thirdly, we study the problem of column selection of $Xinmathbb{R}^{n imes p}$ given a coherence threshold $mu$. This means we want the largest submatrix satisfying some coherence property. We formulate the problem as a linear program with quadratic constraint on ${0,1}^p$. Then, we consider a relaxation on the sphere and we bound the relaxation error. Finally, we study the projected stochastic gradient descent for online PCA. We show that in expectation, the algorithm converges to a leading eigenvector and we suggest an algorithm for step-size selection. We illustrate this algorithm with a numerical experiment.
|
134 |
Estimação de medidas de risco utilizando modelos CAViaR e CARE / Risk measures estimation using CAViaR and CARE models.Silva, Francyelle de Lima e 06 August 2010 (has links)
Neste trabalho são definidos, discutidos e estimados o Valor em Risco e o Expected Shortfall. Estas são medidas de Risco Financeiro de Mercado muito utilizadas por empresas e investidores para o gerenciamento do risco, aos quais podem estar expostos. O objetivo foi apresentar e utilizar vários métodos e modelos para a estimação dessas medidas e estabelecer qual o modelo mais adequado dentro de determinados cenários. / In this work Value at Risk and Expected Shortfall are defined, discussed and estimated . These are measures heavily used in Financial Market Risk, in particular by companies and investors to manage risk, which they may be exposed. The aim is to present and use several methods and models for estimating those measures and to establish which model is most appropriate in certain scenarios.
|
135 |
Použití koherentních metod měření rizika v modelování operačních rizik / The use of coherent risk measures in operational risk modelingLebovič, Michal January 2012 (has links)
The debate on quantitative operational risk modeling has only started at the beginning of the last decade and the best-practices are still far from being established. Estimation of capital requirements for operational risk under Advanced Measurement Approaches of Basel II is critically dependent on the choice of risk measure, which quantifies the risk exposure based on the underlying simulated distribution of losses. Despite its well-known caveats Value-at-Risk remains a predominant risk measure used in the context of operational risk management. We describe several serious drawbacks of Value-at-Risk and explain why it can possibly lead to misleading conclusions. As a remedy we suggest the use of coherent risk measures - and namely the statistic known as Expected Shortfall - as a suitable alternative or complement for quantification of operational risk exposure. We demonstrate that application of Expected Shortfall in operational loss modeling is feasible and produces reasonable and consistent results. We also consider a variety of statistical techniques for modeling of underlying loss distribution and evaluate extreme value theory framework as the most suitable for this purpose. Using stress tests we further compare the robustness and consistency of selected models and their implied risk capital estimates...
|
136 |
Estimação de medidas de risco utilizando modelos CAViaR e CARE / Risk measures estimation using CAViaR and CARE models.Francyelle de Lima e Silva 06 August 2010 (has links)
Neste trabalho são definidos, discutidos e estimados o Valor em Risco e o Expected Shortfall. Estas são medidas de Risco Financeiro de Mercado muito utilizadas por empresas e investidores para o gerenciamento do risco, aos quais podem estar expostos. O objetivo foi apresentar e utilizar vários métodos e modelos para a estimação dessas medidas e estabelecer qual o modelo mais adequado dentro de determinados cenários. / In this work Value at Risk and Expected Shortfall are defined, discussed and estimated . These are measures heavily used in Financial Market Risk, in particular by companies and investors to manage risk, which they may be exposed. The aim is to present and use several methods and models for estimating those measures and to establish which model is most appropriate in certain scenarios.
|
137 |
以風險值衡量銀行外匯部位資本之計提陳昀聖, Chen Yun-Sheng Unknown Date (has links)
本論文的目的在比較標準法和風險值法(VaR)於外匯部位資本計提數額上的差異。在VaR法方面,本篇採用變異數-共變異數法、歷史模擬法以及極端值法等三種衡量方法,並利用回溯測試(backtest)對三種方法預測風險的能力做一檢測。標準法是指財政部規定的資本計提標準方法。
本篇論文實證結果發現用VaR法所計提的資本數額是依標準法所需計提數額的一半。也就是說依標準法提列會造成過多的資金成本。另外,從安全性的角度觀之,經過回溯測試,發現採取歷史模擬法或極端值法則是值得信賴的資本計提的方法。反之,變異數-共變異數法會有低估的現象。但因計算極端值法所需要的資料過於龐大,建議使用歷史模擬法,如此相對於標準法將可省下可觀的資金成本。
第一章 研究動機與目的…………………………………1
第二章 國內外資本適足的規定…………………………3
第一節 資本適足規定(BIS)的發展……………………3
第二節 台灣相關法令規定……………………………6
第三章 文獻探討……………………………………… 10
第四章 研究方法與模型……………………………… 14
第一節 VaR模型…………………………………… 14
第二節 回溯測試…………………………………… 24
第五章 實證分析……………………………………… 28
第一節 實證資料介紹……………………………… 28
第二節 實證結果…………………………………… 29
第六章 結論…………………………………………… 42
參考文獻……………………………………………………44
|
138 |
Development Of Methods For Structural Reliability Analysis Using Design And Analysis Of Computer Experiments And Data Based Extreme Value AnalysisPanda, Satya Swaroop 06 1900 (has links)
The work reported in this thesis is in the area of computational modeling of reliability of engineering structures. The emphasis of the study is on developing methods that are suitable for analysis of large-scale structures such as aircraft structure components. This class of problems continues to offer challenges to an analyst with the most difficult aspect of the analysis being the treatment of nonlinearity in the structural behavior, non-Gaussian nature of uncertainties and quantification of low levels of probability of failure (of the order of 10-5 or less), requiring significant computational effort. The present study covers static/ dynamic behavior, Gaussian/ non-Gaussian models of uncertainties, and (or) linear/ nonlinear structures. The novel elements in the study consist of two components:
• application of modeling tools that already exists in the area of design and analysis of computer experiments, and
. • application of data based extreme value analysis procedures that are available in the statistics literature.
The first component of the work provides opportunity to combine space filling sampling strategies (which have promise for reducing variance of estimation) with kriging based modeling in reliability studies-an opportunity that has not been explored in the existing literature. The second component of the work exploits the virtues of limiting behavior of extremes of sequence of random variables with Monte Carlo simulations of structural response-a strategy for reliability modeling that has not been explored in the existing literature. The hope here is that failure events with probabilities of the order of 10-5 or less could be investigated with relatively less number of Monte Carlo runs. The study also brings out the issues related to combining the above sources of existing knowledge with finite element modeling of engineering structures, thereby leading to newer tools for structural reliability analysis.
The thesis is organized into four chapters. The first chapter provides a review of literature that covers methods of reliability analysis and also the background literature on design and analysis of computer experiments and extreme value analysis.
The problem of reliability analysis of randomly parametered, linear (or) nonlinear structures subjected to static and (or) dynamic loads is considered in Chapter 2. A deterministic finite element model for the structure to analyze sample realization of the structure is assumed to be available. The reliability analysis is carried out within the framework of response surface methods, which involves the construction of surrogate models for performance functions to be employed in reliability calculations. These surrogate models serve as models of models, and hence termed as meta-models, for structural behavior in the neighborhood of design point. This construction, in the present study, has involved combining space filling optimal Latin hypercube sampling and kriging models. Illustrative examples on numerical prediction of reliability of a ten-bay truss and a W-seal in an aircraft structure are presented. Limited Monte Carlo simulations are used to validate the approximate procedures developed.
The reliability of nonlinear vibrating systems under stochastic excitations is investigated in Chapter 3 using a two-stage Monte Carlo simulation strategy. Systems subjected to Gaussian random excitation are considered for the study. It is assumed that the probability distribution of the maximum response in the steady state belongs to the basin of attraction of one of the classical asymptotic extreme value distributions. The first stage of the solution strategy consists of an objective selection of the form of the extreme value distribution based on hypothesis tests, and the next involves the estimation of parameters of the relevant extreme value distribution. Both these steps are implemented using data from limited Monte Carlo simulations of the system response. The proposed procedure is illustrated with examples of linear/nonlinear single-degree and multi-degree of freedom systems driven by random excitations. The predictions from the proposed method are compared with results from large-scale Monte Carlo simulations and also with classical analytical results, when available, from theory of out-crossing statistics. The method is further extended to cover reliability analysis of nonlinear dynamical systems with randomly varying system parameters. Here the methods of meta-modeling developed in Chapter 2 are extended to develop response surface models for parameters of underlying extreme value distributions. Numerical examples presented cover a host of low-dimensional dynamical systems and also the analysis of a wind turbine structure subjected to turbulent wind loads and undergoing large amplitude oscillations.
A summary of contributions made along with a few suggestions for further research is presented in Chapter 4.
|
139 |
Analyzing value at risk and expected shortfall methods: the use of parametric, non-parametric, and semi-parametric modelsHuang, Xinxin 25 August 2014 (has links)
Value at Risk (VaR) and Expected Shortfall (ES) are methods often used to measure market risk. Inaccurate and unreliable Value at Risk and Expected Shortfall models can lead to underestimation of the market risk that a firm or financial institution is exposed to, and therefore may jeopardize the well-being or survival of the firm or financial institution during adverse markets. The objective of this study is therefore to examine various Value at Risk and Expected Shortfall models, including fatter tail models, in order to analyze the accuracy and reliability of these models.
Thirteen VaR and ES models under three main approaches (Parametric, Non-Parametric and Semi-Parametric) are examined in this study. The results of this study show that the proposed model (ARMA(1,1)-GJR-GARCH(1,1)-SGED) gives the most balanced Value at Risk results. The semi-parametric model (Extreme Value Theory, EVT) is the most accurate Value at Risk model in this study for S&P 500. / October 2014
|
140 |
Brown-Resnick Processes: Analysis, Inference and GeneralizationsEngelke, Sebastian 14 December 2012 (has links)
No description available.
|
Page generated in 0.0453 seconds