Spelling suggestions: "subject:"extreme value"" "subject:"extreme alue""
211 |
A Statistical Framework for Distinguishing Between Aleatory and Epistemic Uncertainties in the Best- Estimate Plus Uncertainty (BEPU) Nuclear Safety AnalysesPun-Quach, Dan 11 1900 (has links)
In 1988, the US Nuclear Regulatory Commission approved an amendment that allowed the use of best-estimate methods. This led to an increased development, and application of Best Estimate Plus Uncertainty (BEPU) safety analyses. However, a greater burden was placed on the licensee to justify all uncertainty estimates. A review of the current state of the BEPU methods indicate that there exists a number of significant criticisms, which limits the BEPU methods from reaching its full potential as a comprehensive licensing basis. The most significant criticism relates to the lack of a formal framework for distinguishing between aleatory and epistemic uncertainties. This has led to a prevalent belief that such separation of uncertainties is for convenience, rather than one out of necessity.
In this thesis, we address the above concerns by developing a statistically rigorous framework to characterize the different uncertainty types. This framework is grounded on the philosophical concepts of knowledge. Considering the Plato problem, we explore the use of probability as a means to gain knowledge, which allows us to relate the inherent distinctness in knowledge with the different uncertaintytypesforanycomplexphysicalsystem. Thisframeworkis demonstrated using nuclear analysis problems, and we show through the use of structural models that the separation of these uncertainties leads to more accurate tolerance limits relative to existing BEPU methods. In existing BEPU methods, where such a distinction is not applied, the total uncertainty is essentially treated as the aleatory uncertainty. Thus, the resulting estimated percentile is much larger than the actual (true) percentile of the system's response.
Our results support the premise that the separation of these two distinct uncertainty types is necessary and leads to more accurate estimates of the reactor safety margins. / Thesis / Doctor of Philosophy (PhD)
|
212 |
Extreme Value Theory Applied to Securitizations Rating Methodology / Extremvärdesteori tillämpat på värdepapperiseringBarbouche, Tarek January 2017 (has links)
One of today’s financial trends is securitization. Evaluating Securitization risk requires some strong quantitative skills and a deep understanding of both credit and market risk. For international securitization programs it is mandatory to take into account the exchange-rates-related risks. We will see the di˙erent methods to evaluate extreme variations of the exchange rates using the Extreme Value Theory and Monte Carlo simulations. / Värdepapperisering är en av dagens finansiella trender. Att utvärdera vär-depapperisering risk kräver starka kvantitativa kunskaper och en förståelseför både kredit- och marknadsrisk. För internationell värdepapperisering ärdet obligatoriskt att hänsyn tas till valutarisker. Vi kommer att se de olika metoder för att utvärdera extrema variationer i valutakurser med hjälp av extremvärdesteori och Monte Carlo-simuleringar.
|
213 |
[en] ANALYSIS OF EXTREME VALUES THEORY AND MONTE CARLO SIMULATION FOR THE CALCULATION OF VALUE-AT-RISK IN STOCK PORTFOLIOS / [pt] ANÁLISE DA TEORIA DOS VALORES EXTREMOS E DA SIMULAÇÃO DE MONTE CARLO PARA O CÁLCULO DO VALUE-AT-RISK EM CARTEIRAS DE INVESTIMENTOS DE ATIVOS DE RENDA VARIÁVELGUSTAVO JARDIM DE MORAIS 16 July 2018 (has links)
[pt] Após as recentes crises financeiras que se abateram sobre os mercados financeiros de todo o mundo, com mais propriedade a de 2008/2009, mas ainda a crise no Leste Europeu em Julho/2007, a moratória Russa em Outubro/1998, e, no âmbito nacional, a mudança no regime cambial brasileiro, em Janeiro/1999, as instituições financeiras incorreram em grandes perdas em cada um desses eventos e uma das principais questões levantadas acerca dos modelos financeiros diziam respeito ao gerenciamento de risco. Os diversos métodos de cálculo do Value-atrisk, bem como as simulações e cenários traçados por analistas não puderam prever sua magnitude nem tampouco evitar que a crise se agravasse. Em função disso, proponho-me à questão de estudar os sistemas de gerenciamento de risco financeiro, na medida em que este pode e deve ser aprimorado, sob pena de catástrofes financeiras ainda maiores. Embora seu conteúdo se mostre tão vasto na literatura, as metodologias para cálculo de valor em risco não são exatas e livres de falhas. Nesse contexto, coloca-se necessário o desenvolvimento e aprimoramento de ferramentas de gestão de risco que sejam capazes de auxiliar na melhor alocação dos recursos disponíveis, avaliando o nível de risco à que um investimento está exposto e sua compatibilidade com seu retorno esperado. / [en] After recent financial crisis that have hit financial markets all around the world, with more property on 2008/2009 periods, the Eastern Europe crisis in 2007, the Russian moratorium on October/1998, and with Brazilian national exchange rate regime change on January/1999, financial institutions have incurred
in large losses on each of these events and one of the main question raised about the financial models related to risk management. The Value-at-Risk management and its many forms to calculate it, as well as the simulations and scenarios predicted by analysts could not predict its magnitude or prevent crisis worsened. As a result, I intent to study the question of financial systems management, in order to improve the existing methods, under the threat that even bigger financial disasters are shall overcome. Although it s content is vast on scientific literature, the Value-at-Risk calculate is not exact and free of flaws. In this context, there is need for the development and improvement of risk management tools that are able to assist in a better asset equities allocation of resources, equalizing the risk level of an investment and it s return.
|
214 |
Software for Manipulating and Embedding Data Interrogation Algorithms Into Integrated SystemsAllen, David W. 20 January 2005 (has links)
In this study a software package for easily creating and embedding structural health monitoring (SHM) data interrogation processes in remote hardware is presented. The software described herein is comprised of two pieces. The first is a client to allow graphical construction of data interrogation processes. The second is node software for remote execution of processes on remote sensing and monitoring hardware. The client software is created around a catalog of data interrogation algorithms compiled over several years of research at Los Alamos National Laboratory known as DIAMOND II. This study also includes encapsulating the DIAMOND II algorithms into independent interchangeable functions and expanding the catalog with work in feature extraction and statistical discrimination.
The client software also includes methods for interfacing with the node software over an Internet connection. Once connected, the client software can upload a developed process to the integrated sensing and processing node. The node software has the ability to run the processes and return results. This software creates a distributed SHM network without individual nodes relying on each other or a centralized server to monitor a structure.
For the demonstration summarized in this study, the client software is used to create data collection, feature extraction, and statistical modeling processes. Data are collected from monitoring hardware connected to the client by a local area network. A structural health monitoring process is created on the client and uploaded to the node software residing on the monitoring hardware. The node software runs the process and monitors a test structure for induced damage, returning the current structural-state indicator in near real time to the client.
Current integrated health monitoring systems rely on processes statically loaded onto the monitoring node before the node is deployed in the field. The primary new contribution of this study is a software paradigm that allows processes to be created remotely and uploaded to the node in a dynamic fashion over the life of the monitoring node without taking the node out of service. / Master of Science
|
215 |
Value at risk and expected shortfall : traditional measures and extreme value theory enhancements with a South African market applicationDicks, Anelda 12 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2013. / ENGLISH ABSTRACT: Accurate estimation of Value at Risk (VaR) and Expected Shortfall (ES) is critical in the management of extreme market risks. These risks occur with small probability, but the financial impacts could be large.
Traditional models to estimate VaR and ES are investigated. Following usual practice, 99% 10 day VaR and ES measures are calculated. A comprehensive theoretical background is first provided and then the models are applied to the Africa Financials Index from 29/01/1996 to 30/04/2013. The models considered include independent, identically distributed (i.i.d.) models and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) stochastic volatility models. Extreme Value Theory (EVT) models that focus especially on extreme market returns are also investigated. For this, the Peaks Over Threshold (POT) approach to EVT is followed. For the calculation of VaR, various scaling methods from one day to ten days are considered and their performance evaluated.
The GARCH models fail to converge during periods of extreme returns. During these periods, EVT forecast results may be used. As a novel approach, this study considers the augmentation of the GARCH models with EVT forecasts. The two-step procedure of pre-filtering with a GARCH model and then applying EVT, as suggested by McNeil (1999), is also investigated.
This study identifies some of the practical issues in model fitting. It is shown that no single forecasting model is universally optimal and the choice will depend on the nature of the data. For this data series, the best approach was to augment the GARCH stochastic volatility models with EVT forecasts during periods where the first do not converge. Model performance is judged by the actual number of VaR and ES violations compared to the expected number. The expected number is taken as the number of return observations over the entire sample period, multiplied by 0.01 for 99% VaR and ES calculations. / AFRIKAANSE OPSOMMING: Akkurate beraming van Waarde op Risiko (Value at Risk) en Verwagte Tekort (Expected Shortfall) is krities vir die bestuur van ekstreme mark risiko’s. Hierdie risiko’s kom met klein waarskynlikheid voor, maar die finansiële impakte is potensieel groot.
Tradisionele modelle om Waarde op Risiko en Verwagte Tekort te beraam, word ondersoek. In ooreenstemming met die algemene praktyk, word 99% 10 dag maatstawwe bereken. ‘n Omvattende teoretiese agtergrond word eers gegee en daarna word die modelle toegepas op die Africa Financials Index vanaf 29/01/1996 tot 30/04/2013. Die modelle wat oorweeg word sluit onafhanklike, identies verdeelde modelle en Veralgemeende Auto-regressiewe Voorwaardelike Heteroskedastiese (GARCH) stogastiese volatiliteitsmodelle in. Ekstreemwaarde Teorie modelle, wat spesifiek op ekstreme mark opbrengste fokus, word ook ondersoek. In hierdie verband word die Peaks Over Threshold (POT) benadering tot Ekstreemwaarde Teorie gevolg. Vir die berekening van Waarde op Risiko word verskillende skaleringsmetodes van een dag na tien dae oorweeg en die prestasie van elk word ge-evalueer.
Die GARCH modelle konvergeer nie gedurende tydperke van ekstreme opbrengste nie. Gedurende hierdie tydperke, kan Ekstreemwaarde Teorie modelle gebruik word. As ‘n nuwe benadering oorweeg hierdie studie die aanvulling van die GARCH modelle met Ekstreemwaarde Teorie vooruitskattings. Die sogenaamde twee-stap prosedure wat voor-af filtrering met ‘n GARCH model behels, gevolg deur die toepassing van Ekstreemwaarde Teorie (soos voorgestel deur McNeil, 1999), word ook ondersoek.
Hierdie studie identifiseer sommige van die praktiese probleme in model passing. Daar word gewys dat geen enkele vooruistkattingsmodel universeel optimaal is nie en die keuse van die model hang af van die aard van die data. Die beste benadering vir die data reeks wat in hierdie studie gebruik word, was om die GARCH stogastiese volatiliteitsmodelle met Ekstreemwaarde Teorie vooruitskattings aan te vul waar die voorafgenoemde nie konvergeer nie. Die prestasie van die modelle word beoordeel deur die werklike aantal Waarde op Risiko en Verwagte Tekort oortredings met die verwagte aantal te vergelyk. Die verwagte aantal word geneem as die aantal obrengste waargeneem oor die hele steekproefperiode, vermenigvuldig met 0.01 vir die 99% Waarde op Risiko en Verwagte Tekort berekeninge.
|
216 |
用極值理論分析次級房貸風暴的衝擊-以全球市場為例 / Using extreme value theory to analyze the US sub-prime mortgage crisis on the global stock market彭富忠, Peng, Fu Chung Unknown Date (has links)
The US sub-prime mortgage crisis greatly affected not only the US economy but also other countries in the world. This thesis employs the extreme value theory and Value at Risk (VaR) analysis to assess the impact of the US sub-prime mortgage crisis on various stock markets of the MSCI indexes, including 10 countries and 7 areas. It is reasonable to guess that VaR value should increase after the crisis. The empirical analyses on these indexes conclude that (1) the American market indexes not only do not agree with the guess after the crisis but four American indexes are identical; (2) not all the Asia market indexes consist with the guess; (3) the European market indexes agree with the guess; (4) MSCI AC PACIFIC, NEW ZEALAND, and AUSTRALIA consist with the guess; (5) the behavior for the positive log returns is different from that for the negative returns in some MSCI indexes. Over speaking, the impacts of US sub-prime mortgage crisis on those countries are not the same.
|
217 |
Les approches extrêmes de la contagion sur les marchés financiers / Extreme approaches of contagion in financial marketsXu, Bei 16 November 2012 (has links)
La thèse est composée de trois parties. La première présente un certain nombre de mesures de dépendance extrême. Une application sur les actions et les obligations de 49 pays montre que la théorie des valeurs extrêmes multivariées conduit aux résultats différents de ceux issus du coefficient de corrélation, mais relativement proches de ceux obtenus du rho de Spearman conditionnel multivarié. Cette partie évalue aussi le risque de pertes importantes simultanées. La deuxième partie examine les déterminants des co-mouvements extrêmes entre 5 pays core et 49 pays non core. Les mécanismes de transmission des chocs varient de la période moins récente à la période récente, des pays développés aux pays émergents, des chocs normaux aux chocs extrêmes. La troisième partie étudie le rôle de valeur refuge de l’or sur la période 1986-2012. Les gains positifs extrêmes de l'or peuvent être liés aux pertes extrêmes du S&P. Cependant, ce lien n'est pas toujours valable, il évolue dans le temps et serait conditionné par d'autres facteurs. / The thesis consists of three parts. The first part introduces a number of measures of extreme dependency. An application on stock and bond markets of 49 countries shows the multivariate extreme value theory leads to results which are different from those from the correlation coefficient, but relatively close to those obtained from multivariate conditional Spearman's rho. This part also assesses the risk of simultaneous losses. The second part examines the determinants of extreme co-movements between 5 core countries and 49 non-core countries. Transmission mechanisms of shocks vary from less recent to recent period, from developed to emerging markets, from normal to extreme shocks. The third part examines the role of safe haven of gold over the period 1986-2012. Extreme positive gains of gold can be linked to extreme losses of S&P. However, this relationship is not always valid, it evolves over time and could be determined by other factors.
|
218 |
Measuring and managing operational risk in the insurance and banking sectors / Mesure et gestion du risque opérationnel en assurance et financeKaram, Elias 26 June 2014 (has links)
Notre intérêt dans cette thèse est de combiner les différentes techniques de mesure du risque opérationnel dans les secteurs financiers, et on s'intéresse plus particulièrement aux conséquences du risque d'estimation dans les modèles, qui est un risque opérationnel particulier. Nous allons présenter les concepts mathématiques et actuariels associés ainsi qu'une application numérique en ce qui concerne l'approche de mesure avancée comme Loss Distribution pour calculer l'exigence en capital. En plus, on se concentre sur le risque d'estimation illustré avec l'analyse des scénarios de l'opinion d'experts en conjonction avec des données de pertes internes pour évaluer notre exposition aux évènements de gravité. Nous concluons cette première partie en définissant une technique de mise l'échelle sur la base de (MCO) qui nous permet de normaliser nos données externes à une banque locale Libanaise.Dans la deuxième partie, on donne de l'importance sur la mesure de l'erreur induite sur le SCR par l'erreur d'estimation des paramètres, on propose une méthode alternative pour estimer une courbe de taux et on termine par attirer l'attention sur les réflexions autour des hypothèses de calcul et ce que l'on convient de qualifier d'hypothèse "cohérente avec les valeurs de marché" serait bien plus pertinente et efficace que la complexification du modèle, source d'instabilité supplémentaire, ainsi mettre en évidence le risque d'estimation qui est lié au risque opérationnel et doit être accordé beaucoup plus d'attention dans nos modèles de travail / Our interest in this thesis is first to combine the different measurement techniques for operational risk in financial companies, and we highlight more and more the consequences of estimation risk which is treated as a particular part of operational risk. In the first part, we will present a full overview of operational risk, from the regulatory laws and regulations to the associated mathematical and actuarial concepts as well as a numerical application regarding the Advanced Measurement Approach, like Loss Distribution to calculate the capital requirement, then applying the Extreme Value Theory. We conclude this first part by setting a scaling technique based on (OLS) enabling us to normalize our external data to a local Lebanese Bank. On the second part, we feature estimation risk by first measuring the error induced on the SCR by the estimation error of the parameters, to having an alternative yield curve estimation and finishing by calling attention to the reflections on assumptions of the calculation instead of focusing on the so called hypothesis "consistent with market values", would be more appropriate and effective than to complicate models and generate additional errors and instability. Chapters in this part illustrate the estimation risk in its different aspects which is a part of operational risk, highlighting as so the attention that should be given in treating our models
|
219 |
Spin-glass models and interdisciplinary applications / Modèles de verre de spin et applications interdisciplinairesZarinelli, Elia 13 January 2012 (has links)
Le sujet principal de cette thèse est la physique des verres de spin. Les verres de spin ont été introduits au début des années 70 pour décrire alliages magnétiques diluées. Ils ont désormais été considerés pour comprendre le comportement de liquides sousrefroidis. Parmis les systèmes qui peuvent être décrits par le langage des systèmes desordonnés, on trouve les problèmes d’optimisation combinatoire. Dans la première partie de cette thèse, nous considérons les modèles de verre de spin avec intéraction de Kac pour investiguer la phase de basse température des liquides sous-refroidis. Dans les chapitres qui suivent, nous montrons comment certaines caractéristiques des modèles de verre de spin peuvent être obtenues à partir de résultats de la théorie des matrices aléatoires en connection avec la statistique des valeurs extrêmes. Dans la dernière partie de la thèse, nous considérons la connexion entre la théorie desverres de spin et la science computationnelle, et présentons un nouvel algorithme qui peut être appliqué à certains problèmes dans le domaine des finances. / The main subject of this thesis is the physics of spin glasses. After their introduction in the 70s in order to describe dilute magnetic alloys, spin-glass models have been considered prototype models to understand the behavior of supercooled liquids. Among the systems that can be described and analyzed using the language of disordered systems, there are problems of combinatorial optimization. In the first part of the thesis, we consider spin-glass models with Kac interactions in order to investigate the supercooled phase of glass-forming liquids. Afterwards, we show how some features of spin-glass models can be described by ubiquitous results of Random Matrix Theory in connection with Extreme Value Statistics. Finally, from the interaction of spin-glass theory and computer science, we put forward a new algorithm of immediate application in Financial problems.
|
220 |
Contributions aux algorithmes stochastiques pour le Big Data et à la théorie des valeurs extrèmes multivariés. / Contributions to stochastic algorithm for Big Data and multivariate extreme value theory.Ho, Zhen Wai Olivier 04 October 2018 (has links)
La thèse comporte deux parties distinctes. La première partie concerne des modèles pour les extrêmes multivariés.On donne une construction de vecteurs aléatoires multivariés à variations régulières. La construction se base sur une extension multivariée d'un lemme de Breiman établissant la propriété de variation régulière d'un produit $RZ$ de variable aléatoire avec $R$ positive à variation régulière et $Z$ positive suffisamment intégrable. En prenant $mathbf{Z}$ multivarié et suffisamment intégrable, on montre que $Rmathbf{Z}$ est un vecteur aléatoire à variations régulières et on caractérise sa mesure limite. On montre ensuite que pour $mathbf{Z}$ de loi bien choisie, on retrouve des modèles stables classiques comme le modèle t-extremal, Hüsler-Reiss, etc. Puis, on étend notre construction pour considérer la notion de variation régulière multivariée non standard. On montre ensuite que le modèle de Pareto (qu'on appelle Hüsler-Reiss Pareto) associé au modèle max-stable Hüsler-Reiss forme une famille exponentielle complète. On donne quelques propriétés du modèle Hüsler-Reiss Pareto puis on propose un algorithme de simulation exacte. On étudie l'inférence par le maximum de vraisemblance. Finalement, on considère une extension du modèle Hüsler-Reiss Pareto utilisant la notion de variation régulière non standard. On étudie l'inférence par le maximum de vraisemblance du modèle généralisé et on propose une méthode d'estimation des paramètres. On donne une étude numérique sur l'estimateur du maximum de vraisemblance pour le modèle Hüsler-Reiss Pareto. Dans la second partie qui concerne l'apprentissage statistique, on commence par donner une borne sur la valeur singulière minimale d'une matrice perturbée par l'ajout d'une colonne. On propose alors un algorithme de sélection de colonne afin d'extraire les caractéristiques de la matrice. On illustre notre algorithme sur des données réelles de séries temporelles où chaque série est pris comme étant une colonne de la matrice. Deuxièmement, on montre que si une matrice $X$ à une propriété d'incohérence alors $X$ possède aussi une version affaiblie de la propriété NSP (null space property). Puis, on s'intéresse au problème de sélection de matrice incohérente. A partir d'une matrice $Xin mathbb{R}^{n imes p}$ et $mu>0$, on cherche la plus grande sous-matrice de $X$ avec une cohérence inférieure à $mu$. Ce problème est formulé comme un programme linéaire avec contrainte quadratique sur ${0,1}^p$. Comme ce problème est NP-dur, on considère une relaxation sur la sphère et on obtient une borne sur l'erreur lorsqu'on considère le problème relaxé. Enfin, on analyse l'algorithme de gradient stochastique projeté pour l'analyse en composante principale online. On montre qu'en espérance, l'algorithme converge vers un vecteur propre maximum et on propose un algorithme pour sélectionner le pas de l'algorithme. On illustre ensuite cet algorithme par une expérience de simulation. / This thesis in divided in two parts. The first part studies models for multivariate extremes. We give a method to construct multivariate regularly varying random vectors. The method is based on a multivariate extension of a Breiman Lemma that states that a product $RZ$ of a random non negative regularly varying variable $R$ and a non negative $Z$ sufficiently integrable is also regularly varying. Replacing $Z$ with a random vector $mathbf{Z}$, we show that the product $Rmathbf{Z}$ is regularly varying and we give a characterisation of its limit measure. Then, we show that taking specific distributions for $mathbf{Z}$, we obtain classical max-stable models. We extend our result to non-standard regular variations. Next, we show that the Pareto model associated with the Hüsler-Reiss max-stable model forms a full exponential family. We show some properties of this model and we give an algorithm for exact simulation. We study the properties of the maximum likelihood estimator. Then, we extend our model to non-standard regular variations. To finish the first part, we propose a numerical study of the Hüsler-Reiss Pareto model.In the second part, we start by giving a lower bound of the smallest singular value of a matrix perturbed by appending a column. Then, we give a greedy algorithm for feature selection and we illustrate this algorithm on a time series dataset. Secondly, we show that an incoherent matrix satisfies a weakened version of the NSP property. Thirdly, we study the problem of column selection of $Xinmathbb{R}^{n imes p}$ given a coherence threshold $mu$. This means we want the largest submatrix satisfying some coherence property. We formulate the problem as a linear program with quadratic constraint on ${0,1}^p$. Then, we consider a relaxation on the sphere and we bound the relaxation error. Finally, we study the projected stochastic gradient descent for online PCA. We show that in expectation, the algorithm converges to a leading eigenvector and we suggest an algorithm for step-size selection. We illustrate this algorithm with a numerical experiment.
|
Page generated in 0.0545 seconds