Spelling suggestions: "subject:"extreme value"" "subject:"extreme alue""
221 |
Métodos de Monte Carlo Hamiltoniano na inferência Bayesiana não-paramétrica de valores extremos / Monte Carlo Hamiltonian methods in non-parametric Bayesian inference of extreme valuesHartmann, Marcelo 09 March 2015 (has links)
Neste trabalho propomos uma abordagem Bayesiana não-paramétrica para a modelagem de dados com comportamento extremo. Tratamos o parâmetro de locação μ da distribuição generalizada de valor extremo como uma função aleatória e assumimos um processo Gaussiano para tal função (Rasmussem & Williams 2006). Esta situação leva à intratabilidade analítica da distribuição a posteriori de alta dimensão. Para lidar com este problema fazemos uso do método Hamiltoniano de Monte Carlo em variedade Riemanniana que permite a simulação de valores da distribuição a posteriori com forma complexa e estrutura de correlação incomum (Calderhead & Girolami 2011). Além disso, propomos um modelo de série temporal autoregressivo de ordem p, assumindo a distribuição generalizada de valor extremo para o ruído e determinamos a respectiva matriz de informação de Fisher. No decorrer de todo o trabalho, estudamos a qualidade do algoritmo em suas variantes através de simulações computacionais e apresentamos vários exemplos com dados reais e simulados. / In this work we propose a Bayesian nonparametric approach for modeling extreme value data. We treat the location parameter μ of the generalized extreme value distribution as a random function following a Gaussian process model (Rasmussem & Williams 2006). This configuration leads to no closed-form expressions for the highdimensional posterior distribution. To tackle this problem we use the Riemannian Manifold Hamiltonian Monte Carlo algorithm which allows samples from the posterior distribution with complex form and non-usual correlation structure (Calderhead & Girolami 2011). Moreover, we propose an autoregressive time series model assuming the generalized extreme value distribution for the noise and obtained its Fisher information matrix. Throughout this work we employ some computational simulation studies to assess the performance of the algorithm in its variants and show many examples with simulated and real data-sets.
|
222 |
Estimação de medidas de risco utilizando modelos CAViaR e CARE / Risk measures estimation using CAViaR and CARE models.Silva, Francyelle de Lima e 06 August 2010 (has links)
Neste trabalho são definidos, discutidos e estimados o Valor em Risco e o Expected Shortfall. Estas são medidas de Risco Financeiro de Mercado muito utilizadas por empresas e investidores para o gerenciamento do risco, aos quais podem estar expostos. O objetivo foi apresentar e utilizar vários métodos e modelos para a estimação dessas medidas e estabelecer qual o modelo mais adequado dentro de determinados cenários. / In this work Value at Risk and Expected Shortfall are defined, discussed and estimated . These are measures heavily used in Financial Market Risk, in particular by companies and investors to manage risk, which they may be exposed. The aim is to present and use several methods and models for estimating those measures and to establish which model is most appropriate in certain scenarios.
|
223 |
Použití koherentních metod měření rizika v modelování operačních rizik / The use of coherent risk measures in operational risk modelingLebovič, Michal January 2012 (has links)
The debate on quantitative operational risk modeling has only started at the beginning of the last decade and the best-practices are still far from being established. Estimation of capital requirements for operational risk under Advanced Measurement Approaches of Basel II is critically dependent on the choice of risk measure, which quantifies the risk exposure based on the underlying simulated distribution of losses. Despite its well-known caveats Value-at-Risk remains a predominant risk measure used in the context of operational risk management. We describe several serious drawbacks of Value-at-Risk and explain why it can possibly lead to misleading conclusions. As a remedy we suggest the use of coherent risk measures - and namely the statistic known as Expected Shortfall - as a suitable alternative or complement for quantification of operational risk exposure. We demonstrate that application of Expected Shortfall in operational loss modeling is feasible and produces reasonable and consistent results. We also consider a variety of statistical techniques for modeling of underlying loss distribution and evaluate extreme value theory framework as the most suitable for this purpose. Using stress tests we further compare the robustness and consistency of selected models and their implied risk capital estimates...
|
224 |
Estimação de medidas de risco utilizando modelos CAViaR e CARE / Risk measures estimation using CAViaR and CARE models.Francyelle de Lima e Silva 06 August 2010 (has links)
Neste trabalho são definidos, discutidos e estimados o Valor em Risco e o Expected Shortfall. Estas são medidas de Risco Financeiro de Mercado muito utilizadas por empresas e investidores para o gerenciamento do risco, aos quais podem estar expostos. O objetivo foi apresentar e utilizar vários métodos e modelos para a estimação dessas medidas e estabelecer qual o modelo mais adequado dentro de determinados cenários. / In this work Value at Risk and Expected Shortfall are defined, discussed and estimated . These are measures heavily used in Financial Market Risk, in particular by companies and investors to manage risk, which they may be exposed. The aim is to present and use several methods and models for estimating those measures and to establish which model is most appropriate in certain scenarios.
|
225 |
以風險值衡量銀行外匯部位資本之計提陳昀聖, Chen Yun-Sheng Unknown Date (has links)
本論文的目的在比較標準法和風險值法(VaR)於外匯部位資本計提數額上的差異。在VaR法方面,本篇採用變異數-共變異數法、歷史模擬法以及極端值法等三種衡量方法,並利用回溯測試(backtest)對三種方法預測風險的能力做一檢測。標準法是指財政部規定的資本計提標準方法。
本篇論文實證結果發現用VaR法所計提的資本數額是依標準法所需計提數額的一半。也就是說依標準法提列會造成過多的資金成本。另外,從安全性的角度觀之,經過回溯測試,發現採取歷史模擬法或極端值法則是值得信賴的資本計提的方法。反之,變異數-共變異數法會有低估的現象。但因計算極端值法所需要的資料過於龐大,建議使用歷史模擬法,如此相對於標準法將可省下可觀的資金成本。
第一章 研究動機與目的…………………………………1
第二章 國內外資本適足的規定…………………………3
第一節 資本適足規定(BIS)的發展……………………3
第二節 台灣相關法令規定……………………………6
第三章 文獻探討……………………………………… 10
第四章 研究方法與模型……………………………… 14
第一節 VaR模型…………………………………… 14
第二節 回溯測試…………………………………… 24
第五章 實證分析……………………………………… 28
第一節 實證資料介紹……………………………… 28
第二節 實證結果…………………………………… 29
第六章 結論…………………………………………… 42
參考文獻……………………………………………………44
|
226 |
CCM3 as applied to an idealized all land zonally symmetric planet, Terra Blanda 3Mahajan, Salil 17 February 2005 (has links)
Community Climate Model 3 (CCM3) is run on an idealized all land zonally
symmetric planet (Terra Blanda) with no seasonality, no snow and fixed soil moisture
to obtain a stationary time series effectively much longer than conventional runs with
geography and seasons. The surface temperature field generated is studied to analyze
the spatial and temporal spectra, estimate the length scale and time scale of the
model, and test the linearity of response to periodic and steady heat source forcings.
The length scale of the model is found to be in the range of 1000-2000 km and the time
scale is estimated to be 24 days for the global average surface temperature field. The
response of the global average temperature is found to be fairly linear to periodic and
the steady heat source forcings. The results obtained are compared with the results
of a similar study that used CCM0. Fluctuation Dissipation theorem is also tested
for applicability on CCM3. The response of the surface temperature field to a step
function forcing is demonstrated to be very similar to the decay of naturally occurring
anomalies, and the auto-correlation function. Return period of surface temperature
anomalies is also studied. It is observed that the length of the data obtained from
CCM3, though sufficient for analysis of first and second moments, is significantly
deficient for return period analysis. An AR1 process is simulated to model the global
averaged surface temperature of Terra Blanda 3 to assess the sampling error associated
with short runs.
|
227 |
Analyzing value at risk and expected shortfall methods: the use of parametric, non-parametric, and semi-parametric modelsHuang, Xinxin 25 August 2014 (has links)
Value at Risk (VaR) and Expected Shortfall (ES) are methods often used to measure market risk. Inaccurate and unreliable Value at Risk and Expected Shortfall models can lead to underestimation of the market risk that a firm or financial institution is exposed to, and therefore may jeopardize the well-being or survival of the firm or financial institution during adverse markets. The objective of this study is therefore to examine various Value at Risk and Expected Shortfall models, including fatter tail models, in order to analyze the accuracy and reliability of these models.
Thirteen VaR and ES models under three main approaches (Parametric, Non-Parametric and Semi-Parametric) are examined in this study. The results of this study show that the proposed model (ARMA(1,1)-GJR-GARCH(1,1)-SGED) gives the most balanced Value at Risk results. The semi-parametric model (Extreme Value Theory, EVT) is the most accurate Value at Risk model in this study for S&P 500. / October 2014
|
228 |
Characterization and construction of max-stable processesStrokorb, Kirstin 02 July 2013 (has links)
No description available.
|
229 |
Measuring and managing operational risk in the insurance and banking sectorsKaram, Elias 26 June 2014 (has links) (PDF)
Our interest in this thesis is first to combine the different measurement techniques for operational risk in financial companies, and we highlight more and more the consequences of estimation risk which is treated as a particular part of operational risk. In the first part, we will present a full overview of operational risk, from the regulatory laws and regulations to the associated mathematical and actuarial concepts as well as a numerical application regarding the Advanced Measurement Approach, like Loss Distribution to calculate the capital requirement, then applying the Extreme Value Theory. We conclude this first part by setting a scaling technique based on (OLS) enabling us to normalize our external data to a local Lebanese Bank. On the second part, we feature estimation risk by first measuring the error induced on the SCR by the estimation error of the parameters, to having an alternative yield curve estimation and finishing by calling attention to the reflections on assumptions of the calculation instead of focusing on the so called hypothesis "consistent with market values", would be more appropriate and effective than to complicate models and generate additional errors and instability. Chapters in this part illustrate the estimation risk in its different aspects which is a part of operational risk, highlighting as so the attention that should be given in treating our models
|
230 |
電路設計中電流值之罕見事件的統計估計探討 / A study of statistical method on estimating rare event in IC Current彭亞凌, Peng, Ya Ling Unknown Date (has links)
距離期望值4至6倍標準差以外的罕見機率電流值,是當前積體電路設計品質的關鍵之一,但隨著精確度的標準提升,實務上以蒙地卡羅方法模擬電路資料,因曠日廢時愈發不可行,而過去透過參數模型外插估計或迴歸分析方法,也因變數蒐集不易、操作電壓減小使得電流值尾端估計產生偏差,上述原因使得尾端電流值估計困難。因此本文引進統計方法改善罕見機率電流值的估計:先以Box-Cox轉換觀察值為近似常態,改善尾端分配值的估計,再以加權迴歸方法估計罕見電流值,其中迴歸解釋變數為Log或Z分數轉換的經驗累積機率,而加權方法採用Down-weight加重極值樣本資訊的重要性,此外,本研究也考慮能蒐集完整變數的情況,改以電路資料作為解釋變數進行加權迴歸。另一方面,本研究也採用極值理論作為估計方法。
本文先以電腦模擬評估各方法的優劣,假設母體分配為常態、T分配、Gamma分配,以均方誤差作為衡量指標,模擬結果驗證了加權迴歸方法的可行性。而後參考模擬結果決定篩選樣本方式進行實證研究,資料來源為新竹某科技公司,實證結果顯示加權迴歸配合Box-Cox轉換能以十萬筆樣本數,準確估計左、右尾機率10^(-4) 、10^(-5)、10^(-6)、10^(-7)極端電流值。其中右尾部分的加權迴歸解釋變數採用對數轉換,而左尾部分的加權迴歸解釋變數採用Z分數轉換,估計結果較為準確,又若能蒐集電路資訊作為解釋變數,在左尾部份可以有最準確的估計結果;而篩選樣本尾端1%和整筆資料的方式對於不同方法的估計準確度各有利弊,皆可考慮。另外,1%門檻值比例的極值理論能穩定且中等程度的估計不同電壓下的電流值,且有短程估計最準的趨勢。 / To obtain the tail distribution of current beyond 4 to 6 sigma is nowadays a key issue in integrated circuit (IC) design and computer simulation is a popular tool to estimate the tail values. Since creating rare events via simulation is time-consuming, often the linear extrapolation methods (such as regression analysis) are applied to enhance efficiency. However, it is shown from past work that the tail values is likely to behave differently if the operating voltage is getting lower. In this study, a statistical method is introduced to deal with the lower voltage case. The data are evaluated via the Box-Cox (or power) transformation and see if they need to be transformed into normally distributed data, following by weighted regression to extrapolate the tail values. In specific, the independent variable is the empirical CDF with logarithm or z-score transformation, and the weight is down-weight in order to emphasize the information of extreme values observations. In addition to regression analysis, Extreme Value Theory (EVT) is also adopted in the research.
The computer simulation and data sets from a famous IC manufacturer in Hsinchu are used to evaluate the proposed method, with respect to mean squared error. In computer simulation, the data are assumed to be generated from normal, student t, or Gamma distribution. For empirical data, there are 10^8 observations and tail values with probabilities 10^(-4),10^(-5),10^(-6),10^(-7) are set to be the study goal given that only 10^5 observations are available. Comparing to the traditional methods and EVT, the proposed method has the best performance in estimating the tail probabilities. If the IC current is produced from regression equation and the information of independent variables can be provided, using the weighted regression can reach the best estimation for the left-tailed rare events. Also, using EVT can also produce accurate estimates provided that the tail probabilities to be estimated and the observations available are on the similar scale, e.g., probabilities 10^(-5)~10^(-7) vs.10^5 observations.
|
Page generated in 0.0623 seconds