Spelling suggestions: "subject:"extremevalue"" "subject:"extreme·value""
81 |
Enhanced Power System Operational Performance with Anticipatory Control under Increased Penetration of Wind EnergyJanuary 2016 (has links)
abstract: As the world embraces a sustainable energy future, alternative energy resources, such as wind power, are increasingly being seen as an integral part of the future electric energy grid. Ultimately, integrating such a dynamic and variable mix of generation requires a better understanding of renewable generation output, in addition to power grid systems that improve power system operational performance in the presence of anticipated events such as wind power ramps. Because of the stochastic, uncontrollable nature of renewable resources, a thorough and accurate characterization of wind activity is necessary to maintain grid stability and reliability. Wind power ramps from an existing wind farm are studied to characterize persistence forecasting errors using extreme value analysis techniques. In addition, a novel metric that quantifies the amount of non-stationarity in time series wind power data was proposed and used in a real-time algorithm to provide a rigorous method that adaptively determines training data for forecasts. Lastly, large swings in generation or load can cause system frequency and tie-line flows to deviate from nominal, so an anticipatory MPC-based secondary control scheme was designed and integrated into an automatic generation control loop to improve the ability of an interconnection to respond to anticipated large events and fluctuations in the power system. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2016
|
82 |
Inferencia Bayesiana para valores extremos / Bayesian inference for extremesBernardini, Diego Fernando de, 1986- 15 August 2018 (has links)
Orientador: Laura Leticia Ramos Rifo / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Computação Cientifica / Made available in DSpace on 2018-08-15T01:44:09Z (GMT). No. of bitstreams: 1
Bernardini_DiegoFernandode_M.pdf: 1483229 bytes, checksum: ea77acd21778728138eea2f27e59235b (MD5)
Previous issue date: 2010 / Resumo: Iniciamos o presente trabalho apresentando uma breve introdução a teoria de valores extremos, estudando especialmente o comportamento da variável aleatória que representa o máximo de uma sequência de variáveis aleatórias independentes e identicamente distribuídas. Vemos que o Teorema dos Tipos Extremos (ou Teorema de Fisher-Tippett) constitui uma ferramenta fundamental no que diz respeito ao estudo do comportamento assintóticos destes máximos, permitindo a modelagem de dados que representem uma sequência de observações de máximos de um determinado fenômeno ou processo aleatório, através de uma classe de distribuições conhecida como família de distribuições de Valor Extremo Generalizada (Generalized Extreme Value - GEV). A distribuição Gumbel, associada ao máximo de distribuições como a Normal ou Gama entre outras, é um caso particular desta família. Torna-se interessante, assim, realizar inferência para os parâmetros desta família. Especificamente, a comparação entre os modelos Gumbel e GEV constitui o foco principal deste trabalho. No Capítulo 1 estudamos, no contexto da inferência clássica, o método de estimação por máxima verossimilhança para estes parâmetros e um procedimento de teste de razão de verossimilhanças adequado para testar a hipótese nula que representa o modelo Gumbel contra a hipótese que representa o modelo completo GEV. Prosseguimos, no Capítulo 2, com uma breve revisão em teoria de inferência Bayesiana obtendo inferências para o parâmetro de interesse em termos de sua distribuição a posteriori. Estudamos também a distribuição preditiva para valores futuros. No que diz respeito à comparação de modelos, estudamos inicialmente, neste contexto bayesiano, o fator de Bayes e o fator de Bayes a posteriori. Em seguida estudamos o Full Bayesian Significance Test (FBST), um teste de significância particularmente adequado para testar hipóteses precisas, como a hipótese que caracteriza o modelo Gumbel. Além disso, estudamos outros dois critérios para comparação de modelos, o BIC (Bayesian Information Criterion) e o DIC (Deviance Information Criterion). Estudamos as medidas de evidência especificamente no contexto da comparação entre os modelos Gumbel e GEV, bem como a distribuição preditiva, além dos intervalos de credibilidade e inferência a posteriori para os níveis de retorno associados a tempos de retorno fixos. O Capítulo 1 e parte do Capítulo 2 fornecem os fundamentos teóricos básicos deste trabalho, e estão fortemente baseados em Coles (2001) e O'Hagan (1994). No Capítulo 3 apresentamos o conhecido algoritmo de Metropolis-Hastings para simulação de distribuições de probabilidade e o algoritmo particular utilizado neste trabalho para a obtenção de amostras simuladas da distribuição a posteriori dos parâmetros de interesse. No capítulo seguinte formulamos a modelagem dos dados observados de máximos, apresentando a função de verossimilhança e estabelecendo a distribuição a priori para os parâmetros. Duas aplicações são apresentadas no Capítulo 5. A primeira delas trata das observações dos máximos trimestrais das taxas de desemprego nos Estados Unidos da América, entre o primeiro trimestre de 1994 e o primeiro trimestre de 2009. Na segunda aplicação estudamos os máximos semestrais dos níveis de maré em Newlyn, no sudoeste da Inglaterra, entre 1990 e 2007. Finalmente, uma breve discussão é apresentada no Capítulo 6. / Abstract: We begin this work presenting a brief introduction to the extreme value theory, specifically studying the behavior of the random variable which represents the maximum of a sequence of independent and identically distributed random variables. We see that the Extremal Types Theorem (or Fisher-Tippett Theorem) is a fundamental tool in the study of the asymptotic behavior of those maxima, allowing the modeling of data which represent a sequence of maxima observations of a given phenomenon or random process, through a class of distributions known as Generalized Extreme Value (GEV) family. We are interested in making inference about the parameters of this family. Specifically, the comparison between the Gumbel and GEV models constitute the main focus of this work. In Chapter 1 we study, in the context of classical inference, the method of maximum likelihood estimation for these parameters and likelihood ratio test procedure suitable for testing the null hypothesis associated to the Gumbel model against the hypothesis that represents the complete GEV model. We proceed, in Chapter 2, with a brief review on Bayesian inference theory. We also studied the predictive distribution for future values. With respect to the comparison of models, we initially study the Bayes factor and the posterior Bayes factor, in the Bayesian context. Next we study the Full Bayesian Significance Test (FBST), a significance test particularly suitable to test precise hypotheses, such as the hypothesis characterizing the Gumbel model. Furthermore, we study two other criteria for comparing models, the BIC (Bayesian Information Criterion) and the DIC (Deviance Information Criterion). We study the evidence measures specifically in the context of the comparison between the Gumbel and GEV models, as well as the predictive distribution, beyond the credible intervals and posterior inference to the return levels associated with fixed return periods. Chapter 1 and part of Chapter 2 provide the basic theoretical foundations of this work, and are strongly based on Coles (2001) and O'Hagan (1994). In Chapter 3 we present the well-known Metropolis-Hastings algorithm for simulation of probability distributions, and the particular algorithm used in this work to obtain simulated samples from the posterior distribution for the parameters of interest. In the next chapter we formulate the modeling of the observed data of maximum, presenting the likelihood function and setting the prior distribution for the parameters. Two applications are presented in Chapter 5. The first one deals with observations of the quarterly maximum for unemployment rates in the United States of America, between the first quarter of 1994 and first quarter of 2009. In the second application we studied the semiannual maximum of sea levels at Newlyn, in southwest of England, between 1990 and 2007. Finally, a brief discussion is presented in Chapter 6. / Mestrado / Estatistica / Mestre em Estatística
|
83 |
The best are never normal: exploring the distribution of firm performanceBuchbinder, Felipe 08 June 2011 (has links)
Submitted by Estagiário SPT BMHS (spt@fgv.br) on 2013-07-30T12:35:47Z
No. of bitstreams: 1
Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) / Approved for entry into archive by Estagiário SPT BMHS (spt@fgv.br) on 2013-07-30T12:36:04Z (GMT) No. of bitstreams: 1
Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) / Approved for entry into archive by Estagiário SPT BMHS (spt@fgv.br) on 2013-07-30T12:36:22Z (GMT) No. of bitstreams: 1
Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) / Made available in DSpace on 2013-07-30T12:36:46Z (GMT). No. of bitstreams: 1
Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5)
Previous issue date: 2011-06-08 / Competitive Strategy literature predicts three different mechanisms of performance generation, thus distinguishing between firms that have competitive advantage, firms that have competitive disadvantage or firms that have neither. Nonetheless, previous works in the field have fitted a single normal distribution to model firm performance. Here, we develop a new approach that distinguishes among performance generating mechanisms and allows the identification of firms with competitive advantage or disadvantage. Theorizing on the positive feedback loops by which firms with competitive advantage have facilitated access to acquire new resources, we proposed a distribution we believe data on firm performance should follow. We illustrate our model by assessing its fit to data on firm performance, addressing its theoretical implications and comparing it to previous works.
|
84 |
Power Markets and Risk Management Modeling / Trhy s elektrickou energií a modelování v řízení rizikPaholok, Igor January 2012 (has links)
The main target of this thesis is to summarize and explain the specifics of power markets and test application of models, which might be used especially in risk management area. Thesis starts with definition of market subjects, typology of traded contracts and description of market development with focus on Czech Republic. Thesis continues with development of theoretical concepts of short term/spot electricity markets and potential link between spot and forward electricity markets. After deriving of those microeconomic fundamental models we continue with stochastic models (Jump Diffusion Mean Reverting process and Extreme Value Theory) in order to depict patterns of spot and forward power contracts price volatility. Last chapter deals with credit risk specifics of power trading and develops model (using concept known as Credit Value Adjustment) to compare economic efficiency of OTC and exchange power trading. Developed and described models are tested on selected power markets, again with focus on Czech power market data set.
|
85 |
Essays on Emissions Trading MarketsDhavala, Kishore 05 November 2012 (has links)
This dissertation is a collection of three economics essays on different aspects of carbon emission trading markets. The first essay analyzes the dynamic optimal emission control strategies of two nations. With a potential to become the largest buyer under the Kyoto Protocol, the US is assumed to be a monopsony, whereas with a large number of tradable permits on hand Russia is assumed to be a monopoly. Optimal costs of emission control programs are estimated for both the countries under four different market scenarios: non-cooperative no trade, US monopsony, Russia monopoly, and cooperative trading. The US monopsony scenario is found to be the most Pareto cost efficient. The Pareto efficient outcome, however, would require the US to make side payments to Russia, which will even out the differences in the cost savings from cooperative behavior.
The second essay analyzes the price dynamics of the Chicago Climate Exchange (CCX), a voluntary emissions trading market. By examining the volatility in market returns using AR-GARCH and Markov switching models, the study associates the market price fluctuations with two different political regimes of the US government. Further, the study also identifies a high volatility in the returns few months before the market collapse. Three possible regulatory and market-based forces are identified as probable causes of market volatility and its ultimate collapse. Organizers of other voluntary markets in the US and worldwide may closely watch for these regime switching forces in order to overcome emission market crashes.
The third essay compares excess skewness and kurtosis in carbon prices between CCX and EU ETS (European Union Emission Trading Scheme) Phase I and II markets, by examining the tail behavior when market expectations exceed the threshold level. Dynamic extreme value theory is used to find out the mean price exceedence of the threshold levels and estimate the risk loss. The calculated risk measures suggest that CCX and EU ETS Phase I are extremely immature markets for a risk investor, whereas EU ETS Phase II is a more stable market that could develop as a mature carbon market in future years.
|
86 |
Modeling and Simulation of Spatial Extremes Based on Max-Infinitely Divisible and Related ProcessesZhong, Peng 17 April 2022 (has links)
The statistical modeling of extreme natural hazards is becoming increasingly important due to climate change, whose effects have been increasingly visible throughout the last decades. It is thus crucial to understand the dependence structure of rare, high-impact events over space and time for realistic risk assessment. For spatial extremes, max-stable processes have played a central role in modeling block maxima. However, the spatial tail dependence strength is persistent across quantile levels in those models, which is often not realistic in practice. This lack of flexibility implies that max-stable processes cannot capture weakening dependence at increasingly extreme levels, resulting in a drastic overestimation of joint tail risk.
To address this, we develop new dependence models in this thesis from the class of max-infinitely divisible (max-id) processes, which contain max-stable processes as a subclass and are flexible enough to capture different types of dependence structures. Furthermore, exact simulation algorithms for general max-id processes are typically not straightforward due to their complex formulations. Both simulation and inference can be computationally prohibitive in high dimensions. Fast and exact simulation algorithms to simulate max-id processes are provided, together with methods to implement our models in high dimensions based on the Vecchia approximation method. These proposed methodologies are illustrated through various environmental datasets, including air temperature data in South-Eastern Europe in an attempt to assess the effect of climate change on heatwave hazards, and sea surface temperature data for the entire Red Sea. In another application focused on assessing how the spatial extent of extreme precipitation has changed over time, we develop new time-varying $r$-Pareto processes, which are the counterparts of max-stable processes for high threshold exceedances.
|
87 |
Risk Assessment of International Mixed Asset Portfolio with Vine CopulasNilsson, Axel January 2022 (has links)
This thesis gives an example of assessing the risk of a financial portfolio with international assets, where the assets may be of different classes, by the use of Monte Carlo simulation and Extreme Value Theory. The simulation uses univariate modelling, models of the assets’ returns as stochastic processes, as well as vine copulas to create dependency between the variables. For the asset returns a modified version of a discretized Merton jump diffusion model was used. The risk assessment used Extreme Value Theory to calculate Value at Risk and Expected Shortfall of the simulated portfolio. However, the resulting return distribution, and the risk assessment thereof, was not entirely satisfactory due to unreasonably large values ascertained. / Denna uppsats ger ett exempel på hur riskbedömning av finanisella portföljer med internationella tillgångar av olika tillgångsslag genom Monte Carlo simulering och extremvärdesteori. Simuleringen använder univariat modelling, modeller för tillgångarnas avkastningar som stokastiska processer, såväl som vine-copulas för att skapa ett beroende mellan tillgångarna. Tillgångarnas avkastningar modellerades med en modifierad version av en diskretiserad Merton-jump-diffusion model. Riskbedömningen använde extremvärdesteori för att beräkna Value-at-Risk och Expected-Shortfall. Dock blev den resulterande avkastningsfördelningen och riskbedömningen därav inte helt tillfredsällande på grund av att orimligt stora värden erhölls.
|
88 |
Initiation of Particle Movement in Turbulent Open Channel FlowValyrakis, Manousos 11 May 2011 (has links)
The objective of this thesis is to investigate the flow conditions that lead to coarse grain entrainment at near incipient motion conditions. Herein, a new conceptual approach is proposed, which in addition to the magnitude of hydrodynamic force or flow power, takes into account the duration of the flow event. Two criteria for inception of grain entrainment, namely the critical impulse and critical energy concepts, are proposed and compared. These frameworks adopt a force or energy perspective, considering the momentum or energy transfer from each flow event to the particle respectively, to describe the phenomenon.
A series of conducted mobile particle experiments, are analyzed to examine the validity of the proposed approaches. First a set of bench-top experiments incorporates an electromagnet which applies pulses of known magnitude and duration to a steel spherical particle in a controlled fashion, so as to identify the critical level for entrainment. The utility of the above criteria is also demonstrated for the case of entrainment by the action of turbulent flow, via analysis of a series of flume experiments, where both the history of hydrodynamic forces exerted on the particle as well as its response are recorded simultaneously.
Statistical modeling of the distribution of impulses, as well as conditional excess impulses, is performed using distributions from Extreme Value Theory to effectively model the episodic nature of the occurrence of these events. For the examined uniform and low mobility flow conditions, a power law relationship is proposed for describing the magnitude and frequency of occurrence of the impulse events. The Weibull and exponential distributions provide a good fit for the time between particle entrainments. In addition to these statistical tools, a number of Adaptive Neuro-Fuzzy Inference Systems employing different input representations are used to learn the nonlinear dynamics of the system and perform statistical prediction. The performance of these models is assessed in terms of their broad validity, efficiency and forecast accuracy.
Even though the impulse and energy criteria are deeply interrelated, the latter is shown to be advantageous with regard to its performance, applicability and extension ability. The effect of single or multiple highly energetic events carried by certain coherent flow structures (mainly strong sweep events) with regard to the particle response is also investigated. / Ph. D.
|
89 |
Mathematical methods for portfolio managementOndo, Guy-Roger Abessolo 08 1900 (has links)
Portfolio Management is the process of allocating an investor's wealth to in
vestment opportunities over a given planning period. Not only should Portfolio
Management be treated within a multi-period framework, but one should also take into consideration
the stochastic nature of related parameters.
After a short review of key concepts from Finance Theory, e.g. utility function, risk attitude,
Value-at-rusk estimation methods, a.nd mean-variance efficiency, this work describes a framework
for the formulation of the Portfolio Management problem in a Stochastic Programming setting.
Classical solution techniques for the resolution of the resulting Stochastic Programs (e.g.
L-shaped Decompo sition, Approximation of the probability function) are presented. These are
discussed within both the two-stage and the multi-stage case with a special em phasis on the
former. A description of how Importance Sampling and EVPI are used to improve the efficiency of
classical methods is presented. Postoptimality Analysis, a sensitivity analysis method, is also
described. / Statistics / M. Sc. (Operations Research)
|
90 |
不對稱分配於風險值之應用 - 以台灣股市為例 / An application of asymmetric distribution in value at risk - taking Taiwan stock market as an example沈之元, Shen,Chih-Yuan Unknown Date (has links)
本文以台灣股價加權指數,使用 AR(3)-GJR-GRACH(1,1) 模型,白噪音假設為 Normal 、 Skew-Normal 、 Student t 、 skew-t 、 EPD 、 SEPD 、與 AEPD 等七種分配。著重於兩個部份,(一) Student t 分配一族與 EPD 分配一族在模型配適與風險值估計的比較;(二) 預測風險值區分為低震盪與高震盪兩個區間,比較不同分配在兩區間預測風險值的差異。
實證分析顯示, t 分配一族與 EPD 分配一族配適的結果,無論是只考慮峰態 ( t 分配與 EPD 分配) ,或者加入影響偏態的參數 ( skew-t 分配與 SEPD 分配) , t 分配一族的配適程度都較 EPD 分配一族為佳。更進一步考慮分配兩尾厚度不同的 AEPD 分配,配適結果為七種分配中最佳。
風險值的估計在低震盪的區間,常態分配與其他厚尾分配皆能通過回溯測試,採用厚尾分配效果不大;在高震盪的區間,左尾風險值回溯測試結果,常態分配與其他厚尾分配皆無法全數通過,但仍以 AEPD 分配為最佳。最後比較損失函數,左尾風險值估計以 AEPD 分配為最佳,右尾風險值則無一致的結果。因此我們認為 AEPD 分配可作為風險管理有用的工具。
|
Page generated in 0.0576 seconds