• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 81
  • 14
  • 13
  • 7
  • 6
  • 5
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 141
  • 141
  • 141
  • 52
  • 32
  • 24
  • 21
  • 19
  • 19
  • 18
  • 18
  • 17
  • 17
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Ekstremumų asimptotinė analizė, kai imties didumo skirstinys yra neigiamas binominis / Asymptotis Analisis of Extremes, when the set size is distributed by negative binomial distribution

Sidekerskienė, Tatjana 05 June 2006 (has links)
In this work were considered the maxima and minima structures. Where number of value is random and is distributed by negative binomial distribution. There were theorems that were improved in this work, that helped to find the limit distribute function of this standard structures. These theorems generalize propositions, when set size is geometric random number. Also, there was the concrete distribution analysis done and such distributions were chosen: exponential, general logistic and uniform.
62

極值理論與整合風險衡量

黃御綸 Unknown Date (has links)
自從90年代以來,許多機構因為金融商品的操縱不當或是金融風暴的衝擊數度造成全球金融市場的動盪,使得風險管理的重要性與日俱增,而量化風險模型的準確性也益受重視,基於財務資料的相關性質如異質變異、厚尾現象等,本文主要結合AR(1)-GARCH(1,1)模型、極值理論、copula函數三種模型應用在風險值的估算,且將報酬分配的假設區分為三類,一是無母數模型的歷史模擬法,二是基於常態分配假設下考量隨機波動度的有母數模型,三是利用歷史資料配適尾端分配的極值理論法來對聯電、鴻海、國泰金、中鋼四檔個股和台幣兌美元、日圓兌美元、英鎊兌美元三種外匯資料作一日風險值、十日風險值、組合風險值的測試。 實證結果發現,在一日風險值方面,95%信賴水準下以動態風險值方法表現相對較好,99%信賴水準下動態極值理論法和動態歷史模擬法皆有不錯的估計效果;就十日風險值而言,因為未來十日資產的報酬可能受到特定事件影響,所以估計上較為困難,整體看來在99%信賴水準下以條件GPD+蒙地卡羅模擬的表現相對較理想;以組合風險值來說, copula、Clayton copula+GPD marginals模擬股票或外匯組合的聯合分配不論在95%或99%信賴水準下對其風險值的估計都獲得最好的結果;雖然台灣個股股價受到上下漲跌幅7%的限制,台幣兌美元的匯率也受到央行的干涉,但以極值理論來描述資產尾端的分配情形相較於假設其他兩種分配仍有較好的估計效果。
63

Enhanced Power System Operational Performance with Anticipatory Control under Increased Penetration of Wind Energy

January 2016 (has links)
abstract: As the world embraces a sustainable energy future, alternative energy resources, such as wind power, are increasingly being seen as an integral part of the future electric energy grid. Ultimately, integrating such a dynamic and variable mix of generation requires a better understanding of renewable generation output, in addition to power grid systems that improve power system operational performance in the presence of anticipated events such as wind power ramps. Because of the stochastic, uncontrollable nature of renewable resources, a thorough and accurate characterization of wind activity is necessary to maintain grid stability and reliability. Wind power ramps from an existing wind farm are studied to characterize persistence forecasting errors using extreme value analysis techniques. In addition, a novel metric that quantifies the amount of non-stationarity in time series wind power data was proposed and used in a real-time algorithm to provide a rigorous method that adaptively determines training data for forecasts. Lastly, large swings in generation or load can cause system frequency and tie-line flows to deviate from nominal, so an anticipatory MPC-based secondary control scheme was designed and integrated into an automatic generation control loop to improve the ability of an interconnection to respond to anticipated large events and fluctuations in the power system. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2016
64

Inferencia Bayesiana para valores extremos / Bayesian inference for extremes

Bernardini, Diego Fernando de, 1986- 15 August 2018 (has links)
Orientador: Laura Leticia Ramos Rifo / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Computação Cientifica / Made available in DSpace on 2018-08-15T01:44:09Z (GMT). No. of bitstreams: 1 Bernardini_DiegoFernandode_M.pdf: 1483229 bytes, checksum: ea77acd21778728138eea2f27e59235b (MD5) Previous issue date: 2010 / Resumo: Iniciamos o presente trabalho apresentando uma breve introdução a teoria de valores extremos, estudando especialmente o comportamento da variável aleatória que representa o máximo de uma sequência de variáveis aleatórias independentes e identicamente distribuídas. Vemos que o Teorema dos Tipos Extremos (ou Teorema de Fisher-Tippett) constitui uma ferramenta fundamental no que diz respeito ao estudo do comportamento assintóticos destes máximos, permitindo a modelagem de dados que representem uma sequência de observações de máximos de um determinado fenômeno ou processo aleatório, através de uma classe de distribuições conhecida como família de distribuições de Valor Extremo Generalizada (Generalized Extreme Value - GEV). A distribuição Gumbel, associada ao máximo de distribuições como a Normal ou Gama entre outras, é um caso particular desta família. Torna-se interessante, assim, realizar inferência para os parâmetros desta família. Especificamente, a comparação entre os modelos Gumbel e GEV constitui o foco principal deste trabalho. No Capítulo 1 estudamos, no contexto da inferência clássica, o método de estimação por máxima verossimilhança para estes parâmetros e um procedimento de teste de razão de verossimilhanças adequado para testar a hipótese nula que representa o modelo Gumbel contra a hipótese que representa o modelo completo GEV. Prosseguimos, no Capítulo 2, com uma breve revisão em teoria de inferência Bayesiana obtendo inferências para o parâmetro de interesse em termos de sua distribuição a posteriori. Estudamos também a distribuição preditiva para valores futuros. No que diz respeito à comparação de modelos, estudamos inicialmente, neste contexto bayesiano, o fator de Bayes e o fator de Bayes a posteriori. Em seguida estudamos o Full Bayesian Significance Test (FBST), um teste de significância particularmente adequado para testar hipóteses precisas, como a hipótese que caracteriza o modelo Gumbel. Além disso, estudamos outros dois critérios para comparação de modelos, o BIC (Bayesian Information Criterion) e o DIC (Deviance Information Criterion). Estudamos as medidas de evidência especificamente no contexto da comparação entre os modelos Gumbel e GEV, bem como a distribuição preditiva, além dos intervalos de credibilidade e inferência a posteriori para os níveis de retorno associados a tempos de retorno fixos. O Capítulo 1 e parte do Capítulo 2 fornecem os fundamentos teóricos básicos deste trabalho, e estão fortemente baseados em Coles (2001) e O'Hagan (1994). No Capítulo 3 apresentamos o conhecido algoritmo de Metropolis-Hastings para simulação de distribuições de probabilidade e o algoritmo particular utilizado neste trabalho para a obtenção de amostras simuladas da distribuição a posteriori dos parâmetros de interesse. No capítulo seguinte formulamos a modelagem dos dados observados de máximos, apresentando a função de verossimilhança e estabelecendo a distribuição a priori para os parâmetros. Duas aplicações são apresentadas no Capítulo 5. A primeira delas trata das observações dos máximos trimestrais das taxas de desemprego nos Estados Unidos da América, entre o primeiro trimestre de 1994 e o primeiro trimestre de 2009. Na segunda aplicação estudamos os máximos semestrais dos níveis de maré em Newlyn, no sudoeste da Inglaterra, entre 1990 e 2007. Finalmente, uma breve discussão é apresentada no Capítulo 6. / Abstract: We begin this work presenting a brief introduction to the extreme value theory, specifically studying the behavior of the random variable which represents the maximum of a sequence of independent and identically distributed random variables. We see that the Extremal Types Theorem (or Fisher-Tippett Theorem) is a fundamental tool in the study of the asymptotic behavior of those maxima, allowing the modeling of data which represent a sequence of maxima observations of a given phenomenon or random process, through a class of distributions known as Generalized Extreme Value (GEV) family. We are interested in making inference about the parameters of this family. Specifically, the comparison between the Gumbel and GEV models constitute the main focus of this work. In Chapter 1 we study, in the context of classical inference, the method of maximum likelihood estimation for these parameters and likelihood ratio test procedure suitable for testing the null hypothesis associated to the Gumbel model against the hypothesis that represents the complete GEV model. We proceed, in Chapter 2, with a brief review on Bayesian inference theory. We also studied the predictive distribution for future values. With respect to the comparison of models, we initially study the Bayes factor and the posterior Bayes factor, in the Bayesian context. Next we study the Full Bayesian Significance Test (FBST), a significance test particularly suitable to test precise hypotheses, such as the hypothesis characterizing the Gumbel model. Furthermore, we study two other criteria for comparing models, the BIC (Bayesian Information Criterion) and the DIC (Deviance Information Criterion). We study the evidence measures specifically in the context of the comparison between the Gumbel and GEV models, as well as the predictive distribution, beyond the credible intervals and posterior inference to the return levels associated with fixed return periods. Chapter 1 and part of Chapter 2 provide the basic theoretical foundations of this work, and are strongly based on Coles (2001) and O'Hagan (1994). In Chapter 3 we present the well-known Metropolis-Hastings algorithm for simulation of probability distributions, and the particular algorithm used in this work to obtain simulated samples from the posterior distribution for the parameters of interest. In the next chapter we formulate the modeling of the observed data of maximum, presenting the likelihood function and setting the prior distribution for the parameters. Two applications are presented in Chapter 5. The first one deals with observations of the quarterly maximum for unemployment rates in the United States of America, between the first quarter of 1994 and first quarter of 2009. In the second application we studied the semiannual maximum of sea levels at Newlyn, in southwest of England, between 1990 and 2007. Finally, a brief discussion is presented in Chapter 6. / Mestrado / Estatistica / Mestre em Estatística
65

The best are never normal: exploring the distribution of firm performance

Buchbinder, Felipe 08 June 2011 (has links)
Submitted by Estagiário SPT BMHS (spt@fgv.br) on 2013-07-30T12:35:47Z No. of bitstreams: 1 Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) / Approved for entry into archive by Estagiário SPT BMHS (spt@fgv.br) on 2013-07-30T12:36:04Z (GMT) No. of bitstreams: 1 Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) / Approved for entry into archive by Estagiário SPT BMHS (spt@fgv.br) on 2013-07-30T12:36:22Z (GMT) No. of bitstreams: 1 Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) / Made available in DSpace on 2013-07-30T12:36:46Z (GMT). No. of bitstreams: 1 Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) Previous issue date: 2011-06-08 / Competitive Strategy literature predicts three different mechanisms of performance generation, thus distinguishing between firms that have competitive advantage, firms that have competitive disadvantage or firms that have neither. Nonetheless, previous works in the field have fitted a single normal distribution to model firm performance. Here, we develop a new approach that distinguishes among performance generating mechanisms and allows the identification of firms with competitive advantage or disadvantage. Theorizing on the positive feedback loops by which firms with competitive advantage have facilitated access to acquire new resources, we proposed a distribution we believe data on firm performance should follow. We illustrate our model by assessing its fit to data on firm performance, addressing its theoretical implications and comparing it to previous works.
66

Power Markets and Risk Management Modeling / Trhy s elektrickou energií a modelování v řízení rizik

Paholok, Igor January 2012 (has links)
The main target of this thesis is to summarize and explain the specifics of power markets and test application of models, which might be used especially in risk management area. Thesis starts with definition of market subjects, typology of traded contracts and description of market development with focus on Czech Republic. Thesis continues with development of theoretical concepts of short term/spot electricity markets and potential link between spot and forward electricity markets. After deriving of those microeconomic fundamental models we continue with stochastic models (Jump Diffusion Mean Reverting process and Extreme Value Theory) in order to depict patterns of spot and forward power contracts price volatility. Last chapter deals with credit risk specifics of power trading and develops model (using concept known as Credit Value Adjustment) to compare economic efficiency of OTC and exchange power trading. Developed and described models are tested on selected power markets, again with focus on Czech power market data set.
67

Essays on Emissions Trading Markets

Dhavala, Kishore 05 November 2012 (has links)
This dissertation is a collection of three economics essays on different aspects of carbon emission trading markets. The first essay analyzes the dynamic optimal emission control strategies of two nations. With a potential to become the largest buyer under the Kyoto Protocol, the US is assumed to be a monopsony, whereas with a large number of tradable permits on hand Russia is assumed to be a monopoly. Optimal costs of emission control programs are estimated for both the countries under four different market scenarios: non-cooperative no trade, US monopsony, Russia monopoly, and cooperative trading. The US monopsony scenario is found to be the most Pareto cost efficient. The Pareto efficient outcome, however, would require the US to make side payments to Russia, which will even out the differences in the cost savings from cooperative behavior. The second essay analyzes the price dynamics of the Chicago Climate Exchange (CCX), a voluntary emissions trading market. By examining the volatility in market returns using AR-GARCH and Markov switching models, the study associates the market price fluctuations with two different political regimes of the US government. Further, the study also identifies a high volatility in the returns few months before the market collapse. Three possible regulatory and market-based forces are identified as probable causes of market volatility and its ultimate collapse. Organizers of other voluntary markets in the US and worldwide may closely watch for these regime switching forces in order to overcome emission market crashes. The third essay compares excess skewness and kurtosis in carbon prices between CCX and EU ETS (European Union Emission Trading Scheme) Phase I and II markets, by examining the tail behavior when market expectations exceed the threshold level. Dynamic extreme value theory is used to find out the mean price exceedence of the threshold levels and estimate the risk loss. The calculated risk measures suggest that CCX and EU ETS Phase I are extremely immature markets for a risk investor, whereas EU ETS Phase II is a more stable market that could develop as a mature carbon market in future years.
68

Modeling and Simulation of Spatial Extremes Based on Max-Infinitely Divisible and Related Processes

Zhong, Peng 17 April 2022 (has links)
The statistical modeling of extreme natural hazards is becoming increasingly important due to climate change, whose effects have been increasingly visible throughout the last decades. It is thus crucial to understand the dependence structure of rare, high-impact events over space and time for realistic risk assessment. For spatial extremes, max-stable processes have played a central role in modeling block maxima. However, the spatial tail dependence strength is persistent across quantile levels in those models, which is often not realistic in practice. This lack of flexibility implies that max-stable processes cannot capture weakening dependence at increasingly extreme levels, resulting in a drastic overestimation of joint tail risk. To address this, we develop new dependence models in this thesis from the class of max-infinitely divisible (max-id) processes, which contain max-stable processes as a subclass and are flexible enough to capture different types of dependence structures. Furthermore, exact simulation algorithms for general max-id processes are typically not straightforward due to their complex formulations. Both simulation and inference can be computationally prohibitive in high dimensions. Fast and exact simulation algorithms to simulate max-id processes are provided, together with methods to implement our models in high dimensions based on the Vecchia approximation method. These proposed methodologies are illustrated through various environmental datasets, including air temperature data in South-Eastern Europe in an attempt to assess the effect of climate change on heatwave hazards, and sea surface temperature data for the entire Red Sea. In another application focused on assessing how the spatial extent of extreme precipitation has changed over time, we develop new time-varying $r$-Pareto processes, which are the counterparts of max-stable processes for high threshold exceedances.
69

Risk Assessment of International Mixed Asset Portfolio with Vine Copulas

Nilsson, Axel January 2022 (has links)
This thesis gives an example of assessing the risk of a financial portfolio with international assets, where the assets may be of different classes, by the use of Monte Carlo simulation and Extreme Value Theory. The simulation uses univariate modelling, models of the assets’ returns as stochastic processes, as well as vine copulas to create dependency between the variables. For the asset returns a modified version of a discretized Merton jump diffusion model was used. The risk assessment used Extreme Value Theory to calculate Value at Risk and Expected Shortfall of the simulated portfolio. However, the resulting return distribution, and the risk assessment thereof, was not entirely satisfactory due to unreasonably large values ascertained. / Denna uppsats ger ett exempel på hur riskbedömning av finanisella portföljer med internationella tillgångar av olika tillgångsslag genom Monte Carlo simulering och extremvärdesteori. Simuleringen använder univariat modelling, modeller för tillgångarnas avkastningar som stokastiska processer, såväl som vine-copulas för att skapa ett beroende mellan tillgångarna. Tillgångarnas avkastningar modellerades med en modifierad version av en diskretiserad Merton-jump-diffusion model. Riskbedömningen använde extremvärdesteori för att beräkna Value-at-Risk och Expected-Shortfall. Dock blev den resulterande avkastningsfördelningen och riskbedömningen därav inte helt tillfredsällande på grund av att orimligt stora värden erhölls.
70

Mathematical methods for portfolio management

Ondo, Guy-Roger Abessolo 08 1900 (has links)
Portfolio Management is the process of allocating an investor's wealth to in­ vestment opportunities over a given planning period. Not only should Portfolio Management be treated within a multi-period framework, but one should also take into consideration the stochastic nature of related parameters. After a short review of key concepts from Finance Theory, e.g. utility function, risk attitude, Value-at-rusk estimation methods, a.nd mean-variance efficiency, this work describes a framework for the formulation of the Portfolio Management problem in a Stochastic Programming setting. Classical solution techniques for the resolution of the resulting Stochastic Programs (e.g. L-shaped Decompo­ sition, Approximation of the probability function) are presented. These are discussed within both the two-stage and the multi-stage case with a special em­ phasis on the former. A description of how Importance Sampling and EVPI are used to improve the efficiency of classical methods is presented. Postoptimality Analysis, a sensitivity analysis method, is also described. / Statistics / M. Sc. (Operations Research)

Page generated in 0.0723 seconds