• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 139
  • 27
  • 19
  • 13
  • 11
  • 9
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 263
  • 263
  • 175
  • 68
  • 61
  • 51
  • 40
  • 34
  • 31
  • 30
  • 28
  • 25
  • 25
  • 23
  • 23
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Nonlinear dependence and extremes in hydrology and climate

Khan, Shiraj 01 June 2007 (has links)
The presence of nonlinear dependence and chaos has strong implications for predictive modeling and the analysis of dominant processes in hydrology and climate. Analysis of extremes may aid in developing predictive models in hydro-climatology by giving enhanced understanding of processes driving the extremes and perhaps delineate possible anthropogenic or natural causes. This dissertation develops and utilizes different set of tools for predictive modeling, specifically nonlinear dependence, extreme, and chaos, and tests the viability of these tools on the real data. Commonly used dependence measures, such as linear correlation, cross-correlogram or Kendall's tau, cannot capture the complete dependence structure in data unless the structure is restricted to linear, periodic or monotonic. Mutual information (MI) has been frequently utilized for capturing the complete dependence structure including nonlinear dependence. Since the geophysical data are generally finite and noisy, this dissertation attempts to address a key gap in the literature, specifically, the evaluation of recently proposed MI-estimation methods to choose the best method for capturing nonlinear dependence, particularly in terms of their robustness for short and noisy data. The performance of kernel density estimators (KDE) and k-nearest neighbors (KNN) are the best for 100 data points at high and low noise-to-signal levels, respectively, whereas KNN is the best for 1000 data points consistently across noise levels. One real application of nonlinear dependence based on MI is to capture extrabasinal connections between El Nino-Southern Oscillation (ENSO) and river flows in the tropics and subtropics, specifically the Nile, Amazon, Congo, Parana, and Ganges rivers which reveals 20-70% higher dependence than those suggested so far by linear correlations. For extremes analysis, this dissertation develops a new measure precipitation extremes volatility index (PEVI), which measures the variability of extremes, is defined as the ratio of return levels. Spatio-temporal variability of PEVI, based on the Poisson-generalized Pareto (Poisson-GP) model, is investigated on weekly maxima observations available at 2.5 degree grids for 1940-2004 in South America. From 1965-2004, the PEVI shows increasing trends in few parts of the Amazon basin and the Brazilian highlands, north-west Venezuela including Caracas, north Argentina, Uruguay, Rio De Janeiro, Sao Paulo, Asuncion, and Cayenne. Catingas, few parts of the Brazilian highlands, Sao Paulo and Cayenne experience increasing number of consecutive 2- and 3-days extremes from 1965-2004. This dissertation also addresses the ability to detect the chaotic signal from a finite time series observation of hydrologic systems. Tests with simulated data demonstrate the presence of thresholds, in terms of noise to chaotic-signal and seasonality to chaotic-signal ratios, beyond which the set of currently available tools is not able to detect the chaotic component. Our results indicate that the decomposition of a simulated time series into the corresponding random, seasonal and chaotic components is possible from finite data. Real streamflow data from the Arkansas and Colorado rivers do not exhibit chaos. While a chaotic component can be extracted from the Arkansas data, such a component is either not present or can not be extracted from the Colorado data.
132

Ekstremumų asimptotinė analizė, kai imties didumo skirstinys yra neigiamas binominis / Asymptotis Analisis of Extremes, when the set size is distributed by negative binomial distribution

Sidekerskienė, Tatjana 05 June 2006 (has links)
In this work were considered the maxima and minima structures. Where number of value is random and is distributed by negative binomial distribution. There were theorems that were improved in this work, that helped to find the limit distribute function of this standard structures. These theorems generalize propositions, when set size is geometric random number. Also, there was the concrete distribution analysis done and such distributions were chosen: exponential, general logistic and uniform.
133

極值理論與整合風險衡量

黃御綸 Unknown Date (has links)
自從90年代以來,許多機構因為金融商品的操縱不當或是金融風暴的衝擊數度造成全球金融市場的動盪,使得風險管理的重要性與日俱增,而量化風險模型的準確性也益受重視,基於財務資料的相關性質如異質變異、厚尾現象等,本文主要結合AR(1)-GARCH(1,1)模型、極值理論、copula函數三種模型應用在風險值的估算,且將報酬分配的假設區分為三類,一是無母數模型的歷史模擬法,二是基於常態分配假設下考量隨機波動度的有母數模型,三是利用歷史資料配適尾端分配的極值理論法來對聯電、鴻海、國泰金、中鋼四檔個股和台幣兌美元、日圓兌美元、英鎊兌美元三種外匯資料作一日風險值、十日風險值、組合風險值的測試。 實證結果發現,在一日風險值方面,95%信賴水準下以動態風險值方法表現相對較好,99%信賴水準下動態極值理論法和動態歷史模擬法皆有不錯的估計效果;就十日風險值而言,因為未來十日資產的報酬可能受到特定事件影響,所以估計上較為困難,整體看來在99%信賴水準下以條件GPD+蒙地卡羅模擬的表現相對較理想;以組合風險值來說, copula、Clayton copula+GPD marginals模擬股票或外匯組合的聯合分配不論在95%或99%信賴水準下對其風險值的估計都獲得最好的結果;雖然台灣個股股價受到上下漲跌幅7%的限制,台幣兌美元的匯率也受到央行的干涉,但以極值理論來描述資產尾端的分配情形相較於假設其他兩種分配仍有較好的估計效果。
134

Utilising probabilistic techniques in the assessment of extreme coastal flooding frequency-magnitude relationships using a case study from south-west England

Whitworth, Michael Robert Zordan January 2015 (has links)
Recent events such as the New Orleans floods and the Japanese tsunami of 2011 have highlighted the uncertainty in the quantification of the magnitude of natural hazards. The research undertaken here has focussed on the uncertainty in evaluating storm surge magnitudes based on a range of statistical techniques including the Generalised Extreme Value distribution, Joint Probability and Monte Carlo simulations. To support the evaluation of storm surge frequency magnitude relationships a unique hard copy observed sea level data set, recording hourly observations, was acquired and digitised for Devonport, Plymouth, creating a 40 year data set. In conjunction with Devonport data, Newlyn (1915-2012) tide gauge records were analysed, creating a data set of 2 million data points. The different statistical techniques analysed led to an uncertainty range of 0.4 m for a 1 in 250 year storm surge event, and 0.7 m for a 1 in 1000 storm surge event. This compares to a 0.5 m uncertainty range between the low and high prediction for sea level rise by 2100. The Geographical Information system modelling of the uncertainty indicated that for a 1 in 1000 year event the level uncertainty (0.7 m) led to an increase of 100% of buildings and 50% of total land affect. Within the study area of south-west England there are several critical structures including a nuclear licensed site. Incorporating the uncertainty in storm surge and wave height predictions indicated that the site would be potentially affected today with the combination of a 1 in 1000 year storm surge event coincident with a 1 in 1000 wave. In addition to the evaluation of frequency magnitude relations this study has identified several trends in the data set. Over the data period sea level rise is modelled as an exponential growth (0.0001mm/yr2), indicating the modelled sea level rise of 1.9 mm/yr and 2.2 mm/yr for Newlyn and Devonport, will potentially increase over the next century by a minimum of 0.2 m by 2100.The increase in storm frequency identified as part of this analysis has been equated to the rise in sea level, rather than an increase in the severity of storms, with decadal variations in the observed frequency, potentially linked to the North Atlantic Oscillation. The identification as part of this study of a significant uncertainty in the evaluation of storm surge frequency magnitude relationships has global significance in the evaluation of natural hazards. Guidance on the evaluation of external hazards currently does not adequately consider the effect of uncertainty; an uncertainty of 0.7 m identified within this study could potentially affect in the region of 500 million people worldwide living close to the coast.
135

Enhanced Power System Operational Performance with Anticipatory Control under Increased Penetration of Wind Energy

January 2016 (has links)
abstract: As the world embraces a sustainable energy future, alternative energy resources, such as wind power, are increasingly being seen as an integral part of the future electric energy grid. Ultimately, integrating such a dynamic and variable mix of generation requires a better understanding of renewable generation output, in addition to power grid systems that improve power system operational performance in the presence of anticipated events such as wind power ramps. Because of the stochastic, uncontrollable nature of renewable resources, a thorough and accurate characterization of wind activity is necessary to maintain grid stability and reliability. Wind power ramps from an existing wind farm are studied to characterize persistence forecasting errors using extreme value analysis techniques. In addition, a novel metric that quantifies the amount of non-stationarity in time series wind power data was proposed and used in a real-time algorithm to provide a rigorous method that adaptively determines training data for forecasts. Lastly, large swings in generation or load can cause system frequency and tie-line flows to deviate from nominal, so an anticipatory MPC-based secondary control scheme was designed and integrated into an automatic generation control loop to improve the ability of an interconnection to respond to anticipated large events and fluctuations in the power system. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2016
136

Inferencia Bayesiana para valores extremos / Bayesian inference for extremes

Bernardini, Diego Fernando de, 1986- 15 August 2018 (has links)
Orientador: Laura Leticia Ramos Rifo / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Computação Cientifica / Made available in DSpace on 2018-08-15T01:44:09Z (GMT). No. of bitstreams: 1 Bernardini_DiegoFernandode_M.pdf: 1483229 bytes, checksum: ea77acd21778728138eea2f27e59235b (MD5) Previous issue date: 2010 / Resumo: Iniciamos o presente trabalho apresentando uma breve introdução a teoria de valores extremos, estudando especialmente o comportamento da variável aleatória que representa o máximo de uma sequência de variáveis aleatórias independentes e identicamente distribuídas. Vemos que o Teorema dos Tipos Extremos (ou Teorema de Fisher-Tippett) constitui uma ferramenta fundamental no que diz respeito ao estudo do comportamento assintóticos destes máximos, permitindo a modelagem de dados que representem uma sequência de observações de máximos de um determinado fenômeno ou processo aleatório, através de uma classe de distribuições conhecida como família de distribuições de Valor Extremo Generalizada (Generalized Extreme Value - GEV). A distribuição Gumbel, associada ao máximo de distribuições como a Normal ou Gama entre outras, é um caso particular desta família. Torna-se interessante, assim, realizar inferência para os parâmetros desta família. Especificamente, a comparação entre os modelos Gumbel e GEV constitui o foco principal deste trabalho. No Capítulo 1 estudamos, no contexto da inferência clássica, o método de estimação por máxima verossimilhança para estes parâmetros e um procedimento de teste de razão de verossimilhanças adequado para testar a hipótese nula que representa o modelo Gumbel contra a hipótese que representa o modelo completo GEV. Prosseguimos, no Capítulo 2, com uma breve revisão em teoria de inferência Bayesiana obtendo inferências para o parâmetro de interesse em termos de sua distribuição a posteriori. Estudamos também a distribuição preditiva para valores futuros. No que diz respeito à comparação de modelos, estudamos inicialmente, neste contexto bayesiano, o fator de Bayes e o fator de Bayes a posteriori. Em seguida estudamos o Full Bayesian Significance Test (FBST), um teste de significância particularmente adequado para testar hipóteses precisas, como a hipótese que caracteriza o modelo Gumbel. Além disso, estudamos outros dois critérios para comparação de modelos, o BIC (Bayesian Information Criterion) e o DIC (Deviance Information Criterion). Estudamos as medidas de evidência especificamente no contexto da comparação entre os modelos Gumbel e GEV, bem como a distribuição preditiva, além dos intervalos de credibilidade e inferência a posteriori para os níveis de retorno associados a tempos de retorno fixos. O Capítulo 1 e parte do Capítulo 2 fornecem os fundamentos teóricos básicos deste trabalho, e estão fortemente baseados em Coles (2001) e O'Hagan (1994). No Capítulo 3 apresentamos o conhecido algoritmo de Metropolis-Hastings para simulação de distribuições de probabilidade e o algoritmo particular utilizado neste trabalho para a obtenção de amostras simuladas da distribuição a posteriori dos parâmetros de interesse. No capítulo seguinte formulamos a modelagem dos dados observados de máximos, apresentando a função de verossimilhança e estabelecendo a distribuição a priori para os parâmetros. Duas aplicações são apresentadas no Capítulo 5. A primeira delas trata das observações dos máximos trimestrais das taxas de desemprego nos Estados Unidos da América, entre o primeiro trimestre de 1994 e o primeiro trimestre de 2009. Na segunda aplicação estudamos os máximos semestrais dos níveis de maré em Newlyn, no sudoeste da Inglaterra, entre 1990 e 2007. Finalmente, uma breve discussão é apresentada no Capítulo 6. / Abstract: We begin this work presenting a brief introduction to the extreme value theory, specifically studying the behavior of the random variable which represents the maximum of a sequence of independent and identically distributed random variables. We see that the Extremal Types Theorem (or Fisher-Tippett Theorem) is a fundamental tool in the study of the asymptotic behavior of those maxima, allowing the modeling of data which represent a sequence of maxima observations of a given phenomenon or random process, through a class of distributions known as Generalized Extreme Value (GEV) family. We are interested in making inference about the parameters of this family. Specifically, the comparison between the Gumbel and GEV models constitute the main focus of this work. In Chapter 1 we study, in the context of classical inference, the method of maximum likelihood estimation for these parameters and likelihood ratio test procedure suitable for testing the null hypothesis associated to the Gumbel model against the hypothesis that represents the complete GEV model. We proceed, in Chapter 2, with a brief review on Bayesian inference theory. We also studied the predictive distribution for future values. With respect to the comparison of models, we initially study the Bayes factor and the posterior Bayes factor, in the Bayesian context. Next we study the Full Bayesian Significance Test (FBST), a significance test particularly suitable to test precise hypotheses, such as the hypothesis characterizing the Gumbel model. Furthermore, we study two other criteria for comparing models, the BIC (Bayesian Information Criterion) and the DIC (Deviance Information Criterion). We study the evidence measures specifically in the context of the comparison between the Gumbel and GEV models, as well as the predictive distribution, beyond the credible intervals and posterior inference to the return levels associated with fixed return periods. Chapter 1 and part of Chapter 2 provide the basic theoretical foundations of this work, and are strongly based on Coles (2001) and O'Hagan (1994). In Chapter 3 we present the well-known Metropolis-Hastings algorithm for simulation of probability distributions, and the particular algorithm used in this work to obtain simulated samples from the posterior distribution for the parameters of interest. In the next chapter we formulate the modeling of the observed data of maximum, presenting the likelihood function and setting the prior distribution for the parameters. Two applications are presented in Chapter 5. The first one deals with observations of the quarterly maximum for unemployment rates in the United States of America, between the first quarter of 1994 and first quarter of 2009. In the second application we studied the semiannual maximum of sea levels at Newlyn, in southwest of England, between 1990 and 2007. Finally, a brief discussion is presented in Chapter 6. / Mestrado / Estatistica / Mestre em Estatística
137

The best are never normal: exploring the distribution of firm performance

Buchbinder, Felipe 08 June 2011 (has links)
Submitted by Estagiário SPT BMHS (spt@fgv.br) on 2013-07-30T12:35:47Z No. of bitstreams: 1 Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) / Approved for entry into archive by Estagiário SPT BMHS (spt@fgv.br) on 2013-07-30T12:36:04Z (GMT) No. of bitstreams: 1 Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) / Approved for entry into archive by Estagiário SPT BMHS (spt@fgv.br) on 2013-07-30T12:36:22Z (GMT) No. of bitstreams: 1 Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) / Made available in DSpace on 2013-07-30T12:36:46Z (GMT). No. of bitstreams: 1 Dissertation Felipe Buchbinder.pdf: 1585162 bytes, checksum: b566d391b5cfffdea11553af4c3fcd3e (MD5) Previous issue date: 2011-06-08 / Competitive Strategy literature predicts three different mechanisms of performance generation, thus distinguishing between firms that have competitive advantage, firms that have competitive disadvantage or firms that have neither. Nonetheless, previous works in the field have fitted a single normal distribution to model firm performance. Here, we develop a new approach that distinguishes among performance generating mechanisms and allows the identification of firms with competitive advantage or disadvantage. Theorizing on the positive feedback loops by which firms with competitive advantage have facilitated access to acquire new resources, we proposed a distribution we believe data on firm performance should follow. We illustrate our model by assessing its fit to data on firm performance, addressing its theoretical implications and comparing it to previous works.
138

Power Markets and Risk Management Modeling / Trhy s elektrickou energií a modelování v řízení rizik

Paholok, Igor January 2012 (has links)
The main target of this thesis is to summarize and explain the specifics of power markets and test application of models, which might be used especially in risk management area. Thesis starts with definition of market subjects, typology of traded contracts and description of market development with focus on Czech Republic. Thesis continues with development of theoretical concepts of short term/spot electricity markets and potential link between spot and forward electricity markets. After deriving of those microeconomic fundamental models we continue with stochastic models (Jump Diffusion Mean Reverting process and Extreme Value Theory) in order to depict patterns of spot and forward power contracts price volatility. Last chapter deals with credit risk specifics of power trading and develops model (using concept known as Credit Value Adjustment) to compare economic efficiency of OTC and exchange power trading. Developed and described models are tested on selected power markets, again with focus on Czech power market data set.
139

Essays on Emissions Trading Markets

Dhavala, Kishore 05 November 2012 (has links)
This dissertation is a collection of three economics essays on different aspects of carbon emission trading markets. The first essay analyzes the dynamic optimal emission control strategies of two nations. With a potential to become the largest buyer under the Kyoto Protocol, the US is assumed to be a monopsony, whereas with a large number of tradable permits on hand Russia is assumed to be a monopoly. Optimal costs of emission control programs are estimated for both the countries under four different market scenarios: non-cooperative no trade, US monopsony, Russia monopoly, and cooperative trading. The US monopsony scenario is found to be the most Pareto cost efficient. The Pareto efficient outcome, however, would require the US to make side payments to Russia, which will even out the differences in the cost savings from cooperative behavior. The second essay analyzes the price dynamics of the Chicago Climate Exchange (CCX), a voluntary emissions trading market. By examining the volatility in market returns using AR-GARCH and Markov switching models, the study associates the market price fluctuations with two different political regimes of the US government. Further, the study also identifies a high volatility in the returns few months before the market collapse. Three possible regulatory and market-based forces are identified as probable causes of market volatility and its ultimate collapse. Organizers of other voluntary markets in the US and worldwide may closely watch for these regime switching forces in order to overcome emission market crashes. The third essay compares excess skewness and kurtosis in carbon prices between CCX and EU ETS (European Union Emission Trading Scheme) Phase I and II markets, by examining the tail behavior when market expectations exceed the threshold level. Dynamic extreme value theory is used to find out the mean price exceedence of the threshold levels and estimate the risk loss. The calculated risk measures suggest that CCX and EU ETS Phase I are extremely immature markets for a risk investor, whereas EU ETS Phase II is a more stable market that could develop as a mature carbon market in future years.
140

The Largest Void and Cluster in Non-Standard Cosmology

Castello, Sveva January 2020 (has links)
We employ observational data about the largest cosmic void and most massive galaxy cluster known to date, the 'Cold Spot' void and the 'El Gordo' cluster, in order to constrain the parameter |fR0| from the f(R) gravity formulation by Hu and Sawicki and the matter power spectrum normalization at present time, σ8. We obtain the marginalized posterior distribution for these two parameters through a Markov Chain Monte Carlo analysis, where the likelihood function is modeled through extreme value statistics. The prior distribution for the additional cosmological parameters included in the computations (Ωdmh2, Ωbh2, h and ns) is matched to recent constraints. By combining the likelihood functions for both voids and clusters, we obtain a mean value log|fR0| = -5.1 ± 1.6, which is compatible with General Relativity (log|fR0| ≤-8) at 95% confidence level, but suggests a preference for a non-negligible modified gravity correction.

Page generated in 0.0621 seconds