• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 119
  • 27
  • 19
  • 13
  • 10
  • 9
  • 7
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 232
  • 232
  • 151
  • 61
  • 58
  • 41
  • 36
  • 32
  • 29
  • 27
  • 26
  • 24
  • 23
  • 23
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Spin-glass models and interdisciplinary applications

Zarinelli, Elia 13 January 2012 (has links) (PDF)
Le sujet principal de cette thèse est la physique des verres de spin. Les verres de spin ont été introduits au début des années 70 pour décrire alliages magnétiques diluées. Ils ont désormais été considerés pour comprendre le comportement de liquides sousrefroidis. Parmis les systèmes qui peuvent être décrits par le langage des systèmes desordonnés, on trouve les problèmes d'optimisation combinatoire. Dans la première partie de cette thèse, nous considérons les modèles de verre de spin avec intéraction de Kac pour investiguer la phase de basse température des liquides sous-refroidis. Dans les chapitres qui suivent, nous montrons comment certaines caractéristiques des modèles de verre de spin peuvent être obtenues à partir de résultats de la théorie des matrices aléatoires en connection avec la statistique des valeurs extrêmes. Dans la dernière partie de la thèse, nous considérons la connexion entre la théorie desverres de spin et la science computationnelle, et présentons un nouvel algorithme qui peut être appliqué à certains problèmes dans le domaine des finances.
102

Inferences for the Weibull parameters based on interval-censored data and its application

Huang, Jinn-Long 19 June 2000 (has links)
In this article, we make inferences for the Weibull parameters and propose two test statistics for the comparison of two Weibull distributions based on interval-censored data. However, the distributions of the two statistics are unknown and not easy to obtain, therefore a simulation study is necessary. An urn model in the simulation of interval-censored data was proposed by Lee (1999) to select random intervals. Then we propose a simulation procedure with urn model to obtain approximately the quantiles of the two statistics. We demonstrate an example in AIDS study to illustrate how the tests can be applied to the infection time distributions of AIDS.
103

Extreme behavior and VaR of Short-term interest rate of Taiwan

Chiang, Ming-Chu 21 July 2008 (has links)
The current study empirically analyzes the extreme behavior and the impact of deregulation policies as well as financial turmoil on the extreme behavior of changes of Taiwan short term interest rate. A better knowledge of short-term interest rate properties, such as heavy tails, asymmetry, and uneven tail fatness between right and left tails, provide an insight to the extreme behavior of short-term interest rate as well as a more accurate estimation of interest risk. The predicting performances of filtered and unfiltered VaR (Value at risk) models are also examined to suggest the proper models for management of interest rate risk. By applying Extreme Value theory (EVT), tail behavior is analyzed and tested and the VaR based on parametric and non-parametric EVT models are calculated.The empirical findings show that, first, the distribution of change of rate are heavy-tailed indicating that the actual risk would be underestimated based on normality assumption. Second, the unconditional distribution is consistent with the heavier-tailed distributions such as ARCH process or Student¡¦t. Third, the right tail of distribution of change of rate are significantly heavier than the left one pointing out that the probabilities and magnitudes of rise in rate could be higher than those of drop in rate. Fourth, the amount of tail-fatness in tail of distribution of change of rate increase after 1999 and the vital factors to cause structural break in tail index are the interest rate policies taken by central bank of Taiwan instead of the deregulation policies in money market. Fifth, based on the two break points found in tail index of right and left tail, long sample of CP rates should not be treated as samples from a single distribution. Sixth, the dependent and heteroscedastic properties of data series should be considered in applying EVT to improve accuracy of VaR forecasts. Finally, EVT models predict VaR accurately before 2001 and the benchmark model, HS and GARCH, generally are superior to EVT models after 2001. Among EVT models, MRE and CHE are relative consistent and reliable in VaR prediction.
104

An assessment of uncertainties and limitations in simulating tropical cyclone climatology and future changes

Suzuki-Parker, Asuka 04 May 2011 (has links)
The recent elevated North Atlantic hurricane activity has generated considerable interests in the interaction between tropical cyclones (TCs) and climate change. The possible connection between TCs and the changing climate has been indicated by observational studies based on historical TC records; they indicate emerging trends in TC frequency and intensity in some TC basins, but the detection of trends has been hotly debated due to TC track data issues. Dynamical climate modeling has also been applied to the problem, but brings its own set of limitations owing to limited model resolution and uncertainties. The final goal of this study is to project the future changes of North Atlantic TC behavior with global warming for the next 50 years using the Nested Regional Climate Model (NRCM). Throughout the course of reaching this goal, various uncertainties and limitations in simulating TCs by the NRCM are identified and explored. First we examine the TC tracking algorithm to detect and track simulated TCs from model output. The criteria and thresholds used in the tracking algorithm control the simulated TC climatology, making it difficult to objectively assess the model's ability in simulating TC climatology. Existing tracking algorithms used by previous studies are surveyed and it is found that the criteria and thresholds are very diverse. Sensitivity of varying criteria and thresholds in TC tracking algorithm to simulated TC climatology is very high, especially with the intensity and duration thresholds. It is found that the commonly used criteria may not be strict enough to filter out intense extratropical systems and hybrid systems. We propose that a better distinction between TCs and other low-pressure systems can be achieved by adding the Cyclone Phase technique. Two sets of NRCM simulations are presented in this dissertation: One in the hindcasting mode, and the other with forcing from the Community Climate System Model (CCSM) to project into the future with global warming. Both of these simulations are assessed using the tracking algorithm with cyclone phase technique. The NRCM is run in a hindcasting mode for the global tropics in order to assess its ability to simulate the current observed TC climatology. It is found that the NRCM is capable of capturing the general spatial and temporal distributions of TCs, but tends to overproduce TCs particularly in the Northwest Pacific. The overpredction of TCs is associated with the overall convective tendency in the model added with an outstanding theory of wave energy accumulation leading to TC genesis. On the other hand, TC frequency in the tropical North Atlantic is under predicted due to the lack of moist African Easterly Waves. The importance of high-resolution is shown with the additional simulation with two-way nesting. The NRCM is then forced by the CCSM to project the future changes in North Atlantic TCs. An El Nino-like SST bias in the CCSM induced a high vertical wind shear in tropical North Atlantic, preventing TCs from forming in this region. A simple bias correction method is applied to remove this bias. The model projected an increase both in TC frequency and intensity owing to enhanced TC genesis in the main development region, where the model projects an increased favorability of large-scale environment for TC genesis. However, the model is not capable of explicitly simulating intense (Category 3-5) storms due to the limited model resolution. To extrapolate the prediction to intense storms, we propose a hybrid approach that combines the model results and a statistical modeling using extreme value theory. Specifically, the current observed TC intensity is statistically modeled with the General Pareto distribution, and the simulated intensity changes from the NRCM are applied to the statistical model to project the changes in intense storms. The results suggest that the occurrence of Category 5 storms may be increased by approximately 50% by 2055.
105

Nonlinear dependence and extremes in hydrology and climate

Khan, Shiraj 01 June 2007 (has links)
The presence of nonlinear dependence and chaos has strong implications for predictive modeling and the analysis of dominant processes in hydrology and climate. Analysis of extremes may aid in developing predictive models in hydro-climatology by giving enhanced understanding of processes driving the extremes and perhaps delineate possible anthropogenic or natural causes. This dissertation develops and utilizes different set of tools for predictive modeling, specifically nonlinear dependence, extreme, and chaos, and tests the viability of these tools on the real data. Commonly used dependence measures, such as linear correlation, cross-correlogram or Kendall's tau, cannot capture the complete dependence structure in data unless the structure is restricted to linear, periodic or monotonic. Mutual information (MI) has been frequently utilized for capturing the complete dependence structure including nonlinear dependence. Since the geophysical data are generally finite and noisy, this dissertation attempts to address a key gap in the literature, specifically, the evaluation of recently proposed MI-estimation methods to choose the best method for capturing nonlinear dependence, particularly in terms of their robustness for short and noisy data. The performance of kernel density estimators (KDE) and k-nearest neighbors (KNN) are the best for 100 data points at high and low noise-to-signal levels, respectively, whereas KNN is the best for 1000 data points consistently across noise levels. One real application of nonlinear dependence based on MI is to capture extrabasinal connections between El Nino-Southern Oscillation (ENSO) and river flows in the tropics and subtropics, specifically the Nile, Amazon, Congo, Parana, and Ganges rivers which reveals 20-70% higher dependence than those suggested so far by linear correlations. For extremes analysis, this dissertation develops a new measure precipitation extremes volatility index (PEVI), which measures the variability of extremes, is defined as the ratio of return levels. Spatio-temporal variability of PEVI, based on the Poisson-generalized Pareto (Poisson-GP) model, is investigated on weekly maxima observations available at 2.5 degree grids for 1940-2004 in South America. From 1965-2004, the PEVI shows increasing trends in few parts of the Amazon basin and the Brazilian highlands, north-west Venezuela including Caracas, north Argentina, Uruguay, Rio De Janeiro, Sao Paulo, Asuncion, and Cayenne. Catingas, few parts of the Brazilian highlands, Sao Paulo and Cayenne experience increasing number of consecutive 2- and 3-days extremes from 1965-2004. This dissertation also addresses the ability to detect the chaotic signal from a finite time series observation of hydrologic systems. Tests with simulated data demonstrate the presence of thresholds, in terms of noise to chaotic-signal and seasonality to chaotic-signal ratios, beyond which the set of currently available tools is not able to detect the chaotic component. Our results indicate that the decomposition of a simulated time series into the corresponding random, seasonal and chaotic components is possible from finite data. Real streamflow data from the Arkansas and Colorado rivers do not exhibit chaos. While a chaotic component can be extracted from the Arkansas data, such a component is either not present or can not be extracted from the Colorado data.
106

Ekstremumų asimptotinė analizė, kai imties didumo skirstinys yra neigiamas binominis / Asymptotis Analisis of Extremes, when the set size is distributed by negative binomial distribution

Sidekerskienė, Tatjana 05 June 2006 (has links)
In this work were considered the maxima and minima structures. Where number of value is random and is distributed by negative binomial distribution. There were theorems that were improved in this work, that helped to find the limit distribute function of this standard structures. These theorems generalize propositions, when set size is geometric random number. Also, there was the concrete distribution analysis done and such distributions were chosen: exponential, general logistic and uniform.
107

極值理論與整合風險衡量

黃御綸 Unknown Date (has links)
自從90年代以來,許多機構因為金融商品的操縱不當或是金融風暴的衝擊數度造成全球金融市場的動盪,使得風險管理的重要性與日俱增,而量化風險模型的準確性也益受重視,基於財務資料的相關性質如異質變異、厚尾現象等,本文主要結合AR(1)-GARCH(1,1)模型、極值理論、copula函數三種模型應用在風險值的估算,且將報酬分配的假設區分為三類,一是無母數模型的歷史模擬法,二是基於常態分配假設下考量隨機波動度的有母數模型,三是利用歷史資料配適尾端分配的極值理論法來對聯電、鴻海、國泰金、中鋼四檔個股和台幣兌美元、日圓兌美元、英鎊兌美元三種外匯資料作一日風險值、十日風險值、組合風險值的測試。 實證結果發現,在一日風險值方面,95%信賴水準下以動態風險值方法表現相對較好,99%信賴水準下動態極值理論法和動態歷史模擬法皆有不錯的估計效果;就十日風險值而言,因為未來十日資產的報酬可能受到特定事件影響,所以估計上較為困難,整體看來在99%信賴水準下以條件GPD+蒙地卡羅模擬的表現相對較理想;以組合風險值來說, copula、Clayton copula+GPD marginals模擬股票或外匯組合的聯合分配不論在95%或99%信賴水準下對其風險值的估計都獲得最好的結果;雖然台灣個股股價受到上下漲跌幅7%的限制,台幣兌美元的匯率也受到央行的干涉,但以極值理論來描述資產尾端的分配情形相較於假設其他兩種分配仍有較好的估計效果。
108

Utilising probabilistic techniques in the assessment of extreme coastal flooding frequency-magnitude relationships using a case study from south-west England

Whitworth, Michael Robert Zordan January 2015 (has links)
Recent events such as the New Orleans floods and the Japanese tsunami of 2011 have highlighted the uncertainty in the quantification of the magnitude of natural hazards. The research undertaken here has focussed on the uncertainty in evaluating storm surge magnitudes based on a range of statistical techniques including the Generalised Extreme Value distribution, Joint Probability and Monte Carlo simulations. To support the evaluation of storm surge frequency magnitude relationships a unique hard copy observed sea level data set, recording hourly observations, was acquired and digitised for Devonport, Plymouth, creating a 40 year data set. In conjunction with Devonport data, Newlyn (1915-2012) tide gauge records were analysed, creating a data set of 2 million data points. The different statistical techniques analysed led to an uncertainty range of 0.4 m for a 1 in 250 year storm surge event, and 0.7 m for a 1 in 1000 storm surge event. This compares to a 0.5 m uncertainty range between the low and high prediction for sea level rise by 2100. The Geographical Information system modelling of the uncertainty indicated that for a 1 in 1000 year event the level uncertainty (0.7 m) led to an increase of 100% of buildings and 50% of total land affect. Within the study area of south-west England there are several critical structures including a nuclear licensed site. Incorporating the uncertainty in storm surge and wave height predictions indicated that the site would be potentially affected today with the combination of a 1 in 1000 year storm surge event coincident with a 1 in 1000 wave. In addition to the evaluation of frequency magnitude relations this study has identified several trends in the data set. Over the data period sea level rise is modelled as an exponential growth (0.0001mm/yr2), indicating the modelled sea level rise of 1.9 mm/yr and 2.2 mm/yr for Newlyn and Devonport, will potentially increase over the next century by a minimum of 0.2 m by 2100.The increase in storm frequency identified as part of this analysis has been equated to the rise in sea level, rather than an increase in the severity of storms, with decadal variations in the observed frequency, potentially linked to the North Atlantic Oscillation. The identification as part of this study of a significant uncertainty in the evaluation of storm surge frequency magnitude relationships has global significance in the evaluation of natural hazards. Guidance on the evaluation of external hazards currently does not adequately consider the effect of uncertainty; an uncertainty of 0.7 m identified within this study could potentially affect in the region of 500 million people worldwide living close to the coast.
109

Enhanced Power System Operational Performance with Anticipatory Control under Increased Penetration of Wind Energy

January 2016 (has links)
abstract: As the world embraces a sustainable energy future, alternative energy resources, such as wind power, are increasingly being seen as an integral part of the future electric energy grid. Ultimately, integrating such a dynamic and variable mix of generation requires a better understanding of renewable generation output, in addition to power grid systems that improve power system operational performance in the presence of anticipated events such as wind power ramps. Because of the stochastic, uncontrollable nature of renewable resources, a thorough and accurate characterization of wind activity is necessary to maintain grid stability and reliability. Wind power ramps from an existing wind farm are studied to characterize persistence forecasting errors using extreme value analysis techniques. In addition, a novel metric that quantifies the amount of non-stationarity in time series wind power data was proposed and used in a real-time algorithm to provide a rigorous method that adaptively determines training data for forecasts. Lastly, large swings in generation or load can cause system frequency and tie-line flows to deviate from nominal, so an anticipatory MPC-based secondary control scheme was designed and integrated into an automatic generation control loop to improve the ability of an interconnection to respond to anticipated large events and fluctuations in the power system. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2016
110

Inferencia Bayesiana para valores extremos / Bayesian inference for extremes

Bernardini, Diego Fernando de, 1986- 15 August 2018 (has links)
Orientador: Laura Leticia Ramos Rifo / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Computação Cientifica / Made available in DSpace on 2018-08-15T01:44:09Z (GMT). No. of bitstreams: 1 Bernardini_DiegoFernandode_M.pdf: 1483229 bytes, checksum: ea77acd21778728138eea2f27e59235b (MD5) Previous issue date: 2010 / Resumo: Iniciamos o presente trabalho apresentando uma breve introdução a teoria de valores extremos, estudando especialmente o comportamento da variável aleatória que representa o máximo de uma sequência de variáveis aleatórias independentes e identicamente distribuídas. Vemos que o Teorema dos Tipos Extremos (ou Teorema de Fisher-Tippett) constitui uma ferramenta fundamental no que diz respeito ao estudo do comportamento assintóticos destes máximos, permitindo a modelagem de dados que representem uma sequência de observações de máximos de um determinado fenômeno ou processo aleatório, através de uma classe de distribuições conhecida como família de distribuições de Valor Extremo Generalizada (Generalized Extreme Value - GEV). A distribuição Gumbel, associada ao máximo de distribuições como a Normal ou Gama entre outras, é um caso particular desta família. Torna-se interessante, assim, realizar inferência para os parâmetros desta família. Especificamente, a comparação entre os modelos Gumbel e GEV constitui o foco principal deste trabalho. No Capítulo 1 estudamos, no contexto da inferência clássica, o método de estimação por máxima verossimilhança para estes parâmetros e um procedimento de teste de razão de verossimilhanças adequado para testar a hipótese nula que representa o modelo Gumbel contra a hipótese que representa o modelo completo GEV. Prosseguimos, no Capítulo 2, com uma breve revisão em teoria de inferência Bayesiana obtendo inferências para o parâmetro de interesse em termos de sua distribuição a posteriori. Estudamos também a distribuição preditiva para valores futuros. No que diz respeito à comparação de modelos, estudamos inicialmente, neste contexto bayesiano, o fator de Bayes e o fator de Bayes a posteriori. Em seguida estudamos o Full Bayesian Significance Test (FBST), um teste de significância particularmente adequado para testar hipóteses precisas, como a hipótese que caracteriza o modelo Gumbel. Além disso, estudamos outros dois critérios para comparação de modelos, o BIC (Bayesian Information Criterion) e o DIC (Deviance Information Criterion). Estudamos as medidas de evidência especificamente no contexto da comparação entre os modelos Gumbel e GEV, bem como a distribuição preditiva, além dos intervalos de credibilidade e inferência a posteriori para os níveis de retorno associados a tempos de retorno fixos. O Capítulo 1 e parte do Capítulo 2 fornecem os fundamentos teóricos básicos deste trabalho, e estão fortemente baseados em Coles (2001) e O'Hagan (1994). No Capítulo 3 apresentamos o conhecido algoritmo de Metropolis-Hastings para simulação de distribuições de probabilidade e o algoritmo particular utilizado neste trabalho para a obtenção de amostras simuladas da distribuição a posteriori dos parâmetros de interesse. No capítulo seguinte formulamos a modelagem dos dados observados de máximos, apresentando a função de verossimilhança e estabelecendo a distribuição a priori para os parâmetros. Duas aplicações são apresentadas no Capítulo 5. A primeira delas trata das observações dos máximos trimestrais das taxas de desemprego nos Estados Unidos da América, entre o primeiro trimestre de 1994 e o primeiro trimestre de 2009. Na segunda aplicação estudamos os máximos semestrais dos níveis de maré em Newlyn, no sudoeste da Inglaterra, entre 1990 e 2007. Finalmente, uma breve discussão é apresentada no Capítulo 6. / Abstract: We begin this work presenting a brief introduction to the extreme value theory, specifically studying the behavior of the random variable which represents the maximum of a sequence of independent and identically distributed random variables. We see that the Extremal Types Theorem (or Fisher-Tippett Theorem) is a fundamental tool in the study of the asymptotic behavior of those maxima, allowing the modeling of data which represent a sequence of maxima observations of a given phenomenon or random process, through a class of distributions known as Generalized Extreme Value (GEV) family. We are interested in making inference about the parameters of this family. Specifically, the comparison between the Gumbel and GEV models constitute the main focus of this work. In Chapter 1 we study, in the context of classical inference, the method of maximum likelihood estimation for these parameters and likelihood ratio test procedure suitable for testing the null hypothesis associated to the Gumbel model against the hypothesis that represents the complete GEV model. We proceed, in Chapter 2, with a brief review on Bayesian inference theory. We also studied the predictive distribution for future values. With respect to the comparison of models, we initially study the Bayes factor and the posterior Bayes factor, in the Bayesian context. Next we study the Full Bayesian Significance Test (FBST), a significance test particularly suitable to test precise hypotheses, such as the hypothesis characterizing the Gumbel model. Furthermore, we study two other criteria for comparing models, the BIC (Bayesian Information Criterion) and the DIC (Deviance Information Criterion). We study the evidence measures specifically in the context of the comparison between the Gumbel and GEV models, as well as the predictive distribution, beyond the credible intervals and posterior inference to the return levels associated with fixed return periods. Chapter 1 and part of Chapter 2 provide the basic theoretical foundations of this work, and are strongly based on Coles (2001) and O'Hagan (1994). In Chapter 3 we present the well-known Metropolis-Hastings algorithm for simulation of probability distributions, and the particular algorithm used in this work to obtain simulated samples from the posterior distribution for the parameters of interest. In the next chapter we formulate the modeling of the observed data of maximum, presenting the likelihood function and setting the prior distribution for the parameters. Two applications are presented in Chapter 5. The first one deals with observations of the quarterly maximum for unemployment rates in the United States of America, between the first quarter of 1994 and first quarter of 2009. In the second application we studied the semiannual maximum of sea levels at Newlyn, in southwest of England, between 1990 and 2007. Finally, a brief discussion is presented in Chapter 6. / Mestrado / Estatistica / Mestre em Estatística

Page generated in 0.0358 seconds