Spelling suggestions: "subject:"extreme value"" "subject:"extreme alue""
181 |
Aplicação da Teoria do Valor Extremo e Copulas para avaliar risco de mercado de ações brasileiras / Application of extreme value theory and copulas to assess risk of the stock market in BrazilAngelo Santos Alves 26 September 2013 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / As instituições financeiras são obrigadas por acordos internacionais, como o Acordo de Basiléia, a avaliar o risco de mercado ao qual a instituição está propensa de forma a evitar possíveis contaminações de desastres financeiros em seu patrimônio. Com o intuito de capturar tais fenômenos, surge a necessidade de construir modelos que capturem com mais acurácia movimentos extremos das séries de retornos. O trabalho teve como principal objetivo aplicar a Teoria do Valor Extremo juntamente com Copulas na estimação de quantis extremos para o VaR. Ele utiliza técnicas de simulação de Monte Carlo, Teoria do Valor Extremo e Cópulas com distribuições gaussianas e t. Em contrapartida, as estimativas produzidas serão comparadas com as de um segundo modelo, chamado de simulação histórica de Monte Carlo filtrada, mais conhecida como filtered historical simulation (FHS). As técnicas serão aplicadas a um portfólio de ações de empresas brasileiras. / Financial institutions are required by international agreements such as the Basel Accord, to assess the market risk to which the institution is likely to avoid possible contamination of its assets in financial disasters. In order to capture these phenomena, the need arises to build models that more accurately capture extreme movements of the return series. The work aimed to apply the Extreme Value Theory along with the estimation of copulas for VaR extremes quantiles. He uses techniques of Monte Carlo simulation, Extreme Value Theory and Copulas with Gaussian and t distributions. In contrast, the estimates produced will be compared with a second model, called Monte Carlo simulation of historical filtered, better known as filtered historical simulation (FHS). The techniques are applied to a portfolio of stocks of Brazilian companies.
|
182 |
Modelo de regressão de valor extremo para dados agrupadosSanto, Jonatas Silva do Espirito 11 March 2013 (has links)
Made available in DSpace on 2016-06-02T20:06:07Z (GMT). No. of bitstreams: 1
5034.pdf: 832896 bytes, checksum: 2e9dd202302339e95fd416a410d6eb7e (MD5)
Previous issue date: 2013-03-11 / Financiadora de Estudos e Projetos / One of the distributions used to model extremal events is the type I extremevalue distribution (Gumbel distribution). The usual extreme-value regression model requires independent observations. In this work, using generalized linear model (Mc-Cullagh e Nelder, 1989) and generalized estimating equations (Liang e Zeger, 1986), we developed the extreme-value regression model when there are independent clusters formed by dependent variables. The behavior of parameter estimators of the proposed model is studied through Monte Carlo simulations. / A distribuição valor extremo tipo I, também conhecida como distribuição Gumbel, é uma das distribuições utilizadas para a modelagem de eventos extremos. Os modelos existentes de regressão valor extremo supõem que as observações sejam independentes, inviabilizando o uso desses modelos quando existe dependência entre as observações. Nesta dissertação, utilizando modelos lineares generalizados (McCullagh e Nelder, 1989) e equações de estimação generalizadas (Liang e Zeger, 1986), desenvolvemos o modelo de regress~ao valor extremo para o caso em que h a grupos independentes formados por respostas dependentes. O comportamento dos estimadoresdos parâmetros do modelo proposto é avaliada através de simulações Monte Carlo.
|
183 |
[en] ESSAYS ON THE RISK ASSOCIATED TO FORECASTING ELECTRICITY PRICES AND ON MODELING THE DEMAND OF ENERGY FROM AN ELECTRICITY DISTRIBUTOR / [pt] ENSAIOS SOBRE O RISCO DE PREVISÃO DE PREÇOS DE ENERGIA ELÉTRICA E MODELAGEM DE CARGA DEMANDADA A UMA DISTRIBUIDORA DE ELETRICIDADEMARIO DOMINGUES DE PAULA SIMOES 31 July 2018 (has links)
[pt] A presente tese trata da avaliação do risco associado à incerteza presente na previsão dos preços de energia elétrica, bem como os aspectos de incerteza associados à previsão de demanda da carga de energia elétrica exigida de uma distribuidora de eletricidade. O primeiro trabalho trata do risco associado à previsão dos preços da energia elétrica, partindo do conhecido fato de que os vários modelos de previsão destes preços são sabidamente imprecisos; assim sendo, qual deve ser o risco incorrido ao se utilizar determinada técnica de modelagem, considerando-se que provavelmente estaremos fazendo uma previsão errônea. A abordagem utilizada é a modelagem dos erros de previsão com a Teoria de Valores Extremos, que se mostra bastante segura para modelagens dos quantis extremos da distribuição dos resíduos, desde 98 porcento até acima de 99,5 porcento, para diferentes frequências de amostragem dos dados. No capítulo seguinte, é feita uma avaliação da carga elétrica demandada a uma distribuidora, primeiramente considerando a abordagem utilizando modelos do tipo ARMA e ARMAX, buscando avaliar sua eficiência preditiva. Estes modelos são sabidamente apropriados para previsões no curto prazo, e mostramos através de simulações de Monte Carlo, que sua extensão para previsões de longo prazo torna inócua a busca de sofisticação através do trabalho de incorporação de variáveis exógenas. O motivo é que dado que o erro incorrido em quaisquer destas previsões mais longas com tais modelos é tão grande, ainda que sejam
modelos mais ou menos sofisticados, com variáveis exógenas ou não, um modelo simples produzirá o mesmo efeito do que aquele de maior sofisticação, em termos de confiança na previsão média obtida. Finalmente, o último trabalho aborda o tema de possíveis não linearidades no processo de geração de dados da carga elétrica demandada de uma distribuidora, admitindo não ser este um processo apenas linear. Para tal são usados modelos não lineares auto-regressivos de mudança de regimes, que se mostram vantajosos por serem inerentemente resistentes a possíveis quebras estruturais na série de carga utilizada, além de serem particularmente apropriados para modelar assimetrias no processo gerador de dados. Mostramos que mesmo modelos do tipo TAR simples, com apenas dois regimes e auto excitados, isto é, não incorporando quaisquer variáveis exógenas, podem ser mais apropriados do que modelos lineares auto-regressivos, demonstrando melhor capacidade de previsão fora-da-amostra. Ao mesmo tempo tais modelos tem relativa facilidade de cálculo, não exigindo sofisticados recursos computacionais. / [en] This present thesis discusses the risk associated to the uncertainty that is present in the process of forecasting electricity prices, as well as the aspects of uncertainty in the forecast of electrical energy loads required from an electricity distributor. The first essay deals with the risk inherent to the forecast of electricity prices, bearing in mind that the various existing models are notoriously imprecise. Therefore, we attempt to determine what the forecast risk is, given that a certain forecasting technique is used and that it will probably inaccurate. The approach used is through the modeling of forecast residues with the Extreme Value Theory, which proves itself to be satisfactorily accurate for the modeling of the distribution of residues at such extreme quantiles as from 98 per cent up to over 99,5 per cent, for different data sampling frequencies. The following next chapter shows the evaluation of the electricity load required from a distributor, first by using such models as ARMA and ARMAX, trying to evaluate their predictive efficiency. These models are known to be appropriate for short term predictions, and we show by means of Monte Carlo simulations that their extended use for long term forecasts will render useless any attempt to sophisticate such models by means of incorporating exogenous variables. This is due to the fact that since the error from such longer forecasts will be so large one way or the other, with exogenous variables or not, a simpler model will be as useful as any in terms of the error in the mean prediction. Finally, the last work discusses the possibility of nonlinear effects being present in the data generating process of electrical load demanded from an energy distributor, admitting this process being just linear. To accomplish this task, we use nonlinear auto-regressive regime switching models, which are shown to be inherently resistant to possible structural breaks in the load series data used, at the same time that they are particularly appropriated to modeling asymmetries in the data generating process. We show that even relatively simple self-excited TAR models with only two regimes, that is, not resorting to any exogenous variables, can be more appropriate than linear auto-regressive models, sporting better out-of-sample forecast results. At the same time, such models are relatively simple to calculate, not requiring any sophisticated computational means.
|
184 |
Mesure du capital réglementaire par des modèles de risque de marché / Measure of capital requirement by market risk modelsKourouma, Lancine 11 May 2012 (has links)
Suite à la crise financière et économique de 2008, il a été constaté sur le portefeuille de négociation des banques un montant de capital réglementaire significativement inférieur aux pertes réelles. Pour comprendre les causes de cette insuffisance de capital réglementaire, il nous a paru important d'évaluer la fiabilité des modèles de mesure de risque de marché et de proposer des méthodologies de stress test pour la gestion des risques extrêmes. L'objectif est de mesurer le capital réglementaire sur un portefeuille de négociation composé d'actions et de matières premières par la mesure de la Value at Risk (VaR) et l'Expected Shortfall. Pour réaliser cet objectif, nous avons utilisé le modèle Generalized Pareto Distribution (GPD) et deux modèles internes utilisés par les banques : méthode de simulation historique et modèle de la loi normale. Une première évaluation de la fiabilité effectuée sur les trois modèles de risque sous l'hypothèse de volatilité constante, montre que les modèles internes des banques et le modèle GPD ne mesurent pas correctement le risque du portefeuille d'étude pendant les périodes de crise. Néanmoins, le modèle GPD est fiable en période de faible volatilité mais avec une forte surestimation du risque réel ; cela peut conduire les banques à bloquer plus de fonds propres réglementaires qu'il est nécessaire. Une seconde évaluation de la fiabilité des modèles de risque a été effectuée sous l'hypothèse du changement de la volatilité et par la prise en compte de l'effet asymétrique des rentabilités financières. Le modèle GPD s'est révélé le plus fiable quelles que soient les conditions des marchés. La prise en compte du changement de la volatilité a amélioré la performance des modèles internes des banques. L'intégration des scénarios historiques et hypothétiques dans les modèles de risque a permis d'évaluer le risque extrême tout en diminuant la subjectivité reprochée aux techniques de stress test. Le stress test réalisé avec les modèles internes des banques ne permet pas une mesure correcte du risque extrême. Le modèle GPD est mieux adapté pour le stress test. Nous avons développé un algorithme de stress test qui permettra aux banques d'évaluer le risque extrême de leurs portefeuilles et d'identifier les facteurs de risque responsables de ce risque. Le calcul du capital réglementaire sur la base de la somme de la VaR et du stress VaR n'est pas logique et entraîne un doublement des fonds propres réglementaires des banques. Le doublement de ces fonds propres aura pour conséquence le resserrement du crédit à l'économie. Nous observons que le coefficient multiplicateur et le principe de la racine carrée du temps de l'accord de Bâle conduisent les banques à faire un arbitrage en faveur des modèles de risque non fiables. / During the financial and economic crisis of 2008, it was noticed that the amount of capital required for banks' trading portfolio was significantly less than the real losses. To understand the causes of this low capital requirement, it seemed important to estimate the reliability of the market risk models and to propose stress testing methodologies for the management of extreme risks. The objective is to measure the capital requirement on a trading portfolio, composed of shares and commodities by the measure of the Value at Risk (VaR) and Expected Shortfall. To achieve this goal, we use the Generalized Pareto Distribution (GPD) and two internal models commonly used by banks: historical simulation method and model of the normal law. A first evaluation of the reliability made on the three risk models under the hypothesis of constant volatility, shows that the internal banks' models and the GPD model do not measure correctly the risk of the portfolio during the crisis periods. However, GPD model is reliable in periods of low volatility but with a strong overestimation of the real risk; it can lead banks to block more capital requirement than necessary. A second evaluation of the reliability of the risk models was made under the hypothesis of the change of the volatility and by considering the asymmetric effect of the financial returns. GPD model is the most reliable of all, irrespective of market conditions. The performance of the internal banks' risk models improves when considering the change of the volatility. The integration of the historic and hypothetical scenarios in the risk models, improves the estimation of the extreme risk, while decreasing the subjectivity blamed to the stress testing techniques. The stress testing realized with the internal models of banks does not allow a correct measure of the extreme risk. GPD model is better adapted for the stress testing techniques. We developed an algorithm of stress testing which allow banks to estimate the extreme risk of their portfolios and to identify the risk factors causing this risk. The calculation of the capital requirement based on the sum of the VaR and the stress VaR is not logical and leads to doubling the capital requirement of banks. Consequently, it conducts to a credit crunch in the economy. We observe that the multiplier coefficient and the principle of square root of time of the Basel's agreement lead banks to make arbitration in favor of risk models that are not reliable.
|
185 |
Downscaling estoc?stico para extremos clim?ticos via interpola??o espacialCarvalho, Daniel Matos de 31 May 2010 (has links)
Made available in DSpace on 2014-12-17T15:26:38Z (GMT). No. of bitstreams: 1
DanielMC_DISSERT.pdf: 1549569 bytes, checksum: 5ad46f43cc6bf2e74f6fc1e20e5e2dc5 (MD5)
Previous issue date: 2010-05-31 / Conselho Nacional de Desenvolvimento Cient?fico e Tecnol?gico / Present day weather forecast models usually cannot provide realistic descriptions of local and particulary extreme weather conditions. However, for lead times of about a small number of days, they provide reliable forecast of the atmospheric circulation that encompasses the subscale processes leading to extremes. Hence, forecasts of extreme events can only be achieved through a combination of dynamical and statistical analysis methods, where a stable and significant statistical model based on prior physical reasoning establishes posterior statistical-dynamical model between the local extremes and the large scale circulation. Here we present the development and application of such a statistical model calibration on the besis of extreme value theory, in order to derive probabilistic forecast for extreme local temperature. The dowscaling applies to NCEP/NCAR re-analysis, in order to derive estimates of daily temperature at Brazilian northeastern region weather stations / Os dados de rean?lise de temperatura do ar e precipita??o do NCEP National Centers for Environmental Predictions ser?o refinados para a produ??o dos n?veis de retorno para eventos extremos nas 8 capitais do Nordeste Brasileiro - NB: S?o Luis, Teresina, Fortaleza, Natal, Jo?o Pessoa, Recife, Macei?, Aracaju e Salvador. A grade do Ncep possui resolu??o espacial de 2.5? x 2.5? disponibilizando s?ries hist?ricas de 1948 a atualidade. Com esta resolu??o a grade envolve o NB utilizando 72 localiza??es (s?ries). A primeira etapa consiste em ajustar os modelos da Distribui??o Generalizada de Valores Extremos (GEV) e da Distribui??o Generalizada de Pareto (GPD) para cada ponto da grade. Utilizando o m?todo Geoestat?stico denominado Krigagem, os par?metros da GEV e GPD ser?o interpolados espacialmente. Considerando a interpola??o espacial dos par?metros, os n?veis de retorno para extremos de temperatura do ar e precipita??o poder?o ser obtidos aonde o NCEP n?o fornece informa??o relevante. Visando validar os resultados desta proposta, ser?o ajustados os modelos GEV e GPD as s?ries observacionais di?rias de temperatura e precipita??o de cada capital nordestina, e assim comparar com os resultados obtidos a partir da interpola??o espacial. Por fim o m?todo de Regress?o Quant?lica ser? utilizado como m?todo mais tradicional com a finalidade de compara??o de m?todos.
|
186 |
Aplicação da Teoria do Valor Extremo e Copulas para avaliar risco de mercado de ações brasileiras / Application of extreme value theory and copulas to assess risk of the stock market in BrazilAngelo Santos Alves 26 September 2013 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / As instituições financeiras são obrigadas por acordos internacionais, como o Acordo de Basiléia, a avaliar o risco de mercado ao qual a instituição está propensa de forma a evitar possíveis contaminações de desastres financeiros em seu patrimônio. Com o intuito de capturar tais fenômenos, surge a necessidade de construir modelos que capturem com mais acurácia movimentos extremos das séries de retornos. O trabalho teve como principal objetivo aplicar a Teoria do Valor Extremo juntamente com Copulas na estimação de quantis extremos para o VaR. Ele utiliza técnicas de simulação de Monte Carlo, Teoria do Valor Extremo e Cópulas com distribuições gaussianas e t. Em contrapartida, as estimativas produzidas serão comparadas com as de um segundo modelo, chamado de simulação histórica de Monte Carlo filtrada, mais conhecida como filtered historical simulation (FHS). As técnicas serão aplicadas a um portfólio de ações de empresas brasileiras. / Financial institutions are required by international agreements such as the Basel Accord, to assess the market risk to which the institution is likely to avoid possible contamination of its assets in financial disasters. In order to capture these phenomena, the need arises to build models that more accurately capture extreme movements of the return series. The work aimed to apply the Extreme Value Theory along with the estimation of copulas for VaR extremes quantiles. He uses techniques of Monte Carlo simulation, Extreme Value Theory and Copulas with Gaussian and t distributions. In contrast, the estimates produced will be compared with a second model, called Monte Carlo simulation of historical filtered, better known as filtered historical simulation (FHS). The techniques are applied to a portfolio of stocks of Brazilian companies.
|
187 |
Aplicação da Teoria dos valores extremos em estratégias "Long-Short"Monte-mor, Danilo Soares 17 December 2010 (has links)
Made available in DSpace on 2016-12-23T14:00:36Z (GMT). No. of bitstreams: 1
Danilo Soares Monte-Mor.pdf: 964390 bytes, checksum: 749870f88ee1c9c692cf782e397379ec (MD5)
Previous issue date: 2010-12-17 / Increasingly has appeared on the market of investment Absolute Return Funds (Hedge Funds), which have the main objective to improve their performance through arbitrage strategies, as long-short strategies. It is the disproportionate evolution and even antagonistic of active prices that allows the players to structure strategies to generate additional returns, higher than the opportunity costs and independent of the movement of the market. In this work we used Extreme Value Theory (EVT), an important segment of probability, to model the series of direct relationship between prices of two pairs of assets. The quantiles obtained from such modeling and the quantile provided by normal were superimposed on data for periods subsequent to the period analyzed. From the comparison of such data we created a new quantitative long-short arbitrage strategy, called GEV Long-Short Strategy / Cada vez mais tem surgido no mercado de investimento fundos de retorno absoluto (Hedge Funds) que têm como objetivo principal melhorar seus desempenhos através de estratégias de arbitragem, como é o caso das estratégias long-short. É o comportamento desproporcional e até mesmo antagônico dos preços dos ativos que permite aos players estruturar estratégias para gerar retornos adicionais, superiores aos custos de oportunidade e independentes ao movimento do mercado. Neste trabalho foi utilizada a Teoria de Valores Extremos (TVE), um importante ramo da probabilidade, para que fossem modeladas as
séries da relação direta entre preços de dois pares de ativos. Os quantis obtidos a partir de tal modelagem, juntamente com os quantis fornecidos pela normal, foram superpostos aos dados
para períodos subsequentes ao período analisado. A partir da comparação desses dados foi criada uma nova estratégia quantitativa long-short de arbitragem, a qual denominamos GEV
Long-Short Strategy
|
188 |
Value at Risk no mercado financeiro internacional: avaliação da performance dos modelos nos países desenvolvidos e emergentes / Value at Risk in international finance: evaluation of the models performance in developed and emerging countriesLuiz Eduardo Gaio 01 April 2015 (has links)
Diante das exigências estipuladas pelos órgãos reguladores pelos acordos internacionais, tendo em vistas as inúmeras crises financeiras ocorridas nos últimos séculos, as instituições financeiras desenvolveram diversas ferramentas para a mensuração e controle do risco inerente aos negócios. Apesar da crescente evolução das metodologias de cálculo e mensuração do risco, o Value at Risk (VaR) se tornou referência como ferramenta de estimação do risco de mercado. Nos últimos anos novas técnicas de cálculo do Value at Risk (VaR) vêm sendo desenvolvidas. Porém, nenhuma tem sido considerada como a que melhor ajusta os riscos para diversos mercados e em diferentes momentos. Não existe na literatura um modelo conciso e coerente com as diversidades dos mercados. Assim, o presente trabalho tem por objetivo geral avaliar os estimadores de risco de mercado, gerados pela aplicação de modelos baseados no Value at Risk (VaR), aplicados aos índices das principais bolsas dos países desenvolvidos e emergentes, para os períodos normais e de crise financeira, de modo a apurar os mais efetivos nessa função. Foram considerados no estudo os modelos VaR Não condicional, pelos modelos tradicionais (Simulação Histórica, Delta-Normal e t-Student) e baseados na Teoria de Valores Extremos; o VaR Condicional, comparando os modelos da família ARCH e Riskmetrics e o VaR Multivariado, com os modelos GARCH bivariados (Vech, Bekk e CCC), funções cópulas (t-Student, Clayton, Frank e Gumbel) e por Redes Neurais Artificiais. A base de dados utilizada refere-se as amostras diárias dos retornos dos principais índices de ações dos países desenvolvidos (Alemanha, Estados Unidos, França, Reino Unido e Japão) e emergentes (Brasil, Rússia, Índia, China e África do Sul), no período de 1995 a 2013, contemplando as crises de 1997 e 2008. Os resultados do estudo foram, de certa forma, distintos das premissas iniciais estabelecidas pelas hipóteses de pesquisa. Diante de mais de mil modelagens realizadas, os modelos condicionais foram superiores aos não condicionais, na maioria dos casos. Em específico o modelo GARCH (1,1), tradicional na literatura, teve uma efetividade de ajuste de 93% dos casos. Para a análise Multivariada, não foi possível definir um modelo mais assertivo. Os modelos Vech, Bekk e Cópula - Clayton tiveram desempenho semelhantes, com bons ajustes em 100% dos testes. Diferentemente do que era esperado, não foi possível perceber diferenças significativas entre os ajustes para países desenvolvidos e emergentes e os momentos de crise e normal. O estudo contribuiu na percepção de que os modelos utilizados pelas instituições financeiras não são os que apresentam melhores resultados na estimação dos riscos de mercado, mesmo sendo recomendados pelas instituições renomadas. Cabe uma análise mais profunda sobre o desempenho dos estimadores de riscos, utilizando simulações com as carteiras de cada instituição financeira. / Given the requirements stipulated by regulatory agencies for international agreements, in considering the numerous financial crises in the last centuries, financial institutions have developed several tools to measure and control the risk of the business. Despite the growing evolution of the methodologies of calculation and measurement of Value at Risk (VaR) has become a reference tool as estimate market risk. In recent years new calculation techniques of Value at Risk (VaR) have been developed. However, none has been considered the one that best fits the risks for different markets and in different times. There is no literature in a concise and coherent model with the diversity of markets. Thus, this work has the objective to assess the market risk estimates generated by the application of models based on Value at Risk (VaR), applied to the indices of the major stock exchanges in developed and emerging countries, for normal and crisis periods financial, in order to ascertain the most effective in that role. Were considered in the study models conditional VaR, the conventional models (Historical Simulation, Delta-Normal and Student t test) and based on Extreme Value Theory; Conditional VaR by comparing the models of ARCH family and RiskMetrics and the Multivariate VaR, with bivariate GARCH (VECH, Bekk and CCC), copula functions (Student t, Clayton, Frank and Gumbel) and Artificial Neural Networks. The database used refers to the daily samples of the returns of major stock indexes of developed countries (Germany, USA, France, UK and Japan) and emerging (Brazil, Russia, India, China and South Africa) from 1995 to 2013, covering the crisis in 1997 and 2008. The results were somewhat different from the initial premises established by the research hypotheses. Before more than 1 mil modeling performed, the conditional models were superior to non-contingent, in the majority of cases. In particular the GARCH (1,1) model, traditional literature, had a 93% adjustment effectiveness of cases. For multivariate analysis, it was not possible to set a more assertive style. VECH models, and Bekk, Copula - Clayton had similar performance with good fits to 100% of the tests. Unlike what was expected, it was not possible to see significant differences between the settings for developed and emerging countries and the moments of crisis and normal. The study contributed to the perception that the models used by financial institutions are not the best performing in the estimation of market risk, even if recommended by renowned institutions. It is a deeper analysis on the performance of the estimators of risk, using simulations with the portfolios of each financial institution.
|
189 |
Théorie des valeurs extrêmes et applications en environnement / Extreme value theory and applications in environmentRietsch, Théo 14 November 2013 (has links)
Les deux premiers chapitres de cette thèse s'attachent à répondre à des questions cruciales en climatologie. La première est de savoir si un changement dans le comportement des extrêmes de température peut être détecté entre le début du siècle et aujourd'hui. Nous utilisons la divergence de Kullback Leibler, que nous adaptons au contexte des extrêmes. Des résultats théoriques et des simulations permettent de valider notre approche. La deuxième question est de savoir où retirer des stations météo pour perdre le moins d'information sur le comportement des extrêmes. Un algorithme, le Query By Committee, est développé puis appliqué à un jeu de données réelles. Le dernier chapitre de la thèse traite de l'estimation robuste du paramètre de queue d'une distribution de type Weibull en présence de co-variables aléatoires. Nous proposons un estimateur robuste basé sur un critère de minimisation de la divergence entre deux densités et étudions ses propriétés. / In the first two chapters, we try to answer two questions that are critical in climatology. The first one is to know whether a change in the behaviour of the temperature extremes occured between the beginning of the century and today. We suggest to use a version of the Kullback Leibler divergence tailored for the extreme value context. We provide some theoretical and simulation results to justify our approach. The second question is to decide where to remove stations from a network to lose the least information about the behaviour of the extremes. An algorithm called the Query By Committee is developed and applied to real data. The last chapter of the thesis deals with a more theoretical subject which is the robust estimation of a Weibull type tail index in presence of random covariates. We propose a robust estimator based on a criterion ofminimization of the divergence between two densities and study its properties.
|
190 |
Novelty detection with extreme value theory in vital-sign monitoringHugueny, Samuel Y. January 2013 (has links)
Every year in the UK, tens of thousands of hospital patients suffer adverse events, such as un-planned transfers to Intensive Therapy Units or unexpected cardiac arrests. Studies have shown that in a large majority of cases, significant physiological abnormalities can be observed within the 24-hour period preceding such events. Such warning signs may go unnoticed, if they occur between observations by the nursing staff, or are simply not identified as such. Timely detection of these warning signs and appropriate escalation schemes have been shown to improve both patient outcomes and the use of hospital resources, most notably by reducing patients’ length of stay. Automated real-time early-warning systems appear to be cost-efficient answers to the need for continuous vital-sign monitoring. Traditionally, a limitation of such systems has been their sensitivity to noisy and artefactual measurements, resulting in false-alert rates that made them unusable in practice, or earned them the mistrust of clinical staff. Tarassenko et al. (2005) and Hann (2008) proposed a novelty detection approach to the problem of continuous vital-sign monitoring, which, in a clinical trial, was shown to yield clinically acceptable false alert rates. In this approach, an observation is compared to a data fusion model, and its “normality” assessed by comparing a chosen statistic to a pre-set threshold. The method, while informed by large amounts of training data, has a number of heuristic aspects. This thesis proposes a principled approach to multivariate novelty detection in stochastic time- series, where novelty scores have a probabilistic interpretation, and are explicitly linked to the starting assumptions made. Our approach stems from the observation that novelty detection using complex multivariate, multimodal generative models is generally an ill-defined problem when attempted in the data space. In situations where “novel” is equivalent to “improbable with respect to a probability distribution ”, formulating the problem in a univariate probability space allows us to use classical results of univariate statistical theory. Specifically, we propose a multivariate extension to extreme value theory and, more generally, order statistics, suitable for performing novelty detection in time-series generated from a multivariate, possibly multimodal model. All the methods introduced in this thesis are applied to a vital-sign monitoring problem and compared to the existing method of choice. We show that it is possible to outperform the existing method while retaining a probabilistic interpretation. In addition to their application to novelty detection for vital-sign monitoring, contributions in this thesis to existing extreme value theory and order statistics are also valid in the broader context of data-modelling, and may be useful for analysing data from other complex systems.
|
Page generated in 0.0437 seconds