• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 138
  • 21
  • 21
  • 13
  • 8
  • 7
  • 6
  • 6
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 264
  • 264
  • 168
  • 55
  • 35
  • 33
  • 30
  • 30
  • 26
  • 24
  • 22
  • 22
  • 21
  • 21
  • 20
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Aplicação da Teoria do Valor Extremo e Copulas para avaliar risco de mercado de ações brasileiras / Application of extreme value theory and copulas to assess risk of the stock market in Brazil

Angelo Santos Alves 26 September 2013 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / As instituições financeiras são obrigadas por acordos internacionais, como o Acordo de Basiléia, a avaliar o risco de mercado ao qual a instituição está propensa de forma a evitar possíveis contaminações de desastres financeiros em seu patrimônio. Com o intuito de capturar tais fenômenos, surge a necessidade de construir modelos que capturem com mais acurácia movimentos extremos das séries de retornos. O trabalho teve como principal objetivo aplicar a Teoria do Valor Extremo juntamente com Copulas na estimação de quantis extremos para o VaR. Ele utiliza técnicas de simulação de Monte Carlo, Teoria do Valor Extremo e Cópulas com distribuições gaussianas e t. Em contrapartida, as estimativas produzidas serão comparadas com as de um segundo modelo, chamado de simulação histórica de Monte Carlo filtrada, mais conhecida como filtered historical simulation (FHS). As técnicas serão aplicadas a um portfólio de ações de empresas brasileiras. / Financial institutions are required by international agreements such as the Basel Accord, to assess the market risk to which the institution is likely to avoid possible contamination of its assets in financial disasters. In order to capture these phenomena, the need arises to build models that more accurately capture extreme movements of the return series. The work aimed to apply the Extreme Value Theory along with the estimation of copulas for VaR extremes quantiles. He uses techniques of Monte Carlo simulation, Extreme Value Theory and Copulas with Gaussian and t distributions. In contrast, the estimates produced will be compared with a second model, called Monte Carlo simulation of historical filtered, better known as filtered historical simulation (FHS). The techniques are applied to a portfolio of stocks of Brazilian companies.
172

[en] ANALYSIS OF EXTREME VALUES THEORY AND MONTE CARLO SIMULATION FOR THE CALCULATION OF VALUE-AT-RISK IN STOCK PORTFOLIOS / [pt] ANÁLISE DA TEORIA DOS VALORES EXTREMOS E DA SIMULAÇÃO DE MONTE CARLO PARA O CÁLCULO DO VALUE-AT-RISK EM CARTEIRAS DE INVESTIMENTOS DE ATIVOS DE RENDA VARIÁVEL

GUSTAVO JARDIM DE MORAIS 16 July 2018 (has links)
[pt] Após as recentes crises financeiras que se abateram sobre os mercados financeiros de todo o mundo, com mais propriedade a de 2008/2009, mas ainda a crise no Leste Europeu em Julho/2007, a moratória Russa em Outubro/1998, e, no âmbito nacional, a mudança no regime cambial brasileiro, em Janeiro/1999, as instituições financeiras incorreram em grandes perdas em cada um desses eventos e uma das principais questões levantadas acerca dos modelos financeiros diziam respeito ao gerenciamento de risco. Os diversos métodos de cálculo do Value-atrisk, bem como as simulações e cenários traçados por analistas não puderam prever sua magnitude nem tampouco evitar que a crise se agravasse. Em função disso, proponho-me à questão de estudar os sistemas de gerenciamento de risco financeiro, na medida em que este pode e deve ser aprimorado, sob pena de catástrofes financeiras ainda maiores. Embora seu conteúdo se mostre tão vasto na literatura, as metodologias para cálculo de valor em risco não são exatas e livres de falhas. Nesse contexto, coloca-se necessário o desenvolvimento e aprimoramento de ferramentas de gestão de risco que sejam capazes de auxiliar na melhor alocação dos recursos disponíveis, avaliando o nível de risco à que um investimento está exposto e sua compatibilidade com seu retorno esperado. / [en] After recent financial crisis that have hit financial markets all around the world, with more property on 2008/2009 periods, the Eastern Europe crisis in 2007, the Russian moratorium on October/1998, and with Brazilian national exchange rate regime change on January/1999, financial institutions have incurred in large losses on each of these events and one of the main question raised about the financial models related to risk management. The Value-at-Risk management and its many forms to calculate it, as well as the simulations and scenarios predicted by analysts could not predict its magnitude or prevent crisis worsened. As a result, I intent to study the question of financial systems management, in order to improve the existing methods, under the threat that even bigger financial disasters are shall overcome. Although it s content is vast on scientific literature, the Value-at-Risk calculate is not exact and free of flaws. In this context, there is need for the development and improvement of risk management tools that are able to assist in a better asset equities allocation of resources, equalizing the risk level of an investment and it s return.
173

[en] ESSAYS ON THE RISK ASSOCIATED TO FORECASTING ELECTRICITY PRICES AND ON MODELING THE DEMAND OF ENERGY FROM AN ELECTRICITY DISTRIBUTOR / [pt] ENSAIOS SOBRE O RISCO DE PREVISÃO DE PREÇOS DE ENERGIA ELÉTRICA E MODELAGEM DE CARGA DEMANDADA A UMA DISTRIBUIDORA DE ELETRICIDADE

MARIO DOMINGUES DE PAULA SIMOES 31 July 2018 (has links)
[pt] A presente tese trata da avaliação do risco associado à incerteza presente na previsão dos preços de energia elétrica, bem como os aspectos de incerteza associados à previsão de demanda da carga de energia elétrica exigida de uma distribuidora de eletricidade. O primeiro trabalho trata do risco associado à previsão dos preços da energia elétrica, partindo do conhecido fato de que os vários modelos de previsão destes preços são sabidamente imprecisos; assim sendo, qual deve ser o risco incorrido ao se utilizar determinada técnica de modelagem, considerando-se que provavelmente estaremos fazendo uma previsão errônea. A abordagem utilizada é a modelagem dos erros de previsão com a Teoria de Valores Extremos, que se mostra bastante segura para modelagens dos quantis extremos da distribuição dos resíduos, desde 98 porcento até acima de 99,5 porcento, para diferentes frequências de amostragem dos dados. No capítulo seguinte, é feita uma avaliação da carga elétrica demandada a uma distribuidora, primeiramente considerando a abordagem utilizando modelos do tipo ARMA e ARMAX, buscando avaliar sua eficiência preditiva. Estes modelos são sabidamente apropriados para previsões no curto prazo, e mostramos através de simulações de Monte Carlo, que sua extensão para previsões de longo prazo torna inócua a busca de sofisticação através do trabalho de incorporação de variáveis exógenas. O motivo é que dado que o erro incorrido em quaisquer destas previsões mais longas com tais modelos é tão grande, ainda que sejam modelos mais ou menos sofisticados, com variáveis exógenas ou não, um modelo simples produzirá o mesmo efeito do que aquele de maior sofisticação, em termos de confiança na previsão média obtida. Finalmente, o último trabalho aborda o tema de possíveis não linearidades no processo de geração de dados da carga elétrica demandada de uma distribuidora, admitindo não ser este um processo apenas linear. Para tal são usados modelos não lineares auto-regressivos de mudança de regimes, que se mostram vantajosos por serem inerentemente resistentes a possíveis quebras estruturais na série de carga utilizada, além de serem particularmente apropriados para modelar assimetrias no processo gerador de dados. Mostramos que mesmo modelos do tipo TAR simples, com apenas dois regimes e auto excitados, isto é, não incorporando quaisquer variáveis exógenas, podem ser mais apropriados do que modelos lineares auto-regressivos, demonstrando melhor capacidade de previsão fora-da-amostra. Ao mesmo tempo tais modelos tem relativa facilidade de cálculo, não exigindo sofisticados recursos computacionais. / [en] This present thesis discusses the risk associated to the uncertainty that is present in the process of forecasting electricity prices, as well as the aspects of uncertainty in the forecast of electrical energy loads required from an electricity distributor. The first essay deals with the risk inherent to the forecast of electricity prices, bearing in mind that the various existing models are notoriously imprecise. Therefore, we attempt to determine what the forecast risk is, given that a certain forecasting technique is used and that it will probably inaccurate. The approach used is through the modeling of forecast residues with the Extreme Value Theory, which proves itself to be satisfactorily accurate for the modeling of the distribution of residues at such extreme quantiles as from 98 per cent up to over 99,5 per cent, for different data sampling frequencies. The following next chapter shows the evaluation of the electricity load required from a distributor, first by using such models as ARMA and ARMAX, trying to evaluate their predictive efficiency. These models are known to be appropriate for short term predictions, and we show by means of Monte Carlo simulations that their extended use for long term forecasts will render useless any attempt to sophisticate such models by means of incorporating exogenous variables. This is due to the fact that since the error from such longer forecasts will be so large one way or the other, with exogenous variables or not, a simpler model will be as useful as any in terms of the error in the mean prediction. Finally, the last work discusses the possibility of nonlinear effects being present in the data generating process of electrical load demanded from an energy distributor, admitting this process being just linear. To accomplish this task, we use nonlinear auto-regressive regime switching models, which are shown to be inherently resistant to possible structural breaks in the load series data used, at the same time that they are particularly appropriated to modeling asymmetries in the data generating process. We show that even relatively simple self-excited TAR models with only two regimes, that is, not resorting to any exogenous variables, can be more appropriate than linear auto-regressive models, sporting better out-of-sample forecast results. At the same time, such models are relatively simple to calculate, not requiring any sophisticated computational means.
174

Mesure du capital réglementaire par des modèles de risque de marché / Measure of capital requirement by market risk models

Kourouma, Lancine 11 May 2012 (has links)
Suite à la crise financière et économique de 2008, il a été constaté sur le portefeuille de négociation des banques un montant de capital réglementaire significativement inférieur aux pertes réelles. Pour comprendre les causes de cette insuffisance de capital réglementaire, il nous a paru important d'évaluer la fiabilité des modèles de mesure de risque de marché et de proposer des méthodologies de stress test pour la gestion des risques extrêmes. L'objectif est de mesurer le capital réglementaire sur un portefeuille de négociation composé d'actions et de matières premières par la mesure de la Value at Risk (VaR) et l'Expected Shortfall. Pour réaliser cet objectif, nous avons utilisé le modèle Generalized Pareto Distribution (GPD) et deux modèles internes utilisés par les banques : méthode de simulation historique et modèle de la loi normale. Une première évaluation de la fiabilité effectuée sur les trois modèles de risque sous l'hypothèse de volatilité constante, montre que les modèles internes des banques et le modèle GPD ne mesurent pas correctement le risque du portefeuille d'étude pendant les périodes de crise. Néanmoins, le modèle GPD est fiable en période de faible volatilité mais avec une forte surestimation du risque réel ; cela peut conduire les banques à bloquer plus de fonds propres réglementaires qu'il est nécessaire. Une seconde évaluation de la fiabilité des modèles de risque a été effectuée sous l'hypothèse du changement de la volatilité et par la prise en compte de l'effet asymétrique des rentabilités financières. Le modèle GPD s'est révélé le plus fiable quelles que soient les conditions des marchés. La prise en compte du changement de la volatilité a amélioré la performance des modèles internes des banques. L'intégration des scénarios historiques et hypothétiques dans les modèles de risque a permis d'évaluer le risque extrême tout en diminuant la subjectivité reprochée aux techniques de stress test. Le stress test réalisé avec les modèles internes des banques ne permet pas une mesure correcte du risque extrême. Le modèle GPD est mieux adapté pour le stress test. Nous avons développé un algorithme de stress test qui permettra aux banques d'évaluer le risque extrême de leurs portefeuilles et d'identifier les facteurs de risque responsables de ce risque. Le calcul du capital réglementaire sur la base de la somme de la VaR et du stress VaR n'est pas logique et entraîne un doublement des fonds propres réglementaires des banques. Le doublement de ces fonds propres aura pour conséquence le resserrement du crédit à l'économie. Nous observons que le coefficient multiplicateur et le principe de la racine carrée du temps de l'accord de Bâle conduisent les banques à faire un arbitrage en faveur des modèles de risque non fiables. / During the financial and economic crisis of 2008, it was noticed that the amount of capital required for banks' trading portfolio was significantly less than the real losses. To understand the causes of this low capital requirement, it seemed important to estimate the reliability of the market risk models and to propose stress testing methodologies for the management of extreme risks. The objective is to measure the capital requirement on a trading portfolio, composed of shares and commodities by the measure of the Value at Risk (VaR) and Expected Shortfall. To achieve this goal, we use the Generalized Pareto Distribution (GPD) and two internal models commonly used by banks: historical simulation method and model of the normal law. A first evaluation of the reliability made on the three risk models under the hypothesis of constant volatility, shows that the internal banks' models and the GPD model do not measure correctly the risk of the portfolio during the crisis periods. However, GPD model is reliable in periods of low volatility but with a strong overestimation of the real risk; it can lead banks to block more capital requirement than necessary. A second evaluation of the reliability of the risk models was made under the hypothesis of the change of the volatility and by considering the asymmetric effect of the financial returns. GPD model is the most reliable of all, irrespective of market conditions. The performance of the internal banks' risk models improves when considering the change of the volatility. The integration of the historic and hypothetical scenarios in the risk models, improves the estimation of the extreme risk, while decreasing the subjectivity blamed to the stress testing techniques. The stress testing realized with the internal models of banks does not allow a correct measure of the extreme risk. GPD model is better adapted for the stress testing techniques. We developed an algorithm of stress testing which allow banks to estimate the extreme risk of their portfolios and to identify the risk factors causing this risk. The calculation of the capital requirement based on the sum of the VaR and the stress VaR is not logical and leads to doubling the capital requirement of banks. Consequently, it conducts to a credit crunch in the economy. We observe that the multiplier coefficient and the principle of square root of time of the Basel's agreement lead banks to make arbitration in favor of risk models that are not reliable.
175

Aplicação da Teoria do Valor Extremo e Copulas para avaliar risco de mercado de ações brasileiras / Application of extreme value theory and copulas to assess risk of the stock market in Brazil

Angelo Santos Alves 26 September 2013 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / As instituições financeiras são obrigadas por acordos internacionais, como o Acordo de Basiléia, a avaliar o risco de mercado ao qual a instituição está propensa de forma a evitar possíveis contaminações de desastres financeiros em seu patrimônio. Com o intuito de capturar tais fenômenos, surge a necessidade de construir modelos que capturem com mais acurácia movimentos extremos das séries de retornos. O trabalho teve como principal objetivo aplicar a Teoria do Valor Extremo juntamente com Copulas na estimação de quantis extremos para o VaR. Ele utiliza técnicas de simulação de Monte Carlo, Teoria do Valor Extremo e Cópulas com distribuições gaussianas e t. Em contrapartida, as estimativas produzidas serão comparadas com as de um segundo modelo, chamado de simulação histórica de Monte Carlo filtrada, mais conhecida como filtered historical simulation (FHS). As técnicas serão aplicadas a um portfólio de ações de empresas brasileiras. / Financial institutions are required by international agreements such as the Basel Accord, to assess the market risk to which the institution is likely to avoid possible contamination of its assets in financial disasters. In order to capture these phenomena, the need arises to build models that more accurately capture extreme movements of the return series. The work aimed to apply the Extreme Value Theory along with the estimation of copulas for VaR extremes quantiles. He uses techniques of Monte Carlo simulation, Extreme Value Theory and Copulas with Gaussian and t distributions. In contrast, the estimates produced will be compared with a second model, called Monte Carlo simulation of historical filtered, better known as filtered historical simulation (FHS). The techniques are applied to a portfolio of stocks of Brazilian companies.
176

Aplicação da Teoria dos valores extremos em estratégias "Long-Short"

Monte-mor, Danilo Soares 17 December 2010 (has links)
Made available in DSpace on 2016-12-23T14:00:36Z (GMT). No. of bitstreams: 1 Danilo Soares Monte-Mor.pdf: 964390 bytes, checksum: 749870f88ee1c9c692cf782e397379ec (MD5) Previous issue date: 2010-12-17 / Increasingly has appeared on the market of investment Absolute Return Funds (Hedge Funds), which have the main objective to improve their performance through arbitrage strategies, as long-short strategies. It is the disproportionate evolution and even antagonistic of active prices that allows the players to structure strategies to generate additional returns, higher than the opportunity costs and independent of the movement of the market. In this work we used Extreme Value Theory (EVT), an important segment of probability, to model the series of direct relationship between prices of two pairs of assets. The quantiles obtained from such modeling and the quantile provided by normal were superimposed on data for periods subsequent to the period analyzed. From the comparison of such data we created a new quantitative long-short arbitrage strategy, called GEV Long-Short Strategy / Cada vez mais tem surgido no mercado de investimento fundos de retorno absoluto (Hedge Funds) que têm como objetivo principal melhorar seus desempenhos através de estratégias de arbitragem, como é o caso das estratégias long-short. É o comportamento desproporcional e até mesmo antagônico dos preços dos ativos que permite aos players estruturar estratégias para gerar retornos adicionais, superiores aos custos de oportunidade e independentes ao movimento do mercado. Neste trabalho foi utilizada a Teoria de Valores Extremos (TVE), um importante ramo da probabilidade, para que fossem modeladas as séries da relação direta entre preços de dois pares de ativos. Os quantis obtidos a partir de tal modelagem, juntamente com os quantis fornecidos pela normal, foram superpostos aos dados para períodos subsequentes ao período analisado. A partir da comparação desses dados foi criada uma nova estratégia quantitativa long-short de arbitragem, a qual denominamos GEV Long-Short Strategy
177

Value at Risk no mercado financeiro internacional: avaliação da performance dos modelos nos países desenvolvidos e emergentes / Value at Risk in international finance: evaluation of the models performance in developed and emerging countries

Luiz Eduardo Gaio 01 April 2015 (has links)
Diante das exigências estipuladas pelos órgãos reguladores pelos acordos internacionais, tendo em vistas as inúmeras crises financeiras ocorridas nos últimos séculos, as instituições financeiras desenvolveram diversas ferramentas para a mensuração e controle do risco inerente aos negócios. Apesar da crescente evolução das metodologias de cálculo e mensuração do risco, o Value at Risk (VaR) se tornou referência como ferramenta de estimação do risco de mercado. Nos últimos anos novas técnicas de cálculo do Value at Risk (VaR) vêm sendo desenvolvidas. Porém, nenhuma tem sido considerada como a que melhor ajusta os riscos para diversos mercados e em diferentes momentos. Não existe na literatura um modelo conciso e coerente com as diversidades dos mercados. Assim, o presente trabalho tem por objetivo geral avaliar os estimadores de risco de mercado, gerados pela aplicação de modelos baseados no Value at Risk (VaR), aplicados aos índices das principais bolsas dos países desenvolvidos e emergentes, para os períodos normais e de crise financeira, de modo a apurar os mais efetivos nessa função. Foram considerados no estudo os modelos VaR Não condicional, pelos modelos tradicionais (Simulação Histórica, Delta-Normal e t-Student) e baseados na Teoria de Valores Extremos; o VaR Condicional, comparando os modelos da família ARCH e Riskmetrics e o VaR Multivariado, com os modelos GARCH bivariados (Vech, Bekk e CCC), funções cópulas (t-Student, Clayton, Frank e Gumbel) e por Redes Neurais Artificiais. A base de dados utilizada refere-se as amostras diárias dos retornos dos principais índices de ações dos países desenvolvidos (Alemanha, Estados Unidos, França, Reino Unido e Japão) e emergentes (Brasil, Rússia, Índia, China e África do Sul), no período de 1995 a 2013, contemplando as crises de 1997 e 2008. Os resultados do estudo foram, de certa forma, distintos das premissas iniciais estabelecidas pelas hipóteses de pesquisa. Diante de mais de mil modelagens realizadas, os modelos condicionais foram superiores aos não condicionais, na maioria dos casos. Em específico o modelo GARCH (1,1), tradicional na literatura, teve uma efetividade de ajuste de 93% dos casos. Para a análise Multivariada, não foi possível definir um modelo mais assertivo. Os modelos Vech, Bekk e Cópula - Clayton tiveram desempenho semelhantes, com bons ajustes em 100% dos testes. Diferentemente do que era esperado, não foi possível perceber diferenças significativas entre os ajustes para países desenvolvidos e emergentes e os momentos de crise e normal. O estudo contribuiu na percepção de que os modelos utilizados pelas instituições financeiras não são os que apresentam melhores resultados na estimação dos riscos de mercado, mesmo sendo recomendados pelas instituições renomadas. Cabe uma análise mais profunda sobre o desempenho dos estimadores de riscos, utilizando simulações com as carteiras de cada instituição financeira. / Given the requirements stipulated by regulatory agencies for international agreements, in considering the numerous financial crises in the last centuries, financial institutions have developed several tools to measure and control the risk of the business. Despite the growing evolution of the methodologies of calculation and measurement of Value at Risk (VaR) has become a reference tool as estimate market risk. In recent years new calculation techniques of Value at Risk (VaR) have been developed. However, none has been considered the one that best fits the risks for different markets and in different times. There is no literature in a concise and coherent model with the diversity of markets. Thus, this work has the objective to assess the market risk estimates generated by the application of models based on Value at Risk (VaR), applied to the indices of the major stock exchanges in developed and emerging countries, for normal and crisis periods financial, in order to ascertain the most effective in that role. Were considered in the study models conditional VaR, the conventional models (Historical Simulation, Delta-Normal and Student t test) and based on Extreme Value Theory; Conditional VaR by comparing the models of ARCH family and RiskMetrics and the Multivariate VaR, with bivariate GARCH (VECH, Bekk and CCC), copula functions (Student t, Clayton, Frank and Gumbel) and Artificial Neural Networks. The database used refers to the daily samples of the returns of major stock indexes of developed countries (Germany, USA, France, UK and Japan) and emerging (Brazil, Russia, India, China and South Africa) from 1995 to 2013, covering the crisis in 1997 and 2008. The results were somewhat different from the initial premises established by the research hypotheses. Before more than 1 mil modeling performed, the conditional models were superior to non-contingent, in the majority of cases. In particular the GARCH (1,1) model, traditional literature, had a 93% adjustment effectiveness of cases. For multivariate analysis, it was not possible to set a more assertive style. VECH models, and Bekk, Copula - Clayton had similar performance with good fits to 100% of the tests. Unlike what was expected, it was not possible to see significant differences between the settings for developed and emerging countries and the moments of crisis and normal. The study contributed to the perception that the models used by financial institutions are not the best performing in the estimation of market risk, even if recommended by renowned institutions. It is a deeper analysis on the performance of the estimators of risk, using simulations with the portfolios of each financial institution.
178

Théorie des valeurs extrêmes et applications en environnement / Extreme value theory and applications in environment

Rietsch, Théo 14 November 2013 (has links)
Les deux premiers chapitres de cette thèse s'attachent à répondre à des questions cruciales en climatologie. La première est de savoir si un changement dans le comportement des extrêmes de température peut être détecté entre le début du siècle et aujourd'hui. Nous utilisons la divergence de Kullback Leibler, que nous adaptons au contexte des extrêmes. Des résultats théoriques et des simulations permettent de valider notre approche. La deuxième question est de savoir où retirer des stations météo pour perdre le moins d'information sur le comportement des extrêmes. Un algorithme, le Query By Committee, est développé puis appliqué à un jeu de données réelles. Le dernier chapitre de la thèse traite de l'estimation robuste du paramètre de queue d'une distribution de type Weibull en présence de co-variables aléatoires. Nous proposons un estimateur robuste basé sur un critère de minimisation de la divergence entre deux densités et étudions ses propriétés. / In the first two chapters, we try to answer two questions that are critical in climatology. The first one is to know whether a change in the behaviour of the temperature extremes occured between the beginning of the century and today. We suggest to use a version of the Kullback Leibler divergence tailored for the extreme value context. We provide some theoretical and simulation results to justify our approach. The second question is to decide where to remove stations from a network to lose the least information about the behaviour of the extremes. An algorithm called the Query By Committee is developed and applied to real data. The last chapter of the thesis deals with a more theoretical subject which is the robust estimation of a Weibull type tail index in presence of random covariates. We propose a robust estimator based on a criterion ofminimization of the divergence between two densities and study its properties.
179

Novelty detection with extreme value theory in vital-sign monitoring

Hugueny, Samuel Y. January 2013 (has links)
Every year in the UK, tens of thousands of hospital patients suffer adverse events, such as un-planned transfers to Intensive Therapy Units or unexpected cardiac arrests. Studies have shown that in a large majority of cases, significant physiological abnormalities can be observed within the 24-hour period preceding such events. Such warning signs may go unnoticed, if they occur between observations by the nursing staff, or are simply not identified as such. Timely detection of these warning signs and appropriate escalation schemes have been shown to improve both patient outcomes and the use of hospital resources, most notably by reducing patients’ length of stay. Automated real-time early-warning systems appear to be cost-efficient answers to the need for continuous vital-sign monitoring. Traditionally, a limitation of such systems has been their sensitivity to noisy and artefactual measurements, resulting in false-alert rates that made them unusable in practice, or earned them the mistrust of clinical staff. Tarassenko et al. (2005) and Hann (2008) proposed a novelty detection approach to the problem of continuous vital-sign monitoring, which, in a clinical trial, was shown to yield clinically acceptable false alert rates. In this approach, an observation is compared to a data fusion model, and its “normality” assessed by comparing a chosen statistic to a pre-set threshold. The method, while informed by large amounts of training data, has a number of heuristic aspects. This thesis proposes a principled approach to multivariate novelty detection in stochastic time- series, where novelty scores have a probabilistic interpretation, and are explicitly linked to the starting assumptions made. Our approach stems from the observation that novelty detection using complex multivariate, multimodal generative models is generally an ill-defined problem when attempted in the data space. In situations where “novel” is equivalent to “improbable with respect to a probability distribution ”, formulating the problem in a univariate probability space allows us to use classical results of univariate statistical theory. Specifically, we propose a multivariate extension to extreme value theory and, more generally, order statistics, suitable for performing novelty detection in time-series generated from a multivariate, possibly multimodal model. All the methods introduced in this thesis are applied to a vital-sign monitoring problem and compared to the existing method of choice. We show that it is possible to outperform the existing method while retaining a probabilistic interpretation. In addition to their application to novelty detection for vital-sign monitoring, contributions in this thesis to existing extreme value theory and order statistics are also valid in the broader context of data-modelling, and may be useful for analysing data from other complex systems.
180

Doba nezaměstnanosti v České republice pohledem analýzy přežití / Unemployment Duration in the Czech Republic Through the Lens of Survival Analysis

Čabla, Adam January 2017 (has links)
In the presented thesis the aim is to apply methods of survival analysis to the data from the Labour Force Survey, which are interval-censored. With regard to this type of data, I use specific methods designed to handle them, especially Turnbull estimate, weighted log-rank test and the AFT model. Other objective of the work is the design and application of a methodology for creating a model of unemployment duration, depending on the available factors and its interpretation. Other aim is to evaluate evolution of the probability distribution of unemployment duration and last but not least aim is to create more accurate estimate of the tail using extreme value theory. The main benefits of the thesis can include the creation of a methodology for examining the data from the Labour Force Survey based on standard techniques of survival analysis. Since the data are internationally comparable, the methodology is applicable at the level of European Union countries and several others. Another benefit of this work is estimation of the parameters of the generalized Pareto distribution on interval-censored data and creation and comparison of the models of piecewise connected distribution functions with solution of the connection problem. Work brought empirical results, most important of which is the comparison of results from three different data approaches and specific relationship between selected factors and time to find a job or spell of unemployment.

Page generated in 0.064 seconds