• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 100
  • 14
  • 13
  • 7
  • 6
  • 5
  • 5
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 168
  • 168
  • 168
  • 54
  • 33
  • 30
  • 25
  • 24
  • 21
  • 20
  • 20
  • 19
  • 18
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Non-Life Excess of Loss Reinsurance Pricing / Oceňování zajištění škodního nadměrku v neživotním pojištění

Hrevuš, Jan January 2010 (has links)
Probably the most frequently used definition of reinsurance is insurance for insurance companies, by reinsurance the cedant (insurance company) cedes part of the risk to the reinsurer. Reinsurance plays nowadays a crucial role in insurance industry as it does not only reduce the reinsured's exposure, but it can also significantly reduce the required solvency capital. In past few decades various approaches to reinsurance actuarial modelling were published and many actuaries are nowadays just reinsurance specialized. The thesis provides an overview of the actuarial aspects of modelling a non-life per risk and for motor third party liability per event excess of loss reinsurance structure, according to the author's knowledge no study of such wide scope exists and various aspects have to be found in various fragmented articles published worldwide. The thesis is based on recent industry literature describing latest trends and methodologies used, the theory is compared with the praxis as the author has working experience from underwriting at CEE reinsurer and actuarial reinsurance modelling at global reinsurance broker. The sequence of topics which are dealt corresponds to sequence of the steps taken by actuary modelling reinsurance and each step is discussed in detail. Starting with data preparation and besides loss inflation, more individual claims development methods are introduced and own probabilistic model is constructed. Further, burning cost analysis and probabilistic rating focused on heavy tailed distributions are discussed. A special attention is given to exposure rating which is not commonly known discipline among actuaries outside of reinsurance industry and different methodologies for property and casualty exposure modelling are introduced including many best practice suggestions. All main approaches to the reinsurance modelling are also illustrated on either real or realistically looking data, similar to those provided by European insurance companies to their reinsurers during renewal periods.
122

Modélisation de la structure de dépendance d'extrêmes multivariés et spatiaux / Modelling the dependence structure of multivariate and spatial extremes

Béranger, Boris 18 January 2016 (has links)
La prédiction de futurs évènements extrêmes est d’un grand intérêt dans de nombreux domaines tels que l’environnement ou la gestion des risques. Alors que la théorie des valeurs extrêmes univariées est bien connue, la complexité s’accroît lorsque l’on s’intéresse au comportement joint d’extrêmes de plusieurs variables. Un intérêt particulier est porté aux évènements de nature spatiale, définissant le cadre d’un nombre infini de dimensions. Sous l’hypothèse que ces évènements soient marginalement extrêmes, nous focalisons sur la structure de dépendance qui les lie. Dans un premier temps, nous faisons une revue des modèles paramétriques de dépendance dans le cadre multivarié et présentons différentes méthodes d’estimation. Les processus maxstables permettent l’extension au contexte spatial. Nous dérivons la loi en dimension finie du célèbre modèle de Brown- Resnick, permettant de faire de l’inférence par des méthodes de vraisemblance ou de vraisemblance composée. Nous utilisons ensuite des lois asymétriques afin de définir la représentation spectrale d’un modèle plus large : le modèle Extremal Skew-t, généralisant la plupart des modèles présents dans la littérature. Ce modèle a l’agréable propriété d’être asymétrique et non-stationnaire, deux notions présentées par les évènements environnementaux spatiaux. Ce dernier permet un large spectre de structures de dépendance. Les indicateurs de dépendance sont obtenus en utilisant la loi en dimension finie.Enfin, nous présentons une méthode d’estimation non-paramétrique par noyau pour les queues de distributions et l’appliquons à la sélection de modèles. Nous illustrons notre méthode à partir de l’exemple de modèles climatiques. / Projection of future extreme events is a major issue in a large number of areas including the environment and risk management. Although univariate extreme value theory is well understood, there is an increase in complexity when trying to understand the joint extreme behavior between two or more variables. Particular interest is given to events that are spatial by nature and which define the context of infinite dimensions. Under the assumption that events correspond marginally to univariate extremes, the main focus is then on the dependence structure that links them. First, we provide a review of parametric dependence models in the multivariate framework and illustrate different estimation strategies. The spatial extension of multivariate extremes is introduced through max-stable processes. We derive the finite-dimensional distribution of the widely used Brown-Resnick model which permits inference via full and composite likelihood methods. We then use Skew-symmetric distributions to develop a spectral representation of a wider max-stable model: the extremal Skew-t model from which most models available in the literature can be recovered. This model has the nice advantages of exhibiting skewness and nonstationarity, two properties often held by environmental spatial events. The latter enables a larger spectrum of dependence structures. Indicators of extremal dependence can be calculated using its finite-dimensional distribution. Finally, we introduce a kernel based non-parametric estimation procedure for univariate and multivariate tail density and apply it for model selection. Our method is illustrated by the example of selection of physical climate models.
123

Modèle de mélange et modèles linéaires généralisés, application aux données de co-infection (arbovirus & paludisme) / Mixture model and generalized linear models, application to co-infection data (arbovirus & malaria)

Loum, Mor Absa 28 August 2018 (has links)
Nous nous intéressons, dans cette thèse, à l'étude des modèles de mélange et des modèles linéaires généralisés, avec une application aux données de co-infection entre les arbovirus et les parasites du paludisme. Après une première partie consacrée à l'étude de la co-infection par un modèle logistique multinomial, nous proposons dans une deuxième partie l'étude des mélanges de modèles linéaires généralisés. La méthode proposée pour estimer les paramètres du mélange est une combinaison d'une méthode des moments et d'une méthode spectrale. Nous proposons à la fin une dernière partie consacrée aux mélanges de valeurs extrêmes en présence de censure. La méthode d'estimation proposée dans cette partie se fait en deux étapes basées sur la maximisation d'une vraisemblance. / We are interested, in this thesis, to the study of mixture models and generalized linear models, with an application to co-infection data between arboviruses and malaria parasites. After a first part dedicated to the study of co-infection using a multinomial logistic model, we propose in a second part to study the mixtures of generalized linear models. The proposed method to estimate the parameters of the mixture is a combination of a moment method and a spectral method. Finally, we propose a final section for studing extreme value mixtures under random censoring. The estimation method proposed in this section is done in two steps based on the maximization of a likelihood.
124

Variability and long-term trends of climate extremes over the Limpopo, South Africa

Sikhwari, Thendo 20 September 2019 (has links)
MENVSC / Department of Geography and Geo-Information Sciences / Climate change has a crucial impact on livelihoods, economy, and water resources due to the occurrence of weather and climate extreme events such as floods, droughts and heat waves. Extreme weather has been increasing worldwide, hence the need to understand their nature and trends. The aim of this study was to analyse the spatial variability and long-term trends of climate extremes over the Limpopo in South Africa from 1960 to 2014. Rainfall, temperature, and circulation fields were analysed to understand the extent, nature of climate extremes over the Limpopo. Extreme value theory (EVT) is a powerful method that was also employed in this study to provide statistical models for events rarely observed. R statistical software was used for clustering analysis which has a variety of functions for cluster analysis. Any station whose value is larger than 95th for any day of the season was considered as a widespread extreme event. The results show that the study area is highly vulnerable to extreme events due to its latitudinal location and low altitude. Anomalous cut-off lows, tropical cyclones and tropical storms are the major extreme producing systems affecting the Limpopo province whilst the Botswana High becomes dominant during heat waves and drought. Extreme weather events are common in Limpopo during summertime and often coincide with mature phases of the El Nino Southern Oscillation. In this study, after the suitable model for data was chosen, the interest was in deriving return levels of extreme maximum rainfall. The computed data for return levels predicted that the 5-year return period’s return level is approximately 223.89 mm, which suggests that rainfall of 223.89 mm or more per month should occur at that station or location on the average of once every five years. / NRF / http://hdl.handle.net/11602/1485
125

Échantillonnage et inférence dans réseaux complexes / Sampling and inference in complex networks

Kazhuthuveettil Sreedharan, Jithin 02 December 2016 (has links)
L’émergence récente de grands réseaux, surtout réseaux sociaux en ligne (OSN), a révélé la difficulté de crawler le réseau complet et a déclenché le développement de nouvelles techniques distribuées. Dans cette thèse, nous concevons et analysons des algorithmes basés sur les marches aléatoires et la diffusion pour l'échantillonnage, l'estimation et l'inférence des fonctions des réseaux. La thèse commence par le problème classique de trouver les valeurs propres dominants et leurs vecteurs propres de matrices de graphe symétriques, comme la matrice Laplacienne de graphes non orientés. En utilisant le fait que le spectre est associé à une équation de type différentiel Schrödinger, nous développons des techniques évolutives à l’aide de la diffusion sur le graphe. Ensuite, nous considérons l’échantillonnage des fonctions de réseau (comme somme et moyenne) en utilisant les marches aléatoires sur le graphe. Afin d'éviter le temps «burn-in» de marche aléatoire, avec l'idée de régénération à un nœud fixe, nous développons un estimateur de la fonction de somme qui est non asymptotiquement non-biaisé et dérivons une approximation à la postérieure Bayésienne. La dernière partie de la thèse étudie l'application de la théorie des valeurs extrêmes pour faire une inférence sur les événements extrêmes à partir des échantillons stationnaires des différentes marches aléatoires pour l’échantillonnage de réseau / The recent emergence of large networks, mainly due to the rise of online social networks, brought out the difficulty to gather a complete picture of a network and it prompted the development of new distributed techniques. In this thesis, we design and analyze algorithms based on random walks and diffusion for sampling, estimation and inference of the network functions, and for approximating the spectrum of graph matrices. The thesis starts with the classical problem of finding the dominant eigenvalues and the eigenvectors of symmetric graph matrices like Laplacian of undirected graphs. Using the fact that the eigenspectrum is associated with a Schrödinger-type differential equation, we develop scalable techniques with diffusion over the graph and with gossiping algorithms. They are also adaptable to a simple algorithm based on quantum computing. Next, we consider sampling and estimation of network functions (sum and average) using random walks on graph. In order to avoid the burn-in time of random walks, with the idea of regeneration at its revisits to a fixed node, we develop an estimator for the aggregate function which is non-asymptotically unbiased and derive an approximation to its Bayesian posterior. An estimator based on reinforcement learning is also developed making use of regeneration. The final part of the thesis deals with the use of extreme value theory to make inference from the stationary samples of the random walks. Extremal events such as first hitting time of a large degree node, order statistics and mean cluster size are well captured in the parameter “extremal index”. We theoretically study and estimate extremal index of different random walk sampling techniques
126

Fitting extreme value distributions to the Zambezi River flood water levels recorded at Katima Mulilo in Namibia (1965-2003)

Kamwi, Innocent Silibelo January 2005 (has links)
>Magister Scientiae - MSc / This study sought to identify and fit the appropriate extreme value distribution to flood data, using the method of maximum likelihood. To examine the uncertainty of the estimated parameters and evaluate the goodness of fit of the model identified. The study revealed that the three parameter Weibull and the generalised extreme value (GEV) distributions fit the data very well. Standard errors for the estimated parameters were calculated from the empirical information matrix. An upper limit to the flood levels followed from the fitted distribution.
127

Extreme Value Theory Applied to Securitizations Rating Methodology / Extremvärdesteori tillämpat på värdepapperisering

Barbouche, Tarek January 2017 (has links)
One of today’s financial trends is securitization. Evaluating Securitization risk requires some strong quantitative skills and a deep understanding of both credit and market risk. For international securitization programs it is mandatory to take into account the exchange-rates-related risks. We will see the di˙erent methods to evaluate extreme variations of the exchange rates using the Extreme Value Theory and Monte Carlo simulations. / Värdepapperisering är en av dagens finansiella trender. Att utvärdera vär-depapperisering risk kräver starka kvantitativa kunskaper och en förståelseför både kredit- och marknadsrisk. För internationell värdepapperisering ärdet obligatoriskt att hänsyn tas till valutarisker. Vi kommer att se de olika metoder för att utvärdera extrema variationer i valutakurser med hjälp av extremvärdesteori och Monte Carlo-simuleringar.
128

[en] ANALYSIS OF EXTREME VALUES THEORY AND MONTE CARLO SIMULATION FOR THE CALCULATION OF VALUE-AT-RISK IN STOCK PORTFOLIOS / [pt] ANÁLISE DA TEORIA DOS VALORES EXTREMOS E DA SIMULAÇÃO DE MONTE CARLO PARA O CÁLCULO DO VALUE-AT-RISK EM CARTEIRAS DE INVESTIMENTOS DE ATIVOS DE RENDA VARIÁVEL

GUSTAVO JARDIM DE MORAIS 16 July 2018 (has links)
[pt] Após as recentes crises financeiras que se abateram sobre os mercados financeiros de todo o mundo, com mais propriedade a de 2008/2009, mas ainda a crise no Leste Europeu em Julho/2007, a moratória Russa em Outubro/1998, e, no âmbito nacional, a mudança no regime cambial brasileiro, em Janeiro/1999, as instituições financeiras incorreram em grandes perdas em cada um desses eventos e uma das principais questões levantadas acerca dos modelos financeiros diziam respeito ao gerenciamento de risco. Os diversos métodos de cálculo do Value-atrisk, bem como as simulações e cenários traçados por analistas não puderam prever sua magnitude nem tampouco evitar que a crise se agravasse. Em função disso, proponho-me à questão de estudar os sistemas de gerenciamento de risco financeiro, na medida em que este pode e deve ser aprimorado, sob pena de catástrofes financeiras ainda maiores. Embora seu conteúdo se mostre tão vasto na literatura, as metodologias para cálculo de valor em risco não são exatas e livres de falhas. Nesse contexto, coloca-se necessário o desenvolvimento e aprimoramento de ferramentas de gestão de risco que sejam capazes de auxiliar na melhor alocação dos recursos disponíveis, avaliando o nível de risco à que um investimento está exposto e sua compatibilidade com seu retorno esperado. / [en] After recent financial crisis that have hit financial markets all around the world, with more property on 2008/2009 periods, the Eastern Europe crisis in 2007, the Russian moratorium on October/1998, and with Brazilian national exchange rate regime change on January/1999, financial institutions have incurred in large losses on each of these events and one of the main question raised about the financial models related to risk management. The Value-at-Risk management and its many forms to calculate it, as well as the simulations and scenarios predicted by analysts could not predict its magnitude or prevent crisis worsened. As a result, I intent to study the question of financial systems management, in order to improve the existing methods, under the threat that even bigger financial disasters are shall overcome. Although it s content is vast on scientific literature, the Value-at-Risk calculate is not exact and free of flaws. In this context, there is need for the development and improvement of risk management tools that are able to assist in a better asset equities allocation of resources, equalizing the risk level of an investment and it s return.
129

Value at risk and expected shortfall : traditional measures and extreme value theory enhancements with a South African market application

Dicks, Anelda 12 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2013. / ENGLISH ABSTRACT: Accurate estimation of Value at Risk (VaR) and Expected Shortfall (ES) is critical in the management of extreme market risks. These risks occur with small probability, but the financial impacts could be large. Traditional models to estimate VaR and ES are investigated. Following usual practice, 99% 10 day VaR and ES measures are calculated. A comprehensive theoretical background is first provided and then the models are applied to the Africa Financials Index from 29/01/1996 to 30/04/2013. The models considered include independent, identically distributed (i.i.d.) models and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) stochastic volatility models. Extreme Value Theory (EVT) models that focus especially on extreme market returns are also investigated. For this, the Peaks Over Threshold (POT) approach to EVT is followed. For the calculation of VaR, various scaling methods from one day to ten days are considered and their performance evaluated. The GARCH models fail to converge during periods of extreme returns. During these periods, EVT forecast results may be used. As a novel approach, this study considers the augmentation of the GARCH models with EVT forecasts. The two-step procedure of pre-filtering with a GARCH model and then applying EVT, as suggested by McNeil (1999), is also investigated. This study identifies some of the practical issues in model fitting. It is shown that no single forecasting model is universally optimal and the choice will depend on the nature of the data. For this data series, the best approach was to augment the GARCH stochastic volatility models with EVT forecasts during periods where the first do not converge. Model performance is judged by the actual number of VaR and ES violations compared to the expected number. The expected number is taken as the number of return observations over the entire sample period, multiplied by 0.01 for 99% VaR and ES calculations. / AFRIKAANSE OPSOMMING: Akkurate beraming van Waarde op Risiko (Value at Risk) en Verwagte Tekort (Expected Shortfall) is krities vir die bestuur van ekstreme mark risiko’s. Hierdie risiko’s kom met klein waarskynlikheid voor, maar die finansiële impakte is potensieel groot. Tradisionele modelle om Waarde op Risiko en Verwagte Tekort te beraam, word ondersoek. In ooreenstemming met die algemene praktyk, word 99% 10 dag maatstawwe bereken. ‘n Omvattende teoretiese agtergrond word eers gegee en daarna word die modelle toegepas op die Africa Financials Index vanaf 29/01/1996 tot 30/04/2013. Die modelle wat oorweeg word sluit onafhanklike, identies verdeelde modelle en Veralgemeende Auto-regressiewe Voorwaardelike Heteroskedastiese (GARCH) stogastiese volatiliteitsmodelle in. Ekstreemwaarde Teorie modelle, wat spesifiek op ekstreme mark opbrengste fokus, word ook ondersoek. In hierdie verband word die Peaks Over Threshold (POT) benadering tot Ekstreemwaarde Teorie gevolg. Vir die berekening van Waarde op Risiko word verskillende skaleringsmetodes van een dag na tien dae oorweeg en die prestasie van elk word ge-evalueer. Die GARCH modelle konvergeer nie gedurende tydperke van ekstreme opbrengste nie. Gedurende hierdie tydperke, kan Ekstreemwaarde Teorie modelle gebruik word. As ‘n nuwe benadering oorweeg hierdie studie die aanvulling van die GARCH modelle met Ekstreemwaarde Teorie vooruitskattings. Die sogenaamde twee-stap prosedure wat voor-af filtrering met ‘n GARCH model behels, gevolg deur die toepassing van Ekstreemwaarde Teorie (soos voorgestel deur McNeil, 1999), word ook ondersoek. Hierdie studie identifiseer sommige van die praktiese probleme in model passing. Daar word gewys dat geen enkele vooruistkattingsmodel universeel optimaal is nie en die keuse van die model hang af van die aard van die data. Die beste benadering vir die data reeks wat in hierdie studie gebruik word, was om die GARCH stogastiese volatiliteitsmodelle met Ekstreemwaarde Teorie vooruitskattings aan te vul waar die voorafgenoemde nie konvergeer nie. Die prestasie van die modelle word beoordeel deur die werklike aantal Waarde op Risiko en Verwagte Tekort oortredings met die verwagte aantal te vergelyk. Die verwagte aantal word geneem as die aantal obrengste waargeneem oor die hele steekproefperiode, vermenigvuldig met 0.01 vir die 99% Waarde op Risiko en Verwagte Tekort berekeninge.
130

用極值理論分析次級房貸風暴的衝擊-以全球市場為例 / Using extreme value theory to analyze the US sub-prime mortgage crisis on the global stock market

彭富忠, Peng, Fu Chung Unknown Date (has links)
The US sub-prime mortgage crisis greatly affected not only the US economy but also other countries in the world. This thesis employs the extreme value theory and Value at Risk (VaR) analysis to assess the impact of the US sub-prime mortgage crisis on various stock markets of the MSCI indexes, including 10 countries and 7 areas. It is reasonable to guess that VaR value should increase after the crisis. The empirical analyses on these indexes conclude that (1) the American market indexes not only do not agree with the guess after the crisis but four American indexes are identical; (2) not all the Asia market indexes consist with the guess; (3) the European market indexes agree with the guess; (4) MSCI AC PACIFIC, NEW ZEALAND, and AUSTRALIA consist with the guess; (5) the behavior for the positive log returns is different from that for the negative returns in some MSCI indexes. Over speaking, the impacts of US sub-prime mortgage crisis on those countries are not the same.

Page generated in 0.0585 seconds