• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 19
  • 9
  • 9
  • 8
  • 7
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Risk-Taking Evidence from The Insurance Industry¡XPanel Data Threshold Regression Model and Extreme Value Theory

Tsen, Hsiao-ping 12 July 2007 (has links)
The number of insurance company has grown rapidly in Taiwan due to insurance deregulation since 1992. The main challenge insurance industry face is the declination of profit due to the increasing of competitors. The operator of insurance company is able to face this question and offer the solution, then a company has better solvency. So we explore two issue, one is to investigate the relationship between asset risk and capital adjustment decision in Taiwan¡¦s life insurance industry from 1993 to 2005, and the other is to provide some empirical evidences of retention limit of excess of loss reinsurance in Taiwan¡¦s property insurance industry. In the first issue, a life insurance company is in less risk and has better solvency when it has more capital or higher ratio of capital; however, this also brings higher opportunity cost which means in long run, the average profit will be lower. There is no conclusion how to balance the relationship between capital adjustment and risk taking decision in life insurance industry though this topic is intensively discussed these days. Therefore, with the methodology of panel data threshold regression, we divide life insurance companies into two categories according to ¡§life insurance and annuity insurance premiums to total premiums ratio¡¨. One is life insurance Company of indemnification, and the other is the one of savings. In conclusion, we identify the negative correlation between capital ratio and risk of life insurance company of indemnification and the positive correlation between capital ratio and risk of life insurance company of savings. In the second issue, because of the increase of natural disaster in Taiwan recently, the property insurance company has to face what the reinsurance companies are not willing to underwriter, so excess of loss reinsurance has become the viable solution in Taiwan¡¦s property insurance industry. We apply extreme value theory to the tail of Taiwan property insurance claim for VaR estimation and calculate retention limit of excess of loss reinsurance. The empirical results show that the distribution of Taiwan property insurance claim is fat-tailed. We suggested using Generalized Pareto Distribution (GPD) to model the data with extreme loss and conclude retention limit of excess of loss reinsurance.
2

On Value-at-Risk and the more extreme : A study on quantitative market risk measurements

Lindholm, Dennis January 2015 (has links)
Inline with the third pillar of the Basel accords, quantitative market risk measurements are investigate and evaluated comparing JP Morgan’s RiskMetrics and Bollerslev’s GARCH with the Peek over Threshold and Block Maxima approaches from the Extreme Value Theory framework. Value-at-Risk and Expected Shortfall (Conditional Value-at-Risk), with 95% and 99% confidence, is predicted for 25 years of the OMXS30. The study finds Bollerslev’s suggested t distribution to be a more appropriate distributional assumption, but no evidence to prefer the GARCH to the RiskMetrics. The more demanding Extreme Value Theory procedures trail behind as they are found wasteful of data and more difficult to backtest and therefore evaluate.
3

Improved estimation procedures for a positive extreme value index

Berning, Thomas Louw 12 1900 (has links)
Thesis (PhD (Statistics))--University of Stellenbosch, 2010. / ENGLISH ABSTRACT: In extreme value theory (EVT) the emphasis is on extreme (very small or very large) observations. The crucial parameter when making inferences about extreme quantiles, is called the extreme value index (EVI). This thesis concentrates on only the right tail of the underlying distribution (extremely large observations), and specifically situations where the EVI is assumed to be positive. A positive EVI indicates that the underlying distribution of the data has a heavy right tail, as is the case with, for example, insurance claims data. There are numerous areas of application of EVT, since there are a vast number of situations in which one would be interested in predicting extreme events accurately. Accurate prediction requires accurate estimation of the EVI, which has received ample attention in the literature from a theoretical as well as practical point of view. Countless estimators of the EVI exist in the literature, but the practitioner has little information on how these estimators compare. An extensive simulation study was designed and conducted to compare the performance of a wide range of estimators, over a wide range of sample sizes and distributions. A new procedure for the estimation of a positive EVI was developed, based on fitting the perturbed Pareto distribution (PPD) to observations above a threshold, using Bayesian methodology. Attention was also given to the development of a threshold selection technique. One of the major contributions of this thesis is a measure which quantifies the stability (or rather instability) of estimates across a range of thresholds. This measure can be used to objectively obtain the range of thresholds over which the estimates are most stable. It is this measure which is used for the purpose of threshold selection for the proposed PPD estimator. A case study of five insurance claims data sets illustrates how data sets can be analyzed in practice. It is shown to what extent discretion can/should be applied, as well as how different estimators can be used in a complementary fashion to give more insight into the nature of the data and the extreme tail of the underlying distribution. The analysis is carried out from the point of raw data, to the construction of tables which can be used directly to gauge the risk of the insurance portfolio over a given time frame. / AFRIKAANSE OPSOMMING: Die veld van ekstreemwaardeteorie (EVT) is bemoeid met ekstreme (baie klein of baie groot) waarnemings. Die parameter wat deurslaggewend is wanneer inferensies aangaande ekstreme kwantiele ter sprake is, is die sogenaamde ekstreemwaarde-indeks (EVI). Hierdie verhandeling konsentreer op slegs die regterstert van die onderliggende verdeling (baie groot waarnemings), en meer spesifiek, op situasies waar aanvaar word dat die EVI positief is. ’n Positiewe EVI dui aan dat die onderliggende verdeling ’n swaar regterstert het, wat byvoorbeeld die geval is by versekeringseis data. Daar is verskeie velde waar EVT toegepas word, aangesien daar ’n groot aantal situasies is waarin mens sou belangstel om ekstreme gebeurtenisse akkuraat te voorspel. Akkurate voorspelling vereis die akkurate beraming van die EVI, wat reeds ruim aandag in die literatuur geniet het, uit beide teoretiese en praktiese oogpunte. ’n Groot aantal beramers van die EVI bestaan in die literatuur, maar enige persoon wat die toepassing van EVT in die praktyk beoog, het min inligting oor hoe hierdie beramers met mekaar vergelyk. ’n Uitgebreide simulasiestudie is ontwerp en uitgevoer om die akkuraatheid van beraming van ’n groot verskeidenheid van beramers in die literatuur te vergelyk. Die studie sluit ’n groot verskeidenheid van steekproefgroottes en onderliggende verdelings in. ’n Nuwe prosedure vir die beraming van ’n positiewe EVI is ontwikkel, gebaseer op die passing van die gesteurde Pareto verdeling (PPD) aan waarnemings wat ’n gegewe drempel oorskrei, deur van Bayes tegnieke gebruik te maak. Aandag is ook geskenk aan die ontwikkeling van ’n drempelseleksiemetode. Een van die hoofbydraes van hierdie verhandeling is ’n maatstaf wat die stabiliteit (of eerder onstabiliteit) van beramings oor verskeie drempels kwantifiseer. Hierdie maatstaf bied ’n objektiewe manier om ’n gebied (versameling van drempelwaardes) te verkry waaroor die beramings die stabielste is. Dit is hierdie maatstaf wat gebruik word om drempelseleksie te doen in die geval van die PPD beramer. ’n Gevallestudie van vyf stelle data van versekeringseise demonstreer hoe data in die praktyk geanaliseer kan word. Daar word getoon tot watter mate diskresie toegepas kan/moet word, asook hoe verskillende beramers op ’n komplementêre wyse ingespan kan word om meer insig te verkry met betrekking tot die aard van die data en die stert van die onderliggende verdeling. Die analise word uitgevoer vanaf die punt waar slegs rou data beskikbaar is, tot op die punt waar tabelle saamgestel is wat direk gebruik kan word om die risiko van die versekeringsportefeulje te bepaal oor ’n gegewe periode.
4

Bivariate extreme value analysis of commodity prices

Joyce, Matthew 21 April 2017 (has links)
The crude oil, natural gas, and electricity markets are among the most widely traded and talked about commodity markets across the world. Over the past two decades each commodity has seen price volatility due to political, economic, social, and technological reasons. With that comes a significant amount of risk that both corporations and governments must account for to ensure expected cash flows and to minimize losses. This thesis analyzes the portfolio risk of the major US commodity hubs for crude oil, natural gas and electricity by applying Extreme Value Theory to historical daily price returns between 2003 and 2013. The risk measures used to analyze risk are Value-at-Risk and Expected Shortfall, with these estimated by fitting the Generalized Pareto Distribution to the data using the peak-over-threshold method. We consider both the univariate and bivariate cases in order to determine the effects that price shocks within and across commodities will have in a mixed portfolio. The results show that electricity is the most volatile, and therefore most risky, commodity of the three markets considered for both positive and negative returns. In addition, we find that the univariate and bivariate results are statistically indistinguishable, leading to the conclusion that for the three markets analyzed during this period, price shocks in one commodity does not directly impact the volatility of another commodity’s price. / Graduate
5

Förstudie till införandet av centralt loggsystem hos Försvarsmakten / Prestudy for the Introduction of a Central Logging System for the Swedish Armed Forces

Hellqvist, Olof January 2011 (has links)
Modern IT systems tend to become more and more complex, while the number of active systems in companies increases. Furthermore, the number of security-related incidents is at an all-time high. These new conditions impose new demands on organizations. For example, it is no longer possible to manually collect and examine the systems log messages. The purpose of this thesis has been to make a comprehensive study of solutions for automated collecting and managing of log messages, analyze the Swedish Armed Forces specification for solutions for central log collection and management, and evaluating exis- ting solutions. The work consisted primarily of literature studies and evaluations of two of the Swedish Armed Forces of selected products: NetIQ Security Manager and Splunk. The conclusion was that neither of the two products met the non-optional requirements posed by the specification. I personally think that the Swedish Armed Forces’ requirements specification for the central log management is far too strict and should hence be revised. A number of requirements in the current specification can be removed. Other requirements should be reformulated and/or re-evaluated. / Moderna IT-system tenderar att bli mer och mer komplexa, samtidigt som antalet ak- tiva system i ett fo ̈retag o ̈kar. Vidare a ̈r antalet sa ̈kerhetsrelaterade incidenter ho ̈gre a ̈n n ̊agonsin. Dessa nya omsta ̈ndigheter sta ̈ller nya krav p ̊a organisationer. Exempelvis a ̈r det inte la ̈ngre mo ̈jligt att manuellt samla in och granska systemens loggmeddelanden. Avsikten med den ha ̈r uppsatsen har varit att go ̈r en o ̈vergripande granskning av lo ̈sningar fo ̈r automatisk insamling och analys av loggmeddelanden, analysera de krav som Fo ̈rsvarsmakten sta ̈ller p ̊a lo ̈sningar fo ̈r central logghantering, samt utva ̈rdera befintliga lo ̈sningar. Arbetet bestod huvudsakligen av litteraturstudier samt utva ̈rderingar av tv ̊a av Fo ̈rsvarsmakten utvalda produkter: NetIQ Security Manager och Splunk. Slutsatsen blev att ingen av de tv ̊a produkterna uppfyller Fo ̈rsvarsmaktens samtliga krav fo ̈r central logghantering. Personligen anser jag att Fo ̈rsvarsmaktens kravspecifikation fo ̈r central logg- hantering a ̈r fo ̈r strikt och bo ̈r omarbetas. Ett antal krav i den nuvarande specifikationen kan med fo ̈rdel tas bort. Andra krav bo ̈r omformuleras och/eller omva ̈rderas.
6

The Effect of Processing Conditions on the Energetic Diagram of CdTe Thin Films Studied by Photoluminescence

Collins, Shamara P. 02 July 2018 (has links)
The photovoltaic properties of CdTe-based thin films depend on recombination levels formed in the CdTe layer and at the heterojunction. The localized states are resultant of structural defects (metal sublattice, chalcogen sublattice, interstitial), controlled doping, deposition process, and/or post-deposition annealing. The photoluminescence study of CdTe thin films, from both the bulk and heterojunction, can reveal radiative states due to different defects or impurities. Identification of defects allows for potential explanation of their roles and influence on solar cell performance. A thorough understanding of the material properties responsible for solar cell performance is critical in further advancing the efficiency of devices. The presented work is a systematic investigation using photoluminescence to study CdTe thin films with varying deposition processes. The thin (polycrystalline) films explored in this study were deposited by either the elemental vapor transport technique (EVT) or close spaced sublimation (CSS). Two device architectures were investigated, the typical CdTe/CdS device and the CdSeXTe1-X (CST) alloy device. Post-deposition annealing processes were either laser or thermal. The study of the CdTe thin films is grouped in three general categories: (a) EVT films: Intrinsic and Extrinsic (Group V: Sb and P), (b) CST alloys, and (c) Post-deposition Laser Annealed (LA) films. The main goal of this dissertation is to understand the influence of fabrication procedures (deposition conditions, post deposition thermal and chemical treatments, added impurities, and device architecture) on the defect structure of the CdTe thin films. The behavior of the photoluminescence (PL), studied as a function of the measurement temperature and excitation intensity, provides insight to the mechanism causing the radiative recombination levels. Analysis of the PL spectra for CdTe films with intrinsic doping demonstrated stoichiometric control of native defects for both the Cd- and Te-rich conditions. PL spectra of CdTe:Sb films showed unique Sb-related bands. Also, impurity-related defects were identified in the CdTe:P spectra. Spectral analysis support the need for optimization of dopant concentration. The effects of selenium (Se) thickness and post-deposition processing on the formation of CST alloy were demonstrated in the changing PL spectra. The native defects (and complexes) identified in films with thermal anneal processing were the same as those identified in films with laser anneal post-deposition processing. The PL data were collected and other characterization techniques were used to support the defect assignments. A repository of material properties, which include the recombination levels along with structural defect assignment for each of the CdTe deposition processes, is provided. This project will lend the solar cell community information on CdTe defects for different processing conditions, ultimately influencing the fabrication of improved solar cells.
7

Towards an integration of theories of achievement motivation.

Wellman, David Allen, mikewood@deakin.edu.au January 2001 (has links)
This thesis investigated children's school achievement in terms of an integration of three theories of achievement motivation. The three theoretical outlooks were expectancy-value theory (EVT), implicit theories of intelligence (ITI), and flow theory (FT). The first of two studies was an exploratory investigation of the effectiveness of each theory independently and combined to predict children's achievement in four school subjects. The subject areas were maths, reading, instrumental music and sport. Participants were 84 children (40 females and 44 males) aged 9 to 10 years, one of each child's parents, and school teachers of each child in the four subject areas. All data were collected through questionnaires based on the three models. The results indicated that EVT and FT but not ITI accounted for a significant amount of the variance in children's achievement, including effects for subject area and gender. A second confirmatory study tested EVT, FT and an integrated model for the prediction of achievement in maths, reading and instrumental music. The participants were a further 141 children (74 females and 67 males) aged 10 to 11 years, and a parent and teachers of each child. Data collection using questionnaires occurred early in the school year (Timel) and approximately five months later (Time2). For EVT, children and parents’ competence beliefs were significant predictors of children's achievement in each subject area. Females tended to believe themselves more competent at reading and instrumental music and also valued these subjects more highly than boys. Modeling results for flow theory indicated that children's emotional responses to classes (happiness and confusion) were significant predictors of achievement, the type of emotion varying between subject areas and time periods. Females generally had a more positive emotional reaction to reading and instrumental music classes than males did. The integrated model results indicated significant relationships between EVT and flow theories for each subject area, with EVT explaining most achievement variance in the integrated model. Children's and parents’ competence beliefs were the main predictors of achievement at Timel and 2, Subject area and gender differences were found which provide direction for future research. Anecdotal reports of parents and teachers often attest to individual differences in children's involvement in various school domains. Even among children of apparently similar intelligence, it is not uncommon to find one who likes nothing better than to work on a mathematics problem while another much prefers to read a novel or play a musical instrument Some children appear to achieve good results for most of the activities in which they are engaged while others achieve in a less consistent manner, sometimes particularly excelling in one activity. Some children respond to failure experiences with a determination to improve their performance in the future while others react with resignation and acceptance of their low ability. Some children appear to become totally absorbed in the activity of playing sport while others cannot wait for the game to end. The primary research objective guiding the current thesis is how children's thoughts and feelings about school subjects differ and are related to their school achievement. A perusal of the achievement motivation literature indicates several possible models and concepts that can be applied to explain individual differences in children's school achievement. Concepts such as academic self-concept, multiple intelligences, intrinsic and extrinsic motivation, self-beliefs, competence beliefs, subjective task values, mastery and performance goals, ‘Flow’ experiences and social motivation are just some of the constructs used to explain children's achievement motivation, both within and between various activity domains. These constructs are proposed by researchers from different theoretical perspectives to achievement motivation. Although there is much literature relevant to each perspective, there is little research indicating how the various perspectives may relate to each other. The current thesis will begin by reviewing three currently popular theoretical orientations cited in achievement motivation research: subjective beliefs and values; implicit theories of intelligence, and flow experience and family complexity. Following this review, a framework will be proposed for testing the determinants of children's school achievement, both within each of the three theoretical perspectives and also in combination.
8

Essays on risk management and financial stability / Essais sur la gestion des risques et la stabilité financière

Ben Hadj, Saifeddine 04 July 2017 (has links)
La thèse analyse la question de la stabilité du système financier international dans son ensemble et plus précisément comment améliorer sa résilience. Chaque chapitre se focalise sur un type d'acteur dans ce système complexe, à savoir les banques, les organismes de supervision et les régulateurs internationaux. Le premier chapitre introduit de nouvelles techniques d'optimisation pour accélérer le calcul de mesure de risque dans les banques et les institutions financières. Il propose également une étude théorique pour valider les algorithmes d'optimisation proposés. Le second vise à quantifier l'externalité négative générée par les activités d'une banque ou d'une d'institution financière. Finalement, le dernier chapitre concerne la coopération entre régulateurs nationaux en présence de coûts de coordination en proposant une analyse qui s'appuie sur la théorie des jeux. / We first investigate the computational complexity for estimating quantile based risk measures, such as the widespread Value at Risk for banks and Solvency II capital requirements for insurance companies, via nested Monte Carlo simulations. The estimator is a conditional expectation type estimate where two stage simulations are required to evaluate the risk measure: an outer simulation is used to generate risk factor scenarios that govern price movements and an inner simulation is used to evaluate the future portfolio value based on each of those scenarios. The second essay considers the financial stability from a macro perspective. Measuring negative externalities of banks is a major challenge for financial regulators. We propose a new risk management approach to enhance the financial stability and to increase the fairness of financial transactions. The basic idea is that a bank should assume as much risk as it creates. Any imbalance in the tails of the distribution of profit and losses is a sign of the bank's failure to internalize its externalities or the social costs associated with its activities. The aim of the third essay is to find a theoretical justification toward the mutual benefits for members of a bonking union in the context of a strategic interaction model. We use a unique contagion dynamic that marries the rich literature of game theory, contagion in pandemic crisis and the study of collaboration between regulators. The model is focused toward regulating asset classes, not individual banks. This special design addresses moral hazard issues that could result from government intervention in the case of crisis.
9

Modelling portfolios with heavy-tailed risk factors / Modelování portfolií s risk faktory s těžkými chvosty

Kyselá, Eva January 2015 (has links)
The thesis aims to investigate some of the approaches to modelling portfolio returns with heavy-tailed risk factors. It first elaborates on the univariate time series models, and compares the benchmark model (GARCH with Student t innovations or its GJR extension) predictive performance with its two competitors, the EVT-GARCH model and the Markov-Switching Multifractal (MSM) model. The motivation of EVT extension of GARCH specification is to use a more proper distribution of the innovations, based on the empirical distribution function. The MSM is one of the best performing models in the multifractal literature, a markov-switching model which is unique by its parsimonious specification and variability. The performance of these models is assessed with Mincer-Zarnowitz regressions as well as by comparison of quality of VaR and expected shortfall predictions, and the empirical analysis shows that for the risk management purposes the EVT-GARCH dominates the benchmark as well as the MSM. The second part addresses the dependence structure modelling, using the Gauss and t-copula to model the portfolio returns and compares the result with the classic variance-covariance approach, concluding that copulas offer a more realistic estimates of future extreme quantiles.
10

Portfolio Value at Risk and Expected Shortfall using High-frequency data / Portfólio Value at Risk a Expected Shortfall s použitím vysoko frekvenčních dat

Zváč, Marek January 2015 (has links)
The main objective of this thesis is to investigate whether multivariate models using Highfrequency data provide significantly more accurate forecasts of Value at Risk and Expected Shortfall than multivariate models using only daily data. Our objective is very topical since the Basel Committee announced in 2013 that is going to change the risk measure used for calculation of capital requirement from Value at Risk to Expected Shortfall. The further improvement of accuracy of both risk measures can be also achieved by incorporation of high-frequency data that are rapidly more available due to significant technological progress. Therefore, we employed parsimonious Heterogeneous Autoregression and its asymmetric version that uses high-frequency data for the modeling of realized covariance matrix. The benchmark models are chosen well established DCC-GARCH and EWMA. The computation of Value at Risk (VaR) and Expected Shortfall (ES) is done through parametric, semi-parametric and Monte Carlo simulations. The loss distributions are represented by multivariate Gaussian, Student t, multivariate distributions simulated by Copula functions and multivariate filtered historical simulations. There are used univariate loss distributions: Generalized Pareto Distribution from EVT, empirical and standard parametric distributions. The main finding is that Heterogeneous Autoregression model using high-frequency data delivered superior or at least the same accuracy of forecasts of VaR to benchmark models based on daily data. Finally, the backtesting of ES remains still very challenging and applied Test I. and II. did not provide credible validation of the forecasts.

Page generated in 0.0215 seconds