• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 128
  • 44
  • 5
  • 4
  • 1
  • Tagged with
  • 185
  • 185
  • 79
  • 69
  • 38
  • 32
  • 30
  • 29
  • 23
  • 23
  • 18
  • 17
  • 13
  • 12
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Algorithmic Analysis of a General Class of Discrete-based Insurance Risk Models

Singer, Basil Karim January 2013 (has links)
The aim of this thesis is to develop algorithmic methods for computing particular performance measures of interest for a general class of discrete-based insurance risk models. We build upon and generalize the insurance risk models considered by Drekic and Mera (2011) and Alfa and Drekic (2007), by incorporating a threshold-based dividend system in which dividends only get paid provided some period of good financial health is sustained above a pre-specified threshold level. We employ two fundamental methods for calculating the performance measures under the more general framework. The first method adopts the matrix-analytic approach originally used by Alfa and Drekic (2007) to calculate various ruin-related probabilities of interest such as the trivariate distribution of the time of ruin, the surplus prior to ruin, and the deficit at ruin. Specifically, we begin by introducing a particular trivariate Markov process and then expressing its transition probability matrix in a block-matrix form. From this characterization, we next identify an initial probability vector for the process, from which certain important conditional probability vectors are defined. For these vectors to be computed efficiently, we derive recursive expressions for each of them. Subsequently, using these probability vectors, we derive expressions which enable the calculation of conditional ruin probabilities and, from which, their unconditional counterparts naturally follow. The second method used involves the first claim conditioning approach (i.e., condition on knowing the time the first claim occurs and its size) employed in many ruin theoretic articles including Drekic and Mera (2011). We derive expressions for the finite-ruin time based Gerber-Shiu function as well as the moments of the total dividends paid by a finite time horizon or before ruin occurs, whichever happens first. It turns out that both functions can be expressed in elegant, albeit long, recursive formulas. With the algorithmic derivations obtained from the two fundamental methods, we next focus on computational aspects of the model class by comparing six different types of models belonging to this class and providing numerical calculations for several parametric examples, highlighting the robustness and versatility of our model class. Finally, we identify several potential areas for future research and possible ways to optimize numerical calculations.
172

Optimal Reinsurance Designs: from an Insurer’s Perspective

Weng, Chengguo 09 1900 (has links)
The research on optimal reinsurance design dated back to the 1960’s. For nearly half a century, the quest for optimal reinsurance designs has remained a fascinating subject, drawing significant interests from both academicians and practitioners. Its fascination lies in its potential as an effective risk management tool for the insurers. There are many ways of formulating the optimal design of reinsurance, depending on the chosen objective and constraints. In this thesis, we address the problem of optimal reinsurance designs from an insurer’s perspective. For an insurer, an appropriate use of the reinsurance helps to reduce the adverse risk exposure and improve the overall viability of the underlying business. On the other hand, reinsurance incurs additional cost to the insurer in the form of reinsurance premium. This implies a classical risk and reward tradeoff faced by the insurer. The primary objective of the thesis is to develop theoretically sound and yet practical solution in the quest for optimal reinsurance designs. In order to achieve such an objective, this thesis is divided into two parts. In the first part, a number of reinsurance models are developed and their optimal reinsurance treaties are derived explicitly. This part focuses on the risk measure minimization reinsurance models and discusses the optimal reinsurance treaties by exploiting two of the most common risk measures known as the Value-at-Risk (VaR) and the Conditional Tail Expectation (CTE). Some additional important economic factors such as the reinsurance premium budget, the insurer’s profitability are also considered. The second part proposes an innovative method in formulating the reinsurance models, which we refer as the empirical approach since it exploits explicitly the insurer’s empirical loss data. The empirical approach has the advantage that it is practical and intuitively appealing. This approach is motivated by the difficulty that the reinsurance models are often infinite dimensional optimization problems and hence the explicit solutions are achievable only in some special cases. The empirical approach effectively reformulates the optimal reinsurance problem into a finite dimensional optimization problem. Furthermore, we demonstrate that the second-order conic programming can be used to obtain the optimal solutions for a wide range of reinsurance models formulated by the empirical approach.
173

Queueing Analysis of a Priority-based Claim Processing System

Ibrahim, Basil January 2009 (has links)
We propose a situation in which a single employee is responsible for processing incoming claims to an insurance company that can be classified as being one of two possible types. More specifically, we consider a priority-based system having separate buffers to store high priority and low priority incoming claims. We construct a mathematical model and perform queueing analysis to evaluate the performance of this priority-based system, which incorporates the possibility of claims being redistributed, lost, or prematurely processed.
174

Optimal Reinsurance Designs: from an Insurer’s Perspective

Weng, Chengguo 09 1900 (has links)
The research on optimal reinsurance design dated back to the 1960’s. For nearly half a century, the quest for optimal reinsurance designs has remained a fascinating subject, drawing significant interests from both academicians and practitioners. Its fascination lies in its potential as an effective risk management tool for the insurers. There are many ways of formulating the optimal design of reinsurance, depending on the chosen objective and constraints. In this thesis, we address the problem of optimal reinsurance designs from an insurer’s perspective. For an insurer, an appropriate use of the reinsurance helps to reduce the adverse risk exposure and improve the overall viability of the underlying business. On the other hand, reinsurance incurs additional cost to the insurer in the form of reinsurance premium. This implies a classical risk and reward tradeoff faced by the insurer. The primary objective of the thesis is to develop theoretically sound and yet practical solution in the quest for optimal reinsurance designs. In order to achieve such an objective, this thesis is divided into two parts. In the first part, a number of reinsurance models are developed and their optimal reinsurance treaties are derived explicitly. This part focuses on the risk measure minimization reinsurance models and discusses the optimal reinsurance treaties by exploiting two of the most common risk measures known as the Value-at-Risk (VaR) and the Conditional Tail Expectation (CTE). Some additional important economic factors such as the reinsurance premium budget, the insurer’s profitability are also considered. The second part proposes an innovative method in formulating the reinsurance models, which we refer as the empirical approach since it exploits explicitly the insurer’s empirical loss data. The empirical approach has the advantage that it is practical and intuitively appealing. This approach is motivated by the difficulty that the reinsurance models are often infinite dimensional optimization problems and hence the explicit solutions are achievable only in some special cases. The empirical approach effectively reformulates the optimal reinsurance problem into a finite dimensional optimization problem. Furthermore, we demonstrate that the second-order conic programming can be used to obtain the optimal solutions for a wide range of reinsurance models formulated by the empirical approach.
175

Queueing Analysis of a Priority-based Claim Processing System

Ibrahim, Basil January 2009 (has links)
We propose a situation in which a single employee is responsible for processing incoming claims to an insurance company that can be classified as being one of two possible types. More specifically, we consider a priority-based system having separate buffers to store high priority and low priority incoming claims. We construct a mathematical model and perform queueing analysis to evaluate the performance of this priority-based system, which incorporates the possibility of claims being redistributed, lost, or prematurely processed.
176

Algorithmic Analysis of a General Class of Discrete-based Insurance Risk Models

Singer, Basil Karim January 2013 (has links)
The aim of this thesis is to develop algorithmic methods for computing particular performance measures of interest for a general class of discrete-based insurance risk models. We build upon and generalize the insurance risk models considered by Drekic and Mera (2011) and Alfa and Drekic (2007), by incorporating a threshold-based dividend system in which dividends only get paid provided some period of good financial health is sustained above a pre-specified threshold level. We employ two fundamental methods for calculating the performance measures under the more general framework. The first method adopts the matrix-analytic approach originally used by Alfa and Drekic (2007) to calculate various ruin-related probabilities of interest such as the trivariate distribution of the time of ruin, the surplus prior to ruin, and the deficit at ruin. Specifically, we begin by introducing a particular trivariate Markov process and then expressing its transition probability matrix in a block-matrix form. From this characterization, we next identify an initial probability vector for the process, from which certain important conditional probability vectors are defined. For these vectors to be computed efficiently, we derive recursive expressions for each of them. Subsequently, using these probability vectors, we derive expressions which enable the calculation of conditional ruin probabilities and, from which, their unconditional counterparts naturally follow. The second method used involves the first claim conditioning approach (i.e., condition on knowing the time the first claim occurs and its size) employed in many ruin theoretic articles including Drekic and Mera (2011). We derive expressions for the finite-ruin time based Gerber-Shiu function as well as the moments of the total dividends paid by a finite time horizon or before ruin occurs, whichever happens first. It turns out that both functions can be expressed in elegant, albeit long, recursive formulas. With the algorithmic derivations obtained from the two fundamental methods, we next focus on computational aspects of the model class by comparing six different types of models belonging to this class and providing numerical calculations for several parametric examples, highlighting the robustness and versatility of our model class. Finally, we identify several potential areas for future research and possible ways to optimize numerical calculations.
177

Modelling longitudinally measured outcome HIV biomarkers with immuno genetic parameters.

Bryan, Susan Ruth. January 2011 (has links)
According to the Joint United Nations Programme against HIV/AIDS 2009 AIDS epidemic update, there were a total of 33.3 million (31.4 million–35.3 million) people living with HIV worldwide in 2009. The majority of the epidemic occurs in Sub-Saharan Africa. Of the 33.3 million people living with HIV worldwide in 2009, a vast majority of 22.5 million (20.9 million-24.2 million) were from Sub-Saharan Africa. There were 1.8 million (1.6 million-2.0 million) new infections and 1.3 million (1.1 million-1.5 million) AIDS-related deaths in Sub-Saharan Africa in 2009 (UNAIDS, 2009). Statistical models and analysis are required in order to further understand the dynamics of HIV/AIDS and in the design of intervention and control strategies. Despite the prevalence of this disease, its pathogenesis is still poorly understood. A thorough understanding of HIV and factors that influence progression of the disease is required in order to prevent the further spread of the virus. Modelling provides us with a means to understand and predict the progression of the disease better. Certain genetic factors play a key role in the way the disease progresses in a human body. For example HLA-B types and IL-10 genotypes are some of the genetic factors that have been independently associated with the control of HIV infection. Both HLA-B and IL-10 may influence the quality and magnitude of immune responses and IL-10 has also been shown to down regulate the expression of certain HLA molecules. Studies are therefore required to investigate how HLA-B types and IL-10 genotypes may interact to affect HIV infection outcomes. This dissertation uses the Sinikithemba study data from the HIV Pathogenesis Programme (HPP) at the Medical School, University of KwaZulu-Natal involving 450 HIV positive and treatment naive individuals to model how certain outcome biomarkers (CD4+ counts and viral loads) are associated with immuno genetic parameters (HLA-B types and IL-10 genotypes). The work also seeks to exploit novel longitudinal data methods in Statistics in order to efficiently model longitudinally measured HIV outcome data. Statistical techniques such as linear mixed models and generalized estimating equations were used to model this data. The findings from the current work agree quite closely with what is expected from the biological understanding of the disease. / Thesis (M.Sc.)-University of KwaZulu-Natal, Pietermaritzburg, 2011.
178

Estimation and analysis of measures of disease for HIV infection in childbearing women using serial seroprevalence data.

Sewpaul, Ronel. January 2011 (has links)
The prevalence and the incidence are two primary epidemiological parameters in infectious disease modelling. The incidence is also closely related to the force of infection or the hazard of infection in survival analysis terms. The two measures carry the same information about a disease because they measure the rate at which new infections occur. The disease prevalence gives the proportion of infected individuals in the population at a given time, while the incidence is the rate of new infections. The thesis discusses methods for estimating HIV prevalence, incidence rates and the force of infection, against age and time, using cross-sectional seroprevalence data for pregnant women attending antenatal clinics. The data was collected on women aged 12 to 47 in rural KwaZulu-Natal for each of the years 2001 to 2006. The generalized linear model for binomial response is used extensively. First the logistic regression model is used to estimate annual HIV prevalence by age. It was found that the estimated prevalence for each year increases with age, to peaks of between 36% and 57% in the mid to late twenties, before declining steadily toward the forties. Fitted prevalence for 2001 is lower than for the other years across all ages. Several models for estimating the force of infection are discussed and applied. The fitted force of infection rises with age to a peak of 0.074 at age 15, and then decreases toward higher ages. The force of infection measures the potential risk of infection per individual per unit time. A proportional hazards model of the age to infection is applied to the data, and shows that additional variables such as partner’s age and the number of previous pregnancies do have a significant effect on the infection hazard. Studies for estimating incidence from multiple prevalence surveys are reviewed. The relative inclusion rate (RIR), accounting for the fact that the probability of inclusion in a prevalence sample depends on the individual’s HIV status, and its role in incidence estimation is discussed as a possible future approach of extending the current work. / Thesis (M.Sc.)-University of KwaZulu-Natal, Pietermaritzburg, 2011.
179

Insurance portfolio's with dependent risks

Badran, Rabih 23 January 2014 (has links)
Cette thèse traite de portefeuilles d’assurance avec risques dépendants en théorie du risque.<p>Le premier chapitre traite les modèles avec risques équicorrelés. Nous proposons une structure mathématique qui amène à une fonction génératrice de probabilités particulière (fgp) proposé par Tallis. Cette fgp implique des variables équicorrelées. Puis, nous étudions l’effet de ce type de dépendance sur des quantités d’intérêt dans la littérature actuarielle telle que la fonction de répartition de la somme des montants des sinistres, les primes stop-loss et les probabilités de ruine sur horizon fini. Nous utilisons la structure proposée pour corriger des erreurs dans la littérature dues au fait que plusieurs auteurs agissaient comme si la somme des variables aléatoires équicorrélés aient nécessairement la fgp proposée par Tallis. <p><p>Dans le second chapitre, nous proposons un modèle qui combine les modèles avec chocs et les modèles avec mélanges communs en introduisant une variable qui contrôle le niveau du choc. Dans le cadre de ce nouveau modèle, nous considérons deux applications où nous généralisons le modèle de Bernoulli avec choc et le modèle de Poisson avec choc. Nous étudions, dans les deux applications, l’effet de la dépendance sur la fonction de répartition des montants des sinistres, les primes stop-loss et les probabilités de ruine sur horizon fini et infini. Pour la deuxième application, nous proposons une construction basée sur les copules qui permet de contrôler le niveau de dépendance avec le niveau du choc.<p><p>Dans le troisième chapitre, nous proposons, une généralisation du modèle classique de Poisson où les montants des sinistres et les intersinistres sont supposés dépendants. Nous calculons la transformée de Laplace des probabilités de survie. Dans le cas particulier où les montants des sinistres ont une distribution exponentielle nous obtenons des formules explicites pour les probabilités de survie. <p><p>Dans le quatrième chapitre nous généralisons le modèle classique de Poisson en introduisant de la dépendance entre les intersinistres. Nous utilisons le lien entre les files fluides et le processus du risque pour modéliser la dépendance. Nous calculons les probabilités de survie en utilisant un algorithme numérique et nous traitons le cas où les montants de<p>sinistres et les intersinistres ont des distributions de type phase.<p> / Doctorat en Sciences / info:eu-repo/semantics/nonPublished
180

Generalized Bühlmann-Straub credibility theory for correlated data

Andblom, Mikael January 2023 (has links)
In this thesis, we first go through classical results from the field of credibility theory. One of the most well-known models in the field is the Büuhlmann-Straub model. The model is relatively straightforward to apply in practice and is widely used. A major advantage of the model is its simplicity and intuitive dependency on its model parameters. From our perspective, the main drawback is the assumption regarding uncorrelated data. We show that the correlation can be used to cancel observational noise and therefore obtain more accurate estimators. This leads to an extended credibility formula that contains the Bühlmann-Straub model as a special case. This comes at the cost of introducing singularities which may cause the estimator to behave unexpectedly under certain circumstances. Further research is needed to better understand how often the circumstances are met in practice and if transforming the optimal weights could be a way forward in such cases. Finally, a simulation study based on real-world data shows that the proposed model outperforms the Bühlmann-Straub model.

Page generated in 0.0498 seconds