Spelling suggestions: "subject:"actuar science.""
41 |
Confidence intervals for estimators of welfare indices under complex samplingKirchoff, Retha 03 1900 (has links)
Thesis (MComm (Statistics and Actuarial Science))--University of Stellenbosch, 2010. / ENGLISH ABSTRACT: The aim of this study is to obtain estimates and confidence intervals for welfare
indices under complex sampling. It begins by looking at sampling in general with
specific focus on complex sampling and weighting. For the estimation of the welfare
indices, two resampling techniques, viz. jackknife and bootstrap, are discussed.
They are used for the estimation of bias and standard error under simple random
sampling and complex sampling. Three con dence intervals are discussed, viz. standard
(asymptotic), percentile and bootstrap-t. An overview of welfare indices and
their estimation is given. The indices are categorized into measures of poverty and
measures of inequality. Two Laeken indices, viz. at-risk-of-poverty and quintile
share ratio, are included in the discussion. The study considers two poverty lines,
namely an absolute poverty line based on percy (ratio of total household income
to household size) and a relative poverty line based on equivalized income (ratio of
total household income to equivalized household size). The data set used as surrogate
population for the study is the Income and Expenditure survey 2005/2006
conducted by Statistics South Africa and details of it are provided and discussed.
An analysis of simulation data from the surrogate population was carried out using
techniques mentioned above and the results were graphed, tabulated and discussed.
Two issues were considered, namely whether the design of the survey should be considered
and whether resampling techniques provide reliable results, especially for
con dence intervals. The results were a mixed bag . Overall, however, it was found
that weighting showed promise in many cases, especially in the improvement of the
coverage probabilities of the con dence intervals. It was also found that the bootstrap
resampling technique was reliable (by looking at standard errors). Further
research options are mentioned as possible solutions towards the mixed results. / AFRIKAANSE OPSOMMING: Die doel van die studie is die verkryging van beramings en vertrouensintervalle vir
welvaartsmaatstawwe onder komplekse steekproefneming. 'n Algemene bespreking
van steekproefneming word gedoen waar daar spesi ek op komplekse steekproefneming
en weging gefokus word. Twee hersteekproefnemingstegnieke, nl. uitsnit
(jackknife)- en skoenlushersteekproefneming, word bespreek as metodes vir die beraming
van die maatstawwe. Hierdie maatstawwe word gebruik vir sydigheidsberaming
asook die beraming van standaardfoute in eenvoudige ewekansige steekproefneming
asook komplekse steekproefneming. Drie vertrouensintervalle word bespreek, nl.
die standaard (asimptotiese), die persentiel en die bootstrap-t vertrouensintervalle.
Daar is ook 'n oorsigtelike bespreking oor welvaartsmaatstawwe en die beraming
daarvan. Hierdie welvaartsmaatstawwe vorm twee kategorieë, nl. maatstawwe van
armoede en maatstawwe van ongelykheid. Ook ingesluit by hierdie bespreking is die
at-risk-of-poverty en quintile share ratio wat deel vorm van die Laekenindekse.
Twee armoedemaatlyne , 'n absolute- en relatiewemaatlyn, word in hierdie studie
gebruik. Die absolute armoedemaatlyn word gebaseer op percy , die verhouding van
die totale huishoudingsinkomste tot die grootte van die huishouding, terwyl die relatiewe
armoedemaatlyn gebasseer word op equivalized income , die verhouding van
die totale huishoudingsinkomste tot die equivalized grootte van die huishouding.
Die datastel wat as surrogaat populasie gedien het in hierdie studie is die Inkomste
en Uitgawe opname van 2005/2006 wat deur Statistiek Suid-Afrika uitgevoer is. Inligting
met betrekking tot hierdie opname word ook gegee. Gesimuleerde data vanuit
die surrogaat populasie is geanaliseer deur middel van die hersteekproefnemingstegnieke
wat genoem is. Die resultate van die simulasie is deur middel van gra eke en
tabelle aangedui en bespreek. Vanuit die simulasie het twee vrae opgeduik, nl. of
die ontwerp van 'n steekproef, dus weging, in ag geneem behoort te word en of die
hersteekproefnemingstegnieke betroubare resultate lewer, veral in die geval van die vertrouensintervalle. Die resultate wat verkry is, het baie gevarieer. Daar is egter
bepaal dat weging in die algemeen belowende resultate opgelewer het vir baie van die
gevalle, maar nie vir almal nie. Dit het veral die dekkingswaarskynlikhede van die
vertrouensintervalle verbeter. Daar is ook bepaal, deur na die standaardfoute van
die skoenlusberamers te kyk, dat die skoenlustegniek betroubare resultate gelewer
het. Verdere navorsingsmoontlikhede is genoem as potensiële verbeteringe op die
gemengde resultate wat verkry is.
|
42 |
A brief introduction to basic multivariate economic statistical process controlMudavanhu, Precious 12 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: Statistical process control (SPC) plays a very important role in monitoring and improving
industrial processes to ensure that products produced or shipped to the customer meet the
required specifications. The main tool that is used in SPC is the statistical control chart. The
traditional way of statistical control chart design assumed that a process is described by a
single quality characteristic. However, according to Montgomery and Klatt (1972) industrial
processes and products can have more than one quality characteristic and their joint effect
describes product quality. Process monitoring in which several related variables are of
interest is referred to as multivariate statistical process control (MSPC). The most vital and
commonly used tool in MSPC is the statistical control chart as in the case of the SPC. The
design of a control chart requires the user to select three parameters which are: sample size,
n , sampling interval, h and control limits, k.Several authors have developed control charts
based on more than one quality characteristic, among them was Hotelling (1947) who
pioneered the use of the multivariate process control techniques through the development of a
2 T -control chart which is well known as Hotelling 2 T -control chart.
Since the introduction of the control chart technique, the most common and widely used
method of control chart design was the statistical design. However, according to Montgomery
(2005), the design of control has economic implications. There are costs that are incurred
during the design of a control chart and these are: costs of sampling and testing, costs
associated with investigating an out-of-control signal and possible correction of any
assignable cause found, costs associated with the production of nonconforming products, etc.
The paper is about giving an overview of the different methods or techniques that have been
employed to develop the different economic statistical models for MSPC.
The first multivariate economic model presented in this paper is the economic design of the
Hotelling‟s 2 T -control chart to maintain current control of a process developed by
Montgomery and Klatt (1972). This is followed by the work done by Kapur and Chao (1996)
in which the concept of creating a specification region for the multiple quality characteristics
together with the use of a multivariate quality loss function is implemented to minimize total
loss to both the producer and the customer. Another approach by Chou et al (2002) is also
presented in which a procedure is developed that simultaneously monitor the process mean
and covariance matrix through the use of a quality loss function. The procedure is based on the test statistic 2ln L and the cost model is based on Montgomery and Klatt (1972) as well
as Kapur and Chao‟s (1996) ideas. One example of the use of the variable sample size
technique on the economic and economic statistical design of the control chart will also be
presented. Specifically, an economic and economic statistical design of the 2 T -control chart
with two adaptive sample sizes (Farazet al, 2010) will be presented. Farazet al (2010)
developed a cost model of a variable sampling size 2 T -control chart for the economic and
economic statistical design using Lorenzen and Vance‟s (1986) model.
There are several other approaches to the multivariate economic statistical process control
(MESPC) problem, but in this project the focus is on the cases based on the phase II stadium
of the process where the mean vector, and the covariance matrix, have been fairly well
established and can be taken as known, but both are subject to assignable causes. This latter
aspect is often ignored by researchers. Nevertheless, the article by Farazet al (2010) is
included to give more insight into how more sophisticated approaches may fit in with
MESPC, even if the mean vector, only may be subject to assignable cause.
Keywords: control chart; statistical process control; multivariate statistical process control;
multivariate economic statistical process control; multivariate control chart; loss function. / AFRIKAANSE OPSOMMING: Statistiese proses kontrole (SPK) speel 'n baie belangrike rol in die monitering en
verbetering van industriële prosesse om te verseker dat produkte wat vervaardig word, of na
kliënte versend word wel aan die vereiste voorwaardes voldoen. Die vernaamste tegniek wat
in SPK gebruik word, is die statistiese kontrolekaart. Die tradisionele wyse waarop statistiese
kontrolekaarte ontwerp is, aanvaar dat ‟n proses deur slegs 'n enkele kwaliteitsveranderlike
beskryf word. Montgomery and Klatt (1972) beweer egter dat industriële prosesse en
produkte meer as een kwaliteitseienskap kan hê en dat hulle gesamentlik die kwaliteit van 'n
produk kan beskryf. Proses monitering waarin verskeie verwante veranderlikes van belang
mag wees, staan as meerveranderlike statistiese proses kontrole (MSPK) bekend. Die mees
belangrike en algemene tegniek wat in MSPK gebruik word, is ewe eens die statistiese
kontrolekaart soos dit die geval is by SPK. Die ontwerp van 'n kontrolekaart vereis van die
gebruiker om drie parameters te kies wat soos volg is: steekproefgrootte, n , tussensteekproefinterval,
h en kontrolegrense, k . Verskeie skrywers het kontrolekaarte ontwikkel
wat op meer as een kwaliteitseienskap gebaseer is, waaronder Hotelling wat die gebruik van
meerveranderlike proses kontrole tegnieke ingelei het met die ontwikkeling van die
T2 -kontrolekaart wat algemeen bekend is as Hotelling se 2 T -kontrolekaart (Hotelling,
1947).
Sedert die ingebruikneming van die kontrolekaart tegniek is die statistiese ontwerp daarvan
die mees algemene benadering en is dit ook in daardie formaat gebruik. Nietemin, volgens
Montgomery and Klatt (1972) en Montgomery (2005), het die ontwerp van die kontrolekaart
ook ekonomiese implikasies. Daar is kostes betrokke by die ontwerp van die kontrolekaart
en daar is ook die kostes t.o.v. steekproefneming en toetsing, kostes geassosieer met die
ondersoek van 'n buite-kontrole-sein, en moontlike herstel indien enige moontlike korreksie
van so 'n buite-kontrole-sein gevind word, kostes geassosieer met die produksie van niekonforme
produkte, ens. In die eenveranderlike geval is die hantering van die ekonomiese
eienskappe al in diepte ondersoek. Hierdie werkstuk gee 'n oorsig oor sommige van die
verskillende metodes of tegnieke wat al daargestel is t.o.v. verskillende ekonomiese
statistiese modelle vir MSPK. In die besonder word aandag gegee aan die gevalle waar die
vektor van gemiddeldes sowel as die kovariansiematriks onderhewig is aan potensiële
verskuiwings, in teenstelling met 'n neiging om slegs na die vektor van gemiddeldes in
isolasie te kyk synde onderhewig aan moontlike verskuiwings te wees.
|
43 |
Optimal asset allocation for South African pension funds under the revised Regulation 28Koegelenberg, Frederik Johannes 03 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: On 1 July 2011 the revised version of Regulation 28, which governs the South African
pension fund industry with regard to investments, took effect. The new version allows for
pension funds to invest up to 25 percent compared to 20 percent, in the previous version,
of its total investment in foreign assets. The aim of this study is to determine whether
it would be optimal for a South African pension fund to invest the full 25 percent of its
portfolio in foreign assets.
Seven different optimization models are evaluated in this study to determine the optimal
asset mix. The optimization models were selected through an extensive literature study
in order to address key optimization issues, e.g. which risk measure to use, whether
parametric or non parametric optimization should be used and if the Mean Variance model
for optimization defined by Markowitz, which has been the benchmark with regard to asset
allocation, is the best model to determine the long term asset allocation strategies.
The results obtained from the different models were used to recommend the optimal
long term asset allocation for a South African pension fund and also compared to determine
which optimization model proved to be the most efficient.
The study found that when using only the past ten years of data to construct the
portfolios, it would have been optimal to invest in only South African asset classes with
statistical differences with regard to returns in some cases. Using the past 20-years of data
to construct the optimal portfolios provided mixed results, while the 30-year period were
more in favour of an international portfolio with the full 25% invested in foreign asset
classes.
A comparison of the different models provided a clear winner with regard to a probability
of out performance. The Historical Resampled Mean Variance optimization provided the highest probability of out performing the benchmark. From the study it also became
evident that a 20-year data period is the optimal period when considering the historical
data that should be used to construct the optimal portfolio. / AFRIKAANSE OPSOMMING: Op 1 Julie 2011 het die hersiene Regulasie 28, wat die investering van Suid-Afrikaanse
pensioenfondse reguleer, in werking getree. Hierdie hersiene weergawe stel pensioenfondse
in staat om 25% van hulle fondse in buitelandse bateklasse te belê in plaas van 20%, soos
in die vorige weergawe. Hierdie studie stel vas of dit werklik voordelig sal wees vir ‘n SA
pensioenfonds om die volle 25% in buitelandse bateklasse te belê.
Sewe verskillende optimeringsmodelle is gebruik om die optimale portefeulje te probeer
skep. Die optimeringsmodelle is gekies na ’n uitgebreide literatuurstudie sodat van die
sleutelkwessies met betrekking tot optimering aangespreek kon word. Die kwessies waarna
verwys word sluit in, watter risikomaat behoort gebruik te word in die optimeringsproses,
of ‘n parametriese of nie-parametriese model gebruik moet word en of die “Mean-Variance”
model wat deur Markowitz in 1952 gedefinieer is en al vir baie jare as maatstaf vir portefeulje
optimering dien, nog steeds die beste model is om te gebruik.
Die uiteindelike resultate, verkry van die verskillende optimeringsmodelle, is gevolglik
gebruik om die optimale langtermyn bate-allokasie vir ‘n Suid-Afrikaanse pensioenfonds
op te stel. Die verskillende optimeringsmodelle is ook met mekaar vergelyk om te bepaal
of daar ‘n model is wat beter is as die res.
Vanuit die resultate was dit duidelik dat ’n portfeulje wat slegs uit Suid-Afrikaanse bates
bestaan beter sal presteer as slegs die laaste 10-jaar se data gebruik word om die portefeulje
op stel. Hierdie resultate is ook in meeste van die gevalle bevestig deur middel van hipotese
toetse. Deur gebruik te maak van die afgelope 20-jaar se data om die portefeuljes op te
stel, het gemengde resultate gelewer, terwyl die afgelope 30-jaar se data in meeste van die
gevalle ’n internasionaal gediversifiseerde portefeulje as die beter portefeulje uitgewys het.
In ’n vergelyking van die verskillende optimeringsmodelle is die “Historical Resampled Mean Variance” model duidelik as die beter model uitgewys. Hierdie model het die hoogste
waarskynlikheid behaal om die vasgstelde maatstafportefeuljes uit te presteer. Die resultate
het ook gedui op die 20-jaar periode as die beste data periode om te gebruik as die optimale
portfeulje opgestel word.
|
44 |
Stochastic Mortality Models with Applications in Financial Risk ManagementLi, Siu Hang 18 June 2007 (has links)
In product pricing and reserving, actuaries are often required to make predictions of future death rates. In the past, this has been performed by using deterministic improvement scales that give only a single mortality trajectory. However, there is enormous likelihood that future death rates will turn out to be different from the projected ones, and so a better assessment of longevity risk would be one that consists of both a mean estimate and a measure of uncertainty. Such assessment can be performed using a stochastic mortality model, which is the core of this thesis.
The Lee-Carter model is one of the most popular stochastic mortality models. While it does an excellent job in mean forecasting, it has been criticized for providing overly narrow prediction intervals that may have underestimated uncertainty. This thesis mitigates this problem by relaxing the assumption on the distribution of death counts. We found that the generalization from Poisson to negative binomial is equivalent to allowing gamma heterogeneity within each age-period cells. The proposed extension gives not only a better fit, but also a more conservative prediction interval that may reflect better the uncertainty entailed.
The proposed extension is then applied to the construction of mortality improvement scales for Canadian insured lives. Given that the insured lives data series are too short for a direct Lee-Carter projection, we build an extra relational model that could borrow strengths from the Canadian population data, which covers a far longer period. The resultant scales consist of explicit measures of uncertainty.
The prediction of the tail of a survival distribution requires a special treatment due to the lack of high quality old-age mortality data. We utilize the asymptotic results in modern extreme value theory to extrapolate death probabilities to the advanced ages, and to statistically determine the age at which the life table should be closed. Such technique is further integrated with the Lee-Carter model to produce a stochastic analysis of old-age mortality, and a prediction of the highest attained age for various cohorts.
The mortality models we considered are further applied to the valuation of mortality-related financial products. In particular we investigate the no-negative-equity-guarantee that is offered in most fixed-repayment lifetime mortgages in Britain. The valuation of such guarantee requires a simultaneous consideration of both longevity and house price inflation risk. We found that house price returns can be well described by an ARMA-EGARCH time-series process. Under an ARMA-EGARCH process, however, the Black-Scholes formula no longer applies. We derive our own pricing formula based on the conditional Esscher transformation. Finally, we propose some possible hedging and capital reserving strategies for managing the risks associated with the guarantee.
|
45 |
The Valuation and Risk Management of a DB Underpin Pension PlanChen, Kai January 2007 (has links)
Hybrid pension plans offer employees the best features of both defined benefit and defined contribution plans. In this work, we consider the hybrid design offering a
defined contribution benefit with a defined benefit guaranteed minimum underpin. This study applies the contingent claims approach to value the defined contribution
benefit with a defined benefit guaranteed minimum underpin. The study shows that entry age, utility function parameters and the market price of risk each has a significant effect on the value of retirement benefits.
We also consider risk management for this defined benefit underpin pension plan. Assuming fixed interest rates, and assuming that salaries can be treated as a tradable asset, contribution rates are developed for the Entry Age Normal (EAN), Projected Unit Credit(PUC), and Traditional Unit Credit (TUC) funding methods. For the EAN, the contribution rates are constant throughout the service period. However, the hedge parameters for this method are not tradable. For the accruals method, the individual contribution rates are not constant. For both the PUC and TUC, a delta hedge strategy is derived and explained.
The analysis is extended to relax the tradable assumption for salaries, using the
inflation as a partial hedge. Finally, methods for incorporating volatility reducingand risk management are considered.
|
46 |
Multivariate Time Series Analysis of the Investment Guarantee in Canadian Segregated Fund ProductsLiu, Jie 20 May 2008 (has links)
In the context of the guarantee liability valuation, the sophisticated fund-of-funds structure, of some Canadian segregated fund products, often requires us to model multiple market indices simultaneously in order to benchmark the return of the underlying fund. In this thesis, we apply multivariate GARCH models with Gaussian and non-Gaussian noise to project the future investment scenarios of the fund. We further conduct a simulation study to investigate the difference, among the proposed multivariate models, in the valuation of the Guaranteed Minimum Maturity Benefit (GMMB) option.
Based on the pre-data analysis, the proposed multivariate GARCH models are data driven. The goodness-of-fit for the models is evaluated through formal statistical tests from univariate and multivariate perspectives. The estimation and associated practical issues are discussed in details. The impact from the innovation distributions is addressed. More importantly, we demonstrate an actuarial approach to manage the guarantee liability for complex segregated fund products.
|
47 |
Analysis of Islamic Stock IndicesMohammed, Ansarullah Ridwan January 2009 (has links)
In this thesis, an attempt is made to build on the quantitative research in the field of Islamic Finance. Firstly, univariate modelling using special GARCH-type models is performed on both the FTSE All World and FTSE Shari'ah All World indices. The AR(1) + APARCH(1,1) model with standardized skewed student-t innovations provided the best overall fit and was the most successful at VaR modelling for long and short trading positions. A risk assessment is done using the Conditional Tail Expectation (CTE) risk measure which concluded that in short trading
positions the FTSE Shari'ah All World index was riskier than the FTSE All World index but, in long trading positions the results were not conclusive as to which is riskier. Secondly, under the Markowitz model of risk and return the performance of Islamic equity is compared to conventional equity using various Dow Jones indices. The results indicated that even though the Islamic portfolio is relatively less diversified than the conventional portfolio, due to several investment restrictions, the Shari'ah screening process excluded various industries whose absence resulted in risk reduction. As a result, the Islamic portfolio provided a basket of stocks with special and favourable risk characteristics. Lastly, copulas are used to model the dependency structure between the filtered returns of the FTSE All World and FTSE Shari'ah All World indices after fitting the AR(1) + APARCH(1,1) model with standardized skewed student-t innovations. The t copula outperformed the others and a demonstration of forecasting using the copula-extended model is done.
|
48 |
Option Pricing and Hedging Analysis under Regime-switching ModelsQiu, Chao January 2013 (has links)
This thesis explores option pricing and hedging in a discrete time regime-switching environment. If the regime risk cannot be hedged away, then we cannot ignore this risk and use the Black-Scholes pricing and hedging framework to generate a unique
pricing and hedging measure. We develop a risk neutral pricing measure by applying an Esscher Transform to the real world asset price process, with the focus on the issue of
incompleteness of the market. The Esscher transform turns out to be a convenient and effective tool for option pricing under the
discrete time regime switching models. We apply the pricing measure to both single variate European options and multivariate
options. To better understand the effect of the pricing method, we also compared the results with those generated from two
other risk neutral methods: the Black-Scholes model, and the natural equivalent martingale method.
We further investigate the difference in hedging associated with different pricing measures. This is of interest when the choice of pricing method is uncertain under regime switching models. We compare four hedging strategies: delta hedging for the three risk neutral pricing methods under
study, and mean variance hedging. We also develop a more general tool of tail
ordering for hedging analysis in a general incomplete market with the uncertainty of the risk neutral measures. As a result of the
analysis, we propose that pricing and hedging using the Esscher transform may be an effective strategy for a market where
the regime switching process brings uncertainty.
|
49 |
Estimation and allocation of insurance risk capitalKim, Hyun Tae 27 April 2007 (has links)
Estimating tail risk measures such as Value at Risk (VaR) and Conditional Tail Expectation
(CTE) is a vital component in financial and actuarial risk management.
The CTE is a preferred risk measure, due to coherence and a widespread acceptance
in actuarial community. In particular we focus on the estimation of the CTE using
both parametric and nonparametric approaches.
In parametric case the conditional tail expectation and variance are analytically
derived for the exponential distribution family and its transformed distributions.
For small i.i.d. samples the exact bootstrap (EB) and the influence function are
used as nonparametric methods in estimating the bias and the the variance of the empirical
CTE. In particular, it is shown that the bias is corrected using the bootstrap
for the CTE case. In variance estimation the influence function of the bootstrapped
quantile is derived, and can be used to estimate the variance of any bootstrapped
L-estimator without simulations, including the VaR and the CTE, via the nonparametric
delta method. An industry model are provided by applying theoretical findings
on the bias and the variance of the estimated CTE.
Finally a new capital allocation method is proposed. Inspired by the allocation
of the solvency exchange option by Sherris (2006), this method resembles the CTE
allocation in its form and properties, but has its own unique features, such as managerbased
decomposition. Through a numerical example the proposed allocation is shown
to fail the no undercut axiom, but we argue that this axiom may not be aligned with
the economic reality.
|
50 |
Stochastic Mortality Models with Applications in Financial Risk ManagementLi, Siu Hang 18 June 2007 (has links)
In product pricing and reserving, actuaries are often required to make predictions of future death rates. In the past, this has been performed by using deterministic improvement scales that give only a single mortality trajectory. However, there is enormous likelihood that future death rates will turn out to be different from the projected ones, and so a better assessment of longevity risk would be one that consists of both a mean estimate and a measure of uncertainty. Such assessment can be performed using a stochastic mortality model, which is the core of this thesis.
The Lee-Carter model is one of the most popular stochastic mortality models. While it does an excellent job in mean forecasting, it has been criticized for providing overly narrow prediction intervals that may have underestimated uncertainty. This thesis mitigates this problem by relaxing the assumption on the distribution of death counts. We found that the generalization from Poisson to negative binomial is equivalent to allowing gamma heterogeneity within each age-period cells. The proposed extension gives not only a better fit, but also a more conservative prediction interval that may reflect better the uncertainty entailed.
The proposed extension is then applied to the construction of mortality improvement scales for Canadian insured lives. Given that the insured lives data series are too short for a direct Lee-Carter projection, we build an extra relational model that could borrow strengths from the Canadian population data, which covers a far longer period. The resultant scales consist of explicit measures of uncertainty.
The prediction of the tail of a survival distribution requires a special treatment due to the lack of high quality old-age mortality data. We utilize the asymptotic results in modern extreme value theory to extrapolate death probabilities to the advanced ages, and to statistically determine the age at which the life table should be closed. Such technique is further integrated with the Lee-Carter model to produce a stochastic analysis of old-age mortality, and a prediction of the highest attained age for various cohorts.
The mortality models we considered are further applied to the valuation of mortality-related financial products. In particular we investigate the no-negative-equity-guarantee that is offered in most fixed-repayment lifetime mortgages in Britain. The valuation of such guarantee requires a simultaneous consideration of both longevity and house price inflation risk. We found that house price returns can be well described by an ARMA-EGARCH time-series process. Under an ARMA-EGARCH process, however, the Black-Scholes formula no longer applies. We derive our own pricing formula based on the conditional Esscher transformation. Finally, we propose some possible hedging and capital reserving strategies for managing the risks associated with the guarantee.
|
Page generated in 0.0883 seconds