• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 144
  • 44
  • 24
  • 10
  • 9
  • 8
  • 7
  • 7
  • 5
  • 5
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 273
  • 187
  • 85
  • 69
  • 49
  • 38
  • 34
  • 32
  • 32
  • 30
  • 26
  • 26
  • 22
  • 20
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Funding Liquidity and Limits to Arbitrage

Aoun, Bassam 01 June 2012 (has links)
Arbitrageurs play an important role in keeping market prices close to their fundamental values by providing market liquidity. Most arbitrageurs however use leverage. When funding conditions worsen they are forced to reduce their positions. The resulting selling pressure depresses market prices, and in certain situations, pushes arbitrage spreads to levels exceeding many standard deviations. This phenomenon drove many century old financial institutions into bankruptcy during the 2007−2009 financial crisis. In this thesis, we provide empirical evidence and demonstrate analytically the effects of funding liquidity on arbitrage. We further discuss the implications for risk management. To conduct our empirical studies, we construct a novel Funding Liquidity Stress Index (FLSI) using principal components analysis. Its constituents are measures representing various funding channels. We study the relationship between the FLSI index and three di↵erent arbitrage strategies that we reproduce with real and daily transactional data. We show that the FLSI index has a strong explanatory power for changes in arbitrage spreads, and is an important source of contagion between various arbitrage strategies. In addition, we perform “event studies” surrounding events of changing margin requirements on futures contracts. The “event studies” provide empirical evidence supporting important assumptions and predictions of various theoretical work on market micro-structure. Next, we explain the mechanism through which funding liquidity affects arbitrage spreads. To do so, we study the liquidity risk premium in a market micro-structure framework where market prices are determined by the supply and demand of securities. We extend the model developed by Brunnermeier and Pedersen [BP09] to multiple periods and generalize their work by considering all market participants to be risk-averse. We further decompose the liquidity risk premium into two components: 1) a fundamental risk premium and 2) a systemic risk premium. The fundamental risk premium compensates market participants for providing liquidity in a security whose fundamental value is volatile, while the systemic risk premium compensates them for taking positions in a market that is vulnerable to funding liquidity. The first component is therefore related to the nature of the security while the second component is related to the fragility of the market micro-structure (such as leverage of market participants and margin setting mechanisms).
42

Funding Liquidity and Limits to Arbitrage

Aoun, Bassam 01 June 2012 (has links)
Arbitrageurs play an important role in keeping market prices close to their fundamental values by providing market liquidity. Most arbitrageurs however use leverage. When funding conditions worsen they are forced to reduce their positions. The resulting selling pressure depresses market prices, and in certain situations, pushes arbitrage spreads to levels exceeding many standard deviations. This phenomenon drove many century old financial institutions into bankruptcy during the 2007−2009 financial crisis. In this thesis, we provide empirical evidence and demonstrate analytically the effects of funding liquidity on arbitrage. We further discuss the implications for risk management. To conduct our empirical studies, we construct a novel Funding Liquidity Stress Index (FLSI) using principal components analysis. Its constituents are measures representing various funding channels. We study the relationship between the FLSI index and three di↵erent arbitrage strategies that we reproduce with real and daily transactional data. We show that the FLSI index has a strong explanatory power for changes in arbitrage spreads, and is an important source of contagion between various arbitrage strategies. In addition, we perform “event studies” surrounding events of changing margin requirements on futures contracts. The “event studies” provide empirical evidence supporting important assumptions and predictions of various theoretical work on market micro-structure. Next, we explain the mechanism through which funding liquidity affects arbitrage spreads. To do so, we study the liquidity risk premium in a market micro-structure framework where market prices are determined by the supply and demand of securities. We extend the model developed by Brunnermeier and Pedersen [BP09] to multiple periods and generalize their work by considering all market participants to be risk-averse. We further decompose the liquidity risk premium into two components: 1) a fundamental risk premium and 2) a systemic risk premium. The fundamental risk premium compensates market participants for providing liquidity in a security whose fundamental value is volatile, while the systemic risk premium compensates them for taking positions in a market that is vulnerable to funding liquidity. The first component is therefore related to the nature of the security while the second component is related to the fragility of the market micro-structure (such as leverage of market participants and margin setting mechanisms).
43

Cognitive Vulnerability and the Actuarial Prediction of Depressive Course

Grant, David Adam January 2012 (has links)
A wealth of research indicates that depression is a serious global health issue, and that it is often characterized by a complicated and varied course. The ability to predict depressive course would be tremendously valuable for clinicians. However, the extant literature has not yet produced an accurate and efficient means by which to predict the course of depression. Research also indicates that cognitive variables - and cognitive vulnerability factors in particular - are related to the course of depression. In examining data provided by participants in the Temple-Wisconsin Cognitive Vulnerability to Depression Project (N = 345), the current study aimed to elucidate the relationship between cognitive vulnerability and depressive course using an actuarial statistical method. Results indicated that several cognitive measures predicted aspects of the onset and course of depression at rates significantly better than chance; foremost among these was the Cognitive Style Questionnaire (CSQ; Alloy et al., 2000). The CSQ was found to be the variable that best differentiated between participants who developed an episode of depression and those who did not. Furthermore, in comparison to participants who did not develop an episode of depression, the CSQ was found to differentiate between participants who recovered from a given depressive episode and those who did not, as well as between participants who experienced a single episode and those experiencing a recurrent course of the disorder across the prospective phase of the study. Conceptual and clinical implications of these results are discussed, as are directions for future research. / Psychology
44

'n Ondersoek na die eindige steekproefgedrag van inferensiemetodes in ekstreemwaarde-teorie

Van Deventer, Dewald 03 1900 (has links)
Thesis (MComm (Statistics and Actuarial Science))--University of Stellenbosch, 2005. / Extremes are unusual or rare events. However, when such events – for example earthquakes, tidal waves and market crashes - do take place, they typically cause enormous losses, both in terms of human lives and monetary value. For this reason, it is of critical importance to accurately model extremal events. Extreme value theory entails the development of statistical models and techniques in order to describe and model such rare observations. In this document we discuss aspects of extreme value theory. This theory consists of two approaches: The classical maxima method, based on the properties of the maximum of a sample and the more popular threshold theory, based upon the properties of exceedances of a specified threshold value. This document provides the practitioner with the theoretical and practical tools for both these approaches. This will enable him/her to perform extreme value analyses with confidence. Extreme value theory – for both approaches - is based upon asymptotic arguments. For finite samples, the limiting result for the sample maximum holds approximately only. Similarly, for finite choices of the threshold, the limiting distribution for exceedances of that threshold holds only approximately. In this document we investigate the quality of extreme value based inferences with regard to the unknown underlying distribution when the sample size or threshold is finite. Estimation of extreme tail quantiles of the underlying distribution, as well as the calculation of confidence intervals, are typically the most important objectives of an extreme analysis. For that reason, we evaluate the accuracy of extreme based inferences in terms of these estimates. This investigation was carried out using a simulation study, performed with the software package S-Plus.
45

Confidence intervals for estimators of welfare indices under complex sampling

Kirchoff, Retha 03 1900 (has links)
Thesis (MComm (Statistics and Actuarial Science))--University of Stellenbosch, 2010. / ENGLISH ABSTRACT: The aim of this study is to obtain estimates and confidence intervals for welfare indices under complex sampling. It begins by looking at sampling in general with specific focus on complex sampling and weighting. For the estimation of the welfare indices, two resampling techniques, viz. jackknife and bootstrap, are discussed. They are used for the estimation of bias and standard error under simple random sampling and complex sampling. Three con dence intervals are discussed, viz. standard (asymptotic), percentile and bootstrap-t. An overview of welfare indices and their estimation is given. The indices are categorized into measures of poverty and measures of inequality. Two Laeken indices, viz. at-risk-of-poverty and quintile share ratio, are included in the discussion. The study considers two poverty lines, namely an absolute poverty line based on percy (ratio of total household income to household size) and a relative poverty line based on equivalized income (ratio of total household income to equivalized household size). The data set used as surrogate population for the study is the Income and Expenditure survey 2005/2006 conducted by Statistics South Africa and details of it are provided and discussed. An analysis of simulation data from the surrogate population was carried out using techniques mentioned above and the results were graphed, tabulated and discussed. Two issues were considered, namely whether the design of the survey should be considered and whether resampling techniques provide reliable results, especially for con dence intervals. The results were a mixed bag . Overall, however, it was found that weighting showed promise in many cases, especially in the improvement of the coverage probabilities of the con dence intervals. It was also found that the bootstrap resampling technique was reliable (by looking at standard errors). Further research options are mentioned as possible solutions towards the mixed results. / AFRIKAANSE OPSOMMING: Die doel van die studie is die verkryging van beramings en vertrouensintervalle vir welvaartsmaatstawwe onder komplekse steekproefneming. 'n Algemene bespreking van steekproefneming word gedoen waar daar spesi ek op komplekse steekproefneming en weging gefokus word. Twee hersteekproefnemingstegnieke, nl. uitsnit (jackknife)- en skoenlushersteekproefneming, word bespreek as metodes vir die beraming van die maatstawwe. Hierdie maatstawwe word gebruik vir sydigheidsberaming asook die beraming van standaardfoute in eenvoudige ewekansige steekproefneming asook komplekse steekproefneming. Drie vertrouensintervalle word bespreek, nl. die standaard (asimptotiese), die persentiel en die bootstrap-t vertrouensintervalle. Daar is ook 'n oorsigtelike bespreking oor welvaartsmaatstawwe en die beraming daarvan. Hierdie welvaartsmaatstawwe vorm twee kategorieë, nl. maatstawwe van armoede en maatstawwe van ongelykheid. Ook ingesluit by hierdie bespreking is die at-risk-of-poverty en quintile share ratio wat deel vorm van die Laekenindekse. Twee armoedemaatlyne , 'n absolute- en relatiewemaatlyn, word in hierdie studie gebruik. Die absolute armoedemaatlyn word gebaseer op percy , die verhouding van die totale huishoudingsinkomste tot die grootte van die huishouding, terwyl die relatiewe armoedemaatlyn gebasseer word op equivalized income , die verhouding van die totale huishoudingsinkomste tot die equivalized grootte van die huishouding. Die datastel wat as surrogaat populasie gedien het in hierdie studie is die Inkomste en Uitgawe opname van 2005/2006 wat deur Statistiek Suid-Afrika uitgevoer is. Inligting met betrekking tot hierdie opname word ook gegee. Gesimuleerde data vanuit die surrogaat populasie is geanaliseer deur middel van die hersteekproefnemingstegnieke wat genoem is. Die resultate van die simulasie is deur middel van gra eke en tabelle aangedui en bespreek. Vanuit die simulasie het twee vrae opgeduik, nl. of die ontwerp van 'n steekproef, dus weging, in ag geneem behoort te word en of die hersteekproefnemingstegnieke betroubare resultate lewer, veral in die geval van die vertrouensintervalle. Die resultate wat verkry is, het baie gevarieer. Daar is egter bepaal dat weging in die algemeen belowende resultate opgelewer het vir baie van die gevalle, maar nie vir almal nie. Dit het veral die dekkingswaarskynlikhede van die vertrouensintervalle verbeter. Daar is ook bepaal, deur na die standaardfoute van die skoenlusberamers te kyk, dat die skoenlustegniek betroubare resultate gelewer het. Verdere navorsingsmoontlikhede is genoem as potensiële verbeteringe op die gemengde resultate wat verkry is.
46

A brief introduction to basic multivariate economic statistical process control

Mudavanhu, Precious 12 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: Statistical process control (SPC) plays a very important role in monitoring and improving industrial processes to ensure that products produced or shipped to the customer meet the required specifications. The main tool that is used in SPC is the statistical control chart. The traditional way of statistical control chart design assumed that a process is described by a single quality characteristic. However, according to Montgomery and Klatt (1972) industrial processes and products can have more than one quality characteristic and their joint effect describes product quality. Process monitoring in which several related variables are of interest is referred to as multivariate statistical process control (MSPC). The most vital and commonly used tool in MSPC is the statistical control chart as in the case of the SPC. The design of a control chart requires the user to select three parameters which are: sample size, n , sampling interval, h and control limits, k.Several authors have developed control charts based on more than one quality characteristic, among them was Hotelling (1947) who pioneered the use of the multivariate process control techniques through the development of a 2 T -control chart which is well known as Hotelling 2 T -control chart. Since the introduction of the control chart technique, the most common and widely used method of control chart design was the statistical design. However, according to Montgomery (2005), the design of control has economic implications. There are costs that are incurred during the design of a control chart and these are: costs of sampling and testing, costs associated with investigating an out-of-control signal and possible correction of any assignable cause found, costs associated with the production of nonconforming products, etc. The paper is about giving an overview of the different methods or techniques that have been employed to develop the different economic statistical models for MSPC. The first multivariate economic model presented in this paper is the economic design of the Hotelling‟s 2 T -control chart to maintain current control of a process developed by Montgomery and Klatt (1972). This is followed by the work done by Kapur and Chao (1996) in which the concept of creating a specification region for the multiple quality characteristics together with the use of a multivariate quality loss function is implemented to minimize total loss to both the producer and the customer. Another approach by Chou et al (2002) is also presented in which a procedure is developed that simultaneously monitor the process mean and covariance matrix through the use of a quality loss function. The procedure is based on the test statistic 2ln L and the cost model is based on Montgomery and Klatt (1972) as well as Kapur and Chao‟s (1996) ideas. One example of the use of the variable sample size technique on the economic and economic statistical design of the control chart will also be presented. Specifically, an economic and economic statistical design of the 2 T -control chart with two adaptive sample sizes (Farazet al, 2010) will be presented. Farazet al (2010) developed a cost model of a variable sampling size 2 T -control chart for the economic and economic statistical design using Lorenzen and Vance‟s (1986) model. There are several other approaches to the multivariate economic statistical process control (MESPC) problem, but in this project the focus is on the cases based on the phase II stadium of the process where the mean vector, and the covariance matrix, have been fairly well established and can be taken as known, but both are subject to assignable causes. This latter aspect is often ignored by researchers. Nevertheless, the article by Farazet al (2010) is included to give more insight into how more sophisticated approaches may fit in with MESPC, even if the mean vector, only may be subject to assignable cause. Keywords: control chart; statistical process control; multivariate statistical process control; multivariate economic statistical process control; multivariate control chart; loss function. / AFRIKAANSE OPSOMMING: Statistiese proses kontrole (SPK) speel 'n baie belangrike rol in die monitering en verbetering van industriële prosesse om te verseker dat produkte wat vervaardig word, of na kliënte versend word wel aan die vereiste voorwaardes voldoen. Die vernaamste tegniek wat in SPK gebruik word, is die statistiese kontrolekaart. Die tradisionele wyse waarop statistiese kontrolekaarte ontwerp is, aanvaar dat ‟n proses deur slegs 'n enkele kwaliteitsveranderlike beskryf word. Montgomery and Klatt (1972) beweer egter dat industriële prosesse en produkte meer as een kwaliteitseienskap kan hê en dat hulle gesamentlik die kwaliteit van 'n produk kan beskryf. Proses monitering waarin verskeie verwante veranderlikes van belang mag wees, staan as meerveranderlike statistiese proses kontrole (MSPK) bekend. Die mees belangrike en algemene tegniek wat in MSPK gebruik word, is ewe eens die statistiese kontrolekaart soos dit die geval is by SPK. Die ontwerp van 'n kontrolekaart vereis van die gebruiker om drie parameters te kies wat soos volg is: steekproefgrootte, n , tussensteekproefinterval, h en kontrolegrense, k . Verskeie skrywers het kontrolekaarte ontwikkel wat op meer as een kwaliteitseienskap gebaseer is, waaronder Hotelling wat die gebruik van meerveranderlike proses kontrole tegnieke ingelei het met die ontwikkeling van die T2 -kontrolekaart wat algemeen bekend is as Hotelling se 2 T -kontrolekaart (Hotelling, 1947). Sedert die ingebruikneming van die kontrolekaart tegniek is die statistiese ontwerp daarvan die mees algemene benadering en is dit ook in daardie formaat gebruik. Nietemin, volgens Montgomery and Klatt (1972) en Montgomery (2005), het die ontwerp van die kontrolekaart ook ekonomiese implikasies. Daar is kostes betrokke by die ontwerp van die kontrolekaart en daar is ook die kostes t.o.v. steekproefneming en toetsing, kostes geassosieer met die ondersoek van 'n buite-kontrole-sein, en moontlike herstel indien enige moontlike korreksie van so 'n buite-kontrole-sein gevind word, kostes geassosieer met die produksie van niekonforme produkte, ens. In die eenveranderlike geval is die hantering van die ekonomiese eienskappe al in diepte ondersoek. Hierdie werkstuk gee 'n oorsig oor sommige van die verskillende metodes of tegnieke wat al daargestel is t.o.v. verskillende ekonomiese statistiese modelle vir MSPK. In die besonder word aandag gegee aan die gevalle waar die vektor van gemiddeldes sowel as die kovariansiematriks onderhewig is aan potensiële verskuiwings, in teenstelling met 'n neiging om slegs na die vektor van gemiddeldes in isolasie te kyk synde onderhewig aan moontlike verskuiwings te wees.
47

Optimal asset allocation for South African pension funds under the revised Regulation 28

Koegelenberg, Frederik Johannes 03 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: On 1 July 2011 the revised version of Regulation 28, which governs the South African pension fund industry with regard to investments, took effect. The new version allows for pension funds to invest up to 25 percent compared to 20 percent, in the previous version, of its total investment in foreign assets. The aim of this study is to determine whether it would be optimal for a South African pension fund to invest the full 25 percent of its portfolio in foreign assets. Seven different optimization models are evaluated in this study to determine the optimal asset mix. The optimization models were selected through an extensive literature study in order to address key optimization issues, e.g. which risk measure to use, whether parametric or non parametric optimization should be used and if the Mean Variance model for optimization defined by Markowitz, which has been the benchmark with regard to asset allocation, is the best model to determine the long term asset allocation strategies. The results obtained from the different models were used to recommend the optimal long term asset allocation for a South African pension fund and also compared to determine which optimization model proved to be the most efficient. The study found that when using only the past ten years of data to construct the portfolios, it would have been optimal to invest in only South African asset classes with statistical differences with regard to returns in some cases. Using the past 20-years of data to construct the optimal portfolios provided mixed results, while the 30-year period were more in favour of an international portfolio with the full 25% invested in foreign asset classes. A comparison of the different models provided a clear winner with regard to a probability of out performance. The Historical Resampled Mean Variance optimization provided the highest probability of out performing the benchmark. From the study it also became evident that a 20-year data period is the optimal period when considering the historical data that should be used to construct the optimal portfolio. / AFRIKAANSE OPSOMMING: Op 1 Julie 2011 het die hersiene Regulasie 28, wat die investering van Suid-Afrikaanse pensioenfondse reguleer, in werking getree. Hierdie hersiene weergawe stel pensioenfondse in staat om 25% van hulle fondse in buitelandse bateklasse te belê in plaas van 20%, soos in die vorige weergawe. Hierdie studie stel vas of dit werklik voordelig sal wees vir ‘n SA pensioenfonds om die volle 25% in buitelandse bateklasse te belê. Sewe verskillende optimeringsmodelle is gebruik om die optimale portefeulje te probeer skep. Die optimeringsmodelle is gekies na ’n uitgebreide literatuurstudie sodat van die sleutelkwessies met betrekking tot optimering aangespreek kon word. Die kwessies waarna verwys word sluit in, watter risikomaat behoort gebruik te word in die optimeringsproses, of ‘n parametriese of nie-parametriese model gebruik moet word en of die “Mean-Variance” model wat deur Markowitz in 1952 gedefinieer is en al vir baie jare as maatstaf vir portefeulje optimering dien, nog steeds die beste model is om te gebruik. Die uiteindelike resultate, verkry van die verskillende optimeringsmodelle, is gevolglik gebruik om die optimale langtermyn bate-allokasie vir ‘n Suid-Afrikaanse pensioenfonds op te stel. Die verskillende optimeringsmodelle is ook met mekaar vergelyk om te bepaal of daar ‘n model is wat beter is as die res. Vanuit die resultate was dit duidelik dat ’n portfeulje wat slegs uit Suid-Afrikaanse bates bestaan beter sal presteer as slegs die laaste 10-jaar se data gebruik word om die portefeulje op stel. Hierdie resultate is ook in meeste van die gevalle bevestig deur middel van hipotese toetse. Deur gebruik te maak van die afgelope 20-jaar se data om die portefeuljes op te stel, het gemengde resultate gelewer, terwyl die afgelope 30-jaar se data in meeste van die gevalle ’n internasionaal gediversifiseerde portefeulje as die beter portefeulje uitgewys het. In ’n vergelyking van die verskillende optimeringsmodelle is die “Historical Resampled Mean Variance” model duidelik as die beter model uitgewys. Hierdie model het die hoogste waarskynlikheid behaal om die vasgstelde maatstafportefeuljes uit te presteer. Die resultate het ook gedui op die 20-jaar periode as die beste data periode om te gebruik as die optimale portfeulje opgestel word.
48

The use of discharge risk assessment instruments in general psychiatric services in the United Kingdom

Stein, William Morris January 2000 (has links)
No description available.
49

Stochastic Mortality Models with Applications in Financial Risk Management

Li, Siu Hang 18 June 2007 (has links)
In product pricing and reserving, actuaries are often required to make predictions of future death rates. In the past, this has been performed by using deterministic improvement scales that give only a single mortality trajectory. However, there is enormous likelihood that future death rates will turn out to be different from the projected ones, and so a better assessment of longevity risk would be one that consists of both a mean estimate and a measure of uncertainty. Such assessment can be performed using a stochastic mortality model, which is the core of this thesis. The Lee-Carter model is one of the most popular stochastic mortality models. While it does an excellent job in mean forecasting, it has been criticized for providing overly narrow prediction intervals that may have underestimated uncertainty. This thesis mitigates this problem by relaxing the assumption on the distribution of death counts. We found that the generalization from Poisson to negative binomial is equivalent to allowing gamma heterogeneity within each age-period cells. The proposed extension gives not only a better fit, but also a more conservative prediction interval that may reflect better the uncertainty entailed. The proposed extension is then applied to the construction of mortality improvement scales for Canadian insured lives. Given that the insured lives data series are too short for a direct Lee-Carter projection, we build an extra relational model that could borrow strengths from the Canadian population data, which covers a far longer period. The resultant scales consist of explicit measures of uncertainty. The prediction of the tail of a survival distribution requires a special treatment due to the lack of high quality old-age mortality data. We utilize the asymptotic results in modern extreme value theory to extrapolate death probabilities to the advanced ages, and to statistically determine the age at which the life table should be closed. Such technique is further integrated with the Lee-Carter model to produce a stochastic analysis of old-age mortality, and a prediction of the highest attained age for various cohorts. The mortality models we considered are further applied to the valuation of mortality-related financial products. In particular we investigate the no-negative-equity-guarantee that is offered in most fixed-repayment lifetime mortgages in Britain. The valuation of such guarantee requires a simultaneous consideration of both longevity and house price inflation risk. We found that house price returns can be well described by an ARMA-EGARCH time-series process. Under an ARMA-EGARCH process, however, the Black-Scholes formula no longer applies. We derive our own pricing formula based on the conditional Esscher transformation. Finally, we propose some possible hedging and capital reserving strategies for managing the risks associated with the guarantee.
50

The Valuation and Risk Management of a DB Underpin Pension Plan

Chen, Kai January 2007 (has links)
Hybrid pension plans offer employees the best features of both defined benefit and defined contribution plans. In this work, we consider the hybrid design offering a defined contribution benefit with a defined benefit guaranteed minimum underpin. This study applies the contingent claims approach to value the defined contribution benefit with a defined benefit guaranteed minimum underpin. The study shows that entry age, utility function parameters and the market price of risk each has a significant effect on the value of retirement benefits. We also consider risk management for this defined benefit underpin pension plan. Assuming fixed interest rates, and assuming that salaries can be treated as a tradable asset, contribution rates are developed for the Entry Age Normal (EAN), Projected Unit Credit(PUC), and Traditional Unit Credit (TUC) funding methods. For the EAN, the contribution rates are constant throughout the service period. However, the hedge parameters for this method are not tradable. For the accruals method, the individual contribution rates are not constant. For both the PUC and TUC, a delta hedge strategy is derived and explained. The analysis is extended to relax the tradable assumption for salaries, using the inflation as a partial hedge. Finally, methods for incorporating volatility reducingand risk management are considered.

Page generated in 0.0673 seconds