• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 483
  • 135
  • 61
  • 42
  • 35
  • 31
  • 29
  • 26
  • 25
  • 20
  • 12
  • 11
  • 10
  • 6
  • 5
  • Tagged with
  • 1144
  • 1144
  • 484
  • 255
  • 178
  • 138
  • 137
  • 134
  • 103
  • 97
  • 91
  • 88
  • 85
  • 84
  • 75
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
311

Stochastic Solvency Testing in Life Insurance

Hayes, Genevieve Katherine, genevieve.hayes@anu.edu.au January 2009 (has links)
Stochastic solvency testing methods have existed for more than 20 years, yet there has been little research conducted in this area, particularly in Australia. This is for a number of reasons, the most pertinent of which being the lack of computing capabilities available in the past to implement more sophisticated techniques. However, recent advances in computing have made stochastic solvency testing possible in practice and have resulted in a trend towards this being done in advanced studies. ¶ The purpose of this thesis is to develop a realistic solvency testing model in a form that can be implemented by Australian Life Insurers, in anticipation that the Australian insurance regulator, APRA, will ultimately follow the world trend and require stochastic solvency testing to be carried out in Australia. The model is constructed from three interconnected stochastic sub-models used to describe the economic environment and the mortality and lapsation experience of the portfolio of policies under consideration. Australian economic and Life Insurance data is used to fit a number of possible sub-models, such as generalised linear models, over-dispersion models and asset models, and the ``best'' model is selected in each case. The selected models are a modified CAS/SOA economic sub-model; either a Poisson or negative binomial (NB1) distribution (depending on the policy type considered) as the mortality sub-model; and a normal-Poisson lapsation sub-model. ¶ Based on tests carried out using this model, it is demonstrated that, for portfolios of level and yearly-renewable term insurance business, the current deterministic solvency capital requirements provide little protection against insolvency. In fact, for the test portfolios of term insurance policies considered, the deterministic capital requirements have levels of sufficiency of less than 2% (on a Value at Risk basis) when compared to the change in capital distribution over a three year time horizon. This is of concern, as yearly-renewable term insurance comprises a significant volume of Life Insurance business in Australia, with there being over 426,000 yearly-renewable term insurance policies on the books of Australian Life Insurers in 1999 and more business expected since then. ¶ A sensitivity analysis shows that the results of the stochastic asset requirement calculations are sensitive to the choice of sub-model used to forecast economic variables and to the choice of formulae used to describe the mean mortality and lapsation rates. The implication of this is that, if APRA were to require Life Insurers to calculate their solvency capital requirements on a stochastic basis, some guidance would need to be provided regarding the components of the solvency testing model used. The model is not, however, sensitive to whether an allowance is made for mortality or lapsation rate over-dispersion, nor to whether dependency relationships between mortality rates, lapsation rates and the economy are allowed for. Thus, over-dispersion and dependency relationships between the sub-models can be ignored in a stochastic solvency testing model without significantly impacting the calculated solvency requirements.
312

Incorporating discontinuities in value-at-risk via the poisson jump diffusion model and variance gamma model

Lee, Brendan Chee-Seng, Banking & Finance, Australian School of Business, UNSW January 2007 (has links)
We utilise several asset pricing models that allow for discontinuities in the returns and volatility time series in order to obtain estimates of Value-at-Risk (VaR). The first class of model that we use mixes a continuous diffusion process with discrete jumps at random points in time (Poisson Jump Diffusion Model). We also apply a purely discontinuous model that does not contain any continuous component at all in the underlying distribution (Variance Gamma Model). These models have been shown to have some success in capturing certain characteristics of return distributions, a few being leptokurtosis and skewness. Calibrating these models onto the returns of an index of Australian stocks (All Ordinaries Index), we then use the resulting parameters to obtain daily estimates of VaR. In order to obtain the VaR estimates for the Poisson Jump Diffusion Model and the Variance Gamma Model, we introduce the use of an innovation from option pricing techniques, which concentrates on the more tractable characteristic functions of the models. Having then obtained a series of VaR estimates, we then apply a variety of criteria to assess how each model performs and also evaluate these models against the traditional approaches to calculating VaR, such as that suggested by J.P. Morgan???s RiskMetrics. Our results show that whilst the Poisson Jump Diffusion model proved the most accurate at the 95% VaR level, neither the Poisson Jump Diffusion or Variance Gamma models were dominant in the other performance criteria examined. Overall, no model was clearly superior according to all the performance criteria analysed, and it seems that the extra computational time required to calibrate the Poisson Jump Diffusion and Variance Gamma models for the purposes of VaR estimation do not provide sufficient reward for the additional effort than that currently employed by Riskmetrics.
313

FAMILIES AT RISK � A CRITICAL ANALYSIS OF IMPLICATIONS FOR POLICY AND SERVICES

Roe, Miranda, manroe@aapt.net.au January 2006 (has links)
This thesis examines policy and service delivery issues in the development of health and support for families at risk. The research focuses on families with children less than 7 years of age living in some of the most disadvantaged neighbourhoods of metropolitan Adelaide. The thesis draws on evidence of (a) barriers to service support perceived by these families and (b) their strengths and resources in order to identify and develop arguments related to key issues of policy and service delivery.
314

證券商市場風險管理與風險值的應用:以某證券商為例

李榮福 Unknown Date (has links)
金融市場的激烈震盪,往往會造成投資大眾與企業的重大損失,甚而危及企業的生存及整體金融市場的穩定與發展。而每當金融市場發生變化時首當其衝者常為金融證券相關產業。證券商所面臨之經營風險雖可區分為市場風險、信用風險、流動性風險、作業風險、法律風險及系統風險等六類,但以市場風險為最主要的風險來源,由近年來國內外多起金融機構的重大損失案例可為證。大型化,國際化及多元化為國內證券商之發展趨勢,由於業務多元化、大型化,將使證券商所持有之金融資產部位增加,業務複雜度、組織運作與管理難度增加,相對的經營風險亦提高。因此適當的風險管理機制,以維持良好的風險管理能力,與適當的資源配置是證券商在致力於追求業務擴展之餘,應加以特別注意的重要事項。 本研究主要在探討國內證券商所面對的經營風險有那些,以及其在風險管理上存在的問題與建議,並對主要的市場風險管理問題尋求解決方案及進行個案分析。風險控管的內涵主要包括:風險管理的組織運作、風險衡量之技術、風險管理之策略、風險管理政策與執行等。除探討一般風險管理之策略運用(風險分散、風險移轉、風險承擔及動態避險等的原理與方法)外,並就近年來頗受注目的,風險值風險衡量管理技術的運用與模型進行研究,包括一般所定義之風險值的說明與實務運用外,進一步討論個別模型(包括歷史模擬法、蒙地卡羅模擬法、變異數-共變異數法及波動度之衡量方式等)的計算方法、特點。而在證券商之現行風險管理政策方面,則著重於證券商風險控管之外部規範與內部制度及其所存在的問題。 而就國內證券商所面對的風險管理問題與對策,本文以為除了必須要注重人才的培育召募及落實管理制度的執行外,還必須要有一具效率的風險管理工具及符合風險管理需要的組織與運作模式。就『有效率的風險管理工具』的問題,由於財務工程的原理與資訊科技的技術,可以幫助企業在市場環境快速變化下,迅速掌握企業在經營各項業務與投資決策時所面臨的風險大小與風險承擔能力進而採取適當的避險策略以規避風險。本文建議以建構『風險值風險管理資訊系統』以為解決對策,而就『符合風險管理需要的組織與運作模式』的問題,本文則建議以建構『專業分工、權力制衡、風控獨立、風險績效衡量』的組織運作模式為解決方案。
315

台灣債券投資組合風險值之評估 / The Evaluation of Value at Risk (VaR) on Taiwan Bond Portfolio

謝振耀, Hsieh, Chen-Yao Unknown Date (has links)
在台灣即將加入WTO的前提下,各家券商、銀行等金融業者為了提升本身的競爭力不斷追求利潤最大化以及風險最小化為其首要目標,因此風險控管的重要性便與日遽增,風險管理的方法也不斷推陳出新,在眾多的方法中,如何尋求最適自身的方法,便是各家金融業者刻不容緩研究的課題,風險值(Value at Risk)便是近期發展出來的一種風險控管工具。  本研究以台灣債券組合為例,建構短期與長期公債的投資組合進行評估,研究方法採用一階、二階常態法、偏態修正法、蒙第卡羅模擬法及歷史資料模擬法,並配合不同的信賴水準、移動視窗及不同的利率期間結構及標準差估計法,對債券投資組合進行比較分析與驗証。在風險值驗證方面,則採用回溯測試與前向測試兩種驗證方法加上統計學上的平均值與變異數兩種方法,分別對上述不同的模型方法作驗證。
316

綜合證券商風險資產之評估-Value at Risk 的應用

蔡明孝 Unknown Date (has links)
隨著金融環境的國際化與自由化,資本移動迅速,衍生性金融商品的發展,雖然使得金融市場更加活絡,相對劇烈的變化也增加了許多風險。尤其全世界金融風暴與金融事件頻傳,對於風險的管理與控制已經變得相當重要。 風險管理的工具眾多,本論文主要對Value-at-Risk這個最受歡迎的風險管理工具進行研究,VaR是指「在未來一定期間之內,我可在多少百分比的信賴區間內確定公司投資部位的最大損失額不會超過多少。」 本文首先討論VaR的模型,從參數化的均等加權移動平均、指數加權移動平均、Delta-Gamma法,到無母數的歷史模擬法、蒙地卡羅模擬法、壓力測試法。 接著利用程式設計,對券商自營部門和權證/避險部門,每月實際發生的投資組合做VaR計算。除了計算不同部門、不同投資組合、使用不同VaR模型的風險值,也從實際的風險值裡,討論券商在權證/避險的過程,對風險控管的運用。 最後利用BIS、IOSCO的報告和建議,對券商和主管機關做了幾點建議。
317

應用風險值評估共同基金之績效

張雅惠 Unknown Date (has links)
共同基金績效評量以夏普比率(Sharpe Ratio)最常被使用,但是由於夏普比率建構於常態分配的假設上,當基金報酬率不為常態時就可能產生偏誤。本文針對國內共同基金進行常態性檢定,發現基金報酬率分配呈現左偏、高狹峰的特質,並非常態分配,因此本文擷取風險值(VaR)衡量下方風險、又不需假設報酬率為常態分配的特長,將風險值應用在共同基金績效衡量上,以改善夏普比率在報酬率非常態分配下的偏誤,作為基金績效評估時輔助參考之用,並以國內共同資料進行實證研究,結論歸納如下: 共同基金績效排名衡量上,以風險值取代夏普比率標準差的指標所得到的排名會與夏普比率所得到的排名的確有所差異。一般類股票型基金方面,以風險值取代夏普比率標準差的指標排名相對夏普比率提升的基金都具有風險值較小的特點;另一方面,上櫃股票型基金及科技類股票型基金排名因報酬率差異較大,所以出現報酬率主導排名順序,改變風險衡量方式影響排名不大的現象。 本文比較以風險值取代夏普比率標準差的指標及以標竿報酬率代替無風險利率的指標、以風險值取代夏普比率標準差的指標及報酬風險值均考慮市場影響的指標,瞭解所處市場走勢對基金績效的影響,實證結果發現上櫃型基金排名均往前攀升;科技類股票型基金在考慮市場因素後所獲得的排名評價有後退之現象產生。 在指標預測性方面,夏普比率和以風險值取代夏普比率標準差的指標在統計上不具顯著性;以標竿報酬率代替無風險利率的指標和報酬風險值均考慮市場影響的指標則在統計上具顯著性,具有預測參考價值。
318

利用類神經網路估算國內電子股投資風險值績效

高世儒 Unknown Date (has links)
本研究首次提出以未來臨界報酬率為輸出變數,利用兩種類神經網路(Artificial Neural Network)估算國內電子股代表樣本報酬率的風險值(Value at Risk , VaR)。在研究設計上考慮到使用不同期長來計算自變項所帶來的影響而產生兩種預測方法。本研究並以回顧檢定(Backtesting )檢討藉由臨界值報酬率作為類神經估計法與一般以變異數/共變數法或蒙地卡羅模擬法所估算出VaR的差異。 綜合本研究,在學術及實務上的貢獻有下列四點: 1. 設計臨界報酬率作為估算VaR的方式,可以避免以往計算VaR時,報酬率分配主觀給定的問題。 2. 相關研究過去並未同時涉及類神經網路與VaR,而本研究首次應用類神經網路估算VaR。 3. 本文亦提出以多種不同的基本變數衡量期長來估算VaR,或可幫助界定差異的研究設計。 4. 本研究使用類神經網路可能的一項限制是報酬率臨界值 的設計方式;而類神經網路可能勝出其它預測工具的理由可能是 (1)學習到隱性因子的特性 (2)預測方式為非線性 (3)毋須依賴常態或特定分配之假設。以往類神經網路研究在賽馬決定各工具優劣時,較少探究類神經勝出或落敗的理由,而這卻是本研究設計的焦點。
319

Kontrollera, minimera, spekulera : En studie om kontroll och styrning, riskhantering och value-at-risk på treasuryavdelningar

Thunström, Erick, Björk, Kristofer January 2005 (has links)
<p>Aktiviteterna på en treasuryavdelning har under de senaste åren förändrats. Från att bara kontrollera de dagliga kassaflödena för ett företag, till idag då även handel med värdepapper, spekulation, har blivit en daglig aktivitet. Detta har lett till ett ökat behov av kontroll då spekulation leder till ett ökat risktagande. Denna uppsats studerar tre faktorer som är av betydelse för kontrollen på en treasuryavdelning. Dessa faktorer är; kontroll och styrning, riskhantering och det finansiella riskmåttet Value-at-Risk. Treasuryavdelningens huvudsakliga uppgift är att kontrollera och hantera risker, d v s riskhantering. Ett vanligt mått för att mäta och kontrollera finansiell risk är Value-at-Risk som mäter den största möjliga förlusten under vissa givna förutsättningar.</p><p>En treasuryavdelning kan bland annat utvärderas som kostnadsenhet eller som resultatenhet. Om en treasuryavdelning utvärderas som kostnadsenhet ökar fokus på riskhanteringen. Samtidigt går de möjligheter till att göra vinst, genom spekulation, förlorade. Utvärderas de däremot som resultatenheter skapas dessa möjligheter. I denna studie har det konstaterats att en treasuryavdelning ska utvärderas som en resultatenhet för att skapa vinst för företaget. Detta bör ske genom att sätta tydliga resultatmål för verksamheten. Samtidigt ökar behovet av en effektiv kontroll där det är viktigt att de tre ovannämnda faktorerna samspelar. Vidare är det viktigt att de anställda på avdelningarna är medvetna om vad som ska presteras och att riskhanteringen används på ett effektivt sätt. Användningen av riskmått blir också avgörande och att de riskmått som används är lämpliga för den typ av aktivitet som utförs. I denna studie har det fastslagits att Value-at-Risk är ett effektivt mått för att mäta risk på en treasuryavdelning. Dock skulle användningen av måttet kunna förbättras.</p>
320

NIG distribution in modelling stock returns with assumption about stochastic volatility : Estimation of parameters and application to VaR and ETL.

Kucharska, Magdalena, Pielaszkiewicz, Jolanta January 2009 (has links)
<p>We model Normal Inverse Gaussian distributed log-returns with the assumption of stochastic volatility. We consider different methods of parametrization of returns and following the paper of Lindberg, [21] we</p><p>assume that the volatility is a linear function of the number of trades. In addition to the Lindberg’s paper, we suggest daily stock volumes and amounts as alternative measures of the volatility.</p><p>As an application of the models, we perform Value-at-Risk and Expected Tail Loss predictions by the Lindberg’s volatility model and by our own suggested model. These applications are new and not described in the</p><p>literature. For better understanding of our caluclations, programmes and simulations, basic informations and properties about the Normal Inverse Gaussian and Inverse Gaussian distributions are provided. Practical applications of the models are implemented on the Nasdaq-OMX, where we have calculated Value-at-Risk and Expected Tail Loss</p><p>for the Ericsson B stock data during the period 1999 to 2004.</p>

Page generated in 0.0452 seconds