1 |
Kvantifiering av sjukdomars svårighetsgrad vid prioriteringar av sjukvårdsresurser : En jämförelse av kvantitativ och kvalitativ metodBeer, Cristina, Sandstedt, Isabelle January 2023 (has links)
The Swedish healthcare system strives to maximize the health of the population and at the same time consider a fair distribution of health. In this consideration, the severity of disease is an important attribute as high severity signals a higher need for healthcare. Currently, the Dental and pharmaceuticals benefits agency (TLV) operationalizes this approach by striking a balance between efficiency and equity when deciding which prescription pharmaceuticals that should be reimbursed. The agency uses a qualitative approach to assess the severity of diseases. There are however quantitative methods that can be used to assess disease severity such as absolute and proportional shortfall, indicating how much health is lost compared to the general population. In this study, we quantify the severity of different diseases using absolute and proportional shortfall based on information available from TLV decisions and compare those with the agency´s qualitative assessment. To quantify severity, we use data from publicly available decision-documents from TLV and estimated health of the general population. The results show that quantitative assessments of disease severity with absolute and proportional shortfall differ from the qualitative assessments made by TLV. A large range of absolute and proportional shortfall estimates was observed for diseases assessed to have a specific severity by TLV. A quantitative approach to disease severity could contribute to a more objective severity assessments than current practice. Furthermore, as disease severity is an important aspect of TLV´s prioritization a quantitative assessment could contribute to a fairer allocation of resources in healthcare. / Den svenska sjukvården strävar efter att maximera hälsan hos befolkningen samtidigt som hänsynskall tas till hur hälsan fördelas. Hur vårdens knappa resurser skall fördelas bygger på tre principerfrån en etisk plattform: människovärdesprincipen, behovs- och solidaritetsprincipen ochkostnadseffektivitetsprincipen. Sjukdomars svårighetsgrad är därmed en viktig prioriteringsgrundnär sjukvårdens resurser skall fördelas eftersom hög svårighetsgrad innebär ett större behov avvård. Tandvårds- och läkedelsförmånsverket (TLV) använder exempelvis denna ansats vid beslutom vilka förskrivningsläkemedel som skall få subvention. TLV använder idag en kvalitativ ansatsi bedömningen av svårighetsgrad. Det finns även kvantitativa metoder som bygger på hur myckethälsa patienter förväntas förlora jämfört med individer utan sjukdom, så kallade shortfallmått. Idenna studie kvantifierar vi svårighetsgraden med absolute och proportional shortfall försjukdomar som varit föremål för beslut av TLV och jämför dessa med TLV:s kvalitativabedömningar. För att kvantifiera svårighetsgraden används data från TLV:s beslutsunderlag samtskattad hälsa i normalpopulationen. Resultaten visar att de kvantitativa skattningarna av svårighetsgrad med absolute och proportionalshortfall skiljer sig från de kvalitativa bedömningarna från TLV. Det förekommer ett stort spannav absolute och proportional shortfall för sjukdomar med en specifik svårighetsgrad bedömd avTLV. En kvantitativ ansats kan därmed bidra till mer objektiva skattningar av svårighetsgradjämfört med dagens kvalitativa ansats. Eftersom bedömningen av svårighetsgrad är en viktig delav TLV:s prioriteringsbeslut är ett förslag att myndighetens kvalitativa ansats kompletteras medkvantitativa mått för att säkerställa en rättvis och effektiv resursfördelning inom hälso- ochsjukvården.
|
2 |
An application of value at risk and expected shortfall / An application of value at risk and expected shortfallMayorga, Rodrigo de Oliveira January 2016 (has links)
MAYORGA, Rodrigo de Oliveira. An application of value at risk and expected shortfall / Rodrigo de Oliveira Mayorga. - 2016. 60f. Tese (Doutorado) - Universidade Federal do Ceará, Programa de Pós Graduação em Economia, CAEN, Fortaleza, 2016. / Submitted by Mônica Correia Aquino (monicacorreiaaquino@gmail.com) on 2017-06-07T18:33:28Z
No. of bitstreams: 1
2016_tese_romayorga.pdf: 23551041 bytes, checksum: c9a78d3b82daf878118fea8674fe02e8 (MD5) / Approved for entry into archive by Mônica Correia Aquino (monicacorreiaaquino@gmail.com) on 2017-06-07T18:33:45Z (GMT) No. of bitstreams: 1
2016_tese_romayorga.pdf: 23551041 bytes, checksum: c9a78d3b82daf878118fea8674fe02e8 (MD5) / Made available in DSpace on 2017-06-07T18:33:45Z (GMT). No. of bitstreams: 1
2016_tese_romayorga.pdf: 23551041 bytes, checksum: c9a78d3b82daf878118fea8674fe02e8 (MD5)
Previous issue date: 2016 / The last two decades have been characterized by significant volatilities in
financial world marked by few major crises, market crashes and bankruptcies of large
corporations and liquidations of major financial institutions. In this context, this study
considers the Extreme Value Theory (EVT), which provides well established
statistical models for the computation of extreme risk measures like the Value at Risk
(VaR) and Expected Shortfall (ES) and examines how EVT can be used to model tail
risk measures and related confidence interval, applying it to daily log-returns on four
market indices. These market indices represent the countries with greater commercial
trade with Brazil for last decade (China, U.S. and Argentina). We calculate the daily
VaR and ES for the returns of IBOV, SPX, SHCOMP and MERVAL stock markets
from January 2nd 2004 to September 8th 2014, combining the EVT with GARCH
models. Results show that EVT can be useful for assessing the size of extreme events
and that it can be applied to financial market return series. We also verified that
MERVAL is the stock market that is most exposed to extreme losses, followed by the
IBOV. The least exposed to daily extreme variations are SPX and SHCOMP. / As duas últimas décadas têm sido caracterizadas por volatilidades
significativas no mundo financeiro em grandes crises, quebras de mercado e falências
de grandes corporações e liquidações de grandes instituições financeiras. Neste
contexto, este estudo considera a evolução da Teoria do Valor Extremo (EVT), que
proporciona modelos estatísticos bem estabelecidos para o cálculo de medidas de
risco extremos, como o Value at Risk (VaR) e Espected Shortfall (ES) e examina
como a EVT pode ser usada para modelar medidas de risco raros, estabelecendo
intervalos de confiança, aplicando-a aos log-retornos diários a quatro índices de
mercado. Estes mercados representam os países com maior intercâmbio comercial
com o Brasil (China, U.S. e Argentina). Calculamos o VaR e ES diários dos índices
IBOV, SPX, SHCOMP e MERVAL, com dados diários entre de 02 de janeiro de
2004 e 08 de setembro de 2014, combinando a EVT com modelos GARCH. Os
resultados mostram que EVT pode ser útil para avaliar o tamanho de eventos
extremos e que ele pode ser aplicado a séries de retorno do mercado financeiro.
Verifica-se ainda que MERVAL é o mercado de ações que está mais exposta a perdas
extremas, seguido do IBOV. Os menos expostos a variações extremas diárias são SPX
e SHCOMP.
|
3 |
Investment Analysis: Evaluating the Loss and Risk of a Stocks and Options PortfolioInfantino, Shanna 02 May 2012 (has links)
With the ripples in the financial markets and economic stresses that occur around the world today, it would be beneficial to have some insight into the tools that help investors learn about the riskiness of their portfolios. At what value is one's portfolio in danger of being completely wiped out? We aim to further the understanding of values such as these and give an assessment of some risk measures by investing in an interactive portfolio, as well as estimating the values at risk and expected shortfalls of this portfolio.
|
4 |
Investment Analysis: Evaluating the Loss and Risk of a Stocks and Options PortfolioShah, Azuri 02 May 2012 (has links)
With the ripples in the financial markets and economic stresses that occur around the world today, it would be beneficial to have some insight into the tools that help investors learn about the riskiness of their portfolios. At what value is one's portfolio in danger of being completely wiped out? We aim to further the understanding of values such as these and give an assessment of some risk measures by investing in an interactive portfolio, as well as estimating the values at risk and expected shortfalls of this portfolio.
|
5 |
Backtesting para o Expected Shortfall do Trading Book: avalia????o e an??lise das metodologiasCastro, Leonardo Nascimento 01 January 2017 (has links)
Submitted by Sara Ribeiro (sara.ribeiro@ucb.br) on 2017-08-17T14:48:23Z
No. of bitstreams: 1
LeonardoNascimentoCastroDissertacao2017.pdf: 1765989 bytes, checksum: fb4cfd563d11e2aa428d7c5af632c835 (MD5) / Approved for entry into archive by Sara Ribeiro (sara.ribeiro@ucb.br) on 2017-08-17T14:50:34Z (GMT) No. of bitstreams: 1
LeonardoNascimentoCastroDissertacao2017.pdf: 1765989 bytes, checksum: fb4cfd563d11e2aa428d7c5af632c835 (MD5) / Made available in DSpace on 2017-08-17T14:50:34Z (GMT). No. of bitstreams: 1
LeonardoNascimentoCastroDissertacao2017.pdf: 1765989 bytes, checksum: fb4cfd563d11e2aa428d7c5af632c835 (MD5)
Previous issue date: 2017-01-01 / Due to the Crisis of 2008, the Basel Committee accelerated the process for update the
Accord and identified some weaknesses such as the inability of V aR to capture the tail risk.
Subsequently, it was recommended to substitute V aR, a non-coherent measure of risk due
to the absence of subadditivity, by CV aR. However, in 2011 the absence of elicitability for
CV aR was shown and this has led some people to believe that it is impossible to perform
a backtesting for this risk measure. Elicitability is an mathematical property for model
selection and not for validation, although the convexity of its scoring function is required
for backtesting. It is important to know the identifiability and testability, which have a
relation with elicitability. For a good backtesting in the Trading Book, the testable function
must be sharp, which is strictly increasing and decreasing with respect to the predictive
and realized variables, respectively, and meet the requirement of ridge backtest, which
depends on the least possible V aR. The CV aR, while not being testable or elicitable, is
at least conditionally elicitable and therefore also conditionally testable. To validate the
CV aR models, simulations were made with the three Acerbi methods, two of this study
for testing and another adapted from the quantile approximation. Of these six, none were
perfect, but two presented better results than the V aR Backtesting. This study analyzed
the risk measures V aR and CV aR by the Historical Simulation, Delta-Normal, Correlated
Normal, Monte Carlo and Quasi-Monte Carlo Simulation methods in the 95%, 97.5% and
99% for the Brazilian bond and stock portfolios, as well as the Brazilian Real against the
Dollar, Euro and Yen currencies, and used some backtesting for the two measures. This
study also proposed a method to improve Backtesting results of V aR. / Devido ?? Crise de 2008 o Comit?? de Basileia acelerou o processo para atualiza????o do Acordo e identificou algumas falhas como, por exemplo, a incapacidade do V aR em captar o risco de cauda. Posteriormente, recomendou-se substituir o V aR, uma medida n??o coerente de risco devido ?? aus??ncia de subaditividade, pelo CV aR. Entretanto, em 2011 foi mostrada a aus??ncia da elicitabilidade para o CV aR e isso induziu algumas pessoas a pensarem ser imposs??vel realizar um backtesting para esta medida de risco. A elicitabilidade ?? uma propriedade matem??tica para a sele????o de modelo e n??o para a valida????o, apesar de que a convexidade de sua fun????o scoring ?? necess??ria para o backtesting. Foram introduzidos os conceitos de identificabilidade e testabilidade, que possuem uma rela????o com a elicitabilidade. Para um bom backtesting no Trading Book, a fun????o test??vel deve ser n??tida, que ?? estritamente crescente e decrescente em rela????o ??s vari??veis preditiva e realizada, respectivamente, e atender o requisito de ridge backtest, que dependa o m??nimo poss??vel do V aR. O CV aR, apesar de n??o ser elicit??vel nem test??vel, ?? pelo menos condicionalmente elicit??vel e, portanto, tamb??m condicionalmente test??vel. Para validar os modelos do CV aR, foram feitas simula????es com os tr??s m??todos de Acerbi, dois desta pesquisa para teste e outro adaptado da Aproxima????o dos N??veis de V aR. Destes seis, nenhum foi perfeito, mas dois apresentaram resultados melhores que o Backtesting do V aR. Esta pesquisa analisou as medidas de risco V aR e CV aR pelos m??todos Simula????o Hist??rica, Delta-Normal, Normal Correlacionado, Simula????o Monte Carlo e Quase-Monte Carlo nos intervalos de confian??a de 95%, 97,5% e 99% para as carteiras de t??tulos e a????es brasileiras, al??m das cota????es do Real frente ??s moedas D??lar, Euro e Iene, e utilizou alguns testes de ader??ncia para as duas medidas. Esta pesquisa tamb??m prop??s um m??todo para melhorar os resultados do Backtesting do V aR.
|
6 |
An examination of budget reductions in high-wealth property school districts and low-wealth property school districts in TexasSauceda, Dora E. 25 July 2012 (has links)
An Examination of Budget Reductions in High-Wealth
Property School Districts and Low-Wealth
Property School District in Texas
Dora E. Sauceda, Ed.D.
The University of Texas at Austin, 2012
Supervisor: Julian Vasquez Heilig
In June of 2011, The 82nd Legislature approved a reduction to Texas public
education funding in upwards of $4 billion. Districts, regardless of wealth, responded by
making budgetary reductions that affected personnel, programs, and services. The
reduction in funding is expected to continue into the next biennium. This study examined
the prioritization of budget reductions and process utilized by high-wealth and low-wealth property school districts to enact budget reductions to the various operating
expenditures and the inequities that surfaced as a result of the reductions.
The research questions included in the study were:
1. What budget-reduction options are prioritized at the district level for high-wealth
property school districts versus low-wealth property school districts?
2. What budget-reduction process was utilized at the district level by high-wealth
property school districts and low-wealth property school districts?
3. What district-level budget functions were slated for reduction at high-wealth
property and low-wealth property school districts and what are the equity
implications that surfaced as a result of the reductions?
The study utilized a mixed-methods design. A 5-point Likert scale survey and
semi-structured interview were used to examine the budget-reduction prioritization and process. An independent samples t-test was utilized to examine 2010-2011 and 2011-
2012 per-pupil expenditures by function (N=60). The sample included 30 high-wealth
and 30-low-wealth school districts.
The results of the qualitative data indicated that districts prioritize communication
with stakeholders and school boards when deciding on budgetary reductions.
Communication of the budget problem to all stakeholders was a high priority so as to
ensure buy-in once decisions on budget reductions were made. The semi-structured
interview revealed emergent themes that included maintaining the vision, transparency,
stakeholder participation, equity, and impact of budget reductions. The t-test revealed
statistical significance in the areas of instruction, security services, and payroll. The
results also revealed that programs and services aimed at assisting the students with most need were either decreased or eliminated.
Findings derived from this study will provide educational practitioners and
policymakers with a conglomerate of information on how school-district leaders are
examining their financial resources, areas designated for reduction, and areas they
perceive as vital for preservation. / text
|
7 |
Empirical Analysis of Value at Risk and Expected Shortfall in Portfolio Selection ProblemDing, Liyuan 1988- 14 March 2013 (has links)
Safety first criterion and mean-shortfall criterion both explore cases of assets allocation with downside risk. In this paper, I compare safety first portfolio selection problem and mean-shortfall portfolio optimization problem, considering risk averse investors in practice. Safety first portfolio selection uses Value at Risk (VaR) as a risk measure, and mean-shortfall portfolio optimization uses expected shortfall as a risk measure, respectively. VaR is estimated by implementing extreme theory using a semi-parametric method. Expected shortfall is estimated by two nonparametric methods: a natural estimation and a kernel-weighted estimation.
I use daily data on three international stock indices, ranging from January 1986 to February 2012, to provide empirical evidence in asset allocations and illustrate the performances of safety first and mean-shortfall with their risk measures. Also, the historical data has been divided in two ways. One is truncated at year 1998 and explored the performance during tech boom and financial crisis. the mean-shortfall portfolio optimization with the kernel-weighted method performed better than the safety first criterion, while the safety first criterion was better than the mean-shortfall portfolio optimization with the natural estimation method.
|
8 |
Applying Multivariate Expected Shortfall on High Frequency Foreign Exchange Data / Implementering av multidimensionell Expected Shortfall på högfrekvent växelkursdataHolmsäter, Sara, Malmberg, Emelie January 2016 (has links)
This thesis aims at implementing and evaluating the performance of multivariate Expected Shortfall models on high frequency foreign exchange data. The implementation is conducted with a unique portfolio consisting of five foreign exchange rates; EUR/SEK, EUR/NOK, EUR/USD, USD/SEK and USD/NOK. High frequency is in this context defined as observations with time intervals from second by second up to minute by minute. The thesis consists of three main parts. In the first part, the exchange rates are modelled individually with time series models for returns and realized volatility. In the second part, the dependence between the exchange rates is modelled with copulas. In the third part, Expected Shortfall is calculated, the risk contribution of each exchange rate is derived and the models are backtested. The results of the thesis indicate that three of the five final models can be rejected at a 5% significance level if the risk is measured by Expected Shortfall (ES0:05). The two models that cannot be rejected are based on the Clayton and Student’s t copulas, the only two copulas with heavy left tails. The rejected models are based on the Gaussian, Gumbel-Hougaard and Frank copulas. The fact that some of the copula models are rejected emphasizes the importance of choosing an appropriate dependence structure. The risk contribution calculations show that the risk contributions are highest from EUR/NOK and USD/NOK, and that EUR/USD has the lowest risk contribution and even decreases the portfolio risk in some cases. Regarding the underlying models, it is concluded that for the data used in this thesis, the final combined time series and copula models perform quite well, given that the purpose is to measure the risk. However, the most important parts to capture seem to be the fluctuations in the volatilities as well as the tail dependencies between the exchange rates. Thus, the predictions of the return mean values play a less significant role, even though they still improve the results and are necessary in order to proceed with other parts of the modelling. As future research, we first and foremost recommend including the liquidity aspect in the models. / Syftet med denna masteruppsats är att implementera och utvärdera multidimensionella Expected Shortfall-modeller på högfrekvent växelkursdata. Implementeringen och utvärderingen utförs med en unik portfölj bestående av fem växelkurser; EUR/SEK, EUR/NOK, EUR/USD, USD/SEK och USD/NOK. Högfrekventa observationer är i denna uppsats definierade som sekundvisa upp till minutvisa observationer. Uppsatsen består av tre huvuddelar. I den första delen modelleras växelkurserna individuellt med tidsseriemodeller för växelkursförändringarna i form av avkastning och realiserad volatilitet. I del två modelleras beroendestrukturerna mellan de olika växelkurserna med hjälp av copulas. I den tredje och sista delen beräknas Expected Shortfall och riskbidragen från de enskilda växelkurserna, varefter modellerna utfallstestas. De slutgiltiga resultaten indikerar att tre av de fem föreslagna modellerna kan förkastas vid en signifikansnivå på 5% om risken mäts med Expected Shortfall (ES0:05). De två modeller som inte kan förkastas är baserade på Clayton och Student’s t copulas, vilka särskiljer sig från övriga copulas genom att de har tjocka vänstersvansar. De modeller som förkastas är baserade på Gaussian, Gumbel-Hougaard och Frank copulas. Det faktum att några copula-modeller förkastas betonar vikten av att välja en lämplig beroendestruktur. Riskbidragsberäkningarna visar att EUR/NOK och USD/NOK bidrar mest till den totala risken i portföljen och att EUR/USD har det lägsta riskbidraget, där EUR/USD till och med minskar risken i vissa fall. Vad gäller underliggande modeller så visas det att för den tillgängliga datan i den här uppsatsen så fungerar tidsseriemodeller i kombination med copulas bra, givet att syftet är att mäta risk. Dock tyder resultaten på att volatilitetsfluktuationer samt svansberoenden mellan växelkurserna är de mest väsentliga delarna att fånga. Väntevärdesprognoserna för avkastningarna har mindre inverkan på de slutgiltiga beräkningarna, även om de fortfarande förbättrar resultaten och i sig är nödvändiga för fortsatt modellering. För framtida studier rekommenderar vi först och främst att inkludera likviditetsaspekter i modellerna.
|
9 |
Optimal Portfolio in Outperforming Its Liability Benchmark for a Defined Benefit Pension Plan李意豐, Yi-Feng Li Unknown Date (has links)
摘要
本文於確定給付退休金計劃下,探討基金經理人於最差基金財務短絀情境發生前極大化管理目標之最適投資組合,基金比值過程定義為基金現值與負債指標之比例,管理人將於指定最差基金比值發生前極大化達成既定經營目標之機率,隨時間改變之基金投資集合包括無風險之現金、債券與股票。本研究建構隨機控制模型描述此最適化問題,並以動態規劃方法求解,由結果歸納,經理人之最適策略包含極小化基金比值變異之避險因素,風險偏好及跨期投資集合相關之避險因素與模型狀態變數相關之避險因素。本研究利用馬可夫練逼近法逼近隨機控制的數值解,結果顯示基金經理人須握有很大部位的債券,且不同的投資期間對於最適投資決策有很大的影響。
關鍵字: 短絀、確定給付、負債指標、隨機控制、動態規劃。 / Abstract
This paper analyzes the portfolio problem that is a pension fund manager has to maximize the possibility of reaching his managerial goal before the worst scenario shortfall occurs in a defined benefit pension scheme. The fund ratio process defined as the ratio between the fund level and its accrued liability benchmark is attained to maximize the probability that the predetermined target is achieved before it falls below an intolerable boundary. The time-varying opportunity set in our study includes risk-free cash, bonds and stock index. The problems are formulated as a stochastic control framework and are solved through dynamics programming. In this study, the optimal portfolio are characterized by three components, the liability hedging component, the intertemporal hedging component against changes in the opportunity set, and the temporal hedging component minimizing the variation in fund ratio growth. The Markov chain approximation methods are employed to approximate the stochastic control solutions numerically. The result shows that fund managers should hold large proportions of bonds and time horizon plays a crucial role in constructing the optimal portfolio.
Keywords: shortfall; defined benefit; liability benchmark; stochastic control; dynamic programming.
|
10 |
On Value-at-Risk and the more extreme : A study on quantitative market risk measurementsLindholm, Dennis January 2015 (has links)
Inline with the third pillar of the Basel accords, quantitative market risk measurements are investigate and evaluated comparing JP Morgan’s RiskMetrics and Bollerslev’s GARCH with the Peek over Threshold and Block Maxima approaches from the Extreme Value Theory framework. Value-at-Risk and Expected Shortfall (Conditional Value-at-Risk), with 95% and 99% confidence, is predicted for 25 years of the OMXS30. The study finds Bollerslev’s suggested t distribution to be a more appropriate distributional assumption, but no evidence to prefer the GARCH to the RiskMetrics. The more demanding Extreme Value Theory procedures trail behind as they are found wasteful of data and more difficult to backtest and therefore evaluate.
|
Page generated in 0.1418 seconds