• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 139
  • 27
  • 19
  • 13
  • 11
  • 9
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 263
  • 263
  • 175
  • 68
  • 61
  • 51
  • 40
  • 34
  • 31
  • 30
  • 28
  • 25
  • 25
  • 23
  • 23
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
251

Risks in Commodity and Currency Markets

Bozovic, Milos 17 April 2009 (has links)
This thesis analyzes market risk factors in commodity and currency markets. It focuses on the impact of extreme events on the prices of financial products traded in these markets, and on the overall market risk faced by the investors. The first chapter develops a simple two-factor jump-diffusion model for valuation of contingent claims on commodities in order to investigate the pricing implications of shocks that are exogenous to this market. The second chapter analyzes the nature and pricing implications of the abrupt changes in exchange rates, as well as the ability of these changes to explain the shapes of option-implied volatility "smiles". Finally, the third chapter employs the notion that key results of the univariate extreme value theory can be applied separately to the principal components of ARMA-GARCH residuals of a multivariate return series. The proposed approach yields more precise Value at Risk forecasts than conventional multivariate methods, while maintaining the same efficiency. / El objetivo de esta tesis es analizar los factores del riesgo del mercado de las materias primas y las divisas. Está centrada en el impacto de los eventos extremos tanto en los precios de los productos financieros como en el riesgo total de mercado al cual se enfrentan los inversores. En el primer capítulo se introduce un modelo simple de difusión y saltos (jump-diffusion) con dos factores para la valuación de activos contingentes sobre las materias primas, con el objetivo de investigar las implicaciones de shocks en los precios que son exógenos a este mercado. En el segundo capítulo se analiza la naturaleza e implicaciones para la valuación de los saltos en los tipos de cambio, así como la capacidad de éstos para explicar las formas de sonrisa en la volatilidad implicada. Por último, en el tercer capítulo se utiliza la idea de que los resultados principales de la Teoria de Valores Extremos univariada se pueden aplicar por separado a los componentes principales de los residuos de un modelo ARMA-GARCH de series multivariadas de retorno. El enfoque propuesto produce pronósticos de Value at Risk más precisos que los convencionales métodos multivariados, manteniendo la misma eficiencia.
252

Modelling of extremes

Hitz, Adrien January 2016 (has links)
This work focuses on statistical methods to understand how frequently rare events occur and what the magnitude of extreme values such as large losses is. It lies in a field called extreme value analysis whose scope is to provide support for scientific decision making when extreme observations are of particular importance such as in environmental applications, insurance and finance. In the univariate case, I propose new techniques to model tails of discrete distributions and illustrate them in an application on word frequency and multiple birth data. Suitably rescaled, the limiting tails of some discrete distributions are shown to converge to a discrete generalized Pareto distribution and generalized Zipf distribution respectively. In the multivariate high-dimensional case, I suggest modeling tail dependence between random variables by a graph such that its nodes correspond to the variables and shocks propagate through the edges. Relying on the ideas of graphical models, I prove that if the variables satisfy a new notion called asymptotic conditional independence, then the density of the joint distribution can be simplified and expressed in terms of lower dimensional functions. This generalizes the Hammersley- Clifford theorem and enables us to infer tail distributions from observations in reduced dimension. As an illustration, extreme river flows are modeled by a tree graphical model whose structure appears to recover almost exactly the actual river network. A fundamental concept when studying limiting tail distributions is regular variation. I propose a new notion in the multivariate case called one-component regular variation, of which Karamata's and the representation theorem, two important results in the univariate case, are generalizations. Eventually, I turn my attention to website visit data and fit a censored copula Gaussian graphical model allowing the visualization of users' behavior by a graph.
253

Développement d'un outil statistique pour évaluer les charges maximales subies par l'isolation d'une cuve de méthanier au cours de sa période d'exploitation / Development of a statistical tool to determine sloshing loads to be applied on cargo containment system of a LNG carrier for structural strength assessment

Fillon, Blandine 19 December 2014 (has links)
Ce travail de thèse porte sur les outils statistiques pour l'évaluation des maxima de charges de sloshing dans les cuves de méthaniers. Selon les caractéristiques du navire, son chargement et les conditions de navigation, un ballotement hydrodynamique est observé à l'intérieur des cuves, phénomène communément appelé sloshing. La détermination des charges qui s'appliquent à la structure est basée sur des mesures de pression d'impact au moyen d'essais sur maquette. Les maxima de pression par impact, extraits des mesures, sont étudiés. La durée d'un essai est équivalente à 5 heures au réel et insuffisante pour déterminer des maxima de pression associés à de grandes périodes de retour (40 ans). Un modèle probabiliste est nécessaire pour extrapoler les maxima de pression. Le modèle usuel est une loi de Weibull. Comme ce sont les valeurs extrêmes des échantillons qui nous intéressent, les ajustements sont aussi effectués par les lois des valeurs extrêmes et de Pareto généralisées via les méthodes de maximum par bloc et d'excès au-dessus d'un seuil.L'originalité du travail repose sur l'emploi d'un système alternatif, plus pertinent pour la capture des maxima de pression et d'une quantité de 480 heures de mesures disponible pour les mêmes conditions d'essai. Cela fournit une distribution de référence pour les maxima de pression et nous permet d'évaluer la pertinence des modèles sélectionnés. Nous insistons sur l'importance d'évaluer la qualité des ajustements par des tests statistiques et de quantifier les incertitudes sur les estimations obtenues. La méthodologie fournie a été implémentée dans un logiciel nommé Stat_R qui facilite la manipulation et le traitement des résultats. / This thesis focuses on statistical tools for the assessment of maxima sloshing loads in LNG tanks. According to ship features, tank cargo and sailing conditions, a sloshing phenomenon is observed inside LNG tanks. The determination of sloshing loads supported by the tank structure is derived from impact pressure measurements performed on a test rig. Pressure maxima per impact, extracted from test measurements, are investigated. Test duration is equivalent to 5 hours in full scale. This duration is not sufficient to determine pressure maxima associated with high return periods (40 years). It is necessary to use a probabilistic model in order to extrapolate pressure maxima. Usually, a Weibull model is used. As we focus on extreme values from samples, fittings are also performed with the generalized extreme value distribution and the generalized Pareto distribution using block maximum method and peaks over threshold method.The originality of this work is based on the use of an alternate measurement system which is more relevant than usual measurement system to get pressure maxima and a 480 hours measured data available for same test conditions. This provides a reference distribution for pressure maxima which is used to assess the relevance of the selected probabilistic models. Particular attention is paid to the assessment of fittings quality using statistical tests and to the quantification of uncertainties on estimated values.The provided methodology has been implemented in a software called Stat_R which makes the manipulation and the treatment of results easier.
254

Modeling sea-level rise uncertainties for coastal defence adaptation using belief functions / Utilisation des fonctions de croyance pour la modélisation des incertitudes dans les projections de l'élévation du niveau marin pour l'adaptation côtière

Ben Abdallah, Nadia 12 March 2014 (has links)
L’adaptation côtière est un impératif pour faire face à l’élévation du niveau marin,conséquence directe du réchauffement climatique. Cependant, la mise en place d’actions et de stratégies est souvent entravée par la présence de diverses et importantes incertitudes lors de l’estimation des aléas et risques futurs. Ces incertitudes peuvent être dues à une connaissance limitée (de l’élévation du niveau marin futur par exemple) ou à la variabilité naturelle de certaines variables (les conditions de mer extrêmes). La prise en compte des incertitudes dans la chaîne d’évaluation des risques est essentielle pour une adaptation efficace.L’objectif de ce travail est de proposer une méthodologie pour la quantification des incertitudes basée sur les fonctions de croyance – un formalisme de l’incertain plus flexible que les probabilités. Les fonctions de croyance nous permettent de décrire plus fidèlement l’information incomplète fournie par des experts (quantiles,intervalles, etc.), et de combiner différentes sources d’information. L’information statistique peut quand à elle être décrite par de fonctions des croyance définies à partir de la fonction de vraisemblance. Pour la propagation d’incertitudes, nous exploitons l’équivalence mathématique entre fonctions de croyance et intervalles aléatoires, et procédons par échantillonnage Monte Carlo. La méthodologie est appliquée dans l’estimation des projections de la remontée du niveau marin global à la fin du siècle issues de la modélisation physique, d’élicitation d’avis d’experts, et de modèle semi-empirique. Ensuite, dans une étude de cas, nous évaluons l’impact du changement climatique sur les conditions de mers extrêmes et évaluons le renforcement nécessaire d’une structure afin de maintenir son niveau de performance fonctionnelle. / Coastal adaptation is an imperative to deal with the elevation of the global sealevel caused by the ongoing global warming. However, when defining adaptationactions, coastal engineers encounter substantial uncertainties in the assessment of future hazards and risks. These uncertainties may stem from a limited knowledge (e.g., about the magnitude of the future sea-level rise) or from the natural variabilityof some quantities (e.g., extreme sea conditions). A proper consideration of these uncertainties is of principal concern for efficient design and adaptation.The objective of this work is to propose a methodology for uncertainty analysis based on the theory of belief functions – an uncertainty formalism that offers greater features to handle both aleatory and epistemic uncertainties than probabilities.In particular, it allows to represent more faithfully experts’ incomplete knowledge (quantiles, intervals, etc.) and to combine multi-sources evidence taking into account their dependences and reliabilities. Statistical evidence can be modeledby like lihood-based belief functions, which are simply the translation of some inference principles in evidential terms. By exploiting the mathematical equivalence between belief functions and random intervals, uncertainty can be propagated through models by Monte Carlo simulations. We use this method to quantify uncertainty in future projections of the elevation of the global sea level by 2100 and evaluate its impact on some coastal risk indicators used in coastal design. Sea-level rise projections are derived from physical modelling, expert elicitation, and historical sea-level measurements. Then, within a methodologically-oriented case study,we assess the impact of climate change on extreme sea conditions and evaluate there inforcement of a typical coastal defence asset so that its functional performance is maintained.
255

Stochastic Modelling of Daily Peak Electricity Demand Using Value Theory

Boano - Danquah, Jerry 21 September 2018 (has links)
MSc (Statistics) / Department of Statistics / Daily peak electricity data from ESKOM, South African power utility company for the period, January 1997 to December 2013 consisting of 6209 observations were used in this dissertation. Since 1994, the increased electricity demand has led to sustainability issues in South Africa. In addition, the electricity demand continues to rise everyday due to a variety of driving factors. Considering this, if the electricity generating capacity in South Africa does not show potential signs of meeting the country’s demands in the subsequent years, this may have a significant impact on the national grid causing it to operate in a risky and vulnerable state, leading to disturbances, such as load shedding as experienced during the past few years. In particular, it is of greater interest to have sufficient information about the extreme value of the stochastic load process in time for proper planning, designing the generation and distribution system, and the storage devices as these would ensure efficiency in the electrical energy in order to maintain discipline in the grid systems. More importantly, electricity is an important commodity used mainly as a source of energy in industrial, residential and commercial sectors. Effective monitoring of electricity demand is of great importance because demand that exceeds maximum power generated will lead to power outage and load shedding. It is in the light of this that the study seeks to assess the frequency of occurrence of extreme peak electricity demand in order to come up with a full electricity demand distribution capable of managing uncertainties in the grid system. In order to achieve stationarity in the daily peak electricity demand (DPED), we apply a penalized regression cubic smoothing spline to ensure the data is non-linearly detrended. The R package “evmix” is used to estimate the thresholds using the bounded corrected kernel density plot. The non-linear detrended datasets were divided into summer, spring, winter and autumn according to the calender dates in the Southern Hemisphere for frequency analysis. The data is declustered using Ferro and Segers automatic declustering method. The cluster maxima is extracted using the R package “evd”. We fit Poisson GPD and stationary point process to the cluster maxima and the intensity function of the point process which measures the frequency of occurrence of the daily peak electricity demand per year is calculated for each dataset. The formal goodness-of-fit test based on Cramer-Von Mises statistics and Anderson-Darling statistics supported the null hypothesis that each dataset follow Poisson GPD (σ, ξ) at 5 percent level of significance. The modelling framework, which is easily extensible to other peak load parameters, is based on the assumption that peak power follows a Poisson process. The parameters of the developed i models were estimated using the Maximum Likelihood. The usual asymptotic properties underlying the Poisson GPD were satisfied by the model. / NRF
256

Modelling equity risk and external dependence: A survey of four African Stock Markets

Samuel, Richard Abayomi 18 May 2019 (has links)
Department of Statistics / MSc (Statistics) / The ripple e ect of a stock market crash due to extremal dependence is a global issue with key attention and it is at the core of all modelling e orts in risk management. Two methods of extreme value theory (EVT) were used in this study to model equity risk and extremal dependence in the tails of stock market indices from four African emerging markets: South Africa, Nigeria, Kenya and Egypt. The rst is the \bivariate-threshold-excess model" and the second is the \point process approach". With regards to the univariate analysis, the rst nding in the study shows in descending hierarchy that volatility with persistence is highest in the South African market, followed by Egyptian market, then Nigerian market and lastly, the Kenyan equity market. In terms of risk hierarchy, the Egyptian EGX 30 market is the most risk-prone, followed by the South African JSE-ALSI market, then the Nigerian NIGALSH market and the least risky is the Kenyan NSE 20 market. It is therefore concluded that risk is not a brainchild of volatility in these markets. For the bivariate modelling, the extremal dependence ndings indicate that the African continent regional equity markets present a huge investment platform for investors and traders, and o er tremendous opportunity for portfolio diversi cation and investment synergies between markets. These synergistic opportunities are due to the markets being asymptotic (extremal) independent or (very) weak asymptotic dependent and negatively dependent. This outcome is consistent with the ndings of Alagidede (2008) who analysed these same markets using co-integration analysis. The bivariate-threshold-excess and point process models are appropriate for modelling the markets' risks. For modelling the extremal dependence however, given the same marginal threshold quantile, the point process has more access to the extreme observations due to its wider sphere of coverage than the bivariate-threshold-excess model. / NRF
257

Evaluation of methods for quantifying returns within the premium pension / Utvärdering av metoder för beräkning av internräntani premiepensionen

Backman, Emil, Petersson, David January 2020 (has links)
Pensionsmyndigheten's (the Swedish Pensions Agency) current calculation of the internal rate of return for 7.7 million premium pension savers is both time and resource consuming. This rate of return mirrors the overall performance of the funded part of the pension system and is analyzed internally, but also reported to the public monthly and yearly based on differently sized data samples. This thesis aims to investigate the possibility of utilizing other approaches in order to improve the performance of these calculations. Further, the study aims to verify the results stemming from said calculations and investigate their robustness. In order to investigate competitive matrix methods, a sample of approaches are compared to the more classical numerical methods. The approaches are compared in different scenarios aimed to mirror real practice. The robustness of the results are then analyzed by a stochastic modeling approach, where a small error term is introduced aimed to mimic possible errors which could arise in data management. It is concluded that a combination of Halley's method and the Jacobi-Davidson algorithm is the most robust and high performing method. The proposed method combines the speed and robustness from numerical and matrix methods, respectively. The result show a performance improvement of 550% in time, while maintaining the accuracy of the current server computations. The analysis of error propagation suggests the output error to be less than 0.12 percentage points in 99 percent of the cases, considering an introduced error term of large proportions. In this extreme case, the modeled expected number of individuals with an error exceeding 1 percentage point is estimated to be 212 out of the whole population. / Pensionsmyndighetens nuvarande beräkning av internräntan för 7,7 miljoner pensionssparare är både tid- och resurskrävande. Denna avkastning ger en översikt av hur väl den fonderade delen av pensionssystemet fungerar. Detta analyseras internt men rapporteras även till allmänheten varje månad samt årligen baserat på olika urval av data. Denna uppsats avser att undersöka möjligheten att använda andra tillvägagångssätt för att förbättra prestanda för denna typ av beräkningar. Vidare syftar studien till att verifiera resultaten som härrör från dessa beräkningar och undersöka deras stabilitet. För att undersöka om det finns konkurrerande matrismetoder jämförs ett urval av tillvägagångssätt med de mer klassiska numeriska metoderna. Metoderna jämförs i flera olika scenarier som syftar till att spegla verklig praxis. Stabiliteten i resultaten analyseras med en stokastisk modellering där en felterm införs för att efterlikna möjliga fel som kan uppstå i datahantering. Man drar slutsatsen att en kombination av Halleys metod och Jacobi-Davidson-algoritmen är den mest robusta och högpresterande metoden. Den föreslagna metoden kombinerar hastigheten från numeriska metoder och tillförlitlighet från matrismetoder. Resultatet visar en prestandaförbättring på 550 % i tid, samtidigt som samma noggrannhet som ses i de befintliga serverberäkningarna bibehålls. Analysen av felutbredning föreslår att felet i 99 procent av fallen är mindre än 0,12 procentenheter i det fall där införd felterm har stora proportioner. I detta extrema fall uppskattas det förväntade antalet individer med ett fel som överstiger 1 procentenhet vara 212 av hela befolkningen.
258

Simulation-Based Portfolio Optimization with Coherent Distortion Risk Measures / Simuleringsbaserad portföljoptimering med koherenta distortionsriskmått

Prastorfer, Andreas January 2020 (has links)
This master's thesis studies portfolio optimization using linear programming algorithms. The contribution of this thesis is an extension of the convex framework for portfolio optimization with Conditional Value-at-Risk, introduced by Rockafeller and Uryasev. The extended framework considers risk measures in this thesis belonging to the intersecting classes of coherent risk measures and distortion risk measures, which are known as coherent distortion risk measures. The considered risk measures belonging to this class are the Conditional Value-at-Risk, the Wang Transform, the Block Maxima and the Dual Block Maxima measures. The extended portfolio optimization framework is applied to a reference portfolio consisting of stocks, options and a bond index. All assets are from the Swedish market. The returns of the assets in the reference portfolio are modelled with elliptical distribution and normal copulas with asymmetric marginal return distributions. The portfolio optimization framework is a simulation-based framework that measures the risk using the simulated scenarios from the assumed portfolio distribution model. To model the return data with asymmetric distributions, the tails of the marginal distributions are fitted with generalized Pareto distributions, and the dependence structure between the assets are captured using a normal copula. The result obtained from the optimizations is compared to different distributional return assumptions of the portfolio and the four risk measures. A Markowitz solution to the problem is computed using the mean average deviation as the risk measure. The solution is the benchmark solution which optimal solutions using the coherent distortion risk measures are compared to. The coherent distortion risk measures have the tractable property of being able to assign user-defined weights to different parts of the loss distribution and hence value increasing loss severities as greater risks. The user-defined loss weighting property and the asymmetric return distribution models are used to find optimal portfolios that account for extreme losses. An important finding of this project is that optimal solutions for asset returns simulated from asymmetric distributions are associated with greater risks, which is a consequence of more accurate modelling of distribution tails. Furthermore, weighting larger losses with increasingly larger weights show that the portfolio risk is greater, and a safer position is taken. / Denna masteruppsats behandlar portföljoptimering med linjära programmeringsalgoritmer. Bidraget av uppsatsen är en utvidgning av det konvexa ramverket för portföljoptimering med Conditional Value-at-Risk, som introducerades av Rockafeller och Uryasev. Det utvidgade ramverket behandlar riskmått som tillhör en sammansättning av den koherenta riskmåttklassen och distortions riksmåttklassen. Denna klass benämns som koherenta distortionsriskmått. De riskmått som tillhör denna klass och behandlas i uppsatsen och är Conditional Value-at-Risk, Wang Transformen, Block Maxima och Dual Block Maxima måtten. Det utvidgade portföljoptimeringsramverket appliceras på en referensportfölj bestående av aktier, optioner och ett obligationsindex från den Svenska aktiemarknaden. Tillgångarnas avkastningar, i referens portföljen, modelleras med både elliptiska fördelningar och normal-copula med asymmetriska marginalfördelningar. Portföljoptimeringsramverket är ett simuleringsbaserat ramverk som mäter risk baserat på scenarion simulerade från fördelningsmodellen som antagits för portföljen. För att modellera tillgångarnas avkastningar med asymmetriska fördelningar modelleras marginalfördelningarnas svansar med generaliserade Paretofördelningar och en normal-copula modellerar det ömsesidiga beroendet mellan tillgångarna. Resultatet av portföljoptimeringarna jämförs sinsemellan för de olika portföljernas avkastningsantaganden och de fyra riskmåtten. Problemet löses även med Markowitz optimering där "mean average deviation" används som riskmått. Denna lösning kommer vara den "benchmarklösning" som kommer jämföras mot de optimala lösningarna vilka beräknas i optimeringen med de koherenta distortionsriskmåtten. Den speciella egenskapen hos de koherenta distortionsriskmåtten som gör det möjligt att ange användarspecificerade vikter vid olika delar av förlustfördelningen och kan därför värdera mer extrema förluster som större risker. Den användardefinerade viktningsegenskapen hos riskmåtten studeras i kombination med den asymmetriska fördelningsmodellen för att utforska portföljer som tar extrema förluster i beaktande. En viktig upptäckt är att optimala lösningar till avkastningar som är modellerade med asymmetriska fördelningar är associerade med ökad risk, vilket är en konsekvens av mer exakt modellering av tillgångarnas fördelningssvansar. En annan upptäckt är, om större vikter läggs på högre förluster så ökar portföljrisken och en säkrare portföljstrategi antas.
259

Tail Risk Protection via reproducible data-adaptive strategies

Spilak, Bruno 15 February 2024 (has links)
Die Dissertation untersucht das Potenzial von Machine-Learning-Methoden zur Verwaltung von Schwanzrisiken in nicht-stationären und hochdimensionalen Umgebungen. Dazu vergleichen wir auf robuste Weise datenabhängige Ansätze aus parametrischer oder nicht-parametrischer Statistik mit datenadaptiven Methoden. Da datengetriebene Methoden reproduzierbar sein müssen, um Vertrauen und Transparenz zu gewährleisten, schlagen wir zunächst eine neue Plattform namens Quantinar vor, die einen neuen Standard für wissenschaftliche Veröffentlichungen setzen soll. Im zweiten Kapitel werden parametrische, lokale parametrische und nicht-parametrische Methoden verglichen, um eine dynamische Handelsstrategie für den Schutz vor Schwanzrisiken in Bitcoin zu entwickeln. Das dritte Kapitel präsentiert die Portfolio-Allokationsmethode NMFRB, die durch eine Dimensionsreduktionstechnik hohe Dimensionen bewältigt. Im Vergleich zu klassischen Machine-Learning-Methoden zeigt NMFRB in zwei Universen überlegene risikobereinigte Renditen. Das letzte Kapitel kombiniert bisherige Ansätze zu einer Schwanzrisikoschutzstrategie für Portfolios. Die erweiterte NMFRB berücksichtigt Schwanzrisikomaße, behandelt nicht-lineare Beziehungen zwischen Vermögenswerten während Schwanzereignissen und entwickelt eine dynamische Schwanzrisikoschutzstrategie unter Berücksichtigung der Nicht-Stationarität der Vermögensrenditen. Die vorgestellte Strategie reduziert erfolgreich große Drawdowns und übertrifft andere moderne Schwanzrisikoschutzstrategien wie die Value-at-Risk-Spread-Strategie. Die Ergebnisse werden durch verschiedene Data-Snooping-Tests überprüft. / This dissertation shows the potential of machine learning methods for managing tail risk in a non-stationary and high-dimensional setting. For this, we compare in a robust manner data-dependent approaches from parametric or non-parametric statistics with data-adaptive methods. As these methods need to be reproducible to ensure trust and transparency, we start by proposing a new platform called Quantinar, which aims to set a new standard for academic publications. In the second chapter, we dive into the core subject of this thesis which compares various parametric, local parametric, and non-parametric methods to create a dynamic trading strategy that protects against tail risk in Bitcoin cryptocurrency. In the third chapter, we propose a new portfolio allocation method, called NMFRB, that deals with high dimensions thanks to a dimension reduction technique, convex Non-negative Matrix Factorization. This technique allows us to find latent interpretable portfolios that are diversified out-of-sample. We show in two universes that the proposed method outperforms other classical machine learning-based methods such as Hierarchical Risk Parity (HRP) concerning risk-adjusted returns. We also test the robustness of our results via Monte Carlo simulation. Finally, the last chapter combines our previous approaches to develop a tail-risk protection strategy for portfolios: we extend the NMFRB to tail-risk measures, we address the non-linear relationships between assets during tail events by developing a specific non-linear latent factor model, finally, we develop a dynamic tail risk protection strategy that deals with the non-stationarity of asset returns using classical econometrics models. We show that our strategy is successful at reducing large drawdowns and outperforms other modern tail-risk protection strategies such as the Value-at-Risk-spread strategy. We verify our findings by performing various data snooping tests.
260

Aprofundando as noções de dependência e envelhecimento em distribuições bivariadas de probabilidade / Deepening the notions of dependence and aging in bivariate probability distributions

Pinto, Jayme Augusto Duarte Pereira 21 March 2014 (has links)
A distribuição bivariada de Marshall-Olkin é estendida, relaxando-se a hipótese de choques exponencialmente distribuídos e assumindo-se dependência entre os choques individuais. Abordagem semelhante é considerada para sua versão dual. Representação por meio de cópula, propriedades probabilísticas e de confiabilidade assim como resultados em valores extremos são então obtidos. A propriedade de falta de memória bivariada é estendida assumindo-se uma função de dependência sem memória. Uma nova classe de distribuições caracterizada por essa propriedade estendida é introduzida. Correspondentes interpretações geométricas, procedimentos de construção, representação estocástica, relação com cópula de sobrevivência e propriedades de confiabilidade são derivadas. / Bivariate Marshall-Olkin model, Dual model, Exponential representation, Dependence function, Bivariate aging, Copula, Survival copula, Stochastic order, Bivariate extreme value distribution, Pickands measure, Pickands dependence function, Failure rate, Bivariate hazard gradient, Bivariate lack-of-memory, Residual lifetime vector, Characterization.

Page generated in 0.0911 seconds