• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 100
  • 14
  • 13
  • 7
  • 6
  • 5
  • 5
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 168
  • 168
  • 168
  • 54
  • 33
  • 30
  • 25
  • 24
  • 21
  • 20
  • 20
  • 19
  • 18
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Extreme-Value Analysis of Self-Normalized Increments / Extremwerteigenschaften der normierten Inkremente

Kabluchko, Zakhar 23 April 2007 (has links)
No description available.
102

台灣銀行業系統重要性之衡量 / Measuring Systemic Importance of Taiwan’s Banking System

林育慈, Lin, Yu Tzu Unknown Date (has links)
本文利用Gravelle and Li (2013)提出之系統重要性指標來衡量國內九家上市金控銀行對於系統風險之貢獻程度。此種衡量方法係將特定銀行之系統重要性定義為該銀行發生危機造成系統風險增加的幅度,並以多變量極值理論進行機率的估算。實證結果顯示:一、系統重要性最高者為第一銀行;最低者為中國信託銀行。其中除中國信託銀行之重要性顯著低於其他銀行外,其餘銀行之系統重要性均無顯著差異。二、經營期間較長之銀行其系統重要性較高;具公股色彩之銀行對於系統風險之貢獻程度平均而言高於民營銀行。三、銀行規模與其對系統風險之貢獻大致呈現正向關係,即規模越大之銀行其重要性越高。在此情況下可能會有銀行大到不能倒的問題發生。四、存放比較低之銀行系統重要性亦較低,而資本適足率與系統重要性間並無明顯關係。 / In this thesis, we apply the measure proposed by Gravelle and Li (2013) to examine the systemic importance of certain Taiwanese banks. The systemic importance is defined as the increase in the systemic risk conditioned on the crash of a particular bank, and is estimated by the multivariate extreme value theory. Our empirical evidence shows that the most systemically important bank is First Commercial Bank, and the CTBC Bank is significantly less important than other banks, while the differences among the remaining banks are not significant. Second, banks established earlier have higher systemic importance; and the contribution to systemic risk of public banks, on average, is higher than the contribution of private banks. Third, we also find out that the size of a bank and its risk contribution have positive relationship. That is, the bigger a bank is, the more important it is. Under this circumstances, the too big to fail problem may occur. Last, the bank which has lower loan-to-deposit ratio will be less systemically important than those with higher ones, while the relation between capital adequacy ratio and systemic importance is unclear.
103

Statistical Post-Processing Methods And Their Implementation On The Ensemble Prediction Systems For Forecasting Temperature In The Use Of The French Electric Consumption

Gogonel, Adriana Geanina 27 November 2012 (has links) (PDF)
The thesis has for objective to study new statistical methods to correct temperature predictionsthat may be implemented on the ensemble prediction system (EPS) of Meteo France so toimprove its use for the electric system management, at EDF France. The EPS of Meteo Francewe are working on contains 51 members (forecasts by time-step) and gives the temperaturepredictions for 14 days. The thesis contains three parts: in the first one we present the EPSand we implement two statistical methods improving the accuracy or the spread of the EPS andwe introduce criteria for comparing results. In the second part we introduce the extreme valuetheory and the mixture models we use to combine the model we build in the first part withmodels for fitting the distributions tails. In the third part we introduce the quantile regressionas another way of studying the tails of the distribution.
104

Modeling, analysis, and optimization for wireless networks in the presence of heavy tails

Wang, Pu 13 January 2014 (has links)
The heavy-tailed traffic from wireless users, caused by the emerging Internet and multimedia applications, induces extremely dynamic and variable network environment, which can fundamentally change the way in which wireless networks are conceived, designed, and operated. This thesis is concerned with modeling, analysis, and optimization of wireless networks in the presence of heavy tails. First, a novel traffic model is proposed, which captures the inherent relationship between the traffic dynamics and the joint effects of the mobility variability of network users and the spatial correlation in their observed physical phenomenon. Next, the asymptotic delay distribution of wireless users is analyzed under different traffic patterns and spectrum conditions, which reveals the critical conditions under which wireless users can experience heavy-tailed delay with significantly degraded QoS performance. Based on the delay analysis, the fundamental impact of heavy-tailed environment on network stability is studied. Specifically, a new network stability criterion, namely moment stability, is introduced to better characterize the QoS performance in the heavy-tailed environment. Accordingly, a throughput-optimal scheduling algorithm is proposed to maximize network throughput while guaranteeing moment stability. Furthermore, the impact of heavy-tailed spectrum on network connectivity is investigated. Towards this, the necessary conditions on the existence of delay-bounded connectivity are derived. To enhance network connectivity, the mobility-assisted data forwarding scheme is exploited, whose important design parameters, such as critical mobility radius, are derived. Moreover, the latency in wireless mobile networks is analyzed, which exhibits asymptotic linearity in the initial distance between mobile users.
105

Mathematical methods for portfolio management

Ondo, Guy-Roger Abessolo 08 1900 (has links)
Portfolio Management is the process of allocating an investor's wealth to in­ vestment opportunities over a given planning period. Not only should Portfolio Management be treated within a multi-period framework, but one should also take into consideration the stochastic nature of related parameters. After a short review of key concepts from Finance Theory, e.g. utility function, risk attitude, Value-at-rusk estimation methods, a.nd mean-variance efficiency, this work describes a framework for the formulation of the Portfolio Management problem in a Stochastic Programming setting. Classical solution techniques for the resolution of the resulting Stochastic Programs (e.g. L-shaped Decompo­ sition, Approximation of the probability function) are presented. These are discussed within both the two-stage and the multi-stage case with a special em­ phasis on the former. A description of how Importance Sampling and EVPI are used to improve the efficiency of classical methods is presented. Postoptimality Analysis, a sensitivity analysis method, is also described. / Statistics / M. Sc. (Operations Research)
106

Aplicação da Teoria dos valores extremos em estratégias "Long-Short"

Monte-mor, Danilo Soares 17 December 2010 (has links)
Made available in DSpace on 2016-12-23T14:00:36Z (GMT). No. of bitstreams: 1 Danilo Soares Monte-Mor.pdf: 964390 bytes, checksum: 749870f88ee1c9c692cf782e397379ec (MD5) Previous issue date: 2010-12-17 / Cada vez mais tem surgido no mercado de investimento fundos de retorno absoluto (Hedge Funds) que têm como objetivo principal melhorar seus desempenhos através de estratégias de arbitragem, como é o caso das estratégias long-short. É o comportamento desproporcional e até mesmo antagônico dos preços dos ativos que permite aos players estruturar estratégias para gerar retornos adicionais, superiores aos custos de oportunidade e independentes ao movimento do mercado. Neste trabalho foi utilizada a Teoria de Valores Extremos (TVE), um importante ramo da probabilidade, para que fossem modeladas as séries da relação direta entre preços de dois pares de ativos. Os quantis obtidos a partir de tal modelagem, juntamente com os quantis fornecidos pela normal, foram superpostos aos dados para períodos subsequentes ao período analisado. A partir da comparação desses dados foi criada uma nova estratégia quantitativa long-short de arbitragem, a qual denominamos GEV Long-Short Strategy / Increasingly has appeared on the market of investment Absolute Return Funds (Hedge Funds), which have the main objective to improve their performance through arbitrage strategies, as long-short strategies. It is the disproportionate evolution and even antagonistic of active prices that allows the players to structure strategies to generate additional returns, higher than the opportunity costs and independent of the movement of the market. In this work we used Extreme Value Theory (EVT), an important segment of probability, to model the series of direct relationship between prices of two pairs of assets. The quantiles obtained from such modeling and the quantile provided by normal were superimposed on data for periods subsequent to the period analyzed. From the comparison of such data we created a new quantitative long-short arbitrage strategy, called GEV Long-Short Strategy
107

Aplicação da Teoria do Valor Extremo e Copulas para avaliar risco de mercado de ações brasileiras / Application of extreme value theory and copulas to assess risk of the stock market in Brazil

Angelo Santos Alves 26 September 2013 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / As instituições financeiras são obrigadas por acordos internacionais, como o Acordo de Basiléia, a avaliar o risco de mercado ao qual a instituição está propensa de forma a evitar possíveis contaminações de desastres financeiros em seu patrimônio. Com o intuito de capturar tais fenômenos, surge a necessidade de construir modelos que capturem com mais acurácia movimentos extremos das séries de retornos. O trabalho teve como principal objetivo aplicar a Teoria do Valor Extremo juntamente com Copulas na estimação de quantis extremos para o VaR. Ele utiliza técnicas de simulação de Monte Carlo, Teoria do Valor Extremo e Cópulas com distribuições gaussianas e t. Em contrapartida, as estimativas produzidas serão comparadas com as de um segundo modelo, chamado de simulação histórica de Monte Carlo filtrada, mais conhecida como filtered historical simulation (FHS). As técnicas serão aplicadas a um portfólio de ações de empresas brasileiras. / Financial institutions are required by international agreements such as the Basel Accord, to assess the market risk to which the institution is likely to avoid possible contamination of its assets in financial disasters. In order to capture these phenomena, the need arises to build models that more accurately capture extreme movements of the return series. The work aimed to apply the Extreme Value Theory along with the estimation of copulas for VaR extremes quantiles. He uses techniques of Monte Carlo simulation, Extreme Value Theory and Copulas with Gaussian and t distributions. In contrast, the estimates produced will be compared with a second model, called Monte Carlo simulation of historical filtered, better known as filtered historical simulation (FHS). The techniques are applied to a portfolio of stocks of Brazilian companies.
108

[en] ESSAYS ON THE RISK ASSOCIATED TO FORECASTING ELECTRICITY PRICES AND ON MODELING THE DEMAND OF ENERGY FROM AN ELECTRICITY DISTRIBUTOR / [pt] ENSAIOS SOBRE O RISCO DE PREVISÃO DE PREÇOS DE ENERGIA ELÉTRICA E MODELAGEM DE CARGA DEMANDADA A UMA DISTRIBUIDORA DE ELETRICIDADE

MARIO DOMINGUES DE PAULA SIMOES 31 July 2018 (has links)
[pt] A presente tese trata da avaliação do risco associado à incerteza presente na previsão dos preços de energia elétrica, bem como os aspectos de incerteza associados à previsão de demanda da carga de energia elétrica exigida de uma distribuidora de eletricidade. O primeiro trabalho trata do risco associado à previsão dos preços da energia elétrica, partindo do conhecido fato de que os vários modelos de previsão destes preços são sabidamente imprecisos; assim sendo, qual deve ser o risco incorrido ao se utilizar determinada técnica de modelagem, considerando-se que provavelmente estaremos fazendo uma previsão errônea. A abordagem utilizada é a modelagem dos erros de previsão com a Teoria de Valores Extremos, que se mostra bastante segura para modelagens dos quantis extremos da distribuição dos resíduos, desde 98 porcento até acima de 99,5 porcento, para diferentes frequências de amostragem dos dados. No capítulo seguinte, é feita uma avaliação da carga elétrica demandada a uma distribuidora, primeiramente considerando a abordagem utilizando modelos do tipo ARMA e ARMAX, buscando avaliar sua eficiência preditiva. Estes modelos são sabidamente apropriados para previsões no curto prazo, e mostramos através de simulações de Monte Carlo, que sua extensão para previsões de longo prazo torna inócua a busca de sofisticação através do trabalho de incorporação de variáveis exógenas. O motivo é que dado que o erro incorrido em quaisquer destas previsões mais longas com tais modelos é tão grande, ainda que sejam modelos mais ou menos sofisticados, com variáveis exógenas ou não, um modelo simples produzirá o mesmo efeito do que aquele de maior sofisticação, em termos de confiança na previsão média obtida. Finalmente, o último trabalho aborda o tema de possíveis não linearidades no processo de geração de dados da carga elétrica demandada de uma distribuidora, admitindo não ser este um processo apenas linear. Para tal são usados modelos não lineares auto-regressivos de mudança de regimes, que se mostram vantajosos por serem inerentemente resistentes a possíveis quebras estruturais na série de carga utilizada, além de serem particularmente apropriados para modelar assimetrias no processo gerador de dados. Mostramos que mesmo modelos do tipo TAR simples, com apenas dois regimes e auto excitados, isto é, não incorporando quaisquer variáveis exógenas, podem ser mais apropriados do que modelos lineares auto-regressivos, demonstrando melhor capacidade de previsão fora-da-amostra. Ao mesmo tempo tais modelos tem relativa facilidade de cálculo, não exigindo sofisticados recursos computacionais. / [en] This present thesis discusses the risk associated to the uncertainty that is present in the process of forecasting electricity prices, as well as the aspects of uncertainty in the forecast of electrical energy loads required from an electricity distributor. The first essay deals with the risk inherent to the forecast of electricity prices, bearing in mind that the various existing models are notoriously imprecise. Therefore, we attempt to determine what the forecast risk is, given that a certain forecasting technique is used and that it will probably inaccurate. The approach used is through the modeling of forecast residues with the Extreme Value Theory, which proves itself to be satisfactorily accurate for the modeling of the distribution of residues at such extreme quantiles as from 98 per cent up to over 99,5 per cent, for different data sampling frequencies. The following next chapter shows the evaluation of the electricity load required from a distributor, first by using such models as ARMA and ARMAX, trying to evaluate their predictive efficiency. These models are known to be appropriate for short term predictions, and we show by means of Monte Carlo simulations that their extended use for long term forecasts will render useless any attempt to sophisticate such models by means of incorporating exogenous variables. This is due to the fact that since the error from such longer forecasts will be so large one way or the other, with exogenous variables or not, a simpler model will be as useful as any in terms of the error in the mean prediction. Finally, the last work discusses the possibility of nonlinear effects being present in the data generating process of electrical load demanded from an energy distributor, admitting this process being just linear. To accomplish this task, we use nonlinear auto-regressive regime switching models, which are shown to be inherently resistant to possible structural breaks in the load series data used, at the same time that they are particularly appropriated to modeling asymmetries in the data generating process. We show that even relatively simple self-excited TAR models with only two regimes, that is, not resorting to any exogenous variables, can be more appropriate than linear auto-regressive models, sporting better out-of-sample forecast results. At the same time, such models are relatively simple to calculate, not requiring any sophisticated computational means.
109

Mesure du capital réglementaire par des modèles de risque de marché / Measure of capital requirement by market risk models

Kourouma, Lancine 11 May 2012 (has links)
Suite à la crise financière et économique de 2008, il a été constaté sur le portefeuille de négociation des banques un montant de capital réglementaire significativement inférieur aux pertes réelles. Pour comprendre les causes de cette insuffisance de capital réglementaire, il nous a paru important d'évaluer la fiabilité des modèles de mesure de risque de marché et de proposer des méthodologies de stress test pour la gestion des risques extrêmes. L'objectif est de mesurer le capital réglementaire sur un portefeuille de négociation composé d'actions et de matières premières par la mesure de la Value at Risk (VaR) et l'Expected Shortfall. Pour réaliser cet objectif, nous avons utilisé le modèle Generalized Pareto Distribution (GPD) et deux modèles internes utilisés par les banques : méthode de simulation historique et modèle de la loi normale. Une première évaluation de la fiabilité effectuée sur les trois modèles de risque sous l'hypothèse de volatilité constante, montre que les modèles internes des banques et le modèle GPD ne mesurent pas correctement le risque du portefeuille d'étude pendant les périodes de crise. Néanmoins, le modèle GPD est fiable en période de faible volatilité mais avec une forte surestimation du risque réel ; cela peut conduire les banques à bloquer plus de fonds propres réglementaires qu'il est nécessaire. Une seconde évaluation de la fiabilité des modèles de risque a été effectuée sous l'hypothèse du changement de la volatilité et par la prise en compte de l'effet asymétrique des rentabilités financières. Le modèle GPD s'est révélé le plus fiable quelles que soient les conditions des marchés. La prise en compte du changement de la volatilité a amélioré la performance des modèles internes des banques. L'intégration des scénarios historiques et hypothétiques dans les modèles de risque a permis d'évaluer le risque extrême tout en diminuant la subjectivité reprochée aux techniques de stress test. Le stress test réalisé avec les modèles internes des banques ne permet pas une mesure correcte du risque extrême. Le modèle GPD est mieux adapté pour le stress test. Nous avons développé un algorithme de stress test qui permettra aux banques d'évaluer le risque extrême de leurs portefeuilles et d'identifier les facteurs de risque responsables de ce risque. Le calcul du capital réglementaire sur la base de la somme de la VaR et du stress VaR n'est pas logique et entraîne un doublement des fonds propres réglementaires des banques. Le doublement de ces fonds propres aura pour conséquence le resserrement du crédit à l'économie. Nous observons que le coefficient multiplicateur et le principe de la racine carrée du temps de l'accord de Bâle conduisent les banques à faire un arbitrage en faveur des modèles de risque non fiables. / During the financial and economic crisis of 2008, it was noticed that the amount of capital required for banks' trading portfolio was significantly less than the real losses. To understand the causes of this low capital requirement, it seemed important to estimate the reliability of the market risk models and to propose stress testing methodologies for the management of extreme risks. The objective is to measure the capital requirement on a trading portfolio, composed of shares and commodities by the measure of the Value at Risk (VaR) and Expected Shortfall. To achieve this goal, we use the Generalized Pareto Distribution (GPD) and two internal models commonly used by banks: historical simulation method and model of the normal law. A first evaluation of the reliability made on the three risk models under the hypothesis of constant volatility, shows that the internal banks' models and the GPD model do not measure correctly the risk of the portfolio during the crisis periods. However, GPD model is reliable in periods of low volatility but with a strong overestimation of the real risk; it can lead banks to block more capital requirement than necessary. A second evaluation of the reliability of the risk models was made under the hypothesis of the change of the volatility and by considering the asymmetric effect of the financial returns. GPD model is the most reliable of all, irrespective of market conditions. The performance of the internal banks' risk models improves when considering the change of the volatility. The integration of the historic and hypothetical scenarios in the risk models, improves the estimation of the extreme risk, while decreasing the subjectivity blamed to the stress testing techniques. The stress testing realized with the internal models of banks does not allow a correct measure of the extreme risk. GPD model is better adapted for the stress testing techniques. We developed an algorithm of stress testing which allow banks to estimate the extreme risk of their portfolios and to identify the risk factors causing this risk. The calculation of the capital requirement based on the sum of the VaR and the stress VaR is not logical and leads to doubling the capital requirement of banks. Consequently, it conducts to a credit crunch in the economy. We observe that the multiplier coefficient and the principle of square root of time of the Basel's agreement lead banks to make arbitration in favor of risk models that are not reliable.
110

Aplicação da Teoria do Valor Extremo e Copulas para avaliar risco de mercado de ações brasileiras / Application of extreme value theory and copulas to assess risk of the stock market in Brazil

Angelo Santos Alves 26 September 2013 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / As instituições financeiras são obrigadas por acordos internacionais, como o Acordo de Basiléia, a avaliar o risco de mercado ao qual a instituição está propensa de forma a evitar possíveis contaminações de desastres financeiros em seu patrimônio. Com o intuito de capturar tais fenômenos, surge a necessidade de construir modelos que capturem com mais acurácia movimentos extremos das séries de retornos. O trabalho teve como principal objetivo aplicar a Teoria do Valor Extremo juntamente com Copulas na estimação de quantis extremos para o VaR. Ele utiliza técnicas de simulação de Monte Carlo, Teoria do Valor Extremo e Cópulas com distribuições gaussianas e t. Em contrapartida, as estimativas produzidas serão comparadas com as de um segundo modelo, chamado de simulação histórica de Monte Carlo filtrada, mais conhecida como filtered historical simulation (FHS). As técnicas serão aplicadas a um portfólio de ações de empresas brasileiras. / Financial institutions are required by international agreements such as the Basel Accord, to assess the market risk to which the institution is likely to avoid possible contamination of its assets in financial disasters. In order to capture these phenomena, the need arises to build models that more accurately capture extreme movements of the return series. The work aimed to apply the Extreme Value Theory along with the estimation of copulas for VaR extremes quantiles. He uses techniques of Monte Carlo simulation, Extreme Value Theory and Copulas with Gaussian and t distributions. In contrast, the estimates produced will be compared with a second model, called Monte Carlo simulation of historical filtered, better known as filtered historical simulation (FHS). The techniques are applied to a portfolio of stocks of Brazilian companies.

Page generated in 0.1013 seconds