• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 183
  • 109
  • 40
  • 29
  • 23
  • 18
  • 18
  • 13
  • 11
  • 10
  • 6
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 483
  • 483
  • 483
  • 87
  • 85
  • 75
  • 74
  • 67
  • 66
  • 64
  • 61
  • 59
  • 55
  • 55
  • 48
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
381

Régulation du marché des matières premières / Regulation in commodities markets

Huang, Xiaoying 11 April 2016 (has links)
Cette thèse se structure en 2 parties, le marché physique et le marché financier des matières premières agricoles et prend 3 thèmes intéressants à étudier. D'une part, si les interventions sur les marchés des matières premières sont limitées, une autre option toujours ouverte est de tenter d'atténuer les impacts négatifs des comportements extrêmes des prix. Ce thème fait le point sur la gestion efficace des risques. D'autre part, à ce jour aucun consensus n'a pu être trouvé concernant l'impact des nouveaux produits dérivés (ex : fonds indiciels). Cette question peut en fait être abordée directement dans l'impact des fonds indiciels sur le stockage des matières premières. Enfin, parmi les régulateurs sur les marchés des matières premières, le rôle des banques centrales est souvent ignoré. Pourtant, elles peuvent aussi être un régulateur efficace sur ces marchés. / This thesis provides an overview on the evolution of commodity markets and focuses on the price behaviour in both the commodity physical and the financial derivatives markets. Instead of directly analysing regulation in the commodity markets, this thesis highlights the market changes and specific market behaviour, which gives potential implications for market regulation.Main results of thesis. This research on regulation of commodity markets has been conducted within two different markets: commodity physical markets and commodity derivatives markets. Results in physical markets confirm the evidence of jump in agricultural commodity prices. Price volatility varies with time and is not constant. Commodity markets’ –at least agricultural markets’- prices oscillated during the period 2007/2008, which coincided with financial crisis. Relatively high frequent and small jumps in commodity prices are probably due to financial market factors instead of market fundamentals.The implication of these results in risk management of agricultural cooperatives is that, although taking into account jump in risk measure, such as VaR does not out-perform traditional VaR with normal distribution. Considering this kind of extreme price variation can be complementary for risk managers when facing a highly volatile commodity market. Findings for financial derivatives markets lead to two conclusions.Firstly, commodity index funds are confirmed again as having a short-term impact on futures prices in most products. Based on the theoretical conclusion about the intermediary role of inventory on the impact of speculation on the commodity market, we have found that commodity index can influence commodity futures prices without necessarily passing through inventory changes, which is probably due to price-inelasticity. Secondly, in relation to the impact of central bank announcements on commodity prices, the study shows that the central bank can be an alternative regulator to influence the commodity market. Additionally, it shows that commodity prices include the information from macroeconomic factors, such as currency rate and inflation rate.
382

Risk–based modeling, simulation and optimization for the integration of renewable distributed generation into electric power networks / Modélisation, simulation et optimisation basée sur le risque pour l’intégration de génération distribuée renouvelable dans des réseaux de puissance électrique

Mena, Rodrigo 30 June 2015 (has links)
Il est prévu que la génération distribuée par l’entremise d’énergie de sources renouvelables (DG) continuera à jouer un rôle clé dans le développement et l’exploitation des systèmes de puissance électrique durables, efficaces et fiables, en vertu de cette fournit une alternative pratique de décentralisation et diversification de la demande globale d’énergie, bénéficiant de sources d’énergie plus propres et plus sûrs. L’intégration de DG renouvelable dans les réseaux électriques existants pose des défis socio–technico–économiques, qu’ont attirés de la recherche et de progrès substantiels.Dans ce contexte, la présente thèse a pour objet la conception et le développement d’un cadre de modélisation, simulation et optimisation pour l’intégration de DG renouvelable dans des réseaux de puissance électrique existants. Le problème spécifique à considérer est celui de la sélection de la technologie,la taille et l’emplacement de des unités de génération renouvelable d’énergie, sous des contraintes techniques, opérationnelles et économiques. Dans ce problème, les questions de recherche clés à aborder sont: (i) la représentation et le traitement des variables physiques incertains (comme la disponibilité de les diverses sources primaires d’énergie renouvelables, l’approvisionnement d’électricité en vrac, la demande de puissance et l’apparition de défaillances de composants) qui déterminent dynamiquement l’exploitation du réseau DG–intégré, (ii) la propagation de ces incertitudes sur la réponse opérationnelle du système et le suivi du risque associé et (iii) les efforts de calcul intensif résultant du problème complexe d’optimisation combinatoire associé à l’intégration de DG renouvelable.Pour l’évaluation du système avec un plan d’intégration de DG renouvelable donné, un modèle de calcul de simulation Monte Carlo non–séquentielle et des flux de puissance optimale (MCS–OPF) a été conçu et mis en oeuvre, et qui émule l’exploitation du réseau DG–intégré. Réalisations aléatoires de scénarios opérationnels sont générés par échantillonnage à partir des différentes distributions des variables incertaines, et pour chaque scénario, la performance du système est évaluée en termes économiques et de la fiabilité de l’approvisionnement en électricité, représenté par le coût global (CG) et l’énergie non fournie (ENS), respectivement. Pour mesurer et contrôler le risque par rapport à la performance du système, deux indicateurs sont introduits, la valeur–à–risque conditionnelle(CVaR) et l’écart du CVaR (DCVaR).Pour la sélection optimale de la technologie, la taille et l’emplacement des unités DG renouvelables,deux approches distinctes d’optimisation multi–objectif (MOO) ont été mis en oeuvre par moteurs de recherche d’heuristique d’optimisation (HO). La première approche est basée sur l’algorithme génétique élitiste de tri non-dominé (NSGA–II) et vise à la réduction concomitante de l’espérance mathématique de CG et de ENS, dénotés ECG et EENS, respectivement, combiné avec leur valeurs correspondent de CVaR(CG) et CVaR(ENS); la seconde approche effectue un recherche à évolution différentielle MOO (DE) pour minimiser simultanément ECG et s’écart associé DCVaR(CG). Les deux approches d’optimisation intègrent la modèle de calcul MCS–OPF pour évaluer la performance de chaque réseau DG–intégré proposé par le moteur de recherche HO.Le défi provenant de les grands efforts de calcul requises par les cadres de simulation et d’optimisation proposée a été abordée par l’introduction d’une technique originale, qui niche l’analyse de classification hiérarchique (HCA) dans un moteur de recherche de DE.Exemples d’application des cadres proposés ont été élaborés, concernant une adaptation duréseau test de distribution électrique IEEE 13–noeuds et un cadre réaliste du système test de sous–transmission et de distribution IEEE 30–noeuds. [...] / Renewable distributed generation (DG) is expected to continue playing a fundamental role in the development and operation of sustainable, efficient and reliable electric power systems, by virtue of offering a practical alternative to diversify and decentralize the overall power generation, benefiting from cleaner and safer energy sources. The integration of renewable DG in the existing electric powernetworks poses socio–techno–economical challenges, which have attracted substantial research and advancement.In this context, the focus of the present thesis is the design and development of a modeling,simulation and optimization framework for the integration of renewable DG into electric powernetworks. The specific problem considered is that of selecting the technology, size and location of renewable generation units, under technical, operational and economic constraints. Within this problem, key research questions to be addressed are: (i) the representation and treatment of the uncertain physical variables (like the availability of diverse primary renewable energy sources, bulk–power supply, power demands and occurrence of components failures) that dynamically determine the DG–integrated network operation, (ii) the propagation of these uncertainties onto the system operational response and the control of the associated risk and (iii) the intensive computational efforts resulting from the complex combinatorial optimization problem of renewable DG integration.For the evaluation of the system with a given plan of renewable DG, a non–sequential MonteCarlo simulation and optimal power flow (MCS–OPF) computational model has been designed and implemented, that emulates the DG–integrated network operation. Random realizations of operational scenarios are generated by sampling from the different uncertain variables distributions,and for each scenario the system performance is evaluated in terms of economics and reliability of power supply, represented by the global cost (CG) and the energy not supplied (ENS), respectively.To measure and control the risk relative to system performance, two indicators are introduced, the conditional value–at–risk (CVaR) and the CVaR deviation (DCVaR).For the optimal technology selection, size and location of the renewable DG units, two distinct multi–objective optimization (MOO) approaches have been implemented by heuristic optimization(HO) search engines. The first approach is based on the fast non–dominated sorting genetic algorithm(NSGA–II) and aims at the concurrent minimization of the expected values of CG and ENS, thenECG and EENS, respectively, combined with their corresponding CVaR(CG) and CVaR(ENS) values; the second approach carries out a MOO differential evolution (DE) search to minimize simultaneously ECG and its associated deviation DCVaR(CG). Both optimization approaches embed the MCS–OPF computational model to evaluate the performance of each DG–integrated network proposed by the HO search engine. The challenge coming from the large computational efforts required by the proposed simulation and optimization frameworks has been addressed introducing an original technique, which nests hierarchical clustering analysis (HCA) within a DE search engine. Examples of application of the proposed frameworks have been worked out, regarding an adaptation of the IEEE 13 bus distribution test feeder and a realistic setting of the IEEE 30 bussub–transmission and distribution test system. The results show that these frameworks are effectivein finding optimal DG–integrated networks solutions, while controlling risk from two distinctperspectives: directly through the use of CVaR and indirectly by targeting uncertainty in the form ofDCVaR. Moreover, CVaR acts as an enabler of trade–offs between optimal expected performanceand risk, and DCVaR integrates also uncertainty into the analysis, providing a wider spectrum ofinformation for well–supported and confident decision making.
383

Causalidade Granger em medidas de risco / Granger Causality with Risk Measures

Patricia Nagami Murakami 02 May 2011 (has links)
Esse trabalho apresenta um estudo da causalidade de Granger em Risco bivariado aplicado a séries temporais financeiras. Os eventos de risco, no caso de séries financeiras, estão relacionados com a avaliação do Valor em Risco das posições em ativos. Para isso, os modelos CaViaR, que fazem parte do grupo de modelos de Regressão Quantílica, foram utilizado para identificação desses eventos. Foram expostos os conceitos principais envolvidos da modelagem, assim como as definições necessárias para entendê-las. Através da análise da causalide de Granger em risco entre duas séries, podemos investigar se uma delas é capaz de prever a ocorrência de um valor extremo da outra. Foi realizada a análise de causalidade de Granger usual somente para como comparativo. / Quantile Regression, Value at Risk, CAViaR Model, Granger Causality, Granger Causality in Risk
384

Ensaios em modelagem de dependência em séries financeiras multivariadas utilizando cópulas

Tófoli, Paula Virgínia January 2013 (has links)
O presente trabalho foi motivado pela forte demanda por modelos de dependência mais precisos e realistas para aplicações a dados financeiros multivariados. A recente crise financeira de 2007-2009 deixou claro quão importante é uma modelagem precisa da dependência para a avaliação correta do risco financeiro: percepções equivocadas sobre dependências extremas entre diferentes ativos foram um elemento importante da crise do subprime. O famoso teorema dc Sklar (1959) introduziu as cópulas como uma ferramenta para se modelar padrões de dependência mais sofisticados. Ele estabelece que qualquer função de distribuição conjunta ndimensional pode ser decomposta em suas n distribuições marginais e uma cópula, sendo que a última caracteriza completamente a dependência entre as variáveis. Enquanto existe uma variedade de famílias de cópulas bivariadas que podem descrever um amplo conjunto de dependências complexas, o conjunto de cópulas com dimensão mais elevada era bastante restrito até recentemente. Joe (1996) propôs uma construção de distribuições nmltivariadas baseada em pair-copulas (cópulas bivariadas), chamada pair-copula construction ou modelo de vine cópula, que reverteu esse problema. Nesta tese, desenvolvemos três ensaios que exploram a teoria de cópulas para obter modelos de dependência multivariados muito flexíveis para aplicações a dados financeiros. Patton (2006) estendeu o teorema de Sklar para o caso de distribuições condicionais e tornou o parâmetro de dependência da cópula variante no tempo. No primeiro ensaio, introduzimos um novo enfoque para modelar a dependência entre retornos financeiros internacionais ao longo do tempo, combinando cópulas; tempo-variantes e o modelo de mudança Markoviana. Aplicamos esses modelos de cópula e também os modelos propostos por Patton (2006), Jondeau e Rockinger (2006) e Silva Filho et al. (2012a) aos retornos dos índices FTSE 100, CAC 40 e DAX. Comparamos essas metodologias em termos das dinâmicas de dependência resultantes e das habilidades dos modelos em prever Valor em Risco (VaR). Interessantemente, todos os modelos identificam um longo período de alta dependência entre os retornos começando em 2007, quando a crise do subprime teve início oficialmente. Surpreendentemente, as cópulas elípticas mostram melhor desempenho na previsão dos quantis extremos dos retornos dos portfólios. No segundo ensaio, estendemos nosso estudo para o caso de n > 2 variáveis, usando o modelo de vine cópula para investigar a estrutura de dependência dos índices CAC 40, DAX, FTSE 100, S&P 500 e IBOVESPA, e, particularmente, checar a hipótese de dependência assimétrica nesse caso. Com base em nossos resultados empíricos, entretanto, essa hipótese não pode ser verificada. Talvez a dependência assimétrica com caudas inferiores mais fortes ocorra apenas temporariamente, o que sugere que a incorporação de variação temporal ao modelo de vine cópula pode melhorá-lo como ferramenta para modelar dados financeiros internacionais multivariados. Desta forma, no terceiro ensaio, introduzimos dinâmica no modelo de vine cópula permitindo que os parâmetros de dependência das pair-copulas em uma decomposição D-vine sejam potencialmente variantes no tempo, seguindo um processo ARMA(l,m) restrito como em Patton (2006). O modelo proposto é avaliado em simulações e também com respeito à acurácia das previsões de Valor em Risco (VaR) em períodos de crise. Os experimentos de Monte Cailo são bastante favoráveis à cópula D-vine dinâmica em comparação a uma cópula D-vine estática. Adicionalmente, a cópula D-vine dinâmica supera a cópula D-vine estática em termos de acurária preditiva para os nossos conjuntos de dados / This work was motivated by the strong demand for more precise and realistic dependence models for applications to multivariate financial data. The recent financial crisis of 2007-2009 has made it clear how important is a precise modeling of dependence for the accurate assessment of financial risk: misperceptions about extreme dependencies between different financial assets were an important element of the subprime crisis. The famous theorem by Sklar (1959) introduced the copulas as a tool to model more intricate patterns of dependence. It states that any n-dimensional joint distribution function can be decomposed into its n marginal distributions and a copula, where the latter completely characterizes the dependence among the variables. While there is a variety of bivariate copula families, which can match a wide range of complex dependencies, the set of higher-dimensional copulas was quite restricted until recently. Joe (1996) proposed a construction of multivariate distributions based on pair-copulas (bivariate copulas), called pair-copula construction or vine copula model, that has overcome this issue. In this thesis, we develop three papers that explore the copula theory in order to obtain very flexible multivariate dependence rnodels for applications to financial data. Patton (2006) extended Sklar's theorem to the conditional case and rendered the dependence parameter of the copula time-varying. In the first paper, we introduce a new approach to modeling dependence between International financial returns over time, combining time-varying copulas and the Markov switching model. We apply these copula models and also those proposed by Patton (2006), Jondeau and Rockinger (2006) and Silva Filho et al. (2012a) to the return data of FTSE 100, CAC 40 and DAX indexes. We compare these methodologies in terms of the resulting dynamics of dependence and the models' abilities to forecast Value-at-Risk (VaR). Interestingly, ali the models identify a long period of high dependence between the returns beginning in 2007, when the subprime crisis was evolving. Surprisingly, the elhptical copulas perform best in forecasting the extreme quantiles of the portfolios returns. In the second paper, we extend our study to the case of n > 2 variables, using the vine copula model to investigate the dependence structure of the broad stock market indexes CAC 40, DAX, FTSE 100, S&P 500 and IBOVESPA, and, particularly, check the asymmetric dependence hypothesis in this case. Based on our empirical results, however, this hypothesis cannot be verified. Perhaps, asymmetric dependence with stronger lower tails occurs only temporarily, what suggests that incorporating time variation into the vine copula rnodel can improve it as a tool to rnodel multivariate International financial data. So, in the third paper, we introduce dynamics into the vine copula model by allowing the dependence parameters of the pair-copulas in a D-vine decomposition to be potentially timevarying, following a nonlinear restricted ARMA(l,m) process as in Patton (2006). The proposed model is evaluated in simulations and further assessed with respect to the accuracy of Value-at- Risk (VaR) forecasts in crisis periods. The Monte Cario experiments are quite favorable to the dynamic D-vine copula in comparison with a static D-vine copula. Moreover, the dynamic Dvine copula outperforms the static D-vine copula in terms of predictive accuracy for our data sets.
385

Value at risk e expectes shortfall: medidas de risco e suas propriedades: um estudo empírico para o mercado brasileiro

Moraes, Camila Corrêa 29 January 2013 (has links)
Submitted by Camila Corrêa Moraes (camila.cmoraes@gmail.com) on 2013-02-24T03:00:19Z No. of bitstreams: 1 DISSERTAÇÃO CAMILA MORAES.pdf: 4708711 bytes, checksum: 3c2acb024f3dbcde7627bb8afea462fd (MD5) / Rejected by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br), reason: Prezada Camila, Seu titulo não confere com a Ata, não podemos aprovar o trabalho, pois não temos informação do orientador (verso da Ata) da mudança do título. Aguardo email do seu orientador informando a alteração e posteriormente o professor deve assinar o verso da ata. Att. Suzi 3799-7876 on 2013-02-25T15:26:27Z (GMT) / Submitted by Camila Corrêa Moraes (camila.cmoraes@gmail.com) on 2013-02-26T17:46:32Z No. of bitstreams: 1 DISSERTAÇÃO CAMILA MORAES.pdf: 4708711 bytes, checksum: 3c2acb024f3dbcde7627bb8afea462fd (MD5) / Approved for entry into archive by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br) on 2013-02-26T17:50:59Z (GMT) No. of bitstreams: 1 DISSERTAÇÃO CAMILA MORAES.pdf: 4708711 bytes, checksum: 3c2acb024f3dbcde7627bb8afea462fd (MD5) / Made available in DSpace on 2013-02-26T18:41:00Z (GMT). No. of bitstreams: 1 DISSERTAÇÃO CAMILA MORAES.pdf: 4708711 bytes, checksum: 3c2acb024f3dbcde7627bb8afea462fd (MD5) Previous issue date: 2013-01-29 / Value at Risk (VaR) and Expected Shortfall (ES) are quantitative models to measure market risk of financial assets portfolios. The purpose of this study is to evaluate the results of these models for a portfolio traded in the Brazilian market through four backtesting methods - Basel Traffic Light Test, Kupiec Test, Christoffersen Test and McNeil and Frey Test - covering periods of domestic (2002) and international (2008) financial crisis. The VaR model described here presents two approaches - Parametric, where it is assumed that the distribution of asset returns follow a Normal, and Historical Simulation, where there are no assumption about the distribution of asset returns, but it is assumed that they are independent and identically distributed. The results of VaR were also evaluated with the Cornish-Fisher expansion, which tries to approximate the empirical distribution to a Normal distribution using the values of skewness and kurtosis. Another feature observed was the property of coherence, which evaluates if the risk measure follows four basic axioms - monotonicity, translation invariance, homogeneity and subadditivity. VaR is not considered a coherent risk measure because it doesn´t follow the subadditivity feature in all cases. On the other hand the ES follows the four axioms, thus considered a coherent risk measure. The ES model was evaluated according to the Parametric Normal approach. This work also verified through backtests, if the property of coherency improves the accuracy of the analyzed risk measures / Value at Risk (VaR) e Expected Shortfall (ES) são modelos quantitativos para mensuração do risco de mercado em carteiras de ativos financeiros. O propósito deste trabalho é avaliar os resultados de tais modelos para ativos negociados no mercado brasileiro através de quatro metodologias de backtesting - Basel Traffic Light Test, Teste de Kupiec, Teste de Christoffersen e Teste de McNeil e Frey – abrangendo períodos de crise financeira doméstica (2002) e internacional (2008). O modelo de VaR aqui apresentado utilizou duas abordagens – Paramétrica Normal, onde se assume que a distribuição dos retornos dos ativos segue uma Normal, e Simulação Histórica, onde não há hipótese a respeito da distribuição dos retornos dos ativos, porém assume-se que os mesmos são independentes e identicamente distribuídos. Também foram avaliados os resultados do VaR com a expansão de Cornish-Fisher, a qual visa aproximar a distribuição empírica a uma distribuição Normal utilizando os valores de curtose e assimetria para tal. Outra característica observada foi a propriedade de coerência, a qual avalia se a medida de risco obedece a quatro axiomas básicos – monotonicidade, invariância sob translações, homogeneidade e subaditividade. O VaR não é considerado uma medida de risco coerente, pois não apresenta a característica de subaditividade em todos os casos. Por outro lado o ES obedece aos quatro axiomas, considerado assim uma medida coerente. O modelo de ES foi avaliado segundo a abordagem Paramétrica Normal. Neste trabalho também se verificou através dos backtests, o quanto a propriedade de coerência de uma medida de risco melhora sua precisão.
386

Modelagem de perdas com ações trabalhistas em instituições financeiras

Rachman, Luciano 07 August 2013 (has links)
Submitted by Luciano Rachman (lucianora@uol.com.br) on 2013-09-03T14:15:04Z No. of bitstreams: 1 Dissertacao_Luciano_Rachman.pdf: 1167975 bytes, checksum: da1c59096eda72630b44358c1d1e0b0f (MD5) / Approved for entry into archive by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br) on 2013-09-03T14:52:47Z (GMT) No. of bitstreams: 1 Dissertacao_Luciano_Rachman.pdf: 1167975 bytes, checksum: da1c59096eda72630b44358c1d1e0b0f (MD5) / Made available in DSpace on 2013-09-03T15:05:37Z (GMT). No. of bitstreams: 1 Dissertacao_Luciano_Rachman.pdf: 1167975 bytes, checksum: da1c59096eda72630b44358c1d1e0b0f (MD5) Previous issue date: 2013-08-07 / As perdas trabalhistas nas Instituições Financeiras representam um valor considerável que devem ser consideradas no modelo de capital regulatório para risco operacional, segundo Basileia. A presente dissertação demonstra uma forma de mensurar o risco às quais as Instituições Financeiras estão expostas nesse tipo de perdas. Diversos tipos de distribuições são analisados conforme sua aderência tanto na frequência como na severidade das perdas. Para os valores de frequência, foi obtida uma amostra de dados real, enquanto para a severidade foram utilizados valores obtidos de relatórios de instituto de pesquisa que serviram de insumo para os cálculos de ações trabalhistas conforme legislação brasileira vigente na CLT (Consolidação das Leis do Trabalho). / According to Basel, the labor losses in Financial Institutions represent a substantial value that should be regarded in the model of regulatory capital for operational risk. This dissertation demonstrates a way to measure the risk to which Financial Institutions are exposed to in this type of loss. Several types of distributions are analyzed according to their adherence both in frequency and severity of losses. For frequency values, it was obtained a sample of actual data, whilst for the severity were used values given from reports of research institute which served as an input for the calculations of labor actions according to the present Brazilian legislation in CLT (Consolidation of Labor Laws).
387

Tail Empirical Processes: Limit Theorems and Bootstrap Techniques, with Applications to Risk Measures

Loukrati, Hicham 07 May 2018 (has links)
Au cours des dernières années, des changements importants dans le domaine des assurances et des finances attirent de plus en plus l’attention sur la nécessité d’élaborer un cadre normalisé pour la mesure des risques. Récemment, il y a eu un intérêt croissant de la part des experts en assurance sur l’utilisation de l’espérance conditionnelle des pertes (CTE) parce qu’elle partage des propriétés considérées comme souhaitables et applicables dans diverses situations. En particulier, il répond aux exigences d’une mesure de risque “cohérente”, selon Artzner [2]. Cette thèse représente des contributions à l’inférence statistique en développant des outils, basés sur la convergence des intégrales fonctionnelles, pour l’estimation de la CTE qui présentent un intérêt considérable pour la science actuarielle. Tout d’abord, nous développons un outil permettant l’estimation de la moyenne conditionnelle E[X|X > x], ensuite nous construisons des estimateurs de la CTE, développons la théorie asymptotique nécessaire pour ces estimateurs, puis utilisons la théorie pour construire des intervalles de confiance. Pour la première fois, l’approche de bootstrap non paramétrique est explorée dans cette thèse en développant des nouveaux résultats applicables à la valeur à risque (VaR) et à la CTE. Des études de simulation illustrent la performance de la technique de bootstrap.
388

Methods of optimizing investment portfolios

Seepi, Thoriso P.J. January 2013 (has links)
>Magister Scientiae - MSc / In this thesis, we discuss methods for optimising the expected rate of return of a portfolio with minimal risk. As part of the work we look at the Modern Portfolio Theory which tries to maximise the portfolio's expected rate of return for a cer- tain amount of risk. We also use Quadratic Programming to optimise portfolios. Generally it is recognised that portfolios with a high expected return, carry higher risk. The Modern Portfolio Theory assists when choosing portfolios with the lowest possible risk. There is a nite number of assets in a portfolio and we therefore want to allocate them in such a way that we're able to optimise the expected rate of return with minimal risk. We also use the Markowian approach to allocate these assets. The Capital Asset Pricing Model is also used, which will help us to reduce our e cient portfolio to a single portfolio. Furthermore we use the Black-Litterman model to try and optimise our portfolio with a view to understanding the current market conditions, as well as considering how the market will perform in the future. An additional tool we'll use is Value at Risk. This enables us to manage the market risk. To this end, we follow the three basic approaches from Jorion [Value at Risk. USA: McGraw-Hills, 2001]. The Value at Risk tool has become essential in calcu- lating a portfolio's risk over the last decade. It works by monitoring algorithms in order to nd the worst possible scenarios within the portfolio. We perform several numerical experiments in MATLAB and Microsoft Excel and these are presented in the thesis with the relevant descriptions.
389

Řízení kurzového rizika výrobního podniku / Hedging of currency risk of manufacturing company

Fomina, Elena January 2017 (has links)
This thesis has an aim to create a hedging strategy for currency risks for exporting company. The main reason for hedging are possible losses that can be triggered by changes in exchange rate. In the case of exchange rate changes exporting company may face three different types of exposure: transaction, translation and economic exposure. This thesis concentrates on transaction exposure and builds a hedging strategy for exporting company AAA a.s. This firm is analyzed from qualitative side as well as from quantitative which is presented in the form of historical overview of the company and its position in international group. Based on this analysis as well as on theoretical findings, the hedging strategy for AAA a.s. was proposed. This strategy uses external and internal means of hedging.
390

Využití teorie extrémních hodnot při řízení operačních rizik / Extreme Value Theory in Operational Risk Management

Vojtěch, Jan January 2009 (has links)
Currently, financial institutions are supposed to analyze and quantify a new type of banking risk, known as operational risk. Financial institutions are exposed to this risk in their everyday activities. The main objective of this work is to construct an acceptable statistical model of capital requirement computation. Such a model must respect specificity of losses arising from operational risk events. The fundamental task is represented by searching for a suitable distribution, which describes the probabilistic behavior of losses arising from this type of risk. There is a strong utilization of the Pickands-Balkema-de Haan theorem used in extreme value theory. Roughly speaking, distribution of a random variable exceeding a given high threshold, converges in distribution to generalized Pareto distribution. The theorem is subsequently used in estimating the high percentile from a simulated distribution. The simulated distribution is considered to be a compound model for the aggregate loss random variable. It is constructed as a combination of frequency distribution for the number of losses random variable and the so-called severity distribution for individual loss random variable. The proposed model is then used to estimate a fi -nal quantile, which represents a searched amount of capital requirement. This capital requirement is constituted as the amount of funds the bank is supposed to retain, in order to make up for the projected lack of funds. There is a given probability the capital charge will be exceeded, which is commonly quite small. Although a combination of some frequency distribution and some severity distribution is the common way to deal with the described problem, the final application is often considered to be problematic. Generally, there are some combinations for severity distribution of two or three, for instance, lognormal distributions with different location and scale parameters. Models like these usually do not have any theoretical background and in particular, the connecting of distribution functions has not been conducted in the proper way. In this work, we will deal with both problems. In addition, there is a derivation of maximum likelihood estimates of lognormal distribution for which hold F_LN(u) = p, where u and p is given. The results achieved can be used in the everyday practices of financial institutions for operational risks quantification. In addition, they can be used for the analysis of a variety of sample data with so-called heavy tails, where standard distributions do not offer any help. As an integral part of this work, a CD with source code of each function used in the model is included. All of these functions were created in statistical programming language, in S-PLUS software. In the fourth annex, there is the complete description of each function and its purpose and general syntax for a possible usage in solving different kinds of problems.

Page generated in 0.0342 seconds