• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 91
  • 55
  • 8
  • 3
  • 1
  • 1
  • 1
  • Tagged with
  • 168
  • 168
  • 52
  • 51
  • 50
  • 33
  • 31
  • 29
  • 28
  • 27
  • 27
  • 27
  • 27
  • 25
  • 24
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Modelling of Capital Requirements using LSTM and A-SA in CRR 3 / Modellering av kapitalkrav med hjälp av LSTM och A-SA i regelverket CRR 3

Gan, William January 2022 (has links)
In response to the Great Financial Crisis of 2008, a handful of measures were taken to increase the resilience toward a similar disaster in the future. Global financial regulatory entities implemented several new directives with the intention to enhance global capital markets, leading to regulatory frameworks where financial participants (FPs) are regulated with own fund's requirements for market risks. This thesis intends to investigate two different methods presented in the framework Capital Requirements Regulation 3 (CRR 3), a framework stemming from the Basel Committee and implemented in EU legislation for determining the capital requirements for an FP. The first method, The Alternative Standardised Approach (A-SA), looks at categorical data, whereas the second method, The Alternative Internal Model Approach (A-IMA), uses the risk measure Expected Shortfall (ES) for determining the capital requirement and therefore requires the FP to estimate ES using a proprietary/internal model based on time series data. The proprietary model in this thesis uses a recurrent neural network (RNN) with several long short-term memory (LSTM) layers to predict the next day's ES using the previous 20 day's returns. The data consisted of categorical and time series data of a portfolio with the Nasdaq 100 companies as positions. This thesis concluds that A-IMA with an LSTM-network as the proprietary model, gives a lower capital requirement compared to A-SA but is less reliable in real-life applications due to its behaviour as a "black box" and is, thus, less compliant from a regulatory standpoint. The LSTM-model showed promising results for capturing the overall trend in the data, for example periods with high volatility, but underestimated the true ES. / Efter finanskrisen 2008 vidtogs flera effektiva åtgärder av världens största finansiella myndigheter som ett svar på det tidigare icke transparenta klimatet inom finanssektorn med intentionen att förstärka de globala kapitalmarknaderna. Detta innebar att nya samt strängare regelverk etablerades med direktiv så som hårdare kapitalkrav. Detta examensarbete är en empirisk undersökning samt jämförelse mellan två metoder i regelverket "Captail Requirements Regulation 3" (CRR 3) som kan användas för att beräkna en finansiell institutions kapitalkrav. Den första metoden, så kallad "Den alternativa schablonmetoden" (A-SA), använder kategorisk data för att beräkna kapitalkravet medan den andra metoden, "Den alternativa internmodellen" (A-IMA), kräver en att först beräkna riskmåttet "Expected Shortfall" (ES), med hjälp av en internmodell baserad på tidsseriedata, för att sedan kapitalkravet ska kunna beräknas. CRR 3 innehåller tydliga riktlinjer om hur en sådan internmodell ska utformas och i detta projekt testas en modell baserad på "återkommande neurala nätverk" (RNN) med den specifika arkitekturen "Long Short-Term Memory" (LSTM) för att estimera ES. De slutsatserna som kan dras är att A-IMA med en LSTM-modell, ger ett mindre kapitalkrav än A-SA. Däremot är A-IMA mindre tillförlitliga inom riskappliceringar på grund av risken att neurala nätverk kan bete sig som svarta lådor, vilket gör modellen mindre kompatibel från ett regelverksperspektiv. LSTM-modellen visade sig kunna upptäcka den generella trenden i portföljdatan (exempelvis perioder med hög volaitet) men gav konservativa prediktioner i jämförelse med testdatan.
142

Sustainability scores for portfolio performance / Hållbarhetsbetyg för portföljintegration

Stern, Felix January 2020 (has links)
In this thesis, the traditional methods of only using ESG scores to screen stocks for sustainable portfolios is broadened. The selection of securities for portfolios will instead depend on aggregation, weighting and normalization of a wider set of sustainability variables, in turn creating more all-encompassing sustainability scores. Using these scores, the aim is to implement them in index tracking portfolios. These portfolios combines a hybrid approach between active and passive investment, with the aim of creating sustainable enhanced index funds that can beat the index without adding significant risk. Additionally, this allows for comparison of how different combinations and levels of sustainability affects returns, risk and index tracking. The results that are obtained shows that in the scenario presented in the thesis, it is possible to create a sustainability score which both increases the average sustainability of portfolios, and yields risk adjusted returns. We also studied how a net increase in sustainability scores over a control portfolio results in higher active returns, and eventually a small drop off in information ratio as we apply too strong of a sustainability constraint to our portfolios. The combination of sustainability scores which showed the highest risk adjusted returns was created using equal parts z-scored ESG ratings, ESG risk ratings and ESG momentum. / Detta examensarbete breddar de traditionella metoderna för att skapa hållbara portföljer. Genom att basera urvalet av aktier på aggregering, viktande och normalisering av ett större set av hållbarhetsvariabler, jämfört med traditionell screening baserad på endast ESG betyg, skapas mer omfattande hållbarhetsbetyg. Syftet med studien är att implementera dessa hållbarhetsbetyg vid skapandet av index-portföljer och analysera resultaten. Dessa portföljer kombinerar då både aktiva och passiva investeringsprinciper, med målet att skapa hållbara indexnära fonder som kan prestera bättre än indexet, utan signifikant höjd risk. Dessa hållbarhetsbetyg tillåter även jämförelse av hur olika kombinationer och nivåer av hållbarhet påverkar avkastning, risk och närhet till index. Resultaten visar tydligt att det, inom uppsatsens avgränsningar, är möjligt att skapa hållbarhetsbetyg som ökar både hållbarheten av portföljer i snitt, och skapar riskjusterad avkastning. Det visar även hur en relativ höjning av hållbarhetsbetygen resulterar i högra aktiv avkastning jämfört med en kontroll-portfölj. Vid en viss nivå av höjning sker dock en avtappning av den riskjusterade avkastningen. Den kombinationen av hållbarhetsvariabler som visar högst riskjusterad avkastning när de aggregeras till ett hållbarhetsbetyg är en kombination, i lika delar, av ESG betyg, ESG risk och ESG momentum.
143

Evaluating volatility forecasts, A study in the performance of volatility forecasting methods / Utvärdering av volatilitetsprognoser, En undersökning av kvaliteten av metoder för volatilitetsprognostisering

Verhage, Billy January 2023 (has links)
In this thesis, the foundations of evaluating the performance of volatility forecasting methods are explored, and a mathematical framework is created to determine the overall forecasting performance based on observed daily returns across multiple financial instruments. Multiple volatility responses are investigated, and theoretical corrections are derived under the assumption that the log returns follow a normal distribution. Performance measures that are independent of the long-term volatility profile are explored and tested. Well-established volatility forecasting methods, such as moving average and GARCH (p,q) models, are implemented and validated on multiple volatility responses. The obtained results reveal no significant difference in the performances between the moving average and GARCH (1,1) volatility forecast. However, the observed non-zero bias and a separate analysis of the distribution of the log returns reveal that the theoretically derived corrections are insufficient in correcting the not-normally distributed log returns. Furthermore, it is observed that there is a high dependency of abslute performances on the considered evaluation period, suggesting that comparisons between periods should not be made. This study is limited by the fact that the bootstrapped confidence regions are ill-suited for determining significant performance differences between forecasting methods. In future work, statistical significance can be gained by bootstrapping the difference in performance measures. Furthermore, a more in-depth analysis is needed to determine more appropriate theoretical corrections for the volatility responses based on the observed distribution of the log returns. This will increase the overall forecasting performance and improve the overall quality of the evaluation framework. / I detta arbete utforskas grunderna för utvärdering av prestandan av volatilitetsprognoser och ett matematiskt ramverk skapas för att bestämma den övergripande prestandan baserat på observerade dagliga avkastningar för flera finansiella instrument. Ett antal volatilitetsskattningar undersökts och teoretiska korrigeringar härleds under antagandet att log-avkastningen följer en normalfördelningen. Prestationsmått som är oberoende av den långsiktiga volatilitetsprofilen utforskas och testas. Väletablerare metoder för volatilitetsprognostisering, såsom glidande medelvärden och GARCH-modeller, implementeras och utvärderas mot flera volatilitetsskattningar. De erhållna resultaten visar att det inte finns någon signifikant skillnad i prestation mellan prognoser producerade av det glidande medelvärdet och GARCH (1,1). Det observerade icke-noll bias och en separat analys av fördelningen av log-avkastningen visar dock att de teoretiskt härledda korrigeringarna är otillräckliga för att fullständigt korrigera volatilitesskattningarna under icke-normalfördelade log-avkastningar. Dessutom observeras att det finns ett stort beroende på den använda utvärderingsperioden, vilket tyder på att jämförelser mellan perioder inte bör göras. Denna studie är begränsad av det faktum att de använda bootstrappade konfidensregionerna inte är lämpade för att fastställa signifikanta skillnader i prestanda mellan prognosmetoder. I framtida arbeten behövs fortsatt analys för att bestämma mer lämpliga teoretiska korrigeringar för volatilitetsskattningarna baserat på den observerade fördelningen av log-avkastningen. Detta kommer att öka den övergripande prestandan och förbättra den övergripande kvaliteten på prognoserna.
144

Modélisation du carnet d'ordres limites et prévision de séries temporelles

Simard, Clarence 10 1900 (has links)
Le contenu de cette thèse est divisé de la façon suivante. Après un premier chapitre d’introduction, le Chapitre 2 est consacré à introduire aussi simplement que possible certaines des théories qui seront utilisées dans les deux premiers articles. Dans un premier temps, nous discuterons des points importants pour la construction de l’intégrale stochastique par rapport aux semimartingales avec paramètre spatial. Ensuite, nous décrirons les principaux résultats de la théorie de l’évaluation en monde neutre au risque et, finalement, nous donnerons une brève description d’une méthode d’optimisation connue sous le nom de dualité. Les Chapitres 3 et 4 traitent de la modélisation de l’illiquidité et font l’objet de deux articles. Le premier propose un modèle en temps continu pour la structure et le comportement du carnet d’ordres limites. Le comportement du portefeuille d’un investisseur utilisant des ordres de marché est déduit et des conditions permettant d’éliminer les possibilités d’arbitrages sont données. Grâce à la formule d’Itô généralisée il est aussi possible d’écrire la valeur du portefeuille comme une équation différentielle stochastique. Un exemple complet de modèle de marché est présenté de même qu’une méthode de calibrage. Dans le deuxième article, écrit en collaboration avec Bruno Rémillard, nous proposons un modèle similaire mais cette fois-ci en temps discret. La question de tarification des produits dérivés est étudiée et des solutions pour le prix des options européennes de vente et d’achat sont données sous forme explicite. Des conditions spécifiques à ce modèle qui permettent d’éliminer l’arbitrage sont aussi données. Grâce à la méthode duale, nous montrons qu’il est aussi possible d’écrire le prix des options européennes comme un problème d’optimisation d’une espérance sur en ensemble de mesures de probabilité. Le Chapitre 5 contient le troisième article de la thèse et porte sur un sujet différent. Dans cet article, aussi écrit en collaboration avec Bruno Rémillard, nous proposons une méthode de prévision des séries temporelles basée sur les copules multivariées. Afin de mieux comprendre le gain en performance que donne cette méthode, nous étudions à l’aide d’expériences numériques l’effet de la force et la structure de dépendance sur les prévisions. Puisque les copules permettent d’isoler la structure de dépendance et les distributions marginales, nous étudions l’impact de différentes distributions marginales sur la performance des prévisions. Finalement, nous étudions aussi l’effet des erreurs d’estimation sur la performance des prévisions. Dans tous les cas, nous comparons la performance des prévisions en utilisant des prévisions provenant d’une série bivariée et d’une série univariée, ce qui permet d’illustrer l’avantage de cette méthode. Dans un intérêt plus pratique, nous présentons une application complète sur des données financières. / This thesis is structured as follows. After a first chapter of introduction, Chapter 2 exposes as simply as possible different notions that are going to be used in the two first papers. First, we discuss the main steps required to build stochastic integrals for semimartingales with space parameters. Secondly, we describe the main results of risk neutral evaluation theory and, finally, we give a short description of an optimization method known as duality. Chapters 3 and 4 consider the problem of modelling illiquidity, which is covered by two papers. The first one proposes a continuous time model for the structure and the dynamic of the limit order book. The dynamic of a portfolio for an investor using market orders is deduced and conditions to rule out arbitrage are given. With the help of Itô’s generalized formula, it is also possible to write the value of the portfolio as a stochastic differential equation. A complete example of market model along with a calibration method is also given. In the second paper, written in collaboration with Bruno Rémillard, we propose a similar model with discrete time trading. We study the problem of derivatives pricing and give explicit formulas for European option prices. Specific conditions to rule out arbitrage are also provided. Using the dual optimization method, we show that the price of European options can be written as the optimization of an expectation over a set of probability measures. Chapter 5 contained the third paper and studies a different topic. In this paper, also written with Bruno Rémillard, we propose a forecasting method for time series based on multivariate copulas. To provide a better understanding of the proposed method, with the help of numerical experiments, we study the effect of the strength and the structure of the different dependencies on predictions performance. Since copulas allow to isolate the dependence structure and marginal distributions, we study the impact of different marginal distributions on predictions performance. Finally, we also study the effect of estimation errors on the predictions. In all the cases, we compare the performance of predictions by using predictions based on a bivariate series and predictions based on a univariate series, which allows to illustrate the advantage of the proposed method. For practical matters, we provide a complete example of application on financial data.
145

Studies of robustness in stochastic analysis and mathematical finance

Perkowski, Nicolas Simon 07 February 2014 (has links)
Diese Dissertation behandelt Fragen aus der stochastischen Analysis und der Finanzmathematik, die sich unter dem Begriff der Robustheit zusammenfassen lassen. Zunächst betrachten wir finanzmathematische Modelle mit Arbitragemöglichkeiten. Wir identifizieren die Abwesenheit von Arbitragemöglichkeiten der ersten Art (NA1) als minimale Eigenschaft, die in jedem finanzmathematischen Modell gelten muss, und zeigen, dass (NA1) äquivalent zur Existenz eines dominierenden lokalen Martingalmaßes ist. Als Beispiel für Prozesse, die (NA1) erfüllen, studieren wir stetige lokale Martingale, die darauf bedingt werden nie Null zu treffen. Anschließend verwenden wir eine modellfreie Version der (NA1) Eigenschaft, die es erlaubt, qualitative Eigenschaften von “typischen Preistrajektorien” zu beschreiben. Hier konstruieren wir ein pfadweises Itô-Integral. Dies deutet an, dass sich typische Preispfade als rough-path-Integratoren verwenden lassen. Nun entwickeln wir mittels Fourierentwicklungen einen alternativen Zugang zur rough-path-Theorie. Wir zerlegen das Integral in drei Operatoren mit verschiedenen Eigenschaften. So wird offensichtlich, dass Integratoren mit der Regularität der Brownschen Bewegung mit ihrer Lévy-Fläche versehen werden müssen, um ein pfadweise stetiges Integral zu erhalten. Daraufhin bemerken wir, dass die Integration zweier Funktionen gegeneinander äquivalent dazu ist, eine Funktion mit der Ableitung einer anderen (im Allgemeinen eine Distribution) zu multiplizieren. In höheren Dimensionen ist das Multiplikationsproblem jedoch allgemeiner. Wir verwenden Littlewood-Paley-Theorie, um unseren Fourier-Zugang zur rough-path-Theorie auf Funktionen mehrdimensionaler Variablen zu erweitern. Wir konstruieren einen Operator, der für Funktionen mit dem punktweisen Produkt übereinstimmt und in einer geeigneten Topologie stetig ist. Nun lassen sich stochastische partielle Differentialgleichungen lösen, die bisher aufgrund von Nichtlinearitäten nicht zugänglich waren. / This thesis deals with various problems from stochastic analysis and from mathematical finance that can best be summarized under the common theme of robustness. We begin by studying financial market models with arbitrage opportunities. We identify the weak notion of absence of arbitrage opportunities of the first kind (NA1) as the minimal property that every sensible asset price model should satisfy, and we prove that (NA1) is equivalent to the existence of a dominating local martingale. As examples of processes that satisfy (NA1) but do not admit equivalent local martingale measures, we study continuous local martingales conditioned not to hit zero. We continue by working with a model free formulation of the (NA1) property, which permits to describe qualitative properties of “typical asset price trajectories”. We construct a pathwise Itô integral for typical price paths. Our results indicate that typical price paths can be used as integrators in the theory of rough paths. Next, we use a Fourier series expansion to develop an alternative approach to rough path integration. We decompose the integral into three components with different behavior. Then it is easy to see that integrators with the regularity of the Brownian motion must be equipped with their Lévy area to obtain a pathwise continuous integral operator. We now note that integrating two functions against each other is equivalent to multiplying one with the derivative of the other, which will in general only be a distribution. In higher index dimensions however, the multiplication problem is more general. We use Littlewood-Paley theory to extend our Fourier approach from rough path integrals to multiplying functions of a multidimensional index. We construct an operator which agrees with the usual product for smooth functions, and which is continuous in a suitable topology. We apply this to solve stochastic partial differential equations that were previously difficult to access due to nonlinearities.
146

Revision Moment for the Retail Decision-Making System

Juszczuk, Agnieszka Beata, Tkacheva, Evgeniya January 2010 (has links)
In this work we address to the problems of the loan origination decision-making systems. In accordance with the basic principles of the loan origination process we considered the main rules of a clients parameters estimation, a change-point problem for the given data and a disorder moment detection problem for the real-time observations. In the first part of the work the main principles of the parameters estimation are given. Also the change-point problem is considered for the given sample in the discrete and continuous time with using the Maximum likelihood method. In the second part of the work the disorder moment detection problem for the real-time observations is considered as a disorder problem for a non-homogeneous Poisson process. The corresponding optimal stopping problem is reduced to the free-boundary problem with a complete analytical solution for the case when the intensity of defaults increases. Thereafter a scheme of the real time detection of a disorder moment is given.
147

Pricing of exotic options under the Kou model by using the Laplace transform

Dzharayan, Gayk, Voronova, Elena January 2011 (has links)
In this thesis we present the Laplace transform method of option pricing and it's realization, also compare it with another methods. We consider vanilla and exotic options, but more attention we pay to the two-asset correlation options. We chose the one of the modifications of Black-Scholes model, the Kou double exponential jump-diffusion model with the double exponential distribution of jumps, as model of the underlying stock prices development. The computations was done by the Laplace transform and it's inversion by the Euler method. We will present in details proof of finding Laplace transforms of put and call two-asset correlation options, the calculations of the moment generation function of the jump-diffusion by Levy-Khintchine formulae in cases without jumps and with independent jumps, and direct calculation of the risk-neutral expectation by solving double integral. Our work also contains the programme code for two-asset correlation call and put options. We will show the realization of our programme in the real data. As a result we see how our model complies on the NASDAQ OMX Stock-holm Market, considering the two-asset correlation options on three cases by stock prices of Handelsbanken, Ericsson and index OMXS30.
148

Detection of the Change Point and Optimal Stopping Time by Using Control Charts on Energy Derivatives

AL, Cihan, Koroglu, Kubra January 2011 (has links)
No description available.
149

Análise das dificuldades enfrentadas por alunos do ensino médio em interpretar e resolver problemas de matemática financeira

Fonseca, Simone de Jesus da 31 May 2016 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / This research had the purpose to identify students' difficulties in the financial mathematics problem solving as well, analyze the mistakes made by them. The study involved 39 students of the 3rd year in high school of a state school at the Sergipano High Hinterland. The data collection included the application of two questionnaires with four questions each, both involving the same Financial Mathematics subjects. For the analysis of the first questionnaire was used Error Analysis (Cury 1994) and Movshovitz-Hadar, Zaslavsky and Inbar Model (1986, 1987) quoted by Cury (2007), in order to meet and categorize the types of errors made by the students in the resolution of issues. In the second questionnaire, consisted of problems, we used the qualitative analysis and the phases considered by Polya (1995) to solve problems. From this perspective, we tried to identify the stage that presents itself as the most difficulties of students in problem solving. In the analysis of the first questionnaire, we detected that the biggest difficulty faced is related to technical errors, errors involving calculations and algebraic manipulations. This showed us how the deficit in the operations reflects in the learning of other mathematical content. The second questionnaire proved our mistrust that the greatest difficulty faced by students in problem solving is in the interpretation of the statements. / Esta pesquisa teve o propósito de identificar as dificuldades dos alunos na resolução de problemas em Matemática Financeira bem como, analisar os erros cometidos por eles. O estudo envolveu 39 estudantes do 3º ano do Ensino Médio de um colégio estadual do Alto Sertão Sergipano. A coleta de dados contou com a aplicação de dois questionários com quatro questões cada um, ambos envolvendo os mesmos assuntos de Matemática Financeira. Para a análise do primeiro questionário foi utilizada a Análise de Erros (Cury 1994) e o Modelo de Movshovitz-Hadar, Zaslavsky e Inbar (1986, 1987) citado por Cury (2007), com a finalidade de conhecer e categorizar os tipos de erros cometidos pelos alunos na resolução das questões. Já no segundo questionário, composto por problemas, foi utilizada a análise qualitativa de conteúdo e as fases consideradas por Polya (1995) para a resolução de problemas. Nessa ótica, procuramos identificar a fase que se apresenta como a maior dificuldade dos alunos na resolução de problemas. Na análise do primeiro questionário detectamos que a maior dificuldade enfrentada está relacionada a erros técnicos, que envolvem erros de cálculos e manipulações algébricas. Isso nos mostrou como o déficit nas operações reflete na aprendizagem dos demais conteúdos matemáticos. O segundo questionário comprovou nossa suspeita de que a maior dificuldade enfrentada pelos discentes na resolução de problemas está na interpretação dos enunciados.
150

Carbon Intensity Estimation of Publicly Traded Companies / Uppskattning av koldioxidintensitet hos börsnoterade bolag

Ribberheim, Olle January 2021 (has links)
The purpose of this master thesis is to develop a model to estimate the carbon intensity, i.e the carbon emission relative to economic activity, of publicly traded companies which do not report their carbon emissions. By using statistical and machine learning models, the core of this thesis is to develop and compare different methods and models with regard to accuracy, robustness, and explanatory value when estimating carbon intensity. Both discrete variables, such as the region and sector the company is operating in, and continuous variables, such as revenue and capital expenditures, are used in the estimation. Six methods were compared, two statistically derived and four machine learning methods. The thesis consists of three parts: data preparation, model implementation, and model comparison. The comparison indicates that boosted decision tree is both the most accurate and robust model. Lastly, the strengths and weaknesses of the methodology is discussed, as well as the suitability and legitimacy of the boosted decision tree when estimating carbon intensity. / Syftet med denna masteruppsats är att utveckla en modell som uppskattar koldioxidsintensiteten, det vill säga koldioxidutsläppen i förhållande till ekonomisk aktivitet, hos publika bolag som inte rapporterar sina koldioxidutsläpp. Med hjälp av statistiska och maskininlärningsmodeller kommer stommen i uppsatsen vara att utveckla och jämföra olika metoder och modeller utifrån träffsäkerhet, robusthet och förklaringsvärde vid uppskattning av koldioxidintensitet. Både diskreta och kontinuerliga variabler används vid uppskattningen, till exempel region och sektor som företaget är verksam i, samt omsättning och kapitalinvesteringar. Sex stycken metoder jämfördes, två statistiskt härledda och fyra maskininlärningsmetoder. Arbetet består av tre delar; förberedelse av data, modellutveckling och modelljämförelse, där jämförelsen indikerar att boosted decision tree är den modell som är både mest träffsäker och robust. Slutligen diskuteras styrkor och svagheter med metodiken, samt lämpligheten och tillförlitligheten med att använda ett boosted decision tree för att uppskatta koldioxidintensitet.

Page generated in 0.0972 seconds