• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 183
  • 109
  • 40
  • 29
  • 23
  • 18
  • 18
  • 13
  • 11
  • 10
  • 6
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 483
  • 483
  • 483
  • 87
  • 85
  • 75
  • 74
  • 67
  • 66
  • 64
  • 61
  • 59
  • 55
  • 55
  • 48
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Liquidity and capital market imperfections /

Kessler, Stephan Markus. January 2006 (has links) (PDF)
Univ., Diss.--St. Gallen, 2006.
182

Predicting and hedging credit portfolio risk with macroeconomic factors /

Bär, Tobias. January 2002 (has links)
Frankfurt (Main), University, Thesis (doctoral), 2001.
183

Aspects of Modeling Fraud Prevention of Online Financial Services

Dan, Gorton January 2015 (has links)
Banking and online financial services are part of our critical infrastructure. As such, they comprise an Achilles heel in society and need to be protected accordingly. The last ten years have seen a steady shift from traditional show-off hacking towards cybercrime with great economic consequences for society. The different threats against online services are getting worse, and risk management with respect to denial-of-service attacks, phishing, and banking Trojans is now part of the agenda of most financial institutions. This trend is overseen by responsible authorities who step up their minimum requirements for risk management of financial services and, among other things, require regular risk assessment of current and emerging threats.For the financial institution, this situation creates a need to understand all parts of the incident response process of the online services, including the technology, sub-processes, and the resources working with online fraud prevention. The effectiveness of each countermeasure has traditionally been measured for one technology at a time, for example, leaving the fraud prevention manager with separate values for the effectiveness of authentication, intrusion detection, and fraud prevention. In this thesis, we address two problems with this situation. Firstly, there is a need for a tool which is able to model current countermeasures in light of emerging threats. Secondly, the development process of fraud detection is hampered by the lack of accessible data.In the main part of this thesis, we highlight the importance of looking at the “big risk picture” of the incident response process, and not just focusing on one technology at a time. In the first article, we present a tool which makes it possible to measure the effectiveness of the incident response process. We call this an incident response tree (IRT). In the second article, we present additional scenarios relevant for risk management of online financial services using IRTs. Furthermore, we introduce a complementary model which is inspired by existing models used for measuring credit risks. This enables us to compare different online services, using two measures, which we call Expected Fraud and Conditional Fraud Value at Risk. Finally, in the third article, we create a simulation tool which enables us to use scenario-specific results together with models like return of security investment, to support decisions about future security investments.In the second part of the thesis, we develop a method for producing realistic-looking data for testing fraud detection. In the fourth article, we introduce multi-agent based simulations together with social network analysis to create data which can be used to fine-tune fraud prevention, and in the fifth article, we continue this effort by adding a platform for testing fraud detection. / Finansiella nättjänster är en del av vår kritiska infrastruktur. På så vis utgör de en akilleshäl i samhället och måste skyddas på erforderligt sätt. Under de senaste tio åren har det skett en förskjutning från traditionella dataintrång för att visa upp att man kan till en it-brottslighet med stora ekonomiska konsekvenser för samhället. De olika hoten mot nättjänster har blivit värre och riskhantering med avseende på överbelastningsattacker, nätfiske och banktrojaner är nu en del av dagordningen för finansiella institutioner. Denna trend övervakas av ansvariga myndigheter som efterhand ökar sina minimikrav för riskhantering och bland annat kräver regelbunden riskbedömning av befintliga och nya hot.För den finansiella institutionen skapar denna situation ett behov av att förstå alla delar av incidenthanteringsprocessen, inklusive dess teknik, delprocesser och de resurser som kan arbeta med bedrägeribekämpning. Traditionellt har varje motåtgärds effektivitet mätts, om möjligt, för en teknik i taget, vilket leder till att ansvariga för bedrägeribekämpning får separata värden för autentisering, intrångsdetektering och bedrägeridetektering.I denna avhandling har vi fokuserat på två problem med denna situation. För det första finns det ett behov av ett verktyg som kan modellera effektiviteten för institutionens samlade motåtgärder mot bakgrund av befintliga och nya hot. För det andra saknas det tillgång till data för forskning rörande bedrägeridetektering, vilket hämmar utvecklingen inom området.I huvuddelen av avhandlingen ligger tonvikten på att studera ”hela” incidenthanteringsprocessen istället för att fokusera på en teknik i taget. I den första artikeln presenterar vi ett verktyg som gör det möjligt att mäta effektiviteten i incidenthanteringsprocessen. Vi kallar detta verktyg för ”incident response tree” (IRT) eller ”incidenthanteringsträd”. I den andra artikeln presenterar vi ett flertal scenarier som är relevanta för riskhantering av finansiella nättjänster med hjälp av IRT. Vi utvecklar också en kompletterande modell som är inspirerad av befintliga modeller för att mäta kreditrisk. Med hjälp av scenarioberoende mått för ”förväntat bedrägeri” och ”value at risk”, har vi möjlighet att jämföra risker mellan olika nättjänster. Slutligen, i den tredje artikeln, skapar vi ett agentbaserat simuleringsverktyg som gör det möjligt att använda scenariospecifika resultat tillsammans med modeller som ”avkastning på säkerhetsinvesteringar” för att stödja beslut om framtida investeringar i motåtgärder.I den andra delen av avhandlingen utvecklar vi en metod för att generera syntetiskt data för test av bedrägeridetektering. I den fjärde artikeln presenterar vi ett agentbaserat simuleringsverktyg som med hjälp av bland annat ”sociala nätverksanalyser” kan användas för att generera syntetiskt data med realistiskt utseende. I den femte artikeln fortsätter vi detta arbete genom att lägga till en plattform för testning av bedrägeridetektering. / <p>QC 20151103</p>
184

Efektyviojo investicinio portfelio valdymas rizikos vertės metodu / An Effective Investment Portfolio Management Using Value-at-Risk

Lukšys, Kęstutis 07 June 2006 (has links)
One of risk measurements – Value-at-Risk (VaR) was analyzed in this work. Complete definition of VaR is presented and three classical calculation methods of it are examined: parametric, historical simulations, and Monte-Carlo generations. The main advantages and disadvantages of the application of VaR are reviewed. Correlation effect for two assets risk diversification is examined and Markowitz method for calculation of efficient frontier is presented. Analyzed methods were implemented in the program, which calculates the first moments of portfolio’s returns, correlations between different assets and for a given return adjust weights in a manner to minimize dispersion of portfolio’s returns. For every efficient portfolio VaR can be calculated at any confidence level. Created program was used to analyze three investment portfolios: one of generated data with normal distribution, one of LITIN-10 index stocks and one of OMX Vilnius index stocks. Efficient frontier for these portfolios and VaR for whole efficient frontier were calculated. We noticed difference between minimal VaR and minimal standard deviation portfolios, consequently three investment strategies were implemented. The best results for analyzed portfolios there achieved with minimized VaR strategy.
185

Dynamic Hedging: CVaR Minimization and Path-Wise Comparison

Smirnov, Ivan Unknown Date
No description available.
186

Extreme value modelling with application in finance and neonatal research

Zhao, Xin January 2010 (has links)
Modelling the tails of distributions is important in many fields, such as environmental science, hydrology, insurance, engineering and finance, where the risk of unusually large or small events are of interest. This thesis applies extreme value models in neonatal and finance studies and develops novel extreme value modelling for financial applications, to overcome issues associated with the dependence induced by volatility clustering and threshold choice. The instability of preterm infants stimulates the interests in estimating the underlying variability of the physiology measurements typically taken on neonatal intensive care patients. The stochastic volatility model (SVM), fitted using Bayesian inference and a particle filter to capture the on-line latent volatility of oxygen concentration, is used in estimating the variability of medical measurements of preterm infants to highlight instabilities resulting from their under-developed biological systems. Alternative volatility estimators are considered to evaluate the performance of the SVM estimates, the results of which suggest that the stochastic volatility model provides a good estimator of the variability of the oxygen concentration data and therefore may be used to estimate the instantaneous latent volatility for the physiological measurements of preterm infants. The classical extreme value distribution, generalized pareto distribution (GPD), with the peaks-over-threshold (POT) method to ameliorate the impact of dependence in the extremes to infer the extreme quantile of the SVM based variability estimates. Financial returns typically show clusters of observations in the tails, often termed “volatility clustering” which creates challenges when applying extreme value models, since classical extreme value theory assume independence of underlying process. Explicit modelling on GARCH-type dependence behaviour of extremes is developed by implementing GARCH conditional variance structure via the extreme value model parameters. With the combination of GEV and GARCH models, both simulation and empirical results show that the combined model is better suited to explain the extreme quantiles. Another important benefit of the proposed model is that, as a one stage model, it is advantageous in making inferences and accounting for all uncertainties much easier than the traditional two stage approach for capturing this dependence. To tackle the challenge threshold choice in extreme value modelling and the generally asymmetric distribution of financial data, a two tail GPD mixture model is proposed with Bayesian inference to capture both upper and lower tail behaviours simultaneously. The proposed two tail GPD mixture modelling approach can estimate both thresholds, along with other model parameters, and can therefore account for the uncertainty associated with the threshold choice in latter inferences. The two tail GPD mixture model provides a very flexible model for capturing all forms of tail behaviour, potentially allowing for asymmetry in the distribution of two tails, and is demonstrated to be more applicable in financial applications than the one tail GPD mixture models previously proposed in the literature. A new Value-at-Risk (VaR) estimation method is then constructed by adopting the proposed mixture model and two-stage method: where volatility estimation using a latent volatility model (or realized volatility) followed by the two tail GPD mixture model applied to independent innovations to overcome the key issues of dependence, and to account for the uncertainty associated with threshold choice. The proposed method is applied in forecasting VaR for empirical return data during the current financial crisis period.
187

Extreme value theory and copula theory: a risk management application with energy futures.

Liu, Jia 06 April 2011 (has links)
Deregulation of the energy market and surging trading activities have made the energy markets even more volatile in recent years. Under such circumstances, it becomes increasingly important to assess the probability of rare and extreme price movement in the risk management of energy futures. Similar to other financial time series, energy futures exhibit time varying volatility and fat tails. An appropriate risk measurement of energy futures should be able to capture these two features of the returns. In the first portion of this dissertation, we use the conditional Extreme Value Theory model to estimate Value-at-Risk (VaR) and Expected Shortfall (ES) for long and short trading positions in the energy markets. The statistical tests on the backtests show that this approach provides a significant improvement over the widely used Normal distribution based VaR and ES models. In the second portion of this dissertation, we extend our analysis from a single security to a portfolio of energy futures. In recent years, commodity futures have gained tremendous popularity as many investors believe they provide much needed diversification to their portfolios. In order to properly account for any diversification benefits, we employ a time-varying conditional bivariate copula approach to model the dependence structure between energy futures. In contrast to previous studies on the same subject, we introduce fundamental supply and demand factors into the copula models to study the dependence structure between energy futures. We find that energy futures are more likely to move together during down markets than up markets. In the third part of this dissertation, we extend our study of bivariate copula models to multivariate copula theory. We employ a pair-copula approach to estimate VaR and ES of a portfolio consisting of energy futures, the S&P 500 index and the US Dollar index. Our empirical results show that although the pair copula approach does not offer any added advantage in VaR and ES estimation over a long backtest horizon, it provides much more accurate estimates of risk during the period of high co-dependence among assets after the recent financial crisis.
188

Which GARCH model is best for Value-at-Risk?

Berggren, Erik, Folkelid, Fredrik January 2015 (has links)
The purpose of this thesis is to identify the best volatility model for Value-at-Risk(VaR) estimations. We estimate 1 % and 5 % VaR figures for Nordic indices andstocks by using two symmetrical and two asymmetrical GARCH models underdifferent error distributions. Out-of-sample volatility forecasts are produced usinga 500 day rolling window estimation on data covering January 2007 to December2014. The VaR estimates are thereafter evaluated through Kupiec’s test andChristoffersen’s test in order to find the best model. The results suggest thatasymmetrical models perform better than symmetrical models albeit the simpleARCH is often good enough for 1 % VaR estimates.
189

The impact of the market risk of capital regulations on bank activities

Eksi, Emrah January 2006 (has links)
Banking has a unique role in the well-being of an economy. This role makes banks one of the most heavily regulated and supervised industries. In order to strengthen the soundness and stability of banking systems, regulators require banks to hold adequate capital. While credit risk was the only risk that was covered by the original Basle Accord, with the 1996 amendment, banks have also been required to assign capital for their market risk starting from 1998. In this research, the impact of the market risk capital regulations on bank capital levels and derivative activities is investigated. In addition, this study also evaluates the impact of using different approaches that are allowed to be used while calculating the required market risk capital, as well as the accuracy of VaR models. The implementation of the market risk capital regulations can influence banks either by increasing their capital or by decreasing their trading activities and in particular trading derivative activities. The literature review concerning capital regulations illustrates that in particular the impact of these regulations on bank capital levels and derivative activities is an issue that has not yet been explored. In order to fill this gap, the changes in capital and derivatives usage ratios are modelled by using a partial adjustment framework. The main results of this analysis suggest that the implementation of the market risk capital regulations has a significant and positive impact on the risk-based capital ratios of BHCs. However, the results do not indicate any impact of these regulations on derivative activities. The empirical findings also demonstrate that there is no significant relationship between capital and derivatives. The market risk capital regulations allow the use of either a standardised approach or the VaR methodologies to determine the required capital amounts to cover market risk. In order to evaluate these approaches, firstly differences on bank VaR practices are investigated by employing a documentary analysis. The documentary analysis is conducted to demonstrate the differences in bank VaR practices by comparing the VaR models of 25 international banks. The survey results demonstrate that there, is no industry consensus on the methodology for calculating VaR. This analysis also indicates that the assumptions in estimating VaR models vary considerably among financial institutions. Therefore, it is very difficult for financial market participants to make comparisons across institutions by considering single VaR values. Secondly, the required capital amounts are calculated for two hypothetical foreign exchange portfolios by using both the standardised and three different VaR methodologies, and then these capital amounts are compared. These simulations are conducted to understand to what extent the market risk capital regulations approaches produce different outcomes on the capital levels. The results indicate that the VaR estimates are dependent upon the VaR methodology. Thirdly, three backtesting methodologies are applied to the VaR models. The results indicate that a VaR model that provides accurate estimates for a specific portfolio could fail when the portfolio composition changes. The results of the simulations indicate that the market risk capital regulations do not provide a `level playing field' for banks that are subject to these regulations. In addition, giving an option to banks to determine the VaR methodology could create a moral hazard problem as banks may choose an inaccurate model that provides less required capital amounts.
190

Optimal Deployment of Direction-finding Systems

Kim, Suhwan 03 October 2013 (has links)
A direction-finding system with multiple direction finders (DFs) is a military intelligence system designed to detect the positions of transmitters of radio frequencies. This dissertation studies three decision problems associated with the direction-finding system. The first part of this dissertation is to prescribe DF deployment to maximize the effectiveness with which transmitter positions are estimated in an area of interest (AOI). Three methods are presented to prescribe DF deployment. The first method uses Stansfield’s probability density function to compute objective function coefficients numerically. The second and the third employ surrogate measures of effectiveness as objective functions. The second method, like the first, involves complete enumerations; the third formulates the problem as an integer program and solves it with an efficient network-based label-setting algorithm. Our results show that the third method, which involved use of a surrogate measure as an objective function and an exact label-setting algorithm, is most effective. The second part of this dissertation is to minimize the number of DFs to cover an AOI effectively, considering obstacles between DFs and transmitters. We formulate this problem as a partial set multicover problem in which at least -fraction of the likely transmitter positions must be covered, each by at least direction finders. We present greedy heuristics with random selection rules for the partial set multicover problem, estimating statistical bounds on unknown optimal values. Our results show that the greedy heuristic with column selection rule, which gives priority for selecting a column that advances more rows to k-coverage, performs best on the partial set multicover problems. Results also show that the heuristic with random row and column selection rules is the best of the heuristics with respect to statistical bounds. The third part of this dissertation deals with the problem of deploying direction finders with the goal of maximizing the effectiveness with which transmitter positions can be estimated in an AOI while hedging against enemy threats. We present four formulations, considering the probability that a direction finder deployed at a location will survive enemy threats over the planning horizon (i.e., not be rendered inoperative by an attack). We formulate the first two as network flow problems and present an efficient label-setting algorithm. The third and the fourth use the well-known Conditional Value at Risk (CVaR) risk measure to deal with the risk of being rendered inoperative by the enemy. Computational results show that risk-averse decision models tend to deploy some or all DFs in locations that are not close to the enemy to reduce risk. Results also show that a direction-finding system with 5 DFs provides improved survivability under enemy threats.

Page generated in 0.0262 seconds