• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 185
  • 109
  • 40
  • 29
  • 24
  • 18
  • 18
  • 13
  • 11
  • 11
  • 6
  • 5
  • 5
  • 4
  • 4
  • Tagged with
  • 489
  • 489
  • 483
  • 87
  • 87
  • 76
  • 75
  • 67
  • 67
  • 66
  • 61
  • 59
  • 55
  • 55
  • 48
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

A study on the performance evaluation of financial holding company

Kuo, Chen-Ling 19 August 2002 (has links)
none
182

The Application Of VaR In Taiwan Property And Casualty Insurance Industry And Influence Factor Of Underwriting Risk Research

Liu, Cheng-chung 02 July 2008 (has links)
Abstract In these years, Value at Risk (VaR) has been an important tool of risk management in the bank industry. In the past, property and casualty insurance industry does not have many correlation research in this aspect, especially in the key of the underwriting risk application may be collection difficulty in data , the domestic correlation research literature were actually few. In this paper, we use TEJ data bank to study the statistical data which needs for the research , the research sample total has 9 property insurance companies, By using the public information of TEJ data bank, it obtains the yearly and quarterly data, and uses the ¡§Fuzzy Distance Weighting Method¡¨ to change the quarterly data into monthly data , calculates loss ratio of the yearly, quarterly, monthly, then use the idea of VaR to compare the different of loss ratio-at-risk in yearly, quarterly, monthly¡CMoreover this study discusses the underwriting risk influence factor of domestic property and casualty insurance industry .This research discovers that yearly data will underestimate the actual of loss ratio at risk . In addition using regression analysis, the underwriting loss ratio-at- risk is influenced by free cash flow , leverage ratio , and firm size. According to the result of this paper, it could provide the reference rule when property and casualty insurance industry or supervisory authority set up the risk management rule. Keywords: Value at risk, Loss ratio, Loss ratio-at-risk, Underwriting risk
183

Liquidity and capital market imperfections /

Kessler, Stephan Markus. January 2006 (has links) (PDF)
Univ., Diss.--St. Gallen, 2006.
184

Predicting and hedging credit portfolio risk with macroeconomic factors /

Bär, Tobias. January 2002 (has links)
Frankfurt (Main), University, Thesis (doctoral), 2001.
185

Aspects of Modeling Fraud Prevention of Online Financial Services

Dan, Gorton January 2015 (has links)
Banking and online financial services are part of our critical infrastructure. As such, they comprise an Achilles heel in society and need to be protected accordingly. The last ten years have seen a steady shift from traditional show-off hacking towards cybercrime with great economic consequences for society. The different threats against online services are getting worse, and risk management with respect to denial-of-service attacks, phishing, and banking Trojans is now part of the agenda of most financial institutions. This trend is overseen by responsible authorities who step up their minimum requirements for risk management of financial services and, among other things, require regular risk assessment of current and emerging threats.For the financial institution, this situation creates a need to understand all parts of the incident response process of the online services, including the technology, sub-processes, and the resources working with online fraud prevention. The effectiveness of each countermeasure has traditionally been measured for one technology at a time, for example, leaving the fraud prevention manager with separate values for the effectiveness of authentication, intrusion detection, and fraud prevention. In this thesis, we address two problems with this situation. Firstly, there is a need for a tool which is able to model current countermeasures in light of emerging threats. Secondly, the development process of fraud detection is hampered by the lack of accessible data.In the main part of this thesis, we highlight the importance of looking at the “big risk picture” of the incident response process, and not just focusing on one technology at a time. In the first article, we present a tool which makes it possible to measure the effectiveness of the incident response process. We call this an incident response tree (IRT). In the second article, we present additional scenarios relevant for risk management of online financial services using IRTs. Furthermore, we introduce a complementary model which is inspired by existing models used for measuring credit risks. This enables us to compare different online services, using two measures, which we call Expected Fraud and Conditional Fraud Value at Risk. Finally, in the third article, we create a simulation tool which enables us to use scenario-specific results together with models like return of security investment, to support decisions about future security investments.In the second part of the thesis, we develop a method for producing realistic-looking data for testing fraud detection. In the fourth article, we introduce multi-agent based simulations together with social network analysis to create data which can be used to fine-tune fraud prevention, and in the fifth article, we continue this effort by adding a platform for testing fraud detection. / Finansiella nättjänster är en del av vår kritiska infrastruktur. På så vis utgör de en akilleshäl i samhället och måste skyddas på erforderligt sätt. Under de senaste tio åren har det skett en förskjutning från traditionella dataintrång för att visa upp att man kan till en it-brottslighet med stora ekonomiska konsekvenser för samhället. De olika hoten mot nättjänster har blivit värre och riskhantering med avseende på överbelastningsattacker, nätfiske och banktrojaner är nu en del av dagordningen för finansiella institutioner. Denna trend övervakas av ansvariga myndigheter som efterhand ökar sina minimikrav för riskhantering och bland annat kräver regelbunden riskbedömning av befintliga och nya hot.För den finansiella institutionen skapar denna situation ett behov av att förstå alla delar av incidenthanteringsprocessen, inklusive dess teknik, delprocesser och de resurser som kan arbeta med bedrägeribekämpning. Traditionellt har varje motåtgärds effektivitet mätts, om möjligt, för en teknik i taget, vilket leder till att ansvariga för bedrägeribekämpning får separata värden för autentisering, intrångsdetektering och bedrägeridetektering.I denna avhandling har vi fokuserat på två problem med denna situation. För det första finns det ett behov av ett verktyg som kan modellera effektiviteten för institutionens samlade motåtgärder mot bakgrund av befintliga och nya hot. För det andra saknas det tillgång till data för forskning rörande bedrägeridetektering, vilket hämmar utvecklingen inom området.I huvuddelen av avhandlingen ligger tonvikten på att studera ”hela” incidenthanteringsprocessen istället för att fokusera på en teknik i taget. I den första artikeln presenterar vi ett verktyg som gör det möjligt att mäta effektiviteten i incidenthanteringsprocessen. Vi kallar detta verktyg för ”incident response tree” (IRT) eller ”incidenthanteringsträd”. I den andra artikeln presenterar vi ett flertal scenarier som är relevanta för riskhantering av finansiella nättjänster med hjälp av IRT. Vi utvecklar också en kompletterande modell som är inspirerad av befintliga modeller för att mäta kreditrisk. Med hjälp av scenarioberoende mått för ”förväntat bedrägeri” och ”value at risk”, har vi möjlighet att jämföra risker mellan olika nättjänster. Slutligen, i den tredje artikeln, skapar vi ett agentbaserat simuleringsverktyg som gör det möjligt att använda scenariospecifika resultat tillsammans med modeller som ”avkastning på säkerhetsinvesteringar” för att stödja beslut om framtida investeringar i motåtgärder.I den andra delen av avhandlingen utvecklar vi en metod för att generera syntetiskt data för test av bedrägeridetektering. I den fjärde artikeln presenterar vi ett agentbaserat simuleringsverktyg som med hjälp av bland annat ”sociala nätverksanalyser” kan användas för att generera syntetiskt data med realistiskt utseende. I den femte artikeln fortsätter vi detta arbete genom att lägga till en plattform för testning av bedrägeridetektering. / <p>QC 20151103</p>
186

Efektyviojo investicinio portfelio valdymas rizikos vertės metodu / An Effective Investment Portfolio Management Using Value-at-Risk

Lukšys, Kęstutis 07 June 2006 (has links)
One of risk measurements – Value-at-Risk (VaR) was analyzed in this work. Complete definition of VaR is presented and three classical calculation methods of it are examined: parametric, historical simulations, and Monte-Carlo generations. The main advantages and disadvantages of the application of VaR are reviewed. Correlation effect for two assets risk diversification is examined and Markowitz method for calculation of efficient frontier is presented. Analyzed methods were implemented in the program, which calculates the first moments of portfolio’s returns, correlations between different assets and for a given return adjust weights in a manner to minimize dispersion of portfolio’s returns. For every efficient portfolio VaR can be calculated at any confidence level. Created program was used to analyze three investment portfolios: one of generated data with normal distribution, one of LITIN-10 index stocks and one of OMX Vilnius index stocks. Efficient frontier for these portfolios and VaR for whole efficient frontier were calculated. We noticed difference between minimal VaR and minimal standard deviation portfolios, consequently three investment strategies were implemented. The best results for analyzed portfolios there achieved with minimized VaR strategy.
187

Dynamic Hedging: CVaR Minimization and Path-Wise Comparison

Smirnov, Ivan Unknown Date
No description available.
188

Extreme value modelling with application in finance and neonatal research

Zhao, Xin January 2010 (has links)
Modelling the tails of distributions is important in many fields, such as environmental science, hydrology, insurance, engineering and finance, where the risk of unusually large or small events are of interest. This thesis applies extreme value models in neonatal and finance studies and develops novel extreme value modelling for financial applications, to overcome issues associated with the dependence induced by volatility clustering and threshold choice. The instability of preterm infants stimulates the interests in estimating the underlying variability of the physiology measurements typically taken on neonatal intensive care patients. The stochastic volatility model (SVM), fitted using Bayesian inference and a particle filter to capture the on-line latent volatility of oxygen concentration, is used in estimating the variability of medical measurements of preterm infants to highlight instabilities resulting from their under-developed biological systems. Alternative volatility estimators are considered to evaluate the performance of the SVM estimates, the results of which suggest that the stochastic volatility model provides a good estimator of the variability of the oxygen concentration data and therefore may be used to estimate the instantaneous latent volatility for the physiological measurements of preterm infants. The classical extreme value distribution, generalized pareto distribution (GPD), with the peaks-over-threshold (POT) method to ameliorate the impact of dependence in the extremes to infer the extreme quantile of the SVM based variability estimates. Financial returns typically show clusters of observations in the tails, often termed “volatility clustering” which creates challenges when applying extreme value models, since classical extreme value theory assume independence of underlying process. Explicit modelling on GARCH-type dependence behaviour of extremes is developed by implementing GARCH conditional variance structure via the extreme value model parameters. With the combination of GEV and GARCH models, both simulation and empirical results show that the combined model is better suited to explain the extreme quantiles. Another important benefit of the proposed model is that, as a one stage model, it is advantageous in making inferences and accounting for all uncertainties much easier than the traditional two stage approach for capturing this dependence. To tackle the challenge threshold choice in extreme value modelling and the generally asymmetric distribution of financial data, a two tail GPD mixture model is proposed with Bayesian inference to capture both upper and lower tail behaviours simultaneously. The proposed two tail GPD mixture modelling approach can estimate both thresholds, along with other model parameters, and can therefore account for the uncertainty associated with the threshold choice in latter inferences. The two tail GPD mixture model provides a very flexible model for capturing all forms of tail behaviour, potentially allowing for asymmetry in the distribution of two tails, and is demonstrated to be more applicable in financial applications than the one tail GPD mixture models previously proposed in the literature. A new Value-at-Risk (VaR) estimation method is then constructed by adopting the proposed mixture model and two-stage method: where volatility estimation using a latent volatility model (or realized volatility) followed by the two tail GPD mixture model applied to independent innovations to overcome the key issues of dependence, and to account for the uncertainty associated with threshold choice. The proposed method is applied in forecasting VaR for empirical return data during the current financial crisis period.
189

Extreme value theory and copula theory: a risk management application with energy futures.

Liu, Jia 06 April 2011 (has links)
Deregulation of the energy market and surging trading activities have made the energy markets even more volatile in recent years. Under such circumstances, it becomes increasingly important to assess the probability of rare and extreme price movement in the risk management of energy futures. Similar to other financial time series, energy futures exhibit time varying volatility and fat tails. An appropriate risk measurement of energy futures should be able to capture these two features of the returns. In the first portion of this dissertation, we use the conditional Extreme Value Theory model to estimate Value-at-Risk (VaR) and Expected Shortfall (ES) for long and short trading positions in the energy markets. The statistical tests on the backtests show that this approach provides a significant improvement over the widely used Normal distribution based VaR and ES models. In the second portion of this dissertation, we extend our analysis from a single security to a portfolio of energy futures. In recent years, commodity futures have gained tremendous popularity as many investors believe they provide much needed diversification to their portfolios. In order to properly account for any diversification benefits, we employ a time-varying conditional bivariate copula approach to model the dependence structure between energy futures. In contrast to previous studies on the same subject, we introduce fundamental supply and demand factors into the copula models to study the dependence structure between energy futures. We find that energy futures are more likely to move together during down markets than up markets. In the third part of this dissertation, we extend our study of bivariate copula models to multivariate copula theory. We employ a pair-copula approach to estimate VaR and ES of a portfolio consisting of energy futures, the S&P 500 index and the US Dollar index. Our empirical results show that although the pair copula approach does not offer any added advantage in VaR and ES estimation over a long backtest horizon, it provides much more accurate estimates of risk during the period of high co-dependence among assets after the recent financial crisis.
190

Which GARCH model is best for Value-at-Risk?

Berggren, Erik, Folkelid, Fredrik January 2015 (has links)
The purpose of this thesis is to identify the best volatility model for Value-at-Risk(VaR) estimations. We estimate 1 % and 5 % VaR figures for Nordic indices andstocks by using two symmetrical and two asymmetrical GARCH models underdifferent error distributions. Out-of-sample volatility forecasts are produced usinga 500 day rolling window estimation on data covering January 2007 to December2014. The VaR estimates are thereafter evaluated through Kupiec’s test andChristoffersen’s test in order to find the best model. The results suggest thatasymmetrical models perform better than symmetrical models albeit the simpleARCH is often good enough for 1 % VaR estimates.

Page generated in 0.0604 seconds