• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9
  • 4
  • 4
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 23
  • 23
  • 7
  • 6
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Essays on theoretical credit risk modelling

Xu, Dapeng January 2003 (has links)
No description available.
2

Credit Risk Modeling and Implementation

Gunnars, Johan January 2017 (has links)
The financial crisis and the bankruptcy of Lehman Brothers in 2008 lead to harder regulations for the banking industry which included larger capital reserves for the banks. One of the parts that contributed to this increased capital reserve was the the credit valuation adjustment capital charge which can be explained as the market value of the counterparty default risk. The purpose of the credit valuation adjustment capital charge is to capitalize the risk of future changes in the market value of the counterparty default risk. One financial contract that had a key role in the financial crisis was the credit default swap. A credit default swap involves three different parts, a contract seller, a contract buyer and a reference entity. The credit default swap can be seen as an insurance against a credit event, a default for example of the reference entity. This thesis focuses on the study and calculation of the credit valuation adjustment of credit default swaps. The credit valuation adjustment on a credit default swap can be implemented with two different assumptions. In the first case, the seller (buyer) of the contract is assumed to be default risk free and then only the buyer (seller) contributes to the default risk. In the second case, both the seller and the buyer of the contract is assumed to be default risky and therefore, both parts contributes to the default risk. / Finanskrisen och Lehman Brothers konkurs 2008 ledde till hårdare regleringar för banksektorn som bland annat innefattade krav på större kapitalreserver för bankerna. En del som bidrog till denna ökning av kapitalreserverna var kreditvärdighetsjusteringens kapitalkrav som kan förklaras som marknadsvärdet av motpartsrisken. Syftet med kreditvärdighetsjusteringens kapitalkrav är att kapitalisera risken för framtida förändringar i marknadsvärdet av motpartsrisken. Ett derivat som hade en nyckelroll under finanskrisen var kreditswappen. En kreditswap innefattar tre parter, en säljare, en köpare och ett referensföretag. Kreditswappen kan ses som en försäkring mot en kredithändelse, till exempel en konkurs på referensföretaget. Detta arbete fokuserar på studier och beräkningar av kreditvärdesjusteringen på kreditswappar. Kreditvärdesjusteringen på en kreditswap kan implementeras med två olika antaganden. I det första fallet antas säljaren (köparen) vara riskfri och då bidrar bara köparen (säljaren) till konkursrisken. I det andra fallet antas både säljaren och köparen bidra till konkursrisken.
3

Empirical Examination of the Ex ante and Ex post Determinants of the ICT Adoption

Cheng, kai-yun 13 July 2004 (has links)
Abstract This article uses the plant-level data of Taiwan manufacturing industry to study the determinants in explaining the timing of information and communication technologies (ICTs) adoption. This paper then investigates whether there exists any difference in determining the utilization of ICTs among high-tech industries and traditional industries. We find size variable has the most significant effect while there appears a different impact in different industries.
4

The Impact of E-commerce Adoption on Firm¡¦s Performance

Lo, Wen-Shin 03 August 2005 (has links)
The focus of this paper is to examine the determinants explaining the timing of E-commerce adoption in Taiwan manufacturing industries through the use of a duration model. This paper also investigates whether there have any difference in determining the utilization of E-commerce among different industries. And construct the measurement of E-commerce spillover effect, vertical integration and diversification to see how the E-commerce affects the plant¡¦s boundaries and transaction efficiency by changing the transaction costs among plants.
5

The Valuation of Credit Default Swaps

Diallo, Nafi C 11 January 2006 (has links)
The credit derivatives market has known an incredible development since its advent in the 1990's. Today there is a plethora of credit derivatives going from the simplest ones, credit default swaps (CDS), to more complex ones such as synthetic single-tranche collateralized debt obligations. Valuing this rich panel of products involves modeling credit risk. For this purpose, two main approaches have been explored and proposed since 1976. The first approach is the Structural approach, first proposed by Merton in 1976, following the work of Black-Scholes for pricing stock options. This approach relies in the capital structure of a firm to model its probability of default. The other approach is called the Reduced-form approach or the hazard rate approach. It is pioneered by Duffie, Lando, Jarrow among others. The main thesis in this approach is that default should be modeled as a jump process. The objective of this work is to value Asset-backed Credit default swaps using the hazard rate approach.
6

Local Labor Market Scale, Search Duration, and Re-Employment Match Quality for U.S. Displaced Workers

Wilkin, Kelly R 18 December 2012 (has links)
Geographic space is an important friction preventing the instantaneous matching of unemployed workers to job vacancies. Cities reduce spatial frictions by decreasing the average distance between potential match partners. Owing to these search efficiencies, theories of agglomeration predict that unemployed workers in larger labor markets find employment faster than observationally similar workers in smaller markets. Existing studies rely on cross-sectional variation in aggregate unemployment rates across spatially distinct labor markets to test for scale effects in job search. A major difficulty with these studies is that the unemployment rate is, at any given time, simultaneously the incidence and duration of unemployment. Therefore, conclusions about unemployment exits using the unemployment rate are confounded by transitions into unemployment. This dissertation examines the relationship between market scale unemployment duration for permanently laid off workers in the U.S. Using a large sample of individual unemployment spells in 259 MSAs, proportional hazard model estimates predict a negative relationship between market scale and the hazard of exiting unemployment. This effect is strengthened when space is explicitly controlled for and measured with greater precision. These results are consistent with the hypothesis that search efficiencies lead workers to increase their reservation wages. 2SLS estimates show that re-employment earnings for permanently laid off workers increase with market scale after controlling for endogenous search duration. These effects are robust to standard controls, as well as controls for local labor market conditions. These results challenge the view that search efficiencies lead to lower unemployment rates through faster job-finding rates.
7

Model for Bathtub-Shaped Hazard Rate: Monte Carlo Study

Leithead, Glen S. 01 May 1970 (has links)
A new model developed for the entire bathtub-shaped hazard rate curve has been evaluated as to its usefulness as a method of reliability estimation. The model is of the form: F(t) = 1 - exp - (ϴ1tL + ϴ2t + ϴ3tM) where "L" and "M" were assumed known. The estimate of reliability obtained from the new model was compared with the traditional restricted sample estimate for four different time intervals and was found to have less bias and variance for all time points. The was a monte carlo study and the data generated showed that the new model has much potential as a method for estimating reliability. (51 pages)
8

Credit Risk from Theory to Application

Yi, Chuang 04 1900 (has links)
<p> In this thesis, we calibrated a one factor CIR model for interest rate and a two factor CIR model for each hazard rate of 21 firms. The time series of the interest rate and each hazard rate for 21 firms are also obtained. Extended Kalman Filter and Quasi-Maximum Likelihood Estimation are used as the numerical scheme. The empirical results suggest that multifactor CIR models are not better than multifactor Hull-White model. Positive correlations between hazard rate and interest rate are discovered, although most hazard rates are found to be negatively correlated with the default-free interest rate. The 21 filtered time series of the hazard rates suggest that there maybe a hidden common factor shared only by the intensities. Monte Carlo Simulation is conducted both for interest rate and hazard rates. The simulation indicate that both the SKF and the EKF work pretty well as a filter tool but may produce bad estimation for the value of the likelihood function. QMLE works fine in linear state space form model, but it does a poor job in the case of non-linear state space form.</p> / Thesis / Master of Science (MSc)
9

兩個二段式指數分配比較之研究 / Comparison of two exponential distributions with a change point

賴武志, Lai, Wu Chih Unknown Date (has links)
在存活分析中,含有轉折點的指數分配(又稱二段式指數分配)的模式,常被拿來研究某些疾病的復發率,以決定其治療方式是否有效。然而在文獻上,對這一個模式的探討大都局限在單一母體上,其問題不外乎有兩個:一是檢定此一轉折點是否存在;二是估計此一轉折點。   本文將此一問題擴充,從一個母體提昇至兩個母體,比較兩個母體是否具有相同的轉折點、起始危險率或轉換率。基本上,我們使用了貝氏方法和古典方法來分析。   我們利用貝氏方法,推導出兩個母體在不同的已知條件下,各母數比值或差值的事後分配。但是他們的形式幾乎都很複雜,使得欲做進一步的分析,困難重重。因此,我們引進了 Gibbs 抽樣法,利用各完全條件事後分配,萃取出各邊際事後分配,以供推論之用。   而在古典分析中,我們係採用概似比值檢定法。而最大的問題在於轉折點未知時,我們不知其對數概似比的分配為何。我們除了介紹兩個文獻中估計轉折點的方法,我們更利用了自助法 (bootstrap) 來估計其對數概似比的分配,以供檢定之用。   對於這樣兩母體的比較,在醫學上、工業上甚具意義。本文不僅推導出其供比較用的統計架構,更提供了具體而實用的抽樣方法, 對這問題的分析,頗具貢獻。
10

Testing the Hazard Rate, Part I

Liero, Hannelore January 2003 (has links)
We consider a nonparametric survival model with random censoring. To test whether the hazard rate has a parametric form the unknown hazard rate is estimated by a kernel estimator. Based on a limit theorem stating the asymptotic normality of the quadratic distance of this estimator from the smoothed hypothesis an asymptotic ®-test is proposed. Since the test statistic depends on the maximum likelihood estimator for the unknown parameter in the hypothetical model properties of this parameter estimator are investigated. Power considerations complete the approach.

Page generated in 0.0589 seconds