• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 127
  • 49
  • 35
  • 27
  • 18
  • 10
  • 9
  • 9
  • 8
  • 6
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 308
  • 61
  • 60
  • 47
  • 42
  • 40
  • 37
  • 36
  • 33
  • 31
  • 30
  • 29
  • 27
  • 27
  • 26
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
271

探討標準化偏斜Student-t分配關聯結構模型之抵押債務債券之評價 / Pricing CDOs with Standardized Skew Student-t Distribution Copula Model

黃于騰, Huang, Yu Teng Unknown Date (has links)
在市場上最常被用來評價抵押債務債券(Collateralized Debt Obligation, CDO)的分析方法即為應用大樣本同質性資產組合(Large Homogeneous Portfolio, LHP)假設之單因子關聯結構模型(One Factor Copula Model)。由過去文獻指出,自2008年起,抵押債務債券的商品結構已漸漸出現改變,而目前所延伸之各種單因子關聯結構模型在新型商品的評價結果中皆仍有改善空間。 在本文中使用標準化偏斜Student-t分配(Standardized Skew Student-t distribution, SSTD)取代傳統的高斯分配進行抵押債務債券之分券的評價,此分配擁有控制分配偏態與峰態的參數。但是與Student-t分配相同,SSTD同樣不具備穩定的摺積(convolution)性質,因此在評價過程中會額外消耗部分時間。而在實證分析中,以單因子SSTD關聯結構模型評價擔保債務債券新型商品之分券時得到了較佳的結果,並且比單因子高斯關聯結構模型擁有更多參數以符合實際需求。 / The most widely used method for pricing collateralized debt obligation(CDO) is the one factor copula model with Large Homogeneous Portfolio assumption. Based on the literature of discussing, the structure of CDO had been changed gradually since 2008. The effects for pricing new type CDO tranches in the current extended one factor copula models are still improvable. In this article, we substitute the Gaussian distribution with the Standardized Skew Student-t distribution(SSTD) for pricing CDO tranches, and it has the features of heavy-tail and skewness. However, similar to the Student-t distribution, the SSTD is not stable under convolution as well. For this reason, it takes extra time in the pricing process. The empirical analysis shows that the one factor SSTD copula model has a good effect for pricing new type CDO tranches, and furthermore it brings more flexibility to the one factor Gaussian copula model.
272

Frequency Analysis of Droughts Using Stochastic and Soft Computing Techniques

Sadri, Sara January 2010 (has links)
In the Canadian Prairies recurring droughts are one of the realities which can have significant economical, environmental, and social impacts. For example, droughts in 1997 and 2001 cost over $100 million on different sectors. Drought frequency analysis is a technique for analyzing how frequently a drought event of a given magnitude may be expected to occur. In this study the state of the science related to frequency analysis of droughts is reviewed and studied. The main contributions of this thesis include development of a model in Matlab which uses the qualities of Fuzzy C-Means (FCMs) clustering and corrects the formed regions to meet the criteria of effective hydrological regions. In FCM each site has a degree of membership in each of the clusters. The algorithm developed is flexible to get number of regions and return period as inputs and show the final corrected clusters as output for most case scenarios. While drought is considered a bivariate phenomena with two statistical variables of duration and severity to be analyzed simultaneously, an important step in this study is increasing the complexity of the initial model in Matlab to correct regions based on L-comoments statistics (as apposed to L-moments). Implementing a reasonably straightforward approach for bivariate drought frequency analysis using bivariate L-comoments and copula is another contribution of this study. Quantile estimation at ungauged sites for return periods of interest is studied by introducing two new classes of neural network and machine learning: Radial Basis Function (RBF) and Support Vector Machine Regression (SVM-R). These two techniques are selected based on their good reviews in literature in function estimation and nonparametric regression. The functionalities of RBF and SVM-R are compared with traditional nonlinear regression (NLR) method. As well, a nonlinear regression with regionalization method in which catchments are first regionalized using FCMs is applied and its results are compared with the other three models. Drought data from 36 natural catchments in the Canadian Prairies are used in this study. This study provides a methodology for bivariate drought frequency analysis that can be practiced in any part of the world.
273

運用新共同邊界法探討多重產出銀行業市場競爭度與成本效率 / A New Approach to Jointly Estimating the Lerner Index and Cost Efficiency for Multi-output Banks under a New Meta-Frontier Framework

江典霖, Chiang, Dien Lin Unknown Date (has links)
過去文獻大多使用Lerner指數來衡量銀行業之市場競爭度,但在計算過程中有可能出現其值為負之問題。為解決上述問題,本文運用關聯結構函數建立聯立隨機邊界模型,它由銀行成本邊界與兩條產出價格邊界所組成,可以同時衡量放款市場及投資市場之市場競爭度與成本效率。另外,為比較西歐五個國家的銀行市場競爭度與成本效率,本文進一步採用Huang et al. (2014)所提出的新隨機共同邊界模型,此模型除使用共同成本邊界計算技術缺口比率外,還透過產出價格共同邊界衡量潛在Lerner指數,進一步拆解成Lerner指數與MC gap ratio (MCGR)兩部分,可以比較不同國家間的市場競爭程度。 / This paper proposes the copula-based simultaneous stochastic frontier model (CSSFM), composed of a cost frontier and two output price frontiers for the banking sector, in order to measure cost efficiency and market power in the markets of loans and investments. The new Lerner index can be estimated by relying on the simultaneous equations model, consisting of three frontier equations, which avoids obtaining negative measures of the Lerner index. We then apply the new meta-frontier model to simultaneously estimate and compare cost efficiency and market power across five countries over the period 1998-2010. The salient feature of our proposed approach is that it allows for calculating the technology gap ratio on the basis of the cost frontier, as well as evaluating the potential Lerner index from price frontiers, which can be decomposed into the country-specific Lerner index and marginal cost gap ratio.
274

Frequency Analysis of Droughts Using Stochastic and Soft Computing Techniques

Sadri, Sara January 2010 (has links)
In the Canadian Prairies recurring droughts are one of the realities which can have significant economical, environmental, and social impacts. For example, droughts in 1997 and 2001 cost over $100 million on different sectors. Drought frequency analysis is a technique for analyzing how frequently a drought event of a given magnitude may be expected to occur. In this study the state of the science related to frequency analysis of droughts is reviewed and studied. The main contributions of this thesis include development of a model in Matlab which uses the qualities of Fuzzy C-Means (FCMs) clustering and corrects the formed regions to meet the criteria of effective hydrological regions. In FCM each site has a degree of membership in each of the clusters. The algorithm developed is flexible to get number of regions and return period as inputs and show the final corrected clusters as output for most case scenarios. While drought is considered a bivariate phenomena with two statistical variables of duration and severity to be analyzed simultaneously, an important step in this study is increasing the complexity of the initial model in Matlab to correct regions based on L-comoments statistics (as apposed to L-moments). Implementing a reasonably straightforward approach for bivariate drought frequency analysis using bivariate L-comoments and copula is another contribution of this study. Quantile estimation at ungauged sites for return periods of interest is studied by introducing two new classes of neural network and machine learning: Radial Basis Function (RBF) and Support Vector Machine Regression (SVM-R). These two techniques are selected based on their good reviews in literature in function estimation and nonparametric regression. The functionalities of RBF and SVM-R are compared with traditional nonlinear regression (NLR) method. As well, a nonlinear regression with regionalization method in which catchments are first regionalized using FCMs is applied and its results are compared with the other three models. Drought data from 36 natural catchments in the Canadian Prairies are used in this study. This study provides a methodology for bivariate drought frequency analysis that can be practiced in any part of the world.
275

Stress-Test Exercises and the Pricing of Very Long-Term Bonds

Dubecq, Simon 28 January 2013 (has links) (PDF)
In the first part of this thesis, we introduce a new methodology for stress-test exercises. Our approach allows to consider richer stress-test exercises, which assess the impact of a modification of the whole distribution of asset prices' factors, rather than focusing as the common practices on a single realization of these factors, and take into account the potential reaction to the shock of the portfolio manager. The second part of the thesis is devoted to the pricing of bonds with very long-term time-to-maturity (more than ten years). Modeling the volatility of very long-term rates is a challenge, due to the constraints put by no-arbitrage assumption. As a consequence, most of the no-arbitrage term structure models assume a constant limiting rate (of infinite maturity). The second chapter investigates the compatibility of the so-called "level" factor, whose variations have a uniform impact on the modeled yield curve, with the no-arbitrage assumptions. We introduce in the third chapter a new class of arbitrage-free term structure factor models, which allows the limiting rate to be stochastic, and present its empirical properties on a dataset of US T-Bonds.
276

時間數列模型應用於合成型抵押擔保債務憑證之評價與預測 / Time series model apply to price and predict for Synthetic CDOs

張弦鈞, Chang, Hsien Chun Unknown Date (has links)
根據以往探討評價合成型抵押擔保債務憑證之文獻研究,最廣泛使用的方法應為大樣本一致性資產組合(large homogeneous portfolio portfolio;LHP)假設之單因子常態關聯結構模型來評價,但會因為常態分配的厚尾度及偏斜性造成與市場報價間的差異過大,且會造成相關性微笑曲線現象。故像是Kalemanova et al.在2007年提出之應用LHP假設的單因子Normal Inverse Gaussian(NIG)關聯結構模型以及邱嬿燁(2007)提出NIG及Closed Skew Normal(CSN)複合分配之單因子關聯結構模型(MIX模型)皆是為了改善其在各分劵評價時能達到更佳的評價結果 ,然而過去的文獻在評價合成型抵押擔保債務憑證時,需要將CDS價差、各分劵真實報價之資訊導入模型,並藉由此兩種資訊進而得到相關係數及報價,故靜態模型大多為事後之驗證,在靜態模型方面,我們嘗試使用不同概念之CDS取法以及相對到期日期數遞減之概念來比較此兩種不同方法與原始的關聯結構模型進行比較分析,在動態模型方面,我們應用與時間序列相關之方法套入以往的評價模型,針對不同商品結構的合成型抵押擔保債券評價,並由實證分析來比較此兩種模型,而在最後,我們利用時間序列模型來對各分劵進行預測。
277

運用關聯結構網絡隨機邊界分析法探討我國壽險公司經營績效 / Applying the Copula-Based Network Stochastic Frontier Approach to Study the Efficiency of Taiwan’s Life Insurance Industry

巫瑞虔, Wu, Ruei Cian Unknown Date (has links)
本研究以2000至2012年台灣地區26間人壽保險公司的不平衡縱橫資料,運用網絡隨機邊界分析法將壽險業的生產過程分為行銷與投資兩階段進行效率評估,並利用估計結果計算規模彈性與成本彈性探討台灣壽險業的生產特性,附帶分析跨期技術變動率,最後比較不同分組的壽險公司間經營效率是否存在差異。 實證結果發現壽險公司在行銷活動過程投入較少的內勤員工與較多的固定資產,在投資階段則相反,投入較多的內勤員工與較少的固定資產,與壽險公司實際運作情況相符;此外,投資階段的效率優於第一階段的行銷效率。整體台灣壽險業受到2008年金融風暴影響導致經營效率下降,國內壽險公司在經營效率上優於外商壽險分公司,金控壽險公司生產技術效率優於非金控壽險公司,1993年後成立的新壽險公司生產技術效率平均優於傳統舊壽險公司。 / This paper uses the copula-based network SFA model developed by Huang et al. (2013) to estimate the technical efficiency of Taiwan’s life insurance companies over the period 2000-2012. Under this framework, life insurance companies produce premium income as intermediate product which is one of input factors to produce investment income. The empirical analysis concluded: (a) life insurers use little internal staff in first stage, (b) domestic life insurers have both high technical efficiency and cost efficiency in comparison with foreign life insurers, (c) financial holding life insurers have greater technical efficiency than those of not from financial holding insurers, and (d) new life insurers have higher technical efficiency than old life insurers.
278

單一分券違約信用交換與單一分券擔保債權憑證之評價-Copula方法

林晚容 Unknown Date (has links)
銀行承載許多公司借款、各式擔保貸款及各式信用貸款等,使金融機構面臨龐大各式信用風險問題。在新版巴塞爾資本協定針對信用風險之計算方法做了重大修正,其中信用衍生性商品已具有信用風險抵減之功能。故本研究將針對一籃子信用標的針對信用結構式商品中具有量身訂作的單一分券信用違約交換與單一分券擔保債權憑進行更深入之研究並使用加入Vasicek Model特例Ornstein-Uhlenbeck process表示違約強度之隨機動態過程利用類似風險性債券之概念求得出封閉解以替代存活函數,來為簡化起見在無風險利率假設為一固定常數使用Copula方法評價單一分券信用違約交換與單一分券擔保債權憑。   在數值模擬部分,本篇利用實際市場資料建構出一合成單一分券擔保債權憑證產品,先針對違約動態模型與Copula函數之相關參數以實際市場資料做計與校正,再以評價公式以計算出合理信用價差,其結果可知當Copula函數越能描繪具有信用違約相關之信用違約事件,則當發生信用標的資產先後違約聚集情形會越高,以本研究實際產品資料特性而言Clayton Copula最能表現出違維聚集之情形,但在反應在第一次發生違約的權益分券上反而沒有其他兩種Copula函數用蒙地卡羅法所模擬出之違約次數高反而更低,做所求出來的信用價差也相對來的低,反而在反應違約聚集部分的先償違約交換具有較高信用價差。而在VaR值之衡量上可能因信用標的資產比較少,並沒有明顯之差異。
279

探討合成型抵押擔保債券憑證之評價 / Pricing the Synthetic CDOs

林聖航 Unknown Date (has links)
根據以往探討評價合成型抵押擔保債券之文獻研究,最廣為使用的方法應用大樣本一致性資產組合(large homogeneous portfolio portfolio ; LHP)假設之單因子常態關聯結構模型來評價,但會造成合成型抵押擔保債券憑證與市場報價間的差異過大,且會造成相關性微笑曲線現象。由文獻顯示,單因子關聯結構模型若能加入厚尾度或偏斜性能夠改善以上問題,且對於分券評價時也會有較好的效果,像是Kalemanova et al. (2007) 提出應用LHP假設之單因子Normal Inverse Gaussian(NIG)關聯結構模型以及邱嬿燁(2007)提出NIG及Closed Skew Normal(CSN)複合分配之單因子關聯結構模型(MIX模型)在實證分析中得到極佳的評價結果。自2008年起,合成型抵押擔保債券商品結構開始出現變化,而以往評價合成型抵押擔保債券價格時,商品結構皆為同一種型式。本文將利用常態分配、NIG分配、CSN分配以及NIG與CSN複合分配作為不同的單因子關聯結構模型,藉由絕對誤差極小化方法,針對不同商品結構的合成型抵押擔保債券評價,並進行模型比較分析。由最後實證分析結果顯示,單因子NIG(2)關聯結構模型優於其他模型,也證明NIG分配的第二個參數 β 能夠帶來改善的評價效果,此項證明與過去文獻結論有所不同,但 MIX模型則為唯一一個符合LHP假設的模型。 / Based on the literature of discussing the approach for pricing synthetic CDOs, the most widely used methods used application of Large Homogeneous Portfolio (LHP) assumption of the one factor Gaussian copula model, however , it fails to fit the prices of synthetic CDOs tranches and leads to the implied correlation smile. The literature shows that one factor copula model adding the heavy-tail or skew can improve the above problem, and also has a good effect for pricing tranches such as Kalemanova et al (2007) proposed the application of LHP assumption of one factor NIG copula model and Qiu Yan Ye (2007) proposed the application of LHP assumption of one factor NIG and CSN copula model. This article found that the structure of synthetic CDOs began to change since 2008. The past of pricing synthetic CDOs, the structure of synthetic CDOs are the same type, so this article will use different one factor copula model for pricing different structure of synthetic CDOs by using the absolute error minimization. This article will observe whether the above model can be applied in the new synthetic CDOs and implement of different type model for comparative analysis. The last empirical analysis shows that one factor NIG (2) copula model is superior to other models, more meeting the actual market demand, also proving the second parameter β of the NIG distribution able to bring about improvements in pricing results. This proving is different for the past literature conclusions. However, the MIX model is the only one in line with the LHP assumptions.
280

Fusion d'images de télédétection hétérogènes par méthodes crédibilistes / Fusion of heterogeneous remote sensing images by credibilist methods

Hammami, Imen 08 December 2017 (has links)
Avec l’avènement de nouvelles techniques d’acquisition d’image et l’émergence des systèmes satellitaires à haute résolution, les données de télédétection à exploiter sont devenues de plus en plus riches et variées. Leur combinaison est donc devenue essentielle pour améliorer le processus d’extraction des informations utiles liées à la nature physique des surfaces observées. Cependant, ces données sont généralement hétérogènes et imparfaites ce qui pose plusieurs problèmes au niveau de leur traitement conjoint et nécessite le développement de méthodes spécifiques. C’est dans ce contexte que s’inscrit cette thèse qui vise à élaborer une nouvelle méthode de fusion évidentielle dédiée au traitement des images de télédétection hétérogènes à haute résolution. Afin d’atteindre cet objectif, nous axons notre recherche, en premier lieu, sur le développement d’une nouvelle approche pour l’estimation des fonctions de croyance basée sur la carte de Kohonen pour simplifier l’opération d’affectation des masses des gros volumes de données occupées par ces images. La méthode proposée permet de modéliser non seulement l’ignorance et l’imprécision de nos sources d’information, mais aussi leur paradoxe. Ensuite, nous exploitons cette approche d’estimation pour proposer une technique de fusion originale qui permettra de remédier aux problèmes dus à la grande variété des connaissances apportées par ces capteurs hétérogènes. Finalement, nous étudions la manière dont la dépendance entre ces sources peut être considérée dans le processus de fusion moyennant la théorie des copules. Pour cette raison, une nouvelle technique pour choisir la copule la plus appropriée est introduite. La partie expérimentale de ce travail est dédiée à la cartographie de l’occupation des sols dans les zones agricoles en utilisant des images SPOT-5 et RADARSAT-2. L’étude expérimentale réalisée démontre la robustesse et l’efficacité des approches développées dans le cadre de cette thèse. / With the advent of new image acquisition techniques and the emergence of high-resolution satellite systems, remote sensing data to be exploited have become increasingly rich and varied. Their combination has thus become essential to improve the process of extracting useful information related to the physical nature of the observed surfaces. However, these data are generally heterogeneous and imperfect, which poses several problems in their joint treatment and requires the development of specific methods. It is in this context that falls this thesis that aimed at developing a new evidential fusion method dedicated to heterogeneous remote sensing images processing at high resolution. In order to achieve this objective, we first focus our research, firstly, on the development of a new approach for the belief functions estimation based on Kohonen’s map in order to simplify the masses assignment operation of the large volumes of data occupied by these images. The proposed method allows to model not only the ignorance and the imprecision of our sources of information, but also their paradox. After that, we exploit this estimation approach to propose an original fusion technique that will solve problems due to the wide variety of knowledge provided by these heterogeneous sensors. Finally, we study the way in which the dependence between these sources can be considered in the fusion process using the copula theory. For this reason, a new technique for choosing the most appropriate copula is introduced. The experimental part of this work isdevoted to land use mapping in case of agricultural areas using SPOT-5 and RADARSAT-2 images. The experimental study carried out demonstrates the robustness and effectiveness of the approaches developed in the framework of this thesis.

Page generated in 0.0396 seconds