• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • Tagged with
  • 5
  • 5
  • 5
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Implementing a Class of Permutation Tests: The coin Package

Zeileis, Achim, Wiel, Mark A. van de, Hornik, Kurt, Hothorn, Torsten 11 1900 (has links) (PDF)
The R package coin implements a unified approach to permutation tests providing a huge class of independence tests for nominal, ordered, numeric, and censored data as well as multivariate data at mixed scales. Based on a rich and exible conceptual framework that embeds different permutation test procedures into a common theory, a computational framework is established in coin that likewise embeds the corresponding R functionality in a common S4 class structure with associated generic functions. As a consequence, the computational tools in coin inherit the exibility of the underlying theory and conditional inference functions for important special cases can be set up easily. Conditional versions of classical tests|such as tests for location and scale problems in two or more samples, independence in two- or three-way contingency tables, or association problems for censored, ordered categorical or multivariate data|can easily be implemented as special cases using this computational toolbox by choosing appropriate transformations of the observations. The paper gives a detailed exposition of both the internal structure of the package and the provided user interfaces along with examples on how to extend the implemented functionality. (authors' abstract)
2

Implementing a Class of Permutation Tests: The coin Package

Hothorn, Torsten, Hornik, Kurt, van de Wiel, Mark A., Zeileis, Achim January 2007 (has links) (PDF)
The R package coin implements a unified approach to permutation tests providing a huge class of independence tests for nominal, ordered, numeric, and censored data as well as multivariate data at mixed scales. Based on a rich and flexible conceptual framework that embeds different permutation test procedures into a common theory, a computational framework is established in coin that likewise embeds the corresponding R functionality in a common S4 class structure with associated generic functions. As a consequence, the computational tools in coin inherit the flexibility of the underlying theory and conditional inference functions for important special cases can be set up easily. Conditional versions of classical tests - such as tests for location and scale problems in two or more samples, independence in two- or three-way contingency tables, or association problems for censored, ordered categorical or multivariate data - can be easily be implemented as special cases using this computational toolbox by choosing appropriate transformations of the observations. The paper gives a detailed exposition of both the internal structure of the package and the provided user interfaces. / Series: Research Report Series / Department of Statistics and Mathematics
3

Rare Events Simulations with Applications to the Performance Evaluation of Wireless Communication Systems

Ben Rached, Nadhir 08 October 2018 (has links)
The probability that a sum of random variables (RVs) exceeds (respectively falls below) a given threshold, is often encountered in the performance analysis of wireless communication systems. Generally, a closed-form expression of the sum distribution does not exist and a naive Monte Carlo (MC) simulation is computationally expensive when dealing with rare events. An alternative approach is represented by the use of variance reduction techniques, known for their efficiency in requiring less computations for achieving the same accuracy requirement. For the right-tail region, we develop a unified hazard rate twisting importance sampling (IS) technique that presents the advantage of being logarithmic efficient for arbitrary distributions under the independence assumption. A further improvement of this technique is then developed wherein the twisting is applied only to the components having more impacts on the probability of interest than others. Another challenging problem is when the components are correlated and distributed according to the Log-normal distribution. In this setting, we develop a generalized hybrid IS scheme based on a mean shifting and covariance matrix scaling techniques and we prove that the logarithmic efficiency holds again for two particular instances. We also propose two unified IS approaches to estimate the left-tail of sums of independent positive RVs. The first applies to arbitrary distributions and enjoys the logarithmic efficiency criterion, whereas the second satisfies the bounded relative error criterion under a mild assumption but is only applicable to the case of independent and identically distributed RVs. The left-tail of correlated Log-normal variates is also considered. In fact, we construct an estimator combining an existing mean shifting IS approach with a control variate technique and prove that it possess the asymptotically vanishing relative error property. A further interesting problem is the left-tail estimation of sums of ordered RVs. Two estimators are presented. The first is based on IS and achieves the bounded relative error under a mild assumption. The second is based on conditional MC approach and achieves the bounded relative error property for the Generalized Gamma case and the logarithmic efficiency for the Log-normal case.
4

Numerical methods for foreign exchange option pricing under hybrid stochastic and local volatility models

Cozma, Andrei January 2017 (has links)
In this thesis, we study the FX option pricing problem and put forward a 4-factor hybrid stochastic-local volatility model. The model, which describes the dynamics of an exchange rate, its volatility and the domestic and foreign short rates, allows for a perfect calibration to European options and has a good hedging performance. Due to the high-dimensionality of the problem, we propose a Monte Carlo simulation scheme that combines the full truncation Euler scheme for the stochastic volatility component and the stochastic short rates with the log-Euler scheme for the exchange rate. We analyze exponential integrability properties of Euler discretizations for the square-root process driving the stochastic volatility and the short rates, properties which play a key role in establishing the finiteness of moments and the strong convergence of numerical approximations for a large class of stochastic differential equations in finance, including the ones studied in this thesis. Hence, we prove the strong convergence of the exchange rate approximations and the convergence of Monte Carlo estimators for a number of vanilla and exotic options. Then, we calibrate the model to market data and discuss its fitness for pricing FX options. Next, due to the relatively slow convergence of the Monte Carlo method in the number of simulations, we examine a variance reduction technique obtained by mixing Monte Carlo and finite difference methods via conditioning. We consider a purely stochastic version of the model and price vanilla and exotic options by simulating the paths of the volatility and the short rates, and then evaluating the "inner" Black-Scholes-type expectation by means of a partial differential equation. We prove the convergence of numerical approximations and carry out a theoretical variance reduction analysis. Finally, we illustrate the efficiency of the method through a detailed quantitative assessment.
5

Advanced Monte Carlo Methods with Applications in Finance

Joshua Chi Chun Chan Unknown Date (has links)
The main objective of this thesis is to develop novel Monte Carlo techniques with emphasis on various applications in finance and economics, particularly in the fields of risk management and asset returns modeling. New stochastic algorithms are developed for rare-event probability estimation, combinatorial optimization, parameter estimation and model selection. The contributions of this thesis are fourfold. Firstly, we study an NP-hard combinatorial optimization problem, the Winner Determination Problem (WDP) in combinatorial auctions, where buyers can bid on bundles of items rather than bidding on them sequentially. We present two randomized algorithms, namely, the cross-entropy (CE) method and the ADAptive Mulitilevel splitting (ADAM) algorithm, to solve two versions of the WDP. Although an efficient deterministic algorithm has been developed for one version of the WDP, it is not applicable for the other version considered. In addition, the proposed algorithms are straightforward and easy to program, and do not require specialized software. Secondly, two major applications of conditional Monte Carlo for estimating rare-event probabilities are presented: a complex bridge network reliability model and several generalizations of the widely popular normal copula model used in managing portfolio credit risk. We show how certain efficient conditional Monte Carlo estimators developed for simple settings can be extended to handle complex models involving hundreds or thousands of random variables. In particular, by utilizing an asymptotic description on how the rare event occurs, we derive algorithms that are not only easy to implement, but also compare favorably to existing estimators. Thirdly, we make a contribution at the methodological front by proposing an improvement of the standard CE method for estimation. The improved method is relevant, as recent research has shown that in some high-dimensional settings the likelihood ratio degeneracy problem becomes severe and the importance sampling estimator obtained from the CE algorithm becomes unreliable. In contrast, the performance of the improved variant does not deteriorate as the dimension of the problem increases. Its utility is demonstrated via a high-dimensional estimation problem in risk management, namely, a recently proposed t-copula model for credit risk. We show that even in this high-dimensional model that involves hundreds of random variables, the proposed method performs remarkably well, and compares favorably to existing importance sampling estimators. Furthermore, the improved CE algorithm is then applied to estimating the marginal likelihood, a quantity that is fundamental in Bayesian model comparison and Bayesian model averaging. We present two empirical examples to demonstrate the proposed approach. The first example involves women's labor market participation and we compare three different binary response models in order to find the one best fits the data. The second example utilizes two vector autoregressive (VAR) models to analyze the interdependence and structural stability of four U.S. macroeconomic time series: GDP growth, unemployment rate, interest rate, and inflation. Lastly, we contribute to the growing literature of asset returns modeling by proposing several novel models that explicitly take into account various recent findings in the empirical finance literature. Specifically, two classes of stylized facts are particularly important. The first set is concerned with the marginal distributions of asset returns. One prominent feature of asset returns is that the tails of their distributions are heavier than those of the normal---large returns (in absolute value) occur much more frequently than one might expect from a normally distributed random variable. Another robust empirical feature of asset returns is skewness, where the tails of the distributions are not symmetric---losses are observed more frequently than large gains. The second set of stylized facts is concerned with the dependence structure among asset returns. Recent empirical studies have cast doubts on the adequacy of the linear dependence structure implied by the multivariate normal specification. For example, data from various asset markets, including equities, currencies and commodities markets, indicate the presence of extreme co-movement in asset returns, and this observation is again incompatible with the usual assumption that asset returns are jointly normally distributed. In light of the aforementioned empirical findings, we consider various novel models that generalize the usual normal specification. We develop efficient Markov chain Monte Carlo (MCMC) algorithms to estimate the proposed models. Moreover, since the number of plausible models is large, we perform a formal Bayesian model comparison to determine the model that best fits the data. In this way, we can directly compare the two approaches of modeling asset returns: copula models and the joint modeling of returns.

Page generated in 0.1285 seconds