Spelling suggestions: "subject:"amathematical statistics."" "subject:"dmathematical statistics.""
611 |
Some questions in risk management and high-dimensional data analysisWang, Ruodu 04 May 2012 (has links)
This thesis addresses three topics in the area of statistics and
probability, with applications in risk management. First, for the
testing problems in the high-dimensional (HD) data analysis, we
present a novel method to formulate empirical likelihood tests and
jackknife empirical likelihood tests by splitting the sample into
subgroups. New tests are constructed to test the equality of two HD
means, the coefficient in the HD linear models and the HD covariance
matrices. Second, we propose jackknife empirical likelihood methods
to formulate interval estimations for important quantities in
actuarial science and risk management, such as the risk-distortion
measures, Spearman's rho and parametric copulas. Lastly, we
introduce the theory of completely mixable (CM) distributions. We
give properties of the CM distributions, show that a few classes of
distributions are CM and use the new technique to find the bounds
for the sum of individual risks with given marginal distributions
but unspecific dependence structure. The result partially solves a
problem that had been a challenge for decades, and directly leads to
the bounds on quantities of interest in risk management, such as the
variance, the stop-loss premium, the price of the European options
and the Value-at-Risk associated with a joint portfolio.
|
612 |
Likelihood ratio tests of separable or double separable covariance structure, and the empirical null distributionGottfridsson, Anneli January 2011 (has links)
The focus in this thesis is on the calculations of an empirical null distributionfor likelihood ratio tests testing either separable or double separable covariancematrix structures versus an unstructured covariance matrix. These calculationshave been performed for various dimensions and sample sizes, and are comparedwith the asymptotic χ2-distribution that is commonly used as an approximative distribution. Tests of separable structures are of particular interest in cases when data iscollected such that more than one relation between the components of the observationis suspected. For instance, if there are both a spatial and a temporalaspect, a hypothesis of two covariance matrices, one for each aspect, is reasonable.
|
613 |
Cornish-Fisher Expansion and Value-at-Risk method in application to risk management of large portfoliosSjöstrand, Maria, Aktaş, Özlem January 2011 (has links)
One of the major problem faced by banks is how to manage the risk exposure in large portfolios. According to Basel II regulation banks has to measure the risk using Value-at-Risk with confidence level 99%. However, this regulation does not specify the way to calculate Valueat- Risk. The easiest way to calculate Value-at-Risk is to assume that portfolio returns are normally distributed. Altough, this is the most common way to calculate Value-at-Risk, there exists also other methods. The previous crisis shows that the regular methods are unfortunately not always enough to prevent bankruptcy. This paper is devoted to compare the classical methods of estimating risk with other methods such as Cornish-Fisher Expansion (CFVaR) and assuming generalized hyperbolic distribution. To be able to do this study, we estimate the risk in a large portfolio consisting of ten stocks. These stocks are chosen from the NASDAQ 100-list in order to have highly liquid stocks (bluechips). The stocks are chosen from different sectors to make the portfolio welldiversified. To investigate the impact of dependence between the stocks in the portfolio we remove the two most correlated stocks and consider the resulting eight stock portfolio as well. In both portfolios we put equal weight to the included stocks. The results show that for a well-diversified large portfolio none of the risk measures are violated. However, for a portfolio consisting of only one highly volatile stock we prove that we have a violation in the classical methods but not when we use the modern methods mentioned above.
|
614 |
Provisions estimation for portfolio of CDO in Gaussian financial environmentMaximchuk, Oleg, Volkov, Yury January 2011 (has links)
The problem of managing the portfolio provisions is of very high importance for any financial institution. In this paper we provide both static and dynamic models of provisions estimation for the case when the decision about provisions is made at the first moment of time subject to the absence of information and for the case of complete and incomplete information. Also the hedging strategy for the case of the defaultable market is presented in this work as another tool of reducing the risk of default. The default time is modelled as a first-passage time of a standard Brownian motion through a deterministic barrier. Some methods of numerical provision estimation are also presented.
|
615 |
On an epidemic model given by a stochastic differential equationZararsiz, Zarife January 2009 (has links)
We investigate a certain epidemics model, with and without noise. Some parameter analysis is performed together with computer simulations. The model was presented in Iacus (2008).
|
616 |
Change Point Estimation for Stochastic Differential EquationsYalman, Hatice January 2009 (has links)
A stochastic differential equationdriven by a Brownian motion where the dispersion is determined by a parameter is considered. The parameter undergoes a change at a certain time point. Estimates of the time change point and the parameter, before and after that time, is considered.The estimates were presented in Lacus 2008. Two cases are considered: (1) the drift is known, (2) the drift is unknown and the dispersion space-independent. Applications to Dow-Jones index 1971-1974 and Goldmann-Sachs closings 2005-- May 2009 are given.
|
617 |
Some recent simulation techniques of diffusion bridgeSekerci, Yadigar January 2009 (has links)
We apply some recent numerical solutions to diffusion bridges written in Iacus (2008). One is an approximate scheme from Bladt and S{\o}rensen (2007), another one, from Beskos et al (2006), is an algorithm which is exact: no numerical error at given grid points!
|
618 |
Reliability-based structural design: a case of aircraft floor grid layout optimizationChen, Qing 07 January 2011 (has links)
In this thesis, several Reliability-based Design Optimization (RBDO) methods and algorithms for airplane floor grid layout optimization are proposed. A general RBDO process is proposed and validated by an example. Copula as a mathematical method to model random variable correlations is introduced to discover the correlations between random variables and to be applied in producing correlated data samples for Monte Carlo simulations. Based on Hasofer-Lind (HL) method, a correlated HL method is proposed to evaluate a reliability index under correlation. As an alternative method for computing a reliability index, the reliability index is interpreted as an optimization problem and two nonlinear programming algorithms are introduced to evaluate reliability index. To evaluate the reliability index by Monte Carlo simulation in a time efficient way, a kriging-based surrogate model is proposed and compared to the original model in terms of computing time. Since in RBDO optimization models the reliability constraint obtained by MCS does not have an analytical form, a kriging-based response surface is built. Kriging-based response surface models are usually segment functions that do not have a uniform expression over the design space; however, most optimization algorithms require a uniform expression for constraints. To solve this problem, a heuristic gradient-based direct searching algorithm is proposed. These methods and algorithms, together with the RBDO general process, are applied to the layout optimization of aircraft floor grid structural design.
|
619 |
Empirical likelihood and extremesGong, Yun 17 January 2012 (has links)
In 1988, Owen introduced empirical likelihood as a nonparametric method for constructing confidence intervals and regions. Since then, empirical likelihood has been studied extensively in the
literature due to its generality and effectiveness. It is well known that empirical likelihood has several attractive advantages
comparing to its competitors such as bootstrap: determining the shape of confidence regions automatically using only the data;
straightforwardly incorporating side information expressed through constraints; being Bartlett correctable. The main part of this
thesis extends the empirical likelihood method to several interesting and important statistical inference situations. This thesis has four components. The first component (Chapter II)
proposes a smoothed jackknife empirical likelihood method to construct confidence intervals for the receiver operating characteristic (ROC) curve in order to overcome the computational
difficulty when we have nonlinear constrains in the maximization problem. The second component (Chapter III and IV) proposes smoothed
empirical likelihood methods to obtain interval estimation for the conditional Value-at-Risk with the volatility model being an ARCH/GARCH model and a nonparametric regression respectively, which
have applications in financial risk management. The third component(Chapter V) derives the empirical likelihood for the intermediate
quantiles, which plays an important role in the statistics of extremes. Finally, the fourth component (Chapter VI and VII)
presents two additional results: in Chapter VI, we present an interesting result by showing that, when the third moment is infinity, we may prefer the Student's t-statistic to the sample mean
standardized by the true standard deviation; in Chapter VII, we present a method for testing a subset of parameters for a given
parametric model of stationary processes.
|
620 |
On an epidemic model given by a stochastic differential equationZararsiz, Zarife January 2009 (has links)
<p>We investigate a certain epidemics model, with and without noise. Some parameter analysis is performed together with computer simulations. The model was presented in Iacus (2008).</p>
|
Page generated in 0.1728 seconds