21 
Optimal Martingale measures and hedging in models driven by Levy processesKollar, Jozef January 2011 (has links)
Our research falls into a broad area of pricing and hedging of contingent claims in incomplete markets. In the rst part we introduce the L evy processes as a suitable class of processes for nancial modelling purposes. This in turn causes the market to become incomplete in general and therefore the martingale measure for the pricing/hedging purposes has to be chosen by introducing some subjective criteria. We study several such criteria in the second section for a general stochastic volatility model driven by L evy process, leading to minimal martingale measure, varianceoptimal, or the more general qoptimal martingale measure, for which we show the convergence to the minimal entropy martingale measure for q # 1. The martingale measures studied in the second section are put to use in the third section, where we consider various hedging problems in both martingale and semimartingale setting. We study locally riskminimization hedging problem, meanvariance hedging and the more general poptimal hedging, of which the meanvariance hedging is a special case for p = 2. Our model allows us to explicitly determine the varianceoptimal martingale measure and the meanvariance hedging strategy using the structural results of Gourieroux, Laurent and Pham (1998) extended to discontinuous case by Arai (2005a). Assuming a Markovian framework and appealing to the FeynmanKac theorem, the optimal hedge can be found by solving a threedimensional partial integrodi erential equation. We illustrate this in the last section by considering the varianceoptimal hedge of the European put option, and nd the solution numerically by applying nite di erence method.

22 
Nonparametric competing risks with multivariate frailty modelsMasuadi, E. January 2013 (has links)
This research focuses on two theories: (i) competing risks and (ii) random eect (frailty) models. The theory of competing risks provides a structure for inference in problems where cases are subject to several types of failure. Random eects in competing risk models consist of two underlying distributions: the conditional distribution of the response variables, given the random eect, depending on the explanatory variables each with a failure type specic random eect; and the distribution of the random eect. In this situation, the distribution of interest is the unconditional distribution of the response variable, which may or may not have a tractable form. The parametric competing risk model, in which it is assumed that the failure times are coming from a known distribution, is widely used such as Weibull, Gamma and other distributions. The Gamma distribution has been widely used as a frailty distribution, perhaps due to its simplicity since it has a closed form expression of the unconditional hazard function. However, it is unrealistic to believe that a few parametric models are suitable for all types of failure time. This research focuses on a distribution free of the multivariate frailty models. Another approach used to overcome this problem is using nite mixture of parametric frailty especially those who have a closed form of unconditional survival function. In addition, the advantages and disadvantages of a parametric competing risk models with multivariate parametric and/or nonparametric frailty (correlated random eects) are investigated. In this research, four main models are proposed: rst, an application of a new computation and analysis of a multivariate frailty with competing risk model using Cholesky decomposition of the Lognormal frailty. Second, a correlated Inverse Gaussian frailty in the presence of competing risks model. Third, a nonparametric multivariate frailty with parametric competing risk model is proposed. Finally, a simulation study of nite mixture of Inverse Gaussian frailty showed the ability of this model to t dierent frailty distribution. One main issue in multivariate analysis is the time it needs to t the model. The proposed nonparametric model showed a signicant time decrease in estimating the model parameters (about 80% less time compared the LogNormal frailty with nested loops). A real data of recurrence of breast cancer is used as the applications of these models.

23 
Probability, chances and beliefDerksen, A. A. January 1974 (has links)
No description available.

24 
Some contributions to filtering theory with applications in financial modellingJalen, Luka January 2009 (has links)
Two main groups of filtering algorithms are characterised and developed. Their applicability is demonstrated using actuarial and financial time series data. The first group of algorithms involved hidden Markov models (HMM), where the parameters of an asset price model switch between regimes in accordance with the dynamics of a Markov chain. We start with the known HMM filtering setup and extend the framework to the case where the drift and volatility have independent probabilistic behaviour. In addition, a nonnormal noise term is considered and recursive formulae in the online reestimation of model parameters are derived for the case of students’ tdistributed noise. Change of reference probability is employed in the construction of the filters. Both extensions are then tested on financial and actuarial data. The second group of filtering algorithms deals with sigma point filtering techniques. We propose a method to generate sigma points from symmetric multivariate distributions. The algorithm matches the first three moments exactly and the fourth moment approximately; this minimises the worst case mismatch using a semidefinite programming approach. The sigma point generation procedure is in turn applied to construct algorithms in the latent state estimation of nonlinear time series models; a numerical demonstration of the procedure’s effectiveness is given. Finally, we propose a partially linearised sigma point filter, which is an alternative technique for the optimal state estimation of a wide class of nonlinear time series models. In particular, sigma points are employed for generating samples of possible state values and then a linear programmingbased procedure is utilised in the update step of the state simulation. The performance of the filtering technique is then assessed on simulated, highly nonlinear multivariate interest rate process and is shown to perform significantly better than the extended Kalman filter in terms of computational time.

25 
Algebraic structures in stochastic differential equationsCurry, Charles January 2014 (has links)
We define a new numerical integration scheme for stochastic differential equations driven by Levy processes with uniformly lower mean square remainder than that of the scheme of the same strong order of convergence obtained by truncating the stochastic Taylor series. In doing so we generalize recent results concerning stochastic differential equations driven by Wiener processes. The aforementioned works studied integration schemes obtained by applying an invertible mapping to the stochastic Taylor series, truncating the resulting series and applying the inverse of the original mapping. The shuffle Hopf algebra and its associated convolution algebra play important roles in the their analysis, arising from the combinatorial structure of iterated Stratonovich integrals. It was recently shown that the algebra generated by iterated It^o integrals of independent Levy processes is isomorphic to a quasishuffle algebra. We utilise this to consider maptruncateinvert schemes for Levy processes. To facilitate this, we derive a new form of stochastic Taylor expansion from those of Wagner & Platen, enabling us to extend existing algebraic encodings of integration schemes. We then derive an alternative method of computing maptruncateinvert schemes using a single step, resolving diffculties encountered at the inversion step in previous methods.

26 
Topics in population genetics and mathematical evolutionary biologyXu, Yiyang January 2015 (has links)
Part A studies the optimal strategies of seed germination problems where the population has a class structure under a fluctuating environment . In particular, a multidimensional ageclass model is studied using a dynamical programming method. Numerical results about the socalled optimal stochastic strategy which consists of information about previous environmental states are computed. Comparing the optimal stochastic strategy with the optimal populationbased strategy shows that the optimal stochastic strategy is highly effective in genera.l. A potentially useful diffusion approximation for the seed germination problem is also derived with numerical results. For part B, a multidimensional Moran model is studied using a diffusion approximation approach. The scaling limit and corresponding governing stochastic partial differential equations (SDEs) are derived. An expansion method is used to approximate the stationary distribution of the SDEs. An approximation formula for the effective migration rate is then derived.

27 
Survival and connection probabilities of open three dimensional billiardsRahman, Mohammed Rizwanur January 2015 (has links)
The purpose of the work presented in this thesis was carrying out survival and connection probability analyses in open three dimensional billiards. Results obtained are based on and extended upon previous work on analogous two dimensional counterparts. Typical analyses of such systems involve using the law of reflection in studying ranges of paths travelled by particles and/or waves that have constant velocity unless they make contact with the boundary or escape the system. The two forms of systems whose results will be presented are billiards (probabilities that some enclosed particle does not escape the region in question for a minimum measure of time), in our cases spherical in shape (hence the spherical billiard problem) and wireless communication channels (connection probabilities within a nodal network). We will let A and B denote the constant and 0(1/t) terms in the longtime expansions of the survival probability in the billiard system in question for time t. Results found regarding the spherical billiard work, analytically as well as numerically for a circular hole configuration and at best numerically for a square hole configuration include a longtime trend of the form A +B/t. In both hole type configurations, analytic expressions for the constant terms in the longtime survival probability expansions have been computed. For the case of the circularshaped hole, the 0(1/t) coefficient has a small ε (angular displacement from a vector pointing to the center of the hole to a vector pointing to its boundary) expansion with one of its terms resulting from the Riemann hypothesis, perhaps the greatest unsolved numbertheoretical problem. A key extension to the singlehole problem is the multihole problem. Results found regarding the wireless communications work include the connection probability within a network of nodes being approximately exponentially dependent on absolute distances between nodal pairs and being dependent on reflections with the channel boundary. In particular, results for communication channels with convex geometries (in past work) are presented and it has been justified that the corresponding analysis can be used in the study of nonconvex geometries (in this work).

28 
Probabilistic topographic information visualisationRice, Iain January 2015 (has links)
The focus of this thesis is the extension of topographic visualisation mappings to allow for the incorporation of uncertainty. Few visualisation algorithms in the literature are capable of mapping uncertain data with fewer able to represent observation uncertainties in visualisations. As such, modifications are made to NeuroScale, Locally Linear Embedding, Isomap and Laplacian Eigenmaps to incorporate uncertainty in the observation and visualisation spaces. The proposed mappings are then called Normallydistributed NeuroScale (NNS), Tdistributed NeuroScale (TNS), Probabilistic LLE (PLLE), Probabilistic Isomap (PIso) and Probabilistic Weighted Neighbourhood Mapping (PWNM). These algorithms generate a probabilistic visualisation space with each latent visualised point transformed to a multivariate Gaussian or Tdistribution, using a feedforward RBF network. Two types of uncertainty are then characterised dependent on the data and mapping procedure. Data dependent uncertainty is the inherent observation uncertainty. Whereas, mapping uncertainty is defined by the Fisher Information of a visualised distribution. This indicates how well the data has been interpolated, offering a level of ‘surprise’ for each observation. These new probabilistic mappings are tested on three datasets of vectorial observations and three datasets of real world time series observations for anomaly detection. In order to visualise the time series data, a method for analysing observed signals and noise distributions, Residual Modelling, is introduced. The performance of the new algorithms on the tested datasets is compared qualitatively with the latent space generated by the Gaussian Process Latent Variable Model (GPLVM). A quantitative comparison using existing evaluation measures from the literature allows performance of each mapping function to be compared. Finally, the mapping uncertainty measure is combined with NeuroScale to build a deep learning classifier, the Cascading RBF. This new structure is tested on the MNist dataset achieving world record performance whilst avoiding the flaws seen in other Deep Learning Machines.

29 
Stability of stochastic differential systemsOkonta, Peter Nwanneka January 1976 (has links)
No description available.

30 
The combination of experts' opinions in probability assessment : some theoretical considerationsMcConway, Kevin James January 1978 (has links)
No description available.

Page generated in 0.0323 seconds