• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 104
  • 14
  • 14
  • 13
  • 7
  • 6
  • 6
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 419
  • 251
  • 52
  • 42
  • 40
  • 35
  • 32
  • 28
  • 26
  • 26
  • 26
  • 24
  • 23
  • 21
  • 20
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Stochastic models and methods for the assessment of earthquake risk in insurance

Jiménez-Huerta, Diego January 2009 (has links)
The problem of earthquake risk assessment and management in insurance is a challenging one at the interface of geophysics, engineering seismology, stochastics, insurance mathematics and economics. In this work, I propose stochastic models and methods for the assessment of earthquake risk from an insurer's point of view, where the aim is not to address problems in the financial mathematics and economics of risk selection, pricing, portfolio management, and risk transfer strategies such as reinsurance and securitisation, but to enable the latter through the characterisation of the foundation of any risk management consideration in insurance: the distribution of losses over a period of time for a portfolio of risks. Insurance losses are assumed to be generated by a loss process that is in turn governed by an earthquake process, a point process marked with the earthquake's hypocentre and magnitude, and a conditional loss distribution for an insurance portfolio, governing the loss size given the hypocentre and magnitude of the earthquake, and the physical characteristics of the portfolio as described in the individual policy records. From the modeling perspective, I examine the (non-trivial) minutiae around the infrastructure underpinning the loss process. A novel model of the earthquake process, a Poisson marked point process with spatial gamma intensity measure on the hypocentral space, and extensions of the Poisson and stress release models through the inclusion of hypocentral location in the mark, are proposed. I discuss the general architectural considerations for constructing the conditional loss distribution, and propose a new model as an alternative to the traditional ground motion attenuation and seismic vulnerability approach in engineering risk assessment. On the actuarial mathematics front, given a fully specified loss process, I address the problem of constructing simulation based and, where possible, analytical approximations to the distribution of portfolio losses over a period of time. I illustrate the applicability of the stochastic models and methods proposed in this work through the analysis of a residential homeowners property catastrophe portfolio exposed to earthquake risk in California. I construct approximations to the distribution of portfolio losses over a period of time under each of the three models of the earthquake process that I propose, and discuss their relative merits.
42

The methodology of flowgraph models

Ren, Yu January 2011 (has links)
Flowgraph models are directed graph models for describing the dynamic changes in a stochastic process. They are one class of multistate models that are applied to analyse time-to-event data. The main motivation of the flowgraph models is to determine the distribution of the total waiting times until an event of interest occurs in a stochastic process that progresses through various states. This thesis applies the methodology of flowgraph models to the study of Markov and SemiMarkov processes. The underlying approach of the thesis is that the access to the moment generating function (MGF) and cumulant generating function (CGF), provided by Mason’s rule enables us to use the Method of Moments (MM) which depends on moments and cumulant. We give a new derivation of the Mason’s rule to compute the total waiting MGF based on the internode transition matrix of a flowgraph. Next, we demonstrate methods to determine and approximate the distribution of total waiting time based on the inversion of the MGF, including an alternative approach using the Pad´e approximation of the MGF, which always yields a closed form density. For parameter estimation, we extend the Expectation-Maximization (EM) algorithm to estimate parameters in the mixture of negative weight exponential density. Our second contribution is to develop a bias correction method in the Method of Moments (BCMM). By investigating methods for tail area approximation, we propose a new way to estimate the total waiting time density function and survival
43

Dimensionality reduction in nonparametric conditional density estimation with applications to nonlinear time series

Rosemarin, Roy January 2012 (has links)
Nonparametric methods of estimation of conditional density functions when the dimension of the explanatory variable is large are known to suffer from slow convergence rates due to the 'curse of dimensionality'. When estimating the conditional density of a random variable Y given random d-vector X, a significant reduction in dimensionality can be achieved, for example, by approximating the conditional density by that of a Y given θ TX, where the unit-vector θ is chosen to optimise the approximation under the Kullback-Leibler criterion. As a first step, this thesis pursues this 'single-index' approximation by standard kernel methods. Under strong-mixing conditions, we derive a general asymptotic representation for the orientation estimator, and as a result, the approximated conditional density is shown to enjoy the same first-order asymptotic properties as it would have if the optimal θ was known. We then proceed and generalise this result to a 'multi-index' approximation using a Projection Pursuit (PP) type approximation. We propose a multiplicative PP approximation of the conditional density that has the form f(y|x) = f₀(y)πM m=1 hm (y,θTmx), where the projection directions θm and the multiplicative elements, hm, m = 1,...,M, are chosen to minimise a weighted version of the Kullback-Leibler relative entropy between the true and the estimated conditional densities. We first establish the validity of the approximation by proving some probabilistic properties, and in particular we show that the PP approximation converges weakly to the true conditional density as M approaches infinity. An iterative procedure for estimation is outlined, and in order to terminate the iterative estimation procedure, a variant of the bootstrap information criterion is suggested. Finally, the theory established for the single-index model serve as a building block in deriving the asymptotic properties of the PP estimator under strong-mixing conditions. All methods are illustrated in simulations with nonlinear time-series models, and some applications to prediction of daily exchange-rate data are demonstrated.
44

A dynamic contagion process for modelling contagion risk in finance and insurance

Zhao, Hongbiao January 2012 (has links)
We introduce a new point process, the dynamic contagion process, by generalising the Hawkes process and Cox process with shot noise intensity. Our process includes both self-excited and externally excited jumps, which could be used to model the dynamics of contagion impact from endogenous and exogenous factors of the underlying system. We systematically analyse the theoretical distributional properties of this new process, based on the piecewise-deterministic Markov process theory developed in Davis (1984), and the extension of the martingale methodology used in Dassios and Embrechts (1989). The analytic expressions of the Laplace transform of the intensity process and probability generating function of the point process are derived. A simulation algorithm is provided for further industrial implementation and statistical analysis. Some extensions of this process and comparison with other similar processes are also investigated. The major object of this study is to produce a general mathematical framework for modelling the dependence structure of arriving events with dynamic contagion, which has the potential to be applicable to a variety of problems in economics, finance and insurance. We apply our research to the default probability of credit risk and ruin probability of risk theory.
45

Applied probabilistic forecasting

Binter, Roman January 2012 (has links)
In any actual forecast, the future evolution of the system is uncertain and the forecasting model is mathematically imperfect. Both, ontic uncertainties in the future (due to true stochasticity) and epistemic uncertainty of the model (reflecting structural imperfections) complicate the construction and evaluation of probabilistic forecast. In almost all nonlinear forecast models, the evolution of uncertainty in time is not tractable analytically and Monte Carlo approaches (”ensemble forecasting”) are widely used. This thesis advances our understanding of the construction of forecast densities from ensembles, the evolution of the resulting probability forecasts and methods of establishing skill (benchmarks). A novel method of partially correcting the model error is introduced and shown to outperform a competitive approach. The properties of Kernel dressing, a method of transforming ensembles into probability density functions, are investigated and the convergence of the approach is illustrated. A connection between forecasting and Information theory is examined by demonstrating that Kernel dressing via minimization of Ignorance implicitly leads to minimization of Kulback-Leibler divergence. The Ignorance score is critically examined in the context of other Information theory measures. The method of Dynamic Climatology is introduced as a new approach to establishing skill (benchmarking). Dynamic Climatology is a new, relatively simple, nearest neighbor based model shown to be of value in benchmarking of global circulation models of the ENSEMBLES project. ENSEMBLES is a project funded by the European Union bringing together all major European weather forecasting institutions in order to develop and test state-of-the-art seasonal weather forecasting models. Via benchmarking the seasonal forecasts of the ENSEMBLES models we demonstrate that Dynamic Climatology can help us better understand the value and forecasting performance of large scale circulation models. Lastly, a new approach to correcting (improving) imperfect model is presented, an idea inspired by [63]. The main idea is based on a two-stage procedure where a second stage ‘corrective’ model iteratively corrects systematic parts of forecasting errors produced by a first stage ‘core’ model. The corrector is of an iterative nature so that at a given time t the core model forecast is corrected and then used as an input into the next iteration of the core model to generate a time t + 1 forecast. Using two nonlinear systems we demonstrate that the iterative corrector is superior to alternative approaches based on direct (non-iterative) forecasts. While the choice of the corrector model class is flexible, we use radial basis functions. Radial basis functions are frequently used in statistical learning and/or surface approximations and involve a number of computational aspects which we discuss in some detail.
46

Quantitative modelling of market booms and crashes

Sheynzon, Ilya January 2012 (has links)
Multiple equilibria models are one of the main categories of theoretical models for stock market crashes. To the best of my knowledge, existing multiple equilibria models have been developed within a discrete time framework and only explain the intuition behind a single crash on the market. The main objective of this thesis is to model multiple equilibria and demonstrate how market prices move from one regime into another in a continuous time framework. As a consequence of this, a multiple jump structure is obtained with both possible booms and crashes, which are defined as points of discontinuity of the stock price process. I consider five different models for stock market booms and crashes, and look at their pros and cons. For all of these models, I prove that the stock price is a cadlag semimartingale process and find conditional distributions for the time of the next jump, the type of the next jump and the size of the next jump, given the public information available to market participants. Finally, I discuss the problem of model parameter estimation and conduct a number of numerical studies.
47

Statistical inference on linear and partly linear regression with spatial dependence : parametric and nonparametric approaches

Thawornkaiwong, Supachoke January 2012 (has links)
The typical assumption made in regression analysis with cross-sectional data is that of independent observations. However, this assumption can be questionable in some economic applications where spatial dependence of observations may arise, for example, from local shocks in an economy, interaction among economic agents and spillovers. The main focus of this thesis is on regression models under three di§erent models of spatial dependence. First, a multivariate linear regression model with the disturbances following the Spatial Autoregressive process is considered. It is shown that the Gaussian pseudo-maximum likelihood estimate of the regression and the spatial autoregressive parameters can be root-n-consistent under strong spatial dependence or explosive variances, given that they are not too strong, without making restrictive assumptions on the parameter space. To achieve e¢ ciency improvement, adaptive estimation, in the sense of Stein (1956), is also discussed where the unknown score function is nonparametrically estimated by power series estimation. A large section is devoted to an extension of power series estimation for random variables with unbounded supports. Second, linear and semiparametric partly linear regression models with the disturbances following a generalized linear process for triangular arrays proposed by Robinson (2011) are considered. It is shown that instrumental variables estimates of the unknown slope parameters can be root-n-consistent even under some strong spatial dependence. A simple nonparametric estimate of the asymptotic variance matrix of the slope parameters is proposed. An empirical illustration of the estimation technique is also conducted. Finally, linear regression where the random variables follow a marked point process is considered. The focus is on a family of random signed measures, constructed from the marked point process, that are second-order stationary and their spectral properties are discussed. Asymptotic normality of the least squares estimate of the regression parameters are derived from the associated random signed measures under mixing assumptions. Nonparametric estimation of the asymptotic variance matrix of the slope parameters is discussed where an algorithm to obtain a positive deÖnite estimate, with faster rates of convergence than the traditional ones, is proposed.
48

Bayesian inference for indirectly observed stochastic processes : applications to epidemic modelling

Dureau, Joseph January 2013 (has links)
Stochastic processes are mathematical objects that offer a probabilistic representation of how some quantities evolve in time. In this thesis we focus on estimating the trajectory and parameters of dynamical systems in cases where only indirect observations of the driving stochastic process are available. We have first explored means to use weekly recorded numbers of cases of Influenza to capture how the frequency and nature of contacts made with infected individuals evolved in time. The latter was modelled with diffusions and can be used to quantify the impact of varying drivers of epidemics as holidays, climate, or prevention interventions. Following this idea, we have estimated how the frequency of condom use has evolved during the intervention of the Gates Foundation against HIV in India. In this setting, the available estimates of the proportion of individuals infected with HIV were not only indirect but also very scarce observations, leading to specific difficulties. At last, we developed a methodology for fractional Brownian motions (fBM), here a fractional stochastic volatility model, indirectly observed through market prices. The intractability of the likelihood function, requiring augmentation of the parameter space with the diffusion path, is ubiquitous in this thesis. We aimed for inference methods robust to refinements in time discretisations, made necessary to enforce accuracy of Euler schemes. The particle Marginal Metropolis Hastings (PMMH) algorithm exhibits this mesh free property. We propose the use of fast approximate filters as a pre-exploration tool to estimate the shape of the target density, for a quicker and more robust adaptation phase of the asymptotically exact algorithm. The fBM problem could not be treated with the PMMH, which required an alternative methodology based on reparameterisation and advanced Hamiltonian Monte Carlo techniques on the diffusion pathspace, that would also be applicable in the Markovian setting.
49

Temporal and spatial modelling of extreme river flow values in Scotland

Franco Villoria, Maria January 2013 (has links)
Extreme river flows can lead to inundation of floodplains, with consequent impacts for society, the environment and the economy. Flood risk estimates rely on river flow records, hence a good understanding of the patterns in river flow, and, in particular, in extreme river flow, is important to improve estimation of risk. In Scotland, a number of studies suggest a West to East rainfall gradient and increased variability in rainfall and river flow. This thesis presents and develops a number of statistical methods for analysis of different aspects of extreme river flows, namely the variability, temporal trend, seasonality and spatial dependence. The methods are applied to a large data set, provided by SEPA, of daily river flow records from 119 gauging stations across Scotland. The records range in length from 10 up to 80 years and are characterized by non-stationarity and long-range dependence. Examination of non-stationarity is done using wavelets. The results revealed significant changes in the variability of the seasonal pattern over the last 40 years, with periods of high and low variability associated with flood-rich and flood-poor periods respectively. Results from a wavelet coherency analysis suggest significant influence of large scale climatic indices (NAO, AMO) on river flow. A quantile regression model is then developed based on an additive regression framework using P-splines, where the parameters are fitted via weighted least squares. The proposed model includes a trend and seasonal component, estimated using the back-fitting algorithm. Incorporation of covariates and extension to higher dimension data sets is straightforward. The model is applied to a set of eight Scottish rivers to estimate the trend and seasonality in the 95th quantile of river flow. The results suggest differences in the long term trend between the East and the West and a more variable seasonal pattern in the East. Two different approaches are then considered for modelling spatial extremes. The first approach consists of a conditional probability model and concentrates on small subsets of rivers. Then a spatial quantile regression model is developed, extending the temporal quantile model above to estimate a spatial surface using the tensor product of the marginal B-spline bases. Residual spatial correlation using a Gaussian correlation function is incorporated into standard error estimation. Results from the 95th quantile fitted for individual months suggest changes in the spatial pattern of extreme river flow over time. The extension of the spatial quantile model to build a fully spatio-temporal model is briefly outlined and the main statistical issues identified.
50

The applicability of statistical techniques to credit portfolios with specific reference to the use of risk theory in banking

O'Connor, Ronan B. January 1996 (has links)
This thesis examines the use of statistical techniques in credit portfolio management, with emphasis on actuarial and risk theoretic pricing and reserving measures. The bank corporate loan portfolio is envisaged as an insurance collective, with margins charged for credit risk forming premium income, provisions made forming claims outgo, and variation over time in provisioning and profitability producing a need for reserves. The research leads to the computation of portfolio specific measures of risk, and suggests that a value-at-risk (VAR) based reserve computation has several advantages over current regulatory reserving methodology in bank capital adequacy regulation (CAR) with respect to non-residential private sector loan portfolios. These latter, current CAR practices are invariably used by banks to compute the respective capital adequacy backing required on loans. A loan pricing model is developed that allocates capital required by reference to observed provisioning rates across a total of 64 differing combinations of rating factors. This represents a statistically rigorous return on risk adjusted capital (RORAC) approach to loan pricing. The suggested approach is illustrated by reference to a particular portfolio of loans. The reserving and pricing measures computed are portfolio specific, but the methodology developed and tested on the specific portfolio (dataset) of loans has a wider, more general applicability. A credit market comprising portfolios which are both more and less risky than the original particular portfolio is hypothesised, and existing regulation is compared to VAR regulation in the context of the hypothesised credit market. Fewer insolvencies are observed using the VAR framework than under existing regulation, and problem portfolios are identified earlier than under existing regulation. For the particular portfolio of loans, existing algorithm-based loan pricing is compared with the proposed loan pricing model. Significant differences are observed in loan pricing by reference to gearing and collateral, and the elimination of observed inefficiencies in pricing is recommended. Although the proposed model has some limitations, it is argued to be an improvement on existing regulatory and banking practice.

Page generated in 0.103 seconds