• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9
  • 6
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 24
  • 24
  • 6
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Calibration of Option Pricing in Reproducing Kernel Hilbert Space

Ge, Lei 01 January 2015 (has links)
A parameter used in the Black-Scholes equation, volatility, is a measure for variation of the price of a financial instrument over time. Determining volatility is a fundamental issue in the valuation of financial instruments. This gives rise to an inverse problem known as the calibration problem for option pricing. This problem is shown to be ill-posed. We propose a regularization method and reformulate our calibration problem as a problem of finding the local volatility in a reproducing kernel Hilbert space. We defined a new volatility function which allows us to embrace both the financial and time factors of the options. We discuss the existence of the minimizer by using regu- larized reproducing kernel method and show that the regularizer resolves the numerical instability of the calibration problem. Finally, we apply our studied method to data sets of index options by simulation tests and discuss the empirical results obtained.
2

Multiple Kernel Imputation : A Locally Balanced Real Donor Method

Pettersson, Nicklas January 2013 (has links)
We present an algorithm for imputation of incomplete datasets based on Bayesian exchangeability through Pólya sampling. Each (donee) unit with a missing value is imputed multiple times by observed (real) values on units from a donor pool. The donor pools are constructed using auxiliary variables. Several features from kernel estimation are used to counteract unbalances that are due to sparse and bounded data. Three balancing features can be used with only one single continuous auxiliary variable, but an additional fourth feature need, multiple continuous auxiliary variables. They mainly contribute by reducing nonresponse bias. We examine how the donor pool size should be determined, that is the number of potential donors within the pool. External information is shown to be easily incorporated in the imputation algorithm. Our simulation studies show that with a study variable which can be seen as a function of one or two continuous auxiliaries plus residual noise, the method performs as well or almost as well as competing methods when the function is linear, but usually much better when the function is nonlinear. / <p>At the time of the doctoral defense, the following papers were unpublished and had a status as follows: Paper 1: In press. Paper 3: Submitted. Paper 4: Submitted.</p>
3

First- and Second-Order Properties of Spatiotemporal Point Patterns in the Space-Time and Frequency Domains

Dorai-Raj, Sundardas Samuel 10 August 2001 (has links)
Point processes are common in many physical applications found in engineering and biology. These processes can be observed in one-dimension as a time series or two-dimensions as a spatial point pattern with extensive amounts of literature devoted to their analyses. However, if the observed process is a hybrid of spatial and temporal point process, very few practical methods exist. In such cases, practitioners often remove the temporal component and analyze the spatial dependencies. This marginal spatial analysis may lead to misleading results if time is an important factor in the process. In this dissertation we extend the current analysis of spatial point patterns to include a temporal dimension. First- and second-order intensity measures for analyzing spatiotemporal point patterns are explicitly defined. Estimation of first-order intensities are examined using 3-dimensional smoothing techniques. Conditions for weak stationarity are provided so that subsequent second-order analysis can be conducted. We consider second-order analysis of spatiotemporal point patterns first in the space-time domain through an extension of Ripley's Κ-function. An alternative analysis is given in the frequency domain though construction of a spatiotemporal periodogram. The methodology provided is tested through simulation of spatiotemporal point patterns and by analysis of a real data set. The biological application concerns the estimation of the homerange of groups of the endangered red-cockaded woodpecker in the Fort Bragg area of North Carolina. Monthly or bimonthly point patterns of the bird distribution are analyzed and integrated over a 23 month period. / Ph. D.
4

On gamma kernel function in recursive density estimation

Ma, Xiaoxiao 09 August 2019 (has links)
In this thesis we investigate the convergence rate of gamma kernel estimators in recursive density estimation. Unlike the traditional symmetric and fixed function, the gamma kernel is a kernel function with bounded support and varying shapes. Gamma kernels have been used to address the boundary bias problem which occurs when a symmetric kernel is used to estimate a density which has support on [0, ?). The recursive density estimation is useful when an 'additional data' (on-line) comes from the population density which we want to estimate. We utilize the ideas and results from the adaptive kernel estimation to show that the L_2 convergence rate of the recursive kernel density estimators which use gamma kernels is n^(-4/5).
5

Income Distribution Dynamics and Cross-Region Convergence in Europe. Spatial filtering and novel stochastic kernel representations

Fischer, Manfred M., Stumpner, Peter 04 1900 (has links) (PDF)
This paper suggests an empirical framework for analysing income distribution dynamics and cross-region convergence in the European Union of 27 member states, 1995- 2003. The framework lies in the research tradition that allows the state income space to be continuous, puts emphasis on both shape and intra-distribution dynamics and uses stochastic kernels for studying transition dynamics and implied long-run behaviour. In this paper stochastic kernels are described by conditional density functions, estimated by a product kernel estimator of conditional density and represented by means of novel visualisation tools. The technique of spatial filtering is used to account for spatial effects, in order to avoid misguided inferences and interpretations caused by the presence of spatial autocorrelation in the income distributions. The results reveal a slow catching-up of the poorest regions and a process of polarisation, with a small group of very rich regions shifting away from the rest of the cross-section. This is well evidenced by both, the unfiltered and the filtered ergodic density view. Differences exist in detail, and these emphasise the importance to properly deal with the spatial autocorrelation problem. (authors' abstract)
6

Nestacionární procesy částic / Nonstationary particle processes

Jirsák, Čeněk January 2011 (has links)
Title: Nonstacionary particle processes Author: Čeněk Jirsák Department: Department of Probability and Mathematical Statistics Supervisor: Doc. RNDr. Jan Rataj, CSc., Mathematical Institute, Charles University Supervisor's e-mail address: rataj@karlin.mff.cuni.cz Abstract: Many real phenomena can be modeled as random closed sets of different Hausdorff dimension in Rd . One of the main characteristics of such random set is its expected Hausdorff measure. In case that this measure has a density, the density is called intensity function. In present paper we define a nonparametric kernel estimation of the intensity function. The concept of Hk -rectifiable set has a key role here. Properties of kernel estimation such as unbiasness or convergence behavior are studied. As the esti- mation may be difficult to compute precisely numerical approximations are derived for practical use. Parametric models are also briefly mentioned and the kernel estimation is used with the minimum contrast method to estimate the parameters of the model. At last the suggested methods are tested on simulated data. Keywords: stochastic geometry, intensity measure, random closed set, kernel estimation 1
7

Mobile Services Based Traffic Modeling

Strengbom, Kristoffer January 2015 (has links)
Traditionally, communication systems have been dominated by voice applications. Today with the emergence of smartphones, focus has shifted towards packet switched networks. The Internet provides a wide variety of services such as video streaming, web browsing, e-mail etc, and IP trac models are needed in all stages of product development, from early research to system tests. In this thesis, we propose a multi-level model of IP traffic where the user behavior and the actual IP traffic generated from different services are considered as being two independent random processes. The model is based on observations of IP packet header logs from live networks. In this way models can be updated to reflect the ever changing service and end user equipment usage. Thus, the work can be divided into two parts. The first part is concerned with modeling the traffic from different services. A subscriber is interested in enjoying the services provided on the Internet and traffic modeling should reflect the characteristics of these services. An underlying assumption is that different services generate their own characteristic pattern of data. The FFT is used to analyze the packet traces. We show that the traces contains strong periodicities and that some services are more or less deterministic. For some services this strong frequency content is due to the characteristics of cellular network and for other it is actually a programmed behavior of the service. The periodicities indicate that there are strong correlations between individual packets or bursts of packets. The second part is concerned with the user behavior, i.e. how the users access the different services in time. We propose a model based on a Markov renewal process and estimate the model parameters. In order to evaluate the model we compare it to two simpler models. We use model selection, using the model's ability to predict future observations as selection criterion. We show that the proposed Markov renewal model is the best of the three models in this sense. The model selection framework can be used to evaluate future models.
8

Imagerie tomograpbique d'un volcan à l'aide des muons atmosphériques / Tomographic imaging of volcanoes using atmospheric muons

Béné, Samuel 22 December 2016 (has links)
Les muons atmosphériques sont des particules élémentaires créées lors de l’interaction des rayons cosmiques de haute énergie avec les atomes de la haute atmosphère. Leur capacité à traverser de grandes quantités de matière et leur abondance au niveau du sol permet d’utiliser leur flux comme support à la radiographie de grands objets. Cette technique, la muographie, possède notamment comme sujet d’application possible l’étude de volcans. La collaboration Tomuvol, au sein de laquelle cette thèse s’est déroulée, vise à mettre au point un détecteur et les techniques d’analyse permettant la réalisation d’une telle mesure avec comme sujet d’étude un volcan auvergnat : le Puy de Dôme. Ce manuscrit présente les contributions à ce travail du point de vue instrumental tout d’abord, avec la calibration et l’optimisation des performances des chambres GRPC utilisées pour la mesure. Les performances du détecteur lors des diverses campagnes de prise de données qui se sont déroulées au pied du Puy de Dôme sont également résumées. Dans une deuxième partie, l’accent est porté sur l’analyse physique des données obtenues avec, dans un premier temps, la description des travaux de simulation Monte-Carlo mis en œuvre avec le logiciel GEANT4. Puis, une technique d’estimation du flux transmis de muons atmosphériques à l’aide d’une méthode de type noyaux est présentée, et la carte de densité estimée du Puy de Dôme qui en découle est comparée aux résultats issus de techniques géophysiques. / Atmospheric muons are elementary particles originating from the interaction of high energy cosmic rays with atoms in the upper atmosphere. Their ability to travel through a large amount of matter and their abundance at ground level allows for their flux to be used as a probe for the radiography of big objects. This technique, muography, can in particular be of interest for the study of volcanoes. The Tomuvol collaboration, within which this thesis took place, aims at developing a detector and analysis techniques allowing to perform such a measurment, using a volcano from Auvergne as a case study : the Puy de Dôme. This document describes the author’s contributions to this work, focusing on the intrumentation aspect first, with the calibration and optimisation of the GRPC chambers used to perform the measurment. The performances of the detector during the various campaigns of data acquisition at the base of the Puy de Dôme are also sumed up. A second part is dedicated to the physical analysis of the data with, firstly, the description of the Monte-Carlo simulations that were developed using the GEANT4 software. Then, a kernel-like estimation method of the transmitted flux of atmospheric muons is described, and the density map of the Puy de Dôme thus obtained is compared to results coming from geophysical techniques.
9

Développement de modèles non paramétriques et robustes : application à l’analyse du comportement de bivalves et à l’analyse de liaison génétique

Sow, Mohamedou 20 May 2011 (has links)
Le développement des approches robustes et non paramétriques pour l’analyse et le traitement statistique de gros volumes de données présentant une forte variabilité,comme dans les domaines de l’environnement et de la génétique, est fondamental.Nous modélisons ici des données complexes de biologie appliquées à l’étude du comportement de bivalves et à l’analyse de liaison génétique. L’application des mathématiques à l’analyse du comportement de mollusques bivalves nous a permis d’aller vers une quantification et une traduction mathématique de comportements d’animaux in-situ, en milieu proche ou lointain. Nous avons proposé un modèle de régression non paramétrique et comparé 3 estimateurs non paramétriques, récursifs ou non,de la fonction de régression pour optimiser le meilleur estimateur. Nous avons ensuite caractérisé des rythmes biologiques, formalisé l’évolution d’états d’ouvertures,proposé des méthodes de discrimination de comportements, utilisé la méthode des shot-noises pour caractériser différents états d’ouverture-fermetures transitoires et développé une méthode originale de mesure de croissance en ligne.En génétique, nous avons abordé un cadre plus général de statistiques robustes pour l’analyse de liaison génétique. Nous avons développé des estimateurs robustes aux hypothèses de normalités et à la présence de valeurs aberrantes, nous avons aussi utilisé une approche statistique, où nous avons abordé la dépendance entre variables aléatoires via la théorie des copules. Nos principaux résultats ont montré l’intérêt pratique de ces estimateurs sur des données réelles de QTL et eQTL. / The development of robust and nonparametric approaches for the analysis and statistical treatment of high-dimensional data sets exhibiting high variability, as seen in the environmental and genetic fields, is instrumental. Here, we model complex biological data with application to the analysis of bivalves’ behavior and to linkage analysis. The application of mathematics to the analysis of mollusk bivalves’behavior gave us the possibility to quantify and translate mathematically the animals’behavior in situ, in close or far field. We proposed a nonparametric regression model and compared three nonparametric estimators (recursive or not) of the regressionfunction to optimize the best estimator. We then characterized the biological rhythms, formalized the states of opening, proposed methods able to discriminate the behaviors, used shot-noise analysis to characterize various opening/closing transitory states and developed an original approach for measuring online growth.In genetics, we proposed a more general framework of robust statistics for linkage analysis. We developed estimators robust to distribution assumptions and the presence of outlier observations. We also used a statistical approach where the dependence between random variables is specified through copula theory. Our main results showed the practical interest of these estimators on real data for QTL and eQTL analysis.
10

Essays on nonparametric estimation of asset pricing models

Dalderop, Jeroen Wilhelmus Paulus January 2018 (has links)
This thesis studies the use of nonparametric econometric methods to reconcile the empirical behaviour of financial asset prices with theoretical valuation models. The confrontation of economic theory with asset price data requires various functional form assumptions about the preferences and beliefs of investors. Nonparametric methods provide a flexible class of models that can prevent misspecification of agents’ utility functions or the distribution of asset returns. Evidence for potential nonlinearity is seen in the presence of non-Gaussian distributions and excessive volatility of stock returns, or non-monotonic stochastic discount factors in option prices. More robust model specifications are therefore likely to contribute to risk management and return predictability, and lend credibility to economists’ assertions. Each of the chapters in this thesis relaxes certain functional form assumptions that seem most important for understanding certain asset price data. Chapter 1 focuses on the state-price density in option prices, which confounds the nonlinearity in both the preferences and the beliefs of investors. To understand both sources of nonlinearity in equity prices, Chapter 2 introduces a semiparametric generalization of the standard representative agent consumption-based asset pricing model. Chapter 3 returns to option prices to understand the relative importance of changes in the distribution of returns and in the shape of the pricing kernel. More specifically, Chapter 1 studies the use of noisy high-frequency data to estimate the time-varying state-price density implicit in European option prices. A dynamic kernel estimator of the conditional pricing function and its derivatives is proposed that can be used for model-free risk measurement. Infill asymptotic theory is derived that applies when the pricing function is either smoothly varying or driven by diffusive state variables. Trading times and moneyness levels are modelled by marked point processes to capture intraday trading patterns. A simulation study investigates the performance of the estimator using an iterated plug-in bandwidth in various scenarios. Empirical results using S&P 500 E-mini European option quotes finds significant time-variation at intraday frequencies. An application towards delta- and minimum variance-hedging further illustrates the use of the estimator. Chapter 2 proposes a semiparametric asset pricing model to measure how consumption and dividend policies depend on unobserved state variables, such as economic uncertainty and risk aversion. Under a flexible specification of the stochastic discount factor, the state variables are recovered from cross-sections of asset prices and volatility proxies, and the shape of the policy functions is identified from the pricing functions. The model leads to closed-form price-dividend ratios under polynomial approximations of the unknown functions and affine state variable dynamics. In the empirical application uncertainty and risk aversion are separately identified from size-sorted stock portfolios exploiting the heterogeneous impact of uncertainty on dividend policy across small and large firms. I find an asymmetric and convex response in consumption (-) and dividend growth (+) towards uncertainty shocks, which together with moderate uncertainty aversion, can generate large leverage effects and divergence between macroeconomic and stock market volatility. Chapter 3 studies the nonparametric identification and estimation of projected pricing kernels implicit in the pricing of options, the underlying asset, and a riskfree bond. The sieve minimum-distance estimator based on conditional moment restrictions avoids the need to compute ratios of estimated risk-neutral and physical densities, and leads to stable estimates even in regions with low probability mass. The conditional empirical likelihood (CEL) variant of the estimator is used to extract implied densities that satisfy the pricing restrictions while incorporating the forwardlooking information from option prices. Moreover, I introduce density combinations in the CEL framework to measure the relative importance of changes in the physical return distribution and in the pricing kernel. The nonlinear dynamic pricing kernels can be used to understand return predictability, and provide model-free quantities that can be compared against those implied by structural asset pricing models.

Page generated in 0.0873 seconds