• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 14
  • 14
  • 6
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

On New Constructive Tools in Bayesian Nonparametric Inference

Al Labadi, Luai January 2012 (has links)
The Bayesian nonparametric inference requires the construction of priors on infinite dimensional spaces such as the space of cumulative distribution functions and the space of cumulative hazard functions. Well-known priors on the space of cumulative distribution functions are the Dirichlet process, the two-parameter Poisson-Dirichlet process and the beta-Stacy process. On the other hand, the beta process is a popular prior on the space of cumulative hazard functions. This thesis is divided into three parts. In the first part, we tackle the problem of sampling from the above mentioned processes. Sampling from these processes plays a crucial role in many applications in Bayesian nonparametric inference. However, having exact samples from these processes is impossible. The existing algorithms are either slow or very complex and may be difficult to apply for many users. We derive new approximation techniques for simulating the above processes. These new approximations provide simple, yet efficient, procedures for simulating these important processes. We compare the efficiency of the new approximations to several other well-known approximations and demonstrate a significant improvement. In the second part, we develop explicit expressions for calculating the Kolmogorov, Levy and Cramer-von Mises distances between the Dirichlet process and its base measure. The derived expressions of each distance are used to select the concentration parameter of a Dirichlet process. We also propose a Bayesain goodness of fit test for simple and composite hypotheses for non-censored and censored observations. Illustrative examples and simulation results are included. Finally, we describe the relationship between the frequentist and Bayesian nonparametric statistics. We show that, when the concentration parameter is large, the two-parameter Poisson-Dirichlet process and its corresponding quantile process share many asymptotic pr operties with the frequentist empirical process and the frequentist quantile process. Some of these properties are the functional central limit theorem, the strong law of large numbers and the Glivenko-Cantelli theorem.
12

Information on a default time : Brownian bridges on a stochastic intervals and enlargement of filtrations / Information sur le temps de défaut : ponts browniens sur des intervalles stochastiques et grossissement de filtrations

Bedini, Matteo 12 October 2012 (has links)
Dans ce travail de thèse le processus d'information concernant un instant de défaut τ dans un modèle de risque de crédit est décrit par un pont brownien sur l'intervalle stochastique [0, τ]. Un tel processus de pont est caractérisé comme plus adapté dans la modélisation que le modèle classique considérant l'indicatrice I[0,τ]. Après l'étude des formules de Bayes associées, cette approche de modélisation de l'information concernant le temps de défaut est reliée avec d'autres informations sur le marché financier. Ceci est fait à l'aide de la théorie du grossissement de filtration, où la filtration générée par le processus d'information est élargie par la filtration de référence décrivant d'autres informations n'étant pas directement liées avec le défaut. Une attention particulière est consacrée à la classification du temps de défaut par rapport à la filtration minimale mais également à la filtration élargie. Des conditions suffisantes, sous lesquelles τ est totalement inaccessible, sont discutées, mais également un exemple est donné dans lequel τ évite les temps d'arrêt, est totalement inaccessible par rapport à la filtration minimale et prévisible par rapport à la filtration élargie. Enfin, des contrats financiers comme, par exemple, des obligations privée et des crédits default swaps, sont étudiés dans le contexte décrit ci-dessus. / In this PhD thesis the information process concerning a default time τ in a credit risk model is described by a Brownian bridge over the random time interval [0, τ]. Such a bridge process is characterised as to be a more adapted model than the classical one considering the indicator function I[0,τ]. After the study of related Bayes formulas, this approach of modelling information concerning the default time is related with other financial information. This is done with the help of the theory of enlargement of filtration, where the filtration generated by the information process is enlarged with a reference filtration modelling other information not directly associated with the default. A particular attention is paid to the classification of the default time with respect to the minimal filtration but also with respect to the enlarged filtration. Sufficient conditions under which τ is totally inaccessible are discussed, but also an example is given of a τ avoiding the stopping times of the reference filtration, which is totally inaccessible with respect to its own filtration and predictable with respect to the enlarged filtration. Finally, common financial contracts like defaultable bonds and credit default swaps are considered in the above described settings.
13

Semi-analytische und simulative Kreditrisikomessung synthetischer Collateralized Debt Obligations bei heterogenen Referenzportfolios / Unternehmenswertorientierte Modellentwicklung und transaktionsbezogene Modellanwendungen / Semi-Analytical and Simulative Credit Risk Measurement of Synthetic Collateralized Debt Obligations with Heterogeneous Reference Portfolios / A Modified Asset-Value Model and Transaction-Based Model Applications

Jortzik, Stephan 03 March 2006 (has links)
No description available.
14

Advanced Modeling of Longitudinal Spectroscopy Data

Kundu, Madan Gopal January 2014 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Magnetic resonance (MR) spectroscopy is a neuroimaging technique. It is widely used to quantify the concentration of important metabolites in a brain tissue. Imbalance in concentration of brain metabolites has been found to be associated with development of neurological impairment. There has been increasing trend of using MR spectroscopy as a diagnosis tool for neurological disorders. We established statistical methodology to analyze data obtained from the MR spectroscopy in the context of the HIV associated neurological disorder. First, we have developed novel methodology to study the association of marker of neurological disorder with MR spectrum from brain and how this association evolves with time. The entire problem fits into the framework of scalar-on-function regression model with individual spectrum being the functional predictor. We have extended one of the existing cross-sectional scalar-on-function regression techniques to longitudinal set-up. Advantage of proposed method includes: 1) ability to model flexible time-varying association between response and functional predictor and (2) ability to incorporate prior information. Second part of research attempts to study the influence of the clinical and demographic factors on the progression of brain metabolites over time. In order to understand the influence of these factors in fully non-parametric way, we proposed LongCART algorithm to construct regression tree with longitudinal data. Such a regression tree helps to identify smaller subpopulations (characterized by baseline factors) with differential longitudinal profile and hence helps us to identify influence of baseline factors. Advantage of LongCART algorithm includes: (1) it maintains of type-I error in determining best split, (2) substantially reduces computation time and (2) applicable even observations are taken at subject-specific time-points. Finally, we carried out an in-depth analysis of longitudinal changes in the brain metabolite concentrations in three brain regions, namely, white matter, gray matter and basal ganglia in chronically infected HIV patients enrolled in HIV Neuroimaging Consortium study. We studied the influence of important baseline factors (clinical and demographic) on these longitudinal profiles of brain metabolites using LongCART algorithm in order to identify subgroup of patients at higher risk of neurological impairment. / Partial research support was provided by the National Institutes of Health grants U01-MH083545, R01-CA126205 and U01-CA086368

Page generated in 0.0533 seconds