• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • Tagged with
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Prédiction, inférence sélective et quelques problèmes connexes

Yadegari, Iraj January 2017 (has links)
Nous étudions le problème de l'estimation de moyenne et de la densité prédictive d'une population sélectionnée, en obtenant de nouveaux développements qui incluent l'analyse de biais, la décomposition du risque et les problèmes avec restrictions sur les paramètres (chapitre 2). Nous proposons des estimateurs de densité prédictive efficaces en termes de pertes Kullback-Leibler et Hellinger (chapitre 3) améliorant les procédures de plug-in via une perte duale et via une d'expansion de variance. Enfin, nous présentons les résultats de l'amélioration de l'estimateur du maximum de vraisemblance (EMV) d'une moyenne normale bornée pour une classe de fonctions de perte, y compris la perte normale réfléchie, avec des implications pour l'estimation de densité prédictive. A savoir, nous donnons des conditions sur la perte et la largeur de l'espace paramétrique pour lesquels l'estimateur de Bayes par rapport à la loi a priori uniforme sur la frontière domine la EMV. / Abstract : We study the problem of point estimation and predictive density estimation of the mean of a selected population, obtaining novel developments which include bias analysis, decomposition of risk, and problems with restricted parameters (Chapter 2). We propose efficient predictive density estimators in terms of Kullback-Leibler and Hellinger losses (Chapter 3) improving on plug-in procedures via a dual loss and via a variance expansion scheme. Finally (Chapter 4), we present findings on improving on the maximum likelihood estimator (MLE) of a bounded normal mean under a class of loss functions, including reflected normal loss, with implications for predictive density estimation. Namely, we give conditions on the loss and the width of the parameter space for which the Bayes estimator with respect to the boundary uniform prior dominates the MLE.​
2

NONPARAMETRIC EMPIRICAL BAYES SIMULTANEOUS ESTIMATION FOR MULTIPLE VARIANCES

KWON, YEIL January 2018 (has links)
The shrinkage estimation has proven to be very useful when dealing with a large number of mean parameters. In this dissertation, we consider the problem of simultaneous estimation of multiple variances and construct a shrinkage type, non-parametric estimator. We take the non-parametric empirical Bayes approach by starting with an arbitrary prior on the variances. Under an invariant loss function, the resultant Bayes estimator relies on the marginal cumulative distribution function of the sample variances. Replacing the marginal cdf by the empirical distribution function, we obtain a Non-parametric Empirical Bayes estimator for multiple Variances (NEBV). The proposed estimator converges to the corresponding Bayes version uniformly over a large set. Consequently, the NEBV works well in a post-selection setting. We then apply the NEBV to construct condence intervals for mean parameters in a post-selection setting. It is shown that the intervals based on the NEBV are shortest among all the intervals which guarantee a desired coverage probability. Through real data analysis, we have further shown that the NEBV based intervals lead to the smallest number of discordances, a desirable property when we are faced with the current "replication crisis". / Statistics

Page generated in 0.0592 seconds