• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

GENERAL-PURPOSE STATISTICAL INFERENCE WITH DIFFERENTIAL PRIVACY GUARANTEES

Zhanyu Wang (13893375) 06 December 2023 (has links)
<p dir="ltr">Differential privacy (DP) uses a probabilistic framework to measure the level of privacy protection of a mechanism that releases data analysis results to the public. Although DP is widely used by both government and industry, there is still a lack of research on statistical inference under DP guarantees. On the one hand, existing DP mechanisms mainly aim to extract dataset-level information instead of population-level information. On the other hand, DP mechanisms introduce calibrated noises into the released statistics, which often results in sampling distributions more complex and intractable than the non-private ones. This dissertation aims to provide general-purpose methods for statistical inference, such as confidence intervals (CIs) and hypothesis tests (HTs), that satisfy the DP guarantees. </p><p dir="ltr">In the first part of the dissertation, we examine a DP bootstrap procedure that releases multiple private bootstrap estimates to construct DP CIs. We present new DP guarantees for this procedure and propose to use deconvolution with DP bootstrap estimates to derive CIs for inference tasks such as population mean, logistic regression, and quantile regression. Our method achieves the nominal coverage level in both simulations and real-world experiments and offers the first approach to private inference for quantile regression.</p><p dir="ltr">In the second part of the dissertation, we propose to use the simulation-based ``repro sample'' approach to produce CIs and HTs based on DP statistics. Our methodology has finite-sample guarantees and can be applied to a wide variety of private inference problems. It appropriately accounts for biases introduced by DP mechanisms (such as by clamping) and improves over other state-of-the-art inference methods in terms of the coverage and type I error of the private inference. </p><p dir="ltr">In the third part of the dissertation, we design a debiased parametric bootstrap framework for DP statistical inference. We propose the adaptive indirect estimator, a novel simulation-based estimator that is consistent and corrects the clamping bias in the DP mechanisms. We also prove that our estimator has the optimal asymptotic variance among all well-behaved consistent estimators, and the parametric bootstrap results based on our estimator are consistent. Simulation studies show that our framework produces valid DP CIs and HTs in finite sample settings, and it is more efficient than other state-of-the-art methods.</p>
2

Estimateur neuronal de ratio pour l'inférence de la constante de Hubble à partir de lentilles gravitationnelles fortes

Campeau-Poirier, Ève 12 1900 (has links)
Les deux méthodes principales pour mesurer la constante de Hubble, soit le taux d’expansion actuel de l’Univers, trouvent des valeurs différentes. L’une d’elle s’appuie lourdement sur le modèle cosmologique aujourd’hui accepté pour décrire le cosmos et l’autre, sur une mesure directe. Le désaccord éveille donc des soupçons sur l’existence d’une nouvelle physique en dehors de ce modèle. Si une autre méthode, indépendante des deux en conflit, soutenait une des deux valeurs, cela orienterait les efforts des cosmologistes pour résoudre la tension. Les lentilles gravitationnelles fortes comptent parmi les méthodes candidates. Ce phénomène se produit lorsqu’une source lumineuse s’aligne avec un objet massif le long de la ligne de visée d’un télescope. La lumière dévie de sa trajectoire sur plusieurs chemins en traversant l’espace-temps déformé dans le voisinage de la masse, résultant en une image déformée, gros- sie et amplifiée. Dans le cas d’une source lumineuse ponctuelle, deux ou quatre images se distinguent nettement. Si cette source est aussi variable, une de ses fluctuations apparaît à différents moments sur chaque image, puisque chaque chemin a une longueur différente. Le délai entre les signaux des images dépend intimement de la constante de Hubble. Or, cette approche fait face à de nombreux défis. D’abord, elle requiert plusieurs jours à des spécialistes pour exécuter la méthode de Monte-Carlo par chaînes de Markov (MCMC) qui évalue les paramètres d’un seul système de lentille à la fois. Avec les détections de milliers de systèmes prévues par l’observatoire Rubin dans les prochaines années, cette approche est inconcevable. Elle introduit aussi des simplifications qui risquent de biaiser l’inférence, ce qui contrevient à l’objectif de jeter la lumière sur le désaccord entre les mesures de la constante de Hubble. Ce mémoire présente une stratégie basée sur l’inférence par simulations pour remédier à ces problèmes. Plusieurs travaux antérieurs accélèrent la modélisation de la lentille grâce à l’ap- prentissage automatique. Notre approche complète leurs efforts en entraînant un estimateur neuronal de ratio à déterminer la distribution de la constante de Hubble, et ce, à partir des produits de la modélisation et des mesures de délais. L’estimateur neuronal de ratio s’exécute rapidement et obtient des résultats qui concordent avec ceux de l’analyse traditionnelle sur des simulations simples, qui ont une cohérence statistique acceptable et qui sont non-biaisés. / The two main methods to measure the Hubble constant, the current expansion rate of the Universe, find different values. One of them relies heavily on today’s accepted cosmological model describing the cosmos and the other, on a direct measurement. The disagreement thus arouses suspicions about the existence of new physics outside this model. If another method, independent of the two in conflict, supported one of the two values, it would guide cosmologists’ efforts to resolve the tension. Strong gravitational lensing is among the candidate methods. This phenomenon occurs when a light source aligns with a massive object along a telescope line of sight. When crossing the curved space-time in the vicinity of the mass, the light deviates from its trajectory on several paths, resulting in a distorted and magnified image. In the case of a point light source, two or four images stand out clearly. If this source is also variable, the luminosity fluctuations will appear at different moments on each image because each path has a different length. The time delays between the image signals depend intimately on the Hubble constant. This approach faces many challenges. First, it requires several days for specialists to perform the Markov Chain Monte-Carlo (MCMC) which evaluates the parameters of a single lensing system at a time. With the detection of thousands of lensing systems forecasted by the Rubin Observatory in the coming years, this method is inconceivable. It also introduces simplifications that risk biasing the inference, which contravenes the objective of shedding light on the discrepancy between the Hubble constant measurements. This thesis presents a simulation-based inference strategy to address these issues. Several previous studies have accelerated the lens modeling through machine learning. Our approach complements their efforts by training a neural ratio estimator to determine the distribution of the Hubble constant from lens modeling products and time delay measurements. The neural ratio estimator results agree with those of the traditional analysis on simple simulations, have an acceptable statistical consistency, are unbiased, and are obtained significantly faster.

Page generated in 0.0933 seconds