Spelling suggestions: "subject:"[een] AVERAGE TREATMENT EFFECT"" "subject:"[enn] AVERAGE TREATMENT EFFECT""
11 |
ROBUST INFERENCE FOR HETEROGENEOUS TREATMENT EFFECTS WITH APPLICATIONS TO NHANES DATARan Mo (20329047) 10 January 2025 (has links)
<p dir="ltr">Estimating the conditional average treatment effect (CATE) using data from the National Health and Nutrition Examination Survey (NHANES) provides valuable insights into the heterogeneous</p><p dir="ltr">impacts of health interventions across diverse populations, facilitating public health strategies that consider individual differences in health behaviors and conditions. However, estimating CATE with NHANES data face challenges often encountered in observational studies, such as outliers, heavy-tailed error distributions, skewed data, model misspecification, and the curse of dimensionality. To address these challenges, this dissertation presents three consecutive studies that thoroughly explore robust methods for estimating heterogeneous treatment effects. </p><p dir="ltr">The first study introduces an outlier-resistant estimation method by incorporating M-estimation, replacing the \(L_2\) loss in the traditional inverse propensity weighting (IPW) method with a robust loss function. To assess the robustness of our approach, we investigate its influence function and breakdown point. Additionally, we derive the asymptotic properties of the proposed estimator, enabling valid inference for the proposed outlier-resistant estimator of CATE.</p><p dir="ltr">The method proposed in the first study relies on a symmetric assumption which is commonly required by standard outlier-resistant methods. To remove this assumption while maintaining </p><p dir="ltr">unbiasedness, the second study employs the adaptive Huber loss, which dynamically adjusts the robustification parameter based on the sample size to achieve optimal tradeoff between bias and robustness. The robustification parameter is explicitly derived from theoretical results, making it unnecessary to rely on time-consuming data-driven methods for its selection.</p><p dir="ltr">We also derive concentration and Berry-Esseen inequalities to precisely quantify the convergence rates as well as finite sample performance.</p><p dir="ltr">In both previous studies, the propensity scores were estimated parametrically, which is sensitive to model misspecification issues. The third study extends the robust estimator from our first </p><p dir="ltr">project by plugging in a kernel-based nonparametric estimation of the propensity score with sufficient dimension reduction (SDR). Specifically, we adopt a robust minimum average variance estimation (rMAVE) for the central mean space under the potential outcome framework. Together with higher-order kernels, the resulting CATE estimation gains enhanced efficiency.</p><p dir="ltr">In all three studies, the theoretical results are derived, and confidence intervals are constructed for inference based on these findings. The properties of the proposed estimators are verified through extensive simulations. Additionally, applying these methods to NHANES data validates the estimators' ability to handle diverse and contaminated datasets, further demonstrating their effectiveness in real-world scenarios.</p><p><br></p>
|
12 |
[en] COMBINING STRATEGIES FOR ESTIMATION OF TREATMENT EFFECTS / [pt] COMBINANDO ESTRATÉGIAS PARA ESTIMAÇÃO DE EFEITOS DE TRATAMENTORAFAEL DE CARVALHO CAYRES PINTO 19 January 2018 (has links)
[pt] Uma ferramenta importante na avaliação de políticas econômicas é a estimação do efeito médio de um programa ou tratamento sobre uma variável de interesse. A principal dificuldade desse cálculo deve-se µa atribuição do tratamento aos potenciais participantes geralmente não ser aleatória, causando viés de seleção quando desconsiderada. Uma maneira de resolver esse problema é supor que o econometrista observa um conjunto de características determinantes, a menos de um componente estritamente aleatório,
da participação. Sob esta hipótese, conhecida como Ignorabilidade, métodos semiparamétricos de estimação foram desenvolvidos, entre os quais a imputação de valores contrafactuais e a reponderação da amostra. Ambos são consistentes e capazes de atingir, assintoticamente, o limite de eficiência
semiparamétrico. Entretanto, nas amostras frequentemente disponíveis, o desempenho desses métodos nem sempre é satisfatório. O objetivo deste trabalho é estudar como a combinação das duas estratégias pode produzir estimadores com melhores propriedades em amostras pequenas. Para isto, consideramos duas formas de integrar essas abordagens, tendo como referencial teórico a literatura de estimação duplamente robusta desenvolvida por James Robins e co-autores. Analisamos suas propriedades e discutimos por que podem superar o uso isolado de cada uma das técnicas que os compõem. Finalmente, comparamos, num exercício de Monte Carlo, o desempenho desses estimadores com os de imputação e reponderação. Os resultados mostram que a combinação de estratégias pode reduzir o viés e a variância, mas isso depende da forma como é implementada. Concluímos que a escolha dos parâmetros de suavização é decisiva para o desempenho da estimação em amostras de tamanho moderado. / [en] Estimation of mean treatment effect is an important tool for evaluating economic policy. The main difficulty in this calculation is caused by nonrandom assignment of potential participants to treatment, which leads to
selection bias when ignored. A solution to this problem is to suppose the econometrician observes a set of covariates that determine participation, except for a strictly random component. Under this assumption, known as Ignorability, semiparametric methods were developed, including imputation of counterfactual outcomes and sample reweighing. Both are consistent and can asymptotically achieve the semiparametric efficiency bound. However, in sample sizes commonly available, their performance is not always satisfactory. The goal of this dissertation is to study how combining these strategies can lead to better estimation in small samples. We consider two different ways of merging these methods, based on Doubly Robust inference literature developed by James Robins and his co-authors, analyze their properties and discuss why they would overcome each of their components. Finally, we compare the proposed estimators to imputation and reweighing in a Monte Carlo exercise. Results show that while combined strategies may reduce bias and variance, it depends on the way it is implemented. We conclude that the choice of smoothness parameters is critical to obtain good estimates in moderate size samples.
|
Page generated in 0.046 seconds