Return to search

Signal-to-noise ratio aware minimaxity and its asymptotic expansion

Since its development, the minimax framework has been one of the corner stones of theoretical statistics, and has contributed to the popularity of many well-known estimators, such as the regularized M-estimators for high-dimensional problems. In this thesis, we will first show through the example of sparse Gaussian sequence model, that the theoretical results under the classical minimax framework are insufficient for explaining empirical observations. In particular, both hard and soft thresholding estimators are (asymptotically) minimax, however, in practice they often exhibit sub-optimal performances at various signal-to-noise ratio (SNR) levels. To alleviate the descrepancy, we first demonstrate that this issue can be resolved if the signal-to-noise ratio is taken into account in the construction of the parameter space. We call the resulting minimax framework the signal-to-noise ratio aware minimaxity. Then, we showcase how one can use higher-order asymptotics to obtain accurate approximations of the SNR-aware minimax risk and discover minimax estimators. Theoretical findings obtained from this refined minimax framework provide new insights and practical guidance for the estimation of sparse signals.

In a broader context, we investigated the same problem for sparse linear regression. We assume the random design and allow the feature matrix to be high dimensional as 𝑿 ∈ R^{𝑛 x 𝑝} and 𝑝 âȘą 𝑛 . This adds an extra layer of challenge to the estimation of coefficients. Previous studies have largely relied on results expressed in rate-minimaxity, where estimators are compared based on minimax risk with order-wise accuracy, without specifying the precise constant in the approximation. This lack of precision contributes to the notable gap between theoretical conclusions of the asymptotic minimax estimators and empirical findings of the sub-optimality. This thesis addresses this gap by initially refining the classical minimax result, providing a characterization of the constant in the first-order approximation. Subsequently, by following the framework of SNR-aware minimaxity we introduced before, we derived improved approximations of minimax risks under different SNR levels. Notably, these refined results demonstrated better alignment with empirical findings compared to classical minimax outcomes. As showcased in the thesis, our enhanced SNR-aware minimax framework not only offers a more accurate depiction of sparse estimation but also unveils the crucial role of SNR in the problem. This insight emerges as a pivotal factor in assessing the optimality of estimators.

Identiferoai:union.ndltd.org:columbia.edu/oai:academiccommons.columbia.edu:10.7916/cb6q-cd35
Date January 2023
CreatorsGuo, Yilin
Source SetsColumbia University
LanguageEnglish
Detected LanguageEnglish
TypeTheses

Page generated in 0.0021 seconds