论文标题

全球赋予的迭代重新加权最小二乘,以解决可靠的回归问题

Globally-convergent Iteratively Reweighted Least Squares for Robust Regression Problems

论文作者

Mukhoty, Bhaskar, Gopakumar, Govind, Jain, Prateek, Kar, Purushottam

论文摘要

我们为IRLS(迭代重新加权的最小二乘)提供了第一个全球模型恢复结果,以解决强大的回归问题。众所周知,尽管初始化和数据损坏不良,但对于几个参数估计问题,IRLS仍具有出色的性能。现有的IRL分析通常需要仔细初始化,因此仅提供局部收敛保证。我们通过向基本IRLS例行程序提出增强来解决这一问题,该程序不仅提供保证的全球恢复,而且实际上还优于稳健回归的最先进算法。在基本回归任务中,我们的例程对超参数错误指定,以及诸如线性武装匪徒问题之类的应用任务。我们的理论分析依赖于强凸和平滑度的概念的新颖扩展,即加权强的凸度和光滑度,并确定次高斯的设计提供了一度的加权条件数量。这些概念在分析其他算法也可能很有用。

We provide the first global model recovery results for the IRLS (iteratively reweighted least squares) heuristic for robust regression problems. IRLS is known to offer excellent performance, despite bad initializations and data corruption, for several parameter estimation problems. Existing analyses of IRLS frequently require careful initialization, thus offering only local convergence guarantees. We remedy this by proposing augmentations to the basic IRLS routine that not only offer guaranteed global recovery, but in practice also outperform state-of-the-art algorithms for robust regression. Our routines are more immune to hyperparameter misspecification in basic regression tasks, as well as applied tasks such as linear-armed bandit problems. Our theoretical analyses rely on a novel extension of the notions of strong convexity and smoothness to weighted strong convexity and smoothness, and establishing that sub-Gaussian designs offer bounded weighted condition numbers. These notions may be useful in analyzing other algorithms as well.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源