论文标题

NYSADMM:通过低级别近似的更快的复合凸优化

NysADMM: faster composite convex optimization via low-rank approximation

论文作者

Zhao, Shipu, Frangella, Zachary, Udell, Madeleine

论文摘要

本文开发了一种可扩展的新算法,称为NYSADMM,以最大程度地减少使用凸正则器的光滑凸丢失函数。 NYSADMM通过从随机的低级别NyStröm近似中构造ADMM子问题的预处理,从而加速了乘数的不精确方向方法(ADMM)。 NYSADMM具有强大的理论保证:当NyStröm近似值是子问题正规化革兰氏矩阵的有效维度时,它以持续数量的迭代来解决ADMM子问题。在实践中,排名远小于有效维度可以成功,因此Nysadmm使用自适应策略选择具有类似保证的等级。现实世界数据集上的数值实验表明,NYSADMM可以在标准求解器要求的一半(或更少)(或更少)的时间内解决重要的应用程序,例如Lasso,Logistic回归和支持向量机。 NYSADMM击败标准求解器的问题的广度令人惊讶:这表明ADMM是在通常使用定制方法解决的广泛统计学习问题上数值优化的主要范式。

This paper develops a scalable new algorithm, called NysADMM, to minimize a smooth convex loss function with a convex regularizer. NysADMM accelerates the inexact Alternating Direction Method of Multipliers (ADMM) by constructing a preconditioner for the ADMM subproblem from a randomized low-rank Nyström approximation. NysADMM comes with strong theoretical guarantees: it solves the ADMM subproblem in a constant number of iterations when the rank of the Nyström approximation is the effective dimension of the subproblem regularized Gram matrix. In practice, ranks much smaller than the effective dimension can succeed, so NysADMM uses an adaptive strategy to choose the rank that enjoys analogous guarantees. Numerical experiments on real-world datasets demonstrate that NysADMM can solve important applications, such as the lasso, logistic regression, and support vector machines, in half the time (or less) required by standard solvers. The breadth of problems on which NysADMM beats standard solvers is a surprise: it suggests that ADMM is a dominant paradigm for numerical optimization across a wide range of statistical learning problems that are usually solved with bespoke methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源