论文标题
多项式逻辑回归算法通过二次梯度
Multinomial Logistic Regression Algorithms via Quadratic Gradient
论文作者
论文摘要
多项式逻辑回归,也以其他名称(例如多类逻辑回归和SoftMax回归)而闻名,是一种基本的分类方法,它将二进制逻辑回归推广到多类问题。最近的一项工作提出了一个更快的梯度,称为$ \ texttt {二次梯度} $,该梯度可以加速二进制逻辑回归训练,并提出了增强的Nesterov的加速梯度(NAG)方法,用于二进制逻辑回归。 在本文中,我们将这项工作扩展到多类逻辑回归,并提出一种增强的自适应梯度算法(Adagrad),该算法可以加速原始的Adagrad方法。我们在某些多类问题数据集上测试了增强的NAG方法和增强的Adagrad方法。实验结果表明,这两种增强方法的收敛速度分别比原始方法更快。
Multinomial logistic regression, also known by other names such as multiclass logistic regression and softmax regression, is a fundamental classification method that generalizes binary logistic regression to multiclass problems. A recently work proposed a faster gradient called $\texttt{quadratic gradient}$ that can accelerate the binary logistic regression training, and presented an enhanced Nesterov's accelerated gradient (NAG) method for binary logistic regression. In this paper, we extend this work to multiclass logistic regression and propose an enhanced Adaptive Gradient Algorithm (Adagrad) that can accelerate the original Adagrad method. We test the enhanced NAG method and the enhanced Adagrad method on some multiclass-problem datasets. Experimental results show that both enhanced methods converge faster than their original ones respectively.