论文标题

大型非线性分类的多参数更新傅立叶在线梯度下降算法

A Multi-parameter Updating Fourier Online Gradient Descent Algorithm for Large-scale Nonlinear Classification

论文作者

Chen, Yigying

论文摘要

大规模非线性分类是支持向量机领域的一项具有挑战性的任务。在线随机傅立叶特征图算法是处理大型非线性分类问题的非常重要的方法。这些方法的主要缺点如下:(1)由于在固定随机方向时只有在学习过程中更新超平面向量,因此不能保证这些在线方法可以适应数据分发时数据分布的变化。 (2)随机方向的维度通常更高,以获得更好的分类精度,从而导致更长的测试时间。为了克服这些缺点,提出了基于新型的随机特征图的大规模非线性分类问题,提出了一个多参数更新傅立叶在线梯度下降算法(MPU-FOGD)。在提出的方法中,建议的随机特征图具有较低的维度,而多参数更新策略可以确保学习模型可以更好地适应数据分发时的变化。从理论上讲,证明与现有的随机傅立叶特征映射相比,提出的随机特征映射可以给出更严格的误差。对几个基准数据集的实证研究表明,与最先进的在线随机傅立叶特征图方法相比,提出的MPU-FOGD可以获得更好的测试准确性。

Large scale nonlinear classification is a challenging task in the field of support vector machine. Online random Fourier feature map algorithms are very important methods for dealing with large scale nonlinear classification problems. The main shortcomings of these methods are as follows: (1) Since only the hyperplane vector is updated during learning while the random directions are fixed, there is no guarantee that these online methods can adapt to the change of data distribution when the data is coming one by one. (2) The dimension of the random direction is often higher for obtaining better classification accuracy, which results in longer test time. In order to overcome these shortcomings, a multi-parameter updating Fourier online gradient descent algorithm (MPU-FOGD) is proposed for large-scale nonlinear classification problems based on a novel random feature map. In the proposed method, the suggested random feature map has lower dimension while the multi-parameter updating strategy can guarantee the learning model can better adapt to the change of data distribution when the data is coming one by one. Theoretically, it is proved that compared with the existing random Fourier feature maps, the proposed random feature map can give a tighter error bound. Empirical studies on several benchmark data sets demonstrate that compared with the state-of-the-art online random Fourier feature map methods, the proposed MPU-FOGD can obtain better test accuracy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源