论文标题

通过多项式增强分类器的保守性和鲁棒性

Enhancing Classifier Conservativeness and Robustness by Polynomiality

论文作者

Wang, Ziqi, Loog, Marco

论文摘要

我们说明了有害效果,例如过度自信的决定,即指数行为可以在经典LDA和逻辑回归等方法中具有。然后,我们展示多项式如何解决情况。除其他外,这有目的地导致尾巴中的随机性能远离大部分训练数据。我们随后出现的直接相关,简单而重要的技术新颖性是Softrmax:当代(深)神经网络中使用的标准软磁性功能的合理替代方法。它是通过将标准的软马​​克斯与LDA中使用的高斯类别条件模型联系起来的,并通过多项式替代方法来得出。我们表明,Softrmax的两个方面,保守性和固有的梯度正则化,导致对对抗性攻击的鲁棒性,而无需梯度混淆。

We illustrate the detrimental effect, such as overconfident decisions, that exponential behavior can have in methods like classical LDA and logistic regression. We then show how polynomiality can remedy the situation. This, among others, leads purposefully to random-level performance in the tails, away from the bulk of the training data. A directly related, simple, yet important technical novelty we subsequently present is softRmax: a reasoned alternative to the standard softmax function employed in contemporary (deep) neural networks. It is derived through linking the standard softmax to Gaussian class-conditional models, as employed in LDA, and replacing those by a polynomial alternative. We show that two aspects of softRmax, conservativeness and inherent gradient regularization, lead to robustness against adversarial attacks without gradient obfuscation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源