论文标题

通过正交改善SVD元层的协方差条件

Improving Covariance Conditioning of the SVD Meta-layer by Orthogonality

论文作者

Song, Yue, Sebe, Nicu, Wang, Wei

论文摘要

将SVD元层插入神经网络中很容易使协方差不良,这可能会损害训练稳定性和概括能力中的模型。在本文中,我们系统地研究了如何通过对前SVD层的正交性来改善协方差调节。首先研究了重量的现有正交治疗。但是,这些技术可以改善条件,但会损害性能。为了避免这种副作用,我们提出了最近的正交梯度(NOG)和最佳学习率(OLR)。我们方法的有效性在两个应用程序中得到了验证:反相关的批处理归一化(BN)和全局协方差池(GCP)。关于视觉识别的广泛实验表明,我们的方法可以同时改善协方差调节和概括。此外,与正交重量的组合可以进一步提高性能。

Inserting an SVD meta-layer into neural networks is prone to make the covariance ill-conditioned, which could harm the model in the training stability and generalization abilities. In this paper, we systematically study how to improve the covariance conditioning by enforcing orthogonality to the Pre-SVD layer. Existing orthogonal treatments on the weights are first investigated. However, these techniques can improve the conditioning but would hurt the performance. To avoid such a side effect, we propose the Nearest Orthogonal Gradient (NOG) and Optimal Learning Rate (OLR). The effectiveness of our methods is validated in two applications: decorrelated Batch Normalization (BN) and Global Covariance Pooling (GCP). Extensive experiments on visual recognition demonstrate that our methods can simultaneously improve the covariance conditioning and generalization. Moreover, the combinations with orthogonal weight can further boost the performances.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源