论文标题
深度神经网络回归的多层培训
Multilevel-in-Layer Training for Deep Neural Network Regression
论文作者
论文摘要
回归中的一个普遍挑战是,对于许多问题,高质量解决方案所需的自由度也可以过度拟合。正则化是一类策略,试图限制可能的解决方案范围,以阻止过度拟合,同时仍然可以实现良好的解决方案,而不同的正则化策略会施加不同类型的限制。在本文中,我们提出了一种多层次正则化策略,该策略构建和培训神经网络的层次结构,每个神经网络的层次结构都具有以前网络层的更广泛版本。我们从代数多机(AMG)领域中汲取直觉和技术,传统上用于求解方程的线性和非线性系统,并特别适应了对非线性方程式的完整近似方案(FAS),以适应深度学习问题。然后,通过V-Cycles进行培训,然后鼓励神经网络对问题建立层次结构的理解。我们将此方法称为\ emph {多层次宽度},以区分先前的多级作品,后者在层次上改变了神经网络的深度。最终的方法是一个高度灵活的框架,可以应用于各种层类型,我们可以通过完全连接和卷积层进行证明。我们通过PDE回归问题进行了实验表明,我们的多级训练方法是一种有效的正规化器,可改善所研究神经网络的普遍性能。
A common challenge in regression is that for many problems, the degrees of freedom required for a high-quality solution also allows for overfitting. Regularization is a class of strategies that seek to restrict the range of possible solutions so as to discourage overfitting while still enabling good solutions, and different regularization strategies impose different types of restrictions. In this paper, we present a multilevel regularization strategy that constructs and trains a hierarchy of neural networks, each of which has layers that are wider versions of the previous network's layers. We draw intuition and techniques from the field of Algebraic Multigrid (AMG), traditionally used for solving linear and nonlinear systems of equations, and specifically adapt the Full Approximation Scheme (FAS) for nonlinear systems of equations to the problem of deep learning. Training through V-cycles then encourage the neural networks to build a hierarchical understanding of the problem. We refer to this approach as \emph{multilevel-in-width} to distinguish from prior multilevel works which hierarchically alter the depth of neural networks. The resulting approach is a highly flexible framework that can be applied to a variety of layer types, which we demonstrate with both fully-connected and convolutional layers. We experimentally show with PDE regression problems that our multilevel training approach is an effective regularizer, improving the generalize performance of the neural networks studied.