论文标题
神经复杂性度量
Neural Complexity Measures
论文作者
论文摘要
尽管存在着深层神经网络的各种复杂性措施,但要指定能够预测和解释深网络中的概括的适当措施,这一挑战是具有挑战性的。我们提出了神经复杂性(NC),这是一个用于预测概括的元学习框架。我们的模型通过以数据驱动的方式与许多异质任务进行互动来学习标量复杂度度量。可以将训练有素的NC模型添加到标准培训损失中,以在标准监督学习方案中规范任何任务学习者。我们将NC的方法与现有的手动设计的复杂性度量和其他元学习模型进行了对比,并且我们验证了NC在多重回归和分类任务上的性能
While various complexity measures for deep neural networks exist, specifying an appropriate measure capable of predicting and explaining generalization in deep networks has proven challenging. We propose Neural Complexity (NC), a meta-learning framework for predicting generalization. Our model learns a scalar complexity measure through interactions with many heterogeneous tasks in a data-driven way. The trained NC model can be added to the standard training loss to regularize any task learner in a standard supervised learning scenario. We contrast NC's approach against existing manually-designed complexity measures and other meta-learning models, and we validate NC's performance on multiple regression and classification tasks