论文标题
元学习向后传播并改善它
Meta Learning Backpropagation And Improving It
论文作者
论文摘要
已经提出了许多用于使用神经网络(NNS)的元学习的概念,例如,NNS学会重新编程快速重量,Hebbian可塑性,学习的学习规则和Meta Recurrent NNS。我们的变量共享元学习(VSML)统一了上述内容,并证明了NN中简单的体重分享和稀疏性足以以可重复使用的方式表达强大的学习算法(LAS)。 VSML的一个简单实现,其中神经网络的权重替换为Tiny LSTMS允许仅通过向前模式运行来实现反向传播LA。它甚至可以学习与在线反向传播不同的新LA,并在没有明确梯度计算的情况下将其推广到元训练分布之外的数据集。内省表明,我们的元学习通过与梯度下降不同的方式通过快速关联学习LAS学习。
Many concepts have been proposed for meta learning with neural networks (NNs), e.g., NNs that learn to reprogram fast weights, Hebbian plasticity, learned learning rules, and meta recurrent NNs. Our Variable Shared Meta Learning (VSML) unifies the above and demonstrates that simple weight-sharing and sparsity in an NN is sufficient to express powerful learning algorithms (LAs) in a reusable fashion. A simple implementation of VSML where the weights of a neural network are replaced by tiny LSTMs allows for implementing the backpropagation LA solely by running in forward-mode. It can even meta learn new LAs that differ from online backpropagation and generalize to datasets outside of the meta training distribution without explicit gradient calculation. Introspection reveals that our meta learned LAs learn through fast association in a way that is qualitatively different from gradient descent.