论文标题

复发神经网络中记忆距离的限制的经验分析

Empirical Analysis of Limits for Memory Distance in Recurrent Neural Networks

论文作者

Illium, Steffen, Schillman, Thore, Müller, Robert, Gabor, Thomas, Linnhoff-Popien, Claudia

论文摘要

所有不同类型的经常性神经网络(RNN)共有的意图是通过时间点之间的关系建模。当随后的数据点之间没有立即关系(例如,当数据点随机生成时,例如)时,我们表明RNN仍然能够通过使用标准背部记忆的心脏来记住它们,将一些数据点记住回到序列中。但是,我们还表明,对于经典的RNN,LSTM和GRU网络,可以以这种方式复制的复发调用之间的数据点的距离受到了高度限制(与数据点之间的宽松连接相比),并且受到涉及到的RNN的类型和大小所施加的各种约束。这意味着对于RNN仍然能够识别上述关系的相关数据点之间的距离,存在硬限制(低于信息理论的方式)。

Common to all different kinds of recurrent neural networks (RNNs) is the intention to model relations between data points through time. When there is no immediate relationship between subsequent data points (like when the data points are generated at random, e.g.), we show that RNNs are still able to remember a few data points back into the sequence by memorizing them by heart using standard backpropagation. However, we also show that for classical RNNs, LSTM and GRU networks the distance of data points between recurrent calls that can be reproduced this way is highly limited (compared to even a loose connection between data points) and subject to various constraints imposed by the type and size of the RNN in question. This implies the existence of a hard limit (way below the information-theoretic one) for the distance between related data points within which RNNs are still able to recognize said relation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源