论文标题

通过在外部内存中绑定的紧急符号

Emergent Symbols through Binding in External Memory

论文作者

Webb, Taylor W., Sinha, Ishan, Cohen, Jonathan D.

论文摘要

人类智能的一个关键方面是能够直接从高维感觉数据中推断出抽象规则的能力,并且只有有限的培训经验,就可以这样做。事实证明,深度神经网络算法是直接从高维数据中学习的强大工具,但目前缺乏对抽象规则的数据有效诱导的能力,导致一些人认为符号处理机制是必要的。在这项工作中,我们通过引入新兴的符号绑定网络(ESBN)迈出了弥合差距的一步,这是一个经常性网络增强的,该网络具有外部内存,该内存能够实现可变结合和间接的形式。这种绑定机制允许符号般的表示形式通过学习过程出现,而无需明确合并符号处理机械,使ESBN能够以从这些规则应用的特定实体中抽象的方式学习规则。在一系列任务中,我们表明,这种体系结构仅给出了有限的培训示例对新颖实体的学习规则几乎完美的概括,并且胜过许多其他竞争性神经网络体系结构。

A key aspect of human intelligence is the ability to infer abstract rules directly from high-dimensional sensory data, and to do so given only a limited amount of training experience. Deep neural network algorithms have proven to be a powerful tool for learning directly from high-dimensional data, but currently lack this capacity for data-efficient induction of abstract rules, leading some to argue that symbol-processing mechanisms will be necessary to account for this capacity. In this work, we take a step toward bridging this gap by introducing the Emergent Symbol Binding Network (ESBN), a recurrent network augmented with an external memory that enables a form of variable-binding and indirection. This binding mechanism allows symbol-like representations to emerge through the learning process without the need to explicitly incorporate symbol-processing machinery, enabling the ESBN to learn rules in a manner that is abstracted away from the particular entities to which those rules apply. Across a series of tasks, we show that this architecture displays nearly perfect generalization of learned rules to novel entities given only a limited number of training examples, and outperforms a number of other competitive neural network architectures.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源