论文标题

分布式关联内存网络,带有内存令人耳目一新的损失

Distributed Associative Memory Network with Memory Refreshing Loss

论文作者

Park, Taewon, Choi, Inchul, Lee, Minho

论文摘要

尽管记忆增强神经网络(MANN)研究的最新进展,但具有单个外部内存的关联内存网络仍显示出在复杂的关系推理任务上的有限性能。尤其是基于内容的可寻址内存网络通常无法将输入数据编码为有足够的表示形式来进行关系推理,这限制了MANN的关系建模性能对于长的时间序列数据。为了解决这些问题,我们在这里介绍了一种新颖的分布式关联记忆体系结构(DAM),并具有令人耳目一新的损失(MRL),从而增强了Mann的关系推理性能。受人脑的工作原理的启发,我们的框架编码了在多个内存块上具有分布式表示的数据,并反复刷新内容,以增强记忆,类似于大脑的彩排过程。对于此过程,我们将单个外部存储器替换为一组多个较小的关联内存块,并同时且独立地更新这些子记忆块,以进行输入数据的分布式表示。此外,我们提出了MRL,该MRL可以帮助任务的目标目标,同时学习数据中存在的关系信息。 MRL使Mann能够通过从存储的存储器内容中重现随机采样的输入数据来增强输入数据和任务目标之间的关联。通过此程序,Mann通过关系信息进一步丰富了存储的表示。在实验中,我们将方法应用于差异神经计算机(DNC),该计算机是基于内容的代表性解决内存模型之一,并在记忆和关系推理任务上实现了最新的性能。

Despite recent progress in memory augmented neural network (MANN) research, associative memory networks with a single external memory still show limited performance on complex relational reasoning tasks. Especially the content-based addressable memory networks often fail to encode input data into rich enough representation for relational reasoning and this limits the relation modeling performance of MANN for long temporal sequence data. To address these problems, here we introduce a novel Distributed Associative Memory architecture (DAM) with Memory Refreshing Loss (MRL) which enhances the relation reasoning performance of MANN. Inspired by how the human brain works, our framework encodes data with distributed representation across multiple memory blocks and repeatedly refreshes the contents for enhanced memorization similar to the rehearsal process of the brain. For this procedure, we replace a single external memory with a set of multiple smaller associative memory blocks and update these sub-memory blocks simultaneously and independently for the distributed representation of input data. Moreover, we propose MRL which assists a task's target objective while learning relational information existing in data. MRL enables MANN to reinforce an association between input data and task objective by reproducing stochastically sampled input data from stored memory contents. With this procedure, MANN further enriches the stored representations with relational information. In experiments, we apply our approaches to Differential Neural Computer (DNC), which is one of the representative content-based addressing memory models and achieves the state-of-the-art performance on both memorization and relational reasoning tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源