论文标题
基于能量的一般顺序发作内存网络在绝热极限
Energy-based General Sequential Episodic Memory Networks at the Adiabatic Limit
论文作者
论文摘要
一般的关联存储器模型(GAMM)具有恒定的状态依赖性能量表面,该模型将输出动力学带到固定点,从可异步预加载的一系列记忆中检索单个记忆。我们介绍了一类新的一般顺序发作记忆模型(GSEMM),该模型在绝热极限上表现出时间变化的能量表面,从而导致一系列是顺序的情节记忆。动态能量表面是通过新引入的不对称突触启用的,网络的隐藏层中具有信号传播延迟。我们研究了GSEMM类的两个记忆模型的理论和经验特性,其激活函数不同。 LISEM在特征层中具有非线性,而DSEM在隐藏层中具有非线性。原则上,DSEM具有随着网络中神经元数量的成倍增长的存储容量。我们根据能量最小化原则介绍了突触的学习规则,并表明它可以在线学习单一记忆及其顺序关系。该规则类似于Hebbian学习算法和峰值依赖性可塑性(STDP),该算法描述了神经元之间突触会改变强度的条件。因此,GSEMM在单个理论框架下结合了情节记忆的静态和动态特性,并桥梁神经科学,机器学习和人工智能。
The General Associative Memory Model (GAMM) has a constant state-dependant energy surface that leads the output dynamics to fixed points, retrieving single memories from a collection of memories that can be asynchronously preloaded. We introduce a new class of General Sequential Episodic Memory Models (GSEMM) that, in the adiabatic limit, exhibit temporally changing energy surface, leading to a series of meta-stable states that are sequential episodic memories. The dynamic energy surface is enabled by newly introduced asymmetric synapses with signal propagation delays in the network's hidden layer. We study the theoretical and empirical properties of two memory models from the GSEMM class, differing in their activation functions. LISEM has non-linearities in the feature layer, whereas DSEM has non-linearity in the hidden layer. In principle, DSEM has a storage capacity that grows exponentially with the number of neurons in the network. We introduce a learning rule for the synapses based on the energy minimization principle and show it can learn single memories and their sequential relationships online. This rule is similar to the Hebbian learning algorithm and Spike-Timing Dependent Plasticity (STDP), which describe conditions under which synapses between neurons change strength. Thus, GSEMM combines the static and dynamic properties of episodic memory under a single theoretical framework and bridges neuroscience, machine learning, and artificial intelligence.