论文标题

吸引人神经网络中的矢量符号有限状态机器

Vector Symbolic Finite State Machines in Attractor Neural Networks

论文作者

Cotteret, Madison, Greatorex, Hugh, Ziegler, Martin, Chicca, Elisabetta

论文摘要

Hopfield吸引者网络是人类记忆的强大分布模型,但缺乏影响状态依赖性吸引子过渡的一般机制,以响应输入。我们提出的施工规则使吸引子网络可以实施任意有限状态机(FSM),在该机器中,状态和刺激由高维随机向量表示,并且所有状态过渡均由吸引子网络的动态制定。数值模拟以可实现的FSM的最大大小表示模型的能力,在密集双极状态向量的吸引子网络的大小上是线性的,对于稀疏的二进制状态向量,大约是二极管的。我们表明该模型对于不精确和嘈杂的权重是可靠的,因此是具有高密度但不可靠设备的实施的主要候选人。通过赋予吸引网络模拟任意FSM的能力,我们提出了一个合理的路径,通过该路径可以作为生物神经网络中的分布式计算原始的FSM存在。

Hopfield attractor networks are robust distributed models of human memory, but lack a general mechanism for effecting state-dependent attractor transitions in response to input. We propose construction rules such that an attractor network may implement an arbitrary finite state machine (FSM), where states and stimuli are represented by high-dimensional random vectors, and all state transitions are enacted by the attractor network's dynamics. Numerical simulations show the capacity of the model, in terms of the maximum size of implementable FSM, to be linear in the size of the attractor network for dense bipolar state vectors, and approximately quadratic for sparse binary state vectors. We show that the model is robust to imprecise and noisy weights, and so a prime candidate for implementation with high-density but unreliable devices. By endowing attractor networks with the ability to emulate arbitrary FSMs, we propose a plausible path by which FSMs could exist as a distributed computational primitive in biological neural networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源