论文标题
瞬时混乱维度扩大了复发网络
Transient chaotic dimensionality expansion by recurrent networks
论文作者
论文摘要
大脑中的神经元与峰值通信,这些峰值是时间和价值的离散事件。功能网络模型通常采用速率单位,这些单位连续地通过模拟信号耦合。这两种形式的信号传导是否暗示了定性差异?我们开发了一个统一的平均场理论,以表明速率和二进制网络的一阶统计数据实际上是相同的,如果速率神经元获得适量的噪声。然而,它们对提出的刺激的反应可能在根本上不同。我们通过研究附近的状态轨迹随着时间的推移如何发展来量化这些差异,询问动态在多大程度上是混乱的。发现这两个模型中的混乱在质上不同。在二进制网络中,我们发现了与混乱的网络大小相关的过渡和一个混乱的submanifold,其维度随着时间的推移而刻板地扩展,而具有匹配统计的速率网络则是非交流的。混乱的二进制网络中的维度扩展有助于储层计算中的分类和最佳性能,在每个神经元中大约有一个激活中。我们在尖峰网络中也证明了一种快速计算机制。这种机制的概括扩展到各自混乱的方案中的评分网络。
Neurons in the brain communicate with spikes, which are discrete events in time and value. Functional network models often employ rate units that are continuously coupled by analog signals. Is there a qualitative difference implied by these two forms of signaling? We develop a unified mean-field theory for large random networks to show that first- and second-order statistics in rate and binary networks are in fact identical if rate neurons receive the right amount of noise. Their response to presented stimuli, however, can be radically different. We quantify these differences by studying how nearby state trajectories evolve over time, asking to what extent the dynamics is chaotic. Chaos in the two models is found to be qualitatively different. In binary networks we find a network-size-dependent transition to chaos and a chaotic submanifold whose dimensionality expands stereotypically with time, while rate networks with matched statistics are nonchaotic. Dimensionality expansion in chaotic binary networks aids classification in reservoir computing and optimal performance is reached within about a single activation per neuron; a fast mechanism for computation that we demonstrate also in spiking networks. A generalization of this mechanism extends to rate networks in their respective chaotic regimes.