论文标题

平衡神经网络中的瞬态扩增

Transient amplification in balanced neural networks

论文作者

Tarnowski, Wojciech

论文摘要

不仅在神经科学中,而且在许多由动力学系统建模的领域中,都提出了瞬态扩增作为一种重要机制。尽管如此,没有明确的生物学上合理的机制来微调耦合矩阵或选择要放大的信号。在这项工作中,我们对复发性神经网络的Rajan-Abbott模型进行了定量研究瞬态动力学[K. Rajan和L.F. Abbot PRL 97,188104(2006)]。我们发现弱或没有放大瞬变相位的二阶跃迁和强大扩增的相位,其中平均轨迹被放大。在后期,戴尔的原理和兴奋性/抑制平衡的结合允许强大的权重,同时将系统保持在混乱的边缘。此外,我们表明扩增与动力学的变化更大。通过数值研究平方规范的完全概率密度,我们观察到重量的强度增强,分布的右尾巴变得更重,从高斯移动到指数尾巴。

Transient amplification has been proposed as an important mechanism not only in neuroscience but in many areas modeled by dynamical systems. Despite that, there is no clear biologically plausible mechanism which fine-tunes the coupling matrix or selects signals to be amplified. In this work we quantitatively study transient dynamics in the Rajan-Abbott model of a recurrent neural network [K. Rajan and L.F. Abbot PRL 97, 188104 (2006)]. We find a second order transition between a phase of weakly or no amplified transients and a phase of strong amplification, where the average trajectory is amplified. In the latter phase the combination of Dale's principle and excitatory/inhibitory balance allows for strong weights, while maintaining the system at the edge of chaos. Moreover, we show that the amplification goes hand in hand with greater variability of the dynamics. By numerically studying the full probability density of the squared norm, we observe as the strength of weights grows, the right tail of the distribution becomes heavier, moving from the Gaussian to the exponential tail.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源