论文标题

通过空间分解启用尖峰神经网络的资源感知映射

Enabling Resource-Aware Mapping of Spiking Neural Networks via Spatial Decomposition

论文作者

Balaji, Adarsha, Song, Shihao, Das, Anup, Krichmar, Jeffrey, Dutt, Nikil, Shackleford, James, Kandasamy, Nagarajan, Catthoor, Francky

论文摘要

随着模型复杂性的增长,将基于瓷砖的神经形态硬件映射到峰值神经网络(SNN)的应用程序变得越来越具有挑战性。这是因为瓷砖上的突触存储资源,即。横杆,每个突触后神经元只能容纳固定数量的突触前连接。对于每个神经元具有许多突触前连接的复杂SNN模型,在训练后可能需要修剪一些连接以适应瓷砖资源,从而导致模型质量损失,例如准确性。在这项工作中,我们提出了一种新型的展开技术,该技术将许多突触前连接的神经元功能分解为一系列均匀的神经单元,每个神经单元都是一个功能计算节点,具有两个突触前连接。这种空间分解技术可显着改善横杆利用率并保留所有突触前连接,从而导致连接修剪造成的模型质量损失。我们将所提出的技术集成到现有的SNN映射框架中,并使用机器学习应用程序在Dynap-SE最先进的神经形态硬件上对其进行评估。我们的结果表明,横杆的平均需求降低了60%,突触利用率增加了9倍,硬件上浪费的能量降低了62%,模型质量增加了0.8%和4.6%。

With growing model complexity, mapping Spiking Neural Network (SNN)-based applications to tile-based neuromorphic hardware is becoming increasingly challenging. This is because the synaptic storage resources on a tile, viz. a crossbar, can accommodate only a fixed number of pre-synaptic connections per post-synaptic neuron. For complex SNN models that have many pre-synaptic connections per neuron, some connections may need to be pruned after training to fit onto the tile resources, leading to a loss in model quality, e.g., accuracy. In this work, we propose a novel unrolling technique that decomposes a neuron function with many pre-synaptic connections into a sequence of homogeneous neural units, where each neural unit is a function computation node, with two pre-synaptic connections. This spatial decomposition technique significantly improves crossbar utilization and retains all pre-synaptic connections, resulting in no loss of the model quality derived from connection pruning. We integrate the proposed technique within an existing SNN mapping framework and evaluate it using machine learning applications on the DYNAP-SE state-of-the-art neuromorphic hardware. Our results demonstrate an average 60% lower crossbar requirement, 9x higher synapse utilization, 62% lower wasted energy on the hardware, and between 0.8% and 4.6% increase in model quality.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源