论文标题
通过范围限制的深神经网络的低成本故障校正器
A Low-cost Fault Corrector for Deep Neural Networks through Range Restriction
论文作者
论文摘要
在安全 - 关键领域中采用深神网络(DNN)引起了严重的可靠性问题。一个突出的例子是硬件瞬态故障由于渐进的技术扩展而导致频率增长,并且可能导致DNN失败。 这项工作提出了一种低成本故障校正器Ranger,该Ranger直接纠正了由于没有重新计算的瞬态故障而导致的故障输出。 DNN固有地对良性故障(不会导致输出损坏),但不会导致关键故障(这可能导致错误输出)。 Ranger是一种自动转换,可选择性地限制DNN中的值范围,它减少了由临界断层引起的大偏差,并将其转化为良性断层,而DNNS固有的弹性可以忍受。我们对8个DNN的评估表明,Ranger显着提高了DNNS的误差弹性(增加3倍至50倍),而准确性没有损失,并且开销可忽略不计。
The adoption of deep neural networks (DNNs) in safety-critical domains has engendered serious reliability concerns. A prominent example is hardware transient faults that are growing in frequency due to the progressive technology scaling, and can lead to failures in DNNs. This work proposes Ranger, a low-cost fault corrector, which directly rectifies the faulty output due to transient faults without re-computation. DNNs are inherently resilient to benign faults (which will not cause output corruption), but not to critical faults (which can result in erroneous output). Ranger is an automated transformation to selectively restrict the value ranges in DNNs, which reduces the large deviations caused by critical faults and transforms them to benign faults that can be tolerated by the inherent resilience of the DNNs. Our evaluation on 8 DNNs demonstrates Ranger significantly increases the error resilience of the DNNs (by 3x to 50x), with no loss in accuracy, and with negligible overheads.