论文标题
NISH:一种新型的负面刺激杂交激活功能
Nish: A Novel Negative Stimulated Hybrid Activation Function
论文作者
论文摘要
激活函数对神经网络的效率和鲁棒性有重大影响。作为替代方案,我们演变出了尖端的非单调激活函数,负刺激的杂种激活函数(NISH)。它充当正区域的整流线性单位(RELU)功能,负区域的鼻窦肌体功能。换句话说,它结合了一个sigmoid和正弦功能,并在经典的relu上获得了新的动力学。我们分析了基本网络不同组合和最常见的激活功能的一致性,并使用了几种最受欢迎的基准测试。从实验结果中,我们报告说,与分类中的Mish相比,NISH所达到的准确率略好。
An activation function has a significant impact on the efficiency and robustness of the neural networks. As an alternative, we evolved a cutting-edge non-monotonic activation function, Negative Stimulated Hybrid Activation Function (Nish). It acts as a Rectified Linear Unit (ReLU) function for the positive region and a sinus-sigmoidal function for the negative region. In other words, it incorporates a sigmoid and a sine function and gaining new dynamics over classical ReLU. We analyzed the consistency of the Nish for different combinations of essential networks and most common activation functions using on several most popular benchmarks. From the experimental results, we reported that the accuracy rates achieved by the Nish is slightly better than compared to the Mish in classification.