论文标题
关于封闭式复发单元神经网络的稳定性
On the stability properties of Gated Recurrent Units neural networks
论文作者
论文摘要
本文的目的是提供足够的条件,以保证门控复发单元(GRUS)神经网络的输入到国家稳定性(ISS)和增量输入对状态稳定性(ΔISS)。这些条件是为单层和多层体系结构设计的,由网络权重的非线性不平等组成。可以使用它们来检查训练有素的网络的稳定性,也可以在GRU的训练过程中被执行为限制。在四倍罐非线性基准系统上测试了最终的训练程序,显示了令人满意的建模性能。
The goal of this paper is to provide sufficient conditions for guaranteeing the Input-to-State Stability (ISS) and the Incremental Input-to-State Stability (δISS) of Gated Recurrent Units (GRUs) neural networks. These conditions, devised for both single-layer and multi-layer architectures, consist of nonlinear inequalities on network's weights. They can be employed to check the stability of trained networks, or can be enforced as constraints during the training procedure of a GRU. The resulting training procedure is tested on a Quadruple Tank nonlinear benchmark system, showing satisfactory modeling performances.