论文标题
平行,自我组织,共识神经网络
Parallel, Self Organizing, Consensus Neural Networks
论文作者
论文摘要
开发了一种新的神经网络体系结构(PSCNN),以提高此类网络的性能和速度。该体系结构具有以前模型的所有优势,例如自组织,并具有基于共识的输入平行性和决策等其他出色特征。由于该网络的属性,研究了该网络在并行处理器(NCUBE机)以及常规顺序机器上的实现。体系结构自我以一种最大化性能的方式组织自己的模块。由于它是完全平行的,因此召回程序和学习过程都非常快。将网络的性能与语言知觉,遥感和二进制逻辑问题(独家或二进制逻辑问题)中的反向传播网络进行了比较。在所有研究的情况下,PSCNN均表现出色。
A new neural network architecture (PSCNN) is developed to improve performance and speed of such networks. The architecture has all the advantages of the previous models such as self-organization and possesses some other superior characteristics such as input parallelism and decision making based on consensus. Due to the properties of this network, it was studied with respect to implementation on a Parallel Processor (Ncube Machine) as well as a regular sequential machine. The architecture self organizes its own modules in a way to maximize performance. Since it is completely parallel, both recall and learning procedures are very fast. The performance of the network was compared to the Backpropagation networks in problems of language perception, remote sensing and binary logic (Exclusive-Or). PSCNN showed superior performance in all cases studied.