论文标题

打破图形神经网络的表达瓶颈

Breaking the Expressive Bottlenecks of Graph Neural Networks

论文作者

Yang, Mingqi, Shen, Yanming, Qi, Heng, Yin, Baocai

论文摘要

最近,使用Weisfeiler-Lehman(WL)图同构测试来测量图神经网络(GNN)的表现力,表明邻域聚集GNN最多在区分图形结构时与1-WL测试一样强大。还提出了与$ K $ -WL测试($ k> 1 $)相比提出的改进。但是,这些GNN中的聚合器远非WL测试所要求的注射剂,并且具有较弱的区别强度,使其成为表现力的瓶颈。在本文中,我们通过探索强大的聚合器来提高表现力。我们通过相应的聚合系数矩阵重新重新聚集,然后系统地分析聚合系数矩阵的要求,以构建更强大的聚合器,甚至是注射式聚合器。它也可以看作是保留隐藏特征等级的策略,并暗示基本聚合器对应于低级别转换的特殊情况。我们还显示了在聚合之前应用非线性单元的必要性,这与大多数基于聚合的GNN不同。基于我们的理论分析,我们开发了两个GNN层扩展Conconv和CombConv。实验结果表明,我们的模型显着提高了性能,尤其是对于大型和密集的图表。

Recently, the Weisfeiler-Lehman (WL) graph isomorphism test was used to measure the expressiveness of graph neural networks (GNNs), showing that the neighborhood aggregation GNNs were at most as powerful as 1-WL test in distinguishing graph structures. There were also improvements proposed in analogy to $k$-WL test ($k>1$). However, the aggregators in these GNNs are far from injective as required by the WL test, and suffer from weak distinguishing strength, making it become expressive bottlenecks. In this paper, we improve the expressiveness by exploring powerful aggregators. We reformulate aggregation with the corresponding aggregation coefficient matrix, and then systematically analyze the requirements of the aggregation coefficient matrix for building more powerful aggregators and even injective aggregators. It can also be viewed as the strategy for preserving the rank of hidden features, and implies that basic aggregators correspond to a special case of low-rank transformations. We also show the necessity of applying nonlinear units ahead of aggregation, which is different from most aggregation-based GNNs. Based on our theoretical analysis, we develop two GNN layers, ExpandingConv and CombConv. Experimental results show that our models significantly boost performance, especially for large and densely connected graphs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源