论文标题

通过关系蒸馏的无连接链接预测

Linkless Link Prediction via Relational Distillation

论文作者

Guo, Zhichun, Shiao, William, Zhang, Shichang, Liu, Yozen, Chawla, Nitesh V., Shah, Neil, Zhao, Tong

论文摘要

图形神经网络(GNN)在链接预测任务中表现出了出色的性能。尽管它们有效,但非平凡的邻里数据依赖性带来的高潜伏期限制了GNN在实际部署中。相反,由于缺乏关系知识,已知的有效MLP比GNN效率要低得多。在这项工作中,为了结合GNN和MLP的优势,我们从探索链接预测的直接知识蒸馏(KD)方法开始,即基于logit的匹配和基于节点表示基于logit的匹配。在观察直接的KD模拟对链接预测方面的表现不佳后,我们提出了一个关系KD框架,无连接的链接预测(LLP),以将知识提炼为与MLP的链接预测。与独立链接逻辑或节点表示形式匹配的简单KD方法不同,LLP蒸馏了每个(锚)节点围绕学生MLP的关系知识。具体而言,我们建议基于等级的匹配和基于分配的匹配策略,以相互补充。广泛的实验表明,LLP可以提高MLP的链接预测性能,并在8个基准中提高教师GNN的表现。与大规模OGB数据集中的GNN相比,LLP在链接预测推断中还达到了70.68倍的速度。

Graph Neural Networks (GNNs) have shown exceptional performance in the task of link prediction. Despite their effectiveness, the high latency brought by non-trivial neighborhood data dependency limits GNNs in practical deployments. Conversely, the known efficient MLPs are much less effective than GNNs due to the lack of relational knowledge. In this work, to combine the advantages of GNNs and MLPs, we start with exploring direct knowledge distillation (KD) methods for link prediction, i.e., predicted logit-based matching and node representation-based matching. Upon observing direct KD analogs do not perform well for link prediction, we propose a relational KD framework, Linkless Link Prediction (LLP), to distill knowledge for link prediction with MLPs. Unlike simple KD methods that match independent link logits or node representations, LLP distills relational knowledge that is centered around each (anchor) node to the student MLP. Specifically, we propose rank-based matching and distribution-based matching strategies that complement each other. Extensive experiments demonstrate that LLP boosts the link prediction performance of MLPs with significant margins, and even outperforms the teacher GNNs on 7 out of 8 benchmarks. LLP also achieves a 70.68x speedup in link prediction inference compared to GNNs on the large-scale OGB dataset.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源