论文标题

现代图神经网络在求解组合优化问题(例如最大独立集)时,比经典的贪婪算法要差

Modern graph neural networks do worse than classical greedy algorithms in solving combinatorial optimization problems like maximum independent set

论文作者

Angelini, Maria Chiara, Ricci-Tersenghi, Federico

论文摘要

最近的工作``与物理启发的图形神经网络的组合优化'[Nat Mach Intell 4(2022)367]引入了物理启发的无监督的图形神经网络(GNN),以求解稀疏图上的组合优化问题。为了测试这些GNN的性能,工作的作者显示了两个基本问题的数值结果:最大剪切和最大独立集(MIS)。他们得出的结论是,“图神经网络优化器在标准杆或胜过现有的求解器上的性能,并且能够超越艺术的状态,以达到数百万变量的问题。” 在此评论中,我们表明,一种简单的贪婪算法在几乎线性的时间内运行,可以找到与GNN质量好得多的MIS问题的解决方案。对于GNN而言,对于一百万个变量的问题,贪婪的算法的速度更快为10^4美元。我们看不出有任何充分的理由解决这些GNN的MIS,以及使用大锤破裂坚果的理由。 总的来说,许多关于神经网络在解决组合问题方面的优越性的主张有没有足够稳定的风险,因为我们基于真正的严重问题缺乏标准的基准。我们提出了这样的硬基准之一,我们希望在提出任何优势声称之前对未来的神经网络优化者进行测试。

The recent work ``Combinatorial Optimization with Physics-Inspired Graph Neural Networks'' [Nat Mach Intell 4 (2022) 367] introduces a physics-inspired unsupervised Graph Neural Network (GNN) to solve combinatorial optimization problems on sparse graphs. To test the performances of these GNNs, the authors of the work show numerical results for two fundamental problems: maximum cut and maximum independent set (MIS). They conclude that "the graph neural network optimizer performs on par or outperforms existing solvers, with the ability to scale beyond the state of the art to problems with millions of variables." In this comment, we show that a simple greedy algorithm, running in almost linear time, can find solutions for the MIS problem of much better quality than the GNN. The greedy algorithm is faster by a factor of $10^4$ with respect to the GNN for problems with a million variables. We do not see any good reason for solving the MIS with these GNN, as well as for using a sledgehammer to crack nuts. In general, many claims of superiority of neural networks in solving combinatorial problems are at risk of being not solid enough, since we lack standard benchmarks based on really hard problems. We propose one of such hard benchmarks, and we hope to see future neural network optimizers tested on these problems before any claim of superiority is made.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源