论文标题
随机的拉普拉斯(Laplacian)特征用于学习双曲线空间
Random Laplacian Features for Learning with Hyperbolic Space
论文作者
论文摘要
由于其几何特性,双曲线空间可以支持树和图结构数据的高保真嵌入,并在其上开发了各种双曲线网络。现有的双曲线网络不仅为输入编码几何先验,而且在网络的各个层中编码。这种方法涉及反复映射到双曲线空间,这使得这些网络变得复杂,以实施,计算尺寸昂贵,并且在数值上不稳定训练。在本文中,我们提出了一种更简单的方法:学习输入的双曲线嵌入,然后使用映射通过尊重双曲线空间的同米形式来编码几何学先验的映射,从其从中映射到欧几里得空间,并使用标准的欧几里得网络完成。关键的见解是通过拉普拉斯操作员的本征函数使用随机特征映射,我们显示的可以近似于双曲线空间上的任何等值不变的内核。我们的方法可以与任何图神经网络一起使用:即使使用线性图模型也可以在跨传输和电感性任务中的其他双曲线基线的效率和性能方面显着提高。
Due to its geometric properties, hyperbolic space can support high-fidelity embeddings of tree- and graph-structured data, upon which various hyperbolic networks have been developed. Existing hyperbolic networks encode geometric priors not only for the input, but also at every layer of the network. This approach involves repeatedly mapping to and from hyperbolic space, which makes these networks complicated to implement, computationally expensive to scale, and numerically unstable to train. In this paper, we propose a simpler approach: learn a hyperbolic embedding of the input, then map once from it to Euclidean space using a mapping that encodes geometric priors by respecting the isometries of hyperbolic space, and finish with a standard Euclidean network. The key insight is to use a random feature mapping via the eigenfunctions of the Laplace operator, which we show can approximate any isometry-invariant kernel on hyperbolic space. Our method can be used together with any graph neural networks: using even a linear graph model yields significant improvements in both efficiency and performance over other hyperbolic baselines in both transductive and inductive tasks.