论文标题

谎言组及其同质空间上的固定核和高斯流程I:紧凑的情况

Stationary Kernels and Gaussian Processes on Lie Groups and their Homogeneous Spaces I: the compact case

论文作者

Azangulov, Iskander, Smolensky, Andrei, Terenin, Alexander, Borovitskiy, Viacheslav

论文摘要

高斯工艺可以说是机器学习中最重要的时空模型类别。他们编码有关建模功能的先前信息,可用于精确或近似贝叶斯学习。在许多应用中,尤其是在物理科学和工程中,以及在诸如地统计和神经科学等领域中,对对称性的不变性是人们可以考虑的先前信息的最基本形式之一。高斯过程与这种对称性的协方差的不变性导致了对此类空间平稳性概念的最自然概括。在这项工作中,我们开发了建设性和实用的技术,用于在在对称的背景下产生的一大批非欧几里得空间上构建固定的高斯工艺。我们的技术使(i)以实用方式计算在此类空间上定义的先验和后高斯过程中的协方差内核和(ii)样本。这项工作分为两个部分,每个部分涉及不同的技术考虑:第一部分研究紧凑的空间,而第二部分研究的非紧密距离空间具有某些结构。我们的贡献使我们研究的非欧国人高斯流程模型与标准高斯流程软件包中可用的良好计算技术兼容,从而使从业者可以访问它们。

Gaussian processes are arguably the most important class of spatiotemporal models within machine learning. They encode prior information about the modeled function and can be used for exact or approximate Bayesian learning. In many applications, particularly in physical sciences and engineering, but also in areas such as geostatistics and neuroscience, invariance to symmetries is one of the most fundamental forms of prior information one can consider. The invariance of a Gaussian process' covariance to such symmetries gives rise to the most natural generalization of the concept of stationarity to such spaces. In this work, we develop constructive and practical techniques for building stationary Gaussian processes on a very large class of non-Euclidean spaces arising in the context of symmetries. Our techniques make it possible to (i) calculate covariance kernels and (ii) sample from prior and posterior Gaussian processes defined on such spaces, both in a practical manner. This work is split into two parts, each involving different technical considerations: part I studies compact spaces, while part II studies non-compact spaces possessing certain structure. Our contributions make the non-Euclidean Gaussian process models we study compatible with well-understood computational techniques available in standard Gaussian process software packages, thereby making them accessible to practitioners.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源