论文标题

可扩展的对称塔克张量分解

Scalable symmetric Tucker tensor decomposition

论文作者

Jin, Ruhui, Kileel, Joe, Kolda, Tamara G., Ward, Rachel

论文摘要

我们研究了对称张量的最佳低级塔克分解。激励的应用正在分解高阶多元矩。力矩张量具有特殊的结构,对各种数据科学问题很重要。我们主张预测梯度下降(PGD)方法和高阶特征值分解(HOEVD)近似作为计算方案。最重要的是,我们开发了基本PGD和HOEVD方法的可扩展改编,以分解样品矩张量。在隐式和流媒体技术的帮助下,我们避免了构建和存储时刻张量的间接成本。这种减少使计算塔克分解可用于高维度的大数据实例。数值实验证明了算法的效率以及矩张量分解对现实世界数据集的适用性。最终,我们研究了格拉斯曼尼亚歧管上的收敛性,并证明PGD求解器得出的更新序列达到了一阶和二阶关键。

We study the best low-rank Tucker decomposition of symmetric tensors. The motivating application is decomposing higher-order multivariate moments. Moment tensors have special structure and are important to various data science problems. We advocate for projected gradient descent (PGD) method and higher-order eigenvalue decomposition (HOEVD) approximation as computation schemes. Most importantly, we develop scalable adaptations of the basic PGD and HOEVD methods to decompose sample moment tensors. With the help of implicit and streaming techniques, we evade the overhead cost of building and storing the moment tensor. Such reductions make computing the Tucker decomposition realizable for large data instances in high dimensions. Numerical experiments demonstrate the efficiency of the algorithms and the applicability of moment tensor decompositions to real-world datasets. Finally we study the convergence on the Grassmannian manifold, and prove that the update sequence derived by the PGD solver achieves first- and second-order criticality.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源