论文标题
通过Hankel矩阵表示基于张量的顺序学习下一项建议
Tensor-based Sequential Learning via Hankel Matrix Representation for Next Item Recommendations
论文作者
论文摘要
最近已经显示出自动的变压器模型可以非常有效地解决下一个项目推荐任务。学习的注意力重量捕获了用户行为中的顺序动态,并且可以很好地概括。受到学到的参数空间的特殊结构的激励,我们质疑是否可以使用另一种更轻巧的方法模仿它。我们开发了一种新的基于张量分解的模型,该模型会在学习过程中吸收有关顺序数据的结构知识。我们证明了如何基于特殊的Hankel矩阵表示,可以通过我们的方法复制自我发项网络的某些特性。最终的模型具有浅线性结构,并与其神经对应物进行了比较。
Self-attentive transformer models have recently been shown to solve the next item recommendation task very efficiently. The learned attention weights capture sequential dynamics in user behavior and generalize well. Motivated by the special structure of learned parameter space, we question if it is possible to mimic it with an alternative and more lightweight approach. We develop a new tensor factorization-based model that ingrains the structural knowledge about sequential data within the learning process. We demonstrate how certain properties of a self-attention network can be reproduced with our approach based on special Hankel matrix representation. The resulting model has a shallow linear architecture and compares competitively to its neural counterpart.