论文标题
时间序列的深层状态空间模型
Deep Latent State Space Models for Time-Series Generation
论文作者
论文摘要
基于普通微分方程(ODE)的方法被广泛用于构建时间序列的生成模型。除了明确计算隐藏状态复发的高度计算开销外,现有的基于ODE的模型在学习序列数据中以敏锐的过渡(在许多真实世界系统中常见)降低,这是由于优化过程中的数值挑战。在这项工作中,我们提出了LS4,这是一种序列的生成模型,其潜在变量根据状态空间颂歌而演变以增加建模能力。受到最近深度状态空间模型(S4)的启发,我们通过利用LS4的卷积表示来实现加速,该卷积表示绕过隐藏状态的明确评估。我们表明,在Monash预测存储库中的实际分布,分类和预测分数方面,LS4显着胜过以前的连续时间生成模型,并且能够用敏锐的时间过渡对高度随机数据进行建模。 LS4设定了连续时间潜在生成模型的最新设置,在不规则采样的数据集上,平均误差和更紧密的变化下限显着改善,同时在长序列上也比其他基线更快。
Methods based on ordinary differential equations (ODEs) are widely used to build generative models of time-series. In addition to high computational overhead due to explicitly computing hidden states recurrence, existing ODE-based models fall short in learning sequence data with sharp transitions - common in many real-world systems - due to numerical challenges during optimization. In this work, we propose LS4, a generative model for sequences with latent variables evolving according to a state space ODE to increase modeling capacity. Inspired by recent deep state space models (S4), we achieve speedups by leveraging a convolutional representation of LS4 which bypasses the explicit evaluation of hidden states. We show that LS4 significantly outperforms previous continuous-time generative models in terms of marginal distribution, classification, and prediction scores on real-world datasets in the Monash Forecasting Repository, and is capable of modeling highly stochastic data with sharp temporal transitions. LS4 sets state-of-the-art for continuous-time latent generative models, with significant improvement of mean squared error and tighter variational lower bounds on irregularly-sampled datasets, while also being x100 faster than other baselines on long sequences.