论文标题

量子长的短期记忆

Quantum Long Short-Term Memory

论文作者

Chen, Samuel Yen-Chi, Yoo, Shinjae, Fang, Yao-Lung L.

论文摘要

长期短期记忆(LSTM)是一种用于序列和时间依赖数据建模及其有效性的复发性神经网络(RNN)。在这项工作中,我们提出了LSTM的混合量子古典模型,我们将其列为QLSTM。我们证明了所提出的模型成功地学习了几种时间数据。特别是,我们表明,对于某些测试案例,该量子版本的LSTM收敛速度比其经典对应物更快或等效地达到了更好的精度。由于我们方法的变化性质,因此放松了对量子计数和电路深度的要求,因此我们的工作为实施机器学习算法的实现铺平了道路,以实现噪声中间尺度量子(NISQ)设备的序列建模。

Long short-term memory (LSTM) is a kind of recurrent neural networks (RNN) for sequence and temporal dependency data modeling and its effectiveness has been extensively established. In this work, we propose a hybrid quantum-classical model of LSTM, which we dub QLSTM. We demonstrate that the proposed model successfully learns several kinds of temporal data. In particular, we show that for certain testing cases, this quantum version of LSTM converges faster, or equivalently, reaches a better accuracy, than its classical counterpart. Due to the variational nature of our approach, the requirements on qubit counts and circuit depth are eased, and our work thus paves the way toward implementing machine learning algorithms for sequence modeling on noisy intermediate-scale quantum (NISQ) devices.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源