论文标题

量子自然梯度,有效回溯线搜索

Quantum Natural Gradient with Efficient Backtracking Line Search

论文作者

Atif, Touheed Anwar, Chukwu, Uchenna, Berwald, Jesse, Dridi, Raouf

论文摘要

我们考虑量子天然梯度下降(QNGD)方案,该方案最近提议训练变异量子算法。 QNGD是在配备Fubini-study度量的复杂投影空间上运行的最陡峭的梯度下降(SGD)。在这里,我们根据Armijo的规则提出了QNGD的自适应实现,这是一种有效的回溯线搜索,享有可靠的融合。使用嘈杂的模拟器在具有不同初始化的三个不同模型上使用嘈杂的模拟器测试了所提出的算法。我们的结果表明,自适应QNGD动态调整步长,并始终优于原始QNGD,这需要最佳步骤大小的知识才能{竞争性地执行}。此外,我们表明,在自适应QNGD中执行线路搜索涉及的其他复杂性是最小的,从而确保提出的自适应策略所提供的收益占据了复杂性的任何增加。此外,我们的基准测试表明,配备有自适应方案的简单SGD算法(在欧几里得空间中实现)可以产生类似于具有最佳步长尺寸的QNGD方案的性能。 我们的结果又是对差异几何学在变化量子计算中的重要性的另一个证实。实际上,我们预见到高级数学在NISQ时代发挥了重要作用,可以指导更快,更有效的算法的设计。

We consider the Quantum Natural Gradient Descent (QNGD) scheme which was recently proposed to train variational quantum algorithms. QNGD is Steepest Gradient Descent (SGD) operating on the complex projective space equipped with the Fubini-Study metric. Here we present an adaptive implementation of QNGD based on Armijo's rule, which is an efficient backtracking line search that enjoys a proven convergence. The proposed algorithm is tested using noisy simulators on three different models with various initializations. Our results show that Adaptive QNGD dynamically adapts the step size and consistently outperforms the original QNGD, which requires knowledge of optimal step size to {perform competitively}. In addition, we show that the additional complexity involved in performing the line search in Adaptive QNGD is minimal, ensuring the gains provided by the proposed adaptive strategy dominates any increase in complexity. Additionally, our benchmarking demonstrates that a simple SGD algorithm (implemented in the Euclidean space) equipped with the adaptive scheme above, can yield performances similar to the QNGD scheme with optimal step size. Our results are yet another confirmation of the importance of differential geometry in variational quantum computations. As a matter of fact, we foresee advanced mathematics to play a prominent role in the NISQ era in guiding the design of faster and more efficient algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源