论文标题
用对称和退火补充复发性神经网络波函数以提高准确性
Supplementing Recurrent Neural Network Wave Functions with Symmetry and Annealing to Improve Accuracy
论文作者
论文摘要
经常性的神经网络(RNN)是一类神经网络,这些神经网络已从人工智能的范式中出现,并在自然语言处理领域中实现了许多有趣的进步。有趣的是,这些体系结构被证明是强大的Ansatze,可近似量子系统的基态。在这里,我们建立了[Phys的结果。 Rev. Research 2,023358(2020)]并在二维中构建了更强大的RNN波函数Ansatz。我们使用对称性和退火来获得对二维(2D)海森贝格模型的基态能量的准确估计,在方形晶格和三角形晶格上。我们表明,对于三角形晶格上的大于或等于$ 14 $ 14 $的系统尺寸,我们的方法优于密度矩阵rentrastialationalsation Grout(DMRG)。
Recurrent neural networks (RNNs) are a class of neural networks that have emerged from the paradigm of artificial intelligence and has enabled lots of interesting advances in the field of natural language processing. Interestingly, these architectures were shown to be powerful ansatze to approximate the ground state of quantum systems. Here, we build over the results of [Phys. Rev. Research 2, 023358 (2020)] and construct a more powerful RNN wave function ansatz in two dimensions. We use symmetry and annealing to obtain accurate estimates of ground state energies of the two-dimensional (2D) Heisenberg model, on the square lattice and on the triangular lattice. We show that our method is superior to Density Matrix Renormalisation Group (DMRG) for system sizes larger than or equal to $14 \times 14$ on the triangular lattice.