论文标题

神经拉普拉斯:学习拉普拉斯域中的各种微分方程类别

Neural Laplace: Learning diverse classes of differential equations in the Laplace domain

论文作者

Holt, Samuel, Qian, Zhaozhi, van der Schaar, Mihaela

论文摘要

神经普通微分方程模型的动态系统,其ODE是由神经网络学到的ODES。但是,ODE从根本上是不足以建模具有远程依赖性或不连续性的系统,这些系统在工程和生物系统中很常见。已经提出了更广泛的微分方程(DE)类作为补救措施,包括延迟微分方程和截然不同的方程。此外,当用分段强迫函数对硬质量和odes进行建模时,神经颂歌会遭受数值的不稳定性。在这项工作中,我们提出了神经拉普拉斯(Neural Laplace),这是一个学习不同类别的统一框架,包括上述所有框架。我们没有在时间域中建模动力学,而是在拉普拉斯域中对其进行建模,在拉普拉斯域中,可以将历史依赖性和时间的不连续性表示为复杂指数的求和。为了提高学习效率,我们使用Riemann Sphere的几何立体图来诱导Laplace域中的平滑度。在实验中,神经拉普拉斯在建模和推断DES类别的轨迹方面表现出卓越的性能,包括具有复杂历史依赖性和突然变化的DES的轨迹。

Neural Ordinary Differential Equations model dynamical systems with ODEs learned by neural networks. However, ODEs are fundamentally inadequate to model systems with long-range dependencies or discontinuities, which are common in engineering and biological systems. Broader classes of differential equations (DE) have been proposed as remedies, including delay differential equations and integro-differential equations. Furthermore, Neural ODE suffers from numerical instability when modelling stiff ODEs and ODEs with piecewise forcing functions. In this work, we propose Neural Laplace, a unified framework for learning diverse classes of DEs including all the aforementioned ones. Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials. To make learning more efficient, we use the geometrical stereographic map of a Riemann sphere to induce more smoothness in the Laplace domain. In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs, including the ones with complex history dependency and abrupt changes.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源