论文标题
通过Sobolev Cubatures替换自动差异化,可以固定物理学知情并增强其近似功率
Replacing Automatic Differentiation by Sobolev Cubatures fastens Physics Informed Neural Nets and strengthens their Approximation Power
论文作者
论文摘要
我们提出了一类新的近似值,用于变异损失,适用于训练物理信息的神经网(PINN)。损失公式反映了部分微分方程及其弱制剂的经典Sobolev空间理论。损失计算取决于高斯 - legendre Cubatures的扩展,我们称Sobolev Cubatures,取代自动分化(A.D.)。我们证明,训练所得的SobleV-Pinn(SC-Pinns)的运行时复杂性小于Pinns所要求的,而Pinns依赖于A.D.在一到两个数量级的速度上加快SC-Pinns被证明可以实现更接近的解决方案的近似解决方案,以实现较大的正向和逆PDE问题。
We present a novel class of approximations for variational losses, being applicable for the training of physics-informed neural nets (PINNs). The loss formulation reflects classic Sobolev space theory for partial differential equations and their weak formulations. The loss computation rests on an extension of Gauss-Legendre cubatures, we term Sobolev cubatures, replacing automatic differentiation (A.D.). We prove the runtime complexity of training the resulting Soblev-PINNs (SC-PINNs) to be less than required by PINNs relying on A.D. On top of one-to-two order of magnitude speed-up the SC-PINNs are demonstrated to achieve closer solution approximations for prominent forward and inverse PDE problems than established PINNs achieve.