论文标题

数据增强与模棱两可的网络:对动态预测的概括理论

Data Augmentation vs. Equivariant Networks: A Theory of Generalization on Dynamics Forecasting

论文作者

Wang, Rui, Walters, Robin, Yu, Rose

论文摘要

在动态系统中利用对称性是改善深度学习概括的强大方法。该模型学会对转换是不变的,因此对于分配变化更为强大。数据增强和模棱两可的网络是将对称性注入学习的两种主要方法。但是,他们在改善概括中的确切作用尚不清楚。在这项工作中,我们得出了数据增强和模棱两可网络的概括范围,以表征它们在统一框架中学习的影响。与大多数I.I.D.的先前理论不同。设置,我们专注于具有复杂时间依赖性的非平稳动力学预测。

Exploiting symmetry in dynamical systems is a powerful way to improve the generalization of deep learning. The model learns to be invariant to transformation and hence is more robust to distribution shift. Data augmentation and equivariant networks are two major approaches to injecting symmetry into learning. However, their exact role in improving generalization is not well understood. In this work, we derive the generalization bounds for data augmentation and equivariant networks, characterizing their effect on learning in a unified framework. Unlike most prior theories for the i.i.d. setting, we focus on non-stationary dynamics forecasting with complex temporal dependencies.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源