论文标题

关于熵正规化对平滑瓦斯汀估计量的潜在优势

On the potential benefits of entropic regularization for smoothing Wasserstein estimators

论文作者

Bigot, Jérémie, Freulon, Paul, Hejblum, Boris P., Leclaire, Arthur

论文摘要

本文通过在统计数据中的近似和估计误差之间的经典权衡方面的棱镜来研究最佳运输中的熵正则化研究。 Wasserstein估计器被定义为变分问题的解决方案,其目标功能涉及在概率测量之间使用最佳运输成本。可以通过使用运输计划的熵罚款替换其正规化版本来代替其最佳运输成本来正规化此类估计器。这种正则化的使用对最终的估计器具有潜在的显着平滑作用。在这项工作中,我们研究了其在正规化瓦斯汀估计器的近似和估计特性上的潜在益处。我们的主要贡献是讨论熵正则化如何以较低的计算成本来达到统计性能,这些表现与涉及分布数据分析的统计学习问题中未注册的瓦斯汀估计量相当。为此,我们提出了有关正规化瓦斯汀估计量收敛的新理论结果。我们还使用模拟和真实数据在混合模型中使用最佳传输中的比例估算中的模拟和真实数据研究了它们的数值性能。

This paper is focused on the study of entropic regularization in optimal transport as a smoothing method for Wasserstein estimators, through the prism of the classical tradeoff between approximation and estimation errors in statistics. Wasserstein estimators are defined as solutions of variational problems whose objective function involves the use of an optimal transport cost between probability measures. Such estimators can be regularized by replacing the optimal transport cost by its regularized version using an entropy penalty on the transport plan. The use of such a regularization has a potentially significant smoothing effect on the resulting estimators. In this work, we investigate its potential benefits on the approximation and estimation properties of regularized Wasserstein estimators. Our main contribution is to discuss how entropic regularization may reach, at a lower computational cost, statistical performances that are comparable to those of un-regularized Wasserstein estimators in statistical learning problems involving distributional data analysis. To this end, we present new theoretical results on the convergence of regularized Wasserstein estimators. We also study their numerical performances using simulated and real data in the supervised learning problem of proportions estimation in mixture models using optimal transport.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源