论文标题
使用黑框函数评估
Minimax Efficient Finite-Difference Stochastic Gradient Estimators Using Black-Box Function Evaluations
论文作者
论文摘要
随机梯度估计的标准方法,仅具有嘈杂的黑框函数评估,使用有限差分方法或其变体。虽然很自然,但据我们所知,他们的统计准确性是否是最好的。本文在表明中央有限差分是适当的一类目标函数和平方平方风险的中央有限差分是一个几乎最小的最佳零阶梯度估计器,这是线性估计器类别和所有(非线性)估计器中更大的类别中的均值。
Standard approaches to stochastic gradient estimation, with only noisy black-box function evaluations, use the finite-difference method or its variants. While natural, it is open to our knowledge whether their statistical accuracy is the best possible. This paper argues so by showing that central finite-difference is a nearly minimax optimal zeroth-order gradient estimator for a suitable class of objective functions and mean squared risk, among both the class of linear estimators and the much larger class of all (nonlinear) estimators.