论文标题
迭代性隐式梯度进行非凸优化,并具有变化不等式约束
Iterative Implicit Gradients for Nonconvex Optimization with Variational Inequality Constraints
论文作者
论文摘要
我们提出了一种基于隐式梯度的方案,用于与非凸丢失函数的约束优化问题,该方案可以潜在地分析机器学习中的各种应用,包括元学习,超参数优化和增强学习。提出的算法基于迭代分化(ITD)策略。我们将当前的双重优化文献的收敛性和速率分析扩展到受约束的二重结构的约束结构。为了使用任何一阶方案来解决双层优化,它涉及相对于外部变量(隐式梯度)的内部最佳解决方案的梯度。在本文中,考虑到可能的大规模结构,我们提出了一种获得隐式梯度的有效方法。我们进一步提供了相对于真实梯度的错误界限。此外,我们提供了非肌电率结果。
We propose an implicit gradient based scheme for a constrained optimization problem with nonconvex loss function, which can be potentially used to analyze a variety of applications in machine learning, including meta-learning, hyperparameter optimization, and reinforcement learning. The proposed algorithm is based on the iterative differentiation (ITD) strategy. We extend the convergence and rate analysis of the current literature of bilevel optimization to a constrained bilevel structure with the motivation of learning under constraints. For addressing bilevel optimization using any first-order scheme, it involves the gradient of the inner-level optimal solution with respect to the outer variable (implicit gradient). In this paper, taking into account of a possible large-scale structure, we propose an efficient way of obtaining the implicit gradient. We further provide error bounds with respect to the true gradients. Further, we provide nonasymptotic rate results.