论文标题

MILENAS:通过混合层重新制定的有效的神经建筑搜索

MiLeNAS: Efficient Neural Architecture Search via Mixed-Level Reformulation

论文作者

He, Chaoyang, Ye, Haishan, Shen, Li, Zhang, Tong

论文摘要

许多最近提出的用于神经体系结构搜索(NAS)的方法可以作为双重优化配制。为了有效实施,其解决方案需要二阶方法的近似值。在本文中,我们证明了由于这种近似值导致次优性引起的梯度误差,从某种意义上说,优化过程无法收敛到(本地)最佳解决方案。为了解决这个问题,本文提出了\ mldas,这是NAS的混合级别重新进行的,可以有效,可靠地优化。结果表明,即使在混合水平公式上使用简单的一阶方法,\ mldas \也可以解决NAS问题的较低验证误差。因此,通过我们的方法获得的结构比从双层优化获得的构造始终更高的精度。此外,\ mldas \提出了一个超越飞镖的框架。它通过基于模型尺寸的搜索和早期停止策略进行升级,以在大约5小时内完成搜索过程。卷积体系结构搜索空间内的广泛实验验证了我们方法的有效性。

Many recently proposed methods for Neural Architecture Search (NAS) can be formulated as bilevel optimization. For efficient implementation, its solution requires approximations of second-order methods. In this paper, we demonstrate that gradient errors caused by such approximations lead to suboptimality, in the sense that the optimization procedure fails to converge to a (locally) optimal solution. To remedy this, this paper proposes \mldas, a mixed-level reformulation for NAS that can be optimized efficiently and reliably. It is shown that even when using a simple first-order method on the mixed-level formulation, \mldas\ can achieve a lower validation error for NAS problems. Consequently, architectures obtained by our method achieve consistently higher accuracies than those obtained from bilevel optimization. Moreover, \mldas\ proposes a framework beyond DARTS. It is upgraded via model size-based search and early stopping strategies to complete the search process in around 5 hours. Extensive experiments within the convolutional architecture search space validate the effectiveness of our approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源