论文标题
搜索空间的渐进自动设计,用于一次性神经架构搜索
Progressive Automatic Design of Search Space for One-Shot Neural Architecture Search
论文作者
论文摘要
神经建筑搜索(NAS)引起了人们日益增长的兴趣。为了降低搜索成本,最近的工作探索了跨型号的重量共享,并在NAS中取得了重大进展。但是,已经观察到,在受过独立训练时,具有较高单次模型精度的模型不一定会表现更好。为了解决这个问题,在本文中,我们提出了搜索空间的渐进自动设计,名为pad-nas。与以前的方法不同,在超级网中的所有层共享相同的操作搜索空间,我们根据操作修剪制定了渐进搜索策略,并构建了层面的操作搜索空间。这样,Pad-NAS可以自动设计每一层的操作,并在搜索空间质量和模型多样性之间实现权衡。在搜索过程中,我们还考虑了有效的神经网络模型部署的硬件平台约束。关于Imagenet的广泛实验表明,我们的方法可以实现最先进的性能。
Neural Architecture Search (NAS) has attracted growing interest. To reduce the search cost, recent work has explored weight sharing across models and made major progress in One-Shot NAS. However, it has been observed that a model with higher one-shot model accuracy does not necessarily perform better when stand-alone trained. To address this issue, in this paper, we propose Progressive Automatic Design of search space, named PAD-NAS. Unlike previous approaches where the same operation search space is shared by all the layers in the supernet, we formulate a progressive search strategy based on operation pruning and build a layer-wise operation search space. In this way, PAD-NAS can automatically design the operations for each layer and achieve a trade-off between search space quality and model diversity. During the search, we also take the hardware platform constraints into consideration for efficient neural network model deployment. Extensive experiments on ImageNet show that our method can achieve state-of-the-art performance.