论文标题
部分可观测时空混沌系统的无模型预测
Novelty Driven Evolutionary Neural Architecture Search
论文作者
论文摘要
基于进化算法(EA)的神经体系结构搜索(NAS)涉及通过从头开始训练它来评估每个体系结构,这非常耗时。由于搜索空间中所有体系结构之间的重量共享,可以通过使用超级网估算体系结构的适应性来减少这一点。但是,由于超级网中的操作的共同适应,估计的适应性非常嘈杂,这导致NAS方法被困在局部最佳中。在本文中,我们提出了一种称为Nevonas的方法,其中NAS问题被视为2个目标的多目标问题:(i)最大化架构新颖性,(ii)最大化体系结构适应性/精度。新颖的搜索用于维护每一代的各种解决方案,这有助于避免使用SuperNet计算体系结构健身时局部最佳陷阱。 NSGA-II用于为NAS问题找到\ textIt {Pareto Optimal Front},并且在搜索的体系结构中返回Pareto Front中的最佳体系结构。在室内,Nevonas在2个不同的搜索空间上给出了更好的结果,而与以前的基于EA的方法相比,使用明显较少的计算资源。可以在https://github.com/nightstorm0909/nevonas中找到我们论文的代码。
Evolutionary algorithms (EA) based neural architecture search (NAS) involves evaluating each architecture by training it from scratch, which is extremely time-consuming. This can be reduced by using a supernet for estimating the fitness of an architecture due to weight sharing among all architectures in the search space. However, the estimated fitness is very noisy due to the co-adaptation of the operations in the supernet which results in NAS methods getting trapped in local optimum. In this paper, we propose a method called NEvoNAS wherein the NAS problem is posed as a multi-objective problem with 2 objectives: (i) maximize architecture novelty, (ii) maximize architecture fitness/accuracy. The novelty search is used for maintaining a diverse set of solutions at each generation which helps avoiding local optimum traps while the architecture fitness is calculated using supernet. NSGA-II is used for finding the \textit{pareto optimal front} for the NAS problem and the best architecture in the pareto front is returned as the searched architecture. Exerimentally, NEvoNAS gives better results on 2 different search spaces while using significantly less computational resources as compared to previous EA-based methods. The code for our paper can be found in https://github.com/nightstorm0909/NEvoNAS.