论文标题

Automl-Zero:从头开始发展的机器学习算法

AutoML-Zero: Evolving Machine Learning Algorithms From Scratch

论文作者

Real, Esteban, Liang, Chen, So, David R., Le, Quoc V.

论文摘要

机器学习研究已在多个方面提高,包括模型结构和学习方法。自动进行此类研究的努力(称为Automl)也取得了重大进展。但是,这一进展主要集中在神经网络的架构上,在该建筑中,它依靠精致的专家设计的层作为构建块 - 或类似的限制性搜索空间。我们的目标是证明Automl可以进一步:今天有可能仅使用基本的数学操作作为构建块自动发现完整的机器学习算法。我们通过引入一个新颖的框架来证明这一点,该框架通过通用搜索空间大大降低了人类的偏见。尽管这个空间很广泛,但进化搜索仍然可以发现通过反向传播训练的两层神经网络。然后,可以直接根据感兴趣的任务(例如CIFAR-10变体,其中现代技术在顶部算法中出现,例如双线性相互作用,归一化梯度和平均重量。此外,Evolution将算法适应不同的任务类型:例如,当很少的数据可用时,会出现类似辍学的技术。我们认为,从头开始发现机器学习算法的初步成功表明该领域有希望的新方向。

Machine learning research has advanced in multiple aspects, including model structures and learning methods. The effort to automate such research, known as AutoML, has also made significant progress. However, this progress has largely focused on the architecture of neural networks, where it has relied on sophisticated expert-designed layers as building blocks---or similarly restrictive search spaces. Our goal is to show that AutoML can go further: it is possible today to automatically discover complete machine learning algorithms just using basic mathematical operations as building blocks. We demonstrate this by introducing a novel framework that significantly reduces human bias through a generic search space. Despite the vastness of this space, evolutionary search can still discover two-layer neural networks trained by backpropagation. These simple neural networks can then be surpassed by evolving directly on tasks of interest, e.g. CIFAR-10 variants, where modern techniques emerge in the top algorithms, such as bilinear interactions, normalized gradients, and weight averaging. Moreover, evolution adapts algorithms to different task types: e.g., dropout-like techniques appear when little data is available. We believe these preliminary successes in discovering machine learning algorithms from scratch indicate a promising new direction for the field.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源