Faster Discovery of Neural Architectures by Searching for Paths in a Large Model


We propose an approach for automatic model designing, which is signifi-cantly faster and less expensive than previous methods. The method, whichwe nameEfficient Neural Architecture Search(ENAS), learns to discoverneural architectures by searching for an optimal path within a larger pre-determined model. ENAS generates a discrete mask at each layer, andis trained by a policy gradient method to improve training accuracy. Inour experiments, ENAS achieves comparable test accuracies while being10x faster and requiring 100x less resources than NAS. On the CIFAR-10dataset, ENAS can design novel architectures that achieve the test error of3.86%, compared to 3.41% by standard NAS (Zoph et al., 2017). On thePenn Treebank dataset, ENAS also discovers a novel architecture, whichachieves the test perplexity of 64.6 compared to 62.4 by standard NAS.