Jump to Content

Bayesian Meta-network Architecture Learning

Albert Shaw
Weiyang Liu
Le Song
NIPS 2018 Workshop on Bayesian Deep Learning (2018)

Abstract

For deep neural networks, the particular structure often plays a vital role in achieving state-of-the-art performances in many practical applications. There is much recent work focusing on designing novel structures for neural networks. However, due to the combinatorial nature of the design space, the hand-designing architectures is expensive and potentially sub-optimal. Developing techniques to automatically search this space has become a large focus of many recent efforts and methods such as genetic and reinforcement learning based algorithms have been quite successful in achieving state of the art performance on several tasks. However, the neural network structure in the existing methods are searched through strongly task dependent methods so the architecture search must be repeated for each new task. In this paper, we first propose a Bayesian view for differential architecture search, by which we can easily generalize the structure searching to few-shot meta-learning setting. Following the \emph{optimization embedding} technique~\citep{DaiDaiHeLiuetal18} for variational inference, we propose an efficient method for meta-network architecture searching. We test the algorithm on the few-shot learning benchmark, demonstrating the superiority of the proposed algorithm.

Research Areas