AI
About

Charles Weill is a research engineer at Google. Prior to joining Google Research, Charles received his masters in computer science from Cornell University. His current research interest is in developing open-source tools for automated machine learning (AutoML), specifically neural architecture search for structured data, image recognition, and natural language processing.

Charles currently works on AdaNet, an instance of the AutoML family that seeks to adaptively learn both the architecture and the weights of a deep neural network, and that benefits from strong learning guarantees (Cortes et al., ICML, 2017). The algorithm leverages the recent deep boosting theory (Cortes et al., ICML 2014), and is closely related to ensemble methods.

The key observation is that an ensemble of neural networks whose outputs are linearly combined is also a neural network. AdaNet seeks to answer the question: can we adaptively learn a neural architecture as an ensemble of subnetworks with fewer total trainable parameters and better performance than any single neural network trained end-to-end?

While its original paper focused on adaptively learning fully-connected neural networks for binary classification tasks, AdaNet is a flexible algorithm and framework that can be extended to regression and multiclass classification across many domains. An open-source TensorFlow implementation for training AdaNets on large scale datasets is currently in the works.