Efficient Neural Architecture Search

A tutorial summarizing the latest progresses in Neural Architecture Search
presented at the 35th AAAI Conference on Artificial Intelligence (AAAI 2021).

With the advances and success of deep learning technology in solving complex AI problems, the natural step forward lies in building systems that automate the decisions required for setting up a deep learning pipeline. Tackling this automation is not only crucial for speeding up the deployment of deep learning models, it also helps in expanding the capabilities of models to other challenging scenarios like modelling with limited data or modelling under resource constraints. This tutorial seeks to provide a comprehensive overview of the approaches used in this regard by means of neural architecture search. It is also the first tutorial that strongly focuses on transfer and meta-learning, going beyond classic neural architecture search.

The tutorial is geared toward graduate students, AI researchers, and practitioners, who are interested in automating parts of their deep learning pipelines, want to learn about principles of automated machine learning and deep learning, and apply those principles to make their own work more effective and less arduous.

The prerequisite knowledge assumed of the audience includes basic understanding of deep learning, optimization, and machine learning concepts. Familiarity with some state-of-the-art convolutional neural network architecture can facilitate the understanding but is not required.

Tutorial Material

Slides: Download


Tejaswini Pedapati

Tejaswini Pedapati

Tejaswini Pedapati works at IBM Research. Her research is focused on interpretability and automating deep learning. To that end, she was involved in developing tools and algorithms to provide these capabilities for IBM products. She has a masters’ degree from Columbia University.

Martin Wistuba

Martin Wistuba

Martin Wistuba is a researcher at IBM Research, where he develops tools to automate deep learning. Previously, he received his Ph.D. in Machine Learning from the University of Hildesheim. His research interest includes AutoML, in particular the idea of meta-knowledge transfer to speed up Bayesian optimization and Neural Architecture Search.


author = {Tejaswini Pedapati and Martin Wistuba},
title = {Efficient Neural Architecture Search},
howpublished = {Tutorial at AAAI 2021},
year = {2021},
url = {https://neural-architecture-search.github.io/tutorial-aaai-2021},