Neural Architecture Search (NAS) automates the process of architecture design of neural networks. NAS approaches optimize the topology of the networks, incl. how to connect nodes and which operators to choose. User-defined optimization metrics can thereby include accuracy, model size or inference time to arrive at an optimal architecture for specific applications. Due to the extremely large search space, traditional evolution or reinforcement learning-based AutoML algorithms tend to be computationally expensive. Hence recent research on the topic has focused on exploring more efficient ways for NAS. In particular, recently developed gradient-based and multi-fidelity methods have provided a promising path and boosted research in these directions. Our group has been very active in developing state of the art NAS methods and has been at the forefront of driving NAS research forward. We give a summary of a few recent important work released from our group –
Selected NAS Papers
Literature Overview
NAS is one of the booming subfields of AutoML and the number of papers is quickly increasing. To provide a comprehensive overview of the recent trends, we provide the following sources:
- NAS survey paper [JMLR 2020]
- A book chapter on NAS from our open-access book, “AutoML: Methods, System, Challengers”
- A continuously updated page with a comprehensive NAS literature overview
- A github repo keeping track of the recent work at the intersection of NAS and Transformers awesome-transformer-search
One-Shot NAS Methods
- Understanding and Robustifying Differentiable Architecture Search [ICLR 2020, Oral]
Meta Learning of Neural Architectures
Neural Ensemble Search
-
Neural Ensemble Search for Uncertainty Estimation and Dataset Shift [NeurIPS 2021]
-
Multi-headed Neural Ensemble Search [ICML 2021, UDL Workshop]
Joint NAS and Hyperparameter Optimization
- Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search [ICML 2018, AutoML Workshop]
- Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization [ICML 2021, AutoML Workshop]
Multi-Objective NAS
- LEMONADE: Efficient multi-objective neural architecture search via lamarckian evolution [ICLR 2019]
- Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization [ICML 2021, AutoML Workshop]
Application-Specific NAS
Large-scale study of NAS methods
-
How Powerful are Performance Predictors in Neural Architecture Search? [NeurIPS 2021]
-
NAS-Bench-Suite: NAS Evaluation is (Now) Surprisingly Easy [ICLR 2022]
- NAS-Bench-Suite-Zero: Accelerating Research on Zero Cost Proxies [NeurIPS 2022, Datasets & Benchmarks Track]
Our Blogs
Please also check out blog posts for the related work: