Development of Neural Architecture Search

Google launches her AutoML project last year, in an effort to automate the process of seeking the most appropriate neural net designs for a particular classification problem. Designing neural networks have been time consuming, despite the use of TensorFlow / Keras or other deep learning architecture nowadays. Therefore, the Google Brain team devised the Neural Architecture Search (NAS) using a recurrent neural network to perform reinforcement learning. (See their blog entry.) It is used to find the neural networks for image classifiers. (See their blog entry.)

Apparently, with a state-of-the-art hardware, it is of Google’s advantage to perform such an experiment on the CIFAR-10 dataset using 450 GPUs for 3-4 days. But this makes the work inaccessible for small companies or personal computers.

Then it comes an improvement to NAS: the Efficient Neural Architecture Search via Parameter Sharing (ENAS), which is a much more efficient method to search for a neural networks, by narrowing down the search in a subgraph. It reduces the need of GPUs.

While I do not think it is a threat to machine learning engineers, it is a great algorithm to note. It looks to me a brute-force algorithm, but it needs scientists and engineers to gain insights. Still, I believe development of the theory behind neural networks is much needed.

  • “Using Machine Learning to Explore Neural Network Architecture,” Google Research Blog, 2017. [Google]
  • Barret Zoph, Quoc V. Le, “Neural Architecture Search with Reinforcement Learning,” arXiv:1611.01578 (2016). [arXiv]
  • “AutoML for large scale image classification and object detection,” Google Research Blog, 2017. [Google]
  • Esteban Real, Sherry Moore, Andrew Selle, Saurabh Saxena, Yutaka Leon Suematsu, Jie Tan, Quoc Le, Alex Kurakin, “Large-Scale Evolution of Image Classifiers,” arXiv:1703.01041 (2017). [arXiv]
  • Hieu Pham, Melody Y. Guan, Barret Zoph, Quoc V. Le, Jeff Dean, “Efficient Neural Architecture Search via Parameter Sharing,” arXiv:1802.03268 (2018). [arXiv]
  • tobe, “ENAS: 更有效地设计神经网络模型(AutoML),” TensorFlow专栏, Zhihu (2018). [Zhihu]
  • “Neural Architecture Search.” [Wikipedia]
  • “How realistic is AutoML (Google’s attempts to build neural networks without human intervention)? Is this a real threat for machine learning engineers?” [Quora]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create a free website or blog at WordPress.com.

Up ↑

%d bloggers like this: