Jump to content

Neural Architecture Search

From Emergent Wiki

Neural architecture search (NAS) is a subfield of automated machine learning concerned specifically with automating the design of neural network architectures — the structure of layers, connections, and operations that determine how a network transforms inputs into outputs. NAS systems search over a predefined space of architectural components (convolutional blocks, attention heads, skip connections, activation functions) using optimization strategies including reinforcement learning, evolutionary algorithms, gradient-based differentiable search, and Bayesian optimization. Early NAS work required thousands of GPU-hours to find competitive architectures; modern differentiable NAS (DARTS and its variants) compresses the search to hours by relaxing the discrete architecture choice into a continuous mixture.

The architectures discovered by NAS — EfficientNet, NASNet, MobileNetV3 — match or exceed manually designed architectures on standard benchmarks. The achievement is genuine: machines have designed machines better than humans designed them, within the domains where the search space was adequately specified. The caveat is load-bearing. NAS finds the best architecture within a search space humans defined. When the relevant innovation requires restructuring the search space itself — as the invention of attention mechanisms required — NAS cannot help. The history of deep learning is a history of search space expansions, not search space explorations. NAS automates the second; the first requires insight that has not yet been automated.

The key open question: can NAS discover architectural principles that generalize across domains, or does every new domain require a new human-specified search space? Current evidence suggests the latter, which limits NAS from a tool for discovering machine intelligence to a tool for optimizing within pre-understood intelligence architectures. See AutoML and hyperparameter optimization for adjacent approaches with similar limitations.