Neural Architecture Search is a method for automatically designing neural network architectures instead of hand-crafting them. NAS explores a defined search space of possible model architectures and uses a search strategy—like reinforcement learning, evolutionary algorithms, or gradient-based methods—to find high-performing designs.
The goal is to discover models that balance accuracy, efficiency, and resource constraints (e.g., latency, memory, FLOPs). Search spaces can include layer types, connections, kernel sizes, and depth. Popular NAS approaches include ENAS (Efficient NAS), DARTS (Differentiable Architecture Search), and FBNet.
NAS can be computationally expensive, though techniques like weight sharing, early stopping, and proxy tasks help reduce costs. Once trained, the discovered architectures can outperform manually designed models, especially on edge or mobile devices.
NAS is commonly applied in image classification, object detection, and segmentation but is expanding to NLP and multimodal tasks.
Data Selection & Data Viewer
Get data insights and find the perfect selection strategy
Learn MoreSelf-Supervised Pretraining
Leverage self-supervised learning to pretrain models
Learn MoreSmart Data Capturing on Device
Find only the most valuable data directly on devide
Learn More