My caption

ANAS: Asymptotic NAS for Large-Scale Proxyless Search and Multi-Task Transfer Learning

Publication
Pattern Recognition, 2023, DOI: 10.1016/j.patcog.2023.109821

Abstract: Neural Architecture Search (NAS) is an excellent solution to design a lightweight network for researchers to obtain a trade-off between accuracy and speed, releasing researchers from tedious mechanical trials. However, the main shortcoming of NAS is high and unstable memory consumption of the search work, especially for large-scale tasks. In this study, the proposed Asymptotic Neural Architecture Search network (ANAS) achieved a proxyless search for large-scale tasks with economic and stable memory consumption. Instead of proxy search like other NAS algorithms, ANAS achieved the large-scale proxyless search that directly learns deep neural network architecture for target task. ANAS reduced the peak value of memory consumption by an asymptotic method, and kept the memory consumption stable by the linkage change of a series of key indexes. The Pruning Operation and efficient candidate operations decreased the total memory consumption. Finally, ANAS achieved a good trade-off between accuracy and speed for classification tasks on CIFAR-10, CIFAR-100, and ImageNet datasets. Besides, except for the classification task, it had excellent multi-task transfer learning ability for implementing the segmentation task on CamVid and Cityscapes. ANAS reached 22.8% test errs with 5 M parameter on ImageNet, and 72.9 mIoU (mean Intersection over Union) with 119.9 FPS (Frames Per Second) on Cityscapes.

Related