FOX-NAS: Fast, On-device and Explainable Neural Architecture Search

Chia Hsiang Liu, Yu Shin Han, Yuan Yao Sung, Yi Lee, Hung Yueh Chiang, Kai Chiang Wu

研究成果: Conference contribution同行評審

3 引文 斯高帕斯(Scopus)

摘要

Neural architecture search can discover neural networks with good performance, and One-Shot approaches are prevalent. One-Shot approaches typically require a supernet with weight sharing and predictors that predict the performance of architecture. However, the previous methods take much time to generate performance predictors thus are inefficient. To this end, we propose FOX-NAS that consists of fast and explainable predictors based on simulated annealing and multivariate regression. Our method is quantization-friendly and can be efficiently deployed to the edge. The experiments on different hardware show that FOX-NAS models outperform some other popular neural network architectures. For example, FOX-NAS matches MobileNetV2 and EfficientNet-Lite0 accuracy with 240% and 40% less latency on the edge CPU. Search code and pre-trained models are released at https://github.com/great8nctu/FOX-NAS.1

原文English
主出版物標題Proceedings - 2021 IEEE/CVF International Conference on Computer Vision Workshops, ICCVW 2021
發行者Institute of Electrical and Electronics Engineers Inc.
頁面789-797
頁數9
ISBN(電子)9781665401913
DOIs
出版狀態Published - 2021
事件18th IEEE/CVF International Conference on Computer Vision Workshops, ICCVW 2021 - Virtual, Online, Canada
持續時間: 11 10月 202117 10月 2021

出版系列

名字Proceedings of the IEEE International Conference on Computer Vision
2021-October
ISSN(列印)1550-5499

Conference

Conference18th IEEE/CVF International Conference on Computer Vision Workshops, ICCVW 2021
國家/地區Canada
城市Virtual, Online
期間11/10/2117/10/21

指紋

深入研究「FOX-NAS: Fast, On-device and Explainable Neural Architecture Search」主題。共同形成了獨特的指紋。

引用此