摘要
Differentiable architecture search (DARTS) is an effective continuous relaxation-based network architecture search (NAS) method with low search cost. It has attracted significant attention in AutoML research and has become one of the most effective paradigms in NAS. Although DARTS comes with great efficiency over traditional NAS approaches in handling the complex parameter search process, it often suffers from stabilization issues in producing deteriorating architectures when discretizing the found continuous architecture. To address this issue, we propose a mean-shift based DARTS (MS-DARTS) to improve the stability based on architecture sampling, perturbation, and shifting. The proposed mean-shift approach in MS-DARTS can effectively improve the stability and accuracy of DARTS by smoothing the loss landscape and sampling the architecture parameters within a suitable bandwidth. We investigate the convergence of our mean-shift approach as well as the effects of bandwidth selection toward stability and accuracy optimization. Evaluations on CIFAR-10, CIFAR-100, and ImageNet show that MS-DARTS archives competitive performance among state-ofthe- art NAS methods with reduced search cost. Impact Statement-The proposed MS-DARTS can significantly improve the stability and accuracy of DARTS methods. Although DARTS can greatly improve the efficiency over traditional NAS approaches in search of better architectures in the continuous architecture space, it suffers from stabilization issues when discretizing the found continuous architecture. The proposed mean-shift approach can smooth out the complex NAS loss searching landscape and thus improve stability. Effective bandwidth selection in MS-DARTS can tradeoff and optimize both accuracy and stability.
原文 | English |
---|---|
頁(從 - 到) | 1235-1246 |
頁數 | 12 |
期刊 | IEEE Transactions on Artificial Intelligence |
卷 | 5 |
發行號 | 3 |
DOIs | |
出版狀態 | Published - 1 3月 2024 |