Perturbed Gradients Updating within Unit Space for Deep Learning

Ching Hsun Tseng, Hsueh Cheng Liu, Shin Jye Lee*, Xiaojun Zeng

*此作品的通信作者

研究成果: Conference contribution同行評審

摘要

In deep learning, optimization plays a vital role. By focusing on image classification, this work investigates the pros and cons of the widely used optimizers and proposes a new optimizer: Perturbed Unit Gradient Descent (PUGD) algorithm with extending normalized gradient operation in tensor within perturbation to update in unit space. Via a set of experiments and analyses, we show that PUGD is locally bounded updating, which means the updating from time to time is controlled. On the other hand, PUGD can push models to a flat minimum, where the error remains approximately constant, not only because of the nature of avoiding stationary points in gradient normalization but also by scanning sharpness in a unit ball. From a series of rigorous experiments, PUGD helps models to gain a state-of-the-art Top-1 accuracy in Tiny ImageNet and competitive performances in CIFAR- {10, 100}. We open-source our code at link: https://github.com/hanktseng131415go/PUGD.

原文English
主出版物標題2022 International Joint Conference on Neural Networks, IJCNN 2022 - Proceedings
發行者Institute of Electrical and Electronics Engineers Inc.
ISBN(電子)9781728186719
DOIs
出版狀態Published - 2022
事件2022 International Joint Conference on Neural Networks, IJCNN 2022 - Padua, Italy
持續時間: 18 7月 202223 7月 2022

出版系列

名字Proceedings of the International Joint Conference on Neural Networks
2022-July

Conference

Conference2022 International Joint Conference on Neural Networks, IJCNN 2022
國家/地區Italy
城市Padua
期間18/07/2223/07/22

指紋

深入研究「Perturbed Gradients Updating within Unit Space for Deep Learning」主題。共同形成了獨特的指紋。

引用此