Boosting standard classification architectures through a ranking regularizer

Ahmed Taha, Yi-Ting Chen, Teruhisa Misu, Abhinav Shrivastava, Larry Davis

研究成果: Conference contribution同行評審

12 引文 斯高帕斯(Scopus)

摘要

We employ triplet loss as a feature embedding regularizer to boost classification performance. Standard architectures, like ResNet and Inception, are extended to support both losses with minimal hyper-parameter tuning. This promotes generality while fine-tuning pretrained networks. Triplet loss is a powerful surrogate for recently proposed embedding regularizers. Yet, it is avoided due to large batch-size requirement and high computational cost. Through our experiments, we re-assess these assumptions.During inference, our network supports both classification and embedding tasks without any computational overhead. Quantitative evaluation highlights a steady improvement on five fine-grained recognition datasets. Further evaluation on an imbalanced video dataset achieves significant improvement. Triplet loss brings feature embedding capabilities like nearest neighbor to classification models. Code available at http://bit.ly/2LNYEqL.

原文English
主出版物標題Proceedings - 2020 IEEE Winter Conference on Applications of Computer Vision, WACV 2020
發行者Institute of Electrical and Electronics Engineers Inc.
頁面747-755
頁數9
ISBN(電子)9781728165530
DOIs
出版狀態Published - 3月 2020
事件2020 IEEE/CVF Winter Conference on Applications of Computer Vision, WACV 2020 - Snowmass Village, 美國
持續時間: 1 3月 20205 3月 2020

出版系列

名字Proceedings - 2020 IEEE Winter Conference on Applications of Computer Vision, WACV 2020

Conference

Conference2020 IEEE/CVF Winter Conference on Applications of Computer Vision, WACV 2020
國家/地區美國
城市Snowmass Village
期間1/03/205/03/20

指紋

深入研究「Boosting standard classification architectures through a ranking regularizer」主題。共同形成了獨特的指紋。

引用此