Deep dictionary learning for fine-grained image classification

M. Srinivas, Yen Yu Lin, Hong Yuan Mark Liao

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

11 Scopus citations

Abstract

Fine-grained image classification is quite challenging due to high inter-class similarity and large intra-class variations. Another issue is the small amount of training images with a large number of classes to be identified. To address the chal-lenges, we propose a model for fine-grained image classifi-cation with its application to bird species recognition. Based on the features extracted by bilinear convolutional neural network (BCNN), we propose an on-line dictionary learn-ing algorithm where the principle of sparsity is integrated into classification. The features extracted by BCNN encode pairwise neuron interaction in a translation-invariant manner. This property is valuable to fine-grained classification. The proposed algorithm for dictionary learning further carries out sparsity based classification, where training data can be rep-resented with a less number of dictionary atoms. It alleviates the problems caused by insufficient training data, and makes classification much more efficient. Our approach is evaluated and compared with the state-of-the-art approaches on the CUB-200-2011 dataset. The promising experimental results demonstrate its efficacy and superiority.
Original languageAmerican English
Title of host publication2017 IEEE International Conference on Image Processing (ICIP)
PublisherIEEE Computer Society
Pages835-839
Number of pages5
ISBN (Print)9781509021758
DOIs
StatePublished - 20 Feb 2018

Publication series

NameProceedings - International Conference on Image Processing, ICIP
Volume2017-September

Keywords

  • Deep learning
  • Fine-grained image classification
  • On-line dictionary learning
  • Sparse representation

Fingerprint

Dive into the research topics of 'Deep dictionary learning for fine-grained image classification'. Together they form a unique fingerprint.

Cite this