Adaptive Similarity-Aware Hyperparameter Tuners for Classification Tasks

Chi Lin Hsieh, Kuei Chung Chang*, Tien Fu Chen

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


With the success of deep learning in recent years, lots of different AI models have been applied to the real world. At the same time, how to train a model with good performance becomes a problem people have to face. One of the most important things is hyperparameter tuning since it determines the setting of training flow. However, most of the conventional hyperparameter algorithms are inefficient, because they usually search from scratch for each new task, and that's why they require large search trials to find a good combination of hyperparameters. In this paper, we present a systematic hyperparameter tuning framework which utilizes prior knowledge with a suggestion algorithm and an adaptive controller to improve its efficiency rather than search from scratch for each task. In this way, our proposed method can achieve a better performance with the same budget. In the experiments, we applied our methods to tens of popular datasets, and the results show that our proposed methods can outperform than other approaches.

Original languageEnglish
Pages (from-to)11089-11101
Number of pages13
JournalIEEE Access
StatePublished - 2023


  • Classification
  • dataset similarity
  • evolutionary algorithm
  • hyperparameter tuning
  • transfer learning


Dive into the research topics of 'Adaptive Similarity-Aware Hyperparameter Tuners for Classification Tasks'. Together they form a unique fingerprint.

Cite this