TY - GEN
T1 - Hierarchical hypothesis structure for ensemble learning
AU - Yu, Chu En
AU - Liu, Chien-Liang
AU - Hsieh, Hsin Lung
N1 - Publisher Copyright:
© 2017 IEEE.
PY - 2018/6/21
Y1 - 2018/6/21
N2 - One of the goals for the machine learning research is to improve the accuracy of the classification. Many research studies have focused on developing novel algorithms according to problem domains and statistical learning theory to continuously improve classification performance over the past decades. Recently, many researchers have found that performance bottleneck often occurs when only using a single classification algorithm, since each algorithm has its strength, but it also has its weakness. Ensemble learning, which combines several classifiers or hypotheses to become a strong classifier or learner, relies on the combination of various hypotheses rather than using state-of-the-art algorithms. In ensemble learning, hypothesis selection is crucial to performance, and the diversity of the selected hypotheses is an important selection criterion. This work proposes three algorithms focusing on generating a hierarchical hypothesis structure to achieve the goal of hypothesis selection, in which the two hypotheses are combined based on particular criterion. We conduct experiments on 8 data sets, and the experimental results indicate that the proposed method outperforms random forest, which is a state-of-the-art method.
AB - One of the goals for the machine learning research is to improve the accuracy of the classification. Many research studies have focused on developing novel algorithms according to problem domains and statistical learning theory to continuously improve classification performance over the past decades. Recently, many researchers have found that performance bottleneck often occurs when only using a single classification algorithm, since each algorithm has its strength, but it also has its weakness. Ensemble learning, which combines several classifiers or hypotheses to become a strong classifier or learner, relies on the combination of various hypotheses rather than using state-of-the-art algorithms. In ensemble learning, hypothesis selection is crucial to performance, and the diversity of the selected hypotheses is an important selection criterion. This work proposes three algorithms focusing on generating a hierarchical hypothesis structure to achieve the goal of hypothesis selection, in which the two hypotheses are combined based on particular criterion. We conduct experiments on 8 data sets, and the experimental results indicate that the proposed method outperforms random forest, which is a state-of-the-art method.
KW - Ensemble Learning
KW - Hypothesis Divergence
KW - Hypothesis Hierarchical Structure
KW - Hypothesis Selection
UR - http://www.scopus.com/inward/record.url?scp=85050204060&partnerID=8YFLogxK
U2 - 10.1109/FSKD.2017.8393044
DO - 10.1109/FSKD.2017.8393044
M3 - Conference contribution
AN - SCOPUS:85050204060
T3 - ICNC-FSKD 2017 - 13th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery
SP - 1827
EP - 1832
BT - ICNC-FSKD 2017 - 13th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery
A2 - Zhao, Liang
A2 - Wang, Lipo
A2 - Cai, Guoyong
A2 - Li, Kenli
A2 - Liu, Yong
A2 - Xiao, Guoqing
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 13th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery, ICNC-FSKD 2017
Y2 - 29 July 2017 through 31 July 2017
ER -