TY - JOUR
T1 - Machine learning for multi-class protein fold classification based on neural networks with feature gating
AU - Huang, Chuen Der
AU - Chung, I. Fang
AU - Pal, Nikhil Ranjan
AU - Lin, Chin Teng
PY - 2003
Y1 - 2003
N2 - The success of a classification system depends heavily on two things: the tools being used and the features considered. For the bioinformatics applications the role of appropriate features has not been paid adequate importance. In this investigation we use two novel ideas. First, we use neural networks where each input node is associated with a gate. At the beginning of the training all gates are almost closed, i.e., no feature is allowed to enter the network. During the training, depending on the requirements, gates are either opened or closed. At the end of the training, gates corresponding to good features are completely opened while gates corresponding to bad features are closed more tightly. And of course, some gates may be partially open. So the network can not only select features in an online manner when the learning goes on, it also does some feature extraction. The second novel idea is to use a hierarchical machine learning architecture. Where at the first level the network classifies the data into four major folds : all alpha, all beta, alpha + beta and alpha / beta. And in the next level we have another set of networks, which further classifies the data into twenty seven folds. This approach helps us to achieve the following. The gating network is found to reduce the number of features drastically. It is interesting to observe that for the first level using just 50 features selected by the gating network we can get a comparable test accuracy as that using 125 features using neural classifiers. The process also helps us to get a better insight into the folding process. For example, tracking the evolution of different gates we can find which characteristics (features) of the data are more important for the folding process. And, of course, it reduces the computation time. The use of the hierarchical architecture helps us to get a better performance also.
AB - The success of a classification system depends heavily on two things: the tools being used and the features considered. For the bioinformatics applications the role of appropriate features has not been paid adequate importance. In this investigation we use two novel ideas. First, we use neural networks where each input node is associated with a gate. At the beginning of the training all gates are almost closed, i.e., no feature is allowed to enter the network. During the training, depending on the requirements, gates are either opened or closed. At the end of the training, gates corresponding to good features are completely opened while gates corresponding to bad features are closed more tightly. And of course, some gates may be partially open. So the network can not only select features in an online manner when the learning goes on, it also does some feature extraction. The second novel idea is to use a hierarchical machine learning architecture. Where at the first level the network classifies the data into four major folds : all alpha, all beta, alpha + beta and alpha / beta. And in the next level we have another set of networks, which further classifies the data into twenty seven folds. This approach helps us to achieve the following. The gating network is found to reduce the number of features drastically. It is interesting to observe that for the first level using just 50 features selected by the gating network we can get a comparable test accuracy as that using 125 features using neural classifiers. The process also helps us to get a better insight into the folding process. For example, tracking the evolution of different gates we can find which characteristics (features) of the data are more important for the folding process. And, of course, it reduces the computation time. The use of the hierarchical architecture helps us to get a better performance also.
UR - http://www.scopus.com/inward/record.url?scp=33646418489&partnerID=8YFLogxK
M3 - Article
AN - SCOPUS:33646418489
SN - 0302-9743
VL - 2714
SP - 1168
EP - 1175
JO - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
JF - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
ER -