“School of Cognitive Sciences”
Back to Papers HomeBack to Papers of School of Cognitive Sciences
Paper IPM / Cognitive Sciences / 13146 |
|
||||||
Abstract: | |||||||
Negative Correlation Learning (NCL) is a popular
combining method that employs special error function
for the simultaneous training of base neural network (NN)
experts. In this article, we propose an improved version of
NCL method in which the capability of gating network, as
the combining part of Mixture of Experts method, is used to
combine the base NNs in the NCL ensemble method. The
special error function of the NCL method encourages each
NN expert to learn different parts or aspects of the training
data. Thus, the local competence of the experts should be
considered in the combining approach. The gating network
provides a way to support this needed functionality for
combining the NCL experts. So the proposed method is
called Gated NCL. The improved ensemble method is
compared with the previous approaches were used for
combining NCL experts, including winner-take-all (WTA)
and average (AVG) combining techniques, in solving several
classification problems from UCI machine learning repository.
The experimental results show that our proposed
ensemble method significantly improved performance over
the previous combining approaches.
Download TeX format |
|||||||
back to top |