“School of Cognitive Sciences”
Back to Papers HomeBack to Papers of School of Cognitive Sciences
Paper IPM / Cognitive Sciences / 13781 |
|
||||||||||
Abstract: | |||||||||||
Mixture of Experts is one of the most popular ensemble methods in pattern recognition systems. Although, diversity between the experts is one of the necessary conditions for the success of combining methods, ensemble systems based on Mixture of Experts suffer from the lack of enough diversity among the experts caused by unfavorable initial parameters. In the conventional Mixture of Experts, each expert receives the whole feature space. To increase diversity among the experts, solve the structural issues of Mixture of Experts such as zero coefficient problem, and improve efficiency in the system, we intend to propose a model, entitled Mixture of Feature Specified Experts, in which each expert gets a different subset of the original feature set. To this end, we first s3lect a set of feature subsets which lead to a set of diverse and efficient classifiers. Then the initial parameters are infused to the system with training classifiers on the selected feature subsets. Finally, we train the expert and the gating networks using the learning rule of classical Mixture of Experts to organize collaboration between the members of system and aiding the gating network to find the best partitioning of the problem space. To evaluate our proposed method, we have used six datasets from the UCI repository. In addition the generalization capability of our proposed method is considered on real-world database of EEG based Brain-Computer Interface. The performance of our method is evaluated with various appraisal criteria and significant improvement in recognition rate of our proposed method is indicated in all practical tests.
Download TeX format |
|||||||||||
back to top |