ALGORITHM OF PRUNING OF HYBRID NEURAL NETWORKS ENSEMBLE

O. I. Chumachenko, A. O. Kuzmenko

Abstract


Despite the fact that the ensemble is usually more accurate than a single network, existing ensemble techniques tend to create unreasonably large ensembles that increase the use of memory and computation costs. The ensemble's pruning solves this problem. The article analyzes the compromise between accuracy and diversity and it is proved that classifiers, which are more accurate and make more predictions in the minority group, are more important for the construction of the subensemble. A metric that takes into account accuracy and diversity is proposed to evaluate the contribution of a separate classifier that will help to allocate the required number of networks with the best results.

Keywords


Ensemble pruning; bagging; accuracy; diversity

References


Zhenyu Lu, Xindong Wu, Xingquan Zhu, and Josh Bongard, Ensemble Pruning via Individual Contribution Ordering. Burlington. 2007, 10 p.

Gonzalo Martı´nez-Munoz, Daniel Herna´ndez-Lobato, and Alberto Sua´rez, “An Analysis of Ensemble Pruning Techniques Based on Ordered Aggregation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, no. 2, 2009, pp. 245–259.

Z.-H. Zhou, J. Wu, and W. Tang, “Ensembling Neural Networks: Many Could Be Better than All,” Artificial Intelligence, vol. 137, pp. 239–263, 2002.


Full Text: PDF

Refbacks

  • There are currently no refbacks.


Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.