STRUCTURAL-PARAMETRIC SYNTHESIS OF HYBRID NEURAL NETWORKS ENSEMBLES

Authors

  • O. I. Chumachenko National Technical University of Ukraine “Ihor Sikorsky Kyiv Polytechnic Institute,” Kyiv
  • A. T. Kot National Technical University of Ukraine “Ihor Sikorsky Kyiv Polytechnic Institute,” Kyiv

DOI:

https://doi.org/10.18372/1990-5548.54.12323

Keywords:

Neural networks, ensemble, training, optimization, topology

Abstract

It is considered the approach to the design of the ensemble of neural networks, where a collection of a finite number of neural networks is trained for the same task, then their results of the given task solution are combined. It is proposed an algorithm of optimal choice of neural networks topologies and their quantity for their inclusion as a member in ensemble. The further refinement of ensemble composition is done with help pruning operation. The output of an ensemble is a weighted average of the outputs of each network, with the ensemble weights determined as a function of the relative error of each network determined in training. It is presented a novel approach to determine the ensemble weights dynamically as part of the training algorithm. The weights are proportional to the certainty of the respective outputs.

Author Biographies

O. I. Chumachenko, National Technical University of Ukraine “Ihor Sikorsky Kyiv Polytechnic Institute,” Kyiv

Technical Cybernetic Department

Candidate of Science (Engineering). Assosiate Professor

A. T. Kot, National Technical University of Ukraine “Ihor Sikorsky Kyiv Polytechnic Institute,” Kyiv

Technical Cybernetic Department

Post-graduate student

References

G. E. Hinton, A practical guide to training restricted Boltzmann machines, (Tech. Rep. 2010-000). Toronto: Machine Learning Group, University of Toronto. 2010, pp. 160–169.

J. Horn, N. Nafpliotis, and D. E. Goldberg, “A niched Pareto genetic algorithm for multiobjective optimiza-tion,” in Proceedings of the First IEEE Conference on Evolutionary Computation, vol. 1, Piscataway, 1994, pp. 82–87.

Chumachenko E. I. Features of hybrid neural net-works use with input data of different types / E. I. Chumachenko, D. Yu. Koval, G. A. Sipakov, D. D. Shevchuk // Electronics and Control Systems, N 4(42) – Kyiv: NAU, 2014. – pp. 91–97.

Chumachenko O. I., Kryvenko I. V. Neural networks module learning // Electronics and Control Systems, N 2(48) – Kyiv: NAU, 2016. – pp. 76–80.

L. Breiman. Bagging predictors. Machine Learning, 24(2):123–140, 1996.

L. Breiman. Random forests. Machine Learning, 45(1):5 –32, 2001.

W. Fan, H. Wang, P. S. Yu, and S. Ma. Is random model better? on its accuracy and efficiency. In Proceedings of the 3rd IEEE International Conference on Data Mining, pages 51 – 58, 2003.

Y. Freund and R. E. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55(1):119–139, 1997.

T. K. Ho. The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(8):832–844, 1998.

L.K. Hansen and P. Salamon, “Neural Network Ensembles,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 12, pp. 993-1001, 1990.

A. Krogh and J. Vedelsby, “Neural Network Ensembles, Cross Validation, and Active Learning,” Advances in Neural Information Processing Systems, G. Tesauro, D. Touretzky, and T. Leen, eds., vol. 7, pp. 231-238, MIT Press, 1995.

L. Breiman, “Random Forests,” Machine Learning, vol. 45, no. 1, pp. 5-32, 2001.

Y. Freund and R.E. Schapire, “A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting,” Proc. Second European Conf. Computational Learning Theory, pp. 23-37, 1995.

E. Bauer and R. Kohavi, “An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants,” Machine Learning, vol. 36, nos. 1-2, pp. 105-139, 1999.

G. Martґınez-Mu˜noz and A. Suaґrez, “Aggregation Ordering in Bagging,” Proc. IASTED Int’l Conf. Artificial Intelligence and Applications, pp. 258-263, 2004.

G. Martґınez-Mu˜noz and A. Suґarez, “Using Boosting to Prune Bagging Ensembles,” Pattern Recognition Letters, vol. 28, no. 1, pp. 156-165, 2007.

Y. Zhang, S. Burer, and W.N. Street, “Ensemble Pruning via Semi-Definite Programming,” J. Machine Learning Research, vol. 7, pp. 1315-1338, 2006.

G. Tsoumakas, L. Angelis, and I. Vlahavas, “Selective Fusion of Heterogeneous Classifiers,” Intelligent Data Analysis, vol. 9, pp. 511-525, 2005.

R.E. Banfield, L. O. Hall, K. W. Bowyer, and W. P. Kegelmeyer, “A Comparison of Decision Tree Ensemble Creation Techniques,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 29, no. 1, pp. 173–180, Jan. 2007.

Zhenyu Lu, Xindong Wu+, Xingquan Zhu@, Josh Bongard, “Ensemble Pruning via Individual Contribution Ordering,” http://www.cs.uvm.edu/~jbongard/papers/2010_KDD_Lu.pdf

Imran Maqsood, Muhammad Riaz Khan, and Ajith Abraham, “An ensemble of neural networks for weather forecasting,” Neural Comput & Applic (2004) 13: 112–122, DOI 10.1007/s00521-004-0413-4.

Downloads

Issue

Section

COMPUTER-AIDED DESIGN SYSTEMS